You are on page 1of 78

THE COOK REPORT ON INTERNET PROTOCOL

JULY 2010

How Dr. David Zakim Is Developing


His Clinical Expert Operating
System

Introducing a Global Internet and Fourth Paradigm Based


Digital Framework for the Practice of Medicine and Conduct
of Clinical Research

Volume XIX, No. 4


July 2010
ISSN 1071 - 6327

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 1
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Contents;

How Dr. David Zakim Is Developing His


Clinical Expert Operating
System
Editorʼs Introduction" " " " " " " " " " p. 5

Why Computers Are Necessary to Economically


Sustainable and High Quality Medical Practice " " " " " p. 6

Medicine and Science: Two Different Cultures" " " " " p. 8

How Do You Manage the Knowledge Base?"" " " " " p. 10

Investment via a Foundation of Social Conscience" " " " p. 13

An Expert System to Address the Chronic Illnesses of Adults" " p. 16

Building Decision Trees for Collection of Patient History" " " p. 18

Large Databases Likely Will Offer the Key


to What Should be Measured" " " " " " " " p. 21

Goals for the Next Five Years " " " " " " " " p. 24

Part 2 April 2, 2010

We Are Trying to Show What Can Actually


Be Done
" " " " " " " " " " " " " p.27

The Platform Emulates Thinking of Panels of Experts

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 2
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Viewing the Patient as a Living Complex System" " " " p. 30

The Structure of the Knowledge System" " " " " " p. 32

Strategy by Which to Seed and Grow the Idea" " " " " p. 33

Role of the “Live” Physician" " " " " " " " p. 35

Whither the Physician?" " " " " " " " " P. 36

Trials of CLEOS in Germany in 2010" " " " " " " p. 37

Appendix:
Underutilization of information and knowledge in everyday
medical practice: Evaluation of a computer-based solution " p. 40

Executive Summary

The Progress of the Medical Record from a


Mainframe Hierarchy to Decentralized,
Edge-Based Intelligence

Larry Weed and the Coupler - The Path from Promis to the
Personal Computer and Increasing Decentralization" " " " p. 45

How to Structure and Formalize the Knowledge Base


in a Pre Internet Era " " " " " " " " " " p. 46

David Zakim and the Transition to CLEOS as a


Next Generation Internet Knowledge Commons Based System" " p. 48

Afterward" " " " " " " " " " " " p. 51

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 3
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Symposium Discussion April May 2010

The New Internet Architecture: Cloud and Content


Providers Versus the Incumbentʼs Last Mile " " " p. 53

An Abbreviated Summary of Bill St Arnaudʼs Architecture Paper" " p 54

How ACIs Benefit the Rest of the Internet Ecosystem" " " " p. 56

ACI,  Network Neutrality Challenges, and Last Mile Networks " " p. 57

Some State Level Regulatory Implications " " " " " " p. 58

Conclusion" " " " " " " " " " p. 59

Postscript"" " " " " " " " " " " p. 60

Putting Duct Tape


on Telecom Regulation
The Obama FCC Speaks its Mind " " " " " p. 61

How Does this Translate 18 months after Obama?" " " " p. 62

The Unbearable Lightness of Being or Title II ʻliteʻ" " " " p. 64

Executive Summary ! ! ! ! ! ! ! ! p. 77

Future Issues" " " " " " " " " " " p. 78

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 4
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Editorʼs Introduction
At the beginning of December 2009, after about 15 minutes conversation with David Zakim I knew
that I was talking to a man who was determined to transform the practice of medicine on a global
basis. As did Larry Weed 40 years earlier, David realizes that the prevailing paradigm of medical
practice is fundamentally broken because the knowledge base of medicine has grown several or-
ders of magnitude too large for any human mind to parse unassisted. Having written about Dr.
Weed’s work in 1978 and having revisited it last summer in the November 2009 COOK Report, I
wanted to understand how Dr. Zakim was carrying on what Larry began so long ago.

This extended piece is the result of more than three hours of interviews with David. At the end of
a few minutes, when the realization hits that this man really envisions the creation of an expert
system that integrates all medical knowledge into a patient centered interface, the mind - if recep-
tive – gasps how is it possible? The task is too enormous and too complex. Science fiction. I hope
that this essay offers satisfactory answers to these questions.

David has been working for ten years on his project. Given the complexity and magnitude, as one
would expect, he has not found the time to stop and try to explain it to a broader audience. This is
the goal that I have set in what follows. I have pushed, probed, queried and read to the point
where I believe this interview coupled with extensive excerpts from two of David’s papers will en-
able the interested non-medical policy person grasp the very large policy significance of what David
is doing.

A paradigm shifting work of this scope depends on sound science and technology but, without
equally sound political and economic judgment, it will fail. It has been encouraging to me to see
how David understands the overwhelming economic forces that will either force medicine down the
CLEOS path or into draconian rationing to allow nations to avoid being shoved into bankruptcy by
the current broken system.

With the help of Eva Waskell, whom I must also thank for the introduction, to David I fired off the
most difficult political policy and economic questions I could think of. As readers will see he han-
dled them all with equanimity. Certainly a grand strategy is called for because, in the current cli-
mate that is dominated by finance, there will be many corporations that would like to privatize
what David is doing. Such would be, I believe, a disaster of the first order. It is therefore fully in
keeping with the 21st century world of the internet that he has established the Institute of Digital
Medicine to act as legal owner and protector of the medical intellectual property and preserve it in
an open knowledge commons.

Our interview was conducted on January 24 2010 with a follow up on April 2 2010. I am also in-
cluding extensive excerpts from David’s October 2009 technical paper and an earlier 2008 report,
in order that readers may wade into both as I have and can gain a better understanding of how the
software behind CLEOS works.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 5
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

COOK Report: Your medical background is apparent from your CV, [ soon to be available at a new
website - www.insdm.org ] but could you start by giving me some of the highlights of what hap-
pened in your earlier career that got you started down the road of the obvious failures of medicine?
Your material is really wonderfully succinct about stating these failures. But what happened that
compelled you toward the investigation of adopting computers to assist the human brain for medi-
cal practice?

Why Computers Are Necessary to Economically


Sustainable and High Quality Medical Practice
David Zakim: That was something I had never thought about until my formal academic career
was very close to its end. As you can see from my CV, my interest in what I’m doing now occurred
very late in my medical career. I went to medical school thinking, as most students who go to
medical school think, that they’re going to come out the other end a physician. While you’re in
medical school, you have very little understanding of what that really means. I had friends in medi-
cal school whose fathers were physicians. And they had no better idea than I, what to expect,
something which became clear after all of us had graduated and we were house officers in training.
I know from talking to them that they were as surprised by what transpired as I had been. When
you enter medical school, it’s very difficult to understand what your life is going to be like as a phy-
sician.

But early in my training, I got interested in laboratory-based approaches to medical science. I think
that principally occurred because of the people who had been my mentors when I was in medical
school and when I was a young house officer in training. For example, Prof. C. Y. Kao, who was the
head of the lab where I worked during summers in medical school and through most of my senior
year, told me when I was a sophomore in medical school, to go to the library everyday and look at
the journals that came in that day. These were laid out on a table. I should sit down and read
those articles I thought were interesting.

He also told me I should start by reading the work of Beadle and Tatum, who had just shared a No-
bel Prize in Biochemistry for their work on the relationship between genes and enzymes disorders.
nobelprize.org/nobel_prizes/medicine/laureates/1958/

I did what he told me. He was someone I admired and we remained life-long friends until he died,
unfortunately at an early age. So I went to the library everyday for the next 40 years and read
what was interesting to me in the journals that came in that day. I only stopped doing this when I
became emeritus from Cornell because I didn’t have a library that was available. Of course some of
the journals can now be read in their totality on the Internet.

At any rate, I read Beadle and Tatum and got very, very interested in biochemistry. I found it ex-
tremely interesting, perhaps because I started my university life with the idea I would be a chem-
ist. I majored in chemistry but I switched in going to medical school because I didn’t understand
quantum mechanics. I thought that if I didn’t understand quantum mechanics, there’s no way I can
be a physical chemist. By the way, I didn’t understand that what I failed to understand in quantum
mechanics was what everyone failed to understand in quantum mechanics. I was looking for an in-
tellectual resolution to problems which fundamentally didn’t have one.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 6
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

COOK Report: At Columbia University in 1961, I thought I would be a chemistry major. But unlike
yourself, I couldn’t do the basic calculus. So I went into Russian history and politics instead.

Zakim: I had no problems with the math, and I had loved chemistry since I had taken it in high
school. I had a superb teacher and I really liked the subject. And the biochemistry really captivated
me.

What started me on a research career was that I was a first-year assistant resident, which in those
days was your second year of training in internal medicine, and I saw a patient with an enormous
liver. We presented this case to the attending Prof. Dhod Kowlessar, who was a very bright young
guy. I thought the patient had hepatoma. I can remember Dhod telling me: “Zakim, what’s wrong
with you? This guy has an alcoholic fatty liver. He’s a drunk and he’s lying through his teeth about
his alcohol consumption.” I was very embarrassed by that because a huge alcoholic fatty liver was
a condition that I didn’t know about.

As a result, I went to the library and read about it and came back to Dhod and said to him, “The
biochemistry and the medical research on the etiology of alcoholic fatty liver are not consistent.
What the medical literature says is that the cause of this condition cannot be so.” “That’s very in-
teresting,” Dhod said. “I just saw a poster on a bulletin board that the licensed beverage industry is
interested to fund research on the metabolism of ethanol. Why don’t you see if they’ll give you
some money?”

COOK Report: What did you mean when you said that this condition cannot be so?

Zakim: It means that the biochemical mechanism for the cause of alcoholic fatty liver as reported
in the medical literature could not be valid because it was incompatible with what was known about
the regulation of the synthesis of fatty acids at the time—which remains fundamentally true.

I wrote a one or two-page grant application and the licensed beverage people sent back a check for
$12,000 I think, which in 1961 or 1962 was a significant amount of money. It was enough to buy
some small laboratory items and get some laboratory space. That set me off on the rest of my aca-
demic career. Initially I did biochemistry with a focus on metabolic pathways. Then I moved to en-
zymology and then to the biophysics of lipid-protein interactions within the plane of membranes.
So I moved away from everyday clinical medicine. But my appointments at the University of Cali-
fornia and at Cornell Medical College were always primarily in the Department of Medicine. And I
was always involved in the teaching of students and residents at both places, as well as fellows in
gastroenterology because I also had specialty training in gastroenterology.

So I always had some interest in medicine, and in the last two or years of my formal academic ca-
reer I came to see that it was essentially futile to try to teach medicine because there was too
much to teach. And not only was there too much to teach, there was no way a student could learn
what the faculty, I think glibly, expected the students at all levels to learn, especially at the medical
school level and less so at the residency level where not unexpectedly the residents were were
clinically more sophisticated. But at the medical school level the students would always ask, “ What
will be on the test?” by which they meant “What should I learn? What is it you want me to know?”
Now this is a very legitimate question for the students to ask. But generally the faculty would be in

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 7
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

an uproar over that question. They would say, “I’m not going to tell you. You should be able to fig-
ure it out.”

COOK Report: Isn’t the issue here not really one of what do you need to learn? But on a more ba-
sic level of how do I cope with what is thrown at me?. But the implicit assumption was that you
also need to learn all of the knowledge—which is impossible.

Zakim: That’s true.

COOK Report: But Larry Weed would tell them, and what you would also tell them I imagine given
your life’s experience, is that you need to learn how to approach matching the conditions in the da-
tabase of the patient when he walks in the door to the knowledge contained in the medical litera-
ture.

Let me ask you about this. Weed was in the chemistry laboratory probably ten or twelve years be-
fore you were. He was doing chemistry in the morning and medicine in the afternoon if I recall cor-
rectly, and he reported that he noticed this huge difference in the way the chemists or the scien-
tists approached their methodology and record keeping and the way physicians did it. It was the
problem versus the source orientation. How did this apply to your experience or did it?

Zakim: I really didn’t experience the two worlds in the way you described that Weed did. I didn’t
work in the lab in the morning and go to the clinic in the afternoon. I worked in the lab in the
morning, afternoon and evening.

COOK Report: If you had been going to the clinic, do you think you would have had the experi-
ence that Weed had?

Zakim: I don’t think so. I had very good mentors in science throughout my career. A lot of people
viewed me as a self-made scientist. No one is a self-made anything. At Cornell I had a mentor by
the name of Abe Mazur who had a lab at Cornell and was very, very helpful to me when I got
started with the work on alcohol metabolism. Abe was professor of chemistry at City College.
When I came to the University of California, I had a very senior person in the department of bio-
chemistry who was very interested in me. His name was Tom Singer. I learned enormously from
him about the intellectual processes of being a scientist. What those intellectual processes come
down to is if you want to do science, you have to work on a tractable problem. You also have to
formulate a problem that’s tractable and a set of hypotheses that can be proved or disproved with
experimental approaches. Singer was a superb teacher in clarifying one’s thoughts about how you
did that. So I became embedded in that sort of culture laboratory science as contrasted with clini-
cal medicine.

Medicine and Science: Two Different Cultures

I realized that medicine was a different culture. When the physician sees a patient, the physician,
myself included, doesn’t take apart the clinical problem in that way I would take apart a problem in
a very small piece of the world of metabolic pathways. The physician tries to match the facts of a

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 8
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

patient’s, for lack of a better word, case, with what he or she learned and can remember from the
world’s literature.

COOK Report: However, one additional reason you could do what you did in biochemistry is be-
cause you can work with atoms and molecules. The physician, of course, has to work with the en-
tire human being, which sets up a different battlefield in a sense, right?

Zakim: Yes, for certain. What the physician cannot do is control anything. The purpose of the sci-
entist is to control everything but one variable. Over the course of my career in science, what I
strove to do was to work on increasingly simpler and simpler systems. Ultimately, I was working on
systems with one or two components.

COOK Report: That’s interesting. I would have almost expected you to say the opposite, i.e. that
you went from simple systems to more complex ones.

Zakim: No. The fact of the matter is that if you really want to understand the complexity of physi-
cal systems, you’ve got to work on the most simple possible physical system. In medicine, not only
are you working in a system with enormous complexity, you’re working in a layered system of mul-
tiple systems of multiple internal complexities, all of which interact over the layers. You cannot do
experimental science in that setting. Now there is clinical research, but I don’t consider that ex-
perimental science. It’s a different kind of discipline where you sweep away most of the complexity
and you try to reduce it to simple sets of issues. So again, you try to isolate one variable in a clini-
cal experiment knowing that you have a lot of things underneath that you don’t know about.

But I was interested in physical science and experimental physical science where you have to work
increasingly on simpler and simpler systems, or on systems that are more simple because you un-
derstand each of the components and their interactions in a precise way. So my life was functioning
on two levels, the complex level and the more simple intellectual level. I didn’t see the intellectual
richness, that I was experiencing as an experimental scientist in a physical science, being possible
to achieve in a more clinical discipline. So I stayed doing simpler things. And I think this is true of
most people educated in medical school, who move away from clinical medicine to laboratory work.

COOK Report: What do you mean by intellectual richness? Do you mean that it’s the satisfaction
of achieving a kind of profound understanding of how hugely important basic mechanisms really
are?

Zakim: No. It’s coming to appreciate that your experience of the world is through your mind. I
think that for both theoretical and experimental scientists in the physical sciences that’s the hook.
You come to see, “I’m experiencing the world through my mind!” And that’s a tremendously excit-
ing and rich feeling. And I didn’t see the possibility of that kind of experience in a more clinical do-
main.

But I came to a point where I didn’t want to keep doing, after 40 years, the same kind of things I
had been doing. I had started working in a laboratory in 1958 and by the year 2000 I didn’t want
to work in a laboratory any more. Plus, I was very tired of the chase for money to support labora-
tory work. So in my inner turmoil regarding how to continue with life in a satisfying way intellectu-
ally, I was in a Cornell bookstore one day. Standing in front of me in the checkout line was a stu-

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 9
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

dent with a very large basic tome, a textbook on medicine by either Harrison or Cecil. The student
held up the two or three thousand-page book and asked me, “Which part of the book do I have to
know?” “You have to know it all,” I replied. I could see the blood drain from the student’s face. I
didn’t know in what way he wanted to engage me but he didn’t like my answer. So I told him to
look at it this way. A patient walks into your office. Which page of the book is your patient on? That
was the end of the interaction.

So I patted myself on the back and thought that that was a pretty insightful retort to the student’s
question. But wait a minute! There’s no way he can know everything in the book. Plus, the book is
a completely inadequate set of ideas and knowledge about medicine for really understanding each
of the problems discussed in the book. These books are really meant more as an introduction to
each of the areas. And they’re very dense. So while the student’s question was legitimate, what I
said to the student to do was impossible. And then I got started asking, how do you manage this
knowledge base?

How Do You Manage the Knowledge Base?

COOK Report: So this was a thought process unleashed by that encounter with the student. And
when did this process begin? In the following days? Or weeks?

Zakim: Sometime in that frame. And then it occurred to me that when I was a house officer, I read
15 to 20 medical journals on a daily basis, in addition to going to the library. I used to subscribe to
about 10 journals. It was the same in my peer group. And as a ward attending, I was very dis-
mayed by the fact that the house officers, i.e. the residents, didn’t read. They relied on hearsay to
learn. I was very concerned about that but it’s not unrealistic because no matter what you read,
you don’t read enough. You get overwhelmed. So that was also on my mind.

Then by chance I saw a lawyer who was no longer practicing law and was in a software develop-
ment company developing software for the management mostly of ideas related to patent law. He
had an expert system that lawyers or inventors or entrepreneurs or businesses would use to input
ideas and the expert system would analyze them. I don’t know what knowledge base was behind
this system. But he showed me a demo of how to buy a bicycle in the spring of 2000. I was not
very sophisticated when it came to the use of computers. I used one to type basically. I didn’t
know there were such programs as expert systems. But I said to him, “Fred, if I had software like
this, I could formalize all of medical knowledge so a machine could read it.”

COOK Report: If I understand it correctly, the patent literature in its diffuse complexity is vaguely
analogous to the medical literature. And what the patent lawyer needs to do is find the right series
of needles in the haystack, right?

Zakim: That’s partly it but medicine is far more complex. But at any rate, Fred said, “That sounds
like a noble purpose. If you have a laptop, I’ll give you a free license to use the software. But when
you use it, let my just give you a word of advice. Start on something about which you know a great
deal. Don’t start on something in which your knowledge is not in-depth. Pick a field of medicine in
which you have a great depth of knowledge. Besides, I’ll give you the name of a software pro-

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 10
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

grammer you can call if you get stuck.” So I went to Fred’s office the following day, he gave me the
program and that’s how I got started.

COOK Report: This is an expert system for patent law and it sounds like you’re saying there was a
way to apply it to medicine.

Zakim: No. What he was trying to sell was the barebones software. He was building the expert
system part. The user could build in any knowledge content. His customer base, as I recall, was
related to patent law. So he gave me the barebones expert system, which was designed to enable
the user to build trees, to put in questions and so on.

COOK Report: What a wonderful circumstance, and fortuitous coincidence.

Zakim: It was providential because I was introduced to this fellow by a neighbor who was also a
lawyer in a different software business. He called me up and said to come down to his office be-
cause he had a friend there with a really interesting piece of software. “You’ll really, really be inter-
ested in this,” he said.

COOK Report: And where were you physically at the time?

Zakim: I was still in Manhattan at Cornell. It must have been the spring of 2000. I knew I was go-
ing to become emeritus at the end of the school year that year. But when my neighbor first invited
me down to his office, I kept telling him that I wasn’t going to do it. I wasn’t interested. But he
kept arguing with me until I finally went. That’s the part that I find really providential because my
neighbor had no idea of what I was trying to wrap my mind around.

When I saw the software, I immediately knew that the place to start to build an expert system that
has an application to medicine was the medical history because medical history remains the basis
for achieving quality outcomes in medical care.

COOK Report: I totally understand that. But what was it about the software he had that allowed
you to make the connection with patient history?

Zakim: No. What the software made the connection to was the possibility to formalize disparate
knowledge using software. This software had decision trees and it could draw inferences. I had no
idea one could do that with software. But I eventually built a small demo that dealt with a very
small part of liver disease, hepatitis C. It was 70 or 80 questions. My son Tom, who was fairly
knowledgeable about computing said, “Why are you building a demo? Why don’t you build a real
system?” This query became an impetus to think in larger terms.

But if I were going to build a real system, I realized that whatever piece I undertook I would never
finish because any part of medicine is too huge and it changes continuously. Consequently, I would
never get to the end of it. Since I can’t get to the end of it and I wanted to do something that’s
valuable, I would have to conceive of what I was doing as something that produces a useful prod-
uct for a useful set of rules. Going down this path I think that people must understand that while
what they produce is nevertheless incomplete, it still has value because it can be applied to the so-
lution of problems.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 11
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

I knew I wanted to begin with history taking and I knew it was an endeavor that would take an
enormous effort by a large number of people. I had only myself. And I wanted to start on some-
thing that would have value. Therefore I started with the idea of an emphasis on chronic diseases
that were poorly managed which had high morbidity and high premature mortality that could be
managed more effectively if the existing knowledge base that was applicable to them was used by
physicians in day-to-day practice. So that became the initial thrust of what I wanted to do.

I had two family friends who put up a significant amount of money to do this as a commercial ac-
tivity. I saw this in the context of a commercial activity, not in the context of an academic activity
because I did not think, and I think I was absolutely right about this, that this would be an activity
that would be welcomed inside the academy of medicine. And I did not see it as an activity that the
National Institute of Health (NIH) would fund.

COOK Report: I have an historical question. This was 30 years after NIH funded Weed for PROMIS
in the 1970s I believe. So while your system is not precisely the same, in general was it due to a
change in the environment after 30 years that NIH wouldn’t fund it?

Zakim: It was my judgment that I would not get funded by NIH—if for no other reason, by the
way, than my age. There’s a very strong age bias in NIH funding. And not only was it that someone
my age was starting in a completely new career, but I also had no bona fides in this area. It would
have been very difficult. And you can’t achieve anything within the academy unless you have inde-
pendent funding. In fact, they want you to pay rent for your space. Therefore, I though it had to be
done as a commercial entity. We had enough money and I had some collaborators and employees.
I moved back to Mill Valley, California. And I was able to pay some people I contacted within the
academic community to write specific expert programs in some of the areas in which I wanted em-
phasis, like asthma, diabetes, coronary disease, management of lipoproteins, and depression. The
expert system program I had been using was inadequate for what I needed and the guy who origi-
nally gave the software to me wouldn’t allow modifications to his program to be made. But I had
now specified a set of things that I wanted that I thought I needed from a medical perspective.

COOK Report: So it was a source code issue?

Zakim: Yes, we wanted to change the source code, i.e. the instructions that tell the computer what
to do, that determine the basic interface and that determine the limits of what can be done. But we
couldn’t. So we went out and had a company in Foster City, California named Stottler Henke Asso-
ciates, Inc. (SHAI) build us an expert system that met my needs as I saw them then. The amount
of money we had was insufficient. I think that’s always the case. But soon after we got started, the
Internet software bubble imploded and that meant that more money might not be available.

COOK Report: Can you tell me something about the money that your friends contributed? Was it
money related to medical science? Or the arts? What?

Zakim: They came from the business world.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 12
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

COOK Report: So they were just successful businessmen who had capital to invest and were in-
terested in putting it to good use?

Investment via a Foundation of Social Conscience

Zakim: Right. Both of them had, and they still do, a very strong social conscience. One especially
invested only in socially useful endeavors. So we started and ran out of money but I got addicted
to it. I was convinced that this was the future of medicine, i.e. that if one is going to control the
issue of equity and the cost of health care, especially in the Western world, the approach has got to
be through improving the quality of care because the cost of healthcare and the lack of equity in
access to healthcare, which is a cost issue, is driven by poor quality outcomes. What I mean by
poor quality outcomes is outcomes that don’t achieve for the patient what it is possible to achieve if
the patient benefited from the fruit of existing medical knowledge.

Also in this light, let me say that I started from the beginning with the idea that a computer had to
replace human beings in the cognitive aspects of healthcare.

COOK Report: So the whole issue is that you have the physician and his practice, you have the
vast library of medical knowledge, and the only way this can be applied in a way that is useful is if
the physician has the ability to take a very good medical history. But that’s not enough. Once that
history has been taken, you have to have some means of guiding the physician over a period of
time in the interaction with the patient in such a way that the patient is connected with the medical
knowledge. In a sense, the greatest skill of the physician is to think about how to apply these tools
in the most logical and rigorous way as a problem-solver.

Zakim: You’re exactly correct. The diagnosis is the problem. In fact, I think it was Larry Weed who
created the term “problem-oriented record.” When I graduated from medical school in 1961, we
would write a note, which would be our “impression,” and then we’d write a “commentary,” which
was our analysis of the problem. Weed changed all of that. Our “impression” was really the active
problem list. It served to focus the physician on specific problem issues, which should then have
the effect of the physician applying the knowledge base in this specific problem to the intricacies
and the uniqueness of the problem as it exists in patient A, patient B and patient C. That is exactly
how medicine ought to be practiced. But medicine cannot be practiced that way by a human being
for two reasons. First, no human being can learn and remember enough of the knowledge base and
carry it around with them to use it. The alternative is that the human being says, “I acknowledge I
cannot learn it and remember it to use it. But I’ll define the problem and then I’ll go look up what I
need to know for this patient exactly.

COOK Report: But the whole paradigm of medical education and medical training has been and
still is based on the real false assumption that people somehow can learn all of this and carry it
around in their heads, right?

Zakim: Absolutely. Ask any doctor. “I can handle any problem that walks in the door.” But physi-
cians are trying to do something that is humanly impossible. About the time I got started on this in
the spring of 2000, I read an article in Endocrinology Review on expert systems that could be used
to help physicians manage diagnosis and management issues. The article said that at the time

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 13
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

there were 1,500 extant expert systems, each for a different specific area of medicine. The prob-
lem with these systems was that no one used them. They existed but no one used them.

COOK Report: But even if people did use them, you would need to standardize them so they could
interoperate among themselves.

Zakim: That’s true, too. So I could see that there were unsolvable issues with the approach to ex-
pert systems as they were conceived. Consequently, I knew from the beginning that I had to be
involved with the system, which would be built from the bottom up with a long-term perspective
that anything that was built would fit into a system that would manage the totality of problems,
and would not require at any point any input from physicians except for the physical examination.

COOK Report: That is a hugely profound statement you just made.

Zakim: But that’s the only way you’re going to solve the problem. I’m convinced that’s absolutely
true, more so now than I was 10 years ago. But I started with that in mind. And I had an enor-
mous advantage—I had no idea how big an undertaking that was. Because if I did, I would not
have started. But I’ve had that advantage in everything I’ve done in my life. I was just rash. I dove
in and did things and didn’t realize what an enormous mistake I’d made by getting involved until it
was too late.

I know that some of the other people who have tried to do some of the things that the current sys-
tem tries to do started with a different long-term outlook, and they built something of very limited
value. To achieve the long-term goal that I have in mind, they would have to start from zero. I do
not. I’m building it in such a way that more knowledge and more software functionality can be
added to it. And everything in the system is useful because it started with the goal that it would
eventually become a system that needs to interact only with the patient, and that needs to interact
with laboratories only online by knowing where to go get laboratory data for patient X or patient Y,
and it needs to interact with physicians only to the extent that a physical examination is required. I
firmly believe that one does not always need physicians to do physical examinations however.

In a lot of physical examinations, the patient can be instructed to self-examine a lot of simple
things that are medically very significant. The program already has some of these things built in.
But I think that with a software system, you can have nurses who care do a very fine physical ex-
amination. They just have to be instructed regarding what to look for and how to enter it. I’ve built
a software system that I haven’t implemented yet because I don’t have the money, and the soft-
ware contains instructions or the formalization of the medical knowledge. In other words, I have
formalized a textbook of physical diagnosis as software and ultimately we will put this into the sys-
tem. So I know that that can be done.

COOK Report: What you’ve just expressed is really profound! And now I have two questions. One
is regarding Holland and the other is perhaps for the next few minutes of our conversation. To what
extent can you verbally describe how you’re approaching this because you’ve just swallowed the
whale so to speak. The Holland problem is this. I’ve been working with Bob Hertzberger who lives
in Holland and was on the team that discovered the elementary particle the W boson in 1982. The
team leader got a Nobel Prize but went into bioinformatics. Hertzberger has done 5 years worth of
work in creating a virtual Laboratory for e-science and I talked with him at length until I under-

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 14
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

stood what he was doing from A to Z. At the end of these conversations he said that he owed a pa-
per to his government funding agency explaining what he had done over the past 5 years, and he
thought that our conversations could actually be turned into that paper and submitted to the gov-
ernment agency. I was flattered, of course, but the point is that a lot of Hertzberger’s stuff is theo-
retical knowledge and experimental science. There’s a book called The Fourth Paradigm. Twenty
years ago at the John von Neumann National Supercomputing Center, computational science was
called the third paradigm. The fourth paradigm is so-called data-intensive science.

Zakim: That book was published by Microsoft. I have it and I did read the part that had to do with
medicine.

COOK Report: Hertzberger had me read the introduction to The Fourth Paradigm as a way to con-
nect to the work he was doing.

Zakim: Was Hertzberger in that book?

COOK Report: I’m not sure, but he personally knows all of these people in the book. What I de-
scribe in the paper I’ve just written with him (and will send you) is the approach he’s taken over
the last five years and the approach that’s continuing in Holland. They have an e-science center
now. It’s e-science middleware—a generic, stack level, open source way of approaching it. He even
explains the details of how he’s structuring it in bioscience, bioinformatics and so on. I think that
part will completely correspond to your interests. These guys in Holland are developing a collabora-
tive, cooperative e-science middleware approach that can be used in bio-banking, astronomy, phys-
ics, chemistry, etc. on these absolutely state-of-the-art computer systems for collaborative work.

Now I’m interested in how you wound up in Stuttgart. One could take into consideration the cliché
about the German personality as being sort of authoritarian, and that may be a total mistake in the
context of where you are. But the Dutch situation is so incredibly profound in the way that nation
handles these problems. It’s just awesome ... Phillips Electronics is now Phillips Medical Electronics
Health Services. And all of their teaching hospitals are on the optical network. The Dutch have the
most advanced optical network in the world. I’ll bet that the hospital in Stuttgart could pretty easily
be connected to that network. All I’m saying is that as a connector I see some real potential oppor-
tunities here.

When we met each other in person for the first time about six weeks ago, I had been in Portland,
Oregon a few weeks earlier where I had done the interviews with these Dutch people. And since
December 6, 2009 I’ve been immersed in this seven days a week. As a result, I now understand it
at a level I didn’t then. My mind sees a major correspondence between your work and what the
Dutch are doing with their knowledge infrastructure. I’ve already told them a little bit about you,
and I’m now even more convinced that you need to meet each other at some point in the not too
distant future.

Let’s get back to the intellectual framework, the analytical concept and the decision-tree structure
that you’re now adopting to your system because here you’re going off into something utterly pro-
found.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 15
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

An Expert System to Address the Chronic Illnesses of Adults

Zakim: The system we have addresses illness in adults and in particular the chronic illnesses of
adults. So the emphasis is on asthma, diabetes, chronic obstructive pulmonary disease and a vari-
ety of heart diseases. The system we have today interacts directly with the patient to acquire their
life-long medical history up to the point in time when they’re interacting with the program. This is
CLEOS (Clinical Expert Operating System).

CLEOS addresses the acute and chronic manifestations of adult illness, and especially adult ill-
nesses associated with chronic disease. (It is not designed to interact with a patient who has just
suffered trauma. The system will not interact effectively if a patient comes in from a car accident,
for example. The system will not take a history of the car accident.)

In the context of chronic manifestations of adult illness, it addresses the acute complications of
these chronic illnesses and a lot of acute illness that occurs in patients without chronic illness, like
pneumonia and a variety of problems. The system does this by asking the patient what’s wrong.
And it does it by emulating the thinking of a dozen expert physicians with detailed expert knowl-
edge in the different aspects of medicine. And it is integrated as a single system so that it asks the
patient what’s wrong, which the patient answers from a set of menus I’ll talk about in a moment.
On the basis of what the patient answers, the system will then ask the patient any number of ques-
tions required to take a complete history of the acute illness, past medical illnesses, a review of
systems which in a systematic way asks a set of leading questions with regard to each organ sys-
tem, and depending on the patient’s answers looks in more detail within the organ system.

COOK Report: Is this the decision-tree structure that’s described in the two papers you sent me
and introduced in the following text box?

From An expert system for high-throughput collection and analysis


of clinical data
(Presentation at Annual Meeting of the German Informatics
Society, Leubbeck, Oct 1-2 2009)

[After the paper’s introduction is finished we read:]


Significance of the medical history
The patient’s medical history is the most important of the sources of clinical information in Figure
1. Diagnoses can be made at least 80% of the time from history alone, assuming of course that an
adequate history is available. When history alone is not diagnostic, it nevertheless suggests possi-
ble diagnoses and points to tests that could confirm a possible diagnosis. History is critical not
only for identifying what a patient perceives as their main problem but for delineating co-morbid
conditions, which can be important for treatment decisions. There are no chemical or genetic tests
or imaging studies that substitute for the information in the history; and there is nothing to suggest
this will change in the foreseeable future. Despite its importance, history-taking usually is done
poorly in routine medical practice because it is time-intensive and knowledge-intensive. There is a

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 16
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

compelling case, therefore, that the medical history is the starting point for addressing the quality
of care issue. It is as well the starting point for modernizing the fundamental approach to the con-
duct of clinical research.
Computerization of history-taking
History-taking is a rules-based activity that can be formalized in machine-readable form. We have
created a web-based history-taking program that interacts directly with people seeking medical
care. The program with the acronym CLEOS® [Clinical Expert Operating System] begins by ask-
ing why the individual is seeking medical care and proceeds, based on this answer, to acquire a
history of their main complaint and a history of their health experience up to the present point in
time. Figure 2 is an example of a formalization of the pathophysiology for collecting information
from a patient complaining of pain and specifically for routing patients with different pain syn-
dromes to collect information in the context of possible neurologic, renal, vascular or
______________________________________________________________________

Figure 2: Formalization of pathophysiology as machine-readable instructions to acquire


a history directly from patients.
_____________________________________________________________________

gynecologic causes. Nodes of the type HITHIA01 ask the patient specific questions. Nodes of the
types ROB or PHTHIID are sets of trees representing distinct sets of different pathophysiologies.
Nodes coded with XC are inferences that interpret the clinical significance of any set of prior an-
swers to direct further questioning. Arrows indicate the direction of questioning for a given answer
value. The value z on some arrows does not represent an answer but an instruction not to ask the
question if it has not been asked. The z therefore is a function that allows the program to look

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 17
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

backward to “see where it has been” without writing an inference. They are not shown in this tree,
but nodes also can represent laboratory results so that the direction of questioning can be based in
part on laboratory findings. The same applies to physical examination fields. The trees leverage
the knowledge of committees of physicians with clinical expertise in different fields of medicine
in that any patient with access to the Internet can receive the benefit of the knowledge of the set of
experts, who authored the content of the program. The current interview program has more than
13,400 decision nodes and more than 26,000 data fields that cover common, acute illnesses in
adults and all chronic illnesses in adults. The detail of coverage varies for different conditions; but
the program can be expanded indefinitely.
At the end of an interview, the program uses several thousand inferences to extract the clinical
meaning of the information collected and to generate a narrative report of the findings, including
possible diagnoses, co-morbid states and recommended actions in regard to diagnosis and treat-
ment. The number of inferences for interpreting the clinical information is infinitely expandable.
Indeed, the program can be enriched more quickly at the moment by expanding the number of
inferences that interpret the data than by extending the range of data collection. The program out-
performs physicians in terms of the accuracy and completeness of historical information. The pro-
gram is easy for patients to use, and it is highly acceptable to patients.

And later from section 6. 1 Data collection

So-called “observational studies” based on data from everyday medical care delivered to millions
of patients can substitute for controlled trials if one can insure that patient records are collected in
a standardized manner and that potentially confounding variables can be controlled by the robust-
ness of the data sets. Technology like CLEOS® thus is an important advance for clinical research
because the controlled, prospective clinical trial is seriously flawed. Its expense makes it inappli-
cable to the myriad of unresolved clinical issues; and as mentioned, results from prospective stud-
ies often do not apply in practice because they exclude the natural heterogeneity of patients with
apparently singular diagnoses. Technology like CLEOS® operates in the stream of everyday prac-
tice and can capture for clinical science the 900,000,000 out-patient visits, 35,000,000 hospitaliza-
tions, 64,000,000 surgeries and 3.5 billion prescriptions in the U.S/year, all of which is currently
wasted information. Technology like CLEOS® can assemble extremely large clinical databases at
marginal cost because professional time is not required for generating most of the clinical data.

[End of text box]

Zakim: In a sense it is. But remember that everything the system does is either based on
a decision tree or an inference. What the system outputs as reports as to what it found is
arrived at by simple inferences. If X is so and Y is so then Z is so.

COOK Report: How is this either similar or dissimilar to Weed’s concept of the problem
knowledge coupler? The concept is to enable the physician to couple the problems pre-
sented by patients to the relevant medical knowledge about those problems.

Building Decision Trees for Collection of Patient History

Zakim: It probably is similar. I don’t know the dissimilarities. But let me say this. In
Weed’s system you can pick one or another coupler I think. You don’t do that in CLEOS.
CLEOS asks what’s wrong and then it does all of the picking from there. No one is involved
except the patient.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 18
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

COOK Report: That is profound. I see why you do this. But can you describe the meth-
odology, the thinking, the flow charts, the road map behind this?

Zakim: Basically, if you take a textbook of medicine in coronary disease and it contains all
the complications, all the differential diagnoses, and all the facts involved—CLEOS has
formalized all those facts so it knows what to ask. Let’s say the patient enters that he has
pain and it’s the main complaint. The patient then gets a picture of the body and clicks
where the pain is. Say that the patient clicks the retrosternal area. The system then says,
what is the differential diagnosis of retrosternal pain? The number one diagnosis is coro-
nary disease so it starts there. If it doesn’t seem to fit coronary disease, it could be peri-
carditis. So it will look there. It could be esophageal disease. So it will look there. It could
be pneumonia with pleural irritation. It could be pulmonary infarct. The system will look at
all these things, including that it could be gall bladder disease. Based on the patient’s an-
swers, it will decide where to go.

COOK Report: How do you get the basic knowledge or thinking or information that is in
the relevant medical textbook into CLEOS?

Zakim: In two ways. You can rely on the knowledge of an expert. Or I can sit with a text-
book in an area I don’t know a lot about plus journal articles, and I can formalize that
knowledge in two ways. What is the information one needs to have to make a diagnosis or
to exclude a diagnosis? What is the order in which the information should be accrued? And
as you’re accruing this information, how should the information be interpreted as each
new fact arrives to direct what the new accrual facts should be? That’s how an expert does
it.

COOK Report: Give me some sense of the human resources needed over what period of
time to accomplish this type of input into CLEOS.

Zakim: I’ve been doing this in areas in which I have a fair amount of knowledge. And I
read a lot. I’m also building it out internally now.

COOK Report: And you’ve been doing this input primarily on your own?

Zakim: Yes, for the last eight years. I’ve had to. And if I read something that I think I
didn’t know in an area the system addresses, I will look to see whether or not that’s ad-
dressed. If it’s not, I’ll add it.

COOK Report: So there are two things going on here. You’ve taken on some fraction of
the total work of taking the patient history and you’ve put it into writing. The result has
been two or three of the files you’ve sent me. The next inference is that there’s an awe-
some amount of things that exist in either rough files, rough papers, rough diagrams or
just in your brain that have not yet been put down on paper in human readable words.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 19
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Zakim: There are two levels. There is a lot of stuff that hasn’t yet been formalized into
the software programs but it’s sketched out.

COOK Report: So you’re not only building a software program but you also need to build
something in words that humans can grasp in parallel to the computer, right?

Zakim: Right. I had no formal idea of this but I’m convinced. We have about 5,400 infer-
ences that the system draws based on the data that it collects. My guess is that it could
probably draw about 50,000 inferences from that same data set. I don’t have the scope
[of knowledge] to extract that from the data. Other people will have to do that. We do re-
port the facts of the history. I think we capture all of that as text. What we probably do
not capture as text are all the interpretations that can be made from those facts. That’s
an ongoing process.

COOK Report: From what you’ve said so far, my gut level knowledge leads me to believe
that a desirable thing to do would be to get you to meet Bob Hertzberger and some of the
bioinformatics people in Holland because I strongly suspect that if it make sense from
your point of view, what they would have to offer you is the framework, the foundation,
the infrastructure by which you could perhaps begin your own project within their bio-
system, for lack of a better word to call it. So this could turn out to be the software kernel
by which you might be able to get something going where you could get collaboration
from other people with interest in this really begin to build the necessary structures in a
collaborative way. In other words, what the Dutch are doing could help you get from
5,400 inferences to 50,000.

Zakim: No, I don’t think so because medicine is a completely empirical activity. There are
no theoretical principles that can be applied at this point in time to the knowledge to de-
rive conclusions on the basis of theory or some computational algorithm. The inferences
which have to be drawn have to be drawn on the basis of known relationships between
know phenotypic expression and diagnosis. I call that brute force.

The system as it exists it not e-science. It doesn’t meet the standard of the fourth para-
digm concept. It is the key element in moving medical science—I should say clinical sci-
ence, which is the interpretation of signs and symptoms, to the level of the fourth para-
digm. The fourth paradigm as it applies to what I’m doing, or as it applies more broadly to
clinical science, is to make sense out of very large, complex medical databases. And I’m
very interested in the idea that if we had ten million people with their phenotypes in the
CLEOS database that computational methods can be used to derive new clinical insights,
which cannot be derived on the basis of our empirical knowledge of medicine.

Our empirical knowledge of medicine is quite limited. I’ll give you an example. Coronary
disease remains the number one cause of death in the Western world. We know that the

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 20
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

incidence of coronary events and death from coronary disease can be lowered by man-
agement of low-density lipoprotein cholesterol level. And to that end there are predictive
algorithms for who should and shouldn’t be treated and to what levels they should be
treated to lower LDL cholesterol. Well, it turns out that , the algorithms used for primary
prevention require the treatment of 200 people to prevent one coronary event.

COOK Report: In other words, for every 200 people you do treat, you can expect one fa-
vorable outcome that you wouldn’t have had if you didn’t do it this way?

Zakim: Right. Now cost-benefit analyses, which have been published in the medical lit-
erature in English, indicate that that’s worth it. What the data says to me is that while it
may be worth it, the data are no good and the algorithms are no good. We have to do
better! So how do we do better? Nobody knows. Maybe it relates to eye color. Maybe it
relates to hearing acuity. It certainly doesn’t relate to the algorithm that’s being used. The
algorithm that’s being used may be better than nothing but it’s not a whole lot better than
nothing.

COOK Report: Is the inference from this that you ultimately drive it down to the genome
almost?

Zakim: No. Well, there’s an interesting article in this week’s British Medical Journal. It
looks at the value of genotype for predicting the incidence of diabetes versus the value of
phenotype. Genotype is useless; phenotype is much better. Combining genotype and phe-
notype has no value beyond phenotype alone. Now one could have guessed at that result
from other things that have been published. But the side-by-side analysis is very power-
ful. It used to be said, and it’s true— genotype, so what? Or sequence, so what? It doesn’t
tell you enough, and increasingly over the last five years people have come to see that.
Another interesting thing, not in the medical literature but in the business news in the last
couple of weeks, is that Decode is bankrupt. Decode, which was a commercial activity, de-
termines the genome of Icelandic people with the hope that they’re going to provide all
kinds of insights into the genetic basis of disease. Their goal was also to use this database
to develop new medications. Well, it was a failure. So it’s being reorganized to keep up the
databank of the genome of Icelandic people. But a follow-up article in this week’s New
York Times says that Decode is going to give up the idea of developing any new medica-
tions because the linkage between the genome and the phenotype is too weak.

Large Databases Likely Will Offer the Key


to What Should be Measured

So we don’t know what we should be looking for. Large databases, and this is where the
fourth paradigm comes in, are the only clue we are probably going to get as to what we
should be measuring. Number one, we’re at the limit of what rationality says we should
look at. Number two, yes, high blood pressure may be very important for the generation

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 21
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

of coronary disease. But what may also be important, and is not built into the algorithm
because no on can do it and apply it, is the following. At what age does hypertension be-
gin? What’s the patient’s age now? And so on and so on.

Another important thing is smoking history. Do you smoke? Did you used to smoke? You
don’t smoke? No one asks if your father or your mother or your spouse smokes. There’s
all of this stuff that is undoubtedly extremely important which is missing. It takes enor-
mous time to get, it’s very hard to put it into an analyzable form, and then it’s very hard
to analyze the data. The system I built does all of that. At the moment we don’t ask about
secondhand smoke but we’re going to start doing so..

COOK Report: If we were sitting somewhere in Holland with the bioinformatics and e-
science group of people, and you made that statement, and they asked you to explain to
them how your system does that, or how one would verify the reality of that statement,
what would you do?

Zakim: I’d say this. The system asks the patient, do you smoke? Did you used to smoke?
Did you never smoke? The patient says, I never smoked. The system asks, did your father
smoke? Did your mother smoke? If you’re married, does your spouse smoke? How heavily
do they smoke? It’s a set of nodes in a decision tree to collect those data. Those data are
then all stored in unique data files in a relational database. Now you have a million people
in your database. And you know who’s had a coronary event at what age and who has
not.

COOK Report: And the million people in the database have all gone through the same
decision tree of your software?

Zakim: Right. Every history is identical because if a question is not asked, it’s not asked
for a reason. A question didn’t go unasked because someone forgot to ask it. It’s unasked
for a verifiable and traceable reason, which is built into the logic of the trees.

COOK Report: Is there some way to have a visual representation of this logic pathway so
people could more easily see for themselves how the decision-tree works in practice?

Zakim: The concepts are simple. Let me just say that if you had a million people in the
database, they all have been interviewed in exactly the same way. Exactly. So the data-
base is complete for everybody. Now you get a mathematician and say, just use abstract
mathematical algorithms, the same kind you use for analyzing micro-array data, and see
what you come up with. They’re come up with stuff. That’s the fourth paradigm. That’s
analyzing complexity using methods for looking at very large databases that a human
cannot do; only computational algorithms can do it.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 22
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

We rely on the idea that the knowledge we have, the pathophysiology we have, is suffi-
cient to meet all of our needs. That’s an assumption that may or may not be true. In cer-
tain areas of medicine we know it’s not true. Look at the example I gave of lowering LDL
cholesterol. We do not have sufficient knowledge to predict who’s at high risk for a coro-
nary event until a patient has one. Then of course we know in the sense that someone
who has had a coronary event has very high risk for another event.

COOK Report: It sounds like you’re saying that you have the basic software structure
worked out, along with the patient history questionnaire and the basic infrastructural
framework such that if you had the resources to start having large numbers of people in-
terviewed by the software, people that may have to be pre-qualified in some way?

Zakim: No. Any adult fits.

COOK Report: What I’m getting at is that it sounds like you have the software but not
the resources to put it on a bunch of computers and get people to interact with the com-
puters to accumulate a database.

Zakim: Right. What we need for that, and we’re doing some of it on a shoestring, is the
underlying runtime system. We have software for that but it has to be replaced because
it’s not stable enough. I have had the wonderful good fortune to be involved with a superb
programmer, Andy Noori, who like me got addicted to the program and who is dedicated
to seeing it through because he believes in its social significance. Andy just finished 18
months of work, done without compensation, rewriting the program that runs the inter-
view, which was necessary to stabilize the program as it interacted with patients. I’ve
been enormously lucky to be involved with Andy. We would be nowhere as far along with-
out him. But we have other needs that a single IT guy cannot provide. We need a server
network, we need server security. We need a call center to help people out. We need
these kinds of things. In Holland it would have to be in Dutch. Right now it’s only in Ger-
man and English. We have large infrastructure needs for maintaining the system. By infra-
structure I mean software needs. The people in Holland have probably worked all of this
out, and at that level they could be enormously helpful. We are having a workshop on
February 26 in Stuttgart that begins to talk with software experts about the details of
what we need and how we should go forward. We need a different database structure for
saving all of the information. We need all kinds of things. But the basic medical part is the
part we have. Nobody has that. And it’s built in such a way that it’s infinitely expandable.

COOK Report: You have the patent for that medical part and it’s assigned to the Institute
for Digital Medicine?

Zakim: Yes, the Institute for Digital Medicine is a German charity. And the February work-
shop we’re doing is on content development and it’s by invitation. We have some software

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 23
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

architects that my key colleague in German has invited and we’re looking to recruit some
of them.

COOK Report: It’s now absolutely clear to me where you’re going and it’s mind-boggling.
As far as the Dutch are concerned, I don’t think there’s anything that would prevent them
from doing international collaboration with you. I don’t think you have to be a Dutch citi-
zen to use their infrastructure.

Zakim: The Dutch have begun to build large medical databases in specific diseases. That
is closed off.

COOK Report: Could you elaborate on what you mean by its being closed off

Zakim: The purpose is to develop a database that the Dutch will own. I don’t believe that
a specific commercial entity will own it but the Dutch government is developing it for the
purposes of generating money.

COOK Report: the politicians may be saying what are we going to do when our gas re-
serves run out?

Zakim: I think they intend to keep it. I think it is very Dutch. But it’s worth exploring be-
cause we could license what we’re doing to them. They could license what they’re doing to
us. There are all kinds of ways to do it. But I don’t see it as a really open academic kind of
activity based on what I know they’re doing in inflammatory bowel disease. They’re build-
ing nationwide databases. They have some very good databases in rheumatology. I think
their long-term goal is partly commercial, but certainly to the benefit of the Dutch as a
country and not to the benefit of a small number of Dutch.

COOK Report: You could look at it this way. If you took all of the nations on the face of
the earth, I think the Netherlands is probably the nation who probably over all nations
would be most intellectually and philosophically equipped to listen carefully to what we’ve
been talking about and to follow up and collaborate. And if they didn’t collaborate, it
would probably be for political reasons.

Zakim: Yes, I agree.

Goals for the Next Five Years

COOK Report: Let’s try to wrap this up by your telling me what your goals are in the
short term and the long-term, like the next five years. What do you have in mind?

Zakim: We have a set of goals that are to some extent technical, to some extent clinical
research, and to some extent content development. The technical goals we have are re-

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 24
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

lated to content development in the following way. We want to create a platform through
which multiple authors can build content to the same version of the CLEOS, with the soft-
ware program on which everyone works exists somewhere on the web, in a cloud. Let’s
say that one version of the program is worked on by someone remotely. Because of the
complexity of the program, that requires a significant amount of software development to
arrive at a level of stability in such a way that one person doesn’t screw up the whole pro-
gram because of what he or she does.

So further development in maintenance depends on development of a software platform


so multiple authors can work on the same single version of the program. Tied to that, of
course, we’ll need to have multiple authors. Therefore I’m very interested in beginning to
recruit physicians who are clear thinkers and interested in developing content. I also have
to begin to look for somebody to replace me in the long run. So we have the platform and
the people to develop. And we want to also continue to expand not only the content but
also to continue to improve within the content that exists.

We want to build a community of like-minded people who believe as we believe that the
major problem in the delivery of health care is poor quality. Let me say in this regard what
I mean by we. The we is Prof. Dominik Alscher, who is at the Robert Bosch Krankenhaus,
Stuttgart and me. The only way to remediate poor quality is through information technol-
ogy of the kind we are developing. So we want a broad community of people involved.
Their involvement would be in a variety of ways. We want to involve the academic com-
munity in such away that it begins to use the program as a clinical research tool. They will
use it as they see fit in specific clinical areas. We expect to establish the value of the pro-
gram as an indispensable tool for collecting clinical information and building databases for
analysis and so on. One way to establish the value of the program is through publication
of peer-reviewed clinical research work that utilizes the program to achieve ends that are
otherwise not achievable.

We are also engaging the idea that we need a commercial partner who will develop a
commercial entity which will be a software services business that will build the server
network and take care of the security and do whatever it takes to distribute and support
everyday use of the program because the Institute for Digital Medicine (IDM) does not
have the skills to do that. We’re not interested in acquiring those skills. We want to make
what we do simply the medicine part and leave the commercial parts to people with com-
mercial interests and commercial skills. What IDM would do to the commercial entity is
license the use of the software and the content that we’ve developed. I’ve been advised
that any existing company would not be a suitable partner because of the nature of large
corporations. So we’re looking more toward venture capital.

COOK Report: So you’re saying that IDM would have a component that would be respon-
sible for the servers and security and information technology part of this apart from medi-
cine, right?

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 25
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Zakim: What you just said would describe a separate commercial component or entity.
IDM would share in that commercial entity but most of the revenues of IDM would come
from licensing the software content and its proprietary software programs.

COOK Report: How would this commercial entity come into existence?

Zakim: We have an advisor from the venture world who’s looking for venture people to do
that. There would be a business plan in which it would be clear what part they would have
to subsume and what part IDM would subsume, and there would be a legal agreement be-
tween the two entities. While I’ve had limited discussions with people on the VC (venture
capital) side, they like that idea very much and see the sense of it. Of course, when you
get down to the details there may be all kinds of problems with it. But as a broad concept
it’s not a problem.

We would also like to involved on a worldwide basis with other non-for-profits that are in-
terested in health care as part of the community. There are many, many issues in the
health care realm that IDM’s approach doesn’t solve, but to which we can contribute a
significant part of a solution. One of the very dangerous myths about health care is the
idea people have that if they break it, health care can fix it. If you break it, you have to
live or die with it. But the medical community in its broadest sense likes to use the hidden
message—go ahead and eat, drink and be merry and if you break it, don’t worry because
we can fix it.

COOK Report: You’re speaking on multiple levels, but a part of this is the whole idea of
education of the non medical population regarding what ‘s possible and what’s not.

Zakim: I think it falls under the area of consumer empowerment. When they talk about a
patient-centered health care system, what you’re talking about is a health care system in
which people take care of their health. You don’t need a doctor to take good care of your
health. Unfortunately, there will always be illness; some people will always die prema-
turely. And for some people, no matter what we do, they will die tragically or die prema-
turely. But we need to be able to afford care for those people. But most of what we spend
on health care does nothing but enrich the already rich.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 26
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Part 2 April 2, 2010

We Are Trying to Show What Can Actually


Be Done
Editor’s Note: Wanting more detail on the working of David’s technology and on his take
on the economic and political drivers affecting it, as well as strategy for overcoming po-
tential obstacles, I requested a follow up interview which took place on April 2, 2010.

COOK Report: Please tell me what you mean by an expert system.

Zakim: An expert system is just a rules-based system in which the rules have been for-
malized so a machine can read them and act upon them in two ways. First, the machine
can collect information from an individual or a laboratory with which it interacts and can
interpret the meaning of the information collected, and can output an actionable recom-
mendation, which in a medical context can be a diagnostic, a differential diagnostic, a rec-
ommendation for a certain kind of therapy, a change in dose, a recommendation for col-
lecting additional data like laboratory data.

COOK Report: A differential diagnostic would be recommending some other line of ques-
tions?

Zakim: No. The machine would not recommend some other line of questions because it
fundamentally asks all questions.

COOK Report: What exactly is a differential diagnostic? [Wikipedia explains: This


method, essentially a process of elimination is used . . . by physicians, physician assis-
tants, and other trained medical professionals to diagnose the specific disease in a pa-
tient. . . . . careful differential diagnosis involves first making a list of possible diagnoses,
then attempting to remove diagnoses from the list until at most one diagnosis remains.”
http://en.wikipedia.org/wiki/Differential_diagnosis ]

Zakim: Let’s say a patient has chest pain and the patient is 40-years-old. The differential
could be anywhere from esophageal reflux to coronary artery disease to a swallowing dis-
order to gall bladder disorder. So the differential is all reasonable possible causes. The
physician-patient interaction seeks information needed to eliminate possible causes – one
by one.

In an emergency room setting it would be, get an EKG. That’s a very simple example. The
number one cause to rule out is coronary disease. In another example further along the
differential process, the physician may find that the EKG shows Q waves or other essen-

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 27
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

tially pathognomonic evidence of an acute myocardial infarction. Or the EKG is done under
stress and it shows evidence for angina. Our expert system can output a differential diag-
nosis, it can give a diagnosis and say, get this patient to angioplasty, or whatever is most
applicable, i.e. whatever the best evidence-based guidelines say is what the machine can
recommend.

COOK Report: So obviously the machine will function somewhat differently in an emer-
gency room application compared to a regular patient visit to an office, right?

Zakim: It would function somewhat differently but not that differently. For example, the
machine might say, get to the emergency room immediately.

COOK Report: How then is this expert system updated? How does it evolve? It is com-
posed of patient intake data. This intake data is in a huge relational database of patient
histories and this database is continually updated. I see how that can happen. But how is
the medical knowledge in the literature represented or formalized in the database of pa-
tient histories?

Zakim: The database of patient histories is a database of fact. The patient has chest pain.
The patient has chest pain that is retrosternal. The chest pain does not radiate to the
shoulder. The chest pain severity on a scale of one to ten is a three. Those are all facts,
and that is what is in the relationship database or history of a single patient. With every
intake of patient data the size and usefulness of the database increases. It enables the
summation of histories for hundreds of patients and then thousands and eventually mil-
lions of patients.

That’s a process completely separate from how the system operates to collect the infor-
mation. Medical practice is rational; it’s rules-based. If it were not rules-based and ra-
tional, there would be no reason to go to a doctor. You could go to anybody. Anyone’s
guess would be as good as anyone else’s. But the sense is that disease presents according
to certain sets of empirical observations, which are rules, and will respond to a different
set of empirical observations, which represent another set of rules.

For example. If a patient is 23-years-old and enters the chief complaint as chest pain, the
machine knows because it’s built in this way (to consider the patient’s age), that the like-
lihood that it’s coronary disease is not a number one diagnosis. And it will start looking at
another possibility. On the other hand, for a 40-year-old male the number one possibility
amongst all possibilities will be coronary disease. Number two, a patient presenting with
pain has to be characterized in several domains. Where is the pain? When did the pain
start? Is the pain intermittent? If the pain is intermittent, what are the things that pro-
voke the pain? What are the things that alleviate the pain? What is the severity of the
pain? Those are just simple clinical rules that apply to the patient with pain. And you build
into the tree that if the patient complains of pain, you find out where the pain is.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 28
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

COOK Report: So what you’re saying is that, as far as current medical knowledge is con-
cerned, you’re trying to construct a system of history taking or knowledge acquisition that
accumulates over time and then is shaped into a knowledge system of its own. And what
do you do with the libraries filled with existing knowledge? Am I making any sense?

Zakim: Your last question makes good sense but the first part of what you said isn’t
wrong but it’s also not right. The system interacts with a single patient. It can only ac-
quire clinical data from the patient with whom the program interacts. It has a formalized
set of rules for acquiring clinical information from a patient.

For example, if you went to someone who didn’t go to medical school and you say, I have
chest pain. That individual could think, oh, gee. You could be having a heart attack. That
individual probably doesn’t have enough to know to ask specific questions about the pain,
what is the severity of the pain, what is the duration of the pain and so on. That individual
probably also doesn’t understand that this pain could be from the gall bladder, from the
esophagus, from the stomach. It could be from the pericardium; it could be from the
heart. If it’s from the lung, it can have multiple causes. If it’s from the pericardium, it can
have multiple causes.

Another thing a layman is unlikely to know is, if I ask you what the severity of the chest
pain is on a scale of one to ten and you tell me ten, you’re not having myocardial disease
because the pain of angina and the pain of a myocardial infarction is mild to moderate and
never really severe. There are other forms of chest pain that can be severe. And all of this
knowledge is available.

Now the machine can be built to collect knowledge on the basis of what is known and
what is in libraries. And the machine also incorporates the knowledge for interpreting the
facts for two reasons. One is deciding what’s the next question to ask. The second is de-
ciding how to interpret the constellation of clinical data that has been collected around
each of the medical issues in each individual patient that has been queried.

COOK Report: It sounds like the decision trees have to be built by incorporating very
fundamental and key architectural elements of existing knowledge, right?

Zakim: Absolutely. Otherwise it’s a waste of time. But there’s also the need to build infer-
ences which interpret the data as it’s collected and after it’s collected, and can reinterpret
the data as more data becomes available. For example, the patient has chest pain and a
normal electrocardiogram ten minutes after the history is collected. That has to be inte-
grated into the inferences. A decision tree is just one way of formalizing medical knowl-
edge. There are many other ways to do it. But physicians think in the way the trees are
constructed. They’re taught to think that way and they do think that way. The medical

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 29
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

texts think that way and I think that way. So when I saw a decision tree, I said, “Oh,
boy!”

COOK Report: Because you ask questions based on the data you input and then you rule
things out, yes?

Zakim: You make them less likely. But sometimes you can seem to rule them out or make
them more likely. In this case you change the nature of the questions. Here’s a simple
example. If you have chest pain and a fever, that’s not myocardial infarction or angina
because fever isn’t a part of coronary disease. So chest pain and fever is more likely
something else in the chest other than coronary disease;, and what that would be would
depends on the constellation of facts that the machine will try to collect.

The Platform Emulates Thinking of Panels of Experts


Viewing the Patient as a Living Complex System

What CLEOS is basically built to do is to emulate the thinking of panels of experts in dif-
ferent domains of medicine. So within adult internal medicine there’s endocrinology.
Within endocrinology there’s a thyroid domain, a diabetes domain and so on; within cardi-
ology there are several domains; within pulmonary several domains.

COOK Report: Can you say something about the process of selecting and using the ex-
perts?

Zakim: In general experts build expert systems that address their domain of expertise.

COOK Report: so this is already going on independent of what you’re doing. However
what you’re doing also depends on beginning to set up these panels of experts yourself?

Zakim: Yes and no. There are hundreds of so-called medical expert systems. They’re de-
signed to interact with the doctor in the following way. The doctor collects the history,
does a physical, and has some laboratory information. The doctor then sits down at a
computer interface, and the interface queries the doctor to input what clinical data the
doctor has acquired. The system will then output diagnostic recommendations and thera-
peutic recommendations.

There are multiple problems with this approach. First, each of these expert systems deals
with a single domain and doesn’t overlap other domains. The bigger problem is that no
one uses them because doctors don’t have the time. The third problem is that these sys-
tems usually contain more data fields than the doctor can fill. They impute answers and
take shortcuts. But the biggest drawback to them is no one uses them because they are
built first of all to interact with the doctor. These systems have been around almost 20

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 30
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

years or maybe more. Some of them are excellent. There’s one, it is an infectious disease
system which has been reviewed critically, and it’s very good.

We differ. We would have experts build expert domains with the purpose of interacting
with the patient and not with the doctor.

COOK Report: So you’re constructing a philosophy or a worldview within a framework


within which things operate. You have to begin somewhere, so you’re beginning with a
gradual process of accretion of educating people about what you’re doing, getting like-
minded people to agree to join you in building this so that it is integrated, it does work,
and it treats the body and the patient as a complex and interacting system. And as you
said before, most of these systems exist within silos of specialized knowledge domains.
Am I going in the right direction?

Zakim: Yes. For example, let’s say that I’ve isolated the diagnostic expert for infectious
disease who deals with a patient who has elevated cholesterol. He needs to collect the in-
formation to determine if such a patient has criteria that determine at different cut off
points what an elevated cholesterol means for that patient. It doesn’t address global is-
sues. Now my philosophy is based on the idea that in all the stakeholders in the health
care system, there is only one participant that really cares about the outcome. You can
guess what that participant is.

COOK Report: I would think it’s the patient.

Zakim: Yes. The hospital administrators fundamentally don’t care; they’re running busi-
nesses. Physicians are paid no matter what the outcome. I don’t think physicians are ma-
lign but they’re overworked and it’s impossible for any human being to learn, remember
and recall the scope of knowledge required for interacting in a daily way with patients with
general medical issues. The British Medical Journal suggests that a competent physician
has to have approximately one million facts at his or her command to deal with an aver-
age load of patients.

COOK Report: So what you’re looking at is the motivation to pay attention to what’s go-
ing on and the interest in dealing with all of these uncertainties. I’m assuming that the
patient is the key motivated person.

Zakim: The patient holds the keys to the whole process because only the patient knows
what’s happened to him or her. And why the program starts with the medical history is
because the medical history is the sine qua non for a good outcome. Diagnosis can be
made almost all of the time from the facts of the patient history alone; and when they
can’t, the facts of the history point to the diagnosis and they also point more significantly
to the co-morbidities which can impact not only testing but also therapeutic decisions and
so on. In the Western world, we have divided the whole knowledge base of the individual’s

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 31
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

experience among a panel of physician specialists. As a result of this substitution, the


patient has a gastroenterologist, he has a cardiologist, he has a pulmonologist, he has an
allergist, he has a dermatologist.

COOK Report: So ultimately you will recouple that segmentation of knowledge into the
system will be by way of interweaving the diagnostic processes of these panels? You do
this as expressed in by the tools of Boolean knowledge representation described in the
second text box from your 2008 paper below. You are weaving basic knowledge into a
skeletal framework of diagnostic decision trees along with the inferences made as the un-
limited memory and branching capability of the computer system couples the patient’s
presentation into the existing corpus of medical knowledge.

The Structure of the Knowledge System

Zakim: I would describe the representation of knowledge in the following way. The Insti-
tute of Digital Medicine will be like the editorial board of a treatise of medicine with one or
two senior editors and panels of editors in subcategories who can then assign certain
parts of their domain to other people. It’s a worldwide effort. Ultimately it will be a very
large worldwide effort to secure experts from around the world in different areas of medi-
cine to contribute and also to maintain different parts of the domains.

COOK Report: Hopefully you will be able to act as a catalyst to enable this to begin to
happen.

Zakim: That’s the purpose, or rather one of the purposes, which is to show that it can be
done. Now some people may be taken aback by my saying that the machine is going to
replace doctors in the context of framing a response to the query about the knowledge
base and its relationship to the computer. The idea of replacing physicians is long way off
at some point in the future. Currently that’s a metaphor. On the other hand, if the pro-
gram doesn’t “replace physicians”, it won’t work. You can’t do it half way. The system ei-
ther does what the best panel of physicians can do or it won’t work. If it doesn’t do what
the best panel of physicians can do, it’s not worth the effort because it won’t work. It
won’t improve the quality of care.

COOK Report: And eventually over time, the feedback loop that’s part of the design will
help the system evolve and remain accurate and keep up-to-date.

Zakim: It’s partly feedback but the system is also driven by clinical and fundamental re-
search as new knowledge becomes available, as new drugs become available, as old drugs
are shown not to be efficacious or to have toxicity in certain settings... knowledge has to
be incorporated.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 32
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

COOK Report: It’s a complex and interacting system that evolves and updates itself over
time?

Zakim: It does not update itself. No, it does not.

COOK Report: But the process of operating the system should result in its evolution,
right?

Zakim: Yes, but not without the interposition of humans.

Strategy by Which to Seed and Grow the Idea

COOK Report: How then do you plan to involve humans in a manner that would seem
less threatening to them and where they can see the benefits? And to what extent do you
think there will be people who self-select to support the system? And what are your
thoughts on some of the near-term kind of political and sociological issues that you’ll be
bumping up against?

Zakim: Right now I’m only interested in preaching to the choir. I’m not interested in con-
vincing people of the value of this.

COOK Report: How are you finding out who and where the choir is?

Zakim: I’m not actively doing that at the moment because I’ve got my hands full with
what I’ve got. But I’ll give you some examples. I don’t have a choir yet but I have several
choir members and the membership is expanding continuously, although I’m not pur-
posely working at it unless I have a brainstorm when I hear about somebody.

People I’ve spoken to who have contributed to the building of content have generally cut
me off after I’ve talked to them for about 30 seconds with the remark, “Don’t tell me any
more. You’re preaching to the choir. You’re absolutely right. I agree with you completely.
I’ll do it!”

A close family friend, who is German and a trustee of the hospital where I’m now working,
asked whether I would make it available for examination by someone at the hospital. This
was about six years ago. I said, sure. So we gave this person web access and the medical
director of the hospital asked a younger physician [Prof Alscher] to have a look at it. Do-
minik looked at it and came back with a written report in which he said that if this system
were deployed in Germany today, the quality of health care would be better tomorrow. We
should do this. So there are people who see the fundamental value. Actually there are
more people who see the fundamental value and want to start to use it than we can sup-
port. And we can’t support them because funds are too tight and we’re too limited in
manpower. So you can find the choir.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 33
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

We also know from discussions with a global company that 8% of physicians in Germany
would use CLEOS or a similar program now. The eight percent was ascertained by a PR
firm that surveyed German physicians. Eight percent is an interesting number because
I’ve heard marketing people say that worldwide no matter what product you have, 10% of
people are early adopters. And this survey just supports that rule or that conjecture. Now
8% for a project like this is a very large number. And so there are innovators. And we’re
not ready to support 8% of the physicians in Germany for a whole lot of reasons...mostly
to do with money.

COOK Report: So the worldview that choir members seem to have is that they’re the
early adopters; they’re some X portion of a population with a new way of doing things or a
new way of seeing technology, for a variety of reasons, faster than the others. These are
the early adopters, right?

Zakim: Yes, we would look for early adopters. But we know from talking to many people
here in the United States that the reaction of patients here is, boy! How can I get my doc-
tor to use it? Patients in the US see the advantage faster than patients in Germany. Pa-
tients in Germany are much more under the sway of the myth of the physician. They do
whatever the doctor tells them to do. They’re unquestioning and they have more faith in
the system.

COOK Report: We think in somewhat clichéd terms that people in Germany are perhaps
used to a more constrained system with more authority and may question authority more
reluctantly than in the United States. Does this make any sense in explaining the differ-
ence in patients you’ve observed?

Zakim: That’s an interesting idea that the Germans question some authorities less than
Americans but they’re much more suspicious of their government than Americans are. So
it cuts across complex lines. All I know for certain is that Germans have a different rela-
tionship, as compared with Americans, with the medical community. They’re more trusting
of the medical community.

However, because I’m not fluent in German I don’t interact with German patients. In the
United States I know that the patients are certainly very interested in this sort of ap-
proach. I don’t know that that’s true in Germany, although patients who have chronic ill-
ness who have been interviewed by the program think it’s very valuable for their health
because we’ve questioned them.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 34
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Role of the “Live” Physician

COOK Report: Let’s talk about the role of the live physician. As the system collects the
history from the patient by going through the decision-trees, is there anything a live phy-
sician could observe that the computer input process would miss?

Zakim: Absolutely. Except that by and large the physician doesn’t make those observa-
tions.

COOK Report: The system probably doesn’t allow him or her enough time in one sense?

Zakim: Partly, but now physicians in the US are mandated to use electronic medical re-
cords. Have you ever gone to a doctor who uses an electronic medical record to record
your history? They don’t even look at you; they’re looking at the computer. We can pro-
vide an output to the physician that the physician can read in a minute of two before the
physician sees the patient, and he or she can then spend time with the patient -- actually
looking at the patient.

COOK Report: And isn’t there the possibility that even before the patient visits the office,
he or she could input this form over the web?

Zakim: It always works via the Internet. In the hospital it’s an intranet.

COOK Report: And the idea is that the physician would already have some input from the
patient before the patient ever appeared in person at the office, right?

Zakim: The patient’s entire history will be available to the physician along with the in-
terim history and the follow-up as well.

COOK Report: It would be really good to see the inherent logic of what you have just ex-
plained gain traction. Meanwhile: a question about where other aspects of your work are
headed.

Natural language and speech recognition programs have evolved over time and are cer-
tainly much better now than they were 10 or 20 years ago. Do you see a similar path for
the system you’re working on, in that the diagnosis and treatment plan of a live physician
can be compared as the system evolves - in clinical studies I suppose - to the one gener-
ated by the computer? If the system works, it eventually produces a better quality out-
come. Do you foresee a process of just gradual use and education via the physician and
the system over a period of years where there will be a growing symbiosis between the
process of the physicians and the patients and the system to where the physician will not

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 35
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

be psychologically thrown for a loop when the computer “outsmarts” them? The self-
selecting physicians by definition won’t be thrown for a loop when this happens, right?

Zakim: They may be. But the system will be tested by its outcome where it will be com-
pared with routine care. Pieces of the system will be developed and licensed as devices in
the same way that drugs are developed and licensed as drugs. In other words, consider
care of hypertension, we don’t have to show we can take better care. If we can take care
that is as good as current standards, we have to get a license as a device. That’s a law in
the United States as I understand it; it’s an FDA (Food and Drug Administration) guideline.

As one physician said, the idea of providing better outcomes than physicians is not a very
high hurdle. This is a long, ongoing process. This will not happen overnight. We are not a
commercial endeavor so it doesn’t have to happen overnight. We are trying to show what
it is possible to do.

Whither the Physician?

COOK Report: When you say you’re going to replace the live physician in the long run,
the role of the live physician is used here as a metaphor. They will never be eliminated en-
tirely, but there’s a complex process here that one is trying to shape and metamorphose
into a more productive process for everyone concerned, right?

Zakim: There are very large-scale social and economic issues. The number one cause of
bankruptcy in the United States is health care costs. There’s no question that health care
costs are going to bankrupt the whole darn county. In fact, if you go to nejm.org (The
New England Journal of Medicine) you can download for free basically a perspective writ-
ten by an economist on the impact of health care on the federal budget.

http://healthcarereform.nejm.org/?p=3170&query=home “The Specter of Financial Arma-


geddon — Health Care and Federal Debt in the United States”

German’s spend about half per capita than the US does. They’re just further from the
edge, but the federal government there understands, like in the US, that they’re getting
closer and closer to the edge. And they’re very interested in controlling the costs of health
care there before health care overtakes the entire budget of the United States. And as
Senator Don Wyden said probably six to eight years ago in a speech he made in Oregon, if
we continue as we are, health care will consume 100% of the GDP (Gross Domestic Prod-
uct). So something has got to be done!

The alternatives are better paradigms or draconian rationing. Consequently, there is a


need to replace a paradigm that no longer works. And the paradigm is that a human being
of above average intelligence can learn all there is to learn to practice medicine on an ar-

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 36
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

ray of patients who will appear at his or her door on any given day. Not only can that per-
son not learn it, but they cannot remember it and use it to make rational decisions.

Each part of the paradigm is broken. The medical knowledge base can’t be learned. It
can’t be remembered. And if it can be remembered, it can’t be used rationally because it’s
too complex. And the reason is simple. There’s a real limit to human cognitive function;
it’s not infinite. It has serious limitations. Now our ability to think is fairly unconstrained,
but our facility to remember and simultaneously juggle a number of balls in the air is quite
limited. Computers can’t think but they remember. And if a computer has to juggle one
million variables and come up with a singular answer, that’s no problem for it if you tell it
how to do it.

Now a human being has to tell the computer how to do it. And if you look back over the
history of medicine, the idea that the role of the physician in a professional way will
change is nothing new. It’s been changing continuously. This is just going to be a natural
evolution. Some people won’t like it, but the economic realities will drive it. There is no
alternative. Well, there is an alternative but it’s socially unacceptable and politically diffi-
cult, i.e. rationing. But unless the paradigms change, rationing will happen.

Trials of CLEOS in Germany in 2010

COOK Report: Could you talk about and tell me as much as possible about the ongoing
trials that will be happening between now and the end of the year?

Zakim: We’re trying to deploy in an emergency room setting, which is a very hostile en-
vironment for this kind of a program.

COOK Report: What compelled you to go into the emergency room setting?

Zakim: Political reality. In Germany, health care is two systems—a hospital system and a
non-hospital system. The academic community is involved in the hospital system. I’m in-
volved here in the hospital system. (I don’t know if there’s a similar system in the Nether-
lands.) My key colleague here in Germany is director of the emergency room. And there
are 20-30,000 visits a year. That’s a lot of people. In the beginning we’re not going to do
outcomes studies. We have compared the historical data collected by the program versus
the historical data collected by the physicians in the hospital. And the program outper-
forms them. The published manuscript has the details of the comparative performance,
i.e. what the doctor finds out and what the computer finds out—the number of problem
areas that go undetected by the physicians. [See BMC Medical Informatics and Decision Making 2008,
8:50 http://www.biomedcentral.com/1472-6947/8/50]

COOK Report: So you have a foundation base of outcomes up to this point in time.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 37
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Zakim: Not outcomes, just identifications of problems of patients. That’s a start. I am in-
terested in outcome studies but I need different partners for that. The other things that
are happening here is that we’ll soon start to use the program to build very large data-
bases on the longitudinal behavior of the disease that presents as acute renal injury and
of the disease that presents as curable breast cancer in women. The databases will con-
tain the clinical parameters before, during and after these patients present.

COOK Report: If you look at the next 60 to 90 days, would it make any sense for you to
be more specific about the things you hope to accomplish?

Zakim: Just start to enroll patients, to actually have the program deployed, in other
words to start interviewing patients with these issues on an ongoing basis.

COOK Report: So this will enable you to collect data that can be used effectively accord-
ing to these new paradigms over time in following up and tracking these people, right?

Zakim: To better understand how the patient got to where they are, and what will happen
to them in the future, and to evolve information that will predict for other people so that
one can begin to interdict the natural course of the disease.

COOK Report: Expand a little upon what you just said.

Zakim: The Framingham study was designed to predict who was at risk for coronary
events. The Framingham study gave us guidelines, especially for treating elevated choles-
terol, for reducing the incidence of coronary events. What were the markers that predicted
good outcomes correctly? Those markers are now used worldwide. We hope to achieve the
same kind of long-term result as was modeled in the Framingham study of cholesterol,
but we expect to do it better because we can enroll much larger numbers of patients, and
we can analyze much broader clinical databases.

COOK Report: How do you expect the enrollment of larger numbers of people to pro-
gress?

Zakim: It doesn’t require any physician time. There’s no cost.

COOK Report: It’s just there and it gets used automatically, and as it gets used it proba-
bly gets used in more places. But as it gets used over time, the databases automatically
grow, right?

Zakim: It grows as a consequence of its operation with patients so there’s very little mar-
ginal cost.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 38
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

COOK Report: Can you conclude by explaining what you would bring to the efforts of the
key clinical directors involved in the various disease database efforts? Such as the Dutch
String of Pearls project?

And how might the process of collaboration unfold?

Zakim: It would just be a collaboration in clinical research. I have something they don’t
have.

COOK Report: That’s the CLEOS software.

Zakim: Yes, I would meet with them and demonstrate how it operates and see if they’re
interested. If they are, we can probably provide some kind of limited access to the server
in the US where it runs and see how it develops. But what would have to be developed
would be very specific clinical research protocols that are very focused.

COOK Report: So you have the servers and the basic framework and the architectural
structure which you could explain. And this is something on which the Dutch could begin
to use for the development of their own disease database in each of the seven areas that
the String of Pearls projects intends to develop. But the Dutch are talking about seven dif-
ferent diseases that aren’t necessarily at all what you’re doing in the emergency room.

Zakim: I’m only interested in two of their areas, the two most frequent which are rheu-
matologic disease, which we have very well represented, and what’s called inflammatory
bowel disease, which is also well represented in the program as it exists.

COOK Report: So you’ve done the early knowledge acquisition and the decision tree de-
velopment in these two specific areas.

Zakim: Yes, because they’re very common diseases and these two areas fit with their in-
terests. I know from the literature that the Dutch are building a national database in
rheumatology and they’re trying to do the same thing in IBD. And those two areas inter-
est me. But the other areas in which we have in-depth coverage are areas the Dutch
aren’t that interested in, at least not now. And obviously the Dutch interests reflect the
strengths of their academic medical centers, which are not uniformly strong. At any rate,
I’m open and we’ll see what happens. And the collection of the clinical data would belong
to the system. So if they contributed trees to expand the expertise of the system, what-
ever agreement we had, they would become part of the system and be available to any-
body.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 39
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Appendix:
Underutilization of information and knowledge in everyday
medical practice: Evaluation of a computer-based solution
http://www.biomedcentral.com/1472-6947/8/50

Editor’s Note: I close with this extended excerpt from David’s 2008 Bio-med Central pa-
per in order to give readers a more clear idea of the functioning and output of CLEOS

Acquisition of clinical data in the history-taking program is based on the principles of pathophysiology formalized as software algo-
rithms representing medical knowledge as branched chain decision trees. Figure 1 is an example of a typical decision tree, which is
the initial tree in the evaluation of a patient presenting with chest pain. The tree in Figure 1 is one of a family of 29 trees used to
evaluate patients with chest pain and then to acquire a complete cardiovascular history.

The arrangement of nodes in Figure 1 is based on a logical flow of questions in response to answers concerning the pathophysiology
of coronary artery disease as it applies to acquiring a history of a patient with suspected coronary disease, as for example the sever-
ity, nature, location and radiation of pain, associated cardiac and/or vagal symptoms, precipitators of pain and so on. Nodes coded
with 4 letters followed by a Roman numeral and ending in an Arabic number represent questions. Small case letters are answer val-

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 40
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

ues for the parent question. A specific answer or set of answers points to a specific follow up node, indicated by the direction arrows.
Question nodes are of several types: yes/no; yes/ no/uncertain; single selection multiple choice question; multiple choice questions;
graphic representations of anatomy, e.g. to select affected joints in a patient with joint pain; and user entries, which can be entry of
free text or entry from pull down menus. Nodes consisting of 3 letters represent sets of trees. Node PPU, for example, is a set of
trees representing the pathophysiology of pulmonary disease as it applies to history-taking. The tree in Figure 1 will switch from
acquiring a history concerning cardiac disease to focus on possible pulmonary disease when, for example, a patient's chest pain has
the properties of pleuritic pain. The node PGL is a set of trees representing the pathophysiology of gastrointestinal disease, exclu-
sive of liver disease, as it applies to history-taking. Nodes with 4 initial letters, a Roman numeral and ending in a capital letter repre-
sent single trees, which in turn represent single pathophysiologic issues. Node PGLHID represents a tree that questions the patient
about swallowing disorders. Thus, when the history of chest pain (acquired by transit of the tree in Figure 1) indicates intermittent
chest pain always precipitated by swallowing, the program will direct history-taking to the pathophysiology of swallowing disorders
(node PGLHID) and then to a review of the gastrointestinal history (PGL). Nodes labeled complete end a string of questions; and
depending on the arrangement of nodes at least one "Complete" node ends a tree.

As eluded to already, knowledge as trees for history-taking is broken-down by organ system and further as medical issues within an
organ system. The elements acquiring the cardiac history, for example, are organized as separate trees for issues like hypertension,
peripheral vascular disease, heart failure, coronary artery disease, and so on. Trees within an organ system and across organ systems
are integrated as a single interview by a master tree so that patients are asked relevant questions about all systems no matter what
their chief complaint, i.e., where in the set of trees an interview began. Moreover, no matter where the interview begins, e.g., inde-
pendent of the organ system of the chief complaint or a switch in the pathophysiology pursued by history of the present illness, the
program will be directed to acquire a history of all organ systems, a social history, a history of significant past medical events, a
history of medications being taken, a history of known allergies and a family history. The master tree will direct the interview to a
complete cardiac history as part of the review of systems for a patient with pleuritic pain or a patient with a swallowing disorder
manifesting as chest pain. The program thus emulates history-taking by a physician, who may decide as the patient's story develops
to switch a line of inquiry to pursue alternate pathophysiologies. This level of flexibility at run time is achieved through the mecha-
nism of gating entry into every tree in the program.

The program "tests" whether a patient should be inter- viewed about every issue in the program as that issue is reached in the course
of an interview. The "test" is based on answers to prior questions in the interview, which "gate" entry to every tree. Questions
HUBHIB60 and HUBHIB33 gate the entry into the tree in Figure 1. One of these questions must be answered "Yes" (answer value
a) for the tree to be entered. If the tree in Figure 1 is not entered, the interview proceeds to the next tree in the order of trees within
the module NAP. The mechanism of gating enables the program to explore all aspects of the history that are significant for a given
patient while avoid- ing redundancy and issues of no medical significance, as determined by answers to questions at each node in an
interview.

Authors of content use the tree structures shown to develop their history-taking programs. The information in trees is compiled as
machine-readable code, which directs the program at run time by calling questions from a database as the program instantiates one
or another node. The program tested in this work comprised 8018 nodes representing approximately 12,000 data fields.

At the end of an interview, the data acquired by the history-taking software are analyzed by a large set of inferences for which Fig-
ure 2 is an example. Rules use the Boolean operators AND, OR and NOT. The rules serve two functions. They organize the data as a
clinical narrative; and they identify clinical problems.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 41
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 42
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Figure 3 is an example of a report generated by the inferences within the pro- gram. Much of this report is narrative; but all key
elements identified in an interview are reported as entries in Tables, e.g. Current medications,

Table of Active Problems (Figure 3). Some of the problems identified in the table of Active Problems can be identified from entries
into single data fields, as for example the presence of hypertension. The citation "Symptoms compatible with angina" is based, how-
ever, on a rule interpreting data according to pathophysiology. Note in Figure 2 that interpretation of clinical data to generate a pos-
sible diagnosis, e.g. "Symptoms compatible with angina" is concept-driven. "If state- ments" comprising a rule cite data fields not
specific text.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 43
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Deployment of software

Prior to deployment, the program was bench-tested extensively for errors in the logic of the trees, for errors in the flow of questions
and gating of entry to trees and for errors in reporting of findings. In addition, the program was used by several hundred out-patients
in the United States.

The software for acquiring the history operated from a central server behind a firewall within the IT department of the Robert-
Bosch-Hospital (RBK, Stuttgart, Germany). Patients connected to the history-taking software via the hospital's Intranet. Patients
selected their preferred language of German or English for the interview. After enter- ing demographic data, they were asked to se-
lect a chief complaint from a set of menus. All subsequent questions were determined by the software as illustrated in Figure 1. Pa-
tients were able to review and change prior answers by moving the program backward one question at a time along the same
branching pathways as the forward loop of questions. Moving backward erased answers to questions traversed in the backward di-
rection. Just as the program determines when a specific tree ends (see figure 1), the master tree ends on a "Complete" node at the
end of the family history. When the interview ends, all data are stored as codes in a structured, searchable database within the cen-
tral computer. Data from computer interviews were extracted from compiled reports as in the example in Figure 3 not by review of
individual data fields. Computer-acquired histories were available only to reviewers not to patients' physicians. Computer histories
were extracted by D.Z.; physician histories were extracted from the hospital chart by N.B.

Patient selection
Patients who were interviewed had been admitted emergently. Patient charts were not consulted prior to any interview; and the
study nurse, who picked patients randomly for interview, had no knowledge of these patients before an interview and carried on no
discussions with floor staff about a patient's clinical state. No demographic criteria were used to select patients except that inter-
views were limited to patients on general internal medicine, cardiology and nephrology services, which agreed to participate in the
program. Patients were asked simply to participate in a test of a new program by which computers interview patients about their
past and present health problems. No attempt was made to explain to patients the clinical value of the medical history. No patient
who was asked to participate declined to be interviewed.

Ninety eight patients were interviewed by the computer program after a physician's history was recorded. Patients could choose
between a computer workstation in the study unit of the clinic or use a laptop in bed, which had a wireless connection to the intranet
of the hospital.-

Editor’s Note: IDM has a password-protected test site [www.insdm.org] that allows health care pro-
fessionals to use a run-time version of the CLEOS® program. Interested health care professionals can
contact David Zakim at david.zakim@idm-foundation.org to request time-limited access to the program.
We will have a public site at www.insdm.org in July, with access to the vitae of David Zakim and other
key people, manuscripts describing clinical research results generated by CLEOS® and a video demon-
stration of CLEOS® in operation.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 44
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Executive Summary

The Progress of the Medical Record


from a Mainframe
Hierarchy to Decentralized,
Edge-Based Intelligence

Larry Weed and the Coupler - The Path from Promis to the
Personal Computer and Increasing Decentralization

By the 1980s when Dr. Weed began the work that ultimately led to the formation of Problem
Knowledge Coupler Corporation in 1982, Moore’s law and its inherent economics were making
shifts in the availability of the technology from expensive institutionally based mini computers to
personal computers. These shifts changed the landscape of what was possible for him to accom-
plish

At the same time the choice of TCP/IP as a decentralized standard for the NSF funded supercom-
puter centers enabled NSFnet to embrace the internetworking philosophy of the builders of the AR-
PAnet. In the age of the integrated circuit, intelligence and the ability to experiment and innovate
via loosely coupled cooperation moved inexorably downward and outward from the commanding
heights of centralized industrial era organizations to the people at the edges.

As a result, people like Larry Weed had access to technology, that a few years previously, only a
huge corporation could have afforded. They were free to experiment and innovate because the
costs of failure were orders of magnitude less. Most important however was the fact that these
moves also enabled the emergence of a global Internet that would make possible David Zakim’s
approach aimed at building a globally distributed system called CLEOS. [We note that Dr. Zakim’s
system is also being established with the Peer to Peer knowledge commons described by Michel
Bauwens in a future COOK Report. Very likely August 2010.]

Larry Weed set out to do what could be done with the tools at hand. What was possible with the
emergence of the personal computer and the parallel decline in price and power of a larger com-
puters suitable for a small hospital was to carry forward and refine the 1970s NIH funded PROMIS
experiment carried out at the University of Vermont Medical School. Dr. Weed saw that for the first
time it would be possible to think in terms of how to bring the multimillion-dollar investment
needed for the 1970s technology into the office of individual physicians for between 5 and 10 thou-
sand dollars per office.

The evolution of patient intake and history was a matter of reasonably straight-forward input from
the earlier PROMIS work. As Soshana Zuboff had shown in her 1988 classic, In the Age of the

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 45
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Smart Machine, the political issues of applying computers to medicine were overwhelming. Larry
had gotten to the point of a clinical trial on the maternity wing at the University of Vermont Medical
School in Burlington. In an era when basic economic costs of medicine were rising rapidly but not
at such a terrifying pace as now, the political and sociological problems combined to derail further
progress. The doctors saw it as an unacceptable threat to their positions. But when his funding
ended Larry refused to give up his dream.

How to Structure and Formalize the Knowledge Base in a Pre Internet Era
During the1980s I would argue that Larry Weed’s major problem was how to deal with structuring
and formalizing the evolving medical knowledge base. Given the economics and tools at hand it
was mandatory to find an economically sustainable approach to the daunting task of capturing
medical knowledge for the on-going development of the PROMIS medical expert system.

In 1981, in order to describe his desired direction, he wrote an article: "Physicians of the Future"
in the New England Journal of Medicine.

“In medical practice, expert thinking should be coupled to action through modern tools of commu-
nication (principally computers); we should not rely too heavily on human memory and analytic
capacities at the time of action. … All workers in the basic biologic sciences … should have the out-
put of their research systematically entered into computerized knowledge structures that are cou-
pled directly to the everyday actions of all providers solving problems in health care.” In an April
11, 2010 email Larry added: The second page of the three page article refers to the need “to de-
velop a computerized guidance system that couples the best in current thinking to the patient’s
problems and provides feedback on all actions.” A 1982 fourteen page paper describing Weed’s
thinking is found here: http://pkc.com/papers/philosophy.pdf Larry’s email concluded: “The big
issue is combinatorial thinking enabled by computers and joined with the computerized problem-
oriented record.”

When I asked David Zakim to respond specifically to these quotes by Larry Weed he replied: “They
describe what I am doing with limited sets of knowledge. The idea that all medically related re-
search should be incorporated into some type of formalized knowledge structure is the right idea of
what has to be done. That said, it will take an army of physicians to carry out the task, which will
happen only as the role of physicians shifts from immediate providers of care [in a cognitive con-
text] to "programmers" of medical knowledge into machine-readable databases of knowledge. I
believe this will happen on the order of 35 to 50 years from now. I won't be around when it starts
to happen as Weed envisions it. My role is to show that it can be done and can provide significant
societal benefit.”

Given the capability of the tools at his command in between 1980 and 2000, it was only natural
that Weed would have to think and work in terms of what became known as “knowledge couplers”
– recipes for making connections to the knowledge regarding the patient’s problem set found in the
current literature.

I contend that Larry Weed did what was natural given the state-of-the-art in the 1980s and that
David Zakim, by not starting until a full generation later in the year 2000, was able to view the op-
portunity before him from the foundation of a fresh perspective. This new foundation rested on

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 46
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

mass increases in computer processing power, storage and database performance as well as on the
new platform of the Internet and the change in outlook from closed and proprietary to open and
collaborative.

Larry Weed’s approach in 1982 was quite reasonable. Establish a private corporation called the
Problem Knowledge Coupler Corp. and seek funding in order to build knowledge couplers that
would guide physicians in the treatment of the most common families of complaints presented by
their patients. When I examined PKC Corp. in the second half of 2009 they had built, after ap-
proximately 28 years of existence, a family of about 90 to 100 knowledge couplers that repre-
sented perhaps 80% of the problems with which patients seek medical care.

As I documented in my November 2009 COOK Report, looking for insight into my own issue of ar-
thritis, I very eagerly tried out the most likely knowledge couplers after having completed PKC's
basic patient history coupler. They were useful and informative but did not go nearly far enough.

While PKC seems to have found a useful role in being able to maintain the current family of ap-
proximately 100 couplers, Larry Weed at one point would have liked to expand the family to as
many as 1000 because the knowledge set was not deep enough with 100. For example, it was
easy for me to ascertain that my own problems fell within the realm of arthritis but the level of the
patient's medical history at intake was not detailed enough to arrive at a point of inference involv-
ing for me what was the early onset of the chronic problem. When this decision point was
broached in a visit that I made at the Utrecht University Medical Center in the Netherlands on 12
March, 2010, it enabled a further line of questioning that resulted in a diagnosis of DISH - an event
that brought some welcome clarity into a personal 20 year long history of uncertainty.
http://www.mayoclinic.com/health/diffuse-idiopathic-skeletal-hyperostosis/DS00740

I clearly would feel better off with 1000 couplers than with 100. The problem with this approach
however is that the additional 900 couplers would demand a vast increase in the size of the staff of
computer savvy medical librarians working for a private company.

Now let's assume for the sake of argument that if 50 medical librarians could keep 100 couplers
updated, and serve 80% of the population and that perhaps 500 would be required to keep 1000
couplers updated. The result would be a presumably at least tenfold increase in the cost of operat-
ing the corporate entity that kept the system viable in a way that would enable it to serve a hypo-
thetical 100% of the population.

To make the numbers simple, if operating expenses of $10 million a year could serve 80% of the
population, surveying the remaining 20% would very likely demand a ten-fold increase or an ex-
penditure of 100 million per year. It becomes quickly evident that we are looking at a problem that
does not scale. One of the reasons it does not scale is that it has more than 15 years of investment
in a prior generation of technology that does not take account of the changes that Dr. Zakim has
been able to leverage.

However, there is another problem with the private corporate PKC approach. Management inevita-
bly saw it as a profit-making enterprise that, if it did succeed, would bring in untold sums of
money. This was not surprising given that the company that Larry Weed founded in 1982 started
with a private, siloed, and proprietary profit-making vision which given the state of society and

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 47
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

technology at the time was really inevitable.

But by 2005 PKC needed further investment if it was to solve some of the most intractable prob-
lems facing our global society. Unfortunately, in seeking this investment there was a split in top-
level management and in June of 2006, its founder Larry Weed left. PKC continues on doing good
work but with its income based, in substantial part, on contracts with the US Department of De-
fense, its long-term potential -- when compared to the very different edifice that David Zakim has
been building for the last decade -- is limited.

David Zakim and the Transition to CLEOS as a


Next Generation Internet Knowledge Commons Based System
Examination of the differences between the structure of problem knowledge couplers and what Dr.
Zakim has been working on can provide some instructive insights into why we should be optimistic
about the prospects of CLEOS and the Institute for Digital Medicine.

David’s initial funding steps were commercial rather than academic because his age and the
paradigm-breaking message behind his research meant that the chance of something like NIH
funding would be remote. He was able to find successful business friends with strong social con-
science to provide money to catalyze his earliest efforts and allow the first generation code to be
created.

David, starting from the same fundamental premise as Weed, told me: “I could see that there were
unsolvable issues with the approach to expert systems as they were conceived. They were built by
different people for the purpose of dealing with different specialties. They were based on closed
knowledge sets and did not inter-operate.”

Consequently “I knew from the beginning that I had to be involved with the system, which would
be built from the bottom up with a long-term perspective that anything that was built would fit into
a system that would [by design] manage the totality of problems, and [the use of which] would not
require at any point any input from physicians except for the physical examination.”

That is a profound and revolutionary problem statement. Architecture would be key to the viability
of the enterprise and the right architectural choice would both increase the scope and pay divi-
dends in future operating economy. In the digital age architecture determines economics of a sys-
tem, which in turn will determine success or failure. As David puts it: “Now my philosophy is based
on the idea that in all the stakeholders in the health care system, there is only one participant that
really cares about the outcome. [This is the patient.]. . . . The patient holds the key to the whole
process because only the patient knows what’s happened to him or her.”

In 2010 fifteen years after the advent of the commercial internet and riding on the ubiquity of the
web and the vast increase of our population that has grown up familiar with the every day use of
computer tools there is a vastly different and more hospitable environment for patients to take
charge of their own health care by interacting with computer based record systems.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 48
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

The commoditization of technology means that every user of the Internet has the ability to interact
with David’s system. The motivated and interested patient is given for the first time a tool that can
make a major difference. In David’s words “The system we have today interacts directly with the
patient to acquire their life-long medical history up to the point in time when they’re interacting
with the program. This is CLEOS (Clinical Expert Operating System).”

“CLEOS addresses the acute and chronic manifestations of adult illness, and especially adult ill-
nesses associated with chronic disease. It addresses the acute complications of these chronic ill-
nesses and a lot of acute illness that occurs in patients without chronic illness, like pneumonia and
a variety of problems. The system does this by asking the patient what’s wrong.’

What CLEOS is basically built to do is to emulate the thinking of panels of experts in different do-
mains of medicine. So within adult internal medicine there’s endocrinology. Within endocrinology
there’s a thyroid domain, a diabetes domain and so on; within cardiology there are several do-
mains; within pulmonary several domains.

“We would have experts build expert domains with the purpose of interacting with the patient”.
The domains are structured into decision trees as described in the October 2009 text box above
where we read: “The trees leverage the knowledge of committees of physicians with clinical exper-
tise in different fields of medicine in that any patient with access to the Internet can receive the
benefit of the knowledge of the set of experts, who authored the content of the program.”

As David says in the COOK Report interview: “The Institute of Digital Medicine will be like the edi-
torial board of a treatise of medicine with one or two senior editors and panels of editors in sub-
categories who can then assign certain parts of their domain to other people. It’s a worldwide ef-
fort. Ultimately it will be a very large worldwide effort to secure experts from around the world in
different areas of medicine to contribute and also to maintain different parts of the domains.”

“I’m building it in such a way that more knowledge and more software functionality can be added
to it. And everything in the system is useful because it started with the ultimate purpose that it
would be a system that needs to interact only with the patient, and that needs to interact with
laboratories only online by knowing where to go get laboratory data for patient X or patient Y, and
it needs to interact with physicians only to the extent that a physical examination is required.”

David adds: “I’ve built a software system that contains instructions or the formalization of medical
knowledge. I also have formalized a textbook of physical diagnosis as software to direct and record
findings of physical examinations; and ultimately we will put this into the CLEOS system. I know
what can be done. I haven’t implemented CLEOS yet on any kind of scale because we don’t have
the money.”

I contend that what David is doing is laying the groundwork for a system that can succeed because
it is founded on the ability of the use of his patient history intake system to become available via
the world wide web on a global scale. In other words we have the ability to gather the data. But
once gathered, the data can be subjected to the analytics of the Fourth Paradigm to enable on go-
ing clinical studies of unprecedented scope and power. You build it like a carefully controlled and
monitored Wikipedia and let it benefit from what Yochai Benkler has identified as “commons based
peer production driven independently of monetary incentives. It can function this way because it

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 49
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

will be installed as infrastructure into the operation of on going medical activities in order to make
these systems function more effectively. Here is how David puts it in his 2009 paper:

Technology like CLEOS® can assemble extremely large clinical databases at


marginal cost because professional time is not required for generating most of
the clinical data.

It can be especially cost effective because, as it spreads through the health care system it becomes
a part of the infrastructure. It is just “there” and does its thing as a part of normal operation of the
healthcare system. On a local level, there’s no maintenance of the program. Needed maintenance
occurs centrally. The patient or physician as users can interact with the CLEOS program in the way
they might look something up in the NY Times. These are attributes that could not happen without
the fourth paradigm’s dependence on the optical networks of the internet and its evolving e-science
tools.

Furthermore, when CLEOS spreads to the point that it begins to operate as part of the basic medi-
cal infrastructure, it will facilitate physician use of medical records because the physician will have
as a by product of every day practice the complete patient history data along the lines of the illus-
trations shown in the 2008 paper excerpts included as the Appendix on pages 37 - 41 above.

Finally and most significantly I contend that the knowledge mapping and coupling process of Insti-
tute for Digital Medicine treatise described by David can look to the evolution of Wikipedia for its
proof of concept. Wikipedia seemed an impossible goal when it started in 2001 but it has meta-
morphosed into a global multi language resource via the structural capability of internet technology
while being propelled by the desire of knowledge experts to collaborate.

Nothing will ever grow in quite the expected way and the IDM knowledge input into CLEOS will un-
doubtedly not be as open as Wikipedia. Nevertheless Wikipedia provides an organizational proof of
concept. Furthermore its collaborative knowledge commons foundation seems certain to be able to
solve the disciplinary and geographical scaling problem faced by the need to expand the scope of
the PKC Corp as a private, for-profit corporation engaged in a task where full success requires en-
cyclopedic scale and global scope.

T h e e n v i r o n m e n t d e s c r i b e d b y Yo c h a i B e n k l e r i n We a l t h o f N e t w o r k s
http://cyber.law.harvard.edu/wealth_of_networks/Main_Page makes us optimistic for the success
of David Zakim’s project. On the open Business site
http://www.openbusiness.cc/2006/04/24/the-wealth-of-networks/ we we read:

“But according to Yochai Benkler traditional economics shaped by industrial norms has failed to ex-
plain the emerging pattern of open production. Against this context he introduces the concept of
commons based peer production driven by non-monetary and non-proprietary incentives.” Open-
Business of course had to ask where the money in this context is.
1. In your book you describe an emerging mode of production, which you call “commons-based
peer production”. What does this mean? Can you give examples?
By “commons-based peer production” I mean any one of a wide range of collaborative efforts we
are seeing emerging on the Net in which a group of people engages in a cooperative production

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 50
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

enterprise that effectively produces information goods without price signals or managerial com-
mands.”

I expect that by means of hard work and good fortune CLEOS has been designed by David in such
a way that the medical community will begin to invest in it to ameliorate on-going problems and
that, because of its open architecture it can be built on a Benkler peer-production-based founda-
tion. While the task is stunningly large, the technology, economics and politics may be aligning in
a way that will enable success.

Afterward
I do not want to suggest that my comparison of David Zakim and Larry Weed presents an exhaus-
tive picture of the use of computers in health care. It does not. There is much work being done on
electronic medical records and on medical expert systems of all kinds. What I wish to focus on is
broader. It is an organizational transformation of medicine that was begun by Dr Lawrence Weed
with his Problem Oriented Medical Record. By means of problem orientation of records. patient his-
tory could be charted in such a way that the structure of knowledge and the likely effect of its ap-
plication could be clearly charted.

Massive rapid linking and retrieval of stored data by means of computers became possible after
Weed gave medical records a a structure within which to function. Source orientation kept every-
thing in meaningless vertical silos. Problem orientation enabled lateral communication where for
the first time the power of computers could be applied to coupling or linking the vast amounts of
knowledge that could never be held in real time in the mind of a physician. This would be done
through knowledge structures, be they called couplers or decision trees, navigable via computer-
based expert systems.

Larry Weed broke the foundation for the enabling digital record keeping and began to apply com-
puters to the practice of medicine in the trial of PROMIS in the 1970s and then starting in 1982 in a
small scale commercial setting with the Problem Knowledge Coupler Corporation.

I contend that David Zakim, who beginning with his retirement from Cornell in 2000 recognized the
same medical problems that confronted Larry Weed. has taken a global approach to their solution.
Both of these men want to reshape all of medicine. Not just a part. The central meme is use of the
computer to organize medical knowledge and couple that organized knowledge to the brain of the
physician as he or she formulates diagnosis and treatment plans. Larry understandably started
with the physician. While in 1974 Weed published a paperback book for use by patients called
Manage Your Own Health Care, enabled by a new generation of computer and network technology
David Zakim is beginning with the web based intake of patient history.

Over all, I see the two men as following the same path: Creation of an ordered structure of data
into which computerized guidance can be built to increase the likelihood that decisions can be cap-
tured and tracked. The goal is to use the knowledge acquired to verify the quality of treatment out-
comes in the most broad and global sense. The task is huge and the benefits incalculable. While
there must be tens of thousands of different applications of computers to medicine, if there are any
others that match the universal scale and scope of the work first of Weed and now of Zakim, I am
unaware of them.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 51
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Symposium Discussion April May 2010

The New Internet Architecture:


Cloud and Content Providers Versus the
Incumbentʼs Last Mile
Editor’s Note: It would help if only as a part of our suspicions that the Internet is an economic
infrastructure that is important to a viable global economy, we had some idea of what the actual
architecture of the Internet is. Hidden behind bulwarks of non-disclosures and corporate
obfuscation the Net is now a black box. If you are trying to establish a policy for this major global
infrastructure that ensures as infrastructures generally should, that it functions in the public
interest rather that in the interests of firms large enough to siphon off most benefits for their
shareholders, one should understand how it is structured. In other words – what is its basic
architecture?

The past six years of Net Neutrality squabbles have not helped. The FCC by making the Internet a
Title One Information Service in 2004 and largely dropping regulatory reporting requirements has
deprived us of the ability to see the Internet more clearly than a mysterious blur of “pipes.”

Bill St Arnaud, who for 15 years directed the development of Canada’s research network, has
written a very important paper describing the current architecture:

A personal perspective on the evolving Internet and Research and Education Networks

Here is an abbreviated abstract:

“Over the past few years the Internet has evolved from an “end-to-end” telecommunications
service to a distributed computing information and content infrastructure. [snip]. This evolution in
the Internet was predicted by Van Jacobson several years ago and now seems readily evident by
recent studies such as the Arbor report indicating that the bulk of Internet traffic is carried over
this type of infrastructure as opposed to a general purpose routed Internet/optical network. This
evolving Internet is likely to have profound impacts on Internet architectures and business models
in the both the academic and commercial worlds. Increasingly traffic will be “local” with
connectivity to the nearest cloud, content distribution network and or social network gateway at a
local Internet Exchange (IX) point”..

Bill references Van Jacobson’s August 2006 talk at Google on a content delivery based architecture
for the net. See http://video.google.com/videoplay?
docid=-6972678839686672840&ei=iUx3SajYAZPiqQLwjIS7BQ&q=tech+talks+van+jacobson#

A 2007 Van Jacobson talk is here http://collegerama.tudelft.nl/mediasite/Viewer/?


peid=1b65466d7b9b4cbd973b089ca5a590a8

Then there is this very interesting entry can be found with a search for “Van Jacobson Google
architecture”.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 52
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

[Van Jacobson content-centric networking insights at the 4Ward … - 3:04pm Aug 18, 2009 … Van
Jacobson content-centric networking insights at the 4Ward … In the large and content-dense talk
(around 2,5 hs), Van provided many details of the CCN architecture]

I have read fairly carefully the 96 slide lecture of August 18, 2009 and find it extremely fascinating.
What Bill is doing in his talk is describing how content and applications have moved to the edge
over the past five years. And what Van seems to be saying is how to construct an overlay for the
delivery of content that can run quite smoothly on TCP/IP yet be quite independent of the
limitations such as IP before address space shortage that seem to be popping up all around us.

I wish I could say that I understand in any appreciable level of detail precisely what Van is doing —
I do not. But I think I do see the general implications. With virtually unlimited storage and memory
at our desktops and at the intelligent nodes at the edges of network the Akamaisation of the
Internet becomes almost a foregone conclusion. Indeed it seems to me that Google to deliver its
material has to be using Van Jacobson-like content centric networking protocols.

Now imagine running into this story on the front page of the New York Times website a couple of
hours ago where it explains how an increasing number of geeks have downloaded the unreleased
source code for Google’s Chrome operating system, put it on their lap tops or notebooks, added
their own enhancements, and are happily using it as an OS to access the Internet.

What I find really intriguing is that very few people seem to comprehend the change since the
popping of the Internet bubble in basic Internet architecture where you no longer assume that
information is at some specific geographical location and depend on the network to follow the Old
1990 rules of going to that point in getting the information and bringing it across one or more
backbones and through one or more Internet exchange points before it delivers it to you.

But the content providers, be they publishers in the old style or websites in the new style, have
brought the information to the general vicinity of virtually every MSO head end or telco central
office and it is only the duopoly's last mile that stands in the way of an extraordinarily cost-
effective access to the data. And it is Google of course that far more than anyone else has created
an environment that invites everyone to live in Google’s Cloud. Net books and the Chrome OS
would seem to be the final wrapping on the package. Of course the only inconvenience is the
duopoly standing in the way of the average person’s access to the Cloud under such
circumstances….. one must imagine,or at least hope, that it will become increasingly possible to
jump over the moat.

What intrigues me is how long it will take before more people understand the change in
architecture. And of course perhaps it’s the strategic significance of this change that leads everyone
involved to be so proprietary about their service and the geographic locations on the surface. The
people without any clothing look to me like the telco and MSO incumbents. They have warned us
that they have no incentive to invest in their networks but at the same time, because they have
fortified their stranglehold over the end-user, the rest of the world has had to build out to the very
edges where the duopolists attach their customers. What kind of investment in what networks one
must ask?

And because they have relied on their respective monopolies and, as Bill St. Arnaud points out,
they have not done the kind of building out that the content and application providers have, they

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 53
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

stand there potentially quite naked relying on secrecy, nondisclosure, obfuscation, lobbying and the
whole nine yards to maintain the fiction that they control the kingdom when in reality what they
control are only the drawbridges separating the people from the castle.

That is what I find so important about what Bill has written and for which you can get plenty of
intellectual backup for from Van Jacobson’s content centric networking efforts. A Google control
Internet running on Chrome and on a Van Jacobson-like content centric networking overlay of TCP/
IP may not be everyone’s favorite Internet but it will be a force to be reckoned with. While it is
tempting to think that the sooner the reckoning the better, one also must wonder how comfortable
a Google controlled universe with no counter balance would be. A rather chilling spoof maybe
viewed here. http://www.vidzshare.net/play.php?vid=16345

An Abbreviated Summary of Bill St Arnaudʼs Architecture Paper

Bill finds that a global Internet has now grown to such a huge extent that the old pretense that the
Internet is truly end-to-end in its structure has out lived any claim to economic usefulness.
Sending content or applications from one or more edges across the entire global network to all
other possibly interested edges simply doesn’t scale.

He shows why with the spread of exchange points it has become far more cost effective to move
content and applications -- software as a service for example sales force -- to places as close to the
end user as possible whereas much data as possible is cached – in other words kept locally.

He speaks in terms of “Application Content Infrastructure (ACIs) [that] are specialized


infrastructure providing advanced services to the global Internet community that are delivered
locally. This avoids all the attendant problems of packet loss and congestion on the backbone
network (and also in the last mile). Invariably ACIs connect to the public Internet at the IXs or at
private peering points located in most cities. Their development was driven by several factors, but
primarily it was to enhance the user’s Internet experience, especially when there is significant
congestion and packet loss in the last mile networks.”

ACIs seem not widely defined as such yet. Bill appears to be reasonably conflating application
infrastructure (middleware, software as a service, cloud computing etc) with content delivery, best
understood as web sites and related sources of data the delivery of which can be enhanced by
caching at the network’s edge. Akamai pioneered this in 1999. The COOK Report extensively
described Akamai in June 2000.

Bill illustrates his ACI concept by pointing out that Facebook runs its own infrastructure where
user’s data is stored on servers as geographically as close to the user as possible and sent to all
other servers connected to a Facebook ACI infrastructure put together by Facebook. This
infrastructure functions in a way similar to a VPN. In this scheme of things Facebook does its thing
with little if any data traveling over the public end-to-end Internet.

Bill points out that “Virtual Private Networks (VPNs) are extensively used by Fortune 500
companies, governments and other large organizations for internal traffic purposes.” However
“some ACI networks are global in scale and rival the carrier public Internet networks in terms of
size and complexity. Good examples of such large scale ACIs include Microsoft MSN’s network,

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 54
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Amazon’s EC2 cloud, Google’s Gmail and YouTube infrastructure as well as Level3’s Content
Distribution Service. Many Fortune 500 companies like banks and airlines have also deployed their
own ACIs as an adjunct to their own internal networks in order to provide secure and reliable
service to their customers.’

He continues: “While an ACI must make extensive investment in router and servers, it doesn’t
need to advertise the full routing table. Rather than maintaining their own web servers and email
many organizations outsource this to ACI specialists to save on costs.”

“Content and application service providers depend on locations in hosting centers for the
distribution, replication and load balancing of content and media to various IXs and other
interconnection points throughout the Internet. The quintessential examples of CDNs are Akamai
and Limelight. There are over a dozen companies in this space including players such as
CDNetworks, Edgecast, Voxel, Mirror Image, Amazon Cloudfront, BitGravity, CacheFly, Edgecast
Networks, etc” [snip]

“While CDN facilities are useful and cost effective for distributing web pages and streaming video
they could not address the need for computing intensive applications or storage. "Clouds" were
developed to address this need. Clouds are essentially a global ACI facility of tightly coupled and
interconnected computing and storage resources. An entire separate paper could be written on
clouds but generally they are broken down into three major applications areas:

· Infrastructure as a Service (IaaS)

· Platform a Service (PaaS)

· Software as a Service (SaaS)”

“Infrastructure as a service is cloud ACI where users can configure their own virtual computing
infrastructure and deploy their own applications. The Amazon EC2 cloud is the prototypical
example of this type of cloud ACI. Platform as a Service, on the other hand, removes much of the
complexity of creating virtual machines and allows users to focus on deploying an application. The
Microsoft Azure is built along these lines.’”

Application infrastructure and content distribution infrastructure come together at


exchange points.

“The key component that ties together hosting companies, CDN facilities and clouds into a
collective ACI service is the Internet Exchange (IX) point. They often serve a dual purpose:

(a) This is where media, content and application companies connect to third party ACI facilities for
distribution of their product; and

(b) This is where ACI facilities connect with last mile access providers for delivery of the content
and applications to the consumer.”

“There are several hundred IXs around the world. A reasonably comprehensive list of these
exchange points and the connected ACIs and Internet backbones can be found at http://
en.wikipedia.org/wiki/List_of_Internet_exchange_points.”

ACIs improve delivery of online services by distributing them to exchange points nearer
the end users.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 55
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

”Keeping data local and thereby reducing transmission time can significantly enhance a users’
perception of the quality of the responsiveness of an application – even under congestion and
packet loss. As long as there is sufficient bandwidth to prevent egregious congestion in the last
mile virtually all new innovative content and applications can be delivered over a best effort last
mile network if they originate from an ACI. For example, only a short time ago it seemed
unimaginable that one could deliver HD quality movies over the Internet – but this is done
routinely today by many services delivered using ACI, such as Hulu, Netflix, and YouTube.”

The research and education community has its own application content infrastructures in terms of
things like the Optiputer and LHC network.

How ACIs Benefit the Rest of the Internet Ecosystem


Bill continues: “To understand the impact of ACIs are having on the evolving Internet one only has
to look at a recent study by Arbor Networks [ARBOR]. Arbor Networks – in collaboration with
University of Michigan and Merit Network - presented the largest study of global Internet traffic
since the start of the commercial Internet at recent Internet engineering conference in October
2009.” http://www.nanog.org/meetings/nanog47/abstracts.php?
pt=MTQ1MyZuYW5vZzQ3&nm=nanog47

“The study included analysis of Internet traffic across 110 large and geographically diverse cable
operators, international end-to end carrier backbones, regional networks and content providers.
Results from the study show that over the last five years, Internet traffic has migrated away from
the traditional Internet core of ten to twelve backbone providers and now flows directly between
large content providers and last mile networks (and then to their end-users). As the study notes
these networks “have become so popular they are changing the shape of the Internet itself such
that many organizations and ISPs are directly peering with content providers rather than
purchasing transit from [backbone] ISPs”.

“As a result the costs of Internet transit have dropped dramatically for all the parties in the Internet
eco-system. Major media and content now often deliver their content directly to third party ACIs at
major IXs where most traffic is exchanged on a settlement free basis. Regional ISPs and last mile
providers have then been able to reduce the amount of Internet transit they purchase from
backbone Internet providers by directly connecting to an ACI at an IX. Backbone ISPs have also
benefited in that their costs are also significantly reduced as they can also access the same traffic
delivered by ACIs rather than carrying it on their backbones.”

“ACIs have not only reduced Internet transit costs but they have also introduced a greater degree
of competition. A content and application provider now has many different ways of delivering their
product to end users. Organizations with a few very small applications and with a limited market
can still choose to host their own servers and purchase Internet transit from a regional or backbone
ISP. Alternatively they can decide to deploy multiple servers at different hosting data centers or
move their content or application to any number of third party ACI companies. As we have seen
there are a plethora of such companies with a wide range of services and options.”

“ACIs have also significantly reduced the cost of Internet infrastructure by adopting new network

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 56
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

architectures that simplify and reduce the complexity of the traditional backbone ISP facilities.
Again this benefits not only the ACI but reduces cost of Internet delivery to all members of the
Internet eco-system. Because ACIs provide load balancing and redundancy through the wide
distribution of servers at various nodes around the world they do not need complex and highly
redundant network facilities between these nodes.” [snip]

ACI,  Network Neutrality Challenges, and Last Mile Networks

“The growing dominant role of ACIs in the Internet infrastructure may pose future challenges for
regulators to insure that ACIs are able to peer and interconnect to last mile Internet service
providers and that their services are accessible and unconstrained to the ultimate consumer.”

As more and more Internet Exchanges are deployed closer to consumers, the future Internet
regulatory battle ground will be the last mile.

The inadequate investment in last mile infrastructure, particularly in North America, was one of the
principle drivers to deploy ACI facilities so that content and application companies could ensure a
certain degree of responsiveness and interactivity with users. As ACI facilities are increasingly
bypassing the backbone ISPs, despite the drop in Internet transit costs, the last redoubt
for the cable and telephone companies will be ownership and control of the last mile
infrastructure. But as discussed earlier it is in the last mile where most Internet congestion
occurs. Despite the huge investment made by ACIs in building out their infrastructure as close as
possible to the end consumer, it may not be sufficient to overcome the limitations to today’s last
mile infrastructure particularly with the next generation of high bandwidth applications and as the
carriers increasingly rely on traffic management techniques to address the challenges of
congestion.

Historically building a last mile infrastructure that was part of a national or global end-to-end
network was very challenging and you needed large companies to maintain and build this
infrastructure. But with the development of ACIs and the disintermediation of end-to-end network
the last mile is a stand alone element which does not require the complexity of previous end-to-
end facilities.

Concepts like “Customer Owned Networks” are thereby easier to imagine and may represent the
future direction of how we deploy and manage last mile infrastructure. ]

Editor’s Note: Christopher Mitchell, who is ther Director of the Telecommunications as Commons
Initiative, Institute for Local Self-Reliance in early May released a comprehensive report on publicly
owned broadband networks in the US. He writes: “It covers the philosophy of such networks, the
barriers to building them, and a number of success stories. It addresses Institutional Networks, a
little bit on wireless for mobility, and a number of different approaches to building FTTH networks.
It also offers some lessons learned by the networks that have been operating for a number of
years.

But what might be more interesting for some members on the list is that I look at open access
networks specifically (focusing on Jackson, TN and UTOPIA) and some of the little incentive

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 57
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

problems between the owner and service providers that have produced friction and hurt the
feasibility of open access.

The report is available as a free download here:

http://www.muninetworks.org/reports/breaking-broadband-monopoly “

Finally Bill continues: The concept of RPON [RPON] was a further articulation of this idea to allow
consumers to control and manage their peering and multi-home with multiple ACIs at a nearby
Internet Exchange point.

The recent announcement by Google to fund a number of last mile fiber to the home pilot projects
may be the necessary catalyst to enable these types of innovative last mile solutions.

Some State Level Regulatory Implications

On Arch-econ Brian Harris who works in the New Mexico Attorney General’s Office wrote:
Reading Bill St. Arnaud’s article on Applications and Content Infrastructure (“ACI”) and the evolving
internet, I started to wonder how these facts about data center and IX deployment, investment
dollar flow and a putative stand-alone last mile network could translate into achievable advocacy
goals of increased investment (meaning tender loving care) for the last mile. After all, the last mile
is a particular concern for states trying to keep their citizens in the game. I ask because obtaining
regulatory fiats is of quite limited utility; what I think really needs to happen is that the owners of
this last mile infrastructure come to realize that their business interests are best served by
disentangling “service” and infrastructure. I think Bill’s paper does an important job in describing
the detail of how the “smart edge/dumb pipe” network looks and will continue to look for some
time. I am trying to gear it down to the point where both public and private decision makers say
“oh course! We need a LoopCo. Summon the M&A drones.”

“It also occurred to me that this was a more detailed and data driven description of what Internet
advocates have been referring to as the “smart edge” for years. Or what could almost be
described as a smart edge, if it wasn’t for that dang dilapidated, outdated and sometimes rotting in
the ground local infrastructure.”

“Gordon asked for an elevator pitch of the article. My elevator pitch would be something along the
lines of:

1. The long haul networks have evolved as a truly competitive market, and large customers who
need to ship lots of bits far distances have a variety of cost-effective choices. For this network,
connectivity is truly a commodity.

2. The local distribution networks have lagged far behind however in terms of true, market
disciplining competition. There, the owners of these local networks attempt to differentiate their
commodity by packaging various connection speeds (using various transmission media), to offer
“services.” Hence the “triple play” option that many local infrastructure owners offer.

3. Businesses that rely on plentiful contact with -- well call them what you will (end users,
consumers, citizens, ratepayers); have leveraged the commodity pricing of long haul to diminish

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 58
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

the impact of the local bottleneck of poorly maintained local infrastructure. This allows them to
conduct business with a high degree of confidence that connectivity to their information
(applications and/or content) will not be unduly hampered.”

4. As a card-carrying member of the (broken?) regulatorium, I paid special attention to this


statement:

“The growing dominant role of ACIs in the Internet infrastructure may pose future challenges for
regulators to insure that ACIs are able to peer and interconnect to last mile Internet service
providers and that their services are accessible and unconstrained to the ultimate consumer.”

Conclusion
In his abstract Bill St Arnaud sums things up: “This evolving Internet is likely to have
profound impacts on Internet architectures and business models in the both the academic and
commercial worlds. Increasingly traffic will be “local” with connectivity to the nearest
cloud, content distribution network and or social network gateway at a local Internet
Exchange (IX) point. Network topology and architecture will increasingly be driven by
the needs of the applications and content rather than as general purpose infrastructure
connecting users and devices.”

I would ask: what is it that the FCC thinks it is dealing with? How can policy be argued if no one
knows what the infrastructure is, where it goes or who owns it because it is all under rigid NDA?

Bill's paper documents what has been happening better than I have seen it done before. “Internet
lines” are transport, Erik Cecil said.

COOK Report: Were it clear that the only thing the incumbents have is the last mile because
everyone else has built their own paths to get to the last mile, would this make any difference? I
think it would. Remember ATT whines that it would like to get rid of the last mile. But because
they are able to hide their infrastructure in secrecy, they can lie all day long and get away with it.
Can this be changed? They whine that they are being deprived of incentives to invest in their
networks but they are allowed to escape any direct examination of what those networks actu8ally
are. How long will this be tolerated?

May 7 Robert Atkinson: “I’ve not been able to catch up with the threads on the FCC’s plans to
reclassify internet access since I’m traveling out of the US where internet access is difficult. But I
finally got a decent connection and found an interesting “Wall Street” reaction (from Raymond
James), the gist of which is not a big deal: “...we do not believe any real impact on these
carriers will emerge for years, if at all....” Since the cablecos and telcos will be (are?) arguing
that the prospect of regulation will inhibit investment, I thought this investor reaction was
revealing.”

COOK Report: Agreed Bob. Again I'd like to ask investment in what? Backbones? No. Middle
mile networks ?? NO. Fiber to the home? NO.! Verizon quit that. ATT and Qwest never started.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 59
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Internet lines? Those are smoke and mirrors. The only internet line is the last “x” hundred
meters from the content that has been shoved as close to the edge as possible.

The investment is largely in lawyers and lobbying and maybe in wireless.

Postscript
Canarie and Bill St Arnaud, beginning at least 15 years, ago built one of the globe's leading
research networks. Canarie started as a national backbone linking Canada’s leading research
networks. However, around the time of the dot com bust 2001 the Canadian provinces began to
build their own research networks.

Suddenly in January 2010 Bill St Arnaud departs from Canarie after having pretty much pioneered
green IT between 2007 and 2009. As of the summer of 2010, Canarie appears to me to be
rudderless doing not much more than providing light paths and web services for Canadian
universities as it had been for at least the last five years.

While I have been promoting his architecture paper http://billstarnaud.blogspot.com/2010/02/


personal-perspective-on-evolving.html for quite a while, on May 13 i received this from Bill:

“As I pointed out several months ago in my paper on the Future of R&E networks, traffic patterns
are rapidly changing on all Internet networks including R&E networks. Increasingly traffic is
becoming more local as these networks start to directly peer with large content, application and
cloud providers at major IXs. A good example of this change in traffic pattern is the recent press
release issued by the R&E network in the province of Ontario in Canada. They have seen a traffic
volumes triple over the last 3 years. The overwhelming volume of the traffic is with over 40
content, cloud and application providers such as Google, Limelight etc. Only about 16% of the
traffic goes to the national Internet backbone operated by CANARIE”.

COOK Report: From the Orion press release I surmise that the application content infrastructure
Bill writes about in his large Architecture Infrastructure paper has had no trouble using IXs in
Ottawa and Toronto to link ORION (the Ontario province research net) to the rest of the world.
Ontario traffic has tripled but only 16% goes via the Canarie backbone elsewhere in Canada.

Can it be that the ascendancy of application content infrastructure providers as described by Bill in
this summary are making national research network backbones obsolescent? If so what does this
portend for Internet Two and especially for National Lambda Rail? We have the RONs which may be
analogous to Orion.

What we don't have is any mapping effort worthy of the name. It's hard to imagine that there
aren't similarities between ORION and the RONs. Wouldn't it behoove someone to do some
studies on traffic from the ACI infrastructure to the RONS and get people thinking about how little
Qwest, Verizon and ATT provide to their customers?

Formulating policy in the dark is a BAD idea.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 60
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Putting Duct Tape


on Telecom Regulation:
The Obama FCC Speaks its Mind
The COOK Report has read Yves Smith's ECONned: How Unenlightened Self Interest Undermined
Democracy and Corrupted Capitalism . [http://www.amazon.com/ECONned-Unenlightened-
Undermined-Democracy-Capitalism/dp/0230620515 ] Discouraging book. She presents a
persuasive argument that Wall Street has captured Washington and is free to continue what she
calls “Looting 2.0”.

Smith writes: "Why is there such reluctance to undertake fundamental reform given the damage
to the global economy wrought by the financiers? Let's review four theories some of which we have
touched on earlier:

Cognitive regulatory capture, meaning that the regulators have adopted the industry world
view, which makes them reluctant to act.

Extortion meaning the financial services industry controls infrastructure that is essential to
capitalism and cannot be displaced except at very high cost.

State capture, meaning the financial services industry now has the status of oligarchs in Third
World countries, having used its economic clout to buy so much political influence that they largely
dictate policy regarding its interests.

Paradigm breakdown, meaning the elements of the current system are no longer viable, but that
is a possibility that no one is prepared to face since the old system seemed to work well for a
protracted period. Thus the authorities reflexively put duct tape on the machinery rather than
hazard a teardown.

All these factors play a role in the hesitance to impose reforms, but the most intractable and least
recognized is the last, the difficulty in seeing that the failings of the current system are deeply
rooted and not amenable to simple remedies. Any resolution of the major problems facing the
financial system require a good deal of time, care and persistent effort and would simultaneously
be highly politicized. It makes it very likely that financial services industry will derail or blunt
reform efforts. That in turn means that the current paradigm will be patched up and restored to
service only to fail again. This pattern will replay until the breakdown is beyond repair." (Page
283 ).

COOK Report: Sadly this reads to me like a lot of our telecom discussion. Certainly regulatory
capture. An inability to do anything except put the duct tape she talks about on the system.

And it is not just telecom. It is the energy industry in spades as we have just seen with BP and the
Gulf of Mexico. No one entity responsible and therefore socialize the losses.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 61
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

In some ways Smith’s book, among the many on the collapse that I have read, is among the least
filled with narrative facts but it does educate in very troubling ways how the system works. She
shows how some of the CDO squared hedge fund “stuff” was leveraged at 500 to one by the
beginning of 2008. The real-estate boom had essentially ended two years earlier. But the shadow
banking system was like a casino that poured borrow money into players accounts as long as they
could keep the roulette wheels spinning which of course they did until finally the entire game blew
up. The CDO's represented no new economic activity. They were just repackaging of older credit
instruments sold because there were naive customers who pleaded to keep on buying collections of
gambles on loan assets that had no hope of ever being paid. Since those who ran the casino were
paid huge bonuses for keeping it running regardless of the outcome, they simply continued the
game until it took down the two weakest operators and the rest went running to Washington for
help.

How Does this Translate 18 months after Obama?

Telecom exhibits the same regulatory capture. Consequently the National Broadband Plan
mountain roared and gave birth to a mouse with the FCC announcing a series of rules makings that
will keep the incumbents in charge of their old asset structures.

Recall how the Power of Pull in last month’s issue (June 2010) emphasized that the changes
layered by ICT on top of the industrial infrastructure of the 20th century had rendered the old
model unsustainable. Return on assets down 75 percent over the past 45 years and headed
toward zero.

The question becomes what will humans use to organize their economic activity? The corporation
or firm? The books authors come from the corporate world so it is not surprising that it is this world
to which they speak with power and in a way that makes sense. The Internet is enabling a parallel
world of open commons, a world in which Michel Bauwens has made himself the librarian. Very
likely after another month I will publish an interview that I did with Michel on the way through
Bangkok in March. This world is one as yet little seen but it holds the potential for becoming a
promising form of alternative social and economic organization.

I speak in these terms because the FCC is caught in paradigm breakdown. Technology has
rendered the old ways unworkable. Policy cannot keep up. Therefore we apply duct tape to the old
broken system. Twenty years ago we had good information on who was doing what where. After
deregulation we have almost none. The result is that it is left to long term observers like Bill St
Arnaud to put together by inference a new picture of Internet architecture.

We have seen the arguments that said the FCC should acknowledge its 2004 mistake and reclassify
Internet under Title 11 of the Telecom Act. On May 6th, the FCC applied a heavy dose of duct tape
and announced reclassification under Title II “lite”.

On May 6 Fred Goldstein commented: I just read through the Genachowski statement and the
more telling Counsel statement. Basically, it boils down to this:

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 62
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

The "Internet" component of a service will not be regulated based on Title I ancillary jurisdiction,
but the transmission component is regulated under Title II only to the extent that it provides the
nexus to apply Title I ancillary jurisdiction to the Internet component carried atop it. No other
rights are conferred. But now USF can apply (Section 254), both to tax ISP's and to give subsidies
to rural ILECs for providing data services on a non-common-carrier basis. Verizon and Comcast
can pay; CenturyTel and Sandwich Isles can collect.

Most explicitly, Counsel states that the Computer II rules are *not* being reinstated. ISP's thus do
NOT gain access to telco wires. So you don't gain a choice of ISP's. It's still a choice of one ISP
per wire. The telcos keep the wires for themselves, and while they must provide
"nondiscriminatory" service at "just and reasonable" rates (i.e., sections 201-202), these
protections only apply to the transmission services that they ONLY provide TO THEMSELVES. So
they can still screw everyone else, but not themselves. I'm sure we can all feel good about that.

And the FCC can't regulate Internet content, but the bit-level telecommunications component (the
one transmitted without regard for content) can probably be regulated so as to ensure that it looks
like what somebody thinks the Internet should look like. But it's not the Internet; it's the
transmission of bits to and from the Internet that must only be provided in a way that the Internet
itself is managed the way the FCC wants it to.

Professor Irwin Corey couldn't have done a better job. This is supposed to withstand legal scrutiny,
because it is so confusing that even a Supreme Court justice is supposed to be snowed.

Also on May 6th Chris Savage: Let me try to state it without irony or mocking. Tell me if I got it
wrong.

1. The FCC wants to impose the 4 or 5 principles that it used to sanction Comcast.

2. It wants to do that under Title 1, not Title 2.

3. The DC Circuit says that the exercise of Title 1 authority must be ancillary to an exercise of an
express authority, in this case Title 2.

4. The FCC, therefore, will assert Title 2 authority over the transmission of Internet
communications (like the logic of the Portland case, to a first approximation).

5. Other than certain minimal regulatory things like "be reasonable" and "don't discriminate", this
will not amount to anything.

6. It will, however, suffice to provide express regulation under Title 2 extensive enough to impose
the Internet "principles" under Title 1.

If that's basically right (I'm on bb and haven't read the statements) it's a bit of a legal kludge but
seems to me to be a nice bit of politico-legal ju-jitsu. Suppose they do this and someone appeals.
The court will affirm it or throw it out. If the latter it will either be because the partial/constricted
Title 2 analysis is too weird, or because the exercise of Title 2 power isn't robust enough to support
the Title 1 exercise.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 63
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

The obvious consequence of a successful appeal is EVEN MORE regulation. It comes down to
timing. No matter what happens in November there will be a Democratic majority on the
Commission at least until January 2013. If they move fast they can do this in time to get it up to
the DC Circuit and back to take the next step (assuming reversal) before January 2013, even if the
Democrats lose the White House in November 2012.

So opponents of this have to delay it as long as they can, then appeal, then get a Republican into
the White House in 2012.

Should be interesting.

The Unbearable Lightness of Being or Title II ʻLiteʼ

Editor’s Note: The meaning of “lite” is not easily grasped. As proof we offer here a discussion
that led off from my effort to get the list’s reaction to Bill St Arnaud’s architecture discussion that
pointed out how content and cloud applications have been driven closer to the last mile edge of the
network.

May 13 Fred Goldstein: The discussion of Application Content Infrastructures (ACIs) and the
importance of content caches strikes me as an instance where the "network neutrality" crowd is
running in entirely the wrong direction. If packets are packets and the network is supposed to act
dumb, then who gets to run a cache, where?

This is not a real problem if the content owner decides to hire Akamai, Limelight or another CDN
and points its DNS at them. In such a case, the cache is the named content. This works pretty
well for large ISPs, such as the big telephone and cable companies, since they have the volume to
be at a point where a CDN node is located.

But if you apply the "@Home" model in which the ISP does the caching, then the game is different.
A rural ISP might want to have a cache, but is too small to have the CDNs pay for it. So they put
up their own cache and substitute the cache contents for the actual requested file. This improves
performance and lowers the cost, and indeed is an action *not* consistent with common carriage
but *quite* consistent with the technical definition of an "information service". This becomes a
competitive decision. Do the customers of this ISP get the Very Latest page, or do they get a
merely-recent copy, faster and cheaper? For a rural provider, it can make a big difference.

But a common carrier doesn't do this. I think back to a mortician, Almon Strowger, who was
concerned that his competitor's wife was a telephone operator. To prevent her from diverting his
traffic, he invented the dial telephone and the automatic switch. True, the phone company was
legally even then a common carrier, and was supposed to deliver calls as requested, but a little
paranoia led to an important invention.

In theory, an ISP-owned cache is like the telephone operator, whose motives and behavior were
naturally suspect even if wrongdoing is not demonstrated. This is not a reason to abolish caches,
but shows a distinction between carrier and content-centric business models.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 64
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Savage: On a high conceptual level, one of best arguments I can think of against classifying ISPs
(including here both ILECs and cable) as carriers is that the relevant statutes speak not in terms of
what the firms DO, but in terms of what they OFFER their customers.

To a first approximation, what Verizon "offers" me is "going and getting me a copy of the files/web
pages/email messages that I ask for." In effect that's what I pay them to do for me, and any
failure of the requested files to show up on my screen I attribute, in the first instance, to Verizon.

Now you and I and everyone on this list knows that what Verizon (or whomever) actually DOES
that leads to the relevant stuff appearing on my screen is complicated indeed, and -- except
perhaps for the caching model -- doesn't quite fit the "go get me the files" model.

But it does fit what a generic consumer probably thinks they are BUYING.

Under US law, that's an information service, not a telecom service, for sure.

Vint Cerf: 1. an ISP can certainly run a competitive cacheing service like Akamai, etc.
2. They can charge the supplier of the content or they could charge the user.
3. This is an application that rides on top of the Internet service.
4. They do not need to monkey with Internet packet layer to implement this service. They would
presumably use the same kind of hack that Akamai does - the lookup of the content to the
cacheing server.

Savage: Vint, You are of course technically correct.

Note, though, that I am not describing a “reality-based” argument (that is, one that is based on
what an ISP actually does, as an engineering matter). I am describing an entirely law-based
argument (and therefore, probably an anti-engineering argument). Under the definitions in the
Communications Act, “information service” is defined as “offering of a capability for” creating,
storing, retrieving, manipulating, etc., information. At the same time, a “telecommunications
service” is the “offering of telecommunications” to the public, etc.

From this perspective, it is not insane to say that what my ISP (Verizon) “offers” me is the service
of “retrieving a copy of” the files located at whatever URL I type in. From this perspective I literally
do not care if they have it locally cached, if they go one router hop to some CDN, or if they get it
by sending my request packet out into the void in the best-efforts hope that the correct file comes
back. Also from this perspective, all the best-efforts stuff becomes irrelevant to the classification
question. (I think it’s probably irrelevant anyway, but that’s another story.) If it sometimes takes
Verizon a while to deliver to me the files I’ve asked for, or if sometimes the file doesn’t come back
at all, that may be a commentary on how well Verizon is delivering on its offer. But that doesn’t
change what they are offering.

I think it’s time to go back and re-read Brand X…

Goldstein: Vint, In the "@Home" model, which lives on in some ISPs, the ISP does monkey with
the packet layer. You try to contact www.snorglefratzz.com and start an http connection to its IP

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 65
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

address. Your ISP intercepts the IP, which it recognizes, and redirects it to its own local cache.
You get a cache from the ISP's server, which carries whatever it decided to cache at some point in
the past. Your packet never gets to the actual IP address you first requested.

Brett, for instance, runs a cache like this. He even caches pages that are marked do-not-cache.
He does this because upstream capacity is really expensive in Laramie, thanks to monopoly Special
Access rates. So it reduces his cost.

Cerf: The alternative is to do this at the DNS level (which I would prefer). That is the way Akamai
and Google do it.

Goldstein: Of course. But that is easiest when the cached content supplier wants it done. I don't
know if they actually do it via a buggered local DNS, or via IP intercepts (does anyone here
know?). The @Home model is how the ISP saves on redundant backbone hits, without the consent
of the content provider. It's intentionally non-neutral, but very useful in rural, high-cost areas. It
makes little sense for the urban cable systems where it was originally used.

Savage: OK, so I just went back and re-read Brand X.

The neurons that were telling me that the case turned on some level on what was being “offered”
were right. But it was weird. They said that all the stuff done by the ISP’s routers, caches, etc.
was an information service, integrated with the transport. Scalia et al. said no, the transport was
separate. Nobody seemed to argue that what the routers etc. did was part of transmission. Yet if
you want a clean path for your packets from your computer, through the ISP, out to “the world” as
indicated by the IP address resulting from the DNS lookup, then that stuff is “transmission” too.

Hmm. I’m trying to figure out if the FCC’s proposed “Third Way” makes any sense. I’m getting
less sure by the minute.

Dan Sershen [telco strategist]: Sorry for what I am certain is an ignorant question on my part.
I am genuine in my seeking to understand but I am confused as to exactly what problem the FCC
are trying to solve, and specifically how the proposed 1.5 solution addresses the problem. I have
seen so many posts and articles pointing to a myriad of issues that I am uncertain as to which ones
the proposed solution really addresses?

Goldstein: Join the club. It strikes me as intentionally obfuscatory.

Sershen: It seems that the view is the broadband transport market is not acting in a way that
supports the FCC's public policy goals. The proposal seeks to regulate them to ensure that the
market players behave as the government wishes. My assumption its that its not primarily about
monopoly or oligopoly power, but about private companies adhering to public policy goals.

Goldstein: You're correct on that last sentence. On the first one, it's about a view that the
broadband access-ISP market is threatened to be not acting in a way that supports their goals.
They want to preserve the vertically-integrated ISP policy, so you can't be an ISP without owning
the wire, which pretty much limits it to the cable and phone companies. With a monopoly or
duopoly, an ISP could thus easily get abusive. So there is monopoly power, but they're explicitly

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 66
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

and intentionally not using their clear power to change it. Instead, they're literally taking steps to
tighten up the monopoly, discourage further attempts at ISP competition, and then moving the
regulation up the stack, where their authority is merely, at best, ancillary.

Sershen: As I understand the proposal, by splitting out the transport players from the definition of
information service, the FCC can regulate them separately to ensure they behave according to the
FCC's wishes. At the same time, the proposal attempts to provide a "lighter touch" regarding the
regulatory burden imposed by the full weight of common carrier rules through application of only a
subset of the telecommunications regulation.

This seems reasonable but do I have this close to right?

Goldstein: No. They're only splitting out the transport layer in a very technical way, with little real
impact -- "apply and forbear". It will impact tax under Section 254 (it is taxable; information is
not). But it will not be offered as common carriage, or separately from the vertically-integrated
ISP service. And if you get technical, the "non-neutrality" they're trying to control only happens
above the transmission layer. So they're regulating the information service to meet their wishes,
and leaving the transmission itself unregulated.

Sershen: Secondly, do we think that the proposal will actually result in the transport industry
behaving in a way that does achieve the FCC's public policy goals.

Goldstein: Is their goal full employment for lawyers? It is supposed to scare phone and cable
companies into providing support for home-based content distribution networks, and to support
high-volume video downloads. And it is supposed to scare small ISPs out of business entirely. But
it is largely a political feint, so they can say that they are doing something about "network
neutrality", which sure sounds like a great idea if your understanding of the Internet comes from
the back of a cereal box.

Sershen: Thank you. Your comments help a lot.

It seems to me that the FCC might find its actually easier to prove and declare the local transport
industry an oligopoly and regulate it as a utility than it is to torture a definition separating out
transport from information service.

Goldstein: Of course. That would help the public interest too, and create the right kind of
"neutrality", where people have a real choice.

But the FCC is looking at this as a pure political game. There are four sides. ILEC and cable both
have money, so the deal is meant to keep the ILECs (especially Verizon) happy, without imposing
on cable in a way that can't be overturned after the next election, or that violates their false
equivalence with ILECs (important to Verizon). The third side, Big Content, led by Google, is
pleased, since they want the access-ISPs to be forced to carry their content regardless of cost.
And the fourth side, small ISPs, has little money or lobbying clout, so they're thrown under the
bus.

Law exists to justify the political picking of winners.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 67
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Savage: Dan,

In 2005, the FCC asserted the authority to require at least facilities-based ISPs (ILECs and cable)
to adhere to some general “principles” regarding consumer rights for Internet service. These
included some stuff that vaguely looked like network neutrality:

• To encourage broadband deployment and preserve and promote the open and
interconnected nature of the public Internet, consumers are entitled to access
the lawful Internet content of their choice.
• To encourage broadband deployment and preserve and promote the open and
interconnected nature of the public Internet, consumers are entitled to run
applications and use services of their choice, subject to the needs of law
enforcement.
• To encourage broadband deployment and preserve and promote the open and
interconnected nature of the public Internet, consumers are entitled to connect
their choice of legal devices that do not harm the network.
• To encourage broadband deployment and preserve and promote the open and
interconnected nature of the public Internet, consumers are entitled to
competition among network providers, application and service providers, and
content provider.

The legal basis for these principles was a statement by the Supreme Court in the Brand X case,
that the FCC could impose some obligations on facilities-based IXCs under “Title I” of the
Communications Act (which is sort of a catch-all provision). However, when the FCC actually
applied these principles to whack Comcast in the BitTorrent situation, Comcast took the FCC to the
DC Circuit court of appeals, which ruled that whatever Title I might allow, the FCC had not done a
good enough job connecting its action against Comcast to a specific part of the FCC’s express
authority under Title II of the Act (relating to common carriers).

What the FCC claims to be trying to do is to come up with a new, improved legal rationale for what
it thought the situation was before the DC Circuit ruled, i.e., a legal theory under which it can
promulgate and enforce the principles stated above. It claims to have no greater regulatory
ambitions than that. That is, whatever issues there might be about competitive conditions in the
broadband transport market, the FCC is not trying to deal with them now.

It’s trial-balloon legal theory is more or less as follows: (1) Reclassify some portion of what
facilities-based ISPs do as “telecommunications service” subject to the FCC’s direct regulation
under Title II. (2) Use that direct Title II authority, plus any necessary Title I “ancillary” authority
attached to it, to impose the principles noted above (and, in a related proceeding, some more
explicitly network neutrality obligations). (3) “Forbear” from applying any other regulatory
obligations to the facilities-based ISPs.

So I think that the FCC’s public policy goals here are sort of orthogonal to the ones you are
identifying.

That said, one of industry’s concerns here is that even though the FCC now says that it will forbear

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 68
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

from any regulation other than reinstating the principles and some gentle form of network
neutrality, as time goes on the temptation to impose additional regulation on the industry will
become irresistible. Therefore even if industry could live with what the FCC is proposing to do now,
as a defensive measure they may need to oppose it, because of what the FCC could do, under this
same rationale, later.

I recognize that this kind of gamesmanship and negotiation seems insane from the perspective of
“just establishing good public policy.” As Harold says, though, “welcome to the sausage factory.”

Sershen: I guess I did walk into the back room of the meat store and I am getting a view into how
the sausage is being made! LOL.

I think you may have helped me determine the nature of my confusion. Its got to do with the
public policy goals. I believe the stated goals may be subject to wide interpretation by many
people. This is one source of the confusion. Your message has helped clarify it.

I remain confused on exactly how the proposed solution will actually achieve the public policy
goals. I suspect it will be difficult to separate transport from information service. Over time as
technology continues to develop, systems become more integrated, and traffic management
becomes more precise, these lines will become even more blurred. Even if definitions can be
agreed to, I am not sure how the regulatory body expects that it will achieve the policy goals listed
by invoking this solution.

I also understand the argument that once a regulatory body gets permission to regulate
something, its never the case that less regulation occurs over time.

While I think the intentions are good, I suspect the road takes us to a place none of us would
prefer to go. I can think of a myriad of unintended consequences as a result of arbitrary definitions
of telecom service and information service. While likely not the intention, the FCC will find it has
picked the winners and losers in the Internet industrial organizational value chain. While pure
speculation, if you take this road to its ultimate conclusion, it possible we will be talking about "rate
of return" and parsing cost models again.

Brian Harris: Some already are. But not in the sense of a return to the pre-1996 days, but in
using RoR as one aspect of a regulatory toolkit. This new approach would focus simply on the raw
physical transmission media (fiber optic cable, copper, coaxial cable and antennas) divorced from
the myriad services that travel over this media. The goal of RoR in this context is to assure a
network that reaches everyone, something that the free market alone will not incent.

Sershen: This where I get confused and admittedly its my ignorance. How do you separate out
raw transmission from the rest of the infrastructure and how that will promote universal
broadband?

Sorry for being slow on this.

Goldstein: It's really quite trivial in concept, and has been done elsewhere. A specific legislative
starting point would be this draft bill:

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 69
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

http://www.ionary.com/separationbillproposal.htm

You basically have the loop plant, including the wire center buildings, and put that in one company,
under rate-of-return. They lease dark copper and fiber on a wholesale basis, with no concern
about how it is lit or what is done with it. The ServiceCos who lease it are then for the most part
unregulated. This allows for serious competition both at the light-the-wire layer (ILECs become
like CLECs) and at the information-service layers (ISPs get access too, typically via LECs). And
the same fiber build can be sold to cable as well as telco, so one broadband build can cover an area
that is not profitable for cable or telco alone.

The problem with the hair-splitting over what is "information service" and what is
"telecommunications service" is that the definitions appear to have been written as descriptive, not
prescriptive. That is, they identified who was the ISP and who was the telco in 1996. The "layers"
were implicit, since information service ran atop telecommunications. Trying to build things to
meet the letter of the definitions is backwards, though it fits some American legal traditions.

Shersen: Is this the approach the FCC is taking? It could be but its not well detailed in the FCC
descriptive.

Goldstein: No. The direction I'm taking is two quadrants in the opposite direction. (I don't like
degrees; 360 is an arbitrary divisor of a circle. 90 degrees is a quadrant.) The Schlick plan could
well have been written by Verizon.

Sershen: Didn't we try this in 1996? I am so rusty on this that my memory may be way off. I
thought we took this approach with 270, 271 etc and lost the case in the 5th circuit based on
unreasonable search and seizure. I am way out of my area of expertise here.

Goldstein: No. 271 was trying to achieve some of the benefits of this without actually doing it.
The idea of UNEs is to make some elements available, as would happen with Separation, but
ownership would remain with the incumbent. But only for existing stuff, no new construction
(though you could convert things to UNE after they'd paid retail for a while). Then the self-
contradictory "necessary and impair" clause allowed them to withdraw UNEs on grounds that there
was some competition anyway, as interpreted by the FCC and/or a Court of Appeals. So while it
remains on the books, and is explicitly unchanged by the new policy, it has only limited
applicability.

Separation takes out the conflict. The ILEC must buy from the LoopCo, who must sell to others on
the same terms. The proposed LoopCo rate isn't generally TELRIC, though it could vary in either
direction.

Sershen: If we are headed this direction anyway, (and I am not advocating this but...), perhaps it
would be more honest and easier to simply nationalize the transmission plant and run it as a
government entity. This way the government can distribute broadband as they see fit instead of
relying on private companies. This seems reasonable and directly addresses the public policy
issues.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 70
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Goldstein: It does, and it would work -- Australia, hardly known as a bastion of socialism, is doing
just that. However, the US prefers private ownership. So it involves traditional utility regulation.
A utility is, after all, a public function performed by a private company on terms that allow the
externalities of value to be retained by the customer, in exchange for which the utility gets a fair
rate of return. In the current model, former utilities seek opportunities to chase after and retain
those externalities.

Sershen: Attempting to redistribute assets and resources and value through tortured regulation,
universal service funds, and taxation seems more difficult and expensive than simply confiscating
the asset and redistributing it as wished.

Goldsetin: Ding! Give that man a cigar. (Preferably the chocolate variety; it's safer.) It's even
worse than you suggest, due to the implicit cross-subsidies maintained in the access charge
mechanism, call classification, and other arcana that make the telco business model so complex.

Sershen: I understand the concept of private property may get in the way but we just took over a
car company, replaced the board and the executive team, redistributed value from the bond
holders to the employees, and invested billions of tax payer dollars, all legally.

Goldstein: Yes, though to be sure the car companies were bankrupt, wiping out their common
equity, and the government, as a major debt-holder, inherited them in exchange for debt. The
LoopCo idea has no takings, though. It simply splits the ownership into two stocks rather than
one. Those who value stability ("widows and orphans" investors) might prefer the LoopCo, while
those who like risk might prefer the ServiceCo. Vertical integration, as done today, creates a sub-
optimal investment profile.

Sherrsen: If we want to adhere to the concepts of property rights, perhaps another method of
achieving public policy goals would be to give the consumers money or a credit to purchase
broadband. That way carriers would be incented to build out to these people and the consumers
would still be assured of affordable broadband. This could be a very cost effective way to achieve
some of the policy goals.

Goldstein: The "stamps" idea has been suggested as a better way to achieve universal service.
The subsidies would be fully portable. However, it only works when there is a choice, and subsidy-
sucking areas are too expensive to warrant multiple outside plant providers. So the subsidy ends
up in the same monopoly.

Sershen: Am I missing something? Sorry for any naiveté on this. I am way out of my area on the
legal aspects on this and several years behind on the thoughts on public policy.

Goldstein: Separation is an idea widely discussed outside of the US. In the UK, BT was
functionally separated (one stock, two operating entities under separate management) into a
service company (retail) and a LoopCo (OpenReach). I think it has happened elsewhere.
Computer II, in 1980, forced AT&T to functionally separate, though divestiture coincidentally came
along and turned it into structural separation, but that was of terminal equipment.

Tim Cowen: Couple of thoughts strike me about the lack of innovation in the traditional 'separate

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 71
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

LoopCo' model. Given that competition drives innovation, even having limited competition from
cable is better than nothing at that level. Exposing loopco to as much pressure from direct
competition, and as much pressure to meet downstream players' needs, ought to be a major goal
of regulatory policy.

If loopco is set up as a basic transmission utility, then as Fred outlines, the customers gain the
benefit of any externalities/value. In loopco there is little incentive to innovate, and every incentive
to meet generic or averaged needs and maximise return on an undifferentiated basis. This isn't
quite the same as investing as little as possible. The issue is lack of any focus on changing
customer needs and no capability in seeking to meet all the different things that customers want in
all the different places and at different prices.

It may be that the assumption of utility makes averaged offerings the inevitable outcome. I also
wonder whether separate entities can drive innovation through contract since loopco has no
incentive to take on any liability for execution and implementation, and even if it does, breach of
contract sounds in damages, paid for out of regulated returns. Functional separation can avoid this
where the culture in a single firm can be managed to promote innovation through a focus on
customer needs.

Much will also depend on the mechanism by which loopco is compensated. Typically loopco is able
to make a "fair" return on capital, Telric, TSLRIC or whatever base is taken. The basic concern of
monopoly is monopoly profit, so that is addressed, but, the incentive to innovate depends on
loopco being driven by customer demands and in any financial model that I have seen, that isn't
dealt with.

In basic utilities such as transporting water, this probably doesn't matter (for general welfare) all
that much. However when thinking that in telecoms the loopco is running a critical component of
the world's digital infrastructure, the risk to society is that this last (or middle) mile, defines the
speed of the system. Its almost like building a worldwide circuit board and intentionally sabotaging
the system by using an out of date component.

This has become extraordinarily important with the growth of distributed computing platforms.
According to Neelie Kroes, 50% of European productivity growth is driven by the ICT sector. This
looked high to me unless it is thought that ICT allows changes to business processes, increases
industrial agility and ways of working as well as direct cost reductions and efficiency benefits from
newer faster and cheaper technology.

One example I was involved in was deployment of ICT that changed the speed to market of short
run car production, allowing FIAT to take 6 months out of the traditional 18 month timescale for
introducing a new model. Another in the fashion industry is Zara, now a pan EU fashion house, that
specialises in short runs of clothes, made in the EU, built on an ICT platform that allows the firm to
identify trends and changes in demand and meet them very swiftly, giving Zara competitive
advantage over cheaper production in places such as India.

In many different ways these downstream customer driven systems depend on local access
capability. However, upgrading the local access plant to respond to differentiated downstream
demands can be determined by different rather than average customer requirements. The

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 72
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

difference between people watching You Tube (consumer needs) and stock market trading between
algorithmic computers, (or enterprise needs linked on an enterprise cloud computing platform), are
clearly very different.

However, they all depend on shared use of the local access facilities, which should, if possible, be
driven by innovation for the highest levels of quality of service, etc, in the wider public interest.
Trouble here is that the regulator can set up incentives but the incumbent has the power of
initiative (or not to take the initiative).

Where the upstream access player has economies of scale and scope and demands are placed on it
by downstream players it will often want to take a portfolio risk approach: low risk high return
investment in some places and higher risk higher return investment in others. Typically, demand
varies with location so different speeds, capabilities and offerings in different places. This runs
straight into universal service utility thinking, which could act as a 'floor' or baseline level offering,
but often acts as an organisational goal, with nothing more.

If local access monopoly can differentiate and innovate that may give rise to wider economic
benefits since customer demands are so different.

The core question: How does the local access player allocate capital to meet all the different
customer demands?

Part of the answer is in having information about the market and in identifying demand.

In BT, when I was there about 3 years ago, I suggested that this would be better addressed if the
downstream parts of the organisation identified demand in a 'demand forum' and that should
interface with a 'supply forum', and 'design council' that would help define products, before going
to a capital allocation committee. BT Design and Operate were part over an overall plan called
'Connected World', that included the downstream customer facing divisions.

The idea was to build on one of the inherent advantages of functional separation over structural
separation. Functional separation has been criticised because the entire firm, when looked at end
to end, with a monopoly at one end, may have an incentive to discriminate in favour of
downstream businesses. To my mind, if regulation can be trusted to solve the monopoly and basic
discrimination issues, then, looked at from the customer driving the firm, the customer-facing
downstream businesses can drive innovation through the firm, and then drive innovation into the
local access layer. This does depend on the firm being exposed to competition at the downstream
level: if competition at that level is fierce enough then information is driven through the firm and it
can become more customer focused than a traditional structural separation; at least that is one
aspect of a better design than the structural separation approach.

Of course there were a lot of people involved in all this: Ben Verwaayen's genius in getting a firm
wide cultural re-alignment and relentless focus on customer needs was enormous. Andy Green's
deep strategic insight and development of downstream capability, and many people recognised that
the process for internal product development was hopelessly slow, and the development of 21CN as
an open platform is another part of the story.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 73
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Whether it works all depends on execution which has its ups and downs as we have seen, and the
recession obscures the school report.

My basic contention is that as a model, functional separation is potentially an improvement on


structural separation, because, amongst other elements, it helps as a vehicle to promote innovation
in a way that structural separation could not achieve.

Latest results from BT are pointing to some indications that it can be managed for cash. The jury is
out on whether it can deliver on the promise of innovation and growth, but the model is a good
one.

Savage: Interesting thoughts. I would suggest the following for your consideration:

1. Under the FiberLoopCo model, it matters greatly, I think, what one envisions the "Loop" part to
be. Suppose it was really just "home-run" fiber loops -- individual fibers from every home,
business, etc. to a reasonably convenient aggregation point, with no aggregation point farther from
a premises than a fiber signal can go with no regeneration, etc. Imagine (if it does not already
exist) a mass-market, NID-like, industry standard optical termination doo-dad at the home with a
switch you move for different settings: "5 up/20 down" or "symmetrical gigE" or whatever. Under
this model LoopCo's job would be making sure the (totally passive) fiber remains intact (fixing lines
downed by fallen trees, etc.) and periodically changing out the industry-standard fiber NIDs/ONTs.
They'd run the "real estate" portion of the aggregation points, but for-profit vendors -- cable,
telephone, independent ISPs -- would collocate their own gear at those places and provide their
own services. (Imagine the cross-connect frame from hell... <g>).

Goldstein: That's pretty much what I have in mind. Of course it needn't be a cross-connect from
hell -- there is still room for innovation in the inside plant. And today's wire center is almost
certainly the wrong size; it was based on copper loop economics, pre-1980. I'd expect somewhat
smaller local distribution area huts/cabinets, and somewhat larger (than a wire center) regional
aggregation centers. But the wire centers become carrier-neutral collocation spaces.

Savage: 2. In this model the lack of competition and the lack of normal business incentives to
innovate would not be a problem because we would have defined the role of FiberLoopCo to
precisely those things that are not particularly subject to relevant innovation pressures.

Goldstein: Exactly. One way I look at it is by lifespan. If the lifespan is longer than 20 years
(poles, conduits, buildings, some passive loop plant), then it goes to LoopCo. If it's shorter than
say 10 years, it is owned by the ServiceCos. The LoopCo plant is dark fiber or dry copper, except
where impractical, and in those cases, the muxing is at as low a layer as possible.

Savage: 3. It gets much, much more complicated once you consider having a regulated monopoly
(or government-owned) FiberLoopCo perform routing, caching, etc.

Goldstein: By definition, those are not LoopCo activities. Those are higher-layer services, whether
they meet the American definition of "telecommunications" or "information service". LoopCo is in
the "network element" business.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 74
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Savage: As far as I can understand, those are the functions where you have to make the "hard"
decisions -- what capacity to install, what traffic shaping algorithms to use, which CDNs to connect
with, etc., etc. These functions are, indeed, subject to the need to innovate, modify behavior
based on changing customer demands, etc., etc.. I am not at all convinced that the monopoly-
regulated or government-owned LoopCo model would work well for these functions.

4. So, at least in the world of Savage Thought Experiments, a FiberLoopCo should provide fiber
only (ok, maybe some NIDs/ONTs), and some convenient real estate for collocation and cross-
connection, but nothing else.

Goldstein: Even the ONTs would probably belong to the ServiceCos who lease the loops. Where
things get a little funky is in shared glass, like PONs. They exist, and are hard to unbundle. WDM-
PON can unbundle at the lambda level, though it's immature and a bit pricey today. GPON doesn't
unbundle well, though it can provide some bitstream (common carriage) services if you push hard
enough. In such cases, we probably fall back on the Computer II rules. They prohibited ILECs from
owning "terminal equipment", but did make a specific "multiplexer exemption", where a mux could
be put on customer premises with the demarc on back of the mux.

Savage: Make sense?

Goldstein Yes. Addressing one of Tim's specific questions,

Cowen: Couple of thoughts strike me about the lack of innovation in the traditional 'separate
loopco' model. Given that competition drives innovation, even having limited competition from
cable is better than nothing at that level. Exposing loopco to as much pressure from direct
competition, and as much pressure to meet downstream players' needs, ought to be a major goal
of regulatory policy.

If loopco is set up as a basic transmission utility, then as Fred outlines, the customers gain the
benefit of any externalities/value. In loopco there is little incentive to innovate, and every incentive
to meet generic or averaged needs and maximise return on an undifferentiated basis. This isn't
quite the same as investing as little as possible. The issue is lack of any focus on changing
customer needs and no capability in seeking to meet all the different things that customers want in
all the different places and at different prices.

Goldstein: The incentive to innovate within the LoopCo's scope of work, that being outside plant,
is the same one that existed during the Computer II era. Mostly it's regulatory, since they're
regulated utilities. If the LoopCo's customers make a Bona Fide Request for something, then the
LoopCo and regulators should jointly decide if it's worthwhile, and what the price would be.

Cowen: It may be that the assumption of utility makes averaged offerings the inevitable outcome.
I also wonder whether separate entities can drive innovation through contract since loopco has no
incentive to take on any liability for execution and implementation, and even if it does, breach of
contract sounds in damages, paid for out of regulated returns. Functional separation can avoid this
where the culture in a single firm can be managed to promote innovation through a focus on
customer needs.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 75
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Goldstein: I don't see that at all. The culture of American firms, at least, isn't about focusing on
customer needs; it's about extracting as much of the externalities as possible. The competitive
ServiceCos deal with retail customers and will make requests of LoopCo and its regulators. Of
course LoopCo can innovate too, even if mostly to control its costs; as Rollie cited, Loma Linda is
an example.

Cowen: Much will also depend on the mechanism by which LoopCo is compensated. Typically
LoopCo is able to make a "fair" return on capital, Telric, TSLRIC or whatever base is taken. The
basic concern of monopoly is monopoly profit, so that is addressed, but, the incentive to innovate
depends on LoopCo being driven by customer demands and in any financial model that I have
seen, that isn't dealt with.

Goldstein: It's like the electric and gas utilities. They get a fair rate of return, but it can be
adjusted up or down to reflect performance.

Cowen: In basic utilities such as transporting water, this probably doesn't matter (for general
welfare) all that much. However when thinking that in telecoms the LoopCo is running a critical
component of the world's digital infrastructure, the risk to society is that this last(or middle) mile,
defines the speed of the system. Its almost like building a worldwide circuit board and intentionally
sabotaging the system by using an out of date component.

Goldstein: Glass doesn't go out of date very quickly. Even copper doesn't, though it's possibly
nearing the end (slow phase-down) of a very long lifespan. Actives do, so LoopCo doesn't get the
actives.

American phone companies are to innovation as BP is to clean water. Left to their own vertically-
integrated wherewithal, it's always 1967. Their ordering process (ASR) is still modeled on a 3270
green screen. Competition is the only reason for anyone to ask a customer what they want, and
beyond the simple duopoly, competition in the outside plant only works in a few places, for large
enterprise customers. And the duopoly does not promote innovation; it only helps ensure a supply
of basic cookie-cutter services (POTS and retail ISP) at less-outrageous prices than a pure
monopoly would have.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 76
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

Executive Summary
David Zakim’s Medical Revolution pp. 1-51
As medical knowledge expanded in the 20th century, medical practice became based on a false
premise -- that the physician could somehow keep in his head everything he needed to know in the
treatment of his patients. This issue provides an in depth look at the work of David Zakim who re-
markably in his retirement ten years ago began the daunting task of using digital technology to
organize and make accessible medical knowledge. In two lengthy interviews done earlier this year
we show how Dr Zakim is using a uniform patient complaint form to take patient history and com-
pile what will become massive databases of patient histories starting with the five most common
chronic illness of adults in western societies. The first clinical trials of the software are beginning in
the summer of 2010 in Stuttgart Germany.

The computerized questionnaire uses a decision tree format to guide the patient and physician
through medically accepted choices. Intake is done over the web and the physician will have the
basic history and complaint in front of him when the patient enters for consultation. Eventually
massive databases will be gathered which over time will provide clinically valuable records of out-
comes.

Dr Zakim is making his software open and public accessible placing all intellectual property in the
ownership of his institute for digital medicine to ensure that no single company can claim owner-
ship by attempting to privatize the results,

Before turning to the interview, readers may continue with our much more detailed summary of his
work on page 45. There we place what he is doing in the context of Dr Larry Weeds pioneering ex-
pert system work of a generation earlier.

Symposium discussion

Changes in Internet Architecture pp. 52-60

Bill St Arnaud posted a paper in Feb 2010 that describes the rise of application content infrastruc-
ture. During the past decade as the incumbents buttressed their control over the last mile provid-
ers of content and applications took advantage of decreases in bandwidth cost to build their own
backbone and push their service towards the edge of the network. s the incumbents consolidated
their stranglehold over the edge, the importance of the connective backbones diminished as service
of all kinds built their mesh in the middle.

St Arnauds points about the ability to push content and service from exchange points and the need
forced by the intransigence of the duopoly to me everything as close to the last mile as possible,
emphasizes the change in the net away from email and web to all forms of content delivery and
cloud computing kinds of applications.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 77
THE COOK REPORT ON INTERNET PROTOCOL JULY 2010

we summarize some of Van Jacobson’s architectural thinking that underlies these changes and we
point out that when the providers are allowed to keep their infrastructure confidential and proprie-
tary that sound policy making becomes impossible.

FCCʼs Title II “lite” Disgrace pp. 61-76

Finally we present a detailed tour of the FCC’s equivocation in the face of its visionless broadband
plan.

In looking at what Genachowski has done…. reclassify as Title II but then apologetically saying
don’t worry we will be “lite”, we see the significance in Yves Smith’s explanation of financial regula-
tory failure: “Paradigm breakdown, meaning the elements of the current system are no longer
viable, but that is a possibility that no one is prepared to face since the old system seemed to work
well for a protracted period. Thus the authorities reflexively put duct tape on the machinery rather
than hazard a teardown.”

Fred Goldstein well captured the absurd convolution: “The "Internet" component of a service will
not be regulated based on Title I ancillary jurisdiction, but the transmission component is regulated
under Title II only to the extent that it provides the nexus to apply Title I ancillary jurisdiction to
the Internet component carried atop it. No other rights are conferred. But now USF can apply
(Section 254), both to tax ISP's and to give subsidies to rural ILECs for providing data services on
a non-common-carrier basis. Verizon and Comcast can pay; CenturyTel and Sandwich Isles can
collect.

Most explicitly, Counsel states that the Computer II rules are *not* being reinstated. ISP's thus do
NOT gain access to telco wires. So you don't gain a choice of ISP's. It's still a choice of one ISP
per wire. The telcos keep the wires for themselves, and while they must provide "nondiscrimina-
tory" service at "just and reasonable" rates (i.e., sections 201-202), these protections only apply to
the transmission services that they ONLY provide TO THEMSELVES. So they can still screw every-
one else, but not themselves. I'm sure we can all feel good about that.

And the FCC can't regulate Internet content, but the bit-level telecommunications component (the
one transmitted without regard for content) can probably be regulated so as to ensure that it looks
like what somebody thinks the Internet should look like. But it's not the Internet; it's the trans-
mission of bits to and from the Internet that must only be provided in a way that the Internet itself
is managed the way the FCC wants it to.”

Future Issues:

August 2010 an interview with Michel Bauwens done in Bangkok early March. On Michel’s role as
the librarian and archivist for his peer to peer foundation which looks like a more benign and sta-
ble for of capitalism than what we now have. Out about June 30. See the extremely fascinating
web site at http://p2pfoundation.net/The_Foundation_for_P2P_Alternatives. The interview with
Sheldon Renan on netness bubbles along and there are welcome signs that this also could be com-
pleted soon.

© 2010 COOK NETWORK CONSULTANTS 431 GREENWAY AVE. EWING, NJ 08618-2711 USA PAGE 78

You might also like