I’m fascinated by the topic of artificial intelligence (AI). This is the third column in a series regarding robots in medicine. (See Robot Anesthesia and Robot Anesthesia II)

AI already influences our daily life. Smartphones verbally direct us to our destination through mazes of highways and traffic. Computers analyze our shopping habits and populate our Internet screens with advertisements for products we’ve ogled in the past. Smartphones perform voice-to-text conversions by pattern recognition of human vocal sounds. Fingerprint scanners learn and then recognize the image of our thumbprints with exacting accuracy. Amazon’s Alexa is an AI-powered personal assistant that accepts verbal commands in our homes.

What about artificial intelligence in medicine (AIM)? AIM is a bold enterprise on the horizon in clinical medicine. Hundreds of AIM scientific publications appear in medical journals each year. I’m not an AIM researcher, but I’m an expert clinician and I love to read. I’ve worked in almost every scenario of medical practice, and because my base is at Stanford University Medical Center in Silicon Valley, many of the advances of the high-tech industry are right here in my backyard. My medical board certifications are in internal medicine and anesthesiology—two fields which have significant overlap in their knowledge base but radically different practice settings. Internal medicine doctors work in clinics, where most diseases are chronic and the most valuable tools for doctors are excellent listening and diagnostic skills. Anesthesiologists work in operating rooms and intensive care units—acute care settings which demand vigilance, steady hands, and quick thinking.

Based on my experience and my reading, I foresee AIM/robots populating three clinical arenas in radically different roles. These arenas will be: 1) diagnosis of images, 2) clinics, and 3) operating rooms/intensive care units. Let’s look at each of these in turn.

  1. Diagnosis of images    This will be the first major application of AIM. We already have electrocardiogram (ECG) machines which interpret a patient’s ECG tracing with high accuracy, and print out the diagnosis for the physician to read. This application debuted in the 1980s and is now the industry standard, although confirmation of diagnosis by a physician is important for some diagnoses such as ST-elevation myocardial infarction (STEMI). More than a few physicians have already lost the skill of reading an ECG themselves because of this device. Future applications of image analysis in medicine will be machine learning for diagnosis in radiology, pathology, and dermatology. The evaluation of digital X-rays, MRIs, or CT scans is the assessment of arrays of pixels. Expect that future computer programs will be as accurate or more accurate than human radiologists. The model for machine learning is similar to the fashion in which a human child learns. A child is not given a list of criteria which define what a dog looks like. Instead, the child sees an animal and his parents tell him that animal is a dog. After repeated exposures, the child learns what a dog looks like. Early on the child may be fooled into thinking that a wolf is a dog, but with increasing experience the child can discern with almost perfect accuracy what is or is not a dog. Machine learning is a subset of deep learning, a concept that makes automated decision-making possible. Deep learning is a radically different method of programming computers. It requires massive database entry, much like the array of dogs that a child sees in the example above, so that the computer can learn the skill of pattern matching. The program repetitively teaches a machine the identity of certain images, and the system hones this algorithm and becomes faster and more accurate in recognizing similar images. An AI computer which masters machine learning and deep learning will probably not give yes or no answers, but rather a percentage likelihood of a diagnosis, i.e. a radiologic image has greater than a 99% chance of being normal, or a skin lesion has greater than a 99% chance of being a malignant melanoma. At the present time the Food and Drug Administration (FDA) does not allow machines to make formal diagnoses, and such AI computer applications are only prototypes. But if you’re a physician who makes his or her living by interpreting digital images, there’s real concern about AI taking your job in the future. Some experts believe AIM devices will not replace radiologists, but rather will make their work more efficient and accurate. For example, AI computers can identify MRI or CT scans which are normal, freeing human radiologists to concentrate on scans where an abnormality exists. In this scenario, radiologists would not lose their jobs to AIM computers, instead radiologists who don’t use AIM machines may lose their jobs to radiologists who do use the AIM technology. In pathology, computerized digital diagnostic skills will be applied to microscopic diagnosis. In dermatology, machine learning will be used to diagnosis skin cancers, based on large learned databases of digital photographs. Dermatologists must rely on years of experience to learn to discern various skin lesions, but an AI computer can ingest hundreds of thousands of images in a period of months.
  2. Clinics  In the clinic setting, the desired AI application would be a computer that could input information on a patient’s history, physical examination, and laboratory studies, and via machine learning and deep learning, establish the patient’s diagnoses with a high percentage of success. AI computers will be stocked with information from multiple sources, including all known medical knowledge published in textbooks and journals, as well as the electronic health records (EHR)/ clinical data from thousands of previous hospital and clinic patients. AI machines can remember this vast array of information better than any human physician. AI machines will organize the input of new patient information into a flowchart, also known as a branching tree. A flowchart will mimic the process a physician carries out when asking a patient a series of questions. The flowchart program contains a series of “if . . . then . . .” branches that depend on the patient’s answers. AI will input the information sources from each new patient, and arrive at diagnoses. Once each diagnosis is established with a reasonable degree of medical certainty, an already-established algorithm for treatment of that diagnosis can be applied. For example, if the computer makes a diagnosis of asthma, then an established textbook treatment regimen of bronchodilators will be activated. It’s projected that AIM applications in clinic settings will decrease unnecessary diagnostic tests, lower therapeutic costs, and reduce the manpower needed for outpatient medicine.
  3. Operating rooms  The best current example of robot technology in the operating room is the da Vinci operating robot, used primarily in urology and gynecologic surgery. This robot is not intended to have an independent existence, but rather enables the surgeon to see inside the body in three dimensions and to perform fine motor procedures at a higher level. In my previous essays Robot Anesthesia and Robot Anesthesia II, I described models of robots designed to perform intravenous sedation or intubation of the trachea, products which are futuristic but currently have no market share. The good news for procedural physicians such as anesthesiologists or surgeons is this: it’s unlikely any AI computer or robot will be able to independently replace the manual skills such as airway management, endotracheal intubation, or surgical excision. Regarding anesthesiology, I expect future AIM robots will be hyperattentive monitoring devices which follow the vital signs of anesthetized patients, and then utilize feedback loops to titrate or adjust the depth of anesthetic drugs as indicated by these vital signs. Such a robot would not replace a human anesthesiologist, but could serve as an autopilot analogue during the maintenance or middle phase of long anesthetics, freeing up the anesthesia professional so that he or she need not be physically present. This parallels the original genesis of the role of a nurse anesthetist—to be present during stable phases of anesthetic management—so that the physician anesthesiologist could roam to other operating rooms as needed.

What will an AIM robot doctor look like? It’s unlikely it will look like a human. Most sources project it will look like a smartphone. I’d expect the screen to be bigger than a smartphone screen, so an AIM robot doctor will likely look like a tablet computer. For certain applications such as clinic diagnosis or new image retrieval, the AIM robot will have a camera, perhaps on a retractable arm so that the camera can approach various aspects of a patient’s anatomy as indicated. Individual patients will need to sign in to the computer software system—this will be done via tools such as retinal scanners, fingerprint scanners, or face recognition programs—so that the computer can retrieve that individual patient’s EHR data from an Internet cloud. It’s possible individual patients will be issued a card, not unlike a debit or credit card, which includes a chip linking them to their EHR data.

How will we define if these medical computers are truly intelligent? The accepted test for machine intelligence is the Turing test, as described by computer scientist Alan Turing in 1950. In the Turing test, a human evaluator interacts with two players via a computer keyboard. One of the players is a human and the other a machine. If the evaluator cannot reliably tell the machine from the human, the machine is said to have passed the test, and is deemed intelligent.

What will be the economics of AIM? Who will pay for it? Currently America spends 17.6% of its Gross National Product on healthcare, and this number is projected to reach 20% by 2025. Entrepreneurs realize that healthcare is a multi-billion dollar industry, and the opportunity to earn those healthcare dollars is a seductive lure. Companies are looking to merge increasing computing power available at steadily decreasing costs, big data from large EHR patient populations, and artificial intelligence with an aim to drive down the costs of health care while increasing effectiveness. Expect to see the development of increasingly cheaper AIM devices to augment the skills of human physicians, or maybe replace them in some job descriptions. The government’s medical costs may decrease if work currently done by expensive-to-train physicians is instead performed by nurse practitioners or nurses aided by artificial intelligence machines, supervised by relatively few human physicians. Google is working on an AIM project in the United Kingdom entitled DeepMind. DeepMind is using machine learning to analyze eye scans from more than a million patients, with the aim to create algorithms which can detect early warning signs of eye diseases that human physicians might miss. Google researchers have also developed an AIM computer to screen for and analyze the spread of breast cancer cells in lymph node tissue on pathology slide images. Scientists at the Memorial Sloan-Kettering Cancer Center in New York have programmed over 600,000 medical evidence reports, 1.5 million patient medical records, and two millions of pages of text from medical journals into IBM’s Watson computer. Equipped with more information than any human physician could ever remember, Watson is projected to become a diagnostic machine superior to any doctor.

There’s a worldwide shortage of physicians. The earliest a human physician can enter the workforce is age 29, after completing 4 years of college, 4 years of medical school, and 3 years of the shortest residency (e.g. internal medicine, pediatrics, or family practice residency). A major advantage of AIM is that the machines won’t require 24 years of education. Can America afford to train people for almost three decades to then sit in a clinic and perform histories and physicals on patients who have chronic illnesses such as hypertension, hyperlipidemia, and obesity? Shifting these jobs to allied healthcare providers such as physician assistants or nurse practitioners is a cheaper alternative, but what could be cheaper than an AIM machine module which either assists one physician to evaluate a vast number of patients, or an AIM module of the future which replaces the physician entirely?

When can we expect to see new AIM tools adopted in clinical practice? Web-based smartphone apps such as Your.MD and Babylon already exist to assist physicians in diagnosis. You can anticipate the application of machine learning in the diagnosis of digital images soon. The DeepMind and Watson computers are blazing a trail toward machine learning in clinical medicine. Expect the FDA to assess the new technologies, and when it is safe and appropriate, to approve machine diagnosis as part of the practice of medicine. Remember how fast we advanced from a cell phone the size of a breadbox to the powerful smartphone that fits in the palm of your hand today. In the ten years since the introduction of the iPhone in 2007, who could have imagined the vast array of applications we carry in our pocket or purse in 2017?

AIM is coming. It will arrive be sooner than we think, and in all likelihood it will be more powerful and more wonderful than we could imagine. A brave prediction: AIM will change medicine more than any development since the invention of anesthesia in 1849.

I can’t wait to see it.


Recommended reading:

Hsieh, Paul. AI in Medicine: Rise of the Machines, Forbes, April 30, 2017. 

Mukherjee, Siddhartha. A.I. vs M.D. What Happens When Diagnosis is Automated? The New Yorker, April 3, 2017. 

Manney, Kevin. How Artificial Intelligence Will Heal America’s Sick Healthcare System. Newsweek, May 24, 2017. 

Omni staff. Artificial Intelligence in Medicine. Omni, 2016.

Bhavsar N, Norman A. Artificial Intelligence is Completely Transforming Healthcare. Futurism, April 3, 2017.

Dickson B. How Artificial Intelligence is Revolutionizing Healthcare,, May 2017.

Russell S, Norvig P. Arificial Intelligence, A Modern Approach, 3rd Edition, 2010, Prentice Hall.



Introducing … THE DOCTOR AND MR. DYLAN, Dr. Novak’s debut novel, a medical-legal mystery which blends the science and practice of anesthesiology with unforgettable characters, a page-turning plot, and the legacy of Nobel Prize winner Bob Dylan.

Publication date September 9, 2014 by Pegasus Books.

On October 2, 2014 THE DOCTOR AND MR. DYLAN became the world’s  #1 bestselling anesthesia Kindle book on

To reach the Amazon webpage, click on the book image below:



In this debut thriller, tragedies strike an anesthesiologist as he tries to start a new life with his son.

Dr. Nico Antone, an anesthesiologist at Stanford University, is married to Alexandra, a high-powered real estate agent obsessed with money. Their son, Johnny, an 11th-grader with immense potential, struggles to get the grades he’ll need to attend an Ivy League college. After a screaming match with Alexandra, Nico moves himself and Johnny from Palo Alto, California, to his frozen childhood home of Hibbing, Minnesota. The move should help Johnny improve his grades and thus seem more attractive to universities, but Nico loves the freedom from his wife, too. Hibbing also happens to be the hometown of music icon Bob Dylan. Joining the hospital staff, Nico runs afoul of a grouchy nurse anesthetist calling himself Bobby Dylan, who plays Dylan songs twice a week in a bar called Heaven’s Door. As Nico and Johnny settle in, their lives turn around; they even start dating the gorgeous mother/daughter pair of Lena and Echo Johnson. However, when Johnny accidentally impregnates Echo, the lives of the Hibbing transplants start to implode. In true page-turner fashion, first-time novelist Novak gets started by killing soulless Alexandra, which accelerates the downfall of his underdog protagonist now accused of murder. Dialogue is pitch-perfect, and the insults hurled between Nico and his wife are as hilarious as they are hurtful: “Are you my husband, Nico? Or my dependent?” The author’s medical expertise proves central to the plot, and there are a few grisly moments, as when “dark blood percolated” from a patient’s nostrils “like coffee grounds.” Bob Dylan details add quirkiness to what might otherwise be a chilly revenge tale; we’re told, for instance, that Dylan taught “every singer with a less-than-perfect voice…how to sneer and twist off syllables.” Courtroom scenes toward the end crackle with energy, though one scene involving a snowmobile ties up a certain plot thread too neatly. By the end, Nico has rolled with a great many punches.

Nuanced characterization and crafty details help this debut soar.





Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s