What is Telemedicine?

Telemedicine allows health care professionals to evaluate, diagnose and treat patients at a distance using telecommunications technology. The approach has been through a striking evolution in the last decade and it is becoming an increasingly important part of the American healthcare infrastructure.

What we recognize as telemedicine today started in the 1950’s when a few hospital systems and university medical centers started to try to find ways to share information and images via telephone. In one of the first successes, two health centers in Pennsylvania were able to transmit radiologic images over the phone.

In the early days, telemedicine was used mostly to connect doctors working with a patient in one location to specialists somewhere else. This was of great benefit to rural or hard to reach populations where specialists aren’t readily available. Throughout the next several decades, the equipment necessary to conduct remote visits remained expensive and complex, so the use of the approach, while growing, was limited.

The rise of the internet age brought with it profound changes for the practice of telemedicine. The proliferation of smart devices, capable of high-quality video transmission, opened up the possibility of delivering remote healthcare to patients in their homes, workplaces or assisted living facilities as an alternative to in-person visits for both primary and specialty care.