This article appears in the September 2012 issue of HealthLeaders magazine.
The healthcare industry, which has long trailed other industries in its use of analytics, is developing into a seedbed for research on advanced analytics, including topics such as natural language processing, artificial intelligence, and genomics.
How Mobile Technologies Fuel TeleHealth Advances The sheer power of smaller, cheaper, and faster healthcare is evident in today's mobile health solutions. Providers are already seeing mHealth as a way to curb hospital admissions among high-cost patients. >>>
The healthcare analytics world last year took particular note of IBM's Watson project, a supercomputer able to answer questions posed in natural language that defeated human competitors on the TV quiz show Jeopardy! IBM then announced it was working with Columbia University and others to commercialize Watson for clinical analytics and decision support.
At HCA, Jonathan Perlin, MD, chief medical officer and president of the clinical and physician services group, says, "We're doing some advanced work in terms of looking at how we might use natural language processing to detect subtleties in data and, in the future, even better support for precision medicine and personalized care."
In March, Memorial Sloan-Kettering Cancer Center announced its partnership on the Watson research. "We're still in the early work, very actively working on this project with IBM," says Patricia Skarulis, vice president of information systems and CIO. "All I can tell you is they are devoting a lot of resources and we are devoting a lot of resources. Database people, analytics people, our senior physicians. It's been a fun project."
Indeed, the future for analytics in healthcare only gets bigger and more ambitious from here.
For one thing, the amount of data being collected for analysis is exploding. "We're bringing quite a bit of genomic data right now into our clinical warehouse," Skarulis says. With the consent of its patients, the organization is studying pathology information from colon cancer patients and certain lung cancer patients, currently importing results from 60 different tests.
"What makes our people very excited is to have the genomic data, and to have this pristine clinical data to be able to combine with it, we think, will be an extraordinarily useful resource tool," Skarulis says.
Researchers are tapping into even newer analytics technologies to analyze structured and unstructured healthcare data. At UW Health in Madison, Wis., chief research information officer Umberto Tachinardi, MD, is creating an advanced data warehouse built in part on technology from I2B2 (Informatics for Integrating Biology and the Bedside), an NIH-funded National Center for Biomedical Computing at Boston-based Partners HealthCare System.
UW Health represents the academic healthcare entities of the University of Wisconsin-Madison: UW Medical Foundation, the 566-bed UW Hospital and Clinics, UW School of Medicine and Public Health, American Family Children's Hospital, and UW Carbone Cancer Center.
The open-source software powering I2B2 is utilized by 60 large academic medical centers, including UW Health, which were awarded Clinical and Translational Science grants by the National Institutes of Health, Tachinardi says. I2B2 accelerates the process of copying parts of electronic health records into deidentified data sets so researchers can more easily identify interesting phenotypes within the data, he adds.
"Curating information is a very expensive part of our business, regardless of being a research or clinical organization," Tachinardi notes. For instance, simple data types such as gender become complicated given the growth in transgender population.
"In the research world we're already used to living with those complications. But the clinical world is just starting to learn about them right now," Tachinardi says.
This article appears in the September 2012 issue of HealthLeaders magazine.
My daughter wonders why it would be preferable for everyone to listen to the same music in her art class. Surely it would be better if everyone could listen to their own music on their own iPods or other music players.
The art teacher points out that the point of listening to the same music is to have a shared moment, not merely to inspire a painting. But in a society populated now with people listening to their own private music or media, it's like swimming against a current.
When you think about it, those shared moments can be few and far between these days. Whether we're watching TV, or surfing the Web, or listening to music, it's all too easy to slip into solitary, passive receptivity.
Within healthcare it's all too easy to slip into that same kind of passivity, even (or especially) when information in the IT system is flowing fast and furious. Sure, patients can demand their medical records, but can they read them and act upon the deluge of info they may find there?
"We silence the conversation with technology," says Steven D. Freedman, MD, Ph.D, chief of the division of translational medicine and professor of medicine at Harvard Medical School. Now, Freedman is about to begin a clinical trial to change that, and it starts at the very moment that patient meets doctor.
Freedman's initiative, called Passport to Trust, will provide greater structure to doctor-patient interactions through a technology that will ultimate sit on top of any electronic medical record.
It won't be the first time that technology jumped from the financial world into healthcare, this time using technology from NexJ Systems. "Their approach brings together all the different digitized information and data, but leaves the original source at the original database in the hospital, which is very important to hospitals and security," says Camilia Martin, MD, MS, a member of the Passport to Trust team who developed the initial electronic prototype.
"These decision points and the knowledge must be formed around the patient, not so much formed around what the physician, or practice, or hospital, is doing," Martin says.
The trick is to allow the physician, within the workflow of a typical day, to generate a care plan and give a copy to the patient before the end of the visit, while putting a copy of the plan in the medical record, Martin says.
"Once we have that, then we're going to move forward in expanding that so that then it's accessible through the Internet and the Web by anybody and everybody that's given permission," she says.
Beth Israel Deaconess Medical Center, where Freedman practices, was just named by InformationWeek as the number one technology innovator this year, beating out IT powerhouses such as John Deere, Zynga, and Acxiom Corp.
And yet, it's a reflection of many hospitals, having more than one EMR: in this case, its homegrown Online Medical Record, as well as cloud-based eClinicalWorks out in the practices. Passport to Trust will have to work with both.
In the Passport to Trust model, office visits get structure by inserting a time-out into the visit, a little like the time-outs in operating rooms where patient identity and operational details are reviewed, Freedman says.
"I go over, okay, so what is your problem, what at least is in my mind as to potential causes of that problem, and then how would any tests or treatments truly change management, and then what's our stepwise plan?" he says. "What are we doing, each week, until I have you better? And then lastly, what are your thoughts and concerns, from the patient side? And so that's how I've structured things."
"It's interesting that everyone has a slightly different take on what they think would be an important outcome measure to show," Freedman says. "Insurers want whatever the trial is by whomever to show decreased cost, and that at least you don't worsen healthcare quality." But the project won't truly succeed unless it shows patient engagement, he adds.
A paper-based pilot study found patient satisfaction rising from 34 percent to 94 percent being satisfied with having a plan of action from their doctor. "People were actually astounded that they had a simple plan that they could then follow," Freedman says.
"The EMRs are essentially a one-way archive," Freedman says. "We dump in lots of data—your lab results, imaging—and the patients really don't have very good access to it, and if they do, it's in a language that's, for the most part, not understandable to them, and that's why we're trying to create this."
"No one ever thinks about the fact that the whole relationship that's built around the patient and the physician starts at that office visit or that encounter," Martin says.
"No one's trying to structure that, yet there's been plenty of literature to suggest that patient satisfaction, literacy, and understanding the issues actually dictates their compliance. It dictates their follow-through. It dictates everything they do after leaving that office, that drives cost. That plan that's generated in the office is the biggest driver of cost, but no one's just looking at that simple relationship."
In a CYA healthcare culture, just getting doctors to justify tests and procedures to patients during those consultations will save money. The clinical trial, still being designed, will focus on diseases such as abdominal pain or lower back pain, comparing participating doctors to a control group of doctors who won't be using the Passport to Trust techniques, comparing the number of lab tests and radiological scans ordered. But Freedman admits it won't be easy showing changes in cost in a short time.
So the first trial will almost certainly be followed by others. NextJ is providing the current funding for the digitization of Passport to Trust, but Freedman envisions applying to the Robert Wood Johnson Foundation or the Commonwealth Fund for additional funding. The clinical trial will last a year. "We have had many conversations with different stakeholders—insurers, employers, other healthcare groups, physicians, PBMs," he says.
Imagining a patient population that is truly informed, engaged, and passionate about their care is surely one of the brass rings of everything we write about at HealthLeaders. In the rush to implement Meaningful Use, patient engagement is not at the top of many health executives' plates today. But sooner or later, it will be. The office encounter should be a conversation, not just a data download. Efforts like Passport to Trust will be leading the way.
This article appears in the September 2012 issue of HealthLeaders magazine.
Ever since the first experiments with telemedicine, providers have been taking steps to move healthcare closer to where patients live and work. Now, mobile technology—epitomized by the millions of such apps already downloaded to smartphones, but also appearing in nearly unlimited form factors—is accelerating those steps.
Healthcare a Seedbed for Analytics Research The healthcare industry, which has long trailed other industries in its use of analytics, is a leader in research on natural language processing, artificial intelligence, and genomics, fueling advances in areas such as decision support. >>>
At Boston's Partners HealthCare, a system with 2,700 licensed beds, 45 employees scrutinize these developments at the Center for Connected Health. One early effort to equip cardiac patients with remote monitoring technology resulted in a 50% drop in readmissions, says Joseph Kvedar, MD, founder and director of the center.
"We're all committed to a healthcare delivery model that moves care out of the hospital, out of the office, and directly and continuously into the lives of patients," Kvedar says. "We find that the best technologies to facilitate that vision are monitoring and communications technologies properly applied."
Kvedar says his team sees "two reproducible value propositions over and over again" regarding mHealth. One is improved patient self-care. "That to me is the most exciting one, that we can arm patients with data about themselves in context, and they manage it not dissimilar to the way a baseball manager manages a lineup of batting averages. They can see what they're hitting and if they need to improve something. They can do that and watch their numbers change. It's very, very powerful."
The second value proposition, Kvedar says, is just-in-time care. "We give providers a dashboard view of their population, informed by all of these sensor data, connected health data that are streaming in from those patients, and then enable those clinicians to reach into the lives of individuals who need the most at that moment in time," he says.
The sheer power of smaller, cheaper, and faster healthcare is evident in today's mHealth solutions; dermatology, Kvedar's specialty, has been an early beneficiary. "When I started doing this work, the camera we used was a $12,000 device that was about the size of a shoebox, and it had less than one megapixel resolution," he says. "Now you can do everything on your iPhone or your Android smartphones, so teledermatology is coming into its own. We finally are at the point where the technology to effect image capture and history entry is so easy that anyone can do it on the fly, and the amount of incremental time that the referring provider needs to put in is almost zero."
The cardiac patient remote monitoring app Kvedar mentions happens to be delivered in a tabletop device made by McLean, Va.–based ViTel Net, which is owned by the Bosch Group, but only because that patient demographic is less comfortable using a tablet or phone interface. But more and more, tablets and phones are the form factor of choice, Kvedar says.
Despite some continuing data breaches, security of health data on mobile devices is improving, Kvedar says. "I think it's a solved problem, to the degree that any information these days is always subject to being hacked," he says. "You can never say glibly, ‘That'll never happen,' but we use secure sockets, we use all kinds of authentication tools, we make sure our vendors pass a very rigorous security audit, so we're very particular about privacy and security and take it very seriously so that we can protect and maximize our patients' privacy. I don't see it as a big barrier."
In an age where "there's an app for that" is a catchphrase, healthcare is grabbing its share of the spotlight. At the Consumer Electronics Show in January, Eric Topol, MD, chief academic officer of Scripps Health, a five-hospital, $2.5-billion nonprofit health system in San Diego, made headlines by appearing during a keynote speech and recounting his use of an experimental device, the AliveCor iPhone ECG.
Doubling as an iPhone case, the AliveCor ECG includes two outward-facing sensors. "You just make a circuit with your heart," Topol told the CES audience. The phone displays the patient's cardiogram. "I use it in clinic now all the time for my patients instead of a regular cardiogram."
Topol went on to tell the audience that he was on a cross-country flight while carrying the device. "They called for a doctor on the plane for a passenger in the back," he said. "With this phone, I could make a diagnosis of a significant heart attack, which led to an emergency landing, and fortunately the fellow did very well."
As of mid-August, the device was not for sale and its website stated that it was "Not cleared by the FDA for sale in the United States." A spokesman for Topol said that patient privacy prevents Topol from revealing any other details about the incident.
But the ripple effect from stories such as Topol's are being felt throughout healthcare, and the "cool" factor of providers developing their own apps is thrusting them into entirely new spotlights.
At the Apple Worldwide Developer Conference in San Francisco in June, the new Mayo Clinic Patient App— which lets Mayo Clinic patients access their personal medical record, appointment schedule, and other services—was highlighted by Apple.
With more than 4,000 physicians and scientists, a 140-year history, and more than 1 million patients seen annually, Mayo has its own team of 60 people—including designers, project managers, physicians, and nurses—working in its Center for Innovation.
"We have a declared mission to touch in a meaningful way 200 million lives," says Michael Matly, MD, director of business development and new ventures at Mayo's Center for Innovation.
"Unlike other innovation centers in academic medical centers, we focus strictly on health delivery, so our motto is transforming the experience and delivery of health and healthcare," Matly says. Mayo Clinic also partnered with Rock Health, a San Francisco–based incubator for startups led by CEO Halle Tecco. Many of the startups are building on mobile platforms.
"You really have a multitude of industries intersecting, so [Tecco] was able really to bring all these different groups around the table," Matly says. "You have payers and the venture community and technology companies and providers all coming around looking at these new technologies and health tech companies.
"We sift through this large list of startups and we pick a handful of them where we think there's an opportunity for Mayo to add value, so we basically match these companies with clinical champions within the practice that can work with these technological entrepreneurs to build better products."
Both Partners and Mayo are also wrestling with how mHealth apps will be financially supported in the post-fee-for-service world now unfolding.
A part of the solution may be to leverage mHealth to reduce the number of doctor visits and hospital admissions. Matly points to technology from CellScope, a San Francisco–based startup that provides a small iPhone attachment that turns the phone's camera into an otoscope. Sixty percent of pediatric visits are due to ear pain, but CellScope's technology would allow parents to snap what Matly calls "beautiful" in-home pictures of their kids' eardrums, then send them to doctors for diagnosis.
"A payer would be very interested, because if I can reimburse you $40 instead of going in for a $200 visit, it's probably better for me and better for the patient," Matly says.
Providers see mHealth as a way to keep out of the hospital patients upon whom they are losing money, such as Medicare patients, says Christopher Wasden, global healthcare innovation leader at PwC, a leading advisor to public and private organizations across the health industries.
According to a study published in March by the Geisinger Center for Health Research (part of the Danville, Pa.–based, $2.7 billion integrated health system), the Geisinger Monitoring Program interactive voice response protocol reduced 30-day hospital readmission rates by 44%. Wasden says this lightweight approach succeeds where more tech-intensive telemonitoring solutions have proven inconclusive.
Still, there is resistance to mHealth. Wasden mentions a large healthcare system that told him that such change would require a substantial change management program that includes educating physicians and nurses on how to deliver this type of care, and then changing their work flow so they can do it.
Beyond these leadership challenges, many providers are not prepared to accept the massive amount of new data generated spontaneously by sensors, then uploaded from apps on mobile devices to their data centers, Wasden says. "Doctors are already overwhelmed by the data they have. They don't want more data. And they especially don't want more data generated by a patient where they don't even trust the data that the patients generate. So while there's a lot of promise associated with the data, we don't actually have the tools and the capabilities and the applications necessary to really know what to do with the data to actually have it be of any value or meaningful use within the practice."
The disruption that truly mobile telemonitoring will inflict on healthcare goes deeper, Wasden says. "Doctors are very comfortable making gut decisions practicing medicine on an empirical and an anecdotal basis, but they're not comfortable saying that, ‘I made this decision based upon an analysis of the data that says that you have a 98% chance of doing better if we do this than doing that,' " he says.
Speaking for himself and not for Mayo, Matly expects direct-to-consumer payment to closely track mHealth technology adoption, while Kvedar sees the system working itself out somehow.
"Providers of all stripes, dermatologists included, are now open to different models of care delivery," Kvedar says. "They're more likely to say, ‘Okay, what we're going to do is concentrate on doing the right thing, and because these reimbursement models are changing, we have the faith to figure out that we'll be able to get paid for our work.' "
For now, mHealth has yet to approach the fullness of its promise. But the way forward may be getting clearer with time. "In Massachusetts we are going at risk with every single payer in our system," Kvedar says. "We also already have a signed contract with BlueCross of Massachusetts that puts us at risk, and we're negotiating with all of our other payers, so we're changing the mind-set of our organization to be less focused on volume and more focused on value. As we do that, tools like connected health become
very appealing."
Reprint HLR0912-6
This article appears in the September 2012 issue of HealthLeaders magazine.
As worrisome as the final deadlines for use of ICD-10 codes are, it's time to devote significant resources to getting ready for them.
Belittled in some quarters as a make-work, vendor-enriching government regulation, ICD-10 actually gets right to the heart of improving the quality of care.
Don't take my word for it, even if you read my recent story in HealthLeaders magazine. Listen to Sharon Korzdorfer, director of information management at St. Luke's Hospital of Kansas City.
St. Luke's is a not-for-profit, acute-care, tertiary academic teaching institution with more than 600 beds. It is one hospital out of an 11-hospital health system. Eight of the facilities run McKesson EMRs, and the three critical-access hospitals use CPSI, a platform designed specifically for critical access hospitals, with a completely different billing process to McKesson's.
It is the all-too-common bifurcations like this within healthcare IT that make the rich coding provided by ICD-10 so important, not just for satisfying the demands of payers who want ever more billing detail, but to exchange anatomically precise clinical data between different types of hospitals in the same system.
Korzdorfer plans to deploy ICD-10 in both systems at the same time. The work starts with an extensive evaluation of the skill levels of staff in areas of pharmacology, physiology, anatomy, and basic terminology. The skill set ranges from coders with one or two years' experience to some who have coded for more than 30 years, she says.
The education component seems like the biggest piece of ICD-10 to me. Hospitals might have some staff who are familiar with the cardiovascular system, but need further help mastering the integumentary system. Korzdorfer says it will take this whole year to get staff the kind of supplemental education needed to prepare for ICD-10.
Like so many others I've spoken with, Korzdorfer says preparation is proceeding as if the original 2013 go-live with ICD-10 were still the case, rather than the 2014 date recently formalized by CMS. But that sound you hear is the feet still being dragged at too many provider institutions.
It's time to stop the foot dragging.
Now, the question is, how.
A simple, clean cutover from earlier coding systems to ICD-10 may not be possible.
"I don't think there's a perfect solution out there, but I think that payers being ready first may make the most sense," says Janice Jacobs, director of regulatory affairs at IMA Consulting, a national independent healthcare management consulting firm working with more than 700 hospitals and health systems throughout the U.S.
And yet, CMS was mute on this suggestion, which was promoted by the Medical Group Management Association among others. So the payers and the health plans get just as long as the providers to get their ICD-10 coding systems up and running. (Aetna, for one, has pledged to be fully ready to process ICD-10 claims by October 1, 2014.)
Jacobs is the experienced voice of reason in a corner of information technology fraught with claims and counterclaims. She is vehemently dismissive of vendors who claim to have ways to generate ICD-10 codes automatically from other coding systems. For instance, I asked her about a suggestion from the American College of Physicians that SNOMED could generate automatic ICD-10 codes from SNOMED-CP terms.
"SNOMED has about 300,000 codes, so even with ICD-10 expanding [by] 168,000 codes, you're still looking at almost double the codes SNOMED would have over ICD-10," Jacobs says. "So how are you going to accurately crosswalk 300,000 codes into 168,000 codes accurately and automatically? That's where I see the problem there. They are two different systems. They serve two different purposes, and there's double codes in SNOMED, so that's what I see as the issue with anything automated."
Another problem, according to Jacobs: Neither SNOMED nor ICD-10 group codes into a Medical Severity Diagnosis Related Group, or MS-DRG. "You'll still need coding personnel that will take the codes, even if they are somehow mapped accurately, and group them into the appropriate MS-DRG codes for reimbursement. So you're kind of looking at a clinical terminology system, a clinical tracking system, versus what you need to get a bill out the door."
One thing that bears additional scrutiny is the possibly disingenuous proposal earlier this year by the American Medical Association to study ICD-10's successor, ICD-11, with the thought of going directly to ICD-11.
"ICD-11 is in the beta phase right now," Jacobs says. "So we really don't know what it is going to look like when it is finally released in 2015. To say that we'll just forego ICD-10 and go to ICD-11 when we don't even know what the final product is going to look like is really frivolous, I think."
According to another consultant I spoke with who prefers to remain unnamed, the AMA has a hidden agenda when it comes to slowing down adoption of ICD-10.
This consultant says the AMA derives significant revenue from providing codes for a somewhat competitive diagnosis system known as Current Procedural Technology, or CPT.
"The AMA licenses CPT to the federal government and agrees to use it exclusively for characterizing physician work," the consultant says. "That's why in the ICD-10 uptake, doctors didn't have to characterize their work with ICD-10 PCS procedure codes. They only had to do diagnosis codes. They were going to continue to use CPT."
Each year, companies pay the AMA license fees for the current version of CPT to embed in the practice management software piece of electronic health records, the consultant says. "CPT owned by the AMA is the AMA's major source of income, because the AMA updates CPT every year," says the consultant.
AMA critics on this CPT issue have been pushed to the margins, and my raising this concern may place me on the margins as well. But it does provide one plausible theory of why the AMA wants to continue to throw a spanner in the ICD-10 works, ostensibly in favor of ICD-11.
And Janice Jacobs' words of concern about the infeasibility of crosswalk technology from any coding system to any other coding system still echo. For sanity's sake, our industry's implementation of the government's ICD-10 mandate needs to hurtle forward, and now.
Shameless plug: If you're ready to roll up your sleeves and get started, or even if you're already under way, a perfect place to start or continue would be my upcoming HealthLeaders Webcast, "Reboot for ICD-10: Lessons from UnitedHealth Group & North Shore-LIJ," scheduled for Monday, October 22 from 1:00 to 2:30 p.m. Eastern time.
In honor of National Health IT Week next week, here are my top 10 predictions for healthcare IT for the next 12 months—none of them involving Meaningful Use or ICD-10!
1. Patients ask, where's my data? Patients will organize a single-day national event called Where's My Medical Data, in which providers and payers will be besieged by emails and phone calls from patients wanting their medical records. Patients will complain loudly at the slowness of the responses, the outright refusal by some providers, and the complexity of the records received.
While the scenario might not play out exactly in this form, I heard this proposal floated at the recent Healthcare Unbound conference in San Francisco, where it received the encouragement of Farzad Mostashari, the National Coordinator for Health Information Technology within the federal Office of the National Coordinator for Health Information Technology. It hasn't yet become an ONC initiative (they are a little busy right now), but patients might lead the way.
2. Higher software prices allow EMR makers to staff up. Providers in turn will call upon software makers of electronic medical records to redesign their products to allow easy generation of records for patient use. A rise in the cost of such products, due to a supply squeeze, will enable EMR software makers to raid the ranks of other high-tech companies such as Google and Microsoft in order to staff up. But the principles embodied in Fred Brooks' timeless book, The Mythical Man-Month, will slow progress; adding designers and programmers still doesn't produce linear progress in software.
3. The human touch becomes a major tech issue. A bumper sticker spotted where I live in Berkeley, CA, says, "It's become appallingly clear that our technology has surpassed our humanity." We are running a risk of losing the human touch in an age of health tech marvels. Teams may be communicating better than ever, but from the patient's point of view it's a blur of emails, messages, phone calls, and faces. The medical home is one response to the depersonalization of medicine. Can tech provide other "repersonalizing" experiences? Examples include videoconferencing, social networking, technology-mediated support groups, and simple time on the phone with a physician.
4. Tablets replace expensive videoconferencing gear. Too much of the videoconferencing gear in hospitals today looks like the giant screens of the original Star Trek series. If you want to know where it's going, look at what Captain Jean-Luc Picard used in Star Trek: The Next Generation: a small screen in his quarters, for more confidential communications without losing that face-to-face factor. A telecommunications executive recently told me he had informally checked the usage logs of expensive videoconferencing systems at hospitals—and found them woefully underutilized. Now that tablets are proliferating, look for those to be employed, perhaps even in group settings, as the videoconferencing system of choice.
5. Identity crisis. Information flows at the speed of trust. If massive EHR use is to avoid massive fraud, a national patient ID (and provider ID) system is a requirement. For example, there is a huge number of women named Maria Gonzales in Los Angeles County. With multiple payers, providers, and government agencies trying to keep track of all of them, there's also huge potential for fraud as medical records are automated.
Will a standard U.S. healthcare ID happen, and if so, how? What are the risks to healthcare leaders if it doesn't happen soon? We have President Bill Clinton to thank for signing a law that prohibits establishment of a national healthcare ID system, but we'll either need to amend that or use some technological tricks to achieve the effect of an ID system without violating the existing law. Patients themselves may have to assert their digital identities. For more on this concept, check out the Personal Data Ecosystem Consortium, an industry effort that brings together the best thinking over the past 20 years about how to get identity management right. And if we're lucky, it won't take a Department of Homeland Security to do it.
6. A systematic fix for alert fatigue. Devices bombard clinicians and executives with alerts, for everything from life-threatening errors to suggestions from purchasing on how to save money. Clinicians say enough! But quality mavens insist on many of the alerts. IT systems can be redesigned around human factors, but a systems approach is also needed. In the technology world, the network management folks unified all alerts starting in 1988 with the Simple Network Management Protocol, or SNMP for short. SNMP and its successors are why computer networks today are manageable even though hardware still fails. Healthcare IT needs its own SNMP. Maybe this year it will get it.
7. Patient adherence for fun and profit. Technology is poised to make sure that patients take their meds as directed, get exercise, lose weight, and report changes in their conditions promptly. Lives will be saved. Accountable care won't work without it. And healthcare is starting to deliver it. More patients will see savings on their health insurance premiums if they comply with these guidelines. Clever software developers may deliver bonus secret levels of Angry Birds to successful weight-loss patients, which could be even more motivating to some than cash.
8. Medical homes and medical neighborhoods lead to medical cities. Technology these days is geo-this and geo-that. Population health efforts have liberated tons of health data, which is being analyzed at every geographic level. Look for lots more analysis of what makes entire cities healthy or sick. Walkability scores will take their place alongside other factors and could begin to factor into health insurance premiums. The data is all out there, waiting to be tapped.
9. Social network–powered, peer-to-peer training replaces older company-based, HR-style training. Executives do this already. If you are a CMIO, you go to AMDIS conferences and learn from your peers. If you are a CIO, you go to CHIME and HIMSS events. The AMA takes care of doctors, and various specialties have their own events. Distance learning is becoming dominant in universities.
There's no reason the rest of the healthcare line staff has to sit in rooms training, or retraining, on their EMRs when they could be part of a virtual classroom, mentored by a peer from somewhere else on the planet, who knows exactly what they're going through and can answer their questions. Instructors will become more like resource personnel or librarians. Didactic lecture as a method of HR-powered training becomes rare, and ceases to be a nonproductive cost center.
10. People trump technology.
Healthcare leaders are all around us. But it takes dogged determination to not get swept up in the bits and bytes. Too much technology is surrounded by a candy coating of hype. To cut through it, check out Quantified Self, a new movement of people determined to use technology in a fundamental way to track what they are doing. I attended my first Quantified Self meetup in San Francisco last week, and it was a fascinating mix of geek-love and the kind of excitement generated by a successful Weight Watchers meeting. Quantified patients, whose enthusiasm and collective tech already outweigh that of the government, may be emerging as the cutting edge of medical research a year from now. They are also figuring out some of the finer points of patient privacy versus sharing in a way that vendors, and providers, are only now getting around to understanding.
If half or more of these predictions actually come true in the next year, healthcare will be better for it. I was inspired to write this column by HIMSS, which is running a Blog Carnival to commemorate National Health IT Week.
Five weeks ago I wrote that the then-imminent release of the Meaningful Use Stage 2 final rules did not mean vendors could turn a crank and produce software code that healthcare providers, would be able to seamlessly implement in order to meet deadlines looming in 2013 and 2014.
On Friday, during an online briefing about the just-released 2014 final rules, the squeeze on vendors seemed to tighten during an exchange with ONC federal policy division director Steve Posnack.
Someone asked him when certification of vendor code for Meaningful Use 2014-compliant software would begin.
"It's hard for me to give a definitive timeline," Posnack responded. "The test procedures have to be published for all the new certification criteria, and those are in the process of being finalized right now. Then they'll be subject to the period of public comment, so I think there is a window of variability in terms of when that would start.
"It's in everyone's interest to have it start as soon as possible," he continued," and that's what we're shooting for, because I think just like you saw with the final rule, there's intense commitment from everyone in the department to get these rules out as soon as possible, recognizing the timing concerns that commenters had expressed, and I think the same is going to be true with the test procedures and the other elements that still need to fall in place."
To me, this casts a lengthy shadow over all the other issues looming in the 474 pages issued by ONC and the other 672 pages issued by HHS. Providers are already seeing lengthy delays from vendors as they rush to implement EHR systems, according to Charles E. Christian, CIO of Good Samaritan Hospital in Vincennes, IN., who spoke at length of his concern at the American Hospital Leadership Summit in July.
Rewind back to the date that the final rules for Meaningful Use Stage 1 were released. Now ask yourself this: How long did it take to get from the final rulemaking then and availability of software certified by CMS for Meaningful Use Stage 1? I wasn't monitoring the issue back in the summer of 2010, but I am guessing it was at least three months.
Given the number of new features in in this round for both Stage 1 and Stage 2 attesters, it could take as long or longer for certification of software this time.
The AHA shares my concerns. "there are many activities that need to happen before we will have certified products that can be purchased and installed by providers, hospitals, and physicians," says Chantal Worzala, AHA director of policy.
Each year, providers must confirm that their software is still in compliance with current Meaningful Use guidelines, and that's adding to the squeeze on providers to implement and vendors to facilitate those implementations.
In its comments to ONC this spring, AHA tried to allow any provider currently certifying at the 2011 Meaningful Use guidelines to stay on that release of software until it moved to the Stage 2 set of Meaningful Use functions.
But ONC refused to introduce this level of complexity into the regulations. While I can't blame ONC for trying to keep a complex set of regulations a little bit simpler, the result is that all providers have to get updated software from their vendors at once, whether they're in attesting at 2014-level Stage 1 or Stage 2.
"The notion that they're putting everybody into that upgrade cycle in the same year absolutely is problematic," Worzala says.
The question I can't get an answer for yet is this: were things just as bad for vendors when the 2011 rules were finalized? I'm sure someone reading this who's been following Meaningful Use longer than I can enlighten us in the comments below.
If that were the case, it could provide some inkling of how long the wait for certified code for the 2014 rules will be. Then again, the 2014 rules are a different beast than the 2011 rules, so as they say, your mileage may vary.
Vendors will shed light on all of this in the coming days and weeks. Along with that somber news may also come an unwanted, but basic economic effect of the law of supply and demand.
The AHA's letter of comment on the proposed rule in May reported that its members see "aggressive pricing of individual certified functionalities" by software vendors. "We expect that these distortions are exacerbated by limited vendor capacity to meet accelerated demand and workforce shortages."
We may be about to see that economic impact in spades. It remains to be seen whether the short time left to produce certified running code, and to help providers meet the 2014 deadlines for attestation makes that impact even worse.
"Patient safety is impaired by the failure to quickly fix technology when it becomes counterproductive, especially because unsolved problems engender dangerous workarounds."
Those words might have been spoken at last week's Contra Costa County Board of Supervisors meeting in Martinez, CA, where two nurses went public with concerns about the safety of their county health system's July 1 Epic EMR implementation.
In fact, those words were published in 2008 by the Joint Commission. What healthcare leaders have to ask this week is, how seriously have these problems been addressed, since they are still occurring in 2012?
Laura Easley, RN, and Lee Ann Fagen, RN, are nurses at a Contra Costa County hospital clinic at the West County Detention Facility in Richmond, CA. Here's what they had to say last week.
"We went through the process of meetings where Epic showed health services the workflow they had created," said Laura Easley, RN. "Numerous times our health team told the Epic group there were many concerns. When we went live with the system, the problems we addressed were even more obvious. Health professionals are being told by non-health professionals how we should conduct our health practice. This cannot be."
"We're unable to document our medication administration correctly," said Lee Ann Fagen, RN. Last week's testimony centered on the possible harm averted when a nurse realized that the Epic system was recommending a possibly fatal dosage of a heart medication.
Contra Costa County Regional Medical Center also operates a 164-bed hospital, and although the testimony last week centered on outlying clinics at three county detention facilities, sources inside the hospital tell me that EMR-related safety concerns are high within the hospital as well.
Because the health system has a policy forbidding unauthorized contact with the media, I cannot identify them.
Communication woes But both management and workers agree that communication at the hospital about the new EMR has been inadequate. "Epic is a tool, but it does not replace the skill and the knowledge and the experience of our staff, who are in fact, the most powerful components of our safety system," says Anna Roth, RN, CEO of Contra Costa Health Services. "I'm thankful that our staff is diligent."
That diligence comes at a cost, however. I'm told that some documentation that used to take two hours has been known to take four hours now. That translates either as less time spent at the bedside, or as a labor force that is more overworked than before the EMR went live.
Either way, it's bad news.
"The overall safety and effectiveness of technology in health care ultimately depends on its human users, ideally working in close concert with properly designed and installed electronic systems," the Joint Commission wrote in Sentinel Event Alert Number 42, dated December 2008.
Later on there's this: "If not carefully planned and integrated into workflow processes, new technology systems can create new work, complicate workflow, or slow the speed at which clinicians carry out clinical documentation and ordering processes."
So no one can say there hasn't been sufficient, serious warning of the risks of rushing EMRs into use. But with stimulus money for EMR deployment available for a limited time only, the rush to grab that money is clearly at odds with careful planning. This is true to a greater extent now than when the Joint Commission issued its warning.
The Joint Commission report also speaks of the impairment of patient safety by the failure to quickly fix technology. "Unsolved problems engender dangerous workarounds," the 2008 report warns. "Systems not properly integrated are prone to data fragmentation because new data must be entered into more than one system."
System integration issues At Contra Costa Regional Medical Center, my sources tell me two such major systems are not properly integrated—not an unusual situation for hospitals. The hospital's lab systems are still running software from Meditech, even as the hospital-wide EMR is now Epic's.
If an alarming lab result shows up in Meditech, it will not appear in the patient's Epic-based medical record without extra integration work by vendors and IT staff. Otherwise, all lab results must be communicated via some other means.
David Runt, CIO of Contra Costa Health Services, acknowledges that the lab runs Meditech while the EMR is Epic's. "We've had some issues with the lab requests flowing over to the laboratory, but those are addressed and fixed as they are raised," Runt says.
The laboratory module is the newest of the Epic suite of products, and has "not a large installed base," Runt says. "We're formerly a Meditech shop, so those Meditech shops that were converting to Epic, many of them have made the decision to stay on the Meditech lab or stay on whatever lab system they were on at the time, so we're not unique in that respect."
Roth says no patients have been seriously adversely affected by anything that can be attributed to the transition to Epic. But my sources say that Epic was concerned enough last week to fly support staff to Contra Costa. I do not, however, have confirmation of this from either Epic or Contra Costa Health Services.
Ulterior union motives?
Because a union raised the red flag in this case, it's reasonable to consider the recent history of northern California nurses' unions. During a nurse lockout at nearly Alta Bates Hospital last September, a patient died while under the care of a temporary nurse hired by the facility during the lockout. Charges and counter-charges flew between labor and management.
It's tempting and perhaps cynical to question the motives of unions that raise safety issues during tense labor negotiations. But since a two-year contract was just signed in June between the California Nurses Association and the Contra Costa system, this newest concern doesn't appear to be a negotiation ploy.
DeAnn McEwen, vice president of National Nurses United, of which CNA is a founding member, argues that management is moving toward an over-reliance on technology at the cost of essential caregivers. "We're not Luddites, we're not saying that we don't see the value of some of it, but when it interferes with or overrides our practice, it puts patients at risk of harm."
McEwen, still nursing during two 12-hour shifts per week at a tertiary care medical center in Los Angeles County, talks at length about healthcare labor issues aggravated by technology. At her own hospital, she says flaws in the Epic implementation have required workarounds for five years, with no end in sight.
"Technology should be assistive to the direct care providers and enhance therapeutic communication," McEwen says. "Cash register-styled electronic documentation systems may serve insurers' and billing department purposes, but overall, they have not been shown to improve patient outcomes or enhance professional communication."
McEwen emphasizes that the problem is bigger than Epic, and so it is. A lot of people like to take and give credit for the rapid automation of healthcare in the U.S. today. Vendors are feted in financial publications. We properly credit the vision of government, provider, and payer leaders in pushing hospital technologies forward.
But in the end, the reason the entire proposition isn't completely running off the rails at this point is the result of the efforts of a whole lot of undertrained, overworked line staff and a whole lot of workarounds.
Let's hope that the efforts now underway at the Office of the National Coordinator for Health Information Technology to address usability of EMRs leads to some major reforms soon. Meanwhile, the Joint Commission's words of warning continue to echo.
A multimillion-dollar "go-live" implementation of the EpicCare EMR from Epic Systems Corp. came under intense scrutiny Tuesday when two nurses approached the governing body of a California hospital with patient safety concerns.
Those concerns stem from an incident at a Contra Costa County hospital clinic at the West County Detention Facility in Richmond, CA, where one nurse says the Epic system's recommended dosage of a heart medication "could have killed the patient."
"We're unable to document our medication administration correctly," said an emotional Lee Ann Fagan, speaking to the Contra Costa County Board of Supervisors in Martinez, CA.
A nurse familiar with the patient's medical history was able to override the system and adjust the amount of medication.
The story immediately spread via local media in the San Francisco Bay area, and highlighted concerns throughout the county hospital system about the Epic implementation.
"The Epic system decision support technology interferes with the RN's duty and right to advocate in exclusive interest of their patients," said Jerry Fillingim, a labor representative of the California Nurses Association.
Since the July 1 go-live date at two jail-based clinics—the West County Facility and the Martinez Detention Facility—and one based at the John A. Davis Juvenile Hall in Martinez, nurses have filed 142 notices of Assignment Despite Objection, a form used by CNA.
It requires that members notify supervisors and seek a solution before filing the form. Many of the forms' complaints speak of improper or incomplete training.
In an interview Tuesday night with HealthLeaders, Contra Costa Health Services executives responded that fixes had been made to the system since July 1, but communication of those fixes to line staff has been inadequate.
"We need to look a the process that we're engaged in now, and we need to improve that," says Anna Roth, RN, CEO of Contra Costa Health Services, whose 164-bed county hospital as well as ambulatory services and clinics at detention facilities switched to Epic on July 1.
"The introduction and implementation of an EHR in and of itself is a huge change in workflow, but I do know we put a great deal of effort into working with front line staff designing the workflow," Roth says.
The question isn't should you use cloud computing. The question is how.
First tip: Don't go all in without some sort of disaster recovery plan. Disasters do happen in the cloud. Earlier this month, dozens of hospitals temporarily lost access to patient records due to a cloud outage.
They should have asked the tougher questions earlier of their cloud service providers. In this case, it was Cerner Corporation, which attributed the outage to human error. The outage affected Adventist Health, which reverted to using paper-based records during the five-hour interruption in service.
Having just written a story about the cloud for HealthLeaders magazine, I was startled by just how many hospitals appear to have entrusted their EHRs to someone else's data center. Dell Healthcare recently told me it hosts more than 500 hospitals' EHRs in its cloud. Many Cerner customers run their own data centers, but an increasing number do not, leaving the hosting to Cerner.
As federal and state dollars continue to flow to automate healthcare information systems, expect an accompanying stream of federal and state oversight and investigation into outages such as these. There have been no reports of serious incidents caused by the recent outage, but the risk of adverse events triggered by the unavailability of electronic records will only grow.
Don't expect vague explanations such as "human error" to mollify regulators or the public. Cerner should have gotten out in front of the coverage of this month's outage by posting a detailed explanation through its blog or a press release. It did neither.
One reason CIOs cling to their own data centers is that they gain a totally transparent view into their IT infrastructure. All the way back to the dawn of personal computing, a great benefit of data centers has been a view under the hood.
The cloud changes all that. Service providers sometimes let users in on all the inner workings of their operations, but often they do not. Diligent CIOs extract as much knowledge as they can during the evaluation process, but cloud-based services are often so complex as to overwhelm even the best-trained CIOs. There's a booming business in consultancies that can help make sense of cloud offerings, so CIOs can anticipate and head off trouble.
Clearly, some organizations are making great use of the cloud, and are finding ways to fund and grow adoption. It's been suggested that hospitals get together and share as many resources as they can without depending entirely on vendors, and one entire state has done just that.
The Colorado Hospital Association expects to achieve statewide medical image sharing in the first quarter of 2013, with all those images being served up by a private cloud.
Creative financing for the Colorado Telehealth Network came in the form of a grant from the Federal Communications Commission, which also allowed the association to equip 205 facilities throughout the state with sufficient bandwidth to serve up those images when they start flowing over the network.
The set-up also potentially obviates the need for costly health information exchanges to move those images about, says Colorado Hospital Association president and CEO Steven Summer.
(Health information exchanges are different than health insurance exchanges, even though sloppy marketers and journalists call them both HIEs. For the record, the correct abbreviation of health insurance exchange is HIX.)
The function of HIEs and cloud computing seem to be a natural fit. Private HIEs appear to be taking off faster than public ones, but don't you think set-ups like Colorado's are more cost-efficient and resource-savvy—not to mention less chaotic—than a plethora of private HIEs would be all over the state? Yet Colorado is the only state that has gone this route.
I understand the continuing tension over private versus public in healthcare. Private systems often innovate more rapidly. Cloud technology will proliferate in private networks as well as public. But accountability will be radically different. If the Colorado Telehealth Network has an outage, there will be immediate explanations, and they won't leave us wondering what really happened.
Which is not to say that private technology companies won't build the public-financed clouds. In the case of Colorado, Summer says the organization started by evaluating the offerings of five companies: GNAX Health, Merge Healthcare, TeraMedica, Iron Mountain, and Dell Healthcare. So far, they've narrowed the selection down to GNAX Health and Merge Healthcare. There will be only one winner, which hasn't been announced yet.
One other very interesting aspect of the Colorado cloud is that Summer describes it as a vendor-neutral archive, or VNA. "It doesn't matter what [EMR] system a hospital may use," he says. "The VNA will be neutral in terms of drawing down the images and using [them]. So we've eliminated any of what often become proprietary impediments."
The key to making VNA happen is getting all the participants at the table. That means all the public, private, urban, rural, and academic hospitals; the physician practices; and payers as well.
Meanwhile, the state provides bandwidth to all at a fraction of the cost it might have been, had this service been provided through private bandwidth service providers. That might be anathema to a provider that wants to sell lots of extra services on top of bandwidth, but it also helps preserve the neutrality of the shared cloud infrastructure.
As healthcare leaders sort through the opportunities and pitfalls of the cloud, it will be more important than ever for them to share their experiences. The dialogue is just beginning.
This article appears in the August 2012 issue of HealthLeaders magazine.
The debate continues to rage: Are meaningful use requirements too specific or too vague? On target or wide of the mark? It still depends on who you ask.
"If these guidelines remain this rigid, this inflexible, this one-size-fits-all, there may well be a number of physicians who try in good faith and fail," says Steven J. Stack, MD, chair of the board of trustees of the American Medical Association and an emergency physician with Lexington, Ky.–based Saint Joseph Health System.
"It actually ends up creating a lot of unnecessary overhead to offer options," counters John D. Halamka, MD, MS, the CIO of the Beth Israel Deaconess Medical Center in Boston.
During the meaningful use rollout in 2011, allowable options in the specification diluted the interoperability between systems. The rules governing Stage 2, also now known as the 2014 edition of meaningful use, specify one way to implement content, domain vocabularies, and transport, Halamka says. "Every time you offer options," it's actually more work and less interoperability."
Attesting for meaningful use 2014 edition requires providers to interoperate, and with health information exchanges being in their infancy, providers may be hard-pressed to meet those interoperability requirements.
"We recognize that to really get meaningful meaningful use takes time," says Farzad Mostashari, MD, ScM, national coordinator for health information technology within the office of the U.S. Department of Health and Human Services secretary.
Previously, Mostashari said that meaningful use would be successful this year if CMS paid 100,000 providers in 2012 for attesting compliance with the 2011 meaningful use guidelines. "Now it looks like we're going to smash that 100,000 mark," he says. In a June 19 press release, HHS reported that more than 110,000 eligible professionals and over 2,400 eligible hospitals have been paid by the Medicare and Medicaid EHR Incentive Programs.
Some attestations have been more easily won, as institutions that have built their EMRs for years can fine-tune them to meaningful use requirements. Others are dramatic come-from-behind affairs occurring in recent months.
One example is the 722-licensed-bed University of Mississippi Medical Center in Jackson, Miss., which on June 1, went live with an implementation of the EpicCare EMR from Verona, Wis.–based Epic. In true 21st-century fashion, UMMC tweeted and issued posts to a public Facebook page before, during, and after go-live. Moving to meaningful use engaged the entire institution, says John Showalter, MD, MSIS, the CMIO at UMMC. "We have more than 900 people who signed up to be super-users on our training," Showalter says. "All those people had to be released from their regular clinical duties or administrative duties or lab duties to come and take extra training, and at go-live they were pulled out of their regular jobs just to be support staff."
Institutions take a financial and operational hit during the transition to meaningful use. UMMC has a $90 million implementation budget spread over five years, Showalter says. Because it is a state institution, the money came from a state-backed bond. "We whittled away a considerable portion of our cash while we were waiting for the bond to get through and get the money back from the bond. The bond has gone through and the money is coming in."
Other numbers tell their own story. In the past two and a half years, UMMC deployed more than 7,000 devices, counting barcode scanners and printers. In the end, UMMC brought up 15 applications in five hospitals and more than 20 clinic locations in a single day.
Staff had to pass a proficiency test or did not get access to the system. IT staff were able to learn valuable lessons by attending a similar go-live previously at Ochsner Medical Center in New Orleans, La., Showalter says.
Because approximately 35% of the UMMC patient population is eligible for Medicaid, the health system has already attested in 2012 and will receive more than $9 million in meaningful use reimbursement through the Medicaid Adopt program, Showalter says.
Those who attested in 2011 are breathing a sigh of relief over CMS' decision to allow them an extra year to achieve the Stage 2 thresholds.
"Common sense told us they were going to have to relax and push it out a year, so that was good news and I'm glad they did," says Ed Ricks, vice president of information systems and CIO at Beaufort (S.C.) Memorial Hospital. "We would have had to have been at Stage 2 by October 1 of this year, and now it's October 1, 2013. We can do that."
The American Hospital Association and the AMA came to at least one agreement on meaningful use. Commenting on the Notice of Proposed Rulemaking for Stage 2, they say the spirit of the law shouldn't allow CMS to peek back at 2012 data in order to get a full year's worth of data in each meaningful use attestation.
"It deprives providers of desperately needed time to comply," says the AMA's Stack.
As of this writing, it was unclear whether appeals for a 90-day recordkeeping period in 2013 would be substituted in the final rule for the 2014 edition of meaningful use.
The 197-licensed-bed Beaufort Memorial has done meaningful use in phases, Ricks notes. Three-quarters of its physicians are independent, and of those, not all are yet using computerized physician order entry. As of late May, Beaufort was about to bring its OB-GYN and pediatricians live on CPOE. So certain rollouts vary by specialty.
"The biggest challenge may be the quality measures and collecting the discrete data from areas that historically were not pieces of discrete data, that we abstracted out," Ricks says. "Luckily we were migrating to a new version of Meditech software over the past couple of years, and so we tried to build in the collection of that data in the work flows up front for the clinicians so that we didn't have to do any kind of double work as we went on."
Two requirements in the meaningful use 2014 edition loom large and pose bigger challenges for all providers: information exchange and patient engagement.
Mostashari says the 2014 edition moves from tests of data exchange to actual exchange of information across organizational and vendor boundaries. "Patients are certainly quite aware that their care is not as coordinated as it could be," he says. "They certainly know there are tests repeated unnecessarily."
Outweighing even the cost of repeated tests is the patient safety danger from those tests—for example, excessive ionizing radiation, Mostashari adds.
The 2014 edition as proposed requires discharges to be accompanied by a care summary for 65% of transitions of care or referrals, and for that to be done electronically 10% of the time across organizational and vendor boundaries.
Not all physicians agree that the data exchange requirements in the 2014 edition are a win-win.
"This is a great example of where the system potentially is going to shift the work toward the doctor," says Lyle Berkowitz, MD, medical director of IT and innovation at Northwestern Memorial Physicians Group, a Chicago-based multisite practice of 100-plus primary care physicians who are on the medical staff at Northwestern Memorial Hospital and faculty members of Northwestern University's Feinberg School of Medicine. "Meaningful use isn't saying exactly how it's going to be done from a work flow perspective. We actually need to train doctors how to run a team. It's not a skill that's traditionally taught."
As for patient engagement, many providers remain unsure how their organizations will achieve this goal of the 2014 edition.
"I don't know how a healthcare provider can be responsible for making sure that a patient that receives information is going to look at it and use it in a meaningful way," says Jackie Lucas, FACHE, vice president and CIO of Baptist Healthcare System, a seven-hospital system with approximately 2,000 licensed beds headquartered in Louisville, Ky.
In discussing these meaningful use requirements, one of her staff suggested that a provider could offer a nonmonetary incentive to patients to access their information electronically. "We've got to be very careful," Lucas says. "I suggested anything a provider did would have to meet HIPAA regulations.
"That's a lot to ask the provider, not only to ensure that the information is available to the patient electronically, but also to require the provider to meet a targeted percentage of patients who have actually accessed their information electronically," she adds.
On the health information exchange front, both NMPG and Baptist deployed McKesson's RelayHealth technology to provide a secure patient portal. This eliminated the need for time- and labor-consuming faxes and phone calls to affiliated physician practices, Lucas says.
"You improve quality, and it's also more secure," she says. "You don't have a fax lying around on a fax machine, curled up on the floor, or somebody punches the wrong number accidentally and sends that fax to the wrong location."
Other providers are still sorting out exchange strategies. "South Carolina has a statewide initiative, and we are participating with that," says Beaufort's Ricks. "But we're looking at building our own sort of health information exchange for the community, so a clinician can see a current picture of what's going on with that person across the community." For now, fax machine referrals will continue to some degree, he adds.
Ricks is determined to move ahead on patient engagement not just because of meaningful use regulations, but also because "we just know that somehow over the next five or seven years, there's going to be an evolution of the way we're reimbursed. It has to happen.
"We're looking at some solutions, very new in the marketplace, that are an adjunct to a patient portal. The intent is to engage patients with real-time monitoring of things," Ricks says. "I hate to use the word gamification, but that's the word I keep hearing, a social media aspect to some degree."
For example, to cut down on 30-day readmissions, discharged congestive heart patients might be sent home with a Bluetooth-enabled scale, Ricks says. The system would alert staff if measurements stopped or if the patient gained weight for two or three days in a row.
"Meaningful use is less about a technology implementation and more about policy and work flow implementation," says Beth Israel Deaconess' Halamka. "Think of the medical record as Wikipedia for each patient, where it's the collective editing of the entire institution that results in a record that's complete enough."
By eliminating those options that Halamka alluded to earlier, the cost of interfaces between the remaining disparate systems will drop from thousands of dollars per interface to hundreds of dollars. "There's still customization, but it's minor."
Although everyone speaks of meaningful use as a platform upon which to build tomorrow's innovations in medicine, there are complaints today about the lack of ease of use of EMRs.
"The cost to us in lost efficiency by implementing a health information system is somewhere between $300,000 and $750,000 in the first year," says Prentice Tom, MD, chief medical officer of CEP America, an Emeryville, Calif.–based provider of hospitalist services that sees 4.5 million patients annually.
To make up for the inefficiency, CEP hires scribes that cost about $27,000 per physician annually.
EMRs would be adopted more rapidly, and wouldn't require as many government incentives to acquire, if they included algorithms that present the useful information out of the record, Tom says.
"It's a fair indictment," Halamka says. "If it takes 47 clicks to write an e-prescription, as opposed to one click to order a CD on Amazon, something's wrong." Future meaningful use rules will require "some measure of usability" to address this, he says.
"The conversation has really got to be more about value for the money we are spending," Mostashari says. "Electronic health records are a necessary, though probably not sufficient part of that equation."
Reprint HLR0812-5
This article appears in the August 2012 issue of HealthLeaders magazine.