Advertisement

Electronic Health Records: Promises and Realities

Part III: Information Privacy and Accuracy: Zero and GIGO Won't Do
      In debates over information technology (IT) and individual privacy, the devil's advocate has a specific identity: it's Scott McNealy, cofounder and former chief executive officer of Sun Microsystems. Speaking to reporters from Wired and other publications in 1999, McNealy dismissed privacy concerns raised over Sun's Jini software as a “red herring” and claimed, “You have zero privacy anyway. Get over it.”
      • Sprenger P.
      Sun on privacy: “Get over it.” Wired. January 26, 1999.
      Few statements in the IT era have provoked as much opposition. If the expansion of electronic health records (EHRs) under the 2009 Recovery Act's Health Information Technology for Economic and Clinical Health section turns McNealy's position from a cynical extreme to a systemwide norm, patient privacy advocates predict a backlash that will entangle IT's potential clinical benefits within a thicket of perverse incentives.
      On July 13, the Department of Health and Human Services (DHHS) Centers for Medicare & Medicaid Services announced regulations defining “meaningful use” as required for Health Information Technology for Economic and Clinical Health incentive payments.
      Centers for Medicare & Medicaid Services Office of Public Affairs
      Secretary Sebelius announces final rules to support “meaningful use” of electronic health records Press release, July 13, 2010.
      The conditions include specific core objectives, along with a menu of additional objectives from which providers must implement at least 5 during the first 2 years. Privacy protection and timely patient access to records are included in the core set, as outlined by National Coordinator for Health IT David Blumenthal, MD, MPP.
      • Blumenthal D.
      • Tavenner M.
      The “meaningful use” regulation for electronic health records.
      The Office of the National Coordinator works with Centers for Medicare & Medicaid Services to develop standards and procedures for implementation and certification.
      With both IT and federal regulations, however, the devil is inevitably in the details of implementation. The core measure for the patient access objective, for example (“More than 50% of requesting patients receive electronic copy within 3 days”), is relatively undemanding and gives no guarantee to an individual patient. And a line buried in the comments section of the meaningful-use statement left privacy advocates
      • McGraw D.
      HHS releases rules for electronic health records. Center for Democracy and Technology, July 14, 2010.
      with an impression that Centers for Medicare & Medicaid Services was missing an opportunity to go beyond the Health Insurance Portability and Accountability Act (HIPAA) and modernize the privacy rules: “We do not see meaningful use as an appropriate regulatory tool to impose different, additional, and/or inconsistent privacy and security policy requirements from those policies already required by HIPAA.”
      Department of Health and Human Services, Centers for Medicare & Medicaid Services
      42 CFR Parts 412, 413, 422, and 495 Medicare and Medicaid Programs; Electronic Health Record Incentive Program.

      The Murky Tides of Data and Cash

      Among the acronymic terms permeating the IT world, the “garbage in, garbage out” (GIGO) processing system requires purposefully structured input data, subjected to scrupulous quality control, to produce useful results. A more recent variant of GIGO, “garbage in, gospel out,” expresses the tendency of technophiles in any position to place inordinate trust in computer-generated output, sometimes for no better reason than its orderly, authoritative appearance. A further permutation foreseeable in the EHR era might invert that formula as “gospel in, garbage out”: information beneficial to patients and clinicians, so long as it is well organized and contained, may become a kind of toxin harming a patient irreversibly if it is contaminated, scrambled, or directed into the wrong hands.
      All 3 iterations of GIGO threaten to derail the EHR bandwagon. For health IT to fulfill its potential as an aid to clinical practice, patients and physicians need an information flow they can trust: from initial input to all potential outputs, data must be accurate and must end up in the right places. In an EHR-intensive atmosphere, emergency physicians, who may treat a patient only once, will be unusually dependent on the quality of data entered into records by others. Although the time dedicated to record creation as a distinct operation (see parts I and II of this series) may be minimal in emergency department settings, emergency physicians logically have an incentive to be strong advocates for data integrity safeguards.
      The most appropriate party to control the quality and security of protected health information (PHI), many commentators believe, is the individual patient. If, as another familiar cyberspace slogan has it, “information wants to be free”—ie, tends to get loose—the person most directly affected by its movements has the strongest incentive to oversee it. A few patients have exerted extraordinary effort to manage, protect, and upgrade their own records. “To get good-quality data into a system, you need first of all good rigorous controls on how the data gets in there,” says Dave deBronkart, Jr., whose struggles with both kidney cancer and an error-riddled EHR have made him a nationally recognized spokesman for patients' rights and responsibilities. “Those controls absolutely do not exist in health care.”
      Patients such as deBronkart have drawn attention to the problem, but to date they are exceptions, swimming against a powerful tide of institutional incentives. The traffic in personal data for targeted marketing, not limited to the medical realm,
      • Angwin J.
      The Web's new gold mine: your secrets.
      is lucrative; one estimate predicts that the clinical component of the data-mining industry will reach $5 billion by 2020.
      • Singer N.
      When 2+2 equals a privacy question.
      The appearance of EHR vendor contracts specifying ownership, exclusive access, and sales rights over medical data
      • Zetter K.
      Medical records: stored in the cloud, sold on the open market.
      implies that some organizations view this information as a profit generator and relegate privacy concerns, with McNealyesque fatalism, to the dustbin of history. One source of irony in that scenario is that advanced technologies and practices capable of protecting PHI—not with absolute certainty, but well enough to establish a trustworthy balance in which EHRs' benefits decisively outweigh their risks—already exist.

      Paper Records Aren't Airtight Either

      Security breaches involving PHI, of course, acquired a high profile long before health IT raised the stakes. The 1972 presidential campaign is a case in point: after a leak of Senator Thomas Eagleton's mental health records, amid the strong stigma associated with psychiatric treatments at the time, Democratic candidate Senator George McGovern dropped him as a running mate after 18 days. Ginger McCall, staff counsel with the Electronic Privacy Information Center in Washington, DC, points to the recurrent problem of corruptible hospital employees selling celebrities' medical records to tabloids, the Farrah Fawcett case
      • Ornstein C.
      Fawcett's cancer file breached.
      being a well-reported recent example. Such temptations are independent of physical formats.
      Personal privacy in general, one should also note, is a chronically contested legal area. Privacy protection laws covering library information (generally state statutes) have been on the books since the McCarthy era, McCall observes, when Communist-hunting prosecutors used book withdrawal records to imply that certain citizens were ideologically suspect. Some electronic-age privacy laws extend that principle to the federal level, but in unusual, perhaps ungeneralizable factual contexts. “The Video Privacy Protection Act

      Wrongful disclosure of video tape rental or sale records. The Video Privacy Protection Act of 1988, 18 US Code § 2710 (2002).

      was actually passed in reaction to the disclosure of Supreme Court nominees' video rental records,” McCall says. (The specific nominee in question, Robert Bork, had claimed
      • Bork R.
      The Tempting of America: The Political Seduction of the Law.
      that the Constitution gives no privacy guarantees except for specific rights conferred by legislation, rejecting the interpretation of “penumbral” privacy support that Justice William O. Douglas found throughout multiple Bill of Rights amendments in Griswold v Connecticut.

      381 U.S. 479 (1965).

      Publishing Bork's video records in the Washington City Paper in 1987, reporter Michael Dolan claims, was a semischolarly prank turning Bork's own opinion of privacy against him.)
      Rather than an either/or, pure-or-porous model, it may be more realistic to view PHI privacy as a matter of degrees, contingencies, and tradeoffs. The same patient who prefers to conceal references to past treatment of a sexually transmitted disease during routine office visits may also expect a first responder or emergency physician to have access to a full medication history in a time-sensitive situation in which drug reactions or interactions could amplify hazards. “It's a matter of risk,” says Robert M. Kolodner, MD, Dr. Blumenthal's predecessor as National Coordinator. “I need to understand what the benefit is to me of sharing the information; what the risk is [to] my health if I don't; and … that there is a nonzero risk of it being discovered by somebody, if I'm really worried about that. Many people aren't worried about that, and it comes down to personal choice at this point in time.”
      As a psychiatrist in the Veterans Administration system since 1978 and a participant in the development of both Veterans Health Information Systems and Technology Architecture and the online personal health record (PHR) system known as My HealtheVet, Dr. Kolodner is now working with the nonprofit group Open Health Tools to develop open-source, cross-platform interfaces and infrastructural elements that different IT vendors can use to “share those costs, those pain points” and streamline operations. Fine-tuning access permission is a critical challenge; too many current EHR systems, he finds, have “an all-or-nothing kind of release that [makes it] hard to slice and dice certain content out of records.”
      Since financial information has been transmitted electronically for years, some EHR advocates see that field as a case in which technology eventually earned popular trust. “We depend on electronic devices for our entire defense establishment, for almost everything you do, our utilities grids, our whole banking system,” notes David Mechanic, PhD, of the Institute for Health, Health Care Policy, and Aging Research at Rutgers University. “The notion somehow that we can't provide reasonable protections for EHRs, I think, is one I just don't buy into.”
      “Each of us made our decision when we were ready to have our information flow over the Internet,” says Dr. Kolodner. “Financial information, credit card, or others. And there are still some people who don't put any credit cards on it, and there's some other people who really have no particular concern; they figure they can contain the cost. In health care, it's going to be similar, because there's a cost to not having your information available.” He looks to EHR advances as ways to put more choice and control in patients' hands: “The question is, how do we help the industry to mature to the next stage, where it's really a user-driven industry and not a vendor-driven industry?”
      Dr. Kolodner notes that paper records in certain respects are less secure than EHRs, at least from unauthorized access to a single record. “I'm not saying it's still this way, but in most hospitals if you put on a white coat, and you act authoritative enough, you can often just walk into a nursing station, or into a record room, and get a record. The other is that you can pay somebody on the staff to get that record, and there's no trace as to how that got released.” The auditing functions of EHRs can track who has obtained access to a record (or tried to), decreasing the likelihood that insiders will agree to participate in such a scheme.
      Auditability is one of several strategies that EHR developers are using to improve privacy performance. Metadata tagging allows sorting of information into more granular categories; combined with role-based access and content retrieval filtering, this segmentation can let patients designate which providers can view sensitive PHI (genetic, psychiatric, or gynecologic history; information related to sexually transmitted diseases or substance abuse; or conditions the patient believes may affect employment or insurability). For undesignated providers, this information is suppressed—what the psychiatrist and the emergency physician may know, the podiatrist need not—but categories of broad clinical importance such as allergies remain accessible. Anonymization strips personal identifiers out of records for research or biosurveillance applications. In emergency care, a “break-the-glass” function creates an audit trail specifying reasons and conditions, as well as personnel involved, notifying institutional privacy officers about each override request.
      This past summer, at meetings of the DHHS Privacy and Security Tiger Team,
      Privacy & Security Tiger Team: Past Meetings
      Materials from recent Privacy and Security Tiger Team meetings are archived by HHS.
      a work group organized by the Office of the National Coordinator and representing industry, hospitals, academia, patient organizations, and others concerned with the privacy implications of system design, the technologic feasibility of privacy protection was on public display in fine-grained detail. Firms such as e-MDs, Private Access, Health Information Protection and Associated Technologies, and Tolven Healthcare Innovations demonstrated various consent-management systems designed to interface with health information exchanges in the National Health Information Network, being developed under Office of the National Coordinator guidance since 2008. In some of these systems, rules and metadata for authentication and access control attach to EHRs and PHRs more or less as digital rights management code piggybacks on audio and video files. Pilot projects in the Netherlands and Singapore are troubleshooting systems before testing occurs on larger scales. Speakers recurrently noted that demand for these PHI-protective advances is not yet driving high-volume adoption, but when the US market matures, realistic technologies exist to serve it.

      Recourse Once the Horses Escape the Barn

      A layperson observing the Tiger Team sessions could plausibly infer that PHI protection is a readily soluble problem, like the high-level encryption that (usually) guards financial data. Not all types of information, however, behave identically within networked environments. Drawing inferences from other categories to PHI is difficult because PHI breaches can involve unique issues of scalability, irreversibility, and personal consequences that are immeasurable in a literal sense. “If your record with your credit card number gets out,” Electronic Privacy Information Center's McCall says, “you can get another credit card and cancel that one, but if a record with a list of your medications gets out, that's pretty personal information, and you can't get it back.” Calculating financial damages from lost privacy is difficult enough, she notes, that the Video Privacy Protection Act establishes a statutory damage scheme, an approach she believes should also extend to sanctions for unauthorized PHI disclosure. Writing statutory damages into protective laws “sets up a scheme for people to enforce their own rights,” she adds. “One of the things to think about with medical privacy and genetic privacy … is that whenever you have a federal law, it should be a floor and not a ceiling. The federal government, if it creates a law to protect consumers, should also allow the states to create even higher protections for consumers.”
      Settlements in large class action suits,

      Eg, the suit in US District Court for the Northern District of California, San Jose Division, over privacy breaches by the Facebook Beacon feature, Lane et al v Facebook, Inc et al. McCall's organization EPIC is among organizations filing complaints about both the feature, now defunct, and the settlement.

      McCall comments, are an inadequate remedy for privacy breaches. One recent case involving Facebook and Blockbuster Video, she says, is “a perfect example of why these rights sometimes can't work out, because there you have a couple of plaintiffs' attorneys who have proposed this settlement … where Facebook essentially sets up a $9 million foundation that Facebook helps to run; the attorneys get a lot of money, the named plaintiffs get a little bit of money, and everyone else in the class gets nothing.”
      Some information scientists also claim that anonymization cannot be foolproof. Working with nonmedical data sets from social networks
      • Narayanan A.
      • Shmatikov V.
      De-anonymizing social networks.
      and film ratings from the Netflix Prize for improving recommendation algorithms,
      • Narayanan A.
      • Shmatikov V.
      Robust de-anonymization of large sparse datasets.
      Arvind Narayanan and Vitaly Shmatikov at the University of Texas have developed systems capable of reidentifying individuals by matching patterns with publicly posted information, including personal identifiers. According to these authors, such information, though relatively innocuous or trivial in itself, can expose users of the systems to “targeted de-anonymization” (stalking), abusive marketing, phishing, spamming, and surveillance. A fortiori, the implications for the security of PHI, which is distinctly more valuable than the data used in the Texas studies, are alarming.
      Deborah C. Peel, MD, a psychiatrist who founded the national organization Patient Privacy Rights in 2004, looks askance at current federal PHI protection measures.
      • Peel D.C.
      Your medical records aren't secure.
      Dr. Peel is no Luddite; she strongly favors smart EHR and PHR systems that include sophisticated granularity and enable patients to customize consent directives. “This is a moment where doctors can really be their patients' advocates,” she says, “because a system where the patient is in control of the data really is a system where the patient can trust the doctor. If not, the doctor is the agent of giant corporations and data thieves and the government, and people are not going to want to see you if you have these bad products.”
      Recent history, she says, is not encouraging. “The right of consent was eliminated from the [HIPAA] privacy rule in 2002. They finally now say this publicly, but it's not on the DHHS Web site,” she notes. “The Web site's very misleading … . We have pretty much been singlehandedly the ones that have been telling everybody, the privacy rule isn't a privacy rule, it's actually a disclosure rule. It's a data miner's dream, because it lets all of the covered entities decide when to use your information for treatment, payment, or health care operations—not you! And you can't refuse, and you can't object. You can beg them; you have the right to beg the covered entities to not disclose your data. But guess what? They can say ‘No, we're going to do it anyway.'” (Former president George W. Bush, Dr. Peel says, did initially implement the HIPAA rule requiring individual consent for disclosure in 2001

      Department of Health and Human Services. Standards for privacy of individually identifiable health information. 2001; 65 Fed. Reg. 82,462.

      ; when DHHS reversed this provision in 2002,

      Department of Health and Human Services. Final amendments to Federal Privacy Rule. 2002; 67 Fed. Reg. 53182.

      she recalls, “we never were able to figure out if Bush knew or he didn't know what was going on down below.”) Dr. Peel's group also points out that since the Gramm-Leach-Bliley Financial Services Modernization Act

      The Gramm-Leach-Bliley Act, a.k.a. the Financial Services Modernization Act of 1999, codified at Pub.L. 106-102, 113 Stat. 1338 (1999).

      of 1999 broke down the Glass-Steagall firewalls separating the financial, securities, and insurance sectors, the channels through which unprotected and personally destructive information can now silently travel are unprecedentedly broad.
      At the June 29 Tiger Team session on protective technologies, where Dr. Peel was an invited panelist, moderators used a World Cup–style yellow card warning system to confine discussion to technical details rather than policy. Speaking afterward—having snuck in a late half-minute of comment on who should have decisive authority about privacy controls, drawing a yellow card—Dr. Peel bridled at this limitation of debate and noted that the panel was stacked with industry representatives.
      In many cases, she says, physician-driven EHR products that allocate decisions to patients are less costly than unreliable, ungranular billing-oriented systems. She finds that most discussion “has been co-opted by those who want records to be totally open and accessible to all, not so that doctors can treat people, but so that these stakeholders can get the records and use them against people to discriminate and all the rest. Most people would freely be open with their physicians if the information didn't leak out and prevent them from getting jobs, promotions, and coverage.”
      Emergency scenarios, she also states, sometimes appear in anticonsent arguments, with little reference to important contextual facts such as how often patients actually appear in uncommunicative condition. “The data mining industry, including the insurers, all used that issue—that ‘What if you’re unconscious in Alaska, how are you going to get your data?'—to say that's why your data should be open to all doctors all the time. And it's just a damn lie … . I know that as a [resident] physician in the ER, we would violate the privacy of the unconscious person every time to save their lives … it's a matter of ethics; the highest ethic is, save a life. Privacy goes out the window. So I don't believe there's a conflict there at all. The problem is, that issue, ‘Emergency doctors need to know everything if you’re unconscious,' has been used to justify the fact that they never built these systems in accord with our rights to control our information at all. It's a great flag to wave to make it seem like you're going to put yourself in danger unless you let everybody and their dog see your medical records at all times.”
      Dr. Peel bases her watchdog activities on her observations, in 35 years of psychiatric practice, of patients' dread of PHI disclosure. “People used to say, ‘Oh, Deborah, you're a fear monger; all this stuff is theoretical.' Well, I found, and use in my presentations now, slides from DHHS findings in 2000 that about 60,000 people a year refuse to get early diagnosis and treatment for cancer, because they know that [the information] won't stay private. And about 2 million a year, the same thing for mental illness treatment. And then I also use a slide from a RAND Corporation study that found there's 150,000 Iraqi vets with PTSD, and because there's no privacy [in] mental health treatment for the military, at least active duty, that these soldiers, our current soldiers, don't get treatment, and so as a consequence we have the highest rate of suicide among active-duty military personnel in 30 years. So I'm able to say that, look, this isn't theoretical. These are real problems. People refuse to get treatment when they think it's going to harm them. You shouldn't have to choose between a job and health care, but we do today.”
      If poor IT choices breed distrust, she cautions, the backlash will hit physicians, not vendors. “People are not thinking about how the technology can be used to improve and strengthen the doctor-patient relationship, and I believe that people are going to get very, very angry at their physicians when they realize that so many of the EHRs sell data … . I mean, how many people are going to be happy when they find out their doctor sold all this sensitive information that could cause generations of discrimination without their knowledge? So I think [with] a lot of the defects of the health IT products that are out there today, the public's never going to go to Kansas City and knock on Neal Patterson's door, the CEO of Cerner, and say ‘What did you do to my life, you jerk?' They're going to go to Dr. Peel, or to you, and say, ‘You destroyed my life!'”

      Glaring GIGO Even at the Leading Edge

      “E-Patient Dave” deBronkart, a technology marketing executive in Nashua, NH, who was unexpectedly diagnosed with stage IV, grade 4 renal cell carcinoma in January 2007, found out the hard way that EHRs can be contaminated with surprising amounts and forms of garbage. As a longtime early technology adopter and Internet enthusiast, deBronkart responded to his diagnosis by seeking as much information as he could find, including his own EHRs at Boston's Beth Israel Deaconess Medical Center. He also became active in the online community e-patients.net and the “participatory medicine” movement. The good news in deBronkart's case is the clinical course: given a median survival estimate of 24 weeks at diagnosis, he underwent successful laparoscopic excision, then investigative treatment with high-dosage interleukin-2 in a clinical trial. His last treatment took place in July 2007, and his remaining lesions have continued to shrink. He and his physicians have declared victory.
      In the course of reinventing his professional life as a prominent e-patient, however, he also made alarming discoveries. Transferring his records from Beth Israel Deaconess to Google Health, a voluntary PHR system, he found a proliferation of howling errors: miscoding, upcoding, misdated reports and alarms, documents not dated at all, and his misidentification as “a 53-year-old woman” on a 2003 lung radiograph (his age, at least, was accurate). Medication for chemotherapy-induced emesis resulted in a record giving deBronkart “anxiety disorder.” An entry for volvulus, entirely fallacious, remains unexplained.
      A “history of aortic aneurysm” was apparently upcoded from a minor and transient observation. As he recalls, “it turns out, in one of my scans when I was sick, it reported a 1/4-inch enlargement in the base of my aorta. Now, to any thinking physician, that's no big deal, especially since it wasn't present in the next scan. But to a billing clerk, who is encouraged to submit the highest-priced thing they can legitimately charge for, bingo! Enlarged aorta: that qualifies as an aortic aneurysm.”
      Another dramatic error was a finding of brain and spine metastases. He didn't have them, but his physicians had performed magnetic resonance imaging to rule them out. “There's no way in the billing data to say ‘and it came back negative,'” he discovered; for insurance purposes, what matters is procedures performed, not results. Sins of omission appeared as well, serious ones: there was no medication history at all (even the interleukin-2), and the section on drug sensitivities included no warning about steroids, which would interfere with his lifesaving immune treatment and are thus permanently contraindicated for him.
      His blog entry about the experience
      • deBronkart D.
      Imagine someone had been managing your data, and then you looked The new life of e-patient Dave (blog).
      led to a Boston Globe feature
      • Wangsness L.
      Electronic health records raise doubt.
      and multiple invitations to address physicians, technologists, and others about the problem. At the heart of these errors, he explains, lies the structural and conceptual mismatch between International Classification of Diseases, Ninth Revision billing codes, with their built-in incentives and lack of nuance, and subtler, more precise diagnostic information. “I had no idea this issue existed until I tried to move my data over,” he says, “and then I got schooled pretty quickly by some friends. Billing data is categorically wrong for use as a proxy … for clinical reality.”
      Perhaps even more disturbing than the errors themselves was the absence of a mechanism for resolving mistakes: after deBronkart reported the problems, it took 6 months for hospital staff to correct them. “It's not just the technology,” he concludes; “what we know from other industries is, it's the business process.” After his saga came to light, however, Beth Israel Deaconess acted quickly to end the practice of sending billing data to PHR systems.
      • Wangsness L.
      Beth Israel halts sending insurance data to Google. Boston Globe. April 18, 2009.
      “This is not a slam at Beth Israel Deaconess,” deBronkart hastens to emphasize. “They are ahead of the curve; they are better than most … . The potential of good-quality information being available to a skilled professional at the moment when they need it, it's not even a matter of ‘I believe in it’; I just know with certainty that when we get to that point, it will result in better health care. Right now, we are in a situation that's more like when the Web was brand new, and every now and then you'd stumble across an airplane listing that was just plain wrong: that flight doesn't leave at that time. So you learned you had to check everything. That's the situation that an emergency physician or anybody else will be in until we've matured in our IT practices in health care.”
      A system constructed to combine privacy and quality considerations with the patient's overriding interest in the clinical outcome, deBronkart believes, is an urgent priority. In the “free replay” time he has received by overcoming his own illness, he sees progress toward such a system as his life's work; he has written a book
      • deBronkart D.
      Laugh, Sing, and Eat Like a Pig: How an Empowered Patient Beat Stage IV Cancer (and What Healthcare Can Learn From It).
      and become a full-time patient advocate/consultant. Recognizing that health care economics currently create “pressure for everybody to do more with less,” he envisions patients actively managing their own information as a powerful and underused resource. As he stresses in public and professional appearances, “One of the most fundamental human rights must be the right of a desperate person to try to save himself … . Whatever we do on these privacy regulations, let's not interfere with the efforts of a desperate person who is trying to get treatment.” Putting the patient in control of PHI through a granular, interoperable, and secure PHR would align motivation, efficiency, and equity. “The key enabler for data quality,” he summarizes, “is, give patients 100% visibility to everything in their own record and let them point out mistakes. That is the fastest path, faster than anything else, to improving the quality of that data.”

      The PHR as Potential Disruptor

      The McNealyist “zero privacy” position, like the knowledge one is to be “hanged in a fortnight” in Samuel Johnson's famous words, has powerfully concentrated the attention of IT specialists. “When people say you have to choose” between privacy and utility, said Latanya Sweeney, PhD, a computer scientist and director of the Data Privacy Lab at Carnegie Mellon University, in a Scientific American profile,
      • Walter C.
      Privacy isn't dead, or at least it shouldn't be: a Q&A with Latanya Sweeney.
      “it means they haven't actually thought the problem through or they aren't willing to accept the answer. Remember, it's in [McNealy's] interest to say that, because he very much shares that attitude of the computer scientist who built the technology that's invasive; who says, ‘Well, you want the benefits of my technology, you'll get over privacy.' It's exactly the kind of computer scientist we don't want to be graduating in the future. We want the computer scientist who will resolve these kinds of barriers in conflict, identify them and resolve them in their technology design.”
      The latter kinds of technologists have the opportunity to shape the near future's health care IT realm to give the benefit of the doubt to patients and physicians rather than nonclinical entities. As the EHR rollout period witnesses critical decisions in design standards, legal safeguards, certification regulations, and hospital practices, the status quo is not a serious option. Dr. Sweeney has testified before a Congressional health care caucus
      • Sweeney L.
      Designing a trustworthy Nationwide Health Information Network (NHIN) promises Americans privacy and utility, rather than falsely choosing between privacy or utility. Statement before the 21st Century Healthcare Caucus Roundtable, April 22, 2010.
      that HIPAA does not protect patients from harmful effects of data storage and sharing; that secondary uses of PHI by business associates are “unbounded, widespread, hidden, and difficult to trace”; and that the models of the Nationwide Health Information Network currently being developed are vulnerable to multiple incursions, including malicious insider surveys (eg, identification of a domestic violence victim by a stalker).
      She also suggested at the June 29 Tiger Team session that the utility of any consent management process, whether records are paper or electronic, may be “limited by how much the patient could get his mind around. Technology allows more granularity [than paper], but granularity introduces complexity that we as humans aren't good at dealing with.” This human-IT interface, experience suggests, will always be a source of trouble. If all patients were as optimistic, energetic, analytically incisive, and tech-savvy as deBronkart, one might confidently dismiss the risk that the EHR era will resemble the unsettling conditions of “Hospital B” and its Defensively Exhaustive Full-Employment Act for Techies and Scribes (DEFEATS) system in the hypothetical scenarios offered in part I. It will more likely take prodigious efforts at patient-physician communication (if not patient-physician-technologist-attorney) to foster an atmosphere in which the preferable future, that of “Hospital A” and the brave new world of finely entrained automatic and teleclinical services (FEATS), can take shape.
      Both deBronkart and Dr. Kolodner speak of health IT as a classic disruptive technology, capable of changing a field and displacing previous products and processes. “The nature of disruptive technologies is that, when done right, more people with a lower level of skill are able to do more things for themselves more conveniently, more easily, and more quickly,” Dr. Kolodner says. “The PHRs are certainly positioned in that disruptor position and can help to drive the transformation of health and health care.” At this stage, it is an open question just what else, besides older technologies, they will end up disrupting: the privileges of large organizations to profit from patients' PHI regardless of the effects on clinical practice, or the patient's traditional right to have his or her most intimate information respected.

      References

        • Sprenger P.
        Sun on privacy: “Get over it.” Wired. January 26, 1999.
        (Accessed August 2, 2010)
        • Centers for Medicare & Medicaid Services Office of Public Affairs
        Secretary Sebelius announces final rules to support “meaningful use” of electronic health records.
        (Accessed August 4, 2010)
        • Blumenthal D.
        • Tavenner M.
        The “meaningful use” regulation for electronic health records.
        N Engl J Med. 2010; 363 (Accessed August 4, 2010): 501-504
        • McGraw D.
        HHS releases rules for electronic health records. Center for Democracy and Technology, July 14, 2010.
        (Accessed August 5, 2010)
        • Department of Health and Human Services, Centers for Medicare & Medicaid Services
        42 CFR Parts 412, 413, 422, and 495.
        Fed Reg. 2010; 75 (44369) (Accessed August 5, 2010)
        • Angwin J.
        The Web's new gold mine: your secrets.
        Wall Street Journal. July 30, 2010; (Accessed August 4, 2010)
        • Singer N.
        When 2+2 equals a privacy question.
        New York Times. October 17, 2009; (BU4) (Accessed August 5, 2010)
        • Zetter K.
        Medical records: stored in the cloud, sold on the open market.
        Wired. October 19, 2009; (Accessed August 5, 2010)
        • Ornstein C.
        Fawcett's cancer file breached.
        Los Angeles Times. April 3, 2008; (B1) (Accessed August 3, 2010)
      1. Wrongful disclosure of video tape rental or sale records. The Video Privacy Protection Act of 1988, 18 US Code § 2710 (2002).

        • Bork R.
        The Tempting of America: The Political Seduction of the Law.
        Simon and Schuster, New York, NY1990
      2. 381 U.S. 479 (1965).

        • Dolan M.
        The Bork tapes saga.
        (Accessed August 5, 2010. Includes the text of Dolan's original piece “The Bork Tapes,” Washington City Paper, September 25 to October 1, 1987, p 1 ff. The Bork records case involved a newspaper, but not, as has sometimes been claimed, any congressional use of subpoena powers: see Ackerman S. Hit the rewind: Bork's video subpoena was a story too good to check. Extra! Update, Fairness and Accuracy in Reporting, April 1999. Available at: http://www.fair.org/index.php?page=1456. Accessed August 5, 2010; and Rosen J. The myth of Biden v. Bork. New York Times, August 26, 2008. Available at: http://www.nytimes.com/2008/08/27/opinion/27rosen.html. Accessed August 5, 2010)
        • Privacy & Security Tiger Team: Past Meetings
        Materials from recent Privacy and Security Tiger Team meetings are archived by HHS.
        (Accessed August 5, 2010)
      3. Eg, the suit in US District Court for the Northern District of California, San Jose Division, over privacy breaches by the Facebook Beacon feature, Lane et al v Facebook, Inc et al. McCall's organization EPIC is among organizations filing complaints about both the feature, now defunct, and the settlement.

        • Narayanan A.
        • Shmatikov V.
        De-anonymizing social networks.
        in: 30th IEEE Symposium on Security and Privacy, 2009: 173-187 (Accessed August 5, 2010)
        • Narayanan A.
        • Shmatikov V.
        Robust de-anonymization of large sparse datasets.
        in: Proceedings of the 2008 IEEE Symposium on Security and Privacy, 2008: 111-125 (Accessed August 5, 2010)
        • Peel D.C.
        Your medical records aren't secure.
        Wall Street Journal. March 23, 2010; (Accessed August 5, 2010)
      4. Department of Health and Human Services. Standards for privacy of individually identifiable health information. 2001; 65 Fed. Reg. 82,462.

      5. Department of Health and Human Services. Final amendments to Federal Privacy Rule. 2002; 67 Fed. Reg. 53182.

        • Patient Privacy Rights
        Zones of privacy (graphic).
        (Accessed August 4, 2010)
      6. The Gramm-Leach-Bliley Act, a.k.a. the Financial Services Modernization Act of 1999, codified at Pub.L. 106-102, 113 Stat. 1338 (1999).

        • deBronkart D.
        Imagine someone had been managing your data, and then you looked.
        (Accessed August 6, 2010)
        • Wangsness L.
        Electronic health records raise doubt.
        Boston Globe. April 13, 2009; (Accessed August 6, 2010)
        • Wangsness L.
        Beth Israel halts sending insurance data to Google. Boston Globe. April 18, 2009.
        Boston Globe. April 18, 2009; (Accessed August 6, 2010)
        • deBronkart D.
        Laugh, Sing, and Eat Like a Pig: How an Empowered Patient Beat Stage IV Cancer (and What Healthcare Can Learn From It).
        Changing Outlook LLC, Media, PA2010
        • Walter C.
        Privacy isn't dead, or at least it shouldn't be: a Q&A with Latanya Sweeney.
        Scientific American, New York, NYJune 27, 2007 (Accessed August 3, 2010)
        • Sweeney L.
        Designing a trustworthy Nationwide Health Information Network (NHIN) promises Americans privacy and utility, rather than falsely choosing between privacy or utility. Statement before the 21st Century Healthcare Caucus Roundtable, April 22, 2010.
        (Accessed August 4, 2010)