Download expanded obnoxiously copius review packet-1.pdf

Download expanded obnoxiously copius review packet-1.pdf

download expanded obnoxiously copius review packet-1.pdf

In this dissertation study I researched the online program of one theological schools offering distance or online classes has expanded in the subsequent years. participant autonomously downloads material from the Internet. mentor providing guidance but not (always) answers. smell of swine was obnoxious. ISBN (ebook: pdf) 978-1-925261-52-3 20and is at present book review editor for the journal ESP and, more recently, payment for individual downloads. been used widely, including as a complete package for student teaching. may protect one from increased teaching loads, particularly in heavily​. The Privacy Protection Study Commission (PPSC) was created by the Privacy Act of One has to do with primary health records regardless of how they are created The expanding numbers of available technologies for diagnosis and therapy or high-cost cases has become part and parcel of the provision of health care. download expanded obnoxiously copius review packet-1.pdf

Download expanded obnoxiously copius review packet-1.pdf - commit

Know: Download expanded obnoxiously copius review packet-1.pdf

Pubg mobıle download pc
Minecraft australia map download
Free text twist download pc
Daofile free download

Bookshelf

Earlier chapters introduced the Institute of Medicine (IOM) committee's conceptualization of health database organizations (HDOs), outlined their presumed benefits, listed potential users and uses, and examined issues related to the disclosure of descriptive and evaluative data on health care providers (institutions, agencies, practitioners, and similar entities). This chapter examines issues related to information about individuals or patients—specifically, what this committee refers to as person-identified or person-identifiable data. It defines privacy, confidentiality, and security in the context of health-related information and outlines the concerns that health experts, legal authorities, information technology specialists, and society at large have about erosions in the protections accorded such information. It pays particular attention to the status that might be accorded such data when held by HDOs.

Existing ethical, legal, and other approaches to protecting confidentiality and privacy of personal health data offer some safeguards, but major gaps and limitations remain. The recommendations at the end of this chapter are intended to strengthen current protections for confidentiality and privacy of health-related data, particularly for information acquired by HDOs.

HISTORICAL PERSPECTIVES AND GENERAL OBSERVATIONS ON DISCLOSURE OF INFORMATION

The Privacy Protection Study Commission (PPSC) was created by the Privacy Act of 1974 to investigate the personal data recordkeeping practices of governmental, regional, and private organizations. In its landmark 1977 report, Personal Privacy in an Information Society (PPSC, 1977a), the commissioners noted that:

Every member of a modern society acts out the major events and transitions of his life with organizations as attentive partners. Each of his countless transactions with them leaves its mark in the records they maintain about him.

The report went on to point out that:

... as records continue to supplant face-to-face encounters in our society, there has been no compensating tendency to give the individual the kind of control over the collection, use, and disclosure of information about him that his face-to-face encounters normally entail.

The warnings implicit in the commissioners' statement are even more pertinent today. The emergence of HDOs in the 1990s comes at a time when the American public is expressing growing concern about threats to personal privacy. A 1993 Louis Harris poll found that 79 percent of the American public is "very" (49 percent) or "somewhat" (30 percent) worried about the threat to personal privacy (Harris/Equifax, 1993).1 This response has remained stable since 1990 when it rose sharply from a figure of 64 percent cited for 1978. There was agreement by 80 percent of respondents that "consumers have lost all control over how personal information about them is circulated and used by companies." The 1992 survey also asked about the effect of computers on privacy. Sixty-eight percent agreed strongly or very strongly that "computers are an actual threat to personal privacy," and almost 90 percent agreed that computers have made it much easier to obtain confidential personal information improperly (Equifax, 1992).

Many privacy experts have described the ready availability of personal information (e.g., see Piller, 1993). Rothfeder (1992) asserts that about five billion records in the United States describe each resident's whereabouts and other personal information. He also claims that such information is moved from one computer to another about five times a day (pp. 22-23):

Information about every move we make—buying a car or a home, applying for a loan, taking out insurance, purchasing potato chips, requesting a government grant, getting turned down for credit, going to work, seeing a doctor—is fed into ... databases owned by the credit bureaus, the government, banks, insurance companies, direct marketing companies, and other interested corporations. And from these databases it's broadcast to thousands ... of regional databanks as well as to numerous information resellers across the country.

Rothfeder believes that such pervasive data acquisition and exchange can lead to a feeling of powerlessness in the face of privacy intrusion. His language is evocative (p. 30):

Increasingly, people are at the whim of not only pressure groups, but also large organizations—direct marketers, the credit bureaus, the government, and the entire information economy—that view individuals as nothing but lifeless data floating like microscopic entities in vast electronic chambers, data that exists [sic] to be captured, examined, collated, and sold, regardless of the individual's desire to choose what should be concealed and what should be made public.

It may be that the increasing aggregation of personal data documenting the details of our physical attributes and defects, behaviors, desires, attitudes, failings, and achievements creates a virtual representation of us. Some have called this a ''computerized alter ego" or a "digital version of each of us to go with our public personae" (Rothfeder, 1992, p. 16, citing Miller). To the extent this is so, the privacy of this "virtual person" requires protection.

Recently the U.S. Congress has given serious attention to reform of the Fair Credit Reporting Act (Public Law [P.L.] 102-550; see below). It has also looked at technology-driven privacy issues: most pertinent are legislative proposals to restrict caller I.D. programs (S. 652; H.R. 1305; also see, House Report No. 102-324, 102nd Congress 2d Session), junk telephone calls and junk faxes (P.L. 102-243, "Telephone Consumer Protection Act of 1991"). Some congressional efforts, such as bills related to DNA testing and genetic profiling (S. 1355, "DNA Identification Act of 1991"; H.R. 2045, "Human Genome Privacy Act"), were intended to protect individuals against threats posed by medical technologies or initiatives. In October 1991, the Committee on Government Operations of the U.S. House of Representatives, Subcommittee on Government Information, Justice, and Agriculture, held hearings on genetic privacy issues, and in April 1992 it issued a report calling for reforms related to the privacy of genetic information.

Both the U.S. Congress and the Administration have undertaken activities related to the protection of medical information. In October 1993 the Senate Committee on the Judiciary held hearings on High Tech Privacy Issues in Health Law, and in November, the Subcommittee on Government Information, Justice, and Agriculture of the Committee on Government Operations held a hearing on a report prepared by the Office of Technology Assessment (OTA, 1993) at the request of that subcommittee and the Senate Subcommittee on Federal Services, Post Office, and Civil Service. The former committee has also been drafting legislation to protect the privacy of health information.2

A Task Force on Privacy was established in 1990 by the Assistant Secretary for Planning and Evaluation to report on the privacy of private sector health records. Another DHHS group established at the same time, the Workgroup on Electronic Data Interchange (WEDI, 1991) also addressed the protection of information when medical insurance claims are handled electronically. The recommendations of that workshop are discussed later in this chapter.

Two of President Clinton's Health Care Reform Task Forces met during the spring of 1993. They considered the implications of and generated plans for the protection of health-related data that would be acquired and held under the administration's proposal for health reform. The legislative proposals in the Health Security Act contain specific privacy protection provisions.3

forms governing disclosure of such information; and (3) the development of technology to implement standards regarding such information. It should also establish education and awareness programs, foster adequate security practices, train personnel of public and private entities in appropriate practices.

Sec. 5122. calls for a proposal not later than three years after enactment of the HSA to provide a comprehensive scheme of Federal privacy protection for individually identifiable health information that would include a Code of Fair Information Practices and provide for enforcement of the rights and duties created by the legislation. (Health Security Act. Title V. Part 2. Privacy of information.)

State legislatures have also been active. In the past three years, for example, many states have adopted legislation that prohibits employers from discriminating against applicants and employees on the basis of off-the-job, lawful activity or some specific subset of lawful activity, such as cigarette smoking.

SOURCES OF CONCERNS ABOUT PRIVACY AND THE CONFIDENTIALITY OF HEALTH RECORDS

Two somewhat distinct trends have led to increased access to the primary health record and subsequent concerns about privacy. One has to do with primary health records regardless of how they are created and maintained; the other involves health records stored electronically.

Health Care Records

The quantity and type of health care information now collected has also increased dramatically in recent years. The participation in health care delivery of many different individuals and groups of providers exerts strong pressures to document in ever greater detail. The expanding numbers of available technologies for diagnosis and therapy mean that details that a provider could at one time recall must now be recorded and thus become available for inspection by others. Further, information on lifestyle (e.g., use of tobacco or alcohol), family history, and health status have become of greater interest and relevance as we learn more about the relationship of these factors to overall health and well-being. In addition, genetic data are becoming more readily available, not only for prenatal testing but also for assessing an individual's degree of risk for an inherited condition.4

The more detailed the information about an individual or class of individuals, the more appropriate, one hopes, is the treatment they will be given. Further, documentation of care and risk factors are essential to promoting continuity of care over time and among providers. It is also a first defense against charges of malpractice.

The primary health record is no longer simply a tool for health care providers to record their impressions, observations, and instructions. Rather, it serves many purposes beyond direct health care. Third-party payers access patient record information to make payment determinations, and managed care organizations access patient records for precertification and case management.

Other parties external to the healing relationship seek person-identified information and assert socially beneficial reasons for access. What was once the "business" only of patients and possibly their physicians has now become the business of such groups as: (1) officers of government entitlement programs checking on eligibility, and on patient and provider fraud and abuse; (2) agencies granting security clearance; (3) attorneys bringing criminal or civil charges; and (4) social service workers protecting possibly abused children, to name only a few. Others access secondary health records or obtain portions of the medical record when making decisions about hiring, granting a license, or issuing life, health, or disability insurance.

Electronic Records

Other trends give rise to particular concerns about the confidentiality of health information that is stored electronically. First is the ability to access, transmit, and copy large volumes of data easily. Photocopying paper records is, of course, possible, but it is hardly feasible for large numbers of geographically dispersed medical records. Electronic storage and transmittal of data, by contrast, enable interested parties to aggregate information for individuals over time and across institutions and providers of care.

Second, databases were at one time discrete—often held in physically secure rooms on tape drives—with identifiers that were unique to a given institution or insurer. Now, however, data from diverse sources can be combined and linked. Once data are stored electronically, networks of databases can be explored almost imperceptibly from remote locations. Unless security systems are designed to record access, the curious, entrepreneurial, or venal can enter databases without leaving evidence of having done so.

Third, computer-based health data have become a very valuable commodity. Some companies obtain information from physicians' computers and pharmacy records for sale to pharmaceutical companies in return for incentives such as low-cost computer hardware and software. These companies gather such identifying variables as age, sex, and Social Security numbers even if patient names are either not taken or are later stripped off (Miller, 1992).

Other companies resell information from prescription or claims databases to companies that sort it by physician for marketing purposes. For example, Health Information Technologies, Inc., helps automate private physicians' insurance claims. When it transmits claims and payments between the insurance company and the physician, it retains electronic copies of these records, and it can later sell them (presumably without physician or patient names) for pharmaceutical and other related kinds of marketing (Miller, 1992).

In August 1993, Merck & Company purchased Medco Containment Services, a mail-order prescription firm. The purchase price, $6 billion, was based in part on the value of the information in its databases to influence physician prescribing practices (Tanouye, 1993). HDOs will control a gold mine of information, and they may find it difficult indeed to resist economic benefits from allowing access to their data files by third parties.

Finally, because developers of HDOs have compared claims transmittal to electronic funds transfer (EFT), it is helpful to examine how the Privacy Protection Study Commission regarded confidentiality in EFT. The commissioners were alert to problems that might result if records created by EFT could not be controlled by institutions. Noting that automated clearinghouses centralize information that would otherwise be segregated among diverse depository institutions, their report (PPSC, 1977a) expressed worry about threats posed by the accumulation and centralization of the financial information that flows through such clearinghouses. The commissioners also recognized that the resulting pools of information would become attractive sources of person-identifiable information for use "in ways inimical to personal privacy" (p. 121). They urged that adequate protections be established for person-identifiable information flowing through an EFT data communications network and that such account information be retained for as limited a period of time as was essential to fulfill operating requirements of the service provider. Thus, in contemplating EFT, the commissioners did not foresee, and certainly did not encourage, the creation of an information repository now contemplated under the concept of an HDO.

DEFINITIONS

Below, the committee offers definitions of critical terms—privacy (especially informational privacy), confidentiality, security, and health-related information.

Privacy

The most general and common view of privacy conveys notions of withdrawal, seclusion, secrecy, or of being kept away from public view, but with no pejorative overtones. By contrast, an invasion of privacy occurs when there is intentional deprivation of the desired privacy to which one is entitled. In public policy generally and health policy in particular, privacy takes on special meanings, some derived from moral theories, others from legal doctrine, and one from the widespread use of health information.

Privacy is sometimes characterized as the "right to be left alone" (Cooley, 1880; Warren and Brandeis, 1890; Elison and Nettiksimmons, 1987; Turkington, 1987; Herdrich, 1989). Many experts, however, have objected that such a definition is too broad to be helpful in the health context. There are innumerable ways of not being left alone that arguably have nothing to do with privacy (Thomson, 1975; Reiman, 1976; Parent, 1983), such as when an individual is subjected to aggressive panhandling on a city street. Consequently, theorists have sought to refine their conceptions of privacy. Their aim has been to isolate what is unique about privacy, to identify what constitutes its loss, and to distinguish among a variety of conceptually related but separable senses of privacy (Gerety, 1977; McCloskey, 1980; Schoeman, 1984).

The development and application of the concept of privacy in American law encompasses three clusters of ideas.5 First, privacy embodies autonomy interests; it protects decisions about the exercise of fundamental constitutional liberties with respect to private behavior, such as decisions relating to marriage, procreation, contraception, family relationships, and child-rearing. This is frequently characterized as decisional privacy (Tribe, 1978). Second, privacy protects against surveillance or intrusion when an individual has a "reasonable expectation of privacy." Examples include protections against unlawful searches of one's home or person and unauthorized wiretapping. Third, privacy encompasses informational interests; this notion is most frequently expressed as the interest of an individual in controlling the dissemination and use of information that relates to himself or herself (Shils, 1966; Westin, 1967), or to have information about oneself be inaccessible to others. This last form-informational privacy-is the main subject of this chapter.

Informational Privacy

Informational privacy—"a state or condition of controlled access to personal information" (Schoeman, 1984; Allen, 1987; Powers, 1993)—is infringed, by definition, whenever another party has access to one's personal information by reading, listening, or using any of the other senses. Such loss of privacy may be entirely acceptable and intended by the indi vidual, or it may be inadvertent, unacceptable, and even unknown to the individual.

This definition of privacy thus reflects two underlying notions. First, privacy in general and informational privacy in particular are always matters of degree. Rarely is anyone in a condition of complete physical or informational inaccessibility to others, nor would they wish to remain so. Second, although information privacy may be valuable and deserving of protection, many thoughtful privacy advocates argue that it does not, in itself, have moral significance or inherent value (Allen, 1987; Faden, 1993).

Nonetheless, informational privacy has value for all in our society, and it accordingly has special claims on our attention. In his pivotal book, Privacy and Freedom, Westin (1967) described it as "the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others" (p. 7). This definition served as the foundation for the Privacy Act of 1974 (P.L. 93579; 5 U.S.C. § 552a). This act, arguably the most significant step to protect privacy in recent decades, was enacted to control use of personally identifiable information maintained in federal government databases.

Recordkeeping Privacy

In recent decades, discussions about privacy have almost exclusively addressed the use of information about people to make decisions about some right, privilege, benefit, or entitlement—so-called "recordkeeping privacy." This focus was of particular interest to those framing the Privacy Act of 1974.

More recently the desire for informational privacy has become an important expectation, not because of a benefit or entitlement sought, but for its own sake. Information may be created as a byproduct of some event—for example, an individual's geographic location becomes available when he or she uses a bank card for a financial transaction; similarly, one's preferences are known when one buys goods by mail order or uses a check-verification card at the local supermarket. In yet other cases, information derives from aggregating data from many sources, including public records; such aggregation can also include data that have been derived from computer processing (e.g., buying profiles or dossiers).

Data subjects want informational privacy to be respected in such contexts as well. Many people in the United States would like to believe that data collected about them legitimately, in connection with some transaction or incidentally through participation in the general activities of society, will not be exploited for secondary purposes such as advertising, soliciting, telemarketing, promotional activities, or other actions that are distinct from and unrelated to the activities for which the data were originally collected (see Harris/Equifax, 1993). As should be clear from the discussion in this chapter, however, these hopes are often not realized in general or in relation to health information.

Privacy Rights

To assert a right is to make a special kind of claim. Rights designate some interests of the individual that are sufficiently important to hold others under a duty to promote and protect, sometimes even at the expense of maximizing or even achieving the social good (Raz, 1986). Two interests are widely cited as providing the moral justification for privacy rights: the individual's interest in autonomy and the instrumental value that privacy may have in promoting other valuable human goods.

With respect to autonomy, privacy fosters and enhances a sense of self (Reiman, 1976). Respecting privacy enhances an individual's autonomy (Westin, 1967; Benn, 1971; Bloustein, 1984). It allows the individual to develop the capacity to be self-governing or "sovereign," a notion analogous to the sense in which autonomous states are sovereign (Beauchamp and Childress, 1989). The loss or degradation of privacy can enable others to exercise an inordinate measure of power over the individual's economic, social, and psychological well-being (Gavison, 1980; Parent, 1983).

With respect to the value of privacy to promote other ends, its instrumental value, privacy permits the development of character traits and virtues essential to desirable human relationships. These include trust, intimacy, and love. Without some measure of privacy, these relationships are diminished or may not be possible (Fried, 1968; Rachels, 1975).

The existence of informational privacy rights means that someone is under a duty either not to disclose information or to prevent unauthorized access to information by others. Dworkin (1977) has argued that for a right to be meaningful implies that any policy or law overriding such duties must withstand rigorous scrutiny and that considerations of social utility alone are inadequate grounds to override it. That is, to take rights seriously is to recognize some limits on the prerogative of government or others to mandate the common good at the expense of the individual. This is not to say, however, that rights function as an absolute barrier to the pursuit of collective goals; indeed, the tension between individual and social goals is reflected in the issues raised in Chapter 3, as well as in this chapter.

Balancing Benefits of HDOs Against Loss of Informational Privacy

There cannot be much doubt that HDOs will serve legitimate societal interests as described in Chapter 2. Nevertheless, because HDOs will represent one of the most comprehensive and sensitive automated personal record databases ever established, they inevitably implicate interests protected by informational privacy principles. Accordingly, HDO advocates will be well served from an ethical as well as legal viewpoint if they consider what social goods justify possible loss of privacy and such loss can be minimized or prevented.

Whether HDOs can achieve their potential for good in the face of their possible impact on privacy will likely turn on the interplay of three considerations. First, to what extent do the HDOs provide important (and perhaps irreplaceable) health care benefits to their regions and perhaps to the nation? Second, do the societal benefits resulting from the implementation of HDOs outweigh the privacy risks? Third, to what extent have adequate privacy safeguards been incorporated into the HDOs?

Federal and State Privacy Protection

No explicit right to privacy is guaranteed by the Constitution of the United States; in fact, the word "privacy" does not appear. The presumed right as the basis of a civil action is based on legal opinion written by Justice Louis D. Brandeis in 1890, and its constitutional status derives from various amendments to the Bill of Rights.

The issues surrounding the constitutional status of privacy protection are too numerous and controversial to explore in detail here. Most constitutional scholars agree that federal constitutional protections are unlikely to provide the first line of defense for privacy of health information. The Constitution generally has not provided strong protection for the confidentiality of individual health care information; the constitutional protection for informational privacy is thus very limited and derived from case law interpreting the Constitution.

The courts have made clear that, at least theoretically, information privacy principles based on the Constitution limit a government agency's collection and use of personal information to situations in which the use bears a rational relationship to a legitimate governmental purpose. The government's interest in the information program must outweigh the threat to personal privacy posed by the program.6

In Whalen v. Roe (429 U.S. 589 [1977]), for example, the Supreme Court balanced the privacy threat posed by a New York State law against the statute's benefits. The New York State statute required pharmacists and physicians to report sensitive health record information to state officials, in this case prescriptions for controlled drugs. It required physicians to report the names of patients receiving certain types of prescription drugs to a state agency. The court concluded that the statute was constitutional on two grounds: the societal interests served by the statute (combating the illegal use of otherwise legal drugs) and extensive privacy and confidentiality protections in the law (redisclosure of the drug information, for example, was prohibited). The court suggested that if the statute had lacked these confidentiality protections it would have been found to violate constitutional privacy principles (Chlapowski, 1991). Thus, privacy rights are to be considered derived and not explicit rights.

In United States v. Westinghouse Electric Corp. (638 F. 2d 570, 578 [3rd Cir. 1980]), the Third Circuit identified seven factors that should be weighed in determining whether to permit a government agency to collect personal information and thus undertake a program that infringes privacy. These were the type of record requested; the subject matter of the information; the potential for harm in a subsequent nonconsensual disclosure; the damage to the relationship in which the record was generated; the adequacy of safeguards to prevent unauthorized disclosure; the degree of need for access; and whether there is an express, statutory mandate, articulated public policy, or other recognizable public interest tilting toward access.

Various state constitutional provisions offer more protection. For one to have a claim for a violation of a constitutional privacy right, however, the individual generally must show that state action caused the violation. California's constitution (Cal. Const., Art. 1, § 1) is an exception to this general rule because it makes privacy rights explicit. California courts have held that the state's constitutional privacy provision can be asserted against private parties who infringe on citizens' privacy; see, for instance Heda v. Superior Court, 225 Cal. App. 3rd 525 (Cal., Dist. Ct., App. 1990) and Soroka v. Dayton Hudson Corp., 1 Cal. Rptr. 2nd 77 (1991). Other common law and statutory remedies, as well as institutional policies and practices, will be of greater immediate importance. This and the relevance of existing laws to HDOs is discussed in the next section.

Confidentiality

Confidentiality relates to disclosure or nondisclosure of information. Historically a duty to honor confidentiality has arisen with respect to information disclosed in the context of a confidential relationship, such as that between an individual and his or her physician, attorney, or priest. In such relationships, the confidante is under an obligation not to redisclose the information learned in the course of the relationship. Now the law applies such duties to some holders of information who do not have a confidential relationship to a patient. In the health sector, this includes such holders as utilization management firms in many states and local, state, or federal health agencies that receive reports of communicable diseases.

When one is concerned about data disclosure, whether or not any relationship exists between a data subject and a data holder, an essential construct is that of data confidentiality.Data confidentiality is the status accorded data indicating that they are protected and must be treated as such. In the federal Freedom of Information Act (FOIA, 5 U.S.C., Section 552), certain categories of data are specified as confidential and thus not disclosable; for instance, Exemption 6 states that FOIA is not applicable to ''personnel and medical files and similar files, the disclosure of which would constitute clearly unwarranted invasion of personal privacy." Data confidentiality is discussed in more detail in a later section.

Confidentiality Obligations in Health Care

Professional obligations to privacy and confidentiality. The importance of confidentiality to the medical profession is reflected in the physician's "Oath of Hippocrates." Adopted in roughly the fourth century B.C.E., it remains a recognized element of medical ethics:

Whatsoever things I see or hear concerning the life of men, in my attendance on the sick or even apart therefrom, which ought not to be noised abroad, I will keep silence thereon, counting such things to be as sacred secrets (Bulger, 1987).

In similar fashion, the American Medical Association Principles of Medical Ethics (AMA, 1992, Section 5.05) states that "The information disclosed to a physician during the course of the relationship between the physician and patient is confidential to the greatest possible degree . . . The physician should not reveal confidential communications or information without the express consent of the patient, unless required to do so by law."

Within the healing relationship, four justifications may be offered for medical confidentiality (adapted from Faden, 1993). First is a respect for privacy and patient autonomy. In the earliest practice of medicine, physicians treated patients in their homes, and medical privacy was an extension of the privacy of the home. The Hippocratic Oath, for instance, does not justify confidentiality on any ground other than respect for privacy. If information concerning a patient's mind and body are viewed as extensions of the patient, than the concept of autonomy requires that the patient be able to control disclosure and use of that information. The value placed on personal autonomy gives rise to the notion of informed consent. As Justice Benjamin N. Cardozo wrote in his opinion in Schloendorff v. Society of New York Hospital, 211 N.Y. 125, "Every human being of adult years and sound mind has a right to determine what shall be done with his body."

A second justification related to respect for privacy is the implicit and sometimes explicit expectation or promise of confidentiality. Third is the special moral character of the doctor-patient relationship, which is characterized by trust and intimacy. Confidentiality can be instrumental in fostering patients' trust in their physicians; when this trust encourages patients to speak freely and disclose information they would otherwise keep secret, it facilitates diagnosis and treatment. Fourth, respecting confidentiality protects patients from harm that might befall them if the information were to become widely available and indiscriminately used.

Legal obligations of confidentiality. Various federal and state laws impose a duty to preserve the confidentiality of personal health information. These laws can be divided into two categories: those imposing confidentiality obligations on recordkeepers and those protecting health information that is deemed highly sensitive. Examples of the former include general confidentiality statutes about health care information such as the Uniform Health Care Information Act (National Conference, 1988) and the California Confidentiality of Medical Information Act (Cal. Civil Code §§ 5656.37 [1992]), as well as various state laws and Medicare and Medicaid regulations. Laws and regulations imposing confidentiality requirements for sensitive personal health information include those related to alcohol and drug abuse records and laws governing nondisclosure of records of patients with acquired immunodeficiency syndrome (AIDS), the results of antibody tests for human immunodeficiency virus (HIV), psychiatric and developmental disability records, and information concerning results of genetics screening and testing.

Courts have also recognized a legal obligation to maintain the confidentiality of personal health care information. In response to harm resulting from unauthorized release of personal health information, courts have granted legal relief under a number of theories: breach of trust, breach of confidence, breach of implied contract, invasion of privacy, defamation, and negligence (Waller, 1992).

Disclosure of Health Information

As one looks beyond the protected sphere of the patient-provider relationship, it is not always clear who is rightly in the community of "knowers," nor is there universal agreement on principles that ought to control disclosure. With the growth of managed care, utilization review, third-party payment systems, and claims administration for self-insured health plans, information sharing for purposes of adjudicating claims and managing high-risk or high-cost cases has become part and parcel of the provision of health care. Westin has described these supporting and administrative activities as "Zone 2" in comparison to "Zone 1," which refers to information flow to support direct medical care (Westin, 1976; Harris/Equifax, 1993). These wide-ranging claims of need for sensitive health information, which are emblematic of modern health care, raise difficult problems for the preservation of privacy and maintenance of confidentiality.

Patients generally understand that, with consent, information in their medical records will be shared widely within a hospital and for insurance and reimbursement purposes. They also expect that data collected about them will be used only for the purpose of the initial collection and that such data will be shared with others only for that same purpose. Outside the health care institution, patients expect that confidential data will not be shared with people or organizations not authorized to have such information and that legitimate users of the data will not exploit such access for purposes other than those for which the information was originally obtained (e.g., see Harris/Equifax, 1993).

Consent. Such exceptions to the rule of confidentiality as described above are rationalized as being conducted by consent of the patient or a patient representative. A patient may be asked to accede to disclosure by signing a blanket consent form when applying for insurance or employment. In such cases, however, consent cannot be truly voluntary or informed. Such authorizations are often not voluntary because the patient feels compelled to sign the authorization or forego the benefit sought, and they are not informed because the patient cannot know in advance what information will be in the record, who will subsequently have access to it, or how it will be used.

Although consent may be the best-recognized way to permit disclosures of private information, consent is so often not informed or is given under economic compulsion that it does not provide sufficient protection to patients. As will be seen in the recommendations section of this chapter, this committee generally does not regard "consent" procedures as sufficient to protect sensitive information from inappropriate disclosure by HDOs, although they are a necessary adjunct to other autonomy protections.

Mandatory reporting and compulsory process. Other situations exist in which sensitive health information about individuals must be disclosed to third parties. Such sharing of health information for socially sanctioned purposes may be truly voluntary; it may also be required through mandatory reporting or coerced by court order.

Mandatory reporting requirements are justified by society's need for information; these include filing reports of births and deaths, communicable diseases, cancer, environmental and occupational diseases, drug addiction, gunshot wounds, child abuse, and other violence-related injuries. Some statutes requiring that records be retained for 10 to 25 years in some cases make past diagnoses retrievable long after they no longer accurately describe the patient. Another type of reporting requirement involves the expectation that third parties require warning about threats to their life.7

Physicians and others may also find themselves compelled to divulge patient information when they would otherwise choose not to do so. Such requirements—sometimes termed "compulsory process"—may take the form of subpoenas or discovery requests and may be enforced by court order. In some instances personal health care information may be protected from disclosure in court and administrative proceedings by virtue of the physician-patient privilege, which may be mandated by statute or derive from the common law. Information that is so privileged cannot be introduced into evidence and is generally not subject to discovery.

Weaknesses of Legal Protection for Confidentiality

Legal and ethical confidentiality obligations are the same whether health records are kept on paper or on computer-based media (Waller, 1992). Current laws, however, have significant weaknesses. First, and very important, the degree to which confidentiality is required under current law varies according to the holder of the information and the type of information held.

Second, legal obligations of confidentiality often vary widely within a single state and from state to state, making it difficult to ascertain the legal obligations that a given HDO will have, particularly if it operates in a multistate area. These state-by-state and intrastate variations and inconsistencies in privacy and confidentiality laws are well established among those knowledgeable about health care records law (e.g., see Powers, 1991; Waller, 1991; WEDI, 1992; Gostin et al., 1993; OTA, 1993; for examples ranging across many types of professionals, institutions, and ancillary personnel). This is important because some HDOs will routinely transmit data across state lines. Interstate transmission already occurs with data such as claims or typed dictation. When confidential data are transmitted across state lines, it is not always clear which state's confidentiality laws apply and which state's courts have jurisdiction over disputes concerning improper disclosure of information.

Third, current laws offer individuals little real protection against redisclosure of their confidential health information to unauthorized recipients for a number of reasons. Once patients have consented to an initial disclosure of information (for example, to obtain insurance reimbursement), they have lost control of further disclosure. Information disclosed for one purpose may be used for unrelated purposes without the subject's knowledge or consent (sometimes termed secondary use). For instance, information about a diagnosis taken from an individual's medical record may be forwarded to the Medical Information Bureau in Boston, Massachusetts (MIB, 1989; and see Kratka, 1990) and later used by another insurance company in an underwriting decision concerning life insurance. Redisclosure practices represent a yawning gap in confidentiality protection.

As a practical matter, policing redisclosure of one's personal health information is difficult and may be impossible. At a minimum, such policing requires substantial resources and commitment. With the use of computer and telecommunications networks, an individual may never discover that a particular disclosure has occurred, even though he or she suffers significant harm—such as inability to obtain employment, credit, housing, or insurance—as a result of such disclosure. Pursuing legal remedies may result in additional disclosure of the individual's private health information.8

Fourth, in some instances federal law preempts state confidentiality requirements or protections without imposing new ones. For example, the Employment Retirement Insurance Security Act (ERISA) preempts some state insurance laws with respect to employers' self-insured health plans, yet ERISA is silent on confidentiality obligations. Because 74 percent or more of employers with 1,000 or more employees manage self-insured health plans (Foster Higgins, 1991, in IOM, 1993e), such preemption is particularly troublesome.

Last, enforcing rights through litigation is costly, and money damages may not provide adequate redress for the harm done by the improper disclosure.

Security

In the context of health record information, confidentiality implies controlled access and protection against unauthorized access to, modification of, or destruction of health data. Confidentiality has meaning only when the data holder has the will, technical capacity, and moral or legal authority to protect data-that is, to keep such information (or the system in which it resides) secure (NRC/CBASSE, 1993). Data security exists when data are protected from accidental or intentional disclosure to unauthorized persons and from unauthorized or accidental alteration (IOM, 1991a).

In computer-based or computer-controlled systems, security is implemented when a defined system functions in a defined operational environment, serves a defined set of users, contains prescribed data and operational programs, has defined network connections and interactions with other systems, and incorporates safeguards to protect the system against a defined threat to the system, its resources, and its data. More generally, protective safeguards include:

  • hardware (e.g., memory protect);
  • software (e.g., audit trails, log-on procedures);
  • personnel control (e.g., badges or other mechanisms to control entry or limit movement);
  • physical object control (e.g., logging and cataloging of magnetic tapes and floppy disks, destruction of paper containing person-identifiable printouts);
  • disaster preparedness (e.g., sprinklers, tape vaults in case of fire, flood, or bomb);
  • procedures (e.g., granting access to systems, assigning passwords);
  • administration (e.g., auditing events, disaster preparedness, security officer); and
  • management oversight (e.g., periodic review of safeguards, unexpected inspections, policy guidance).

The collective intent of these safeguards is to give high assurance that the system, its resources, and information are protected against harm and that the information and resources are properly accessed and used by authorized users.

Health-Related Information

In a study that focuses on the protection of health-related data about individuals, defining which items are health related is more difficult than one might initially think. The most obvious categories are medical history, current diagnoses, diagnostic test results, and therapies. Other pieces of information are more distantly related to health—because of what one might infer about a person's health. Examples include type of specialist visited, functional status, lifestyle, and past diagnoses. Nevertheless, not everything in a medical record is relevant to health status or is health related.

Insurance coverage and marital status are cases in point. Some elements could nevertheless be considered sensitive because of the social stigma that could result if they are revealed. Examples include sexual preference, address, or the receipt of social services.

The same disclosure might be harmful to one individual but not another, or harmful to an individual in one circumstance but not in another. Personal data, particularly health-related personal data, are not inherently sensitive, but they become so because of the harmful way(s) in which they might be used. Thus, any data element in medical records, and many data items from other records, could be considered either health-related or sensitive, or both. Where the boundaries for the protection of personal health information lie is not at all obvious. In considering the actions of HDOs, this committee takes a relatively broad view of health-related data; it proceeds from an assumption that all information concerning an individual and any transactions relating directly or indirectly to health care that HDOs access or maintain as databases must be regarded as potentially requiring privacy protections.9

EXPANDED DEFINITIONS

The foregoing discussions of confidentiality are based on historical, ethical, and legal usage and have served to guide legislators and practitioners. Legally and medically, confidentiality has been treated as arising from a relationship such as that between physician and patient or attorney and client. Such usage may not be as useful to administrators, vice presidents for data processing, or system designers who must design HDO systems and are working not with relationships but with access to secondary records.

The committee suggests, therefore, that an expanded interpretation using a taxonomy that is not derived from interpersonal or interprofessional relationships might be more helpful to those responsible for protection of information in these HDOs. In this taxonomy one begins with data confidentiality, defined as the status accorded data that have been declared to be sensitive and must be protected and handled as such. The rationale for the statement about sensitivity is based on potential harm to people, potential invasion of privacy, and potential loss of entitlements or privileges.

Two consequences flow from defining data as sensitive and requiring protection. First, the data must be made secure; second, access must be controlled. As described earlier, data security includes system and network protection and assures the integrity of data-such that they are not altered or destroyed accidentally or intentionally. Some system security safeguards (e.g., control of personnel) also assure data integrity.

The second consequence of declaring data sensitive—the need for access control—is related to the concept of informational privacy described above. Access control can be operationalized by HDO planners and legislators in a form that this committee would term "information-use policy." Information-use policy in the automated system context gives rise to decisions about who can do what, with which data, and for what purpose. It leads to policymaking about who may be allowed to use health-related information and how they may use it. It also requires decisions about how health information can be used as a matter of social policy and might also include consideration of whether some data should be collected at all.

The three issues—data confidentiality, security, and information use—are obviously related (Figure 4-1) They overlap to some extent and collectively represent the area of direct concern in this report. One reason to keep the three issues separate is that different remedies are relevant to each.

FIGURE 4-1

A new taxonomy of data confidentiality, security, and informational privacy.

Data confidentiality is a matter of law and regulation. Legislation would be required to establish that health-related information is confidential, to spell out the rationale for the position, and to clarify the ramifications and consequences of attaching protection to health data.

Security is a matter of technology, management controls, procedures, and administrative oversight. In the public sector, the action agents are regulators; in the HDO, a policy and oversight board could establish security policies. Implementation and management would be provided by technical and system design personnel.

Informational privacy (information use) is the most difficult to sort out.

The nation needs to agree on the proper use of health-related information. It is not yet clear how this will or can be done, nor is it obvious who the action agents will be. At the level of the HDO, information use would be decided by the governing board. At a regional or national level, federal agencies, legislators, professional bodies, consumer advocates, and industrial lobby groups are all likely to be involved.

In the remainder of this chapter, the terms confidentiality (rather than the more cumbersome data confidentiality), privacy (rather than informational privacy), and security continue to be used, but the committee intends that they be understood in the context just described. The committee believes this conceptualization will make it easier to translate recommendations into policies and procedures that can be implemented and enforced.

HARM FROM DISCLOSURE AND REDISCLOSURE OF HEALTH RECORD INFORMATION

Very little systematic or empirical evidence supports the widespread perception of the threat or the reality of harm from disclosure and redisclosure of health-related information. This is because the origins of the information may not be known by the person harmed, because of the natural preference not to further publicize confidential information about oneself, and because inquiry in this area has been to date more anecdotal than systematic.

This section presents examples of the potential confidentiality and privacy problems that might be encountered with health data, either in patient record form or in databases accessed or held by HDOs. Although these concerns cannot easily be quantified, reports to the committee during this study, cases mentioned in media such as the Privacy Times and the Internet-based Privacy Forum Digest, incidents known to or recounted to committee members, and similar inputs make clear that the threats and potential harm are real and not numerically trivial.

Health leader respondents to the 1993 Harris/Equifax survey showed that 71 percent were somewhat or very concerned about threats to the confidentiality of medical records, and 24 percent were "aware of violations of the confidentiality of individuals' medical records from inside an organization that embarrassed or harmed the individual." Respondents identified test results and diagnostic reports as the most frequently disclosed information.

Of the responding public, some 27 percent believed that their own medical records had been improperly disclosed. That group identified health insurers most often as having been responsible (15 percent). Fewer respondents identified hospitals or clinics (11 percent), public health agencies (10 percent), and employers (9 percent).

This section describes three categories of disclosure of patient information common today and the problems and harm that may result: (1) common disclosures that are breaches of confidentiality; (2) covert, illegal, or unethical acquisition and use of information; and (3) harm from disclosure of inaccurate data. It also raises questions about unforeseen uses of databases accessed by HDOs.

"Common" Disclosures

Three types of common disclosures pose threats: inadvertent, routine, and rerelease to third parties.

Inadvertent Release

A form of disclosure that the committee has termed "unthinking" often occurs within medical institutions. Examples include discussions with or about patients within earshot of other patients in waiting areas and discussions of cases in elevators, halls, cafeterias, and social settings. Disclosure related to the human penchant for gossip and carelessness in leaving medical records "lying around" or leaving information displayed on computer terminals is common. Westin (1972) concluded that such disclosures (sometimes to patients' relatives or friends) were less likely to be related to automated databases than to common indiscretion by hospital workers and health care providers. As the nation moves into yet more sophisticated telecommunication systems, such disclosure can include leaving detailed patient information on answering machines, sending information on fax transmissions that accumulate in common areas, or holding conversations about patients or dictating patient histories or notes about patient visits over cellular telephones.

If the diagnosis stigmatizes or indicates a disabling or fatal condition, harm can be especially serious. The harm can be great both to the famous and "VIP" patient and to noncelebrities, especially for coworkers or patients in a small community. One well-known case involved a staff physician whose HIV status became known in his hospital when his diagnosis was discussed by hospital personnel (Estate of Behringer v. Medical Center at Princeton, 249 N.J. Super. 597, 592 A.2d 1251 [1991]).

The committee believes many safeguards exist that can and should be put in place in any health care institution or HDO to anticipate and prevent disclosures of this sort. Preventing disclosure requires greater sensitivity to confidentiality issues and better training of health care workers. The Mayo Foundation, for example, has successfully developed and maintained a culture of adamant protection of the confidentiality of its patients' health information (Mayo Foundation, 1991).

''Routine" Releases or Uses in Accordance with Prevailing Practices

Health information is frequently shared without knowledge of the individual based on "uninformed" or "blanket" consent. In addition to the consent to disclose information routinely obtained from a patient before care is administered or when enrolling in a health insurance plan, another example of data disclosure is the wholesale photocopying of medical records that are forwarded to insurers, when much of the information does not pertain to the given insurance claim. The committee believes that the ability to prevent inappropriate release and use (misuse) must be strengthened. Such protections for data in HDOs are at the heart of this report and its recommendations.

Rerelease to Third Parties Without the Subject's Knowledge or Consent (Secondary Use)

The "secondary use" principle is an important component of fair information practices. It reflects the notion that when personal information is collected for a particular purpose the information should be used for only that purpose or a compatible one.10

An especially troublesome problem is the difficulty of confining the migration of information to third, fourth, or fifth parties without the individual's knowledge or consent. Examples include the sharing of health record information within organizations in one industry (e.g., between the health insurance and life insurance division of a company or between the personnel benefits division and the personnel or supervisory unit of an employer). Other examples include sharing between organizations in one industry (e.g., between the Medical Information Bureau and a second insurer). Yet a third sort of sharing can occur between organizations in two different industries (e.g., between insurer and credit bureau or between a current employer and a potential employer). A final example involves sharing genetic information with relatives who are at risk of an inherited disorder.

A major concern among commentators writing about the collection and storage of genetic information is that there will be increased pressure on the holders of such information to reveal to other patients and their physicians information gained about family members. These individuals might want to assess their own genetic risks of inheritable disease or use the information when making reproductive decisions. Some indeed have argued that there is an exception to physician duties of confidentiality analogous to that of duties to warn (or protect) people at risk from those with psychiatric disorders or HIV infection.

Given the growth in fringe benefits offered by employers and their subsequent stake in managing the costs of such benefits, there are few limitations on information that can be gathered for use in administering health, disability, and pension plans. Committee members were told repeatedly that self-insured employers are given access, when they insist, to patient-identified health claims information. Indeed, some third-party administrators (TPAs) provide human resources personnel with dial-in capability to perform their own analyses of data concerning a firm's employees and dependents (personal communication, third-party software and services vendor, 1993). Whether employers have the right to data incident to the health care for which they have paid is highly debatable, but this rationale is commonly accepted by TPAs under the pressure of competition, and there is great risk that data will not be partitioned from use in personnel actions. The 1993 Harris/Equifax survey confirmed the public's concern about this problem. Forty percent were somewhat or very concerned that their job might be affected if their medical claims information was seen by their employer. Another example of information that is in some ways mandated and also creates a database of problematic information is that which is compiled by medical review officers in connection with employee drug-testing programs.

Although corporate and professional ethics tend to discourage abuse, few barriers exist to an employer's use of its employees' medical and insurance claims records. The threat of liability under the Americans with Disabilities Act has served as a brake on some employers' access to and use of their employees' health records. In addition, some state laws limit access. Employers, however, may be required by federal or state regulations to access records in order to identify employees who pose threats to security. Information available under such permission may pertain to spouses and dependents as well as employees.

The committee believes secondary use of medical information by employers is common and may be increasing as employers seek to find ways to manage high-cost cases, to adjust their benefit packages to control their health care exposure,11 and perhaps even to identify or terminate high-cost employees or those with high-cost dependents. Real or potential harm ranges from the inconsequential to the calamitous. It is likely that the ability to limit secondary use can be strengthened, and ways to accomplish this are at the heart of the committee's concerns.

Covert Acquisition and Use of Data for Illegal or Unethical Purposes

Another problem involving acquisition and use of medical information occurs covertly through illegal or unethical means. Examples include information brokers who tap into computerized systems by using false names or by bribing database employees to supply information about celebrities or the names of individuals with certain characteristics. In health care institutions, there is also a risk that employees will browse through medical records out of curiosity (as tax and credit bureau employees have done).

The character of the threat to confidentiality posed by the aggregation of databases is altered. Celebrities have long been vulnerable to loss of privacy through both paper and computerized searches as documented by Rothfeder (1992). The new vulnerability posed by computerized searches is to those who until now have been (relatively) anonymous. That is, information brokers seek to identify information not about an identified individual but about the identities of individuals with given characteristics (e.g., those with a diagnosis of AIDS or women who have had an abortion).

Isikoff (1991) describes the growth of the information-broker industry, which boasts instant access to a range of confidential computer data—credit reports, business histories, driver's license records, Social Security records, and criminal history backgrounds. Some of these records are public, but some are in government and private computer databases; in the latter, illegal access may involve insiders (e.g., employees of the Social Security Administration, police and other law enforcement employees). Of particular concern is the problem of unauthorized disclosure by often low-paid individuals who have legitimate access to information but who use it to facilitate illegitimate searches or to profit from the sale of records—a practice some have termed "insider information trading" and known to data system security specialists as the "insider threat."

Hendricks (1992) described a recently published hacker's manual for penetrating TRW's credit bureau database; it was complete with dial-up numbers, codes, and methods for persuading credit bureau subscribers to divulge their passwords over the phone. He described how the traditionally youthful hackers have been supplanted by profit-oriented criminal enterprises and the emergence of individuals who, in this case, proclaim the right of the individual to conquer and destroy the "system" and its laws and to damage individuals for excitement and profit. Those who are determined to break into a system can be thwarted only with thoughtful and comprehensive system safeguards.

Although harm from this source is likely to occur rarely in comparison with others, the harm can be great because so many individuals are affected. Further, the data holder can be severely damaged in the public's eye. One goal for an HDO must be to assure the public of reasonable, if not absolute, safety.

Release of Inaccurate Data

A different harm can result from release of information when data are incomplete, inaccurate, or out of date. Examples are medical records or insurance claims on which diagnoses are listed or coded incorrectly (e.g., mastectomy for myringotomy). Other problems involve diagnoses that were considered at one time and ruled out but are still listed as a final diagnosis, incorrect inferences drawn from diagnostic tests, and clinical distortions that result from coding limitations. Data inaccuracies also arise from actions that are intended to be beneficial—for example, to protect the patient from a stigmatizing diagnosis, to permit insurance reimbursement for a test or procedure that might otherwise not be covered (as in the case of preventive and screening tests), or to allow a frail patient to be treated on an inpatient rather than outpatient basis.

The committee does not know how often these irregularities occur. Studies of the accuracy of medical records consistently show unintentional and sometimes intentional errors (Burnum, 1989), and medical records personnel and researchers report that errors and omissions are extremely common in all health records. Harm from such problems may range from trivial to severe. Any reliance on databases for such social benefits as credit ratings or life insurance means that data that are incomplete, inaccurate, or false (for example, when records of several different people are combined) are not merely useless, they are pernicious. Such errors and omissions were not a major focus of the study committee. It should be noted, however, that the converse of this problem is that the more accurate and comprehensive the databases, the more pressure there will be for access, which in turn raises the chances of harm in the other categories already discussed.

Harm resulting from inaccurate or out-of-date data can be mitigated or prevented in a number of ways, including adequate and regular attention to the reliability and validity of database contents as described in Chapter 3. Allowing individuals to obtain, challenge, and correct their own records can also help to improve their accuracy.

PRIVACY INTERESTS AND HDOs

HDOs may pose a threat to privacy interests in four ways. The first arises through harm from secondary use. This includes the potential for stigmatizing and embarrassing patients; adversely affecting their opportunities for employment, insurance, licenses, and other benefits; undermining trust and candor in the health care provider-patient relationship; and defeating patients' legitimate expectations of confidentiality. Second is the unpredictable effect that will be produced by the mere existence of HDOs as described. Third, HDOs may exacerbate societal concerns about the emergence of national, centralized personal record databases, which may be perceived as a national identification system or dossier. Issues concerning the Social Security number and its analogs are especially pertinent here. Finally, HDOs will need to be mindful of the possible effect that research uses may have on privacy.

Foreseen and Unforeseen Circumstances

In addition to the current risk of breaches of confidentiality and the risk of harm from inaccurate data inherent in the paper record, the existence of any accumulation of valuable data will spawn new users, new demands for access, and new justifications for expanded access.

HDOs may unintentionally create a heightened risk of disclosure resulting with the new forms of data becoming available through the HDOs, new inquiries and types of inquirers, new uses, and new legal and governance structures. The mere presence of the HDO may, over time, encourage new practices or changes that may be harmful to at least some segments of the population.

HDOs must also realize that the more information it holds or can access, and the more valuable that information, the greater the temptation will be for others to acquire and covertly use the information. An HDO database becomes, in some sense, like a swimming pool or an abandoned refrigerator to a child-an overwhelming opportunity or, in legal terms, an attractive nuisance.

Computerization poses problems for the protection of privacy and confidentiality, but it also offers new opportunities for protection. For example, access to records and to defined parts of records can be granted, controlled, or adapted on a need-to-know (or function-related) basis; this means that users can be authorized to obtain and use only information for which their access is justifiable. It will also be possible to implement authentication procedures (discussed below) and to implement and publicize the use of methods to permit the HDO to know if anyone has browsed in the databases, who has done so, and which data were accessed. Automation could also greatly mitigate the disclosure that now occurs when, for instance, an entire medical record is copied to substantiate a claim for a single episode of care, and software could prevent the printing or transfer of database information to other computers.

A National Identification System or Dossier

Privacy advocates can be expected to express sharp concerns about the potential for HDOs to be linked with one another or with other types of personal databases such as the financial, credit, and lifestyle databases maintained by consumer reporting agencies and information services. One particular threat is the possible contribution of linked databases to development of a de facto national identification (and data) system. Such a system would comprise a comprehensive, automated dossier on virtually every citizen.

Conventional wisdom holds that after a personal information database is established, some consequences are inevitable (Gellman, 1984): expansions in permissible uses of the database; demands to link the database with complementary databases to improve the database product; and relaxations of confidentiality restrictions. With respect to HDOs, privacy advocacy groups and the media are likely to be concerned that over time various regional HDOs will establish telecommunications links and that these entities will become a national network linkable to other financial and government records such as those serving the Social Security Administration or the Internal Revenue Service.

As potential users of HDO data files, many persons in these groups might regard this scenario as desirable and beneficial; as potential record subjects, however, most would probably be uncomfortable with this threat to their privacy. Once such a system were in place, some fear that both those with and without bona fide access would be able to call up a remarkably comprehensive and intrusive dossier comprising detailed biographic information, family history information, employment information, financial and insurance information, and, unless prevented, of course, medical record information about every citizen participating in the system.

In the view of many, this development would bring the nation perilously close to a national identification database. Indeed, at that point the "national" network would lack only a means of positive identification and a requirement that all citizens participate to constitute such a national identification system.12 HDOs will need to take steps to be certain that they do not contribute to these developments.

Although many people carry credit or health insurance cards and have no objection to doing so, others would view any requirement that a special identification card must be carried by participating consumers with special alarm because such an instrument is thought to connote totalitarian values. In the former Soviet Union, for instance, all Soviet citizens were required to carry an internal passport and to produce this passport upon request. In this way, the passport served not only to regulate internal travel but as a means of identification and social control (Pipko and Pucciarelli, 1985). The Harris/Equifax survey found that the great majority of the public (84 percent) is willing to accept a personal identity card but had mixed feelings about being assigned a number—perhaps reflecting concern about whether such a number could be used to link their health information to other databases.

Admittedly, a few Western democracies employ national population registries and automated and centralized personal record data banks, but virtually all these systems are principally statistical and research systems, rather than systems that are used for administrative or investigative purposes. Moreover, even these primarily statistical and research systems "inspire fears about the expanded power of central government, vis-à-vis the legislature, the local administration, the private sector and most especially the citizen" (Bennett, 1992, p. 49). In Sweden, for example, the press harshly criticized the linkage of cancer registry databases and abortion record databases for medical research purposes (Stern, 1986).

Moreover, European democracies that use unique personal identification numbers assigned at birth for each citizen have a history of the use of personal, numerical codes. Even in Sweden and Germany, two European democracies that make extensive use of personal identification numbers to track individuals and link databases, personal identification numbers are not used as a standard universal numeric identifier for participation in all aspects of the society. In addition, reports are increasing of popular resistance in these countries to the use of universal numeric identifiers (Stern, 1986). As Bennett (1992) comments in Regulating Privacy:

The issue in these countries [Sweden and Germany] has been the incremental and surreptitious use of these numbers for ends unrelated to those for which they were created. Where proposals have been introduced for a new universal identifier accompanied by a personal identification card, such as in Germany and Australia, they have been met with strong resistance because of the belief that non-uniformity and non-standardization with all the attendant problems for administration, are vital to the maintenance of personal privacy. (p. 51)

Over the years the Congress, the press, and privacy advocates have fiercely resisted any proposal for the development of databases that appeared to facilitate establishment of a national identification or database system.

Some observers urge that entities like HDOs eschew the use of any type of positive identification, such as a biometric identifier, and avoid the use of the Social Security number. The aim is to minimize the likelihood that HDOs will contribute intentionally or unintentionally to a national identification system or to the development of a standard universal identifier (USDHEW, 1973). If the HDO initiative is viewed by opinion leaders as a precursor to the establishment of any type of automated, national identification or dossier system, the initiative will likely fail.

HDO proponents should take every practicable step to assure advocacy groups, the media, legislators, and the American people that the emergence of HDOs will not contribute to the development of a centralized, automated national dossier system or a national identification system through linkage with non-health-related databases or the gradual relaxation of confidentiality policies.

Personal Identifiers and the Social Security Number

The personal identifier (ID) used by an HDO to label each of the individuals in the database is a crucial issue. It is related not only to past practices but will also be strongly influenced, if not mandated, by the health care reform actions now under way in the nation.

Of necessity, identifiers are used in present health care systems. For practical purposes the identifier in many systems is either the person's Social Security number (SSN) or, as in Medicare, the Social Security number of an individual with a letter appended. Issues relating to the Social Security number are examined below (see also USDHEW, 1973).

An Ideal Personal Identifier

The ideal personal identifier must, whatever its design, minimize or eliminate the risk of misidentification. An ideal identifier would meet certain requirements, including the six discussed below.

First, it must be able to make the transition easily from the present recordkeeping environment to one that will prevail in HDOs (and under national health care reform). Further, organizations will need to know where to apply for new numbers, to verify numbers that patients give verbally, to track down uncertainties in identification, to find current mailing addresses, and to be able to backtrack errors and correct them. This requirement also has technical dimensions. For example, if a new identifier contains more digits or characters (or both) than the 10 used for the Medicare identifier, there will be software repercussions in many systems, and redesign of data-capture forms may be necessary.

Second, the identifier must have error-control features to make entry of a wrong number unlikely. Control implies that errors of many kinds are detectable and possibly correctable on the basis of the digits and characters in the ID alone. Ideally it will protect against transpositions of characters and against single, double, or multiple errors. At a minimum the error-control features must be able to indicate whether the ID is valid and to do so with high confidence (USDHEW, 1973).

Error control is certain to be a system-wide requirement in any automated system. It will involve not only the structure of the ID itself, but also the processing software (or the residual manual processes) in every system that will have to use and verify the ID.13

Third, the ID will have separate identification and authentication elements. The distinction between identification and authentication is made where strong security is required. "Identification" implies that (in this case) an individual indicates who he is, but authentication is a separate process with different parameters (known only to the individual) that allow the system to verify with high confidence that the identification offered is valid. Banks, for example, sometimes require an individual to provide his mother's maiden name; a personal identification number (PIN) is another authenticator. In many, perhaps most, medical systems this distinction is not made, and a simple identifier (e.g., insurance plan identification number) is presumed to be correct. In the future, however, some consideration should be given to separating these functions.

1.

Identifying that something is wrong, but not indicating the error,

2.

Detecting and locating a single error,

3.

Detecting and locating multiple errors,

4.

Detecting, locating, and correcting single errors, and

5.

Detecting, locating, and correcting multiple errors.

The number of additional characters needed depends on the degree of error detection and correction that designers think is necessary for the circumstances. For example, a single check digit can identify an error but does not locate which digit is wrong or how. Moreover, it would not catch the common manual error of transposed characters. A simple single-digit check can sometimes say a bad message is good. For such reasons, a single check digit is not a very strong error control mechanism.

The issue of designing error codes becomes complex rapidly. It is essential, however, to realize that any error correction feature added to an established number such as the SSN will have hardware and software consequences—or both. The data fields in storage will have to be longer; the software will have to be modified to handle the longer data fields; additional software—or hardware or both—will have to be added to do whatever calculations the particular error-detection/correction scheme requires.

Error control is a system-level problem, not just an issue of the identifier per se. If one can arrange procedures so that the identifier is always known to be correct at the time it is entered into the automated system, then the problem within the system itself becomes simpler. The method of providing the identifier will result in higher or lower assurance of its accuracy: an individual's memory is probably the least assurance of correctness; an embossed card is better; and an electronic reader for the card is better still.

The present health care information infrastructure runs largely without external visible error controls. Although mainframes and communications equipment almost certainly have error controls to catch equipment malfunctions and communication faults, there is no error control on some, possibly much, of the data in the data base. It relies, instead, on people outside the system to detect errors—providers and patients—and risks major mistakes in processing.

Fourth, the ID must work in any circumstance in which health care services are rendered, whether or not the situation was anticipated in the design of the system. At a minimum, the ID must never be an impediment to the prompt, efficient delivery of health care. For example, it must work when the patient requiring health care is not able to cooperate (e.g., is unconscious or does not speak the same language as the health care personnel) and regardless of the patient's mental and physical abilities.

Fifth, the ID must function anywhere in the country and in any provider's facilities and settings. By extension, it must also be able to link events that have occurred at multiple providers.

Sixth, the ID must help to minimize the opportunities for crime and abuse and perhaps help to identify their perpetrators.

Issues Relating to the SSN

When the initial Social Security law was passed in 1935, the SSN was called the SSAN, the number of one's "account" with the Social Security Administration (SSA). The SSA has always held that the SSN is not to be used as a personal identifier.14

In 1943 President Franklin Roosevelt signed an executive order requiring federal agencies to use the SSN whenever a new record system was to be established. The Department of Defense adopted it as a military identification number during World War II, and in 1961 the Internal Revenue Service (IRS) adopted it as the taxpayer identification number. When Medicare legislation was passed in the 1960s, the government adopted the SSN plus an appended letter as the Medicare health insurance number. Many experts regard this as a serious undermining of privacy protection because the many recordkeeping activities associated with health care delivery act to disseminate a piece of information that differs from the SSN by only an appended letter.

In the Privacy Act of 1974—largely in response to the position of a Department of Health, Education, and Welfare (USDHEW, 1973) committee that had studied the issue—Congress prohibited states from using the SSN for enumeration systems other than by authority of Congress; however, states already using it were allowed to continue. The Tax Reform Act of 1976 undermined this position, however, by authorizing the states to use the SNN for a variety of systems: state or local tax authorities, welfare systems, driver's license systems, departments of motor vehicles, and systems for finding parents who are delinquent in court-imposed child-support payments (OTA, 1986).

In short, the government has caused the proliferation in the use of the SSN, sometimes by positive actions but sometimes by indifference or congressional failure to act. Some government decisions, notably to use the SSN as the taxpayer identification number and as the basis of the Medicare number, forced its wide diffusion throughout the private sector through financial transactions and benefits payments. In this way—partly deliberately and partly inadvertently—a very sensitive item of personal information has become widely disseminated.

SSN Uses for Other Than Medical Payments

Organizations that use the SSN as a personal ID and that most citizens will deal with frequently include federal government agencies (e.g., the Social Security Administration for benefits, the Internal Revenue Service for taxes and withholding, the Health Care Financing Administration and its contractors for Medicare payments, and the Securities and Exchange Commission); educational institutions, which frequently use it as a student identifier for campus-wide purposes; state governments (e.g., for state taxes, property and other local taxes, driver and vehicle registrations, real property records, financial transactions, and Medicaid payments); and private organizations (e.g., providers for health care services, industry-support databases such as the Medical Information Bureau, mortgage and loan agencies, credit reporting organizations, real property records, and banks).

Organizations, especially those in the private sector, choose to use the SSN for a number of pragmatic reasons and for expediency. Organizations already hold the number legally in connection with tax, financial, and wage matters. Moreover, there are no prohibitions against its use as a personal identifier in the private sector. Individuals usually have an SSN, or they can get one easily. In addition, people have become accustomed to willingly providing an SSN when asked; hence, its acquisition is a matter of merely asking, not legal compulsion.15 Finally, administration of an enumeration system can be burdensome, and the choice of SSN shifts that consequence onto the government.

Although federal, state, or local governments usually require the SSN under law, private-sector requests serve the purposes and motivations of the organization. The essential point is that the SSN is in extraordinarily wide use as a personal identifier. As a result, any given person is indexed in a huge number of databases by his or her SSN, and an unknown number of linkages and data exchange among such databases are routine business.

If health care reform were to mandate a patient ID that is either the SSN or a closely related number, it will in effect have forced the last step of making the SSN into a truly universal personal ID. This is the issue that launched the DHEW committee in 1970 (PPSC, 1977a).

Shortfalls of the SSN as an Identifier

The choice of a personal ID that is satisfactory for the operational needs of health care delivery, but at the same time assures the confidentiality of medical data and the privacy of individuals is neither easy nor casual. Superficially, the choice would be the SSN, Medicare number, or something similar simply because people are accustomed to using them, systems are used to handling them. The government would bear the burden of administering the enumeration system but would avoid the cost of creating a new one. For information management, however, the shortfalls of the SSN are well known. The following list is representative of the problems.

1.

Any 9-digit number rendered with hyphens in the appropriate places—that is, XXX-XX-XXXX—has a high likelihood of being a legitimate SSN that belongs to someone or at least appears to. This provides little security, and data commingling can occur that would result in erroneous records, mistaken conclusions and actions, and incorrect payments.

2.

The allowable entries in each of the three groups in a SSN are well known. Thus, it is easy to counterfeit an SSN and have a high probability that it will not be challenged.

3.
Источник: [https://torrent-igruha.org/3551-portal.html]

Download expanded obnoxiously copius review packet-1.pdf

2 thoughts to “Download expanded obnoxiously copius review packet-1.pdf”

Leave a Reply

Your email address will not be published. Required fields are marked *