The planning of cryptographic solutions for messaging and electronic commerce applications in the United States during the past few years has been motivated by a high level of interest in the technology on the part of potential users. It has been marked by a high level of controversy over algorithms, patent rights and escrow policy. The diverse needs of the government and commercial sectors have led to mutually exclusive solutions based on different algorithms and policy; this phenomenon is fairly unique to the United States. Because of the strong requirement to preserve the differences that make these solutions unique for the two environments, the near-term evolution of a single standard appears unlikely. Furthermore, the need on the part of some government agencies and some commercial establishments exists to operate in both environments. This paper deals with the technical definition and design approach to a dual-use cryptographic device and the migration paths to the dual-use device from both environments. Such a device is further considered as a component of a secure cryptographic translation facility.
During the past few years a debate has been raging around the conflict between the rights of an individual to privacy in information to foster its competitiveness and the Government's need to access information for national security and law enforcement purposes. At the heart of the debate is the difficulty in arriving at a position whereby a robust implementation of an encryption process can be accomplished that protects sensitive private or industrial data yet insures that the government can have access to information if that information is part of a criminal conspiracy or enterprise, or other action hostile to the United States. The U.S. Government initially tendered a ciyptographic scheme known as the Clipper Escrow Key management plan. Using communication encryption technology, the government wanted to mandate an encryption process for which it maintained the key used for the decrypting of information transiting any communications path. This key would be split and distributed to escrow agents. The split key would have to be combined if the government were to use the key to decrypt and monitor criminal or other such activities. That methodology met with howls of protest from much of U.S. society (industry and private) due to a certain mistrust of the government and its handling of private information (for example, various IRS scandals). The debate has shifted from the technical solution provided by the Clipper initiative to alternate methods that defme key escrow in terms of a commercial or private entity. A NIST-sponsored key escrow meeting was held on August 17, 1995 to listen to the government's proposal to work towards a solution which industry and the international community would accept and which would provide needed security to private information. The meeting was a positive step towards resolving the conflicting issues surrounding cryptography. The government's proposal to extend the key length of any cryptographic algorithm used to 64 bits to enable export of cryptographic products more readily is a small step towards industry's desire for "good" cryptography. As a result, the government has set the stage to extend cryptography into the broader international field of electronic commerce. Privacy is still an issue and must be included in the resultant key escrow solution. Since the very onset of the debate TECSEC has espoused private key escrow as the only method that individuals, industry and the international community would accept for cryptography. A split key method has been developed that could satisfy many of the issues. The technology is called Constructive Key Management ("CKM "); the resultant product using CKM is called VEIL. VEIL is a software key management design that utilizes multiple key splits with labels as cryptographic triggers. It offers complete administrative control of the key, and it includes an inherent method to construct the key used for encrypting a file or database that results in a fixed header and audit information. By defining the roles of the escrow agent as a mix between government and private as necessary, VEIL can be applied to solve the private key escrow question.
A variety of lessons learned while working to protect sensitive but unclassified data are discussed, and some methods of protecting this data are mentioned. Sensitive data is present in every organization, and is usually not given adequate protection. This paper defines sensitive but unclassified data and discusses where it can be found. A variety of threats against this data are then detailed, followed by some solutions to the problem of protecting against these threats. The authors' experiences with implementing these solutions are considered.
Protection against hostile algorithms contained in Unix software is a growing concern without easy answers. Traditional methods used against similar attacks in other operating system environments such as MS-DOS or Macintosh are insufficient in the more complex environment provided by Unix. Additionally, Unix provides a special and significant problem in this regard due to its open and heterogeneous nature. These problems are expected to become both more common and pronounced as 32 bit multiprocess network operating systems become popular. Therefore, the problems experienced today are a good indicator of the problems and the solutions that will be experienced in the future, no matter which operating system becomes predominate.
This paper describes our technical approach to developing and delivering Unix host- and network-based security products to meet the increasing challenges in information security. Today's global `Infosphere' presents us with a networked environment that knows no geographical, national, or temporal boundaries, and no ownership, laws, or identity cards. This seamless aggregation of computers, networks, databases, applications, and the like store, transmit, and process information. This information is now recognized as an asset to governments, corporations, and individuals alike. This information must be protected from misuse. The Security Profile Inspector (SPI) performs static analyses of Unix-based clients and servers to check on their security configuration. SPI's broad range of security tests and flexible usage options support the needs of novice and expert system administrators alike. SPI's use within the Department of Energy and Department of Defense has resulted in more secure systems, less vulnerable to hostile intentions. Host-based information protection techniques and tools must also be supported by network-based capabilities. Our experience shows that a weak link in a network of clients and servers presents itself sooner or later, and can be more readily identified by dynamic intrusion detection techniques and tools. The Network Intrusion Detector (NID) is one such tool. NID is designed to monitor and analyze activity on the Ethernet broadcast Local Area Network segment and product transcripts of suspicious user connections. NID's retrospective and real-time modes have proven invaluable to security officers faced with ongoing attacks to their systems and networks.
As organizations consider connecting to the Internet, the issue of internetwork security becomes more important. There are many tools and components that can be used to secure a network, one of which is a firewall. Modern firewalls offer highly flexible private network security by controlling and monitoring all communications passing into or out of the private network. Specifically designed for security, firewalls become the private network's single point of attack from Internet intruders. Application gateways (or proxies) that have been written to be secure against even the most persistent attacks ensure that only authorized users and services access the private network. One-time passwords prevent intruders from `sniffing' and replaying the usernames and passwords of authorized users to gain access to the private network. Comprehensive logging permits constant and uniform system monitoring. `Address spoofing' attacks are prevented. The private network may use registered or unregistered IP addresses behind the firewall. Firewall-to-firewall encryption establishes a `virtual private network' across the Internet, preventing intruders from eavesdropping on private communications, eliminating the need for costly dedicated lines.
This note describes six key pitfalls in the deployment of popular commercial firewalls. The term `deployment' is intended to include the architecture of the firewall software itself, the integration of the firewall with the operating system platform, and the interconnection of the complete hardware/software combination within its target environment. After reviewing the evolution of Internet firewalls against the backdrop of classical trusted systems development, specific flaws and oversights in the familiar commercial deployments are analyzed in some detail. While significantly costlier solutions are available that address some of these problems, the analysis is applicable to the overwhelming majority of firewalls in use at both commercial and Government installations.
In this report we review case histories of industrial espionage publicized in the media and in Congressional hearings. The threat to the United Stages as the world's largest investor in R&D is magnified by the transition from a cold war military confrontation of the super powers to an economic competition in global markets. To sustain their market share, France, Japan and Russia have initiated national programs to acquire U.S. technical know-how. Former intelligence staff now distill fragments of sensitive information into meaningful knowledge to guide industrial and national efforts to ascertain dominance. This threat is amplified by the exponential proliferation of global communication networks, like INTERENET, that reach into corporate America and permit unseen adversaries to probe the vast U.S. data stores for unprotected intelligence. Counter intelligence in industrial espionage by the United Stages on a national level is virtually impossible because of public scrutiny in our open society. On the positive side, the upheaval of a rapid transition from high-tension and high economic stability to low-tension and high economic instability is prompting international collaboration against international terrorism. On the corporate level, strategic alliances with foreign firms are expanding to sustain competitiveness and innovation in areas of specialty. A national security plan to protect the U.S. information resources is needed; and a viable policy to operate our information highways as safe conduits for electronic business. The well being of the global economy, not just that of our nation, is at stake and should not be left to chance and provocation.
"In order to form a more perfect Union . . The Constitution of the United States of America, September 17, 1787. "We remain unwilling to impose any discipline upon ourselves that demands a change in lifestyle." David Halberstam, The Next Century, Avon Books, 1992 On May 14, 1787, George Washington, James Madison and Benjamin Frauklinjoined 52 other men representing twelve of the thirteen states, to in the words of Alexander Hamilton, "render the Constitution ofthe Federal Government. "ThroughSeptember ofthat year, the Constitutional Convention sought to write the instructional manual on how to run the United States of America. Two hundred years ago, the world was a very different place. The War with England was over and the United States (plural) were preparing for peace. They needed to forge unity amongst them selves, establish the new country as a viable international partner and build a strong defense for a secure future. Weaknesses in the original constitution, the Articles of Confederation, prompted the new nation's leaders to Philadelphia for four months of debate, emotional speeches and the honing of the Constitution. What they sought was a balance between monarchy and democracy. Despite the vast difference between the participants, their diverging views of how to fashion a new nation and by what rules it should operate, they succeeded at their task. Signed by only 39 of the original 55 delegates, on June 21 , 1788, the Constitution of the United States became the supreme law of the land with its ratification by 9 of the thirteen states But due to deficiencies in state and individual rights, the First Congress in 1789 submitted to the states twelve amendments to the Constitution. The ten surviving amendments are known as the Bill of Rights and are in many ways viewed as the bastions of personal freedom and liberty; perhaps more so than the original body of the Constitution itself. These two documents - The Constitution and the Bill of Rights -have carried this nation forward for over two hundred years suffering only 17 changes. Guided by its principles, America and its citizens have been able to navigate through good times and bad. Who would have thought it could have survived two centuries? Today, however, we see that our national migration into Cyberspace represents such a fundamental O-8194-1980-X/96/$6.QQ change in national lifestyle and culture that the existing tentacles of laws and legislation do not automatically apply.
The technology available today makes the design ofthe subject card possible and practical. The latter is extremely important for the end item to achieve wide spread use. Only a few years ago, the design was possible, but implementation on a wide scale was not feasible or practical because the manufacturing cost would have made the production ofthe cards too expensive. Today though, by employing the right techniques, one can design a PC Public Key Card that provides confidentiality(encryption); authentication(integrity); and non repudiation in a tamper resistant envelope. Note the term resistant is used as the author believes that the perfect design to achieve a truly tamper proofcard is non existent. One will always be able to reverse engineer the design given time, money and talent. The intent ofthe design discussed in this paper is to make it extremely difficult and therefore very expensive for the person or persons attempting to reverse engineer the PC card. It is essential that the design consider tamper proofing from the very beginnings ofthe project. This means that the indiVidUal chips and/or storage devices that ll comprise the design ofthe Public Key PC card be designed or chosen for their inherent resistance to reverse engineering. This implies that the designer is extremely familiarly with the technology available and it's ability to be manufactured on a wide scale basis. One technology that comes to mind inimediately is vROM for it's inherent resistance to reverse engineering. Remember the assumption that anything can be reverse engineered given the money, time and talent and the goal is to choose a technology that is going to make the person or persons attempting to reverse engineer the design to use and require the most ofeach category. vRom has this potential. It has proven itselfthrough repeated attacks by one of the most sophisticated labs in the country. Another desirable characteristic is the density of the cell family used in the design ofthe Application Specific Integrated Circuits(ASICS) used to formulate the cryptographic engine that is the heart and soul of the Public Key PC card. Fortunately, this characteristic can be measured by examining the cell structures ofthe foundries and going with the foundry that sll meet the objective discussed above. Once the memory elements and ASIC design is stabilized, the next level oftamper design must be addressed. This is the design and layout ofthe PC card itself. Again, the rule ofpicking the most difficult combination oftechniques to frustrate the attacker is paramount. But, one also has to be careful that the design is not so complex that it will be difficult to manufacture on a large scale, otherwise no one will be able to afford to use the PC card and it will become another expensive experiment. Fortunately, todays technology provides a way to achieve an extremely tamper resistant PC card lay out that lends itseifto common wide scale manufacturing techniques. Using techniques such as Tape Automatic Bonding(TAB); Flip Chip mountings; Solder Bumping, etc., etc., in the right combination, followed by encapsulation techniques similar to injection molding, one can provide a product that can not only be manufactured on a wide scale but is also inherently resistant to tampering and resistant to the most sophisticated attempts to reverse engineer the end product. A side benefit ofthis type of design is that it is extremely rugged and hence can be deployed in the most hostile environments. The above is a very briefdiscussion of how one could go about the design of a Tamper Resistant Public Key PC card that provides confidentiality, authentication and integrity ofthe users data. It can and is being done today by a team of highly talented high tech companies, each of which brings their own expertise to focus on a particular aspect of the design as mentioned above, and together produce a card that has no equal in products available now or in the near future. Because this is an on going effort and due to the proprietary information involved in the design, additional details cannot be discussed until the required design protection is in place.
Security standards help users implement adequate protection in their systems. Independent, third-party conformance testing to security standards provides those users with a metric beyond vendor affirmation in determining conformance. Independent third-party conformance testing gives manufacturers the opportunity to claim conformance to a standard using a strong metric. However neither standards nor testing programs can be successful without the support of the manufacturers and users they are intended to serve. This paper discusses a standard that was developed by NIST in conjunction with industry, federal and private sector users, and the validation program that provides users the necessary metric to determine conformance.
This report proposes a `Consumer Protection Act for Digital Products' to support electronic commerce and to control the increasing abuse and lack of security on the national information highways. Patterned after the `Food and Drug Act of 1906 (21 USC)' and subsequent legislation, a new agency similar to that of the FDA would have the authority `to develop administrative policy with regard to the safety, effectiveness, and labeling of digital products and their communications for human use, and to review and evaluate new applications of such products.' Specifically, it is proposed that standards, originally developed by the defense industry for the labeling, enveloping, and authentication of digital products delivered to the Government, be extended to promote global electronic commerce by protecting the intellectual property rights of producers, establishing their liability for the end-use of digital products, and give consumers means for informed decision making and purchase.
Recent efforts at the United Nations and in the United States to define the legal structures for electronic commerce are providing insights into the component elements of a global legal structure for commercial digital products. Critical is the important role of originators in defining the rules for the transport, storage and use of standards-based digital messages, and the function of intermediaries as third party resources for assuring the integrity of transactions in digital products.
This paper addresses the underlying legal issues that must be considered prior to engaging in the development of multimedia and Internet applications. The often overlooked topics presented in this paper will show the audience, both developers and end users alike, what is required to help pave the information highway. Topics that are covered are: who owns the software, the developer or the party that commissioned the work; whether the current intellectual property laws create a gap in protection of multimedia software; what are the rights and obligations of the software developer and the commissioner; what is one's exposure to liability and how it can be minimized; and, whether one can be held legally responsible for merely selling multimedia software. Multimedia law is a burgeoning area, and in many ways, our laws have not kept pace with technology. Therefore, those involved with developing multimedia applications must be extremely careful to protect their rights, and to avoid liability.
Cryptography is a set of mathematical techniques used to protect the secrecy of information sent by unprotected or undefendable channels. Although cryptography is thought to be as old as writing itself, recent developments over the past 20 years have greatly expanded its use and need. Today a variety of new cryptographic techniques, including public key cryptography and digital signatures, promise virtually unlimited privacy for our communications--and near certain proof when fraudulent information is sent masquerading as legitimate communications. Nevertheless, despite these advances in cryptography and communications systems, we seem to have less privacy now than ever before. Indeed, as we prepare to exit the 20th Century, our society seems determined to replace the protective value of personal privacy with a new regime that promises positive identification and authentication, and absolute accountability for our actions. Ironically, cryptography and digital signatures may play a strong role in bringing about this dystopian future as well.
This paper introduces the technology of two pioneering patents for the secure distribution of information and intellectual property. The seminal technology has been used in the control of sensitive material such as medical records and imagery in distributed networks. It lends itself to the implementation of an open architecture access control system that provides local or remote user selective access to digital information stored on any computer system or storage medium, down to the data element, pixel, and sub-pixel levels. Use of this technology is especially suited for electronic publishing, health care records, MIS, and auditing.
This paper describes a digital image archiving, retrieval and sales system. The system employs natural language processing technology to retrieve images based on their captions, and can be integrated into enterprise applications that include usage tracking and cropping information.
The types of losses organizations must anticipate have become more difficult to predict because of the eclectic nature of computers and the data communications and the decrease in news media reporting of computer-related losses as they become commonplace. Total business crime is conjectured to be decreasing in frequency and increasing in loss per case as a result of increasing computer use. Computer crimes are probably increasing, however, as their share of the decreasing business crime rate grows. Ultimately all business crime will involve computers in some way, and we could see a decline of both together. The important information security measures in high-loss business crime generally concern controls over authorized people engaged in unauthorized activities. Such controls include authentication of users, analysis of detailed audit records, unannounced audits, segregation of development and production systems and duties, shielding the viewing of screens, and security awareness and motivation controls in high-value transaction areas. Computer crimes that involve highly publicized intriguing computer misuse methods, such as privacy violations, radio frequency emanations eavesdropping, and computer viruses, have been reported in waves that periodically have saturated the news media during the past 20 years. We must be able to anticipate such highly publicized crimes and reduce the impact and embarrassment they cause. On the basis of our most recent experience, I propose nine new types of computer crime to be aware of: computer larceny (theft and burglary of small computers), automated hacking (use of computer programs to intrude), electronic data interchange fraud (business transaction fraud), Trojan bomb extortion and sabotage (code security inserted into others' systems that can be triggered to cause damage), LANarchy (unknown equipment in use), desktop forgery (computerized forgery and counterfeiting of documents), information anarchy (indiscriminate use of crypto without control), Internet abuse (antisocial use of data communications), and international industrial espionage (governments stealing business secrets). A wide variety of safeguards are necessary to deal with these new crimes. The most powerful controls include (1) carefully controlled use of cryptography and digital signatures with good key management and overriding business and government decryption capability and (2) use of tokens such as smart cards to increase the strength of secret passwords for authentication of computer users. Jewelry-type security for small computers--including registration of serial numbers and security inventorying of equipment, software, and connectivity--will be necessary. Other safeguards include automatic monitoring of computer use and detection of unusual activities, segmentation and filtering of networks, special paper and ink for documents, and reduction of paper documents. Finally, international cooperation of governments to create trusted environments for business is essential.