Review of Federal Agency Computer Security and Privacy Plans

NCSL BULLETIN
OCTOBER, 1990

REVIEW OF FEDERAL AGENCY
COMPUTER SECURITY AND PRIVACY PLANS (CSPP): A SUMMARY REPORT

Sensitive information and information resources have become
increasingly important to the functioning of the federal
government. The protection of such information is integral to
the government serving the public trust. Concern that federal
agencies were not protecting their information caused Congress to
enact Public Law 100-235, “Computer Security Act of 1987” (the
Act). The Act reaffirmed the National Institute of Standards and
Technology’s (NIST) computer security responsibilities. These
responsibilities include developing standards and guidelines to
protect sensitive unclassified information. Other
responsibilities include providing new governmentwide programs in
computer security awareness training and security planning.

The Act required federal agencies to conduct educational programs
to increase staff awareness of the need for computer security.
The first-year activity included agencies identifying their
computer systems containing sensitive information. These
agencies prepared and submitted security plans for those systems
to the NIST and National Security Agency (NSA) review team for
advice and comment. This document summarizes a report on the
review of the computer security and privacy plans that were
submitted by federal agencies.

How The Reviews Were Conducted

The Office of Management and Budget (OMB) issued OMB Bulletin 88-
16, “Guidance for Preparation and Submission of Security Plans
for Federal Computer Systems Containing Sensitive Information,”
to guide agencies on preparing and submitting computer security
plans. The bulletin specified the information that was to appear
in each plan. The bulletin further requested that agencies
identify systems as major application or general ADP support
systems. Finally, the bulletin provided the agency the option of
identifying any needs for guidance or technical support. This
option also included making any comments the agency thought
appropriate. Although a four-part format appeared, agencies were
able to use latitude as long as all pertinent information was
present. This permitted agencies with existing programs to
submit current related documents. Submission of an agency
overview was optional and most agencies chose not to provide one.

The joint NIST/NSA review team examined 1,583 plans for 63
federal civilian agencies and 27,992 plans from 441 Department of
Defense (DoD) organizations. Most DoD submissions consisted
mainly of accreditation documentation prepared for other computer
security planning purposes. During the review process, the
review team recorded data about the systems for analysis. The
conclusions made in this report stem principally, but not
exclusively, from the civilian agency submissions.

Major Findings

The review team arrived at a number of conclusions about the
plans and the plan review process, seeing both many positive
signs and some areas for improvement. These findings include:

o The civilian agency CSPPs basically conformed with the
guidance given by OMB Bulletin 88-16. Many controls to
protect sensitive systems were already in place or
planned. These controls appeared consistent with
identified system functions, environment, and security
needs. However, some respondents appeared to have just
“checked the boxes,” perhaps presenting a falsely
optimistic picture.

o Many agencies appeared to report on isolated systems
rather than all systems subject to the Computer
Security Act and OMB Bulletin 88-16.

o Agencywide guidance on how to prepare the plans was not
clear. There was also some question whether a high-
level official reviewed the plans. Also unclear is the
distribution of agency-level computer security policy
and guidance. Further, most plans did not reflect the
joint involvement of ADP, computer security, and
applications communities in computer security planning.

o Significantly, the plans rarely addressed the security
concerns on networking, interfaces with other systems,
and the use of contractors and their facilities. This
may reflect a general confusion about the boundaries
and limits of responsibility for a given system.

o Many plans equated sensitivity only with privacy or
confidentiality and did not fully address requirements
for integrity and availability.

o Most plans did not communicate an appreciation for the
role of risk management activities in computer security
planning.

o Although most agencies said they had computer security
awareness and training, many did not show that all
applicable employees received periodic training.

o Finally, the CSPP submission and review effort raised
the level of federal awareness regarding the need to
protect sensitive information and the importance of
computer security planning.

Recommendations for Agencies

Based on the needs that became apparent during the plan review,
the review team recommends the following:

o Agency management should ensure that computer security
has the highest level of management involvement. This
involvement is also important in the computer security
planning process. Computer security benefits from the
multiple perspectives of and input from agency
information resources management, computer security,
and functional, user, and applications personnel.

o Agency management should identify and describe the
security needs of their systems which contain sensitive
information.

o Agency management should recognize the importance of
computer security and its required planning. This
recognition should be aggressively communicated to
their staffs, perhaps using their computer security and
awareness training programs as one of the vehicles.

o Agencies should incorporate computer security planning
with other information systems planning activities.

o Agencies should consider the protection requirements
for integrity and availability on an equal basis with
that of confidentiality.

o Agencies should assess risks, and select and implement
realistic controls throughout the system life cycle.
This involves awareness of technology changes with
regard to system hardware and software. This awareness
also requires a knowledge of new technology and new
methods for protecting and recovering from system
threats. In addition, agencies should fully document
in-place controls to ease periodic reevaluation,
internal audit, and oversight agency review.

o Agencies should implement certification and
accreditation programs. There is a lack of awareness
of guidance regarding certification and accreditation,
including FIPS PUB 102, “Guideline for Computer
Security Certification and Accreditation.” There is
also a lack of knowledge of the certification
requirements in OMB Circular A-130, “Management of
Federal Information Resources.” Agencies may use OMB
Circular A-130 as the basis for these programs.

o Agencies should clarify the boundaries and limits of
responsibility for each system, and should include, in
any planned risk assessment activity, full
consideration of the telecommunications and networking
environment and relationships with contractors and
other organizations.

o Agencies should stress security awareness and training
for their employees. This includes all employees
involved in the design, management, development,
operation, or use of federal computer systems
containing sensitive information.

o Agencies should develop computer security policy and
operative guidance. Such policy and guidance should
fully reflect and comprehensively address an
encompassing view of computer security. The Computer
Security Act, OMB Circular A-130, and OMB Bulletins 88-
16 and 89-17, “Federal Information Systems and
Technology Planning,” and their successors all contain
this view. The policy should directly address the full
scope of computer security planning and risk management
activities. It must incorporate an application system
perspective and give more detailed consideration to
confidentiality, integrity, and availability protection
requirements.

What NIST is Doing

NIST is evolving a strategy for helping federal agencies in
identifying and protecting sensitive information systems. This
strategy shifts emphasis to the implementation of computer
security plans, particularly those developed under OMB Bulletin
88-16. It provides for visits by OMB, NIST, and NSA staff. This
group will provide direct comments, advice, and technical aid
focused on the agency’s implementation of the Act.

In addition to the agency visits described above, NIST has
initiated the following computer security projects to help
agencies more easily and effectively comply with the Computer
Security Act:

o NIST will develop standardized specifications and
language for federal government computer security
services contracts.

o NIST will develop a guidance document on computer
security in the ADP procurement cycle.

o NIST has recently published guidance on the use of
Trusted Systems.

o NIST will develop guidance on computer security
planning.

o NIST has developed, and will continue to operate, a
computer incident response center in order to address
viruses, worms, and other malicious software attacks.

o NIST will support and coordinate computer security
resource and response centers nationwide.

o NIST will enhance and operate the National Computer
Systems Laboratory (NCSL) Computer Security Bulletin
Board System.

o NIST will operate the NIST/NSA Risk Management
Laboratory and prepare further guidelines on risk
management.

o NIST will develop guidance and recommendations on
assuring information integrity in computer systems.

In addition to the above plans, NIST has already developed a
number of guidelines and other resources to help federal managers
secure their computer systems.

Future Directions

Federal managers have computer security requirements that are
similar to their counterparts in the private sector. We believe
that private sector organizations can learn and benefit from the
federal experience in implementing the Computer Security Act. In
both environments, a vigorous computer security awareness program
is important at all levels in the organization. Also, in both
environments, the active involvement of user, management, ADP,
and computer security communities in computer security planning
could help end some of the existing and potential barriers to
effective computer security. Such collective involvement would
also help ensure cost-effective control measures commensurate
with system function, system sensitivity, security requirements,
and analyzed and considered risks.

Agencies need to be aware of developments taking place in the
national and international standards arena on system
interoperability and data interchange. These developments will
impact information system product availability, protection
requirements, and protection alternatives as agencies do their
near-, mid-, and long-term IRM and computer security planning.

Finally, because agency awareness of problems is fundamental to
the solution, this project has been valuable. Computer security
officers say that the CSPP preparation and review activity has
raised the level of awareness in all parts of their organizations
and has made it easier for them to promote computer security.
The CSPP review project significantly raised the level of federal
awareness about the protection of sensitive information and the
importance of computer security planning. In the final analysis,
this contribution may be among the most meaningful results of the
project.

The complete report of the CSPP review project will be published
as an NIST Interagency Report (NISTIR), and will be available
from the National Technical Information Service (NTIS) U.S.
Department of Commerce, 5285 Port Royal Road, Springfield,
VA 22161. Telephone: (703) 487-4650 FTS 737-4650. For
information about the report findings, contact Dennis Gilbert,
National Institute of Standards and Technology, A216, Technology
Building, Gaithersburg, MD 20899. Telephone: (301) 975-3872.

Downloaded From P-80 International Information Systems 304-744-2253

The Department of Devense Trusted Computer System Evaluation Criteria (August 15, 1993) (The Orange Book)

orange-boot.txt: No such file or directory
% cat orange.boo
orange.boo: No such file or directory
% cat orange-book.txt
CSC-STD-001-83
Library No. S225,711

DEPARTMENT OF DEFENSE

TRUSTED COMPUTER SYSTEM EVALUATION CRITERIA

15 August 1983

CSC-STD-001-83

FOREWORD

This publication, “Department of Defense Trusted Computer System Evaluation
Criteria,” is being issued by the DoD Computer Security Center under the
authority of and in accordance with DoD Directive 5215.1, “Computer Security
Evaluation Center.” The criteria defined in this document constitute a uniform
set of basic requirements and evaluation classes for assessing the
effectiveness of security controls built into Automatic Data Processing (ADP)
systems. These criteria are intended for use in the evaluation and selection
of ADP systems being considered for the processing and/or storage and
retrieval of sensitive or classified information by the Department of Defense.
Point of contact concerning this publication is the Office of Standards and
Products, Attention: Chief, Computer Security Standards.

____________________________ 15 August 1983
Melville H. Klein
Director
DoD Computer Security Center

ACKNOWLEDGMENTS

Special recognition is extended to Sheila L. Brand, DoD Computer Security
Center (DoDCSC), who integrated theory, policy, and practice into and directed
the production of this document.

Acknowledgment is also given for the contributions of: Grace Hammonds and
Peter S. Tasker, the MITRE Corp., Daniel J. Edwards, Col. Roger R. Schell,
Marvin Schaefer, DoDCSC, and Theodore M. P. Lee, Sperry UNIVAC, who as
original architects formulated and articulated the technical issues and
solutions presented in this document; Jeff Makey and Warren F. Shadle,
DoDCSC, who assisted in the preparation of this document; James P. Anderson,
James P. Anderson & Co., Steven B. Lipner, Digital Equipment Corp., Clark
Weissman, System Development Corp., LTC Lawrence A. Noble, formerly U.S. Air
Force, Stephen T. Walker, formerly DoD, Eugene V. Epperly, DoD, and James E.
Studer, formerly Dept. of the Army, who gave generously of their time and
expertise in the review and critique of this document; and finally, thanks are
given to the computer industry and others interested in trusted computing for
their enthusiastic advice and assistance throughout this effort.

TABLE OF CONTENTS

FOREWORD. . . . . . . . . . . . . . . . . . . . . . . . . . . .i
ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . ii
PREFACE . . . . . . . . . . . . . . . . . . . . . . . . . . . .v
INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . . . . .1

PART I: THE CRITERIA
Section
1.0 DIVISION D: MINIMAL PROTECTION. . . . . . . . . . . . .9
2.0 DIVISION C: DISCRETIONARY PROTECTION. . . . . . . . . 11
2.1 Class (C1): Discretionary Security Protection . . 12
2.2 Class (C2): Controlled Access Protection. . . . . 15
3.0 DIVISION B: MANDATORY PROTECTION. . . . . . . . . . . 19
3.1 Class (B1): Labeled Security Protection . . . . . 20
3.2 Class (B2): Structured Protection . . . . . . . . 26
3.3 Class (B3): Security Domains. . . . . . . . . . . 33
4.0 DIVISION A: VERIFIED PROTECTION . . . . . . . . . . . 41
4.1 Class (A1): Verified Design . . . . . . . . . . . 42
4.2 Beyond Class (A1). . . . . . . . . . . . . . . . . 51

PART II: RATIONALE AND GUIDELINES

5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS. . . . . 55
5.1 A Need for Consensus . . . . . . . . . . . . . . . 56
5.2 Definition and Usefulness. . . . . . . . . . . . . 56
5.3 Criteria Control Objective . . . . . . . . . . . . 56
6.0 RATIONALE BEHIND THE EVALUATION CLASSES. . . . . . . . . 63
6.1 The Reference Monitor Concept. . . . . . . . . . . 64
6.2 A Formal Security Policy Model . . . . . . . . . . 64
6.3 The Trusted Computing Base . . . . . . . . . . . . 65
6.4 Assurance. . . . . . . . . . . . . . . . . . . . . 65
6.5 The Classes. . . . . . . . . . . . . . . . . . . . 66
7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA . . . . 69
7.1 Established Federal Policies . . . . . . . . . . . 70
7.2 DoD Policies . . . . . . . . . . . . . . . . . . . 70
7.3 Criteria Control Objective For Security Policy . . 71
7.4 Criteria Control Objective for Accountability. . . 74
7.5 Criteria Control Objective for Assurance . . . . . 76
8.0 A GUIDELINE ON COVERT CHANNELS . . . . . . . . . . . . . 79
9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL
FEATURES . . . . . . . . . . . . . . . . . . . . . . . . 81
10.0 A GUIDELINE ON SECURITY TESTING . . . . . . . . . . . . 83
10.1 Testing for Division C . . . . . . . . . . . . . . 84
10.2 Testing for Division B . . . . . . . . . . . . . . 84
10.3 Testing for Division A . . . . . . . . . . . . . . 85
APPENDIX A: Commercial Product Evaluation Process. . . . . . 87
APPENDIX B: Summary of Evaluation Criteria Divisions . . . . 89
APPENDIX C: Sumary of Evaluation Criteria Classes. . . . . . 91
APPENDIX D: Requirement Directory. . . . . . . . . . . . . . 93

GLOSSARY. . . . . . . . . . . . . . . . . . . . . . . . . . .109

REFERENCES. . . . . . . . . . . . . . . . . . . . . . . . . .115

PREFACE

The trusted computer system evaluation criteria defined in this document
classify systems into four broad hierarchical divisions of enhanced security
protection. They provide a basis for the evaluation of effectiveness of
security controls built into automatic data processing system products. The
criteria were developed with three objectives in mind: (a) to provide users
with a yardstick with which to assess the degree of trust that can be placed
in computer systems for the secure processing of classified or other sensitive
information; (b) to provide guidance to manufacturers as to what to build into
their new, widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide a basis for
specifying security requirements in acquisition specifications. Two types of
requirements are delineated for secure processing: (a) specific security
feature requirements and (b) assurance requirements. Some of the latter
requirements enable evaluation personnel to determine if the required features
are present and functioning as intended. Though the criteria are
application-independent, it is recognized that the specific security feature
requirements may have to be interpreted when applying the criteria to specific
applications or other special processing environments. The underlying
assurance requirements can be applied across the entire spectrum of ADP system
or application processing environments without special interpretation.

INTRODUCTION

Historical Perspective

In October 1967, a task force was assembled under the auspices of the Defense
Science Board to address computer security safeguards that would protect
classified information in remote-access, resource-sharing computer systems.
The Task Force report, “Security Controls for Computer Systems,” published in
February 1970, made a number of policy and technical recommendations on
actions to be taken to reduce the threat of compromise of classified
information processed on remote-access computer systems.[34] Department of
Defense Directive 5200.28 and its accompanying manual DoD 5200.28-M, published
in 1972 and 1973 respectivley, responded to one of these recommendations by
establishing uniform DoD policy, security requirements, administrative
controls, and technical measures to protect classified information processed
by DoD computer systems.[8;9] Research and development work undertaken by the
Air Force, Advanced Research Projects Agency, and other defense agencies in
the early and mid 70’s developed and demonstrated solution approaches for the
technical problems associated with controlling the flow of information in
resource and information sharing computer systems.[1] The DoD Computer
Security Initiative was started in 1977 under the auspices of the Under
Secretary of Defense for Research and Engineering to focus DoD efforts
addressing computer security issues.[33]

Concurrent with DoD efforts to address computer security issues, work was
begun under the leadership of the National Bureau of Standards (NBS) to define
problems and solutions for building, evaluating, and auditing secure computer
systems.[17] As part of this work NBS held two invitational workshops on the
subject of audit and evaluation of computer security.[20;28] The first was
held in March 1977, and the second in November of 1978. One of the products
of the second workshop was a definitive paper on the problems related to
providing criteria for the evaluation of technical computer security
effectiveness.[20] As an outgrowth of recommendations from this report, and in
support of the DoD Computer Security Initiative, the MITRE Corporation began
work on a set of computer security evaluation criteria that could be used to
assess the degree of trust one could place in a computer system to protect
classified data.[24;25;31] The preliminary concepts for computer security
evaluation were defined and expanded upon at invitational workshops and
symposia whose participants represented computer security expertise drawn from
industry and academia in addition to the government. Their work has since
been subjected to much peer review and constructive technical criticism from
the DoD, industrial research and development organizations, universities, and
computer manufacturers.

The DoD Computer Security Center (the Center) was formed in January 1981 to
staff and expand on the work started by the DoD Computer Security
Initiative.[15] A major goal of the Center as given in its DoD Charter is to
encourage the widespread availability of trusted computer systems for use by
those who process classified or other sensitive information.[10] The criteria
presented in this document have evolved from the earlier NBS and MITRE
evaluation material.

Scope

The trusted computer system evaluation criteria defined in this document apply
to both trusted general-purpose and trusted embedded (e.g., those dedicated to
a specific application) automatic data processing (ADP) systems. Included are
two distinct sets of requirements: 1) specific security feature requirements;
and 2) assurance requirements. The specific feature requirements encompass
the capabilities typically found in information processing systems employing
general-purpose operating systems that are distinct from the applications
programs being supported. The assurance requirements, on the other hand,
apply to systems that cover the full range of computing environments from
dedicated controllers to full range multilevel secure resource sharing
systems.

Purpose

As outlined in the Preface, the criteria have been developed for a number of
reasons:

* To provide users with a metric with which to evaluate the
degree of trust that can be placed in computer systems for
the secure processing of classified and other sensitive
information.

* To provide guidance to manufacturers as to what security
features to build into their new and planned, commercial
products in order to provide widely available systems that
satisfy trust requirements for sensitive applications.

* To provide a basis for specifying security requirements in
acquisition specifications.

With respect to the first purpose for development of the criteria, i.e.,
providing users with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be performed on a computer
product from a perspective that excludes the application environment; or, (b)
it can be done to assess whether appropriate security measures have been taken
to permit the system to be used operationally in a specific environment. The
former type of evaluation is done by the Computer Security Center through the
Commercial Product Evaluation Process. That process is described in Appendix
A.

The latter type of evaluation, i.e., those done for the purpose of assessing a
system’s security attributes with respect to a specific operational mission,
is known as a certification evaluation. It must be understood that the
completion of a formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific application
environment. On the contrary, the evaluation report only provides a trusted
computer system’s evaluation rating along with supporting data describing the
product system’s strengths and weaknesses from a computer security point of
view. The system security certification and the formal approval/accreditation
procedure, done in accordance with the applicable policies of the issuing
agencies, must still be followed-before a system can be approved for use in
processing or handling classified information.[8;9]

The trusted computer system evaluation criteria will be used directly and
indirectly in the certification process. Along with applicable policy, it
will be used directly as the basis for evaluation of the total system and for
specifying system security and certification requirements for new
acquisitions. Where a system being evaluated for certification employs a
product that has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation. Technical data
will be furnished to designers, evaluators and the Designated Approving
Authorities to support their needs for making decisions.

Fundamental Computer Security Requirements

Any discussion of computer security necessarily starts from a statement of
requirements, i.e., what it really means to call a computer system “secure.”
In general, secure systems will control, through use of specific security
features, access to information such that only properly authorized
individuals, or processes operating on their behalf, will have access to read,
write, create, or delete information. Six fundamental requirements are
derived from this basic statement of objective: four deal with what needs to
be provided to control access to information; and two deal with how one can
obtain credible assurances that this is accomplished in a trusted computer
system.

POLICY

Requirement 1 – SECURITY POLICY – There must be an explicit and well-defined
security policy enforced by the system. Given identified subjects and
objects, there must be a set of rules that are used by the system to determine
whether a given subject can be permitted to gain access to a specific object.
Computer systems of interest must enforce a mandatory security policy that can
effectively implement access rules for handling sensitive (e.g., classified)
information.[7] These rules include requirements such as: No person lacking
proper personnel security clearance shall obtain access to classified
information. In addition, discretionary security controls are required to
ensure that only selected users or groups of users may obtain access to data
(e.g., based on a need-to-know).

Requirement 2 – MARKING – Access control labels must be associated with
objects. In order to control access to information stored in a computer,
according to the rules of a mandatory security policy, it must be possible to
mark every object with a label that reliably identifies the object’s
sensitivity level (e.g., classification), and/or the modes of access accorded
those subjects who may potentially access the object.

ACCOUNTABILITY

Requirement 3 – IDENTIFICATION – Individual subjects must be identified. Each
access to information must be mediated based on who is accessing the
information and what classes of information they are authorized to deal with.
This identification and authorization information must be securely maintained
by the computer system and be associated with every active element that
performs some security-relevant action in the system.

Requirement 4 – ACCOUNTABILITY – Audit information must be selectively kept
and protected so that actions affecting security can be traced to the
responsible party. A trusted system must be able to record the occurrences of
security-relevant events in an audit log. The capability to select the audit
events to be recorded is necessary to minimize the expense of auditing and to
allow efficient analysis. Audit data must be protected from modification and
unauthorized destruction to permit detection and after-the-fact investigations
of security violations.

ASSURANCE

Requirement 5 – ASSURANCE – The computer system must contain hardware/software
mechanisms that can be independently evaluated to provide sufficient assurance
that the system enforces requirements 1 through 4 above. In order to assure
that the four requirements of Security Policy, Marking, Identification, and
Accountability are enforced by a computer system, there must be some
identified and unified collection of hardware and software controls that
perform those functions. These mechanisms are typically embedded in the
operating system and are designed to carry out the assigned tasks in a secure
manner. The basis for trusting such system mechanisms in their operational
setting must be clearly documented such that it is possible to independently
examine the evidence to evaluate their sufficiency.

Requirement 6 – CONTINUOUS PROTECTION – The trusted mechanisms that enforce
these basic requirements must be continuously protected against tampering
and/or unauthorized changes. No computer system can be considered truly
secure if the basic hardware and software mechanisms that enforce the security
policy are themselves subject to unauthorized modification or subversion. The
continuous protection requirement has direct implications throughout the
computer system’s life-cycle.

These fundamental requirements form the basis for the individual evaluation
criteria applicable for each evaluation division and class. The interested
reader is referred to Section 5 of this document, “Control Objectives for
Trusted Computer Systems,” for a more complete discussion and further
amplification of these fundamental requirements as they apply to
general-purpose information processing systems and to Section 7 for
amplification of the relationship between Policy and these requirements.

Structure of the Document

The remainder of this document is divided into two parts, four appendices, and
a glossary. Part I (Sections 1 through 4) presents the detailed criteria
derived from the fundamental requirements described above and relevant to the
rationale and policy excerpts contained in Part II.

Part II (Sections 5 through 10) provides a discussion of basic objectives,
rationale, and national policy behind the development of the criteria, and
guidelines for developers pertaining to: mandatory access control rules
implementation, the covert channel problem, and security testing. It is
divided into six sections. Section 5 discusses the use of control objectives
in general and presents the three basic control objectives of the criteria.
Section 6 provides the theoretical basis behind the criteria. Section 7 gives
excerpts from pertinent regulations, directives, OMB Circulars, and Executive
Orders which provide the basis for many trust requirements for processing
nationally sensitive and classified information with computer systems.
Section 8 provides guidance to system developers on expectations in dealing
with the covert channel problem. Section 9 provides guidelines dealing with
mandatory security. Section 10 provides guidelines for security testing.
There are four appendices, including a description of the Trusted Computer
System Commercial Products Evaluation Process (Appendix A), summaries of the
evaluation divisions (Appendix B) and classes (Appendix C), and finally a
directory of requirements ordered alphabetically. In addition, there is a
glossary.

Structure of the Criteria

The criteria are divided into four divisions: D, C, B, and A ordered in a
hierarchical manner with the highest division (A) being reserved for systems
providing the most comprehensive security. Each division represents a major
improvement in the overall confidence one can place in the system for the
protection of sensitive information. Within divisions C and B there are a
number of subdivisions known as classes. The classes are also ordered in a
hierarchical manner with systems representative of division C and lower
classes of division B being characterized by the set of computer security
mechanisms that they possess. Assurance of correct and complete design and
implementation for these systems is gained mostly through testing of the
security- relevant portions of the system. The security-relevant portions of
a system are referred to throughout this document as the Trusted Computing
Base (TCB). Systems representative of higher classes in division B and
division A derive their security attributes more from their design and
implementation structure. Increased assurance that the required features are
operative, correct, and tamperproof under all circumstances is gained through
progressively more rigorous analysis during the design process.

Within each class, four major sets of criteria are addressed. The first three
represent features necessary to satisfy the broad control objectives of
Security Policy, Accountability, and Assurance that are discussed in Part II,
Section 5. The fourth set, Documentation, describes the type of written
evidence in the form of user guides, manuals, and the test and design
documentation required for each class.

A reader using this publication for the first time may find it helpful to
first read Part II, before continuing on with Part I.

PART I: THE CRITERIA

Highlighting (UPPERCASE) is used in Part I to indicate criteria not contained
in a lower class or changes and additions to already defined criteria. Where
there is no highlighting, requirements have been carried over from lower
classes without addition or modification.

1.0 DIVISION D: MINIMAL PROTECTION

This division contains only one class. It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

2.0 DIVISION C: DISCRETIONARY PROTECTION

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

2.1 CLASS (C1): DISCRETIONARY SECURITY PROTECTION

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data. It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data. The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity. The following are minimal
requirements for systems assigned a class (C1) rating:

2.1.1 SECURITY POLICY

2.1.1.1 Discretionary Access Control

THE TCB SHALL DEFINE AND CONTROL ACCESS BETWEEN NAMED USERS AND
NAMED OBJECTS (E.G., FILES AND PROGRAMS) IN THE ADP SYSTEM. THE
ENFORCEMENT MECHANISM (E.G., SELF/GROUP/PUBLIC CONTROLS, ACCESS
CONTROL LISTS) SHALL ALLOW USERS TO SPECIFY AND CONTROL SHARING
OF THOSE OBJECTS BY NAMED INDIVIDUALS OR DEFINED GROUPS OR BOTH.

2.1.2 ACCOUNTABILITY

2.1.2.1 Identification and Authentication

THE TCB SHALL REQUIRE USERS TO IDENTIFY THEMSELVES TO IT BEFORE
BEGINNING TO PERFORM ANY OTHER ACTIONS THAT THE TCB IS EXPECTED
TO MEDIATE. FURTHERMORE, THE TCB SHALL USE A PROTECTED
MECHANISM (E.G., PASSWORDS) TO AUTHENTICATE THE USER’S IDENTITY.
THE TCB SHALL PROTECT AUTHENTICATION DATA SO THAT IT CANNOT BE
ACCESSED BY ANY UNAUTHORIZED USER.

2.1.3 ASSURANCE

2.1.3.1 Operational Assurance

2.1.3.1.1 System Architecture

THE TCB SHALL MAINTAIN A DOMAIN FOR ITS OWN EXECUTION
THAT PROTECTS IT FROM EXTERNAL INTERFERENCE OR TAMPERING
(E.G., BY MODIFICATION OF ITS CODE OR DATA STRUCTURES).
RESOURCES CONTROLLED BY THE TCB MAY BE A DEFINED SUBSET
OF THE SUBJECTS AND OBJECTS IN THE ADP SYSTEM.

2.1.3.1.2 System Integrity

HARDWARE AND/OR SOFTWARE FEATURES SHALL BE PROVIDED THAT
CAN BE USED TO PERIODICALLY VALIDATE THE CORRECT OPERATION
OF THE ON-SITE HARDWARE AND FIRMWARE ELEMENTS OF THE TCB.

2.1.3.2 Life-Cycle Assurance

2.1.3.2.1 Security Testing

THE SECURITY MECHANISMS OF THE ADP SYSTEM SHALL BE TESTED
AND FOUND TO WORK AS CLAIMED IN THE SYSTEM DOCUMENTATION.
TESTING SHALL BE DONE TO ASSURE THAT THERE ARE NO OBVIOUS
WAYS FOR AN UNAUTHORIZED USER TO BYPASS OR OTHERWISE
DEFEAT THE SECURITY PROTECTION MECHANISMS OF THE TCB.
(SEE THE SECURITY TESTING GUIDELINES.)

2.1.4 DOCUMENTATION

2.1.4.1 Security Features User’s Guide

A SINGLE SUMMARY, CHAPTER, OR MANUAL IN USER DOCUMENTATION
SHALL DESCRIBE THE PROTECTION MECHANISMS PROVIDED BY THE TCB,
GUIDELINES ON THEIR USE, AND HOW THEY INTERACT WITH ONE ANOTHER.

2.1.4.2 Trusted Facility Manual

A MANUAL ADDRESSED TO THE ADP SYSTEM ADMINISTRATOR SHALL
PRESENT CAUTIONS ABOUT FUNCTIONS AND PRIVILEGES THAT SHOULD BE
CONTROLLED WHEN RUNNING A SECURE FACILITY.

2.1.4.3 Test Documentation

THE SYSTEM DEVELOPER SHALL PROVIDE TO THE EVALUATORS A DOCUMENT
THAT DESCRIBES THE TEST PLAN AND RESULTS OF THE SECURITY
MECHANISMS’ FUNCTIONAL TESTING.

2.1.4.4 Design Documentation

DOCUMENTATION SHALL BE AVAILABLE THAT PROVIDES A DESCRIPTION OF
THE MANUFACTURER’S PHILOSOPHY OF PROTECTION AND AN EXPLANATION
OF HOW THIS PHILOSOPHY IS TRANSLATED INTO THE TCB. IF THE TCB
IS COMPOSED OF DISTINCT MODULES, THE INTERFACES BETWEEN THESE
MODULES SHALL BE DESCRIBED.

2.2 CLASS (C2): CONTROLLED ACCESS PROTECTION

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation. The following are minimal requirements for systems
assigned a class (C2) rating:

2.2.1 SECURITY POLICY

2.2.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system. The
enforcement mechanism (e.g., self/group/public controls, access
control lists) shall allow users to specify and control sharing
of those objects by named individuals, or defined groups OF
INDIVIDUALS, or by both. THE DISCRETIONARY ACCESS CONTROL
MECHANISM SHALL, EITHER BY EXPLICIT USER ACTION OR BY DEFAULT,
PROVIDE THAT OBJECTS ARE PROTECTED FROM UNAUTHORIZED ACCESS.
THESE ACCESS CONTROLS SHALL BE CAPABLE OF INCLUDING OR EXCLUDING
ACCESS TO THE GRANULARITY OF A SINGLE USER. ACCESS PERMISSION
TO AN OBJECT BY USERS NOT ALREADY POSSESSING ACCESS PERMISSION
SHALL ONLY BE ASSIGNED BY AUTHORIZED USERS.

2.2.1.2 Object Reuse

WHEN A STORAGE OBJECT IS INITIALLY ASSIGNED, ALLOCATED, OR
REALLOCATED TO A SUBJECT FROM THE TCB’S POOL OF UNUSED STORAGE
OBJECTS, THE TCB SHALL ASSURE THAT THE OBJECT CONTAINS NO DATA
FOR WHICH THE SUBJECT IS NOT AUTHORIZED.

2.2.2 ACCOUNTABILITY

2.2.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall use a protected
mechanism (e.g., passwords) to authenticate the user’s identity.
The TCB shall protect authentication data so that it cannot be
accessed by any unauthorized user. THE TCB SHALL BE ABLE TO
ENFORCE INDIVIDUAL ACCOUNTABILITY BY PROVIDING THE CAPABILITY TO
UNIQUELY IDENTIFY EACH INDIVIDUAL ADP SYSTEM USER. THE TCB
SHALL ALSO PROVIDE THE CAPABILITY OF ASSOCIATING THIS IDENTITY
WITH ALL AUDITABLE ACTIONS TAKEN BY THAT INDIVIDUAL.

2.2.2.2 Audit

THE TCB SHALL BE ABLE TO CREATE, MAINTAIN, AND PROTECT FROM
MODIFICATION OR UNAUTHORIZED ACCESS OR DESTRUCTION AN AUDIT
TRAIL OF ACCESSES TO THE OBJECTS IT PROTECTS. THE AUDIT DATA
SHALL BE PROTECTED BY THE TCB SO THAT READ ACCESS TO IT IS
LIMITED TO THOSE WHO ARE AUTHORIZED FOR AUDIT DATA. THE TCB
SHALL BE ABLE TO RECORD THE FOLLOWING TYPES OF EVENTS: USE OF
IDENTIFICATION AND AUTHENTICATION MECHANISMS, INTRODUCTION OF
OBJECTS INTO A USER’S ADDRESS SPACE (E.G., FILE OPEN, PROGRAM
INITIATION), DELETION OF OBJECTS, AND ACTIONS TAKEN BY
COMPUTER OPERATORS AND SYSTEM ADMINISTRATORS AND/OR SYSTEM
SECURITY OFFICERS. FOR EACH RECORDED EVENT, THE AUDIT RECORD
SHALL IDENTIFY: DATE AND TIME OF THE EVENT, USER, TYPE OF
EVENT, AND SUCCESS OR FAILURE OF THE EVENT. FOR
IDENTIFICATION/AUTHENTICATION EVENTS THE ORIGIN OF REQUEST
(E.G., TERMINAL ID) SHALL BE INCLUDED IN THE AUDIT RECORD. FOR
EVENTS THAT INTRODUCE AN OBJECT INTO A USER’S ADDRESS SPACE AND
FOR OBJECT DELETION EVENTS THE AUDIT RECORD SHALL INCLUDE THE
NAME OF THE OBJECT. THE ADP SYSTEM ADMINISTRATOR SHALL BE ABLE
TO SELECTIVELY AUDIT THE ACTIONS OF ANY ONE OR MORE USERS BASED
ON INDIVIDUAL IDENTITY.

2.2.3 ASSURANCE

2.2.3.1 Operational Assurance

2.2.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
Resources controlled by the TCB may be a defined subset
of the subjects and objects in the ADP system. THE TCB
SHALL ISOLATE THE RESOURCES TO BE PROTECTED SO THAT THEY
ARE SUBJECT TO THE ACCESS CONTROL AND AUDITING
REQUIREMENTS.

2.2.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct operation
of the on-site hardware and firmware elements of the TCB.

2.2.3.2 Life-Cycle Assurance

2.2.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
Testing shall be done to assure that there are no obvious
ways for an unauthorized user to bypass or otherwise
defeat the security protection mechanisms of the TCB.
TESTING SHALL ALSO INCLUDE A SEARCH FOR OBVIOUS FLAWS THAT
WOULD ALLOW VIOLATION OF RESOURCE ISOLATION, OR THAT WOULD
PERMIT UNAUTHORIZED ACCESS TO THE AUDIT OR AUTHENTICATION
DATA. (See the Security Testing guidelines.)

2.2.4 DOCUMENTATION

2.2.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

2.2.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. THE PROCEDURES FOR
EXAMINING AND MAINTAINING THE AUDIT FILES AS WELL AS THE
DETAILED AUDIT RECORD STRUCTURE FOR EACH TYPE OF AUDIT EVENT
SHALL BE GIVEN.

2.2.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing.

2.2.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. If the TCB
is composed of distinct modules, the interfaces between these
modules shall be described.

3.0 DIVISION B: MANDATORY PROTECTION

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division. Systems in this division must carry the
sensitivity labels with major data structures in the system. The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB. Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

3.1 CLASS (B1): LABELED SECURITY PROTECTION

Class (B1) systems require all the features required for class (C2). In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information. Any
flaws identified by testing must be removed. The following are minimal
requirements for systems assigned a class (B1) rating:

3.1.1 SECURITY POLICY

3.1.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined groups
of individuals, or by both. The discretionary access control
mechanism shall, either by explicit user action or by default,
provide that objects are protected from unauthorized access.
These access controls shall be capable of including or excluding
access to the granularity of a single user. Access permission
to an object by users not already possessing access permission
shall only be assigned by authorized users.

3.1.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.1.1.3 Labels

SENSITIVITY LABELS ASSOCIATED WITH EACH SUBJECT AND STORAGE
OBJECT UNDER ITS CONTROL (E.G., PROCESS, FILE, SEGMENT, DEVICE)
SHALL BE MAINTAINED BY THE TCB. THESE LABELS SHALL BE USED AS
THE BASIS FOR MANDATORY ACCESS CONTROL DECISIONS. IN ORDER TO
IMPORT NON-LABELED DATA, THE TCB SHALL REQUEST AND RECEIVE FROM
AN AUTHORIZED USER THE SECURITY LEVEL OF THE DATA, AND ALL SUCH
ACTIONS SHALL BE AUDITABLE BY THE TCB.

3.1.1.3.1 Label Integrity

SENSITIVITY LABELS SHALL ACCURATELY REPRESENT SECURITY
LEVELS OF THE SPECIFIC SUBJECTS OR OBJECTS WITH WHICH THEY
ARE ASSOCIATED. WHEN EXPORTED BY THE TCB, SENSITIVITY
LABELS SHALL ACCURATELY AND UNAMBIGUOUSLY REPRESENT THE
INTERNAL LABELS AND SHALL BE ASSOCIATED WITH THE
INFORMATION BEING EXPORTED.

3.1.1.3.2 Exportation of Labeled Information

THE TCB SHALL DESIGNATE EACH COMMUNICATION CHANNEL AND
I/O DEVICE AS EITHER SINGLE-LEVEL OR MULTILEVEL. ANY
CHANGE IN THIS DESIGNATION SHALL BE DONE MANUALLY AND
SHALL BE AUDITABLE BY THE TCB. THE TCB SHALL MAINTAIN
AND BE ABLE TO AUDIT ANY CHANGE IN THE CURRENT SECURITY
LEVEL ASSOCIATED WITH A SINGLE-LEVEL COMMUNICATION
CHANNEL OR I/O DEVICE.

3.1.1.3.2.1 Exportation to Multilevel Devices

WHEN THE TCB EXPORTS AN OBJECT TO A MULTILEVEL I/O
DEVICE, THE SENSITIVITY LABEL ASSOCIATED WITH THAT
OBJECT SHALL ALSO BE EXPORTED AND SHALL RESIDE ON
THE SAME PHYSICAL MEDIUM AS THE EXPORTED
INFORMATION AND SHALL BE IN THE SAME FORM
(I.E., MACHINE-READABLE OR HUMAN-READABLE FORM).
WHEN THE TCB EXPORTS OR IMPORTS AN OBJECT OVER A
MULTILEVEL COMMUNICATION CHANNEL, THE PROTOCOL
USED ON THAT CHANNEL SHALL PROVIDE FOR THE
UNAMBIGUOUS PAIRING BETWEEN THE SENSITIVITY LABELS
AND THE ASSOCIATED INFORMATION THAT IS SENT OR
RECEIVED.

3.1.1.3.2.2 Exportation to Single-Level Devices

SINGLE-LEVEL I/O DEVICES AND SINGLE-LEVEL
COMMUNICATION CHANNELS ARE NOT REQUIRED TO
MAINTAIN THE SENSITIVITY LABELS OF THE INFORMATION
THEY PROCESS. HOWEVER, THE TCB SHALL INCLUDE A
MECHANISM BY WHICH THE TCB AND AN AUTHORIZED USER
RELIABLY COMMUNICATE TO DESIGNATE THE SINGLE
SECURITY LEVEL OF INFORMATION IMPORTED OR EXPORTED
VIA SINGLE-LEVEL COMMUNICATION CHANNELS OR I/O
DEVICES.

3.1.1.3.2.3 Labeling Human-Readable Output

THE ADP SYSTEM ADMINISTRATOR SHALL BE ABLE TO
SPECIFY THE PRINTABLE LABEL NAMES ASSOCIATED WITH
EXPORTED SENSITIVITY LABELS. THE TCB SHALL MARK
THE BEGINNING AND END OF ALL HUMAN-READABLE, PAGED,
HARDCOPY OUTPUT (E.G., LINE PRINTER OUTPUT) WITH
HUMAN-READABLE SENSITIVITY LABELS THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE OUTPUT. THE TCB
SHALL, BY DEFAULT, MARK THE TOP AND BOTTOM OF EACH
PAGE OF HUMAN-READABLE, PAGED, HARDCOPY OUTPUT
(E.G., LINE PRINTER OUTPUT) WITH HUMAN-READABLE
SENSITIVITY LABELS THAT PROPERLY* REPRESENT THE
OVERALL SENSITIVITY OF THE OUTPUT OR THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE INFORMATION ON THE
PAGE. THE TCB SHALL, BY DEFAULT AND IN AN
APPROPRIATE MANNER, MARK OTHER FORMS OF HUMAN-
READABLE OUTPUT (E.G., MAPS, GRAPHICS) WITH HUMAN-
READABLE SENSITIVITY LABELS THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE OUTPUT. ANY
OVERRIDE OF THESE MARKING DEFAULTS SHALL BE
AUDITABLE BY THE TCB.

_____________________________________________________________
* THE HIERARCHICAL CLASSIFICATION COMPONENT IN HUMAN-READABLE
SENSITIVITY LABELS SHALL BE EQUAL TO THE GREATEST
HIERARCHICAL CLASSIFICATION OF ANY OF THE INFORMATION IN THE
OUTPUT THAT THE LABELS REFER TO; THE NON-HIERARCHICAL
CATEGORY COMPONENT SHALL INCLUDE ALL OF THE NON-HIERARCHICAL
CATEGORIES OF THE INFORMATION IN THE OUTPUT THE LABELS REFER
TO, BUT NO OTHER NON-HIERARCHICAL CATEGORIES.
_____________________________________________________________

3.1.1.4 Mandatory Access Control

THE TCB SHALL ENFORCE A MANDATORY ACCESS CONTROL POLICY OVER
ALL SUBJECTS AND STORAGE OBJECTS UNDER ITS CONTROL (E.G.,
PROCESSES, FILES, SEGMENTS, DEVICES). THESE SUBJECTS AND
OBJECTS SHALL BE ASSIGNED SENSITIVITY LABELS THAT ARE A
COMBINATION OF HIERARCHICAL CLASSIFICATION LEVELS AND
NON-HIERARCHICAL CATEGORIES, AND THE LABELS SHALL BE USED AS
THE BASIS FOR MANDATORY ACCESS CONTROL DECISIONS. THE TCB
SHALL BE ABLE TO SUPPORT TWO OR MORE SUCH SECURITY LEVELS.
(SEE THE MANDATORY ACCESS CONTROL GUIDELINES.) THE FOLLOWING
REQUIREMENTS SHALL HOLD FOR ALL ACCESSES BETWEEN SUBJECTS AND
OBJECTS CONTROLLED BY THE TCB: A SUBJECT CAN READ AN OBJECT
ONLY IF THE HIERARCHICAL CLASSIFICATION IN THE SUBJECT’S
SECURITY LEVEL IS GREATER THAN OR EQUAL TO THE HIERARCHICAL
CLASSIFICATION IN THE OBJECT’S SECURITY LEVEL AND THE NON-
HIERARCHICAL CATEGORIES IN THE SUBJECT’S SECURITY LEVEL INCLUDE
ALL THE NON-HIERARCHICAL CATEGORIES IN THE OBJECT’S SECURITY
LEVEL. A SUBJECT CAN WRITE AN OBJECT ONLY IF THE HIERARCHICAL
CLASSIFICATION IN THE SUBJECT’S SECURITY LEVEL IS LESS THAN OR
EQUAL TO THE HIERARCHICAL CLASSIFICATION IN THE OBJECT’S
SECURITY LEVEL AND ALL THE NON-HIERARCHICAL CATEGORIES IN THE
SUBJECT’S SECURITY LEVEL ARE INCLUDED IN THE NON- HIERARCHICAL
CATEGORIES IN THE OBJECT’S SECURITY LEVEL.

3.1.2 ACCOUNTABILITY

3.1.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall MAINTAIN AUTHENTICATION
DATA THAT INCLUDES INFORMATION FOR VERIFYING THE IDENTITY OF
INDIVIDUAL USERS (E.G., PASSWORDS) AS WELL AS INFORMATION FOR
DETERMINING THE CLEARANCE AND AUTHORIZATIONS OF INDIVIDUAL
USERS. THIS DATA SHALL BE USED BY THE TCB TO AUTHENTICATE the
user’s identity AND TO DETERMINE THE SECURITY LEVEL AND
AUTHORIZATIONS OF SUBJECTS THAT MAY BE CREATED TO ACT ON BEHALF
OF THE INDIVIDUAL USER. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.1.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. THE TCB SHALL ALSO BE ABLE TO AUDIT ANY OVERRIDE OF
HUMAN-READABLE OUTPUT MARKINGS. FOR each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address space
and for object deletion events the audit record shall include
the name of the object AND THE OBJECT’S SECURITY LEVEL. The
ADP system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
AND/OR OBJECT SECURITY LEVEL.

3.1.3 ASSURANCE

3.1.3.1 Operational Assurance

3.1.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
Resources controlled by the TCB may be a defined subset
of the subjects and objects in the ADP system. THE TCB
SHALL MAINTAIN PROCESS ISOLATION THROUGH THE PROVISION OF
DISTINCT ADDRESS SPACES UNDER ITS CONTROL. The TCB shall
isolate the resources to be protected so that they are
subject to the access control and auditing requirements.

3.1.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct operation
of the on-site hardware and firmware elements of the TCB.

3.1.3.2 Life-Cycle Assurance

3.1.3.2.1 Security Testing

THE SECURITY MECHANISMS OF THE ADP SYSTEM SHALL BE TESTED
AND FOUND TO WORK AS CLAIMED IN THE SYSTEM DOCUMENTATION.
A TEAM OF INDIVIDUALS WHO THOROUGHLY UNDERSTAND THE
SPECIFIC IMPLEMENTATION OF THE TCB SHALL SUBJECT ITS
DESIGN DOCUMENTATION, SOURCE CODE, AND OBJECT CODE TO
THOROUGH ANALYSIS AND TESTING. THEIR OBJECTIVES SHALL BE:
TO UNCOVER ALL DESIGN AND IMPLEMENTATION FLAWS THAT WOULD
PERMIT A SUBJECT EXTERNAL TO THE TCB TO READ, CHANGE, OR
DELETE DATA NORMALLY DENIED UNDER THE MANDATORY OR
DISCRETIONARY SECURITY POLICY ENFORCED BY THE TCB; AS WELL
AS TO ASSURE THAT NO SUBJECT (WITHOUT AUTHORIZATION TO DO
SO) IS ABLE TO CAUSE THE TCB TO ENTER A STATE SUCH THAT
IT IS UNABLE TO RESPOND TO COMMUNICATIONS INITIATED BY
OTHER USERS. ALL DISCOVERED FLAWS SHALL BE REMOVED OR
NEUTRALIZED AND THE TCB RETESTED TO DEMONSTRATE THAT THEY
HAVE BEEN ELIMINATED AND THAT NEW FLAWS HAVE NOT BEEN
INTRODUCED. (SEE THE SECURITY TESTING GUIDELINES.)

3.1.3.2.2 Design Specification and Verification

AN INFORMAL OR FORMAL MODEL OF THE SECURITY POLICY
SUPPORTED BY THE TCB SHALL BE MAINTAINED THAT IS SHOWN TO
BE CONSISTENT WITH ITS AXIOMS.

3.1.4 DOCUMENTATION

3.1.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.1.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. THE MANUAL SHALL DESCRIBE THE OPERATOR AND
ADMINISTRATOR FUNCTIONS RELATED TO SECURITY, TO INCLUDE CHANGING
THE SECURITY CHARACTERISTICS OF A USER. IT SHALL PROVIDE
GUIDELINES ON THE CONSISTENT AND EFFECTIVE USE OF THE PROTECTION
FEATURES OF THE SYSTEM, HOW THEY INTERACT, HOW TO SECURELY
GENERATE A NEW TCB, AND FACILITY PROCEDURES, WARNINGS, AND
PRIVILEGES THAT NEED TO BE CONTROLLED IN ORDER TO OPERATE THE
FACILITY IN A SECURE MANNER.

3.1.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing.

3.1.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. If the TCB
is composed of distinct modules, the interfaces between these
modules shall be described. AN INFORMAL OR FORMAL DESCRIPTION
OF THE SECURITY POLICY MODEL ENFORCED BY THE TCB SHALL BE
AVAILABLE AND AN EXPLANATION PROVIDED TO SHOW THAT IT IS
SUFFICIENT TO ENFORCE THE SECURITY POLICY. THE SPECIFIC TCB
PROTECTION MECHANISMS SHALL BE IDENTIFIED AND AN EXPLANATION
GIVEN TO SHOW THAT THEY SATISFY THE MODEL.

3.2 CLASS (B2): STRUCTURED PROTECTION

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system. In addition, covert channels are
addressed. The TCB must be carefully structured into protection-critical and
non- protection-critical elements. The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review. Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed. The system is relatively resistant to penetration. The
following are minimal requirements for systems assigned a class (B2) rating:

3.2.1 SECURITY POLICY

3.2.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined
groups of individuals, or by both. The discretionary access
control mechanism shall, either by explicit user action or by
default, provide that objects are protected from unauthorized
access. These access controls shall be capable of including
or excluding access to the granularity of a single user.
Access permission to an object by users not already possessing
access permission shall only be assigned by authorized users.

3.2.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.2.1.3 Labels

Sensitivity labels associated with each ADP SYSTEM RESOURCE
(E.G., SUBJECT, STORAGE OBJECT) THAT IS DIRECTLY OR INDIRECTLY
ACCESSIBLE BY SUBJECTS EXTERNAL TO THE TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

3.2.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

3.2.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

3.2.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

3.2.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the
information they process. However, the TCB shall
include a mechanism by which the TCB and an
authorized user reliably communicate to designate
the single security level of information imported
or exported via single-level communication
channels or I/O devices.

3.2.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.
_____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
_____________________________________________________________

3.2.1.3.3 Subject Sensitivity Labels

THE TCB SHALL IMMEDIATELY NOTIFY A TERMINAL USER OF EACH
CHANGE IN THE SECURITY LEVEL ASSOCIATED WITH THAT USER
DURING AN INTERACTIVE SESSION. A TERMINAL USER SHALL BE
ABLE TO QUERY THE TCB AS DESIRED FOR A DISPLAY OF THE
SUBJECT’S COMPLETE SENSITIVITY LABEL.

3.2.1.3.4 Device Labels

THE TCB SHALL SUPPORT THE ASSIGNMENT OF MINIMUM AND
MAXIMUM SECURITY LEVELS TO ALL ATTACHED PHYSICAL DEVICES.
THESE SECURITY LEVELS SHALL BE USED BY THE TCB TO ENFORCE
CONSTRAINTS IMPOSED BY THE PHYSICAL ENVIRONMENTS IN WHICH
THE DEVICES ARE LOCATED.

3.2.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all RESOURCES (I.E., SUBJECTS, STORAGE OBJECTS, AND I/O DEVICES)
THAT ARE DIRECTLY OR INDIRECTLY ACCESSIBLE BY SUBJECTS EXTERNAL
TO THE TCB. These subjects and objects shall be assigned
sensitivity labels that are a combination of hierarchical
classification levels and non-hierarchical categories, and the
labels shall be used as the basis for mandatory access control
decisions. The TCB shall be able to support two or more such
security levels. (See the Mandatory Access Control guidelines.)
The following requirements shall hold for all accesses between
ALL SUBJECTS EXTERNAL TO THE TCB AND ALL OBJECTS DIRECTLY OR
INDIRECTLY ACCESSIBLE BY THESE SUBJECTS: A subject can read an
object only if the hierarchical classification in the subject’s
security level is greater than or equal to the hierarchical
classification in the object’s security level and the non-
hierarchical categories in the subject’s security level include
all the non-hierarchical categories in the object’s security
level. A subject can write an object only if the hierarchical
classification in the subject’s security level is less than or
equal to the hierarchical classification in the object’s
security level and all the non-hierarchical categories in the
subject’s security level are included in the non-hierarchical
categories in the object’s security level.

3.2.2 ACCOUNTABILITY

3.2.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.2.2.1.1 Trusted Path

THE TCB SHALL SUPPORT A TRUSTED COMMUNICATION PATH
BETWEEN ITSELF AND USER FOR INITIAL LOGIN AND
AUTHENTICATION. COMMUNICATIONS VIA THIS PATH SHALL BE
INITIATED EXCLUSIVELY BY A USER.

3.2.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record. For
events that introduce an object into a user’s address space and
for object deletion events the audit record shall include the
name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
and/or object security level. THE TCB SHALL BE ABLE TO AUDIT
THE IDENTIFIED EVENTS THAT MAY BE USED IN THE EXPLOITATION OF
COVERT STORAGE CHANNELS.

3.2.3 ASSURANCE

3.2.3.1 Operational Assurance

3.2.3.1.1 System Architecture

THE TCB SHALL MAINTAIN A DOMAIN FOR ITS OWN EXECUTION
THAT PROTECTS IT FROM EXTERNAL INTERFERENCE OR TAMPERING
(E.G., BY MODIFICATION OF ITS CODE OR DATA STRUCTURES).
THE TCB SHALL MAINTAIN PROCESS ISOLATION THROUGH THE
PROVISION OF DISTINCT ADDRESS SPACES UNDER ITS CONTROL.
THE TCB SHALL BE INTERNALLY STRUCTURED INTO WELL-DEFINED
LARGELY INDEPENDENT MODULES. IT SHALL MAKE EFFECTIVE USE
OF AVAILABLE HARDWARE TO SEPARATE THOSE ELEMENTS THAT ARE
PROTECTION-CRITICAL FROM THOSE THAT ARE NOT. THE TCB
MODULES SHALL BE DESIGNED SUCH THAT THE PRINCIPLE OF LEAST
PRIVILEGE IS ENFORCED. FEATURES IN HARDWARE, SUCH AS
SEGMENTATION, SHALL BE USED TO SUPPORT LOGICALLY DISTINCT
STORAGE OBJECTS WITH SEPARATE ATTRIBUTES (NAMELY:
READABLE, WRITEABLE). THE USER INTERFACE TO THE TCB
SHALL BE COMPLETELY DEFINED AND ALL ELEMENTS OF THE TCB
IDENTIFIED.

3.2.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

3.2.3.1.3 Covert Channel Analysis

THE SYSTEM DEVELOPER SHALL CONDUCT A THOROUGH SEARCH FOR
COVERT STORAGE CHANNELS AND MAKE A DETERMINATION (EITHER
BY ACTUAL MEASUREMENT OR BY ENGINEERING ESTIMATION) OF
THE MAXIMUM BANDWIDTH OF EACH IDENTIFIED CHANNEL. (SEE
THE COVERT CHANNELS GUIDELINE SECTION.)

3.2.3.1.4 Trusted Facility Management

THE TCB SHALL SUPPORT SEPARATE OPERATOR AND ADMINISTRATOR
FUNCTIONS.

3.2.3.2 Life-Cycle Assurance

3.2.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall be:
to uncover all design and implementation flaws that would
permit a subject external to the TCB to read, change, or
delete data normally denied under the mandatory or
discretionary security policy enforced by the TCB; as well
as to assure that no subject (without authorization to do
so) is able to cause the TCB to enter a state such that it
is unable to respond to communications initiated by other
users. THE TCB SHALL BE FOUND RELATIVELY RESISTANT TO
PENETRATION. All discovered flaws shall be CORRECTED and
the TCB retested to demonstrate that they have been
eliminated and that new flaws have not been introduced.
TESTING SHALL DEMONSTRATE THAT THE TCB IMPLEMENTATION IS
CONSISTENT WITH THE DESCRIPTIVE TOP-LEVEL SPECIFICATION.
(See the Security Testing Guidelines.)

3.2.3.2.2 Design Specification and Verification

A FORMAL model of the security policy supported by the
TCB shall be maintained that is PROVEN consistent with
its axioms. A DESCRIPTIVE TOP-LEVEL SPECIFICATION (DTLS)
OF THE TCB SHALL BE MAINTAINED THAT COMPLETELY AND
ACCURATELY DESCRIBES THE TCB IN TERMS OF EXCEPTIONS, ERROR
MESSAGES, AND EFFECTS. IT SHALL BE SHOWN TO BE AN
ACCURATE DESCRIPTION OF THE TCB INTERFACE.

3.2.3.2.3 Configuration Management

DURING DEVELOPMENT AND MAINTENANCE OF THE TCB, A
CONFIGURATION MANAGEMENT SYSTEM SHALL BE IN PLACE THAT
MAINTAINS CONTROL OF CHANGES TO THE DESCRIPTIVE TOP-LEVEL
SPECIFICATION, OTHER DESIGN DATA, IMPLEMENTATION
DOCUMENTATION, SOURCE CODE, THE RUNNING VERSION OF THE
OBJECT CODE, AND TEST FIXTURES AND DOCUMENTATION. THE
CONFIGURATION MANAGEMENT SYSTEM SHALL ASSURE A CONSISTENT
MAPPING AMONG ALL DOCUMENTATION AND CODE ASSOCIATED WITH
THE CURRENT VERSION OF THE TCB. TOOLS SHALL BE PROVIDED
FOR GENERATION OF A NEW VERSION OF THE TCB FROM SOURCE
CODE. ALSO AVAILABLE SHALL BE TOOLS FOR COMPARING A
NEWLY GENERATED VERSION WITH THE PREVIOUS TCB VERSION IN
ORDER TO ASCERTAIN THAT ONLY THE INTENDED CHANGES HAVE
BEEN MADE IN THE CODE THAT WILL ACTUALLY BE USED AS THE
NEW VERSION OF THE TCB.

3.2.4 DOCUMENTATION

3.2.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.2.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. THE TCB MODULES THAT CONTAIN
THE REFERENCE VALIDATION MECHANISM SHALL BE IDENTIFIED. THE
PROCEDURES FOR SECURE GENERATION OF A NEW TCB FROM SOURCE AFTER
MODIFICATION OF ANY MODULES IN THE TCB SHALL BE DESCRIBED.

3.2.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. IT SHALL INCLUDE RESULTS OF
TESTING THE EFFECTIVENESS OF THE METHODS USED TO REDUCE COVERT
CHANNEL BANDWIDTHS.

3.2.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. THE
interfaces between THE TCB modules shall be described. A
FORMAL description of the security policy model enforced by the
TCB shall be available and PROVEN that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. THE DESCRIPTIVE TOP-LEVEL
SPECIFICATION (DTLS) SHALL BE SHOWN TO BE AN ACCURATE
DESCRIPTION OF THE TCB INTERFACE. DOCUMENTATION SHALL DESCRIBE
HOW THE TCB IMPLEMENTS THE REFERENCE MONITOR CONCEPT AND GIVE
AN EXPLANATION WHY IT IS TAMPERPROOF, CANNOT BE BYPASSED, AND
IS CORRECTLY IMPLEMENTED. DOCUMENTATION SHALL DESCRIBE HOW THE
TCB IS STRUCTURED TO FACILITATE TESTING AND TO ENFORCE LEAST
PRIVILEGE. THIS DOCUMENTATION SHALL ALSO PRESENT THE RESULTS
OF THE COVERT CHANNEL ANALYSIS AND THE TRADEOFFS INVOLVED IN
RESTRICTING THE CHANNELS. ALL AUDITABLE EVENTS THAT MAY BE
USED IN THE EXPLOITATION OF KNOWN COVERT STORAGE CHANNELS SHALL
BE IDENTIFIED. THE BANDWIDTHS OF KNOWN COVERT STORAGE CHANNELS,
THE USE OF WHICH IS NOT DETECTABLE BY THE AUDITING MECHANISMS,
SHALL BE PROVIDED. (SEE THE COVERT CHANNEL GUIDELINE SECTION.)

3.3 CLASS (B3): SECURITY DOMAINS

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests. To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity. A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required. The system is highly resistant to
penetration. The following are minimal requirements for systems assigned a
class (B3) rating:

3.3.1 SECURITY POLICY

3.3.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (E.G., ACCESS CONTROL LISTS) shall
allow users to specify and control sharing of those OBJECTS.
The discretionary access control mechanism shall, either by
explicit user action or by default, provide that objects are
protected from unauthorized access. These access controls shall
be capable of SPECIFYING, FOR EACH NAMED OBJECT, A LIST OF NAMED
INDIVIDUALS AND A LIST OF GROUPS OF NAMED INDIVIDUALS WITH THEIR
RESPECTIVE MODES OF ACCESS TO THAT OBJECT. FURTHERMORE, FOR
EACH SUCH NAMED OBJECT, IT SHALL BE POSSIBLE TO SPECIFY A LIST
OF NAMED INDIVIDUALS AND A LIST OF GROUPS OF NAMED INDIVIDUALS
FOR WHICH NO ACCESS TO THE OBJECT IS TO BE GIVEN. Access
permission to an object by users not already possessing access
permission shall only be assigned by authorized users.

3.3.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.3.1.3 Labels

Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

3.3.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

3.3.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

3.3.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

3.3.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the information
they process. However, the TCB shall include a
mechanism by which the TCB and an authorized user
reliably communicate to designate the single
security level of information imported or exported
via single-level communication channels or I/O
devices.

3.3.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.

_____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
_____________________________________________________________

3.3.1.3.3 Subject Sensitivity Labels

The TCB shall immediately notify a terminal user of each
change in the security level associated with that user
during an interactive session. A terminal user shall be
able to query the TCB as desired for a display of the
subject’s complete sensitivity label.

3.3.1.3.4 Device Labels

The TCB shall support the assignment of minimum and
maximum security levels to all attached physical devices.
These security levels shall be used by the TCB to enforce
constraints imposed by the physical environments in which
the devices are located.

3.3.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O
devices) that are directly or indirectly accessible by subjects
external to the TCB. These subjects and objects shall be
assigned sensitivity labels that are a combination of
hierarchical classification levels and non-hierarchical
categories, and the labels shall be used as the basis for
mandatory access control decisions. The TCB shall be able to
support two or more such security levels. (See the Mandatory
Access Control guidelines.) The following requirements shall
hold for all accesses between all subjects external to the TCB
and all objects directly or indirectly accessible by these
subjects: A subject can read an object only if the hierarchical
classification in the subject’s security level is greater than
or equal to the hierarchical classification in the object’s
security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical
categories in the object’s security level. A subject can write
an object only if the hierarchical classification in the
subject’s security level is less than or equal to the
hierarchical classification in the object’s security level and
all the non-hierarchical categories in the subject’s security
level are included in the non- hierarchical categories in the
object’s security level.

3.3.2 ACCOUNTABILITY

3.3.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.3.2.1.1 Trusted Path

The TCB shall support a trusted communication path
between itself and USERS for USE WHEN A POSITIVE TCB-TO-
USER CONNECTION IS REQUIRED (E.G., LOGIN, CHANGE SUBJECT
SECURITY LEVEL). Communications via this TRUSTED path
shall be ACTIVATED exclusively by a user OR THE TCB AND
SHALL BE LOGICALLY ISOLATED AND UNMISTAKABLY
DISTINGUISHABLE FROM OTHER PATHS.

3.3.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object and the object’s security level.
The ADP system administrator shall be able to selectively audit
the actions of any one or more users based on individual
identity and/or object security level. The TCB shall be able to
audit the identified events that may be used in the exploitation
of covert storage channels. THE TCB SHALL CONTAIN A MECHANISM
THAT IS ABLE TO MONITOR THE OCCURRENCE OR ACCUMULATION OF
SECURITY AUDITABLE EVENTS THAT MAY INDICATE AN IMMINENT
VIOLATION OF SECURITY POLICY. THIS MECHANISM SHALL BE ABLE TO
IMMEDIATELY NOTIFY THE SECURITY ADMINISTRATOR WHEN THRESHOLDS
ARE EXCEEDED.

3.3.3 ASSURANCE

3.3.3.1 Operational Assurance

3.3.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
The TCB shall maintain process isolation through the
provision of distinct address spaces under its control.
The TCB shall be internally structured into well-defined
largely independent modules. It shall make effective use
of available hardware to separate those elements that are
protection-critical from those that are not. The TCB
modules shall be designed such that the principle of
least privilege is enforced. Features in hardware, such
as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely:
readable, writeable). The user interface to the TCB shall
be completely defined and all elements of the TCB
identified. THE TCB SHALL BE DESIGNED AND STRUCTURED TO
USE A COMPLETE, CONCEPTUALLY SIMPLE PROTECTION MECHANISM
WITH PRECISELY DEFINED SEMANTICS. THIS MECHANISM SHALL
PLAY A CENTRAL ROLE IN ENFORCING THE INTERNAL STRUCTURING
OF THE TCB AND THE SYSTEM. THE TCB SHALL INCORPORATE
SIGNIFICANT USE OF LAYERING, ABSTRACTION AND DATA HIDING.
SIGNIFICANT SYSTEM ENGINEERING SHALL BE DIRECTED TOWARD
MINIMIZING THE COMPLEXITY OF THE TCB AND EXCLUDING FROM
THE TCB MODULES THAT ARE NOT PROTECTION-CRITICAL.

3.3.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

3.3.3.1.3 Covert Channel Analysis

The system developer shall conduct a thorough search for
COVERT CHANNELS and make a determination (either by
actual measurement or by engineering estimation) of the
maximum bandwidth of each identified channel. (See the
Covert Channels Guideline section.)

3.3.3.1.4 Trusted Facility Management

The TCB shall support separate operator and administrator
functions. THE FUNCTIONS PERFORMED IN THE ROLE OF A
SECURITY ADMINISTRATOR SHALL BE IDENTIFIED. THE ADP
SYSTEM ADMINISTRATIVE PERSONNEL SHALL ONLY BE ABLE TO
PERFORM SECURITY ADMINISTRATOR FUNCTIONS AFTER TAKING A
DISTINCT AUDITABLE ACTION TO ASSUME THE SECURITY
ADMINISTRATOR ROLE ON THE ADP SYSTEM. NON-SECURITY
FUNCTIONS THAT CAN BE PERFORMED IN THE SECURITY
ADMINISTRATION ROLE SHALL BE LIMITED STRICTLY TO THOSE
ESSENTIAL TO PERFORMING THE SECURITY ROLE EFFECTIVELY.

3.3.3.1.5 Trusted Recovery

PROCEDURES AND/OR MECHANISMS SHALL BE PROVIDED TO ASSURE
THAT, AFTER AN ADP SYSTEM FAILURE OR OTHER DISCONTINUITY,
RECOVERY WITHOUT A PROTECTION COMPROMISE IS OBTAINED.

3.3.3.2 Life-Cycle Assurance

3.3.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that
would permit a subject external to the TCB to read,
change, or delete data normally denied under the
mandatory or discretionary security policy enforced by
the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter
a state such that it is unable to respond to
communications initiated by other users. The TCB shall
be FOUND RESISTANT TO penetration. All discovered flaws
shall be corrected and the TCB retested to demonstrate
that they have been eliminated and that new flaws have
not been introduced. Testing shall demonstrate that the
TCB implementation is consistent with the descriptive
top-level specification. (See the Security Testing
Guidelines.) NO DESIGN FLAWS AND NO MORE THAN A FEW
CORRECTABLE IMPLEMENTATION FLAWS MAY BE FOUND DURING
TESTING AND THERE SHALL BE REASONABLE CONFIDENCE THAT
FEW REMAIN.

3.3.3.2.2 Design Specification and Verification

A formal model of the security policy supported by the
TCB shall be maintained that is proven consistent with
its axioms. A descriptive top-level specification (DTLS)
of the TCB shall be maintained that completely and
accurately describes the TCB in terms of exceptions, error
messages, and effects. It shall be shown to be an
accurate description of the TCB interface. A CONVINCING
ARGUMENT SHALL BE GIVEN THAT THE DTLS IS CONSISTENT WITH
THE MODEL.

3.3.3.2.3 Configuration Management

During development and maintenance of the TCB, a
configuration management system shall be in place that
maintains control of changes to the descriptive top-level
specification, other design data, implementation
documentation, source code, the running version of the
object code, and test fixtures and documentation. The
configuration management system shall assure a consistent
mapping among all documentation and code associated with
the current version of the TCB. Tools shall be provided
for generation of a new version of the TCB from source
code. Also available shall be tools for comparing a
newly generated version with the previous TCB version in
order to ascertain that only the intended changes have
been made in the code that will actually be used as the
new version of the TCB.

3.3.4 DOCUMENTATION

3.3.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.3.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. The TCB modules that contain
the reference validation mechanism shall be identified. The
procedures for secure generation of a new TCB from source after
modification of any modules in the TCB shall be described. IT
SHALL INCLUDE THE PROCEDURES TO ENSURE THAT THE SYSTEM IS
INITIALLY STARTED IN A SECURE MANNER. PROCEDURES SHALL ALSO BE
INCLUDED TO RESUME SECURE SYSTEM OPERATION AFTER ANY LAPSE IN
SYSTEM OPERATION.

3.3.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. It shall include results of
testing the effectiveness of the methods used to reduce covert
channel bandwidths.

3.3.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. The
interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the
TCB shall be available and proven that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. The descriptive top-level
specification (DTLS) shall be shown to be an accurate
description of the TCB interface. Documentation shall describe
how the TCB implements the reference monitor concept and give
an explanation why it is tamperproof, cannot be bypassed, and
is correctly implemented. THE TCB IMPLEMENTATION (I.E., IN
HARDWARE, FIRMWARE, AND SOFTWARE) SHALL BE INFORMALLY SHOWN TO
BE CONSISTENT WITH THE DTLS. THE ELEMENTS OF THE DTLS SHALL BE
SHOWN, USING INFORMAL TECHNIQUES, TO CORRESPOND TO THE ELEMENTS
OF THE TCB. Documentation shall describe how the TCB is
structured to facilitate testing and to enforce least privilege.
This documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage channels,
the use of which is not detectable by the auditing mechanisms,
shall be provided. (See the Covert Channel Guideline section.)

4.0 DIVISION A: VERIFIED PROTECTION

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system. Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

4.1 CLASS (A1): VERIFIED DESIGN

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented. This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design.
Independent of the particular specification language or verification system
used, there are five important criteria for class (A1) design verification:

* A formal model of the security policy must be clearly
identified and documented, including a mathematical proof
that the model is consistent with its axioms and is
sufficient to support the security policy.

* An FTLS must be produced that includes abstract definitions
of the functions the TCB performs and of the hardware and/or
firmware mechanisms that are used to support separate
execution domains.

* The FTLS of the TCB must be shown to be consistent with the
model by formal techniques where possible (i.e., where
verification tools exist) and informal ones otherwise.

* The TCB implementation (i.e., in hardware, firmware, and
software) must be informally shown to be consistent with the
FTLS. The elements of the FTLS must be shown, using
informal techniques, to correspond to the elements of the
TCB. The FTLS must express the unified protection mechanism
required to satisfy the security policy, and it is the
elements of this protection mechanism that are mapped to the
elements of the TCB.

* Formal analysis techniques must be used to identify and
analyze covert channels. Informal techniques may be used to
identify covert timing channels. The continued existence of
identified covert channels in the system must be justified.

In keeping with the extensive design and development analysis of the TCB
required of systems in class (A1), more stringent configuration management is
required and procedures are established for securely distributing the system
to sites. A system security administrator is supported.

The following are minimal requirements for systems assigned a class (A1)
rating:

4.1.1 SECURITY POLICY

4.1.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., access control lists) shall
allow users to specify and control sharing of those objects.
The discretionary access control mechanism shall, either by
explicit user action or by default, provide that objects are
protected from unauthorized access. These access controls
shall be capable of specifying, for each named object, a list
of named individuals and a list of groups of named individuals
with their respective modes of access to that object.
Furthermore, for each such named object, it shall be possible to
specify a list of named individuals and a list of groups of
named individuals for which no access to the object is to be
given. Access permission to an object by users not already
possessing access permission shall only be assigned by
authorized users.

4.1.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

4.1.1.3 Labels

Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

4.1.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

4.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

4.1.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

4.1.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the information
they process. However, the TCB shall include a
mechanism by which the TCB and an authorized user
reliably communicate to designate the single
security level of information imported or exported
via single-level communication channels or I/O
devices.

4.1.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.

____________________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
____________________________________________________________________

4.1.1.3.3 Subject Sensitivity Labels

The TCB shall immediately notify a terminal user of each
change in the security level associated with that user
during an interactive session. A terminal user shall be
able to query the TCB as desired for a display of the
subject’s complete sensitivity label.

4.1.1.3.4 Device Labels

The TCB shall support the assignment of minimum and
maximum security levels to all attached physical devices.
These security levels shall be used by the TCB to enforce
constraints imposed by the physical environments in which
the devices are located.

4.1.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O
devices) that are directly or indirectly accessible by subjects
external to the TCB. These subjects and objects shall be
assigned sensitivity labels that are a combination of
hierarchical classification levels and non-hierarchical
categories, and the labels shall be used as the basis for
mandatory access control decisions. The TCB shall be able to
support two or more such security levels. (See the Mandatory
Access Control guidelines.) The following requirements shall
hold for all accesses between all subjects external to the TCB
and all objects directly or indirectly accessible by these
subjects: A subject can read an object only if the hierarchical
classification in the subject’s security level is greater than
or equal to the hierarchical classification in the object’s
security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical
categories in the object’s security level. A subject can write
an object only if the hierarchical classification in the
subject’s security level is less than or equal to the
hierarchical classification in the object’s security level and
all the non-hierarchical categories in the subject’s security
level are included in the non- hierarchical categories in the
object’s security level.

4.1.2 ACCOUNTABILITY

4.1.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

4.1.2.1.1 Trusted Path

The TCB shall support a trusted communication path
between itself and users for use when a positive TCB-to-
user connection is required (e.g., login, change subject
security level). Communications via this trusted path
shall be activated exclusively by a user or the TCB and
shall be logically isolated and unmistakably
distinguishable from other paths.

4.1.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record. For
events that introduce an object into a user’s address space and
for object deletion events the audit record shall include the
name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
and/or object security level. The TCB shall be able to audit
the identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able to
immediately notify the security administrator when thresholds
are exceeded.

4.1.3 ASSURANCE

4.1.3.1 Operational Assurance

4.1.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
The TCB shall maintain process isolation through the
provision of distinct address spaces under its control.
The TCB shall be internally structured into well-defined
largely independent modules. It shall make effective use
of available hardware to separate those elements that are
protection-critical from those that are not. The TCB
modules shall be designed such that the principle of
least privilege is enforced. Features in hardware, such
as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely:
readable, writeable). The user interface to the TCB
shall be completely defined and all elements of the TCB
identified. The TCB shall be designed and structured to
use a complete, conceptually simple protection mechanism
with precisely defined semantics. This mechanism shall
play a central role in enforcing the internal structuring
of the TCB and the system. The TCB shall incorporate
significant use of layering, abstraction and data hiding.
Significant system engineering shall be directed toward
minimizing the complexity of the TCB and excluding from
the TCB modules that are not protection-critical.

4.1.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

4.1.3.1.3 Covert Channel Analysis

The system developer shall conduct a thorough search for
COVERT CHANNELS and make a determination (either by
actual measurement or by engineering estimation) of the
maximum bandwidth of each identified channel. (See the
Covert Channels Guideline section.) FORMAL METHODS SHALL
BE USED IN THE ANALYSIS.

4.1.3.1.4 Trusted Facility Management

The TCB shall support separate operator and administrator
functions. The functions performed in the role of a
security administrator shall be identified. The ADP
system administrative personnel shall only be able to
perform security administrator functions after taking a
distinct auditable action to assume the security
administrator role on the ADP system. Non-security
functions that can be performed in the security
administration role shall be limited strictly to those
essential to performing the security role effectively.

4.1.3.1.5 Trusted Recovery

Procedures and/or mechanisms shall be provided to assure
that, after an ADP system failure or other discontinuity,
recovery without a protection compromise is obtained.

4.1.3.2 Life-Cycle Assurance

4.1.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that
would permit a subject external to the TCB to read,
change, or delete data normally denied under the
mandatory or discretionary security policy enforced by
the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter
a state such that it is unable to respond to
communications initiated by other users. The TCB shall
be found resistant to penetration. All discovered flaws
shall be corrected and the TCB retested to demonstrate
that they have been eliminated and that new flaws have
not been introduced. Testing shall demonstrate that the
TCB implementation is consistent with the FORMAL top-
level specification. (See the Security Testing
Guidelines.) No design flaws and no more than a few
correctable implementation flaws may be found during
testing and there shall be reasonable confidence that few
remain. MANUAL OR OTHER MAPPING OF THE FTLS TO THE
SOURCE CODE MAY FORM A BASIS FOR PENETRATION TESTING.

4.1.3.2.2 Design Specification and Verification

A formal model of the security policy supported by the
TCB shall be maintained that is proven consistent with
its axioms. A descriptive top-level specification (DTLS)
of the TCB shall be maintained that completely and
accurately describes the TCB in terms of exceptions, error
messages, and effects. A FORMAL TOP-LEVEL SPECIFICATION
(FTLS) OF THE TCB SHALL BE MAINTAINED THAT ACCURATELY
DESCRIBES THE TCB IN TERMS OF EXCEPTIONS, ERROR MESSAGES,
AND EFFECTS. THE DTLS AND FTLS SHALL INCLUDE THOSE
COMPONENTS OF THE TCB THAT ARE IMPLEMENTED AS HARDWARE
AND/OR FIRMWARE IF THEIR PROPERTIES ARE VISIBLE AT THE
TCB INTERFACE. THE FTLS shall be shown to be an accurate
description of the TCB interface. A convincing argument
shall be given that the DTLS is consistent with the model
AND A COMBINATION OF FORMAL AND INFORMAL TECHNIQUES SHALL
BE USED TO SHOW THAT THE FTLS IS CONSISTENT WITH THE
MODEL. THIS VERIFICATION EVIDENCE SHALL BE CONSISTENT
WITH THAT PROVIDED WITHIN THE STATE-OF-THE-ART OF THE
PARTICULAR COMPUTER SECURITY CENTER-ENDORSED FORMAL
SPECIFICATION AND VERIFICATION SYSTEM USED. MANUAL OR
OTHER MAPPING OF THE FTLS TO THE TCB SOURCE CODE SHALL BE
PERFORMED TO PROVIDE EVIDENCE OF CORRECT IMPLEMENTATION.

4.1.3.2.3 Configuration Management

During THE ENTIRE LIFE-CYCLE, I.E., DURING THE DESIGN,
DEVELOPMENT, and maintenance of the TCB, a configuration
management system shall be in place FOR ALL SECURITY-
RELEVANT HARDWARE, FIRMWARE, AND SOFTWARE that maintains
control of changes to THE FORMAL MODEL, the descriptive
AND FORMAL top-level SPECIFICATIONS, other design data,
implementation documentation, source code, the running
version of the object code, and test fixtures and
documentation. The configuration management system shall
assure a consistent mapping among all documentation and
code associated with the current version of the TCB.
Tools shall be provided for generation of a new version
of the TCB from source code. Also available shall be
tools, MAINTAINED UNDER STRICT CONFIGURATION CONTROL, for
comparing a newly generated version with the previous TCB
version in order to ascertain that only the intended
changes have been made in the code that will actually be
used as the new version of the TCB. A COMBINATION OF
TECHNICAL, PHYSICAL, AND PROCEDURAL SAFEGUARDS SHALL BE
USED TO PROTECT FROM UNAUTHORIZED MODIFICATION OR
DESTRUCTION THE MASTER COPY OR COPIES OF ALL MATERIAL
USED TO GENERATE THE TCB.

4.1.3.2.4 Trusted Distribution

A TRUSTED ADP SYSTEM CONTROL AND DISTRIBUTION FACILITY
SHALL BE PROVIDED FOR MAINTAINING THE INTEGRITY OF THE
MAPPING BETWEEN THE MASTER DATA DESCRIBING THE CURRENT
VERSION OF THE TCB AND THE ON-SITE MASTER COPY OF THE
CODE FOR THE CURRENT VERSION. PROCEDURES (E.G., SITE
SECURITY ACCEPTANCE TESTING) SHALL EXIST FOR ASSURING
THAT THE TCB SOFTWARE, FIRMWARE, AND HARDWARE UPDATES
DISTRIBUTED TO A CUSTOMER ARE EXACTLY AS SPECIFIED BY
THE MASTER COPIES.

4.1.4 DOCUMENTATION

4.1.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

4.1.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. The TCB modules that contain
the reference validation mechanism shall be identified. The
procedures for secure generation of a new TCB from source after
modification of any modules in the TCB shall be described. It
shall include the procedures to ensure that the system is
initially started in a secure manner. Procedures shall also be
included to resume secure system operation after any lapse in
system operation.

4.1.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. It shall include results of
testing the effectiveness of the methods used to reduce covert
channel bandwidths. THE RESULTS OF THE MAPPING BETWEEN THE
FORMAL TOP-LEVEL SPECIFICATION AND THE TCB SOURCE CODE SHALL BE
GIVEN.

4.1.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. The
interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the
TCB shall be available and proven that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. The descriptive top-level
specification (DTLS) shall be shown to be an accurate
description of the TCB interface. Documentation shall describe
how the TCB implements the reference monitor concept and give
an explanation why it is tamperproof, cannot be bypassed, and
is correctly implemented. The TCB implementation (i.e., in
hardware, firmware, and software) shall be informally shown to
be consistent with the FORMAL TOP- LEVEL SPECIFICATION (FTLS).
The elements of the FTLS shall be shown, using informal
techniques, to correspond to the elements of the TCB.
Documentation shall describe how the TCB is structured to
facilitate testing and to enforce least privilege. This
documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage channels,
the use of which is not detectable by the auditing mechanisms,
shall be provided. (See the Covert Channel Guideline section.)
HARDWARE, FIRMWARE, AND SOFTWARE MECHANISMS NOT DEALT WITH IN
THE FTLS BUT STRICTLY INTERNAL TO THE TCB (E.G., MAPPING
REGISTERS, DIRECT MEMORY ACCESS I/O) SHALL BE CLEARLY DESCRIBED.

4.2 BEYOND CLASS (A1)

Most of the security enhancements envisioned for systems that will provide
features and assurance in addition to that already provided by class (Al)
systems are beyond current technology. The discussion below is intended to
guide future work and is derived from research and development activities
already underway in both the public and private sectors. As more and better
analysis techniques are developed, the requirements for these systems will
become more explicit. In the future, use of formal verification will be
extended to the source level and covert timing channels will be more fully
addressed. At this level the design environment will become important and
testing will be aided by analysis of the formal top-level specification.
Consideration will be given to the correctness of the tools used in TCB
development (e.g., compilers, assemblers, loaders) and to the correct
functioning of the hardware/firmware on which the TCB will run. Areas to be
addressed by systems beyond class (A1) include:

* System Architecture

A demonstration (formal or otherwise) must be given showing
that requirements of self-protection and completeness for
reference monitors have been implemented in the TCB.

* Security Testing

Although beyond the current state-of-the-art, it is
envisioned that some test-case generation will be done
automatically from the formal top-level specification or
formal lower-level specifications.

* Formal Specification and Verification

The TCB must be verified down to the source code level,
using formal verification methods where feasible. Formal
verification of the source code of the security-relevant
portions of an operating system has proven to be a difficult
task. Two important considerations are the choice of a
high-level language whose semantics can be fully and
formally expressed, and a careful mapping, through
successive stages, of the abstract formal design to a
formalization of the implementation in low-level
specifications. Experience has shown that only when the
lowest level specifications closely correspond to the actual
code can code proofs be successfully accomplished.

* Trusted Design Environment

The TCB must be designed in a trusted facility with only
trusted (cleared) personnel.

PART II:

5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS

The criteria are divided within each class into groups of requirements. These
groupings were developed to assure that three basic control objectives for
computer security are satisfied and not overlooked. These control objectives
deal with:

* Security Policy
* Accountability
* Assurance

This section provides a discussion of these general control objectives and
their implication in terms of designing trusted systems.

5.1 A Need for Consensus

A major goal of the DoD Computer Security Center is to encourage the Computer
Industry to develop trusted computer systems and products, making them widely
available in the commercial market place. Achievement of this goal will
require recognition and articulation by both the public and private sectors of
a need and demand for such products.

As described in the introduction to this document, efforts to define the
problems and develop solutions associated with processing nationally sensitive
information, as well as other sensitive data such as financial, medical, and
personnel information used by the National Security Establishment, have been
underway for a number of years. The criteria, as described in Part I,
represent the culmination of these efforts and describe basic requirements for
building trusted computer systems. To date, however, these systems have been
viewed by many as only satisfying National Security needs. As long as this
perception continues the consensus needed to motivate manufacture of trusted
systems will be lacking.

The purpose of this section is to describe, in some detail, the fundamental
control objectives that lay the foundations for requirements delineated in the
criteria. The goal is to explain the foundations so that those outside the
National Security Establishment can assess their universality and, by
extension, the universal applicability of the criteria requirements to
processing all types of sensitive applications whether they be for National
Security or the private sector.

5.2 Definition and Usefulness

The term “control objective” refers to a statement of intent with respect to
control over some aspect of an organization’s resources, or processes, or
both. In terms of a computer system, control objectives provide a framework
for developing a strategy for fulfilling a set of security requirements for
any given system. Developed in response to generic vulnerabilities, such as
the need to manage and handle sensitive data in order to prevent compromise,
or the need to provide accountability in order to detect fraud, control
objectives have been identified as a useful method of expressing security
goals.[3]

Examples of control objectives include the three basic design requirements for
implementing the reference monitor concept discussed in Section 6. They are:

* The reference validation mechanism must be tamperproof.

* The reference validation mechanism must always be invoked.

* The reference validation mechanism must be small enough to be
subjected to analysis and tests, the completeness of which can
be assured.[1]

5.3 Criteria Control Objectives

The three basic control objectives of the criteria are concerned with security
policy, accountability, and assurance. The remainder of this section provides
a discussion of these basic requirements.

5.3.1 Security Policy

In the most general sense, computer security is concerned with
controlling the way in which a computer can be used, i.e.,
controlling how information processed by it can be accessed and
manipulated. However, at closer examination, computer security
can refer to a number of areas. Symptomatic of this, FIPS PUB 39,
Glossary For Computer Systems Security, does not have a unique
definition for computer security.[16] Instead there are eleven
separate definitions for security which include: ADP systems
security, administrative security, data security, etc. A common
thread running through these definitions is the word “protection.”
Further declarations of protection requirements can be found in
DoD Directive 5200.28 which describes an acceptable level of
protection for classified data to be one that will “assure that
systems which process, store, or use classified data and produce
classified information will, with reasonable dependability,
prevent: a. Deliberate or inadvertent access to classified
material by unauthorized persons, and b. Unauthorized
manipulation of the computer and its associated peripheral
devices.”[8]

In summary, protection requirements must be defined in terms of
the perceived threats, risks, and goals of an organization. This
is often stated in terms of a security policy. It has been
pointed out in the literature that it is external laws, rules,
regulations, etc. that establish what access to information is to
be permitted, independent of the use of a computer. In particular,
a given system can only be said to be secure with respect to its
enforcement of some specific policy.[30] Thus, the control
objective for security policy is:

SECURITY POLICY CONTROL OBJECTIVE

A STATEMENT OF INTENT WITH REGARD TO CONTROL OVER ACCESS TO AND
DISSEMINATION OF INFORMATION, TO BE KNOWN AS THE SECURITY POLICY,
MUST BE PRECISELY DEFINED AND IMPLEMENTED FOR EACH SYSTEM THAT IS
USED TO PROCESS SENSITIVE INFORMATION. THE SECURITY POLICY MUST
ACCURATELY REFLECT THE LAWS, REGULATIONS, AND GENERAL POLICIES
FROM WHICH IT IS DERIVED.

5.3.1.1 Mandatory Security Policy

Where a security policy is developed that is to be applied
to control of classified or other specifically designated
sensitive information, the policy must include detailed
rules on how to handle that information throughout its
life-cycle. These rules are a function of the various
sensitivity designations that the information can assume
and the various forms of access supported by the system.
Mandatory security refers to the enforcement of a set of
access control rules that constrains a subject’s access to
information on the basis of a comparison of that
individual’s clearance/authorization to the information,
the classification/sensitivity designation of the
information, and the form of access being mediated.
Mandatory policies either require or can be satisfied by
systems that can enforce a partial ordering of
designations, namely, the designations must form what is
mathematically known as a “lattice.”[5]

A clear implication of the above is that the system must
assure that the designations associated with sensitive data
cannot be arbitrarily changed, since this could permit
individuals who lack the appropriate authorization to
access sensitive information. Also implied is the
requirement that the system control the flow of information
so that data cannot be stored with lower sensitivity
designations unless its “downgrading” has been authorized.
The control objective is:

MANDATORY SECURITY CONTROL OBJECTIVE

SECURITY POLICIES DEFINED FOR SYSTEMS THAT ARE USED TO
PROCESS CLASSIFIED OR OTHER SPECIFICALLY CATEGORIZED
SENSITIVE INFORMATION MUST INCLUDE PROVISIONS FOR THE
ENFORCEMENT OF MANDATORY ACCESS CONTROL RULES. THAT IS,
THEY MUST INCLUDE A SET OF RULES FOR CONTROLLING ACCESS
BASED DIRECTLY ON A COMPARISON OF THE INDIVIDUAL’S
CLEARANCE OR AUTHORIZATION FOR THE INFORMATION AND THE
CLASSIFICATION OR SENSITIVITY DESIGNATION OF THE
INFORMATION BEING SOUGHT, AND INDIRECTLY ON CONSIDERATIONS
OF PHYSICAL AND OTHER ENVIRONMENTAL FACTORS OF CONTROL.
THE MANDATORY ACCESS CONTROL RULES MUST ACCURATELY REFLECT
THE LAWS, REGULATIONS, AND GENERAL POLICIES FROM WHICH
THEY ARE DERIVED.

5.3.1.2 Discretionary Security Policy

Discretionary security is the principal type of access
control available in computer systems today. The basis of
this kind of security is that an individual user, or
program operating on his behalf, is allowed to specify
explicitly the types of access other users may have to
information under his control. Discretionary security
differs from mandatory security in that it implements an
access control policy on the basis of an individual’s
need-to-know as opposed to mandatory controls which are
driven by the classification or sensitivity designation of
the information.

Discretionary controls are not a replacement for mandatory
controls. In an environment in which information is
classified (as in the DoD) discretionary security provides
for a finer granularity of control within the overall
constraints of the mandatory policy. Access to classified
information requires effective implementation of both types
of controls as precondition to granting that access. In
general, no person may have access to classified
information unless: (a) that person has been determined to
be trustworthy, i.e., granted a personnel security
clearance — MANDATORY, and (b) access is necessary for the
performance of official duties, i.e., determined to have a
need-to-know — DISCRETIONARY. In other words,
discretionary controls give individuals discretion to
decide on which of the permissible accesses will actually
be allowed to which users, consistent with overriding
mandatory policy restrictions. The control objective is:

DISCRETIONARY SECURITY CONTROL OBJECTIVE

SECURITY POLICIES DEFINED FOR SYSTEMS THAT ARE USED TO
PROCESS CLASSIFIED OR OTHER SENSITIVE INFORMATION MUST
INCLUDE PROVISIONS FOR THE ENFORCEMENT OF DISCRETIONARY
ACCESS CONTROL RULES. THAT IS, THEY MUST INCLUDE A
CONSISTENT SET OF RULES FOR CONTROLLING AND LIMITING ACCESS
BASED ON IDENTIFIED INDIVIDUALS WHO HAVE BEEN DETERMINED TO
HAVE A NEED-TO-KNOW FOR THE INFORMATION.

5.3.1.3 Marking

To implement a set of mechanisms that will put into effect
a mandatory security policy, it is necessary that the
system mark information with appropriate classification or
sensitivity labels and maintain these markings as the
information moves through the system. Once information is
unalterably and accurately marked, comparisons required by
the mandatory access control rules can be accurately and
consistently made. An additional benefit of having the
system maintain the classification or sensitivity label
internally is the ability to automatically generate
properly “labeled” output. The labels, if accurately and
integrally maintained by the system, remain accurate when
output from the system. The control objective is:

MARKING CONTROL OBJECTIVE

SYSTEMS THAT ARE DESIGNED TO ENFORCE A MANDATORY SECURITY
POLICY MUST STORE AND PRESERVE THE INTEGRITY OF
CLASSIFICATION OR OTHER SENSITIVITY LABELS FOR ALL
INFORMATION. LABELS EXPORTED FROM THE SYSTEM MUST BE
ACCURATE REPRESENTATIONS OF THE CORRESPONDING INTERNAL
SENSITIVITY LABELS BEING EXPORTED.

5.3.2 Accountability

The second basic control objective addresses one of the
fundamental principles of security, i.e., individual
accountability. Individual accountability is the key to securing
and controlling any system that processes information on behalf
of individuals or groups of individuals. A number of requirements
must be met in order to satisfy this objective.

The first requirement is for individual user identification.
Second, there is a need for authentication of the identification.
Identification is functionally dependent on authentication.
Without authentication, user identification has no credibility.
Without a credible identity, neither mandatory nor discretionary
security policies can be properly invoked because there is no
assurance that proper authorizations can be made.

The third requirement is for dependable audit capabilities. That
is, a trusted computer system must provide authorized personnel
with the ability to audit any action that can potentially cause
access to, generation of, or effect the release of classified or
sensitive information. The audit data will be selectively
acquired based on the auditing needs of a particular installation
and/or application. However, there must be sufficient granularity
in the audit data to support tracing the auditable events to a
specific individual who has taken the actions or on whose behalf
the actions were taken. The control objective is:

ACCOUNTABILITY CONTROL OBJECTIVE

SYSTEMS THAT ARE USED TO PROCESS OR HANDLE CLASSIFIED OR OTHER
SENSITIVE INFORMATION MUST ASSURE INDIVIDUAL ACCOUNTABILITY
WHENEVER EITHER A MANDATORY OR DISCRETIONARY SECURITY POLICY IS
INVOKED. FURTHERMORE, TO ASSURE ACCOUNTABILITY THE CAPABILITY
MUST EXIST FOR AN AUTHORIZED AND COMPETENT AGENT TO ACCESS AND
EVALUATE ACCOUNTABILITY INFORMATION BY A SECURE MEANS, WITHIN A
REASONABLE AMOUNT OF TIME, AND WITHOUT UNDUE DIFFICULTY.

5.3.3 Assurance

The third basic control objective is concerned with guaranteeing
or providing confidence that the security policy has been
implemented correctly and that the protection-relevant elements of
the system do, indeed, accurately mediate and enforce the intent
of that policy. By extension, assurance must include a guarantee
that the trusted portion of the system works only as intended. To
accomplish these objectives, two types of assurance are needed.
They are life-cycle assurance and operational assurance.

Life-cycle assurance refers to steps taken by an organization to
ensure that the system is designed, developed, and maintained
using formalized and rigorous controls and standards.[17]
Computer systems that process and store sensitive or classified
information depend on the hardware and software to protect that
information. It follows that the hardware and software themselves
must be protected against unauthorized changes that could cause
protection mechanisms to malfunction or be bypassed completely.
For this reason trusted computer systems must be carefully
evaluated and tested during the design and development phases and
reevaluated whenever changes are made that could affect the
integrity of the protection mechanisms. Only in this way can
confidence be provided that the hardware and software
interpretation of the security policy is maintained accurately
and without distortion.

While life-cycle assurance is concerned with procedures for
managing system design, development, and maintenance; operational
assurance focuses on features and system architecture used to
ensure that the security policy is uncircumventably enforced
during system operation. That is, the security policy must be
integrated into the hardware and software protection features of
the system. Examples of steps taken to provide this kind of
confidence include: methods for testing the operational hardware
and software for correct operation, isolation of protection-
critical code, and the use of hardware and software to provide
distinct domains. The control objective is:

ASSURANCE CONTROL OBJECTIVE

SYSTEMS THAT ARE USED TO PROCESS OR HANDLE CLASSIFIED OR OTHER
SENSITIVE INFORMATION MUST BE DESIGNED TO GUARANTEE CORRECT AND
ACCURATE INTERPRETATION OF THE SECURITY POLICY AND MUST NOT
DISTORT THE INTENT OF THAT POLICY. ASSURANCE MUST BE PROVIDED
THAT CORRECT IMPLEMENTATION AND OPERATION OF THE POLICY EXISTS
THROUGHOUT THE SYSTEM’S LIFE-CYCLE.

6.0 RATIONALE BEHIND THE EVALUATION CLASSES

6.1 The Reference Monitor Concept

In October of 1972, the Computer Security Technology Planning Study, conducted
by James P. Anderson & Co., produced a report for the Electronic Systems
Division (ESD) of the United States Air Force.[1] In that report, the concept
of “a reference monitor which enforces the authorized access relationships
between subjects and objects of a system” was introduced. The reference
monitor concept was found to be an essential element of any system that would
provide multilevel secure computing facilities and controls.

The Anderson report went on to define the reference validation mechanism as
“an implementation of the reference monitor concept . . . that validates
each reference to data or programs by any user (program) against a list of
authorized types of reference for that user.” It then listed the three design
requirements that must be met by a reference validation mechanism:

a. The reference validation mechanism must be tamper proof.

b. The reference validation mechanism must always be invoked.

c. The reference validation mechanism must be small enough to be
subject to analysis and tests, the completeness of which can
be assured.”[1]

Extensive peer review and continuing research and development activities have
sustained the validity of the Anderson Committee’s findings. Early examples
of the reference validation mechanism were known as security kernels. The
Anderson Report described the security kernel as “that combination of hardware
and software which implements the reference monitor concept.”[1] In this vein,
it will be noted that the security kernel must support the three reference
monitor requirements listed above.

6.2 A Formal Security Policy Model

Following the publication of the Anderson report, considerable research was
initiated into formal models of security policy requirements and of the
mechanisms that would implement and enforce those policy models as a security
kernel. Prominent among these efforts was the ESD-sponsored development of
the Bell and LaPadula model, an abstract formal treatment of DoD security
policy.[2] Using mathematics and set theory, the model precisely defines the
notion of secure state, fundamental modes of access, and the rules for
granting subjects specific modes of access to objects. Finally, a theorem is
proven to demonstrate that the rules are security-preserving operations, so
that the application of any sequence of the rules to a system that is in a
secure state will result in the system entering a new state that is also
secure. This theorem is known as the Basic Security Theorem.

The Bell and LaPadula model defines a relationship between clearances of
subjects and classifications of system objects, now referenced as the
“dominance relation.” From this definition, accesses permitted between
subjects and objects are explicitly defined for the fundamental modes of
access, including read-only access, read/write access, and write-only access.
The model defines the Simple Security Condition to control granting a subject
read access to a specific object, and the *-Property (read “Star Property”) to
control granting a subject write access to a specific object. Both the Simple
Security Condition and the *-Property include mandatory security provisions
based on the dominance relation between the clearance of the subject and the
classification of the object. The Discretionary Security Property is also
defined, and requires that a specific subject be authorized for the particular
mode of access required for the state transition. In its treatment of
subjects (processes acting on behalf of a user), the model distinguishes
between trusted subjects (i.e., not constrained within the model by the
*-Property) and untrusted subjects (those that are constrained by the
*-Property).

From the Bell and LaPadula model there evolved a model of the method of proof
required to formally demonstrate that all arbitrary sequences of state
transitions are security-preserving. It was also shown that the *- Property
is sufficient to prevent the compromise of information by Trojan Horse
attacks.

6.3 The Trusted Computing Base

In order to encourage the widespread commercial availability of trusted
computer systems, these evaluation criteria have been designed to address
those systems in which a security kernel is specifically implemented as well
as those in which a security kernel has not been implemented. The latter case
includes those systems in which objective (c) is not fully supported because
of the size or complexity of the reference validation mechanism. For
convenience, these evaluation criteria use the term Trusted Computing Base to
refer to the reference validation mechanism, be it a security kernel,
front-end security filter, or the entire trusted computer system.

The heart of a trusted computer system is the Trusted Computing Base (TCB)
which contains all of the elements of the system responsible for supporting
the security policy and supporting the isolation of objects (code and data) on
which the protection is based. The bounds of the TCB equate to the “security
perimeter” referenced in some computer security literature. In the interest
of understandable and maintainable protection, a TCB should be as simple as
possible consistent with the functions it has to perform. Thus, the TCB
includes hardware, firmware, and software critical to protection and must be
designed and implemented such that system elements excluded from it need not
be trusted to maintain protection. Identification of the interface and
elements of the TCB along with their correct functionality therefore forms the
basis for evaluation.

For general-purpose systems, the TCB will include key elements of the
operating system and may include all of the operating system. For embedded
systems, the security policy may deal with objects in a way that is meaningful
at the application level rather than at the operating system level. Thus, the
protection policy may be enforced in the application software rather than in
the underlying operating system. The TCB will necessarily include all those
portions of the operating system and application software essential to the
support of the policy. Note that, as the amount of code in the TCB increases,
it becomes harder to be confident that the TCB enforces the reference monitor
requirements under all circumstances.

6.4 Assurance

The third reference monitor design objective is currently interpreted as
meaning that the TCB “must be of sufficiently simple organization and
complexity to be subjected to analysis and tests, the completeness of which
can be assured.”

Clearly, as the perceived degree of risk increases (e.g., the range of
sensitivity of the system’s protected data, along with the range of clearances
held by the system’s user population) for a particular system’s operational
application and environment, so also must the assurances be increased to
substantiate the degree of trust that will be placed in the system. The
hierarchy of requirements that are presented for the evaluation classes in the
trusted computer system evaluation criteria reflect the need for these
assurances.

As discussed in Section 5.3, the evaluation criteria uniformly require a
statement of the security policy that is enforced by each trusted computer
system. In addition, it is required that a convincing argument be presented
that explains why the TCB satisfies the first two design requirements for a
reference monitor. It is not expected that this argument will be entirely
formal. This argument is required for each candidate system in order to
satisfy the assurance control objective.

The systems to which security enforcement mechanisms have been added, rather
than built-in as fundamental design objectives, are not readily amenable to
extensive analysis since they lack the requisite conceptual simplicity of a
security kernel. This is because their TCB extends to cover much of the
entire system. Hence, their degree of trustworthiness can best be ascertained
only by obtaining test results. Since no test procedure for something as
complex as a computer system can be truly exhaustive, there is always the
possibility that a subsequent penetration attempt could succeed. It is for
this reason that such systems must fall into the lower evaluation classes.

On the other hand, those systems that are designed and engineered to support
the TCB concepts are more amenable to analysis and structured testing. Formal
methods can be used to analyze the correctness of their reference validation
mechanisms in enforcing the system’s security policy. Other methods,
including less-formal arguments, can be used in order to substantiate claims
for the completeness of their access mediation and their degree of
tamper-resistance. More confidence can be placed in the results of this
analysis and in the thoroughness of the structured testing than can be placed
in the results for less methodically structured systems. For these reasons,
it appears reasonable to conclude that these systems could be used in
higher-risk environments. Successful implementations of such systems would be
placed in the higher evaluation classes.

6.5 The Classes

It is highly desirable that there be only a small number of overall evaluation
classes. Three major divisions have been identified in the evaluation
criteria with a fourth division reserved for those systems that have been
evaluated and found to offer unacceptable security protection. Within each
major evaluation division, it was found that “intermediate” classes of trusted
system design and development could meaningfully be defined. These
intermediate classes have been designated in the criteria because they
identify systems that:

* are viewed to offer significantly better protection and assurance
than would systems that satisfy the basic requirements for their
evaluation class; and

* there is reason to believe that systems in the intermediate
evaluation classes could eventually be evolved such that they
would satisfy the requirements for the next higher evaluation
class.

Except within division A it is not anticipated that additional “intermediate”
evaluation classes satisfying the two characteristics described above will be
identified.

Distinctions in terms of system architecture, security policy enforcement, and
evidence of credibility between evaluation classes have been defined such that
the “jump” between evaluation classes would require a considerable investment
of effort on the part of implementors. Correspondingly, there are expected to
be significant differentials of risk to which systems from the higher
evaluation classes will be exposed.

7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA

Section 1 presents fundamental computer security requirements and Section 5
presents the control objectives for Trusted Computer Systems. They are
general requirements, useful and necessary, for the development of all secure
systems. However, when designing systems that will be used to process
classified or other sensitive information, functional requirements for meeting
the Control Objectives become more specific. There is a large body of policy
laid down in the form of Regulations, Directives, Presidential Executive
Orders, and OMB Circulars that form the basis of the procedures for the
handling and processing of Federal information in general and classified
information specifically. This section presents pertinent excerpts from these
policy statements and discusses their relationship to the Control Objectives.

7.1 Established Federal Policies

A significant number of computer security policies and associated requirements
have been promulgated by Federal government elements. The interested reader
is referred to reference [32] which analyzes the need for trusted systems in
the civilian agencies of the Federal government, as well as in state and local
governments and in the private sector. This reference also details a number
of relevant Federal statutes, policies and requirements not treated further
below.

Security guidance for Federal automated information systems is provided by the
Office of Management and Budget. Two specifically applicable Circulars have
been issued. OMB Circular No. A-71, Transmittal Memorandum No. 1, “Security
of Federal Automated Information Systems,”[26] directs each executive agency
to establish and maintain a computer security program. It makes the head of
each executive branch, department and agency responsible “for assuring an
adequate level of security for all agency data whether processed in-house or
commercially. This includes responsibility for the establishment of physical,
administrative and technical safeguards required to adequately protect
personal, proprietary or other sensitive data not subject to national security
regulations, as well as national security data.”[26, para. 4 p. 2]

OMB Circular No. A-123, “Internal Control Systems,”[27] issued to help
eliminate fraud, waste, and abuse in government programs requires: (a) agency
heads to issue internal control directives and assign responsibility, (b)
managers to review programs for vulnerability, and (c) managers to perform
periodic reviews to evaluate strengths and update controls. Soon after
promulgation of OMB Circular A-123, the relationship of its internal control
requirements to building secure computer systems was recognized.[4] While not
stipulating computer controls specifically, the definition of Internal
Controls in A-123 makes it clear that computer systems are to be included:

“Internal Controls – The plan of organization and all of the methods and
measures adopted within an agency to safeguard its resources, assure the
accuracy and reliability of its information, assure adherence to
applicable laws, regulations and policies, and promote operational
economy and efficiency.”[27, sec. 4.C]

The matter of classified national security information processed by ADP
systems was one of the first areas given serious and extensive concern in
computer security. The computer security policy documents promulgated as a
result contain generally more specific and structured requirements than most,
keyed in turn to an authoritative basis that itself provides a rather clearly
articulated and structured information security policy. This basis, Executive
Order 12356, “National Security Information,” sets forth requirements for the
classification, declassification and safeguarding of “national security
information” per se.[14]

7.2 DoD Policies

Within the Department of Defense, these broad requirements are implemented and
further specified primarily through two vehicles: 1) DoD Regulation 5200.1-R
[7], which applies to all components of the DoD as such, and 2) DoD 5220.22-M,
“Industrial Security Manual for Safeguarding Classified Information” [11],
which applies to contractors included within the Defense Industrial Security
Program. Note that the latter transcends DoD as such, since it applies not
only to any contractors handling classified information for any DoD component,
but also to the contractors of eighteen other Federal organizations for whom
the Secretary of Defense is authorized to act in rendering industrial security
services.*

____________________________________________________________
* i.e., NASA, Commerce Department, GSA, State Department,
Small Business Administration, National Science Foundation,
Treasury Department, Transportation Department, Interior
Department, Agriculture Department, Health and Human
Services Department, Labor Department, Environmental
Protection Agency, Justice Department, U.S. Arms Control and
Disarmament Agency, Federal Emergency Management Agency,
Federal Reserve System, and U.S. General Accounting Office.
____________________________________________________________

For ADP systems, these information security requirements are further amplified
and specified in: 1) DoD Directive 5200.28 [8] and DoD Manual 5200.28-M [9],
for DoD components; and 2) Section XIII of DoD 5220.22-M [11] for contractors.
DoD Directive 5200.28, “Security Requirements for Automatic Data Processing
(ADP) Systems,” stipulates: “Classified material contained in an ADP system
shall be safeguarded by the continuous employment of protective features in
the system’s hardware and software design and configuration . . . .”[8,
sec. IV] Furthermore, it is required that ADP systems that “process, store,
or use classified data and produce classified information will, with
reasonable dependability, prevent:

a. Deliberate or inadvertent access to classified material by
unauthorized persons, and

b. Unauthorized manipulation of the computer and its associated
peripheral devices.”[8, sec. I B.3]

Requirements equivalent to these appear within DoD 5200.28-M [9] and in DoD
5220.22-M [11].

From requirements imposed by these regulations, directives and circulars, the
three components of the Security Policy Control Objective, i.e., Mandatory and
Discretionary Security and Marking, as well as the Accountability and
Assurance Control Objectives, can be functionally defined for DoD
applications. The following discussion provides further specificity in Policy
for these Control Objectives.

7.3 Criteria Control Objective for Security Policy

7.3.1 Marking

The control objective for marking is: “Systems that are designed
to enforce a mandatory security policy must store and preserve the
integrity of classification or other sensitivity labels for all
information. Labels exported from the system must be accurate
representations of the corresonding internal sensitivity labels
being exported.”

DoD 5220.22-M, “Industrial Security Manual for Safeguarding
Classified Information,” explains in paragraph 11 the reasons for
marking information:

“Designation by physical marking, notation or other means
serves to inform and to warn the holder about the
classification designation of the information which requires
protection in the interest of national security. The degree
of protection against unauthorized disclosure which will be
required for a particular level of classification is directly
commensurate with the marking designation which is assigned
to the material.”[11]

Marking requirements are given in a number of policy statements.

Executive Order 12356 (Sections 1.5.a and 1.5.a.1) requires that
classification markings “shall be shown on the face of all
classified documents, or clearly associated with other forms of
classified information in a manner appropriate to the medium
involved.”[14]

DoD Regulation 5200.1-R (Section 1-500) requires that: “. . .
information or material that requires protection against
unauthorized disclosure in the interest of national security shall
be classified in one of three designations, namely: ‘Top Secret,’
‘Secret’ or ‘Confidential.'”[7] (By extension, for use in computer
processing, the unofficial designation “Unclassified” is used to
indicate information that does not fall under one of the other
three designations of classified information.)

DoD Regulation 5200.1-R (Section 4-304b) requires that: “ADP
systems and word processing systems employing such media shall
provide for internal classification marking to assure that
classified information contained therein that is reproduced or
generated, will bear applicable classification and associated
markings.” (This regulation provides for the exemption of certain
existing systems where “internal classification and applicable
associated markings cannot be implemented without extensive system
modifications.”[7] However, it is clear that future DoD ADP
systems must be able to provide applicable and accurate labels for
classified and other sensitive information.)

DoD Manual 5200.28-M (Section IV, 4-305d) requires the following:
“Security Labels – All classified material accessible by or within
the ADP system shall be identified as to its security
classification and access or dissemination limitations, and all
output of the ADP system shall be appropriately marked.”[9]

7.3.2 Mandatory Security

The control objective for mandatory security is: “Security
policies defined for systems that are used to process classified
or other specifically categorized sensitive information must
include provisions for the enforcement of mandatory access control
rules. That is, they must include a set of rules for controlling
access based directly on a comparison of the individual’s
clearance or authorization for the information and the
classification or sensitivity designation of the information being
sought, and indirectly on considerations of physical and other
environmental factors of control. The mandatory access control
rules must accurately reflect the laws, regulations, and general
policies from which they are derived.”

There are a number of policy statements that are related to
mandatory security.

Executive Order 12356 (Section 4.1.a) states that “a person is
eligible for access to classified information provided that a
determination of trustworthiness has been made by agency heads or
designated officials and provided that such access is essential
to the accomplishment of lawful and authorized Government
purposes.”[14]

DoD Regulation 5200.1-R (Chapter I, Section 3) defines a Special
Access Program as “any program imposing ‘need-to-know’ or access
controls beyond those normally provided for access to
Confidential, Secret, or Top Secret information. Such a program
includes, but is not limited to, special clearance, adjudication,
or investigative requirements, special designation of officials
authorized to determine ‘need-to-know’, or special lists of persons
determined to have a ‘need-to- know.'”[7, para. 1-328] This
passage distinguishes between a ‘discretionary’ determination of
need-to-know and formal need-to-know which is implemented through
Special Access Programs. DoD Regulation 5200.1-R, paragraph 7-100
describes general requirements for trustworthiness (clearance) and
need-to-know, and states that the individual with possession,
knowledge or control of classified information has final
responsibility for determining if conditions for access have been
met. This regulation further stipulates that “no one has a right
to have access to classified information solely by virtue of rank
or position.” [7, para. 7-100])

DoD Manual 5200.28-M (Section II 2-100) states that, “Personnel
who develop, test (debug), maintain, or use programs which are
classified or which will be used to access or develop classified
material shall have a personnel security clearance and an access
authorization (need-to-know), as appropriate for the highest
classified and most restrictive category of classified material
which they will access under system constraints.”[9]

DoD Manual 5220.22-M (Paragraph 3.a) defines access as “the
ability and opportunity to obtain knowledge of classified
information. An individual, in fact, may have access to
classified information by being in a place where such information
is kept, if the security measures which are in force do not
prevent him from gaining knowledge of the classified
information.”[11]

The above mentioned Executive Order, Manual, Directives and
Regulations clearly imply that a trusted computer system must
assure that the classification labels associated with sensitive
data cannot be arbitrarily changed, since this could permit
individuals who lack the appropriate clearance to access
classified information. Also implied is the requirement that a
trusted computer system must control the flow of information so
that data from a higher classification cannot be placed in a
storage object of lower classification unless its “downgrading”
has been authorized.

7.3.3 Discretionary Security

The term discretionary security refers to a computer system’s
ability to control information on an individual basis. It stems
from the fact that even though an individual has all the formal
clearances for access to specific classified information, each
individual’s access to information must be based on a demonstrated
need-to-know. Because of this, it must be made clear that this
requirement is not discretionary in a “take it or leave it” sense.
The directives and regulations are explicit in stating that the
need-to-know test must be satisfied before access can be granted
to the classified information. The control objective for
discretionary security is: “Security policies defined for systems
that are used to process classified or other sensitive information
must include provisions for the enforcement of discretionary
access control rules. That is, they must include a consistent set
of rules for controlling and limiting access based on identified
individuals who have been determined to have a need-to-know for the
information.”

DoD Regulation 5200.1-R (Paragraph 7-100) In addition to excerpts
already provided that touch on need-to- know, this section of the
regulation stresses the need- to-know principle when it states “no
person may have access to classified information unless . . .
access is necessary for the performance of official duties.”[7]

Also, DoD Manual 5220.22-M (Section III 20.a) states that “an
individual shall be permitted to have access to classified
information only . . . when the contractor determines that access
is necessary in the performance of tasks or services essential to
the fulfillment of a contract or program, i.e., the individual has
a need-to-know.”[11]

7.4 Criteria Control Objective for Accountability

The control objective for accountability is: “Systems that are used to
process or handle classified or other sensitive information must assure
individual accountability whenever either a mandatory or discretionary
security policy is invoked. Furthermore, to assure accountability the
capability must exist for an authorized and competent agent to access and
evaluate accountability information by a secure means, within a reasonable
amount of time, and without undue difficulty.”

This control objective is supported by the following citations:

DoD Directive 5200.28 (VI.A.1) states: “Each user’s identity shall be
positively established, and his access to the system, and his activity in
the system (including material accessed and actions taken) controlled and
open to scrutiny.”[8]

DoD Manual 5200.28-M (Section V 5-100) states: “An audit log or file
(manual, machine, or a combination of both) shall be maintained as a
history of the use of the ADP System to permit a regular security review
of system activity. (e.g., The log should record security related
transactions, including each access to a classified file and the nature
of the access, e.g., logins, production of accountable classified
outputs, and creation of new classified files. Each classified file
successfully accessed [regardless of the number of individual references]
during each ‘job’ or ‘interactive session’ should also be recorded in the
audit log. Much of the material in this log may also be required to
assure that the system preserves information entrusted to it.)”[9]

DoD Manual 5200.28-M (Section IV 4-305f) states: “Where needed to assure
control of access and individual accountability, each user or specific
group of users shall be identified to the ADP System by appropriate
administrative or hardware/software measures. Such identification
measures must be in sufficient detail to enable the ADP System to provide
the user only that material which he is authorized.”[9]

DoD Manual 5200.28-M (Section I 1-102b) states:

“Component’s Designated Approving Authorities, or their designees
for this purpose . . . will assure:

. . . . . . . . . . . . . . . . .

(4) Maintenance of documentation on operating systems (O/S)
and all modifications thereto, and its retention for a
sufficient period of time to enable tracing of security-
related defects to their point of origin or inclusion in the
system.

. . . . . . . . . . . . . . . . .

(6) Establishment of procedures to discover, recover,
handle, and dispose of classified material improperly
disclosed through system malfunction or personnel action.

(7) Proper disposition and correction of security
deficiencies in all approved ADP Systems, and the effective
use and disposition of system housekeeping or audit records,
records of security violations or security-related system
malfunctions, and records of tests of the security features
of an ADP System.”[9]

DoD Manual 5220.22-M (Section XIII 111) states: “Audit Trails

a. The general security requirement for any ADP system audit
trail is that it provide a documented history of the use of
the system. An approved audit trail will permit review of
classified system activity and will provide a detailed
activity record to facilitate reconstruction of events to
determine the magnitude of compromise (if any) should a
security malfunction occur. To fulfill this basic
requirement, audit trail systems, manual, automated or a
combination of both must document significant events
occurring in the following areas of concern: (i) preparation
of input data and dissemination of output data (i.e.,
reportable interactivity between users and system support
personnel), (ii) activity involved within an ADP environment
(e.g., ADP support personnel modification of security and
related controls), and (iii) internal machine activity.

b. The audit trail for an ADP system approved to process
classified information must be based on the above three
areas and may be stylized to the particular system. All
systems approved for classified processing should contain
most if not all of the audit trail records listed below. The
contractor’s SPP documentation must identify and describe
those applicable:

1. Personnel access;

2. Unauthorized and surreptitious entry into the
central computer facility or remote terminal areas;

3. Start/stop time of classified processing indicating
pertinent systems security initiation and termination events
(e.g., upgrading/downgrading actions pursuant to paragraph
107);

4. All functions initiated by ADP system console
operators;

5. Disconnects of remote terminals and peripheral
devices (paragraph 107c);

6. Log-on and log-off user activity;

7. Unauthorized attempts to access files or programs,
as well as all open, close, create, and file destroy
actions;

8. Program aborts and anomalies including
identification information (i.e., user/program name, time
and location of incident, etc.);

9. System hardware additions, deletions and maintenance
actions;

10. Generations and modifications affecting the
security features of the system software.

c. The ADP system security supervisor or designee shall
review the audit trail logs at least weekly to assure that
all pertinent activity is properly recorded and that
appropriate action has been taken to correct any anomaly.
The majority of ADP systems in use today can develop audit
trail systems in accord with the above; however, special
systems such as weapons, communications, communications
security, and tactical data exchange and display systems,
may not be able to comply with all aspects of the above and
may require individualized consideration by the cognizant
security office.

d. Audit trail records shall be retained for a period of one
inspection cycle.”[11]

7.5 Criteria Control Objective for Assurance

The control objective for assurance is: “Systems that are used to process
or handle classified or other sensitive information must be designed to
guarantee correct and accurate interpretation of the security policy and
must not distort the intent of that policy. Assurance must be provided
that correct implementation and operation of the policy exists throughout
the system’s life-cycle.”

A basis for this objective can be found in the following sections of DoD
Directive 5200.28:

DoD Directive 5200.28 (IV.B.1) stipulates: “Generally, security of an ADP
system is most effective and economical if the system is designed
originally to provide it. Each Department of Defense Component
undertaking design of an ADP system which is expected to process, store,
use, or produce classified material shall: From the beginning of the
design process, consider the security policies, concepts, and measures
prescribed in this Directive.”[8]

DoD Directive 5200.28 (IV.C.5.a) states: “Provision may be made to permit
adjustment of ADP system area controls to the level of protection
required for the classification category and type(s) of material actually
being handled by the system, provided change procedures are developed and
implemented which will prevent both the unauthorized access to classified
material handled by the system and the unauthorized manipulation of the
system and its components. Particular attention shall be given to the
continuous protection of automated system security measures, techniques
and procedures when the personnel security clearance level of users
having access to the system changes.”[8]

DoD Directive 5200.28 (VI.A.2) states: “Environmental Control. The ADP
System shall be externally protected to minimize the likelihood of
unauthorized access to system entry points, access to classified
information in the system, or damage to the system.”[8]

DoD Manual 5200.28-M (Section I 1-102b) states:

“Component’s Designated Approving Authorities, or their designees
for this purpose . . . will assure:

. . . . . . . . . . . . . . . . .

(5) Supervision, monitoring, and testing, as appropriate, of
changes in an approved ADP System which could affect the
security features of the system, so that a secure system is
maintained.

. . . . . . . . . . . . . . . . .

(7) Proper disposition and correction of security
deficiencies in all approved ADP Systems, and the effective
use and disposition of system housekeeping or audit records,
records of security violations or security-related system
malfunctions, and records of tests of the security features
of an ADP System.

(8) Conduct of competent system ST&E, timely review of
system ST&E reports, and correction of deficiencies needed
to support conditional or final approval or disapproval of
an ADP System for the processing of classified information.

(9) Establishment, where appropriate, of a central ST&E
coordination point for the maintenance of records of
selected techniques, procedures, standards, and tests used
in the testing and evaluation of security features of ADP
Systems which may be suitable for validation and use by
other Department of Defense Components.”[9]

DoD Manual 5220.22-M (Section XIII 103a) requires: “the initial approval,
in writing, of the cognizant security office prior to processing any
classified information in an ADP system. This section requires
reapproval by the cognizant security office for major system
modifications made subsequent to initial approval. Reapprovals will be
required because of (i) major changes in personnel access requirements,
(ii) relocation or structural modification of the central computer
facility, (iii) additions, deletions or changes to main frame, storage or
input/output devices, (iv) system software changes impacting security
protection features, (v) any change in clearance, declassification, audit
trail or hardware/software maintenance procedures, and (vi) other system
changes as determined by the cognizant security office.”[11]

A major component of assurance, life-cycle assurance, is concerned with
testing ADP systems both in the development phase as well as during
operation. DoD Directive 5215.1 (Section F.2.C.(2)) requires
“evaluations of selected industry and government-developed trusted
computer systems against these criteria.”[10]

8.0 A GUIDELINE ON COVERT CHANNELS

A covert channel is any communication channel that can be exploited by a
process to transfer information in a manner that violates the system’s
security policy. There are two types of covert channels: storage channels and
timing channels. Covert storage channels include all vehicles that would
allow the direct or indirect writing of a storage location by one process and
the direct or indirect reading of it by another. Covert timing channels
include all vehicles that would allow one process to signal information to
another process by modulating its own use of system resources in such a way
that the change in response time observed by the second process would provide
information.

From a security perspective, covert channels with low bandwidths represent a
lower threat than those with high bandwidths. However, for many types of
covert channels, techniques used to reduce the bandwidth below a certain rate
(which depends on the specific channel mechanism and the system architecture)
also have the effect of degrading the performance provided to legitimate
system users. Hence, a trade-off between system performance and covert
channel bandwidth must be made. Because of the threat of compromise that
would be present in any multilevel computer system containing classified or
sensitive information, such systems should not contain covert channels with
high bandwidths. This guideline is intended to provide system developers with
an idea of just how high a “high” covert channel bandwidth is.

A covert channel bandwidth that exceeds a rate of one hundred (100) bits per
second is considered “high” because 100 bits per second is the approximate
rate at which many computer terminals are run. It does not seem appropriate
to call a computer system “secure” if information can be compromised at a rate
equal to the normal output rate of some commonly used device.

In any multilevel computer system there are a number of relatively
low-bandwidth covert channels whose existence is deeply ingrained in the
system design. Faced with the large potential cost of reducing the bandwidths
of such covert channels, it is felt that those with maximum bandwidths of less
than one (1) bit per second are acceptable in most application environments.
Though maintaining acceptable performance in some systems may make it
impractical to eliminate all covert channels with bandwidths of 1 or more bits
per second, it is possible to audit their use without adversely affecting
system performance. This audit capability provides the system administration
with a means of detecting — and procedurally correcting — significant
compromise. Therefore, a Trusted Computing Base should provide, wherever
possible, the capability to audit the use of covert channel mechanisms with
bandwidths that may exceed a rate of one (1) bit in ten (10) seconds.

The covert channel problem has been addressed by a number of authors. The
interested reader is referred to references [5], [6], [19], [21], [22], [23],
and [29].

9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL FEATURES

The Mandatory Access Control requirement includes a capability to support an
unspecified number of hierarchical classifications and an unspecified number
of non-hierarchical categories at each hierarchical level. To encourage
consistency and portability in the design and development of the National
Security Establishment trusted computer systems, it is desirable for all such
systems to be able to support a minimum number of levels and categories. The
following suggestions are provided for this purpose:

* The number of hierarchical classifications should be greater than or
equal to eight (8).

* The number of non-hierarchical categories should be greater than or
equal to twenty-nine (29).

10.0 A GUIDELINE ON SECURITY TESTING

These guidelines are provided to give an indication of the extent and
sophistication of testing undertaken by the DoD Computer Security Center
during the Formal Product Evaluation process. Organizations wishing to use
“Department of Defense Trusted Computer System Evaluation Criteria” for
performing their own evaluations may find this section useful for planning
purposes.

As in Part I, highlighting is used to indicate changes in the guidelines from
the next lower division.

10.1 Testing for Division C

10.1.1 Personnel

The security testing team shall consist of at least two
individuals with bachelor degrees in Computer Science or the
equivalent. Team members shall be able to follow test plans
prepared by the system developer and suggest additions, shall
be familiar with the “flaw hypothesis” or equivalent security
testing methodology, and shall have assembly level programming
experience. Before testing begins, the team members shall have
functional knowledge of, and shall have completed the system
developer’s internals course for, the system being evaluated.

10.1.2 Testing

The team shall have “hands-on” involvement in an independent run
of the tests used by the system developer. The team shall
independently design and implement at least five system-specific
tests in an attempt to circumvent the security mechanisms of the
system. The elapsed time devoted to testing shall be at least
one month and need not exceed three months. There shall be no
fewer than twenty hands-on hours spent carrying out system
developer-defined tests and test team-defined tests.

10.2 Testing for Division B

10.2.1 Personnel

The security testing team shall consist of at least two
individuals with bachelor degrees in Computer Science or the
equivalent and at least one individual with a master’s degree in
Computer Science or equivalent. Team members shall be able to
follow test plans prepared by the system developer and suggest
additions, shall be conversant with the “flaw hypothesis” or
equivalent security testing methodology, shall be fluent in the
TCB implementation language(s), and shall have assembly level
programming experience. Before testing begins, the team members
shall have functional knowledge of, and shall have completed the
system developer’s internals course for, the system being
evaluated. At least one team member shall have previously
completed a security test on another system.

10.2.2 Testing

The team shall have “hands-on” involvement in an independent run
of the test package used by the system developer to test
security-relevant hardware and software. The team shall
independently design and implement at least fifteen system-
specific tests in an attempt to circumvent the security
mechanisms of the system. The elapsed time devoted to testing
shall be at least two months and need not exceed four months.
There shall be no fewer than thirty hands-on hours per team
member spent carrying out system developer-defined tests and
test team-defined tests.

10.3 Testing for Division A

10.3.1 Personnel

The security testing team shall consist of at least one
individual with a bachelor’s degree in Computer Science or the
equivalent and at least two individuals with masters’ degrees in
Computer Science or equivalent. Team members shall be able to
follow test plans prepared by the system developer and suggest
additions, shall be conversant with the “flaw hypothesis” or
equivalent security testing methodology, shall be fluent in the
TCB implementation language(s), and shall have assembly level
programming experience. Before testing begins, the team members
shall have functional knowledge of, and shall have completed the
system developer’s internals course for, the system being
evaluated. At least one team member shall be familiar enough
with the system hardware to understand the maintenance diagnostic
programs and supporting hardware documentation. At least two
team members shall have previously completed a security test on
another system. At least one team member shall have
demonstrated system level programming competence on the system
under test to a level of complexity equivalent to adding a device
driver to the system.

10.3.2 Testing

The team shall have “hands-on” involvement in an independent run
of the test package used by the system developer to test
security-relevant hardware and software. The team shall
independently design and implement at least twenty-five system-
specific tests in an attempt to circumvent the security
mechanisms of the system. The elapsed time devoted to testing
shall be at least three months and need not exceed six months.
There shall be no fewer than fifty hands-on hours per team
member spent carrying out system developer-defined tests and
test team-defined tests.

APPENDIX A

Commercial Product Evaluation Process

“Department of Defense Trusted Computer System Evaluation Criteria” forms the
basis upon which the Computer Security Center will carry out the commercial
computer security evaluation process. This process is focused on commercially
produced and supported general-purpose operating system products that meet the
needs of government departments and agencies. The formal evaluation is aimed
at “off-the-shelf” commercially supported products and is completely divorced
from any consideration of overall system performance, potential applications,
or particular processing environments. The evaluation provides a key input to
a computer system security approval/accreditation. However, it does not
constitute a complete computer system security evaluation. A complete study
(e.g., as in reference [18]) must consider additional factors dealing with the
system in its unique environment, such as it’s proposed security mode of
operation, specific users, applications, data sensitivity, physical and
personnel security, administrative and procedural security, TEMPEST, and
communications security.

The product evaluation process carried out by the Computer Security Center has
three distinct elements:

* Preliminary Product Evaluation – An informal dialogue between a vendor
and the Center in which technical information is exchanged to create a
common understanding of the vendor’s product, the criteria, and the
rating that may be expected to result from a formal product evaluation.

* Formal Product Evaluation – A formal evaluation, by the Center, of a
product that is available to the DoD, and that results in that product
and its assigned rating being placed on the Evaluated Products List.

* Evaluated Products List – A list of products that have been subjected
to formal product evaluation and their assigned ratings.

PRELIMINARY PRODUCT EVALUATION

Since it is generally very difficult to add effective security measures late
in a product’s life cycle, the Center is interested in working with system
vendors in the early stages of product design. A preliminary product
evaluation allows the Center to consult with computer vendors on computer
security issues found in products that have not yet been formally announced.

A preliminary evaluation is typically initiated by computer system vendors who
are planning new computer products that feature security or major
security-related upgrades to existing products. After an initial meeting
between the vendor and the Center, appropriate non-disclosure agreements are
executed that require the Center to maintain the confidentiality of any
proprietary information disclosed to it. Technical exchange meetings follow
in which the vendor provides details about the proposed product (particularly
its internal designs and goals) and the Center provides expert feedback to the
vendor on potential computer security strengths and weaknesses of the vendor’s
design choices, as well as relevant interpretation of the criteria. The
preliminary evaluation is typically terminated when the product is completed
and ready for field release by the vendor. Upon termination, the Center
prepares a wrap-up report for the vendor and for internal distribution within
the Center. Those reports containing proprietary information are not
available to the public.

During preliminary evaluation, the vendor is under no obligation to actually
complete or market the potential product. The Center is, likewise, not
committed to conduct a formal product evaluation. A preliminary evaluation
may be terminated by either the Center or the vendor when one notifies the
other, in writing, that it is no longer advantageous to continue the
evaluation.

FORMAL PRODUCT EVALUATION

The formal product evaluation provides a key input to certification of a
computer system for use in National Security Establishment applications and is
the sole basis for a product being placed on the Evaluated Products List.

A formal product evaluation begins with a request by a vendor for the Center
to evaluate a product for which the product itself and accompanying
documentation needed to meet the requirements defined by this publication are
complete. Non-disclosure agreements are executed and a formal product
evaluation team is formed by the Center. An initial meeting is then held with
the vendor to work out the schedule for the formal evaluation. Since testing
of the implemented product forms an important part of the evaluation process,
access by the evaluation team to a working version of the system is negotiated
with the vendor. Additional support required from the vendor includes
complete design documentation, source code, and access to vendor personnel who
can answer detailed questions about specific portions of the product. The
evaluation team tests the product against each requirement, making any
necessary interpretations of the criteria with respect to the product being
evaluated.

The evaluation team writes a two-part final report on their findings about the
system. The first part is publicly available (containing no proprietary
information) and contains the overall class rating assigned to the system and
the details of the evaluation team’s findings when comparing the product
against the evaluation criteria. The second part of the evaluation report
contains vulnerability analyses and other detailed information supporting the
rating decision. Since this part may contain proprietary or other sensitive
information it will be distributed only within the U.S. Government on a
strict need-to-know and non- disclosure basis, and to the vendor. No portion
of the evaluation results will be withheld from the vendor.

APPENDIX B

Summary of Evaluation Criteria Divisions

The divisions of systems recognized under the trusted computer system
evaluation criteria are as follows. Each division represents a major
improvement in the overall confidence one can place in the system to protect
classified and other sensitive information.

Division (D): Minimal Protection

This division contains only one class. It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

Division (C): Discretionary Protection

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

Division (B): Mandatory Protection

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division. Systems in this division must carry the
sensitivity labels with major data structures in the system. The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB. Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

Division (A): Verified Protection

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system. Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

APPENDIX C

Summary of Evaluation Criteria Classes

The classes of systems recognized under the trusted computer system evaluation
criteria are as follows. They are presented in the order of increasing
desirablity from a computer security point of view.

Class (D): Minimal Protection

This class is reserved for those systems that have been evaluated but that
fail to meet the requirements for a higher evaluation class.

Class (C1): Discretionary Security Protection

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data. It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data. The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity.

Class (C2): Controlled Access Protection

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation.

Class (B1): Labeled Security Protection

Class (B1) systems require all the features required for class (C2). In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information. Any
flaws identified by testing must be removed.

Class (B2): Structured Protection

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system. In addition, covert channels are
addressed. The TCB must be carefully structured into protection-critical and
non- protection-critical elements. The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review. Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed. The system is relatively resistant to penetration.

Class (B3): Security Domains

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests. To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity. A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required. The system is highly resistant to
penetration.

Class (A1): Verified Design

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented. This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design. In
keeping with the extensive design and development analysis of the TCB required
of systems in class (A1), more stringent configuration management is required
and procedures are established for securely distributing the system to sites.
A system security administrator is supported.

APPENDIX D

Requirement Directory

This appendix lists requirements defined in “Department of Defense Trusted
Computer System Evaluation Criteria” alphabetically rather than by class. It
is provided to assist in following the evolution of a requirement through the
classes. For each requirement, three types of criteria may be present. Each
will be preceded by the word: NEW, CHANGE, or ADD to indicate the following:

NEW: Any criteria appearing in a lower class are superseded
by the criteria that follow.

CHANGE: The criteria that follow have appeared in a lower class
but are changed for this class. Highlighting is used
to indicate the specific changes to previously stated
criteria.

ADD: The criteria that follow have not been required for any
lower class, and are added in this class to the
previously stated criteria for this requirement.

Abbreviations are used as follows:

NR: (No Requirement) This requirement is not included in
this class.

NAR: (No Additional Requirements) This requirement does not
change from the previous class.

The reader is referred to Part I of this document when placing new criteria
for a requirement into the complete context for that class.

Figure 1 provides a pictorial summary of the evolution of requirements through
the classes.

Audit

C1: NR.

C2: NEW: The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit trail of
accesses to the objects it protects. The audit data shall be
protected by the TCB so that read access to it is limited to those
who are authorized for audit data. The TCB shall be able to record
the following types of events: use of identification and
authentication mechanisms, introduction of objects into a user’s
address space (e.g., file open, program initiation), deletion of
objects, and actions taken by computer operators and system
administrators and/or system security officers. For each recorded
event, the audit record shall identify: date and time of the event,
user, type of event, and success or failure of the event. For
identification/authentication events the origin of request (e.g.,
terminal ID) shall be included in the audit record. For events that
introduce an object into a user’s address space and for object
deletion events the audit record shall include the name of the object.
The ADP system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity.

B1: CHANGE: For events that introduce an object into a user’s address
space and for object deletion events the audit record shall include
the name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the actions
of any one or more users based on individual identity and/or object
security level.

ADD: The TCB shall also be able to audit any override of
human-readable output markings.

B2: ADD: The TCB shall be able to audit the identified events that may be
used in the exploitation of covert storage channels.

B3: ADD: The TCB shall contain a mechanism that is able to monitor the
occurrence or accumulation of security auditable events that may
indicate an imminent violation of security policy. This mechanism
shall be able to immediately notify the security administrator when
thresholds are exceeded.

A1: NAR.

Configuration Management

C1: NR.

C2: NR.

B1: NR.

B2: NEW: During development and maintenance of the TCB, a configuration
management system shall be in place that maintains control of changes
to the descriptive top-level specification, other design data,
implementation documentation, source code, the running version of the
object code, and test fixtures and documentation. The configuration
management system shall assure a consistent mapping among all
documentation and code associated with the current version of the TCB.
Tools shall be provided for generation of a new version of the TCB
from source code. Also available shall be tools for comparing a
newly generated version with the previous TCB version in order to
ascertain that only the intended changes have been made in the code
that will actually be used as the new version of the TCB.

B3: NAR.

A1: CHANGE: During the entire life-cycle, i.e., during the design,
development, and maintenance of the TCB, a configuration management
system shall be in place for all security-relevant hardware, firmware,
and software that maintains control of changes to the formal model,
the descriptive and formal top-level specifications, other design
data, implementation documentation, source code, the running version
of the object code, and test fixtures and documentation. Also
available shall be tools, maintained under strict configuration
control, for comparing a newly generated version with the previous
TCB version in order to ascertain that only the intended changes have
been made in the code that will actually be used as the new version
of the TCB.

ADD: A combination of technical, physical, and procedural safeguards
shall be used to protect from unauthorized modification or
destruction the master copy or copies of all material used to
generate the TCB.

Covert Channel Analysis

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The system developer shall conduct a thorough search for covert
storage channels and make a determination (either by actual
measurement or by engineering estimation) of the maximum bandwidth of
each identified channel. (See the Covert Channels Guideline section.)

B3: CHANGE: The system developer shall conduct a thorough search for
covert channels and make a determination (either by actual
measurement or by engineering estimation) of the maximum bandwidth
of each identified channel.

A1: ADD: Formal methods shall be used in the analysis.

Design Documentation

C1: NEW: Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation of how
this philosophy is translated into the TCB. If the TCB is composed
of distinct modules, the interfaces between these modules shall be
described.

C2: NAR.

B1: ADD: An informal or formal description of the security policy model
enforced by the TCB shall be available and an explanation provided to
show that it is sufficient to enforce the security policy. The
specific TCB protection mechanisms shall be identified and an
explanation given to show that they satisfy the model.

B2: CHANGE: The interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the TCB
shall be available and proven that it is sufficient to enforce the
security policy.

ADD: The descriptive top-level specification (DTLS) shall be shown to
be an accurate description of the TCB interface. Documentation shall
describe how the TCB implements the reference monitor concept and
give an explanation why it is tamperproof, cannot be bypassed, and is
correctly implemented. Documentation shall describe how the TCB is
structured to facilitate testing and to enforce least privilege.
This documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the exploitation
of known covert storage channels shall be identified. The bandwidths
of known covert storage channels, the use of which is not detectable
by the auditing mechanisms, shall be provided. (See the Covert
Channel Guideline section.)

B3: ADD: The TCB implementation (i.e., in hardware, firmware, and
software) shall be informally shown to be consistent with the DTLS.
The elements of the DTLS shall be shown, using informal techniques,
to correspond to the elements of the TCB.

A1: CHANGE: The TCB implementation (i.e., in hardware, firmware, and
software) shall be informally shown to be consistent with the formal
top-level specification (FTLS). The elements of the FTLS shall be
shown, using informal techniques, to correspond to the elements of
the TCB.

ADD: Hardware, firmware, and software mechanisms not dealt with in
the FTLS but strictly internal to the TCB (e.g., mapping registers,
direct memory access I/O) shall be clearly described.

Design Specification and Verification

C1: NR.

C2: NR.

B1: NEW: An informal or formal model of the security policy supported by
the TCB shall be maintained that is shown to be consistent with its
axioms.

B2: CHANGE: A formal model of the security policy supported by the TCB
shall be maintained that is proven consistent with its axioms.

ADD: A descriptive top-level specification (DTLS) of the TCB shall be
maintained that completely and accurately describes the TCB in terms
of exceptions, error messages, and effects. It shall be shown to be
an accurate description of the TCB interface.

B3: ADD: A convincing argument shall be given that the DTLS is consistent
with the model.

A1: CHANGE: The FTLS shall be shown to be an accurate description of the
TCB interface. A convincing argument shall be given that the DTLS is
consistent with the model and a combination of formal and informal
techniques shall be used to show that the FTLS is consistent with the
model.

ADD: A formal top-level specification (FTLS) of the TCB shall be
maintained that accurately describes the TCB in terms of exceptions,
error messages, and effects. The DTLS and FTLS shall include those
components of the TCB that are implemented as hardware and/or
firmware if their properties are visible at the TCB interface. This
verification evidence shall be consistent with that provided within
the state-of-the-art of the particular Computer Security Center-
endorsed formal specification and verification system used. Manual
or other mapping of the FTLS to the TCB source code shall be
performed to provide evidence of correct implementation.

Device Labels

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support the assignment of minimum and maximum
security levels to all attached physical devices. These security
levels shall be used by the TCB to enforce constraints imposed by
the physical environments in which the devices are located.

B3: NAR.

A1: NAR.

Discretionary Access Control

C1: NEW: The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system. The
enforcement mechanism (e.g., self/group/public controls, access
control lists) shall allow users to specify and control sharing of
those objects by named individuals or defined groups or both.

C2: CHANGE: The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined groups of
individuals, or by both.

ADD: The discretionary access control mechanism shall, either by explicit
user action or by default, provide that objects are protected from
unauthorized access. These access controls shall be capable of
including or excluding access to the granularity of a single user.
Access permission to an object by users not already possessing access
permission shall only be assigned by authorized users.

B1: NAR.

B2: NAR.

B3: CHANGE: The enforcement mechanism (e.g., access control lists) shall
allow users to specify and control sharing of those objects. These
access controls shall be capable of specifying, for each named
object, a list of named individuals and a list of groups of named
individuals with their respective modes of access to that object.

ADD: Furthermore, for each such named object, it shall be possible to
specify a list of named individuals and a list of groups of named
individuals for which no access to the object is to be given.

A1: NAR.

Exportation of Labeled Information

C1: NR.

C2: NR.

B1: NEW: The TCB shall designate each communication channel and I/O
device as either single-level or multilevel. Any change in this
designation shall be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit any change in the
current security level associated with a single-level communication
channel or I/O device.

B2: NAR.

B3: NAR.

A1: NAR.

Exportation to Multilevel Devices

C1: NR.

C2: NR.

B1: NEW: When the TCB exports an object to a multilevel I/O device, the
sensitivity label associated with that object shall also be exported
and shall reside on the same physical medium as the exported
information and shall be in the same form (i.e., machine-readable or
human-readable form). When the TCB exports or imports an object over
a multilevel communication channel, the protocol used on that channel
shall provide for the unambiguous pairing between the sensitivity
labels and the associated information that is sent or received.

B2: NAR.

B3: NAR.

A1: NAR.

Exportation to Single-Level Devices

C1: NR.

C2: NR.

B1: NEW: Single-level I/O devices and single-level communication channels
are not required to maintain the sensitivity labels of the
information they process. However, the TCB shall include a mechanism
by which the TCB and an authorized user reliably communicate to
designate the single security level of information imported or
exported via single-level communication channels or I/O devices.

B2: NAR.

B3: NAR.

A1: NAR.

Identification and Authentication

C1: NEW: The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected to
mediate. Furthermore, the TCB shall use a protected mechanism (e.g.,
passwords) to authenticate the user’s identity. The TCB shall
protect authentication data so that it cannot be accessed by any
unauthorized user.

C2: ADD: The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual ADP
system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by that
individual.

B1: CHANGE: Furthermore, the TCB shall maintain authentication data that
includes information for verifying the identity of individual users
(e.g., passwords) as well as information for determining the
clearance and authorizations of individual users. This data shall be
used by the TCB to authenticate the user’s identity and to determine
the security level and authorizations of subjects that may be created
to act on behalf of the individual user.

B2: NAR.

B3: NAR.

A1: NAR.

Label Integrity

C1: NR.

C2: NR.

B1: NEW: Sensitivity labels shall accurately represent security levels of
the specific subjects or objects with which they are associated. When
exported by the TCB, sensitivity labels shall accurately and
unambiguously represent the internal labels and shall be associated
with the information being exported.

B2: NAR.

B3: NAR.

A1: NAR.

Labeling Human-Readable Output

C1: NR.

C2: NR.

B1: NEW: The ADP system administrator shall be able to specify the
printable label names associated with exported sensitivity labels.
The TCB shall mark the beginning and end of all human-readable,
paged, hardcopy output (e.g., line printer output) with human-
readable sensitivity labels that properly* represent the sensitivity
of the output. The TCB shall, by default, mark the top and bottom of
each page of human-readable, paged, hardcopy output (e.g., line
printer output) with human-readable sensitivity labels that
properly* represent the overall sensitivity of the output or that
properly* represent the sensitivity of the information on the page.
The TCB shall, by default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps, graphics) with human-
readable sensitivity labels that properly* represent the sensitivity
of the output. Any override of these marking defaults shall be
auditable by the TCB.

B2: NAR.

B3: NAR.

A1: NAR.

____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
____________________________________________________________

Labels

C1: NR.

C2: NR.

B1: NEW: Sensitivity labels associated with each subject and storage
object under its control (e.g., process, file, segment, device) shall
be maintained by the TCB. These labels shall be used as the basis
for mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an authorized
user the security level of the data, and all such actions shall be
auditable by the TCB.

B2: CHANGE: Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained by
the TCB.

B3: NAR.

A1: NAR.

Mandatory Access Control

C1: NR.

C2: NR.

B1: NEW: The TCB shall enforce a mandatory access control policy over all
subjects and storage objects under its control (e.g., processes,
files, segments, devices). These subjects and objects shall be
assigned sensitivity labels that are a combination of hierarchical
classification levels and non-hierarchical categories, and the labels
shall be used as the basis for mandatory access control decisions.
The TCB shall be able to support two or more such security levels.
(See the Mandatory Access Control guidelines.) The following
requirements shall hold for all accesses between subjects and objects
controlled by the TCB: A subject can read an object only if the
hierarchical classification in the subject’s security level is
greater than or equal to the hierarchical classification in the
object’s security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical categories
in the object’s security level. A subject can write an object only
if the hierarchical classification in the subject’s security level is
less than or equal to the hierarchical classification in the object’s
security level and all the non-hierarchical categories in the
subject’s security level are included in the non-hierarchical
categories in the object’s security level.

B2: CHANGE: The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O devices) that
are directly or indirectly accessible by subjects external to the TCB.
The following requirements shall hold for all accesses between all
subjects external to the TCB and all objects directly or indirectly
accessible by these subjects:

B3: NAR.

A1: NAR.

Object Reuse

C1: NR.

C2: NEW: When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data for
which the subject is not authorized.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Security Features User’s Guide

C1: NEW: A single summary, chapter, or manual in user documentation shall
describe the protection mechanisms provided by the TCB, guidelines on
their use, and how they interact with one another.

C2: NAR.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Security Testing

C1: NEW: The security mechanisms of the ADP system shall be tested and
found to work as claimed in the system documentation. Testing shall
be done to assure that there are no obvious ways for an unauthorized
user to bypass or otherwise defeat the security protection mechanisms
of the TCB. (See the Security Testing guidelines.)

C2: ADD: Testing shall also include a search for obvious flaws that would
allow violation of resource isolation, or that would permit
unauthorized access to the audit or authentication data.

B1: NEW: The security mechanisms of the ADP system shall be tested and
found to work as claimed in the system documentation. A team of
individuals who thoroughly understand the specific implementation of
the TCB shall subject its design documentation, source code, and
object code to thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that would permit
a subject external to the TCB to read, change, or delete data
normally denied under the mandatory or discretionary security policy
enforced by the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter a state
such that it is unable to respond to communications initiated by
other users. All discovered flaws shall be removed or neutralized
and the TCB retested to demonstrate that they have been eliminated
and that new flaws have not been introduced. (See the Security
Testing Guidelines.)

B2: CHANGE: All discovered flaws shall be corrected and the TCB retested
to demonstrate that they have been eliminated and that new flaws have
not been introduced.

ADD: The TCB shall be found relatively resistant to penetration.
Testing shall demonstrate that the TCB implementation is consistent
with the descriptive top-level specification.

B3: CHANGE: The TCB shall be found resistant to penetration.

ADD: No design flaws and no more than a few correctable
implementation flaws may be found during testing and there shall be
reasonable confidence that few remain.

A1: CHANGE: Testing shall demonstrate that the TCB implementation is
consistent with the formal top-level specification.

ADD: Manual or other mapping of the FTLS to the source code may form
a basis for penetration testing.

Subject Sensitivity Labels

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall immediately notify a terminal user of each change
in the security level associated with that user during an interactive
session. A terminal user shall be able to query the TCB as desired
for a display of the subject’s complete sensitivity label.

B3: NAR.

A1: NAR.

System Architecture

C1: NEW: The TCB shall maintain a domain for its own execution that
protects it from external interference or tampering (e.g., by
modification of its code or data structures). Resources controlled
by the TCB may be a defined subset of the subjects and objects in
the ADP system.

C2: ADD: The TCB shall isolate the resources to be protected so that they
are subject to the access control and auditing requirements.

B1: ADD: The TCB shall maintain process isolation through the provision
of distinct address spaces under its control.

B2: NEW: The TCB shall maintain a domain for its own execution that
protects it from external interference or tampering (e.g., by
modification of its code or data structures). The TCB shall maintain
process isolation through the provision of distinct address spaces
under its control. The TCB shall be internally structured into well-
defined largely independent modules. It shall make effective use of
available hardware to separate those elements that are protection-
critical from those that are not. The TCB modules shall be designed
such that the principle of least privilege is enforced. Features in
hardware, such as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely: readable,
writeable). The user interface to the TCB shall be completely
defined and all elements of the TCB identified.

B3: ADD: The TCB shall be designed and structured to use a complete,
conceptually simple protection mechanism with precisely defined
semantics. This mechanism shall play a central role in enforcing the
internal structuring of the TCB and the system. The TCB shall
incorporate significant use of layering, abstraction and data hiding.
Significant system engineering shall be directed toward minimizing
the complexity of the TCB and excluding from the TCB modules that are
not protection-critical.

A1: NAR.

System Integrity

C1: NEW: Hardware and/or software features shall be provided that can be
used to periodically validate the correct operation of the on-site
hardware and firmware elements of the TCB.

C2: NAR.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Test Documentation

C1: NEW: The system developer shall provide to the evaluators a document
that describes the test plan and results of the security mechanisms’
functional testing.

C2: NAR.

B1: NAR.

B2: ADD: It shall include results of testing the effectiveness of the
methods used to reduce covert channel bandwidths.

B3: NAR.

A1: ADD: The results of the mapping between the formal top-level
specification and the TCB source code shall be given.

Trusted Distribution

C1: NR.

C2: NR.

B1: NR.

B2: NR.

B3: NR.

A1: NEW: A trusted ADP system control and distribution facility shall be
provided for maintaining the integrity of the mapping between the
master data describing the current version of the TCB and the on-site
master copy of the code for the current version. Procedures (e.g.,
site security acceptance testing) shall exist for assuring that the
TCB software, firmware, and hardware updates distributed to a
customer are exactly as specified by the master copies.

Trusted Facility Management

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support separate operator and administrator
functions.

B3: ADD: The functions performed in the role of a security administrator
shall be identified. The ADP system administrative personnel shall
only be able to perform security administrator functions after taking
a distinct auditable action to assume the security administrator role
on the ADP system. Non-security functions that can be performed in
the security administration role shall be limited strictly to those
essential to performing the security role effectively.

A1: NAR.

Trusted Facility Manual

C1: NEW: A manual addressed to the ADP system administrator shall present
cautions about functions and privileges that should be controlled
when running a secure facility.

C2: ADD: The procedures for examining and maintaining the audit files as
well as the detailed audit record structure for each type of audit
event shall be given.

B1: ADD: The manual shall describe the operator and administrator
functions related to security, to include changing the
characteristics of a user. It shall provide guidelines on the
consistent and effective use of the protection features of the
system, how they interact, how to securely generate a new TCB, and
facility procedures, warnings, and privileges that need to be
controlled in order to operate the facility in a secure manner.

B2: ADD: The TCB modules that contain the reference validation mechanism
shall be identified. The procedures for secure generation of a new
TCB from source after modification of any modules in the TCB shall
be described.

B3: ADD: It shall include the procedures to ensure that the system is
initially started in a secure manner. Procedures shall also be
included to resume secure system operation after any lapse in system
operation.

A1: NAR.

Trusted Path

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support a trusted communication path between
itself and user for initial login and authentication. Communications
via this path shall be initiated exclusively by a user.

B3: CHANGE: The TCB shall support a trusted communication path between
itself and users for use when a positive TCB-to-user connection is
required (e.g., login, change subject security level).
Communications via this trusted path shall be activated exclusively
by a user or the TCB and shall be logically isolated and unmistakably
distinguishable from other paths.

A1: NAR.

Trusted Recovery

C1: NR.

C2: NR.

B1: NR.

B2: NR.

B3: NEW: Procedures and/or mechanisms shall be provided to assure that,
after an ADP system failure or other discontinuity, recovery without a
protection compromise is obtained.

A1: NAR.

(this page is reserved for Figure 1)

GLOSSARY

Access – A specific type of interaction between a subject and an object
that results in the flow of information from one to the other.

Approval/Accreditation – The official authorization that is
granted to an ADP system to process sensitive information in
its operational environment, based upon comprehensive
security evaluation of the system’s hardware, firmware, and
software security design, configuration, and implementation
and of the other system procedural, administrative,
physical, TEMPEST, personnel, and communications security
controls.

Audit Trail – A set of records that collectively provide
documentary evidence of processing used to aid in tracing
from original transactions forward to related records and
reports, and/or backwards from records and reports to their
component source transactions.

Authenticate – To establish the validity of a claimed identity.

Automatic Data Processing (ADP) System – An assembly of computer
hardware, firmware, and software configured for the purpose
of classifying, sorting, calculating, computing,
summarizing, transmitting and receiving, storing, and
retrieving data with a minimum of human intervention.

Bandwidth – A characteristic of a communication channel that is
the amount of information that can be passed through it in a
given amount of time, usually expressed in bits per second.

Bell-LaPadula Model – A formal state transition model of computer
security policy that describes a set of access control
rules. In this formal model, the entities in a computer
system are divided into abstract sets of subjects and
objects. The notion of a secure state is defined and it is
proven that each state transition preserves security by
moving from secure state to secure state; thus, inductively
proving that the system is secure. A system state is
defined to be “secure” if the only permitted access modes of
subjects to objects are in accordance with a specific
security policy. In order to determine whether or not a
specific access mode is allowed, the clearance of a subject
is compared to the classification of the object and a
determination is made as to whether the subject is
authorized for the specific access mode. The
clearance/classification scheme is expressed in terms of a
lattice. See also: Lattice, Simple Security Property, *-
Property.

Certification – The technical evaluation of a system’s security
features, made as part of and in support of the
approval/accreditation process, that establishes the extent
to which a particular computer system’s design and
implementation meet a set of specified security
requirements.

Channel – An information transfer path within a system. May also
refer to the mechanism by which the path is effected.

Covert Channel – A communication channel that allows a process to
transfer information in a manner that violates the system’s
security policy. See also: Covert Storage Channel, Covert
Timing Channel.

Covert Storage Channel – A covert channel that involves the
direct or indirect writing of a storage location by one
process and the direct or indirect reading of the storage
location by another process. Covert storage channels
typically involve a finite resource (e.g., sectors on a
disk) that is shared by two subjects at different security
levels.

Covert Timing Channel – A covert channel in which one process
signals information to another by modulating its own use of
system resources (e.g., CPU time) in such a way that this
manipulation affects the real response time observed by the
second process.

Data – Information with a specific physical representation.

Data Integrity – The state that exists when computerized data is
the same as that in the source documents and has not been
exposed to accidental or malicious alteration or
destruction.

Descriptive Top-Level Specification (DTLS) – A top-level
specification that is written in a natural language (e.g.,
English), an informal program design notation, or a
combination of the two.

Discretionary Access Control – A means of restricting access to
objects based on the identity of subjects and/or groups to
which they belong. The controls are discretionary in the
sense that a subject with a certain access permission is
capable of passing that permission (perhaps indirectly) on
to any other subject.

Domain – The set of objects that a subject has the ability to
access.

Dominate – Security level S1 is said to dominate security level
S2 if the hierarchical classification of S1 is greater than
or equal to that of S2 and the non-hierarchical categories
of S1 include all those of S2 as a subset.

Exploitable Channel – Any channel that is useable or detectable
by subjects external to the Trusted Computing Base.

Flaw Hypothesis Methodology – A system analysis and penetration
technique where specifications and documentation for the
system are analyzed and then flaws in the system are
hypothesized. The list of hypothesized flaws is then
prioritized on the basis of the estimated probability that a
flaw actually exists and, assuming a flaw does exist, on the
ease of exploiting it and on the extent of control or
compromise it would provide. The prioritized list is used
to direct the actual testing of the system.

Flaw – An error of commission, omission, or oversight in a system
that allows protection mechanisms to be bypassed.

Formal Proof – A complete and convincing mathematical argument,
presenting the full logical justification for each proof
step, for the truth of a theorem or set of theorems. The
formal verification process uses formal proofs to show the
truth of certain properties of formal specification and for
showing that computer programs satisfy their specifications.

Formal Security Policy Model – A mathematically precise statement
of a security policy. To be adequately precise, such a
model must represent the initial state of a system, the way
in which the system progresses from one state to another,
and a definition of a “secure” state of the system. To be
acceptable as a basis for a TCB, the model must be supported
by a formal proof that if the initial state of the system
satisfies the definition of a “secure” state and if all
assumptions required by the model hold, then all future
states of the system will be secure. Some formal modeling
techniques include: state transition models, temporal logic
models, denotational semantics models, algebraic
specification models. An example is the model described by
Bell and LaPadula in reference [2]. See also: Bell-
LaPadula Model, Security Policy Model.

Formal Top-Level Specification (FTLS) – A Top-Level Specification
that is written in a formal mathematical language to allow
theorems showing the correspondence of the system
specification to its formal requirements to be hypothesized
and formally proven.

Formal Verification – The process of using formal proofs to
demonstrate the consistency (design verification) between a
formal specification of a system and a formal security
policy model or (implementation verification) between the
formal specification and its program implementation.

Functional Testing – The portion of security testing in which the
advertised features of a system are tested for correct
operation.

General-Purpose System – A computer system that is designed to
aid in solving a wide variety of problems.

Lattice – A partially ordered set for which every pair of
elements has a greatest lower bound and a least upper bound.

Least Privilege – This principle requires that each subject in a
system be granted the most restrictive set of privileges (or
lowest clearance) needed for the performance of authorized
tasks. The application of this principle limits the damage
that can result from accident, error, or unauthorized use.

Mandatory Access Control – A means of restricting access to
objects based on the sensitivity (as represented by a label)
of the information contained in the objects and the formal
authorization (i.e., clearance) of subjects to access
information of such sensitivity.

Multilevel Device – A device that is used in a manner that
permits it to simultaneously process data of two or more
security levels without risk of compromise. To accomplish
this, sensitivity labels are normally stored on the same
physical medium and in the same form (i.e., machine-readable
or human-readable) as the data being processed.

Multilevel Secure – A class of system containing information with
different sensitivities that simultaneously permits access
by users with different security clearances and needs-to-
know, but prevents users from obtaining access to
information for which they lack authorization.

Object – A passive entity that contains or receives information.
Access to an object potentially implies access to the
information it contains. Examples of objects are: records,
blocks, pages, segments, files, directories, directory
trees, and programs, as well as bits, bytes, words, fields,
processors, video displays, keyboards, clocks, printers,
network nodes, etc.

Object Reuse – The reassignment to some subject of a medium
(e.g., page frame, disk sector, magnetic tape) that
contained one or more objects. To be securely reassigned,
such media must contain no residual data from the previously
contained object(s).

Output – Information that has been exported by a TCB.

Password – A private character string that is used to
authenticate an identity.

Penetration Testing – The portion of security testing in which
the penetrators attempt to circumvent the security features
of a system. The penetrators may be assumed to use all
system design and implementation documentation, which may
include listings of system source code, manuals, and circuit
diagrams. The penetrators work under no constraints other
than those that would be applied to ordinary users.

Process – A program in execution. It is completely characterized
by a single current execution point (represented by the
machine state) and address space.

Protection-Critical Portions of the TCB – Those portions of the
TCB whose normal function is to deal with the control of
access between subjects and objects.

Protection Philosophy – An informal description of the overall
design of a system that delineates each of the protection
mechanisms employed. A combination (appropriate to the
evaluation class) of formal and informal techniques is used
to show that the mechanisms are adequate to enforce the
security policy.

Read – A fundamental operation that results only in the flow of
information from an object to a subject.

Read Access – Permission to read information.

Reference Monitor Concept – An access control concept that refers
to an abstract machine that mediates all accesses to objects
by subjects.

Resource – Anything used or consumed while performing a function.
The categories of resources are: time, information, objects
(information containers), or processors (the ability to use
information). Specific examples are: CPU time; terminal
connect time; amount of directly-addressable memory; disk
space; number of I/O requests per minute, etc.

Security Kernel – The hardware, firmware, and software elements
of a Trusted Computing Base that implement the reference
monitor concept. It must mediate all accesses, be protected
from modification, and be verifiable as correct.

Security Level – The combination of a hierarchical classification
and a set of non-hierarchical categories that represents the
sensitivity of information.

Security Policy – The set of laws, rules, and practices that
regulate how an organization manages, protects, and
distributes sensitive information.

Security Policy Model – An informal presentation of a formal
security policy model.

Security Testing – A process used to determine that the security
features of a system are implemented as designed and that
they are adequate for a proposed application environment.
This process includes hands-on functional testing,
penetration testing, and verification. See also: Functional
Testing, Penetration Testing, Verification.

Sensitive Information – Information that, as determined by a
competent authority, must be protected because its
unauthorized disclosure, alteration, loss, or destruction
will at least cause perceivable damage to someone or
something.

Sensitivity Label – A piece of information that represents the
security level of an object and that describes the
sensitivity (e.g., classification) of the data in the
object. Sensitivity labels are used by the TCB as the basis
for mandatory access control decisions.

Simple Security Property – A Bell-LaPadula security model rule
allowing a subject read access to an object only if the
security level of the subject dominates the security level
of the object.

Single-Level Device – A device that is used to process data of a
single security level at any one time. Since the device
need not be trusted to separate data of different security
levels, sensitivity labels do not have to be stored with the
data being processed.

*-Property (Star Property) – A Bell-LaPadula security model rule
allowing a subject write access to an object only if the
security level of the subject is dominated by the security
level of the object. Also known as the Confinement
Property.

Storage Object – An object that supports both read and write
accesses.

Subject – An active entity, generally in the form of a person,
process, or device that causes information to flow among
objects or changes the system state. Technically, a
process/domain pair.

Subject Security Level – A subject’s security level is equal to
the security level of the objects to which it has both read
and write access. A subject’s security level must always be
dominated by the clearance of the user the subject is
associated with.

TEMPEST – The study and control of spurious electronic signals
emitted from ADP equipment.

Top-Level Specification (TLS) – A non-procedural description of
system behavior at the most abstract level. Typically a
functional specification that omits all implementation
details.

Trap Door – A hidden software or hardware mechanism that permits
system protection mechanisms to be circumvented. It is
activated in some non-apparent manner (e.g., special
“random” key sequence at a terminal).

Trojan Horse – A computer program with an apparently or actually
useful function that contains additional (hidden) functions
that surreptitiously exploit the legitimate authorizations
of the invoking process to the detriment of security. For
example, making a “blind copy” of a sensitive file for the
creator of the Trojan Horse.

Trusted Computer System – A system that employs sufficient
hardware and software integrity measures to allow its use
for processing simultaneously a range of sensitive or
classified information.

Trusted Computing Base (TCB) – The totality of protection
mechanisms within a computer system — including hardware,
firmware, and software — the combination of which is
responsible for enforcing a security policy. It creates a
basic protection environment and provides additional user
services required for a trusted computer system. The
ability of a trusted computing base to correctly enforce a
security policy depends solely on the mechanisms within the
TCB and on the correct input by system administrative
personnel of parameters (e.g., a user’s clearance) related
to the security policy.

Trusted Path – A mechanism by which a person at a terminal can
communicate directly with the Trusted Computing Base. This
mechanism can only be activated by the person or the Trusted
Computing Base and cannot be imitated by untrusted software.

Trusted Software – The software portion of a Trusted Computing
Base.

User – Any person who interacts directly with a computer system.

Verification – The process of comparing two levels of system
specification for proper correspondence (e.g., security
policy model with top-level specification, TLS with source
code, or source code with object code). This process may or
may not be automated.

Write – A fundamental operation that results only in the flow of
information from a subject to an object.

Write Access – Permission to write an object.

REFERENCES

1. Anderson, J. P. Computer Security Technology Planning
Study, ESD-TR-73-51, vol. I, ESD/AFSC, Hanscom AFB,
Bedford, Mass., October 1972 (NTIS AD-758 206).

2. Bell, D. E. and LaPadula, L. J. Secure Computer Systems:
Unified Exposition and Multics Interpretation, MTR-2997
Rev. 1, MITRE Corp., Bedford, Mass., March 1976.

3. Brand, S. L. “An Approach to Identification and Audit of
Vulnerabilities and Control in Application Systems,” in
Audit and Evaluation of Computer Security II: System
Vulnerabilities and Controls, Z. Ruthberg, ed., NBS
Special Publication #500-57, MD78733, April 1980.

4. Brand, S. L. “Data Processing and A-123,” in Proceedings of
the Computer Performance Evaluation User’s Group 18th
Meeting, C. B. Wilson, ed., NBS Special Publication
#500-95, October 1982.

5. Denning, D. E. “A Lattice Model of Secure Information
Flow,” in Communications of the ACM, vol. 19, no. 5
(May 1976), pp. 236-243.

6. Denning, D. E. Secure Information Flow in Computer Systems,
Ph.D. dissertation, Purdue Univ., West Lafayette, Ind.,
May 1975.

7. DoD 5200.1-R, Information Security Program Regulation,
August 1982.

8. DoD Directive 5200.28, Security Requirements for Automatic
Data Processing (ADP) Systems, revised April 1978.

9. DoD 5200.28-M, ADP Security Manual — Techniques and
Procedures for Implementing, Deactivating, Testing, and
Evaluating Secure Resource-Sharing ADP Systems, revised
June 1979.

10. DoD Directive 5215.1, Computer Security Evaluation Center,
25 October 1982.

11. DoD 5220.22-M, Industrial Security Manual for Safeguarding
Classified Information, January 1983.

12. DoD 5220.22-R, Industrial Security Regulation, January 1983.

13. DoD Directive 5400.11, Department of Defense Privacy
Program, 9 June 1982.

14. Executive Order 12356, National Security Information,
6 April 1982.

15. Faurer, L. D. “Keeping the Secrets Secret,” in Government
Data Systems, November – December 1981, pp. 14-17.

16. Federal Information Processing Standards Publication (FIPS
PUB) 39, Glossary for Computer Systems Security,
15 February 1976.

17. Federal Information Processing Standards Publication (FIPS
PUB) 73, Guidelines for Security of Computer
Applications, 30 June 1980.

18. Federal Information Processing Standards Publication (FIPS
PUB) 102, Guideline for Computer Security Certification
and Accreditation.

19. Lampson, B. W. “A Note on the Confinement Problem,” in
Communications of the ACM, vol. 16, no. 10 (October
1973), pp. 613-615.

20. Lee, T. M. P., et al. “Processors, Operating Systems and
Nearby Peripherals: A Consensus Report,” in Audit and
Evaluation of Computer Security II: System
Vulnerabilities and Controls, Z. Ruthberg, ed., NBS
Special Publication #500-57, MD78733, April 1980.

21. Lipner, S. B. A Comment on the Confinement Problem, MITRE
Corp., Bedford, Mass.

22. Millen, J. K. “An Example of a Formal Flow Violation,” in
Proceedings of the IEEE Computer Society 2nd
International Computer Software and Applications
Conference, November 1978, pp. 204-208.

23. Millen, J. K. “Security Kernel Validation in Practice,” in
Communications of the ACM, vol. 19, no. 5 (May 1976),
pp. 243-250.

24. Nibaldi, G. H. Proposed Technical Evaluation Criteria for
Trusted Computer Systems, MITRE Corp., Bedford, Mass.,
M79-225, AD-A108-832, 25 October 1979.

25. Nibaldi, G. H. Specification of A Trusted Computing Base,
(TCB), MITRE Corp., Bedford, Mass., M79-228, AD-A108-
831, 30 November 1979.

26. OMB Circular A-71, Transmittal Memorandum No. 1, Security of
Federal Automated Information Systems, 27 July 1978.

27. OMB Circular A-123, Internal Control Systems, 5 November
1981.

28. Ruthberg, Z. and McKenzie, R., eds. Audit and Evaluation of
Computer Security, in NBS Special Publication #500-19,
October 1977.

29. Schaefer, M., Linde, R. R., et al. “Program Confinement in
KVM/370,” in Proceedings of the ACM National
Conference, October 1977, Seattle.

30. Schell, R. R. “Security Kernels: A Methodical Design of
System Security,” in Technical Papers, USE Inc. Spring
Conference, 5-9 March 1979, pp. 245-250.

31. Trotter, E. T. and Tasker, P. S. Industry Trusted Computer
Systems Evaluation Process, MITRE Corp., Bedford,
Mass., MTR-3931, 1 May 1980.

32. Turn, R. Trusted Computer Systems: Needs and Incentives for
Use in government and Private Sector, (AD # A103399),
Rand Corporation (R-28811-DR&E), June 1981.

33. Walker, S. T. “The Advent of Trusted Computer Operating
Systems,” in National Computer Conference Proceedings,
May 1980, pp. 655-665.

34. Ware, W. H., ed., Security Controls for Computer Systems:
Report of Defense Science Board Task Force on Computer
Security, AD # A076617/0, Rand Corporation, Santa
Monica, Calif., February 1970, reissued October 1979.

DoD STANDARD 5200.28: SUMMARY OF THE DIFFERENCES
BETWEEN IT AND CSC-STD-001-83

Note: Text which has been added or changed is indented and preceded by > sign.
Text which has been deleted is enclosed in slashes (/). “Computer Security
Center” was changed to “National Computer Security Center” throughout the
document.

The FOREWORD Section was rewritten and signed by Mr. Don Latham on
26 Dec 85. The ACKNOWLEDGEMENTS Section was updated.

The PREFACE was changed as follows:

PREFACE

The trusted computer system evaluation criteria defined in this
document classify systems into four broad hierarchical divisions
of enhanced security protection. The criteria provide a basis
for the evaluation of effectiveness of security controls built
into automatic data processing system products. The criteria
were developed with three objectives in mind: (a) to provide
users with a yardstick with which to assess the degree of trust
that can be placed in computer systems for the secure processing
of classified or other sensitive information; (b) to provide
guidance to manufacturers as to what to build into their new,
widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide
a basis for specifying security requirements in acquisition
specifications. Two types of requirements are delineated for
secure processing: (a) specific security feature requirements and
(b) assurance requirements. Some of the latter requirements
enable evaluation personnel to determine if the required features
are present and functioning as intended.

>The scope of these criteria is to be applied to
>the set of components comprising a trusted system, and is
>not necessarily to be applied to each system component
>individually. Hence, some components of a system may be
>completely untrusted, while others may be individually
>evaluated to a lower or higher evaluation class than the
>trusted product considered as a whole system. In trusted
>products at the high end of the range, the strength of the
>reference monitor is such that most of the system
>components can be completely untrusted.

Though the criteria are

>intended to be

application-independent, /it is recognized that/ the
specific security feature requirements may have to be
interpreted when applying the criteria to specific

>systems with their own functional requirements,
>applications or special environments (e.g., communications
>processors, process control computers, and embedded systems
>in general).

The underlying assurance requirements can be
applied across the entire spectrum of ADP system or
application processing environments without special
interpretation.

The SCOPE Section was changed as follows:

Scope

The trusted computer system evaluation criteria defined in this
document apply

>primarily

to /both/ trusted, commercially available
automatic data processing (ADP) systems.

>They are also applicable, as amplified below, to the
>evaluation of existing systems and to the specification of
>security requirements for ADP systems acquisition.

Included are two distinct sets of requirements: l) specific security
feature requirements; and 2) assurance requirements. The specific
feature requirements encompass the capabilities typically found
in information processing systems employing general-purpose
operating systems that are distinct from the applications programs
being supported.

>However, specific security feature requirements
>may also apply to specific systems with their own functional
>requirements, applications or special environments (e.g.,
>communications processors, process control computers, and embedded
>systems in general).

The assurance requirements, on the other hand,
apply to systems that cover the full range of computing environments
from dedicated controllers to full range multilevel secure resource
sharing systems.

Changed the Purpose Section as follows:

Purpose

As outlined in the Preface, the criteria have been developed to
serve a number of intended purposes:

To provide

>a standard

to manufacturers as to what security features to build
into their new and planned, … trust requirements

>(with particular emphasis on preventing the
>disclosure of data)

for sensitive applications.

To provide

>DoD components

with a metric with which to evaluate
the degree of trust that can be placed in …

To provide a basis for specifying security requirements in
acquisition specifications.

With respect to the

>second

purpose for development of the criteria, i.e., providing

>DoD components

with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be
performed on a computer product from a perspective that
excludes the application environment; or, (b) it can be
done to assess whether appropriate security measures …

The latter type of evaluation, i.e., those done for the purpose
of assessing a system’s security attributes with respect to a
specific operational mission, is known as a certification
evaluation. It must be understood that the completion of a
formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific
application environment. On the contrary, the evaluation report
only provides a trusted computer system’s evaluation rating along
with supporting data describing the product system’s strengths
and weaknesses from a computer security point of view. The
system security certification and the formal
approval/accreditation procedure, done in accordance with the
applicable policies of the issuing agencies, must still be
followed before a system can be approved for use in processing or
handling classified information.,8;9.

>Designated Approving Authorities (DAAs) remain ultimately
>responsible for specifying security of systems they
>accredit.

The trusted computer system evaluation criteria will be used
directly and indirectly in the certification process. Along with
applicable policy, it will be used directly as

>technical guidance

for evaluation of the total system and for specifying system
security and certification requirements for new acquisitions. Where
a system being evaluated for certification employs a product that
has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation.
Technical data will be furnished to designers, evaluators and the
Designated Approving Authorities to support their needs for
making decisions.

2.1.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 2.2.1.1 as follows:

2.2.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely Reworded Section 2.2.1.2 as follows:

2.2.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Reworded Section 2.2.2.2 as follows:

2.2.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

For each recorded event, the audit record shall
identify: date and time of the event, user, type of event,
and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object. The ADP system
administrator shall be able to selectively audit the
actions of any one or more users based on individual
identity.

Changed Section 2.2.4.3 as follows:

2.2.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 3.1.1.1 as follows:

3.1.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely reworded Section 3.1.1.2 as follows:

3.1.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.1.1.3.2 as follows:

3.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/ communication
channel or I/O device.

Appended a sentence to Section 3.1.1.4 as follows:

3.1.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed one sentence in Section 3.1.2.1 as follows:

3.1.2.1. Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Reworded Section 3.1.2.2 as follows:

3.1.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

> and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level.

‘Unbolded’ the first sentence of Section 3.1.3.2.1.

Reworded Section 3.1.3.2.2 as follows:

3.1.3.2.2 Design Specification and Verification

An informal or formal model of the security policy
supported by the TCB shall be maintained

>over the life cycle of the ADP system and demonstrated

to be consistent with its axioms.

Changed sentence as follows:

3.1.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 3.2.1.1 as follows:

3.2.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely reworded Section 3.2.1.2 as follows:

3.2.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.2.1.3 as follows:

3.2.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object, ROM) that is
directly or indirectly accessible by subjects external
to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 3.2.1.3.2 as follows:

3.2.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/
communication channel or I/O device.

Appended Sectence to Section 3.2.1.4 as follows:

3.2.1.4 Mandatory Access Control

… Identification and authentication data shall be
used by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 3.2.2.1 as follows:

3.2.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Reworded section 3.2.2.2 as follows:

3.2.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels.

Changed Section 3.2.3.2.2 as follows:

3.2.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its
axioms. A descriptive top-level specification
(DTLS) of the TCB shall be maintained that
completely and accurately describes the TCB in
terms of exceptions, error messages, and effects.
It shall be shown to be an accurate description of
the TCB interface.

Changed Section 3.2.4.3 as follows:

3.2.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths.

Replaced “tamperproof” with “tamper resistant”:

3.2.4.4 Design Documentation

Documentation shall be available that provides a
description of the manufacturer’s philosophy of
protection and an explanation of how this philosophy is
translated into the TCB. The interfaces between the
TCB modules shall be described. A formal description
of the security policy model enforced by the TCB shall
be available and proven that it is sufficient to
enforce the security policy. The specific TCB
protection mechanisms shall be identified and an
explanation given to show that they satisfy the model.
The descriptive top-level specification (DTLS) shall be
shown to be an accurate description of the TCB
interface. Documentation shall describe how the TCB
implements the reference monitor concept and give an
explanation why it is

>tamper resistant,

cannot be bypassed, and is correctly implemented.
Documentation shall describe how the TCB is structured to
facilitate testing and to enforce least privilege. This
documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting
the channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage
channels, the use of which is not detectable by the
auditing mechanisms, shall be provided. (See the Covert
Channel Guideline section.)

Changed Section 3.3.1.1 as follows:

3.3.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
access control lists) shall allow users to specify and
control sharing of those objects,

>and shall provide controls to limit
>propagation of access rights.

The discretionary access control mechanism shall, either by
explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of specifying, for
each named object, a list of named individuals and a
list of groups of named individuals with their
respective modes of access to that object.
Furthermore, for each such named object, it shall be
possible to specify a list of named individuals and a
list of groups of named individuals for which no access
to the object is to be given. Access permission to an
object by users not already possessing access
permission shall only be assigned by authorized users.

Completely reworded Section 3.3.1.2 as follows:

3.3.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.3.1.3 as follows:

3.3.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object, ROM) that is
directly or indirectly accessible by subjects external
to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 3.3.1.3.2 as follows:

3.3.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/
communication channel or I/O device.

Appended Sentence to Section 3.3.1.4 as follows:

3.3.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 3.3.2.1 as follows:

3.3.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Changed Section 3.3.2.2 as follows:

3.3.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able
to immediately notify the security administrator when
thresholds are exceeded,

>and if the occurrence or accumulation
>of these security relevant events continues,
>the system shall take the least disruptive
>action to terminate the event.

Changed the first sentence of Section 3.3.3.2.2 as follows:

3.3.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of
>the ADP system

that is proven consistent with its axioms. …

Changed Section 3.3.4.3 as follows:

3.3.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths.

Replaced “tamperproof” with “tamper resistant” in Section 3.3.4.4.

Changed Section 4.1.1.1 as follows:

4.1.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
access control lists) shall allow users to specify and
control sharing of those objects,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall, either by
explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of specifying, for
each named object, a list of named individuals and a
list of groups of named individuals with their
respective modes of access to that object.
Furthermore, for each such named object, it shall be
possible to specify a list of named individuals and a
list of groups of named individuals for which no access
to the object is to be given. Access permission to an
object by users not already possessing access
permission shall only be assigned by authorized users.

Completely reworded Section 4.1.1.2 as follows:

4.1.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 4.1.1.3 as follows:

4.1.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object,

>ROM)

that is directly or indirectly accessible by subjects
external to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 4.1.1.3.2 as follows:

4.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level

>or levels

associated with a /single-level/
communication channel or I/O device.

Appended Sentence to Section 4.1.1.4 as follows:

4.1.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 4.1.2.1 as follows:

4.1.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Changed Section 4.1.2.2 as follows:

4.1.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able
to immediately notify the security administrator when
thresholds are exceeded,

>and, if the occurrence or accumulation of these
>security relevant events continues, the system
>shall take the least disruptive action to
>terminate the event.

‘Unbolded’ the words “covert channels” in Section 4.1.3.1.3.

Changed the first sentence of Section 4.1.3.2.2 as follows:

4.1.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its axioms. …

Changed Section 4.1.4.3 as follows:

4.1.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested, and

results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths. The results of the mapping between the
formal top-level specification and the TCB source code
shall be given.

Replaced “tamperproof” with “tamper resistant” in Section 4.1.4.4.

Changed the last paragraph of Section 5.1 as follows:

5.1 A Need for Consensus

A major goal of …

As described …

>The Purpose of this section is to describe in detail the
>fundamental control objectives. These objectives lay the
>foundation for the requirements outlined in the criteria.

The goal is to explain the foundations so that those outside
the National Security Establishment can assess their
universality and, by extension, the universal applicability
of the criteria requirements to processing all types of
sensitive applications whether they be for National Security
or the private sector.

Changed the second paragraph of Section 6.2 as follows:

6.2 A Formal Policy Model

Following the publication of …

>A subject can act on behalf of a user or another
>subject. The subject is created as a surrogate
>for the cleared user and is assigned a formal
>security level based on their classification.
>The state transitions and invariants of the formal
>policy model define the invariant relationships
>that must hold between the clearance of the user,
>the formal security level of any process that can
>act on the user’s behalf, and the formal security
>level of the devices and other objects to which any
>process can obtain specific modes of access.

The Bell and LaPadula model,

>for example,

defines a relationship between

>formal security levels of subjects and objects,

now referenced as the “dominance relation.” From this definition …
… Both the Simple Security Condition and the *-Property
include mandatory security provisions based on the dominance
relation between the

>formal security levels of subjects and objects.

The Discretionary Security Property …

Added a sentence to the end of Section 7.0:

7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA

Section 1 presents fundamental computer security
requirements and Section 5 presents the control objectives
for Trusted Computer Systems. They are general
requirements, useful and necessary, for the development of
all secure systems. However, when designing systems that
will be used to process classified or other sensitive
information, functional requirements for meeting the Control
Objectives become more specific. There is a large body of
policy laid down in the form of Regulations, Directives,
Presidential Executive Orders, and OMB Circulars that form
the basis of the procedures for the handling and processing
of Federal information in general and classified information
specifically. This section presents pertinent excerpts from
these policy statements and discusses their relationship to
the Control Objectives.

>These excerpts are examples to illustrate the relationship
>of the policies to criteria and may not be complete.

Inserted the following

>as the next to last paragraph

of Section 7.2:

>DoD Directive 5200.28 provides the security requirements for
>ADP systems. For some types of information, such as
>Sensitive Compartmented Information (SCI), DoD Directive
>5200.28 states that other minimum security requirements also
>apply. These minima are found in DCID 1/16 (new reference
>number 5) which is implemented in DIAM 50-4 (new reference
>number 6) for DoD and DoD contractor ADP systems.

From requirements imposed by …

Changed Footnote #1 referenced by Section 7.2 as follows:

Replaced “Health and Human Services Department” with “U.S.
Information Agency.”

Changed (updated) the quote from DoD 5220.22-M, Section 7.3.1, as
follows:

7.3 Criteria Control Objective for Security Policy

7.3.1 Marking

The control objective for marking …

DoD 5220.22-M, “Industrial Security …

>”a. General. Classification designation by physical
>marking, notation or other means serves to warn and to
>inform the holder what degree of protection against
>unauthorized disclosure is required for that
>information or material.” (14)

Changed the

>last paragraph

of Section 7.5 as follows:

A major component of assurance, life-cycle assurance,

>as described in DoD Directive 7920.1,

is concerned with testing ADP systems both in the
development phase as well as during operation.

>(17)

DoD Directive 5215.1 …

Changed Section 9.0 as follows:

9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL FEATURES

The Mandatory Access Control requirement …

* The number of hierarchical classifications should be
greater than or equal to

>sixteen (16).

* The number of non-hierarchical categories should be
greater than or equal to

>sixty-four (64)..

Completely reworded the third paragraph of Formal Product
Evaluation, in Appendix A, as follows:

Formal Product Evaluation

The formal product evaluation provides …

A formal product evaluation begins with …

>The evaluation team writes a final report on their findings about
>the system. The report is publicly available (containing no
>proprietary or sensitive information) and contains the overall
>class rating assigned to the system and the details of the
>evaluation team’s findings when comparing the product against the
>evaluation criteria. Detailed information concerning
>vulnerabilities found by the evaluation team is furnished to the
>system developers and designers as each is found so that the
>vendor has a chance to eliminate as many of them as possible
>prior to the completion of the Formal Product Evaluation.
>Vulnerability analyses and other proprietary or sensitive
>information are controlled within the Center through the
>Vulnerability Reporting Program and are distributed only within
>the U.S. Government on a strict need-to-know and non-disclosure
>basis, and to the vendor.

Changed two paragraphs in Audit (Appendix D) as follows:

C2: NEW: The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction an
audit trail of accesses to the objects it protects. The
audit data shall be protected by the TCB so that read access
to it is limited to those who are authorized for audit data.
The TCB shall be able to record the following types of
events: use of identification and authentication mechanisms,
introduction of objects into a user’s address space (e.g.,
file open, program initiation), deletion of objects, actions
taken by computer operators and system administrators and/or
system security officers,

>and other security relevant events.

or each recorded event, the audit record shall
identify: date and time of the event, user, type of event,
and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object. The ADP system
administrator shall be able to selectively audit the actions
of any one or more users based on individual identity.

B3: ADD: …when thresholds are exceeded,

>and, if the occurrence or accumulation of these
>security relevant events continues, the system
>shall take the least disruptive action to terminate
>the event.

Changed one paragraph in Design Documentation (Appendix D):

B2: ADD: Change “tamperproof” to “tamper resistant.”

Changed two paragraphs in Design Specification and Verification:

B1: NEW: An informal or formal model of the security policy
supported by the TCB shall be maintained

>over the life cycle of the ADP system and demonstrated

to be consistent with its axioms.

B2: CHANGE: A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its axioms.

Changed two paragraphs in Discretionary Access Control as follows:

C2: CHANGE: The enforcement mechanism (e.g., self/group/public
controls, access control lists) shall allow users to specify
and control sharing of those objects by named individuals,
or defined groups of individuals, or by both,

>and shall provide controls to limit propagation of access rights.

B3: CHANGE: The enforcement mechanism (e.g., access control
lists) shall allow users to specify and control sharing of
those objects,

>and shall provide controls to limit propagation of access rights.

These access controls shall be capable of specifying, for each
named object, a list of named individuals and a list of groups of
named individuals with their respective modes of access to that object.

Changed 1 paragraph in Exportation of Labeled Information:

B1: NEW: The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any change
in this designation shall be done manually and shall be
auditable by the TCB. The TCB shall maintain and be able to
audit any change in the /current/ security level

>or levels

associated with a /single-level/ communication channel or
I/O device.

Changed 1 paragraph in Identification and Authorization:

B1: CHANGE: … This data shall be used by the TCB to authenticate
the user’s identity and

>to ensure that

the security level and authorizations of subjects external to
the TCB that may be created to act on behalf of the individual
user

>are dominated by the clearance and authorization
>of that user.

Changed 1 paragraph in Labels:

B2: CHANGE: … (e.g., subject, storage object, ROM) …

Changed 1 paragraph in Mandatory Access Control:

B1: NEW: … Identification and authentication data shall be used

>by the TCB to authenticate the user’s identity and to ensure
>that the security level and authorization of subjects external
>to the TCB that may be created to act on behalf of the
>individual user are dominated by the clearance and authoriza-
>tion of that user.

Rewrote 1 paragraph in Object Reuse:

C2: NEW:
>All authorizations to the information contained
>within a storage object shall be revoked prior to initial
>assignment, allocation or reallocation to a subject from the
>TCB’s pool of unused storage objects. No information,
>including encrypted representations of information, produced
>by a prior subject’s actions is to be available to any
>subject that obtains access to an object that has been
>released back to the system.

Changed l paragraph in Test Documentation:

C1: NEW: The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested,

and results of the security mechanisms’ functional testing.

GLOSSARY

Changed Discretionary Access Control:

Discretionary Access Control – A means of restricting access to
objects based on the identity of subjects and/or groups to
which they belong. The controls are discretionary in the
sense that a subject with a certain access permission is
capable of passing that permission (perhaps indirectly) on
to any other subject

(unless restrained by mandatory access control).

Added:

Front-End Security Filter – A process that is invoked to process
data according to a specified security policy prior to
releasing the data outside the processing environment or
upon receiving data from an external source.

Granularity – The relative fineness or coarseness by which a
mechanism can be adjusted. The phrase “the granularity of
a single user” means the access control mechanism can be
adjusted to include or exclude any single user.

Read-Only Memory (ROM) – A storage area in which the contents
can be read but not altered during normal computer
processing.

Security Relevant Event – Any event that attempts to change the
security state of the system, (e.g., change discretionary
access controls, change the security level of the subject,
change user password, etc.). Also, any event that attempts
to violate the security policy of the system, (e.g., too
many attempts to login, attempts to violate the mandatory
access control limits of a device, attempts to downgrade a
file, etc.).

Changed the name of the term:

Simple Security /Property/

>Condition

– A Bell-LaPadula security model rule allowing a subject
read access to an object only if the security level of the
subject dominates the security level of the object.

Changed definition:

Trusted Computing Base (TCB) – The totality of protection
mechanisms within a computer system –including hardware,
firmware, and software — the combination of which is
responsible for enforcing a security policy.

>A TCB consists of one or more components that together enforce
>a unified security policy over a product or system.

The ability of a TCB to correctly enforce a security
policy depends solely on the mechanisms within the TCB and
on the correct input by system administrative personnel of
parameters (e.g., a user’s clearance) related to the
security policy.

REFERENCES

Added: (References were renumbered as necessary)

5. DCID 1/16, Security of Foreign Intelligence in Automated
Data Processing Systems and Networks (U), 4 January 1983.

6. DIAM 50-4, Security of Compartmented Computer Operations (U),
24 June 1980.

9. DoD Directive 5000.29, Management of Computer Resources in
Major Defense Systems, 26 April 1976.

17. DoD Directive 7920.1, Life Cycle Management of Automated
Information Systems (AIS), 17 October 1978.

Corrected dates on the following References:

14. DoD 5220.22-M, Industrial Security Manual for Safeguarding
Classified Information, March 1984.

15. DoD 5220.22-R, Industrial Security Regulation, February
1984.

%

Department of Defense Trusted Computer System Evaluation Criteria (The Orange Book) 15 August 1983

orange-boot.txt: No such file or directory
% cat orange.boo
orange.boo: No such file or directory
% cat orange-book.txt
CSC-STD-001-83
Library No. S225,711

DEPARTMENT OF DEFENSE

TRUSTED COMPUTER SYSTEM EVALUATION CRITERIA

15 August 1983

CSC-STD-001-83

FOREWORD

This publication, “Department of Defense Trusted Computer System Evaluation
Criteria,” is being issued by the DoD Computer Security Center under the
authority of and in accordance with DoD Directive 5215.1, “Computer Security
Evaluation Center.” The criteria defined in this document constitute a uniform
set of basic requirements and evaluation classes for assessing the
effectiveness of security controls built into Automatic Data Processing (ADP)
systems. These criteria are intended for use in the evaluation and selection
of ADP systems being considered for the processing and/or storage and
retrieval of sensitive or classified information by the Department of Defense.
Point of contact concerning this publication is the Office of Standards and
Products, Attention: Chief, Computer Security Standards.

____________________________ 15 August 1983
Melville H. Klein
Director
DoD Computer Security Center

ACKNOWLEDGMENTS

Special recognition is extended to Sheila L. Brand, DoD Computer Security
Center (DoDCSC), who integrated theory, policy, and practice into and directed
the production of this document.

Acknowledgment is also given for the contributions of: Grace Hammonds and
Peter S. Tasker, the MITRE Corp., Daniel J. Edwards, Col. Roger R. Schell,
Marvin Schaefer, DoDCSC, and Theodore M. P. Lee, Sperry UNIVAC, who as
original architects formulated and articulated the technical issues and
solutions presented in this document; Jeff Makey and Warren F. Shadle,
DoDCSC, who assisted in the preparation of this document; James P. Anderson,
James P. Anderson & Co., Steven B. Lipner, Digital Equipment Corp., Clark
Weissman, System Development Corp., LTC Lawrence A. Noble, formerly U.S. Air
Force, Stephen T. Walker, formerly DoD, Eugene V. Epperly, DoD, and James E.
Studer, formerly Dept. of the Army, who gave generously of their time and
expertise in the review and critique of this document; and finally, thanks are
given to the computer industry and others interested in trusted computing for
their enthusiastic advice and assistance throughout this effort.

TABLE OF CONTENTS

FOREWORD. . . . . . . . . . . . . . . . . . . . . . . . . . . .i
ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . ii
PREFACE . . . . . . . . . . . . . . . . . . . . . . . . . . . .v
INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . . . . .1

PART I: THE CRITERIA
Section
1.0 DIVISION D: MINIMAL PROTECTION. . . . . . . . . . . . .9
2.0 DIVISION C: DISCRETIONARY PROTECTION. . . . . . . . . 11
2.1 Class (C1): Discretionary Security Protection . . 12
2.2 Class (C2): Controlled Access Protection. . . . . 15
3.0 DIVISION B: MANDATORY PROTECTION. . . . . . . . . . . 19
3.1 Class (B1): Labeled Security Protection . . . . . 20
3.2 Class (B2): Structured Protection . . . . . . . . 26
3.3 Class (B3): Security Domains. . . . . . . . . . . 33
4.0 DIVISION A: VERIFIED PROTECTION . . . . . . . . . . . 41
4.1 Class (A1): Verified Design . . . . . . . . . . . 42
4.2 Beyond Class (A1). . . . . . . . . . . . . . . . . 51

PART II: RATIONALE AND GUIDELINES

5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS. . . . . 55
5.1 A Need for Consensus . . . . . . . . . . . . . . . 56
5.2 Definition and Usefulness. . . . . . . . . . . . . 56
5.3 Criteria Control Objective . . . . . . . . . . . . 56
6.0 RATIONALE BEHIND THE EVALUATION CLASSES. . . . . . . . . 63
6.1 The Reference Monitor Concept. . . . . . . . . . . 64
6.2 A Formal Security Policy Model . . . . . . . . . . 64
6.3 The Trusted Computing Base . . . . . . . . . . . . 65
6.4 Assurance. . . . . . . . . . . . . . . . . . . . . 65
6.5 The Classes. . . . . . . . . . . . . . . . . . . . 66
7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA . . . . 69
7.1 Established Federal Policies . . . . . . . . . . . 70
7.2 DoD Policies . . . . . . . . . . . . . . . . . . . 70
7.3 Criteria Control Objective For Security Policy . . 71
7.4 Criteria Control Objective for Accountability. . . 74
7.5 Criteria Control Objective for Assurance . . . . . 76
8.0 A GUIDELINE ON COVERT CHANNELS . . . . . . . . . . . . . 79
9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL
FEATURES . . . . . . . . . . . . . . . . . . . . . . . . 81
10.0 A GUIDELINE ON SECURITY TESTING . . . . . . . . . . . . 83
10.1 Testing for Division C . . . . . . . . . . . . . . 84
10.2 Testing for Division B . . . . . . . . . . . . . . 84
10.3 Testing for Division A . . . . . . . . . . . . . . 85
APPENDIX A: Commercial Product Evaluation Process. . . . . . 87
APPENDIX B: Summary of Evaluation Criteria Divisions . . . . 89
APPENDIX C: Sumary of Evaluation Criteria Classes. . . . . . 91
APPENDIX D: Requirement Directory. . . . . . . . . . . . . . 93

GLOSSARY. . . . . . . . . . . . . . . . . . . . . . . . . . .109

REFERENCES. . . . . . . . . . . . . . . . . . . . . . . . . .115

PREFACE

The trusted computer system evaluation criteria defined in this document
classify systems into four broad hierarchical divisions of enhanced security
protection. They provide a basis for the evaluation of effectiveness of
security controls built into automatic data processing system products. The
criteria were developed with three objectives in mind: (a) to provide users
with a yardstick with which to assess the degree of trust that can be placed
in computer systems for the secure processing of classified or other sensitive
information; (b) to provide guidance to manufacturers as to what to build into
their new, widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide a basis for
specifying security requirements in acquisition specifications. Two types of
requirements are delineated for secure processing: (a) specific security
feature requirements and (b) assurance requirements. Some of the latter
requirements enable evaluation personnel to determine if the required features
are present and functioning as intended. Though the criteria are
application-independent, it is recognized that the specific security feature
requirements may have to be interpreted when applying the criteria to specific
applications or other special processing environments. The underlying
assurance requirements can be applied across the entire spectrum of ADP system
or application processing environments without special interpretation.

INTRODUCTION

Historical Perspective

In October 1967, a task force was assembled under the auspices of the Defense
Science Board to address computer security safeguards that would protect
classified information in remote-access, resource-sharing computer systems.
The Task Force report, “Security Controls for Computer Systems,” published in
February 1970, made a number of policy and technical recommendations on
actions to be taken to reduce the threat of compromise of classified
information processed on remote-access computer systems.[34] Department of
Defense Directive 5200.28 and its accompanying manual DoD 5200.28-M, published
in 1972 and 1973 respectivley, responded to one of these recommendations by
establishing uniform DoD policy, security requirements, administrative
controls, and technical measures to protect classified information processed
by DoD computer systems.[8;9] Research and development work undertaken by the
Air Force, Advanced Research Projects Agency, and other defense agencies in
the early and mid 70’s developed and demonstrated solution approaches for the
technical problems associated with controlling the flow of information in
resource and information sharing computer systems.[1] The DoD Computer
Security Initiative was started in 1977 under the auspices of the Under
Secretary of Defense for Research and Engineering to focus DoD efforts
addressing computer security issues.[33]

Concurrent with DoD efforts to address computer security issues, work was
begun under the leadership of the National Bureau of Standards (NBS) to define
problems and solutions for building, evaluating, and auditing secure computer
systems.[17] As part of this work NBS held two invitational workshops on the
subject of audit and evaluation of computer security.[20;28] The first was
held in March 1977, and the second in November of 1978. One of the products
of the second workshop was a definitive paper on the problems related to
providing criteria for the evaluation of technical computer security
effectiveness.[20] As an outgrowth of recommendations from this report, and in
support of the DoD Computer Security Initiative, the MITRE Corporation began
work on a set of computer security evaluation criteria that could be used to
assess the degree of trust one could place in a computer system to protect
classified data.[24;25;31] The preliminary concepts for computer security
evaluation were defined and expanded upon at invitational workshops and
symposia whose participants represented computer security expertise drawn from
industry and academia in addition to the government. Their work has since
been subjected to much peer review and constructive technical criticism from
the DoD, industrial research and development organizations, universities, and
computer manufacturers.

The DoD Computer Security Center (the Center) was formed in January 1981 to
staff and expand on the work started by the DoD Computer Security
Initiative.[15] A major goal of the Center as given in its DoD Charter is to
encourage the widespread availability of trusted computer systems for use by
those who process classified or other sensitive information.[10] The criteria
presented in this document have evolved from the earlier NBS and MITRE
evaluation material.

Scope

The trusted computer system evaluation criteria defined in this document apply
to both trusted general-purpose and trusted embedded (e.g., those dedicated to
a specific application) automatic data processing (ADP) systems. Included are
two distinct sets of requirements: 1) specific security feature requirements;
and 2) assurance requirements. The specific feature requirements encompass
the capabilities typically found in information processing systems employing
general-purpose operating systems that are distinct from the applications
programs being supported. The assurance requirements, on the other hand,
apply to systems that cover the full range of computing environments from
dedicated controllers to full range multilevel secure resource sharing
systems.

Purpose

As outlined in the Preface, the criteria have been developed for a number of
reasons:

* To provide users with a metric with which to evaluate the
degree of trust that can be placed in computer systems for
the secure processing of classified and other sensitive
information.

* To provide guidance to manufacturers as to what security
features to build into their new and planned, commercial
products in order to provide widely available systems that
satisfy trust requirements for sensitive applications.

* To provide a basis for specifying security requirements in
acquisition specifications.

With respect to the first purpose for development of the criteria, i.e.,
providing users with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be performed on a computer
product from a perspective that excludes the application environment; or, (b)
it can be done to assess whether appropriate security measures have been taken
to permit the system to be used operationally in a specific environment. The
former type of evaluation is done by the Computer Security Center through the
Commercial Product Evaluation Process. That process is described in Appendix
A.

The latter type of evaluation, i.e., those done for the purpose of assessing a
system’s security attributes with respect to a specific operational mission,
is known as a certification evaluation. It must be understood that the
completion of a formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific application
environment. On the contrary, the evaluation report only provides a trusted
computer system’s evaluation rating along with supporting data describing the
product system’s strengths and weaknesses from a computer security point of
view. The system security certification and the formal approval/accreditation
procedure, done in accordance with the applicable policies of the issuing
agencies, must still be followed-before a system can be approved for use in
processing or handling classified information.[8;9]

The trusted computer system evaluation criteria will be used directly and
indirectly in the certification process. Along with applicable policy, it
will be used directly as the basis for evaluation of the total system and for
specifying system security and certification requirements for new
acquisitions. Where a system being evaluated for certification employs a
product that has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation. Technical data
will be furnished to designers, evaluators and the Designated Approving
Authorities to support their needs for making decisions.

Fundamental Computer Security Requirements

Any discussion of computer security necessarily starts from a statement of
requirements, i.e., what it really means to call a computer system “secure.”
In general, secure systems will control, through use of specific security
features, access to information such that only properly authorized
individuals, or processes operating on their behalf, will have access to read,
write, create, or delete information. Six fundamental requirements are
derived from this basic statement of objective: four deal with what needs to
be provided to control access to information; and two deal with how one can
obtain credible assurances that this is accomplished in a trusted computer
system.

POLICY

Requirement 1 – SECURITY POLICY – There must be an explicit and well-defined
security policy enforced by the system. Given identified subjects and
objects, there must be a set of rules that are used by the system to determine
whether a given subject can be permitted to gain access to a specific object.
Computer systems of interest must enforce a mandatory security policy that can
effectively implement access rules for handling sensitive (e.g., classified)
information.[7] These rules include requirements such as: No person lacking
proper personnel security clearance shall obtain access to classified
information. In addition, discretionary security controls are required to
ensure that only selected users or groups of users may obtain access to data
(e.g., based on a need-to-know).

Requirement 2 – MARKING – Access control labels must be associated with
objects. In order to control access to information stored in a computer,
according to the rules of a mandatory security policy, it must be possible to
mark every object with a label that reliably identifies the object’s
sensitivity level (e.g., classification), and/or the modes of access accorded
those subjects who may potentially access the object.

ACCOUNTABILITY

Requirement 3 – IDENTIFICATION – Individual subjects must be identified. Each
access to information must be mediated based on who is accessing the
information and what classes of information they are authorized to deal with.
This identification and authorization information must be securely maintained
by the computer system and be associated with every active element that
performs some security-relevant action in the system.

Requirement 4 – ACCOUNTABILITY – Audit information must be selectively kept
and protected so that actions affecting security can be traced to the
responsible party. A trusted system must be able to record the occurrences of
security-relevant events in an audit log. The capability to select the audit
events to be recorded is necessary to minimize the expense of auditing and to
allow efficient analysis. Audit data must be protected from modification and
unauthorized destruction to permit detection and after-the-fact investigations
of security violations.

ASSURANCE

Requirement 5 – ASSURANCE – The computer system must contain hardware/software
mechanisms that can be independently evaluated to provide sufficient assurance
that the system enforces requirements 1 through 4 above. In order to assure
that the four requirements of Security Policy, Marking, Identification, and
Accountability are enforced by a computer system, there must be some
identified and unified collection of hardware and software controls that
perform those functions. These mechanisms are typically embedded in the
operating system and are designed to carry out the assigned tasks in a secure
manner. The basis for trusting such system mechanisms in their operational
setting must be clearly documented such that it is possible to independently
examine the evidence to evaluate their sufficiency.

Requirement 6 – CONTINUOUS PROTECTION – The trusted mechanisms that enforce
these basic requirements must be continuously protected against tampering
and/or unauthorized changes. No computer system can be considered truly
secure if the basic hardware and software mechanisms that enforce the security
policy are themselves subject to unauthorized modification or subversion. The
continuous protection requirement has direct implications throughout the
computer system’s life-cycle.

These fundamental requirements form the basis for the individual evaluation
criteria applicable for each evaluation division and class. The interested
reader is referred to Section 5 of this document, “Control Objectives for
Trusted Computer Systems,” for a more complete discussion and further
amplification of these fundamental requirements as they apply to
general-purpose information processing systems and to Section 7 for
amplification of the relationship between Policy and these requirements.

Structure of the Document

The remainder of this document is divided into two parts, four appendices, and
a glossary. Part I (Sections 1 through 4) presents the detailed criteria
derived from the fundamental requirements described above and relevant to the
rationale and policy excerpts contained in Part II.

Part II (Sections 5 through 10) provides a discussion of basic objectives,
rationale, and national policy behind the development of the criteria, and
guidelines for developers pertaining to: mandatory access control rules
implementation, the covert channel problem, and security testing. It is
divided into six sections. Section 5 discusses the use of control objectives
in general and presents the three basic control objectives of the criteria.
Section 6 provides the theoretical basis behind the criteria. Section 7 gives
excerpts from pertinent regulations, directives, OMB Circulars, and Executive
Orders which provide the basis for many trust requirements for processing
nationally sensitive and classified information with computer systems.
Section 8 provides guidance to system developers on expectations in dealing
with the covert channel problem. Section 9 provides guidelines dealing with
mandatory security. Section 10 provides guidelines for security testing.
There are four appendices, including a description of the Trusted Computer
System Commercial Products Evaluation Process (Appendix A), summaries of the
evaluation divisions (Appendix B) and classes (Appendix C), and finally a
directory of requirements ordered alphabetically. In addition, there is a
glossary.

Structure of the Criteria

The criteria are divided into four divisions: D, C, B, and A ordered in a
hierarchical manner with the highest division (A) being reserved for systems
providing the most comprehensive security. Each division represents a major
improvement in the overall confidence one can place in the system for the
protection of sensitive information. Within divisions C and B there are a
number of subdivisions known as classes. The classes are also ordered in a
hierarchical manner with systems representative of division C and lower
classes of division B being characterized by the set of computer security
mechanisms that they possess. Assurance of correct and complete design and
implementation for these systems is gained mostly through testing of the
security- relevant portions of the system. The security-relevant portions of
a system are referred to throughout this document as the Trusted Computing
Base (TCB). Systems representative of higher classes in division B and
division A derive their security attributes more from their design and
implementation structure. Increased assurance that the required features are
operative, correct, and tamperproof under all circumstances is gained through
progressively more rigorous analysis during the design process.

Within each class, four major sets of criteria are addressed. The first three
represent features necessary to satisfy the broad control objectives of
Security Policy, Accountability, and Assurance that are discussed in Part II,
Section 5. The fourth set, Documentation, describes the type of written
evidence in the form of user guides, manuals, and the test and design
documentation required for each class.

A reader using this publication for the first time may find it helpful to
first read Part II, before continuing on with Part I.

PART I: THE CRITERIA

Highlighting (UPPERCASE) is used in Part I to indicate criteria not contained
in a lower class or changes and additions to already defined criteria. Where
there is no highlighting, requirements have been carried over from lower
classes without addition or modification.

1.0 DIVISION D: MINIMAL PROTECTION

This division contains only one class. It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

2.0 DIVISION C: DISCRETIONARY PROTECTION

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

2.1 CLASS (C1): DISCRETIONARY SECURITY PROTECTION

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data. It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data. The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity. The following are minimal
requirements for systems assigned a class (C1) rating:

2.1.1 SECURITY POLICY

2.1.1.1 Discretionary Access Control

THE TCB SHALL DEFINE AND CONTROL ACCESS BETWEEN NAMED USERS AND
NAMED OBJECTS (E.G., FILES AND PROGRAMS) IN THE ADP SYSTEM. THE
ENFORCEMENT MECHANISM (E.G., SELF/GROUP/PUBLIC CONTROLS, ACCESS
CONTROL LISTS) SHALL ALLOW USERS TO SPECIFY AND CONTROL SHARING
OF THOSE OBJECTS BY NAMED INDIVIDUALS OR DEFINED GROUPS OR BOTH.

2.1.2 ACCOUNTABILITY

2.1.2.1 Identification and Authentication

THE TCB SHALL REQUIRE USERS TO IDENTIFY THEMSELVES TO IT BEFORE
BEGINNING TO PERFORM ANY OTHER ACTIONS THAT THE TCB IS EXPECTED
TO MEDIATE. FURTHERMORE, THE TCB SHALL USE A PROTECTED
MECHANISM (E.G., PASSWORDS) TO AUTHENTICATE THE USER’S IDENTITY.
THE TCB SHALL PROTECT AUTHENTICATION DATA SO THAT IT CANNOT BE
ACCESSED BY ANY UNAUTHORIZED USER.

2.1.3 ASSURANCE

2.1.3.1 Operational Assurance

2.1.3.1.1 System Architecture

THE TCB SHALL MAINTAIN A DOMAIN FOR ITS OWN EXECUTION
THAT PROTECTS IT FROM EXTERNAL INTERFERENCE OR TAMPERING
(E.G., BY MODIFICATION OF ITS CODE OR DATA STRUCTURES).
RESOURCES CONTROLLED BY THE TCB MAY BE A DEFINED SUBSET
OF THE SUBJECTS AND OBJECTS IN THE ADP SYSTEM.

2.1.3.1.2 System Integrity

HARDWARE AND/OR SOFTWARE FEATURES SHALL BE PROVIDED THAT
CAN BE USED TO PERIODICALLY VALIDATE THE CORRECT OPERATION
OF THE ON-SITE HARDWARE AND FIRMWARE ELEMENTS OF THE TCB.

2.1.3.2 Life-Cycle Assurance

2.1.3.2.1 Security Testing

THE SECURITY MECHANISMS OF THE ADP SYSTEM SHALL BE TESTED
AND FOUND TO WORK AS CLAIMED IN THE SYSTEM DOCUMENTATION.
TESTING SHALL BE DONE TO ASSURE THAT THERE ARE NO OBVIOUS
WAYS FOR AN UNAUTHORIZED USER TO BYPASS OR OTHERWISE
DEFEAT THE SECURITY PROTECTION MECHANISMS OF THE TCB.
(SEE THE SECURITY TESTING GUIDELINES.)

2.1.4 DOCUMENTATION

2.1.4.1 Security Features User’s Guide

A SINGLE SUMMARY, CHAPTER, OR MANUAL IN USER DOCUMENTATION
SHALL DESCRIBE THE PROTECTION MECHANISMS PROVIDED BY THE TCB,
GUIDELINES ON THEIR USE, AND HOW THEY INTERACT WITH ONE ANOTHER.

2.1.4.2 Trusted Facility Manual

A MANUAL ADDRESSED TO THE ADP SYSTEM ADMINISTRATOR SHALL
PRESENT CAUTIONS ABOUT FUNCTIONS AND PRIVILEGES THAT SHOULD BE
CONTROLLED WHEN RUNNING A SECURE FACILITY.

2.1.4.3 Test Documentation

THE SYSTEM DEVELOPER SHALL PROVIDE TO THE EVALUATORS A DOCUMENT
THAT DESCRIBES THE TEST PLAN AND RESULTS OF THE SECURITY
MECHANISMS’ FUNCTIONAL TESTING.

2.1.4.4 Design Documentation

DOCUMENTATION SHALL BE AVAILABLE THAT PROVIDES A DESCRIPTION OF
THE MANUFACTURER’S PHILOSOPHY OF PROTECTION AND AN EXPLANATION
OF HOW THIS PHILOSOPHY IS TRANSLATED INTO THE TCB. IF THE TCB
IS COMPOSED OF DISTINCT MODULES, THE INTERFACES BETWEEN THESE
MODULES SHALL BE DESCRIBED.

2.2 CLASS (C2): CONTROLLED ACCESS PROTECTION

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation. The following are minimal requirements for systems
assigned a class (C2) rating:

2.2.1 SECURITY POLICY

2.2.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system. The
enforcement mechanism (e.g., self/group/public controls, access
control lists) shall allow users to specify and control sharing
of those objects by named individuals, or defined groups OF
INDIVIDUALS, or by both. THE DISCRETIONARY ACCESS CONTROL
MECHANISM SHALL, EITHER BY EXPLICIT USER ACTION OR BY DEFAULT,
PROVIDE THAT OBJECTS ARE PROTECTED FROM UNAUTHORIZED ACCESS.
THESE ACCESS CONTROLS SHALL BE CAPABLE OF INCLUDING OR EXCLUDING
ACCESS TO THE GRANULARITY OF A SINGLE USER. ACCESS PERMISSION
TO AN OBJECT BY USERS NOT ALREADY POSSESSING ACCESS PERMISSION
SHALL ONLY BE ASSIGNED BY AUTHORIZED USERS.

2.2.1.2 Object Reuse

WHEN A STORAGE OBJECT IS INITIALLY ASSIGNED, ALLOCATED, OR
REALLOCATED TO A SUBJECT FROM THE TCB’S POOL OF UNUSED STORAGE
OBJECTS, THE TCB SHALL ASSURE THAT THE OBJECT CONTAINS NO DATA
FOR WHICH THE SUBJECT IS NOT AUTHORIZED.

2.2.2 ACCOUNTABILITY

2.2.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall use a protected
mechanism (e.g., passwords) to authenticate the user’s identity.
The TCB shall protect authentication data so that it cannot be
accessed by any unauthorized user. THE TCB SHALL BE ABLE TO
ENFORCE INDIVIDUAL ACCOUNTABILITY BY PROVIDING THE CAPABILITY TO
UNIQUELY IDENTIFY EACH INDIVIDUAL ADP SYSTEM USER. THE TCB
SHALL ALSO PROVIDE THE CAPABILITY OF ASSOCIATING THIS IDENTITY
WITH ALL AUDITABLE ACTIONS TAKEN BY THAT INDIVIDUAL.

2.2.2.2 Audit

THE TCB SHALL BE ABLE TO CREATE, MAINTAIN, AND PROTECT FROM
MODIFICATION OR UNAUTHORIZED ACCESS OR DESTRUCTION AN AUDIT
TRAIL OF ACCESSES TO THE OBJECTS IT PROTECTS. THE AUDIT DATA
SHALL BE PROTECTED BY THE TCB SO THAT READ ACCESS TO IT IS
LIMITED TO THOSE WHO ARE AUTHORIZED FOR AUDIT DATA. THE TCB
SHALL BE ABLE TO RECORD THE FOLLOWING TYPES OF EVENTS: USE OF
IDENTIFICATION AND AUTHENTICATION MECHANISMS, INTRODUCTION OF
OBJECTS INTO A USER’S ADDRESS SPACE (E.G., FILE OPEN, PROGRAM
INITIATION), DELETION OF OBJECTS, AND ACTIONS TAKEN BY
COMPUTER OPERATORS AND SYSTEM ADMINISTRATORS AND/OR SYSTEM
SECURITY OFFICERS. FOR EACH RECORDED EVENT, THE AUDIT RECORD
SHALL IDENTIFY: DATE AND TIME OF THE EVENT, USER, TYPE OF
EVENT, AND SUCCESS OR FAILURE OF THE EVENT. FOR
IDENTIFICATION/AUTHENTICATION EVENTS THE ORIGIN OF REQUEST
(E.G., TERMINAL ID) SHALL BE INCLUDED IN THE AUDIT RECORD. FOR
EVENTS THAT INTRODUCE AN OBJECT INTO A USER’S ADDRESS SPACE AND
FOR OBJECT DELETION EVENTS THE AUDIT RECORD SHALL INCLUDE THE
NAME OF THE OBJECT. THE ADP SYSTEM ADMINISTRATOR SHALL BE ABLE
TO SELECTIVELY AUDIT THE ACTIONS OF ANY ONE OR MORE USERS BASED
ON INDIVIDUAL IDENTITY.

2.2.3 ASSURANCE

2.2.3.1 Operational Assurance

2.2.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
Resources controlled by the TCB may be a defined subset
of the subjects and objects in the ADP system. THE TCB
SHALL ISOLATE THE RESOURCES TO BE PROTECTED SO THAT THEY
ARE SUBJECT TO THE ACCESS CONTROL AND AUDITING
REQUIREMENTS.

2.2.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct operation
of the on-site hardware and firmware elements of the TCB.

2.2.3.2 Life-Cycle Assurance

2.2.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
Testing shall be done to assure that there are no obvious
ways for an unauthorized user to bypass or otherwise
defeat the security protection mechanisms of the TCB.
TESTING SHALL ALSO INCLUDE A SEARCH FOR OBVIOUS FLAWS THAT
WOULD ALLOW VIOLATION OF RESOURCE ISOLATION, OR THAT WOULD
PERMIT UNAUTHORIZED ACCESS TO THE AUDIT OR AUTHENTICATION
DATA. (See the Security Testing guidelines.)

2.2.4 DOCUMENTATION

2.2.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

2.2.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. THE PROCEDURES FOR
EXAMINING AND MAINTAINING THE AUDIT FILES AS WELL AS THE
DETAILED AUDIT RECORD STRUCTURE FOR EACH TYPE OF AUDIT EVENT
SHALL BE GIVEN.

2.2.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing.

2.2.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. If the TCB
is composed of distinct modules, the interfaces between these
modules shall be described.

3.0 DIVISION B: MANDATORY PROTECTION

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division. Systems in this division must carry the
sensitivity labels with major data structures in the system. The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB. Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

3.1 CLASS (B1): LABELED SECURITY PROTECTION

Class (B1) systems require all the features required for class (C2). In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information. Any
flaws identified by testing must be removed. The following are minimal
requirements for systems assigned a class (B1) rating:

3.1.1 SECURITY POLICY

3.1.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined groups
of individuals, or by both. The discretionary access control
mechanism shall, either by explicit user action or by default,
provide that objects are protected from unauthorized access.
These access controls shall be capable of including or excluding
access to the granularity of a single user. Access permission
to an object by users not already possessing access permission
shall only be assigned by authorized users.

3.1.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.1.1.3 Labels

SENSITIVITY LABELS ASSOCIATED WITH EACH SUBJECT AND STORAGE
OBJECT UNDER ITS CONTROL (E.G., PROCESS, FILE, SEGMENT, DEVICE)
SHALL BE MAINTAINED BY THE TCB. THESE LABELS SHALL BE USED AS
THE BASIS FOR MANDATORY ACCESS CONTROL DECISIONS. IN ORDER TO
IMPORT NON-LABELED DATA, THE TCB SHALL REQUEST AND RECEIVE FROM
AN AUTHORIZED USER THE SECURITY LEVEL OF THE DATA, AND ALL SUCH
ACTIONS SHALL BE AUDITABLE BY THE TCB.

3.1.1.3.1 Label Integrity

SENSITIVITY LABELS SHALL ACCURATELY REPRESENT SECURITY
LEVELS OF THE SPECIFIC SUBJECTS OR OBJECTS WITH WHICH THEY
ARE ASSOCIATED. WHEN EXPORTED BY THE TCB, SENSITIVITY
LABELS SHALL ACCURATELY AND UNAMBIGUOUSLY REPRESENT THE
INTERNAL LABELS AND SHALL BE ASSOCIATED WITH THE
INFORMATION BEING EXPORTED.

3.1.1.3.2 Exportation of Labeled Information

THE TCB SHALL DESIGNATE EACH COMMUNICATION CHANNEL AND
I/O DEVICE AS EITHER SINGLE-LEVEL OR MULTILEVEL. ANY
CHANGE IN THIS DESIGNATION SHALL BE DONE MANUALLY AND
SHALL BE AUDITABLE BY THE TCB. THE TCB SHALL MAINTAIN
AND BE ABLE TO AUDIT ANY CHANGE IN THE CURRENT SECURITY
LEVEL ASSOCIATED WITH A SINGLE-LEVEL COMMUNICATION
CHANNEL OR I/O DEVICE.

3.1.1.3.2.1 Exportation to Multilevel Devices

WHEN THE TCB EXPORTS AN OBJECT TO A MULTILEVEL I/O
DEVICE, THE SENSITIVITY LABEL ASSOCIATED WITH THAT
OBJECT SHALL ALSO BE EXPORTED AND SHALL RESIDE ON
THE SAME PHYSICAL MEDIUM AS THE EXPORTED
INFORMATION AND SHALL BE IN THE SAME FORM
(I.E., MACHINE-READABLE OR HUMAN-READABLE FORM).
WHEN THE TCB EXPORTS OR IMPORTS AN OBJECT OVER A
MULTILEVEL COMMUNICATION CHANNEL, THE PROTOCOL
USED ON THAT CHANNEL SHALL PROVIDE FOR THE
UNAMBIGUOUS PAIRING BETWEEN THE SENSITIVITY LABELS
AND THE ASSOCIATED INFORMATION THAT IS SENT OR
RECEIVED.

3.1.1.3.2.2 Exportation to Single-Level Devices

SINGLE-LEVEL I/O DEVICES AND SINGLE-LEVEL
COMMUNICATION CHANNELS ARE NOT REQUIRED TO
MAINTAIN THE SENSITIVITY LABELS OF THE INFORMATION
THEY PROCESS. HOWEVER, THE TCB SHALL INCLUDE A
MECHANISM BY WHICH THE TCB AND AN AUTHORIZED USER
RELIABLY COMMUNICATE TO DESIGNATE THE SINGLE
SECURITY LEVEL OF INFORMATION IMPORTED OR EXPORTED
VIA SINGLE-LEVEL COMMUNICATION CHANNELS OR I/O
DEVICES.

3.1.1.3.2.3 Labeling Human-Readable Output

THE ADP SYSTEM ADMINISTRATOR SHALL BE ABLE TO
SPECIFY THE PRINTABLE LABEL NAMES ASSOCIATED WITH
EXPORTED SENSITIVITY LABELS. THE TCB SHALL MARK
THE BEGINNING AND END OF ALL HUMAN-READABLE, PAGED,
HARDCOPY OUTPUT (E.G., LINE PRINTER OUTPUT) WITH
HUMAN-READABLE SENSITIVITY LABELS THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE OUTPUT. THE TCB
SHALL, BY DEFAULT, MARK THE TOP AND BOTTOM OF EACH
PAGE OF HUMAN-READABLE, PAGED, HARDCOPY OUTPUT
(E.G., LINE PRINTER OUTPUT) WITH HUMAN-READABLE
SENSITIVITY LABELS THAT PROPERLY* REPRESENT THE
OVERALL SENSITIVITY OF THE OUTPUT OR THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE INFORMATION ON THE
PAGE. THE TCB SHALL, BY DEFAULT AND IN AN
APPROPRIATE MANNER, MARK OTHER FORMS OF HUMAN-
READABLE OUTPUT (E.G., MAPS, GRAPHICS) WITH HUMAN-
READABLE SENSITIVITY LABELS THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE OUTPUT. ANY
OVERRIDE OF THESE MARKING DEFAULTS SHALL BE
AUDITABLE BY THE TCB.

_____________________________________________________________
* THE HIERARCHICAL CLASSIFICATION COMPONENT IN HUMAN-READABLE
SENSITIVITY LABELS SHALL BE EQUAL TO THE GREATEST
HIERARCHICAL CLASSIFICATION OF ANY OF THE INFORMATION IN THE
OUTPUT THAT THE LABELS REFER TO; THE NON-HIERARCHICAL
CATEGORY COMPONENT SHALL INCLUDE ALL OF THE NON-HIERARCHICAL
CATEGORIES OF THE INFORMATION IN THE OUTPUT THE LABELS REFER
TO, BUT NO OTHER NON-HIERARCHICAL CATEGORIES.
_____________________________________________________________

3.1.1.4 Mandatory Access Control

THE TCB SHALL ENFORCE A MANDATORY ACCESS CONTROL POLICY OVER
ALL SUBJECTS AND STORAGE OBJECTS UNDER ITS CONTROL (E.G.,
PROCESSES, FILES, SEGMENTS, DEVICES). THESE SUBJECTS AND
OBJECTS SHALL BE ASSIGNED SENSITIVITY LABELS THAT ARE A
COMBINATION OF HIERARCHICAL CLASSIFICATION LEVELS AND
NON-HIERARCHICAL CATEGORIES, AND THE LABELS SHALL BE USED AS
THE BASIS FOR MANDATORY ACCESS CONTROL DECISIONS. THE TCB
SHALL BE ABLE TO SUPPORT TWO OR MORE SUCH SECURITY LEVELS.
(SEE THE MANDATORY ACCESS CONTROL GUIDELINES.) THE FOLLOWING
REQUIREMENTS SHALL HOLD FOR ALL ACCESSES BETWEEN SUBJECTS AND
OBJECTS CONTROLLED BY THE TCB: A SUBJECT CAN READ AN OBJECT
ONLY IF THE HIERARCHICAL CLASSIFICATION IN THE SUBJECT’S
SECURITY LEVEL IS GREATER THAN OR EQUAL TO THE HIERARCHICAL
CLASSIFICATION IN THE OBJECT’S SECURITY LEVEL AND THE NON-
HIERARCHICAL CATEGORIES IN THE SUBJECT’S SECURITY LEVEL INCLUDE
ALL THE NON-HIERARCHICAL CATEGORIES IN THE OBJECT’S SECURITY
LEVEL. A SUBJECT CAN WRITE AN OBJECT ONLY IF THE HIERARCHICAL
CLASSIFICATION IN THE SUBJECT’S SECURITY LEVEL IS LESS THAN OR
EQUAL TO THE HIERARCHICAL CLASSIFICATION IN THE OBJECT’S
SECURITY LEVEL AND ALL THE NON-HIERARCHICAL CATEGORIES IN THE
SUBJECT’S SECURITY LEVEL ARE INCLUDED IN THE NON- HIERARCHICAL
CATEGORIES IN THE OBJECT’S SECURITY LEVEL.

3.1.2 ACCOUNTABILITY

3.1.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall MAINTAIN AUTHENTICATION
DATA THAT INCLUDES INFORMATION FOR VERIFYING THE IDENTITY OF
INDIVIDUAL USERS (E.G., PASSWORDS) AS WELL AS INFORMATION FOR
DETERMINING THE CLEARANCE AND AUTHORIZATIONS OF INDIVIDUAL
USERS. THIS DATA SHALL BE USED BY THE TCB TO AUTHENTICATE the
user’s identity AND TO DETERMINE THE SECURITY LEVEL AND
AUTHORIZATIONS OF SUBJECTS THAT MAY BE CREATED TO ACT ON BEHALF
OF THE INDIVIDUAL USER. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.1.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. THE TCB SHALL ALSO BE ABLE TO AUDIT ANY OVERRIDE OF
HUMAN-READABLE OUTPUT MARKINGS. FOR each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address space
and for object deletion events the audit record shall include
the name of the object AND THE OBJECT’S SECURITY LEVEL. The
ADP system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
AND/OR OBJECT SECURITY LEVEL.

3.1.3 ASSURANCE

3.1.3.1 Operational Assurance

3.1.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
Resources controlled by the TCB may be a defined subset
of the subjects and objects in the ADP system. THE TCB
SHALL MAINTAIN PROCESS ISOLATION THROUGH THE PROVISION OF
DISTINCT ADDRESS SPACES UNDER ITS CONTROL. The TCB shall
isolate the resources to be protected so that they are
subject to the access control and auditing requirements.

3.1.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct operation
of the on-site hardware and firmware elements of the TCB.

3.1.3.2 Life-Cycle Assurance

3.1.3.2.1 Security Testing

THE SECURITY MECHANISMS OF THE ADP SYSTEM SHALL BE TESTED
AND FOUND TO WORK AS CLAIMED IN THE SYSTEM DOCUMENTATION.
A TEAM OF INDIVIDUALS WHO THOROUGHLY UNDERSTAND THE
SPECIFIC IMPLEMENTATION OF THE TCB SHALL SUBJECT ITS
DESIGN DOCUMENTATION, SOURCE CODE, AND OBJECT CODE TO
THOROUGH ANALYSIS AND TESTING. THEIR OBJECTIVES SHALL BE:
TO UNCOVER ALL DESIGN AND IMPLEMENTATION FLAWS THAT WOULD
PERMIT A SUBJECT EXTERNAL TO THE TCB TO READ, CHANGE, OR
DELETE DATA NORMALLY DENIED UNDER THE MANDATORY OR
DISCRETIONARY SECURITY POLICY ENFORCED BY THE TCB; AS WELL
AS TO ASSURE THAT NO SUBJECT (WITHOUT AUTHORIZATION TO DO
SO) IS ABLE TO CAUSE THE TCB TO ENTER A STATE SUCH THAT
IT IS UNABLE TO RESPOND TO COMMUNICATIONS INITIATED BY
OTHER USERS. ALL DISCOVERED FLAWS SHALL BE REMOVED OR
NEUTRALIZED AND THE TCB RETESTED TO DEMONSTRATE THAT THEY
HAVE BEEN ELIMINATED AND THAT NEW FLAWS HAVE NOT BEEN
INTRODUCED. (SEE THE SECURITY TESTING GUIDELINES.)

3.1.3.2.2 Design Specification and Verification

AN INFORMAL OR FORMAL MODEL OF THE SECURITY POLICY
SUPPORTED BY THE TCB SHALL BE MAINTAINED THAT IS SHOWN TO
BE CONSISTENT WITH ITS AXIOMS.

3.1.4 DOCUMENTATION

3.1.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.1.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. THE MANUAL SHALL DESCRIBE THE OPERATOR AND
ADMINISTRATOR FUNCTIONS RELATED TO SECURITY, TO INCLUDE CHANGING
THE SECURITY CHARACTERISTICS OF A USER. IT SHALL PROVIDE
GUIDELINES ON THE CONSISTENT AND EFFECTIVE USE OF THE PROTECTION
FEATURES OF THE SYSTEM, HOW THEY INTERACT, HOW TO SECURELY
GENERATE A NEW TCB, AND FACILITY PROCEDURES, WARNINGS, AND
PRIVILEGES THAT NEED TO BE CONTROLLED IN ORDER TO OPERATE THE
FACILITY IN A SECURE MANNER.

3.1.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing.

3.1.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. If the TCB
is composed of distinct modules, the interfaces between these
modules shall be described. AN INFORMAL OR FORMAL DESCRIPTION
OF THE SECURITY POLICY MODEL ENFORCED BY THE TCB SHALL BE
AVAILABLE AND AN EXPLANATION PROVIDED TO SHOW THAT IT IS
SUFFICIENT TO ENFORCE THE SECURITY POLICY. THE SPECIFIC TCB
PROTECTION MECHANISMS SHALL BE IDENTIFIED AND AN EXPLANATION
GIVEN TO SHOW THAT THEY SATISFY THE MODEL.

3.2 CLASS (B2): STRUCTURED PROTECTION

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system. In addition, covert channels are
addressed. The TCB must be carefully structured into protection-critical and
non- protection-critical elements. The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review. Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed. The system is relatively resistant to penetration. The
following are minimal requirements for systems assigned a class (B2) rating:

3.2.1 SECURITY POLICY

3.2.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined
groups of individuals, or by both. The discretionary access
control mechanism shall, either by explicit user action or by
default, provide that objects are protected from unauthorized
access. These access controls shall be capable of including
or excluding access to the granularity of a single user.
Access permission to an object by users not already possessing
access permission shall only be assigned by authorized users.

3.2.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.2.1.3 Labels

Sensitivity labels associated with each ADP SYSTEM RESOURCE
(E.G., SUBJECT, STORAGE OBJECT) THAT IS DIRECTLY OR INDIRECTLY
ACCESSIBLE BY SUBJECTS EXTERNAL TO THE TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

3.2.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

3.2.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

3.2.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

3.2.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the
information they process. However, the TCB shall
include a mechanism by which the TCB and an
authorized user reliably communicate to designate
the single security level of information imported
or exported via single-level communication
channels or I/O devices.

3.2.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.
_____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
_____________________________________________________________

3.2.1.3.3 Subject Sensitivity Labels

THE TCB SHALL IMMEDIATELY NOTIFY A TERMINAL USER OF EACH
CHANGE IN THE SECURITY LEVEL ASSOCIATED WITH THAT USER
DURING AN INTERACTIVE SESSION. A TERMINAL USER SHALL BE
ABLE TO QUERY THE TCB AS DESIRED FOR A DISPLAY OF THE
SUBJECT’S COMPLETE SENSITIVITY LABEL.

3.2.1.3.4 Device Labels

THE TCB SHALL SUPPORT THE ASSIGNMENT OF MINIMUM AND
MAXIMUM SECURITY LEVELS TO ALL ATTACHED PHYSICAL DEVICES.
THESE SECURITY LEVELS SHALL BE USED BY THE TCB TO ENFORCE
CONSTRAINTS IMPOSED BY THE PHYSICAL ENVIRONMENTS IN WHICH
THE DEVICES ARE LOCATED.

3.2.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all RESOURCES (I.E., SUBJECTS, STORAGE OBJECTS, AND I/O DEVICES)
THAT ARE DIRECTLY OR INDIRECTLY ACCESSIBLE BY SUBJECTS EXTERNAL
TO THE TCB. These subjects and objects shall be assigned
sensitivity labels that are a combination of hierarchical
classification levels and non-hierarchical categories, and the
labels shall be used as the basis for mandatory access control
decisions. The TCB shall be able to support two or more such
security levels. (See the Mandatory Access Control guidelines.)
The following requirements shall hold for all accesses between
ALL SUBJECTS EXTERNAL TO THE TCB AND ALL OBJECTS DIRECTLY OR
INDIRECTLY ACCESSIBLE BY THESE SUBJECTS: A subject can read an
object only if the hierarchical classification in the subject’s
security level is greater than or equal to the hierarchical
classification in the object’s security level and the non-
hierarchical categories in the subject’s security level include
all the non-hierarchical categories in the object’s security
level. A subject can write an object only if the hierarchical
classification in the subject’s security level is less than or
equal to the hierarchical classification in the object’s
security level and all the non-hierarchical categories in the
subject’s security level are included in the non-hierarchical
categories in the object’s security level.

3.2.2 ACCOUNTABILITY

3.2.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.2.2.1.1 Trusted Path

THE TCB SHALL SUPPORT A TRUSTED COMMUNICATION PATH
BETWEEN ITSELF AND USER FOR INITIAL LOGIN AND
AUTHENTICATION. COMMUNICATIONS VIA THIS PATH SHALL BE
INITIATED EXCLUSIVELY BY A USER.

3.2.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record. For
events that introduce an object into a user’s address space and
for object deletion events the audit record shall include the
name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
and/or object security level. THE TCB SHALL BE ABLE TO AUDIT
THE IDENTIFIED EVENTS THAT MAY BE USED IN THE EXPLOITATION OF
COVERT STORAGE CHANNELS.

3.2.3 ASSURANCE

3.2.3.1 Operational Assurance

3.2.3.1.1 System Architecture

THE TCB SHALL MAINTAIN A DOMAIN FOR ITS OWN EXECUTION
THAT PROTECTS IT FROM EXTERNAL INTERFERENCE OR TAMPERING
(E.G., BY MODIFICATION OF ITS CODE OR DATA STRUCTURES).
THE TCB SHALL MAINTAIN PROCESS ISOLATION THROUGH THE
PROVISION OF DISTINCT ADDRESS SPACES UNDER ITS CONTROL.
THE TCB SHALL BE INTERNALLY STRUCTURED INTO WELL-DEFINED
LARGELY INDEPENDENT MODULES. IT SHALL MAKE EFFECTIVE USE
OF AVAILABLE HARDWARE TO SEPARATE THOSE ELEMENTS THAT ARE
PROTECTION-CRITICAL FROM THOSE THAT ARE NOT. THE TCB
MODULES SHALL BE DESIGNED SUCH THAT THE PRINCIPLE OF LEAST
PRIVILEGE IS ENFORCED. FEATURES IN HARDWARE, SUCH AS
SEGMENTATION, SHALL BE USED TO SUPPORT LOGICALLY DISTINCT
STORAGE OBJECTS WITH SEPARATE ATTRIBUTES (NAMELY:
READABLE, WRITEABLE). THE USER INTERFACE TO THE TCB
SHALL BE COMPLETELY DEFINED AND ALL ELEMENTS OF THE TCB
IDENTIFIED.

3.2.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

3.2.3.1.3 Covert Channel Analysis

THE SYSTEM DEVELOPER SHALL CONDUCT A THOROUGH SEARCH FOR
COVERT STORAGE CHANNELS AND MAKE A DETERMINATION (EITHER
BY ACTUAL MEASUREMENT OR BY ENGINEERING ESTIMATION) OF
THE MAXIMUM BANDWIDTH OF EACH IDENTIFIED CHANNEL. (SEE
THE COVERT CHANNELS GUIDELINE SECTION.)

3.2.3.1.4 Trusted Facility Management

THE TCB SHALL SUPPORT SEPARATE OPERATOR AND ADMINISTRATOR
FUNCTIONS.

3.2.3.2 Life-Cycle Assurance

3.2.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall be:
to uncover all design and implementation flaws that would
permit a subject external to the TCB to read, change, or
delete data normally denied under the mandatory or
discretionary security policy enforced by the TCB; as well
as to assure that no subject (without authorization to do
so) is able to cause the TCB to enter a state such that it
is unable to respond to communications initiated by other
users. THE TCB SHALL BE FOUND RELATIVELY RESISTANT TO
PENETRATION. All discovered flaws shall be CORRECTED and
the TCB retested to demonstrate that they have been
eliminated and that new flaws have not been introduced.
TESTING SHALL DEMONSTRATE THAT THE TCB IMPLEMENTATION IS
CONSISTENT WITH THE DESCRIPTIVE TOP-LEVEL SPECIFICATION.
(See the Security Testing Guidelines.)

3.2.3.2.2 Design Specification and Verification

A FORMAL model of the security policy supported by the
TCB shall be maintained that is PROVEN consistent with
its axioms. A DESCRIPTIVE TOP-LEVEL SPECIFICATION (DTLS)
OF THE TCB SHALL BE MAINTAINED THAT COMPLETELY AND
ACCURATELY DESCRIBES THE TCB IN TERMS OF EXCEPTIONS, ERROR
MESSAGES, AND EFFECTS. IT SHALL BE SHOWN TO BE AN
ACCURATE DESCRIPTION OF THE TCB INTERFACE.

3.2.3.2.3 Configuration Management

DURING DEVELOPMENT AND MAINTENANCE OF THE TCB, A
CONFIGURATION MANAGEMENT SYSTEM SHALL BE IN PLACE THAT
MAINTAINS CONTROL OF CHANGES TO THE DESCRIPTIVE TOP-LEVEL
SPECIFICATION, OTHER DESIGN DATA, IMPLEMENTATION
DOCUMENTATION, SOURCE CODE, THE RUNNING VERSION OF THE
OBJECT CODE, AND TEST FIXTURES AND DOCUMENTATION. THE
CONFIGURATION MANAGEMENT SYSTEM SHALL ASSURE A CONSISTENT
MAPPING AMONG ALL DOCUMENTATION AND CODE ASSOCIATED WITH
THE CURRENT VERSION OF THE TCB. TOOLS SHALL BE PROVIDED
FOR GENERATION OF A NEW VERSION OF THE TCB FROM SOURCE
CODE. ALSO AVAILABLE SHALL BE TOOLS FOR COMPARING A
NEWLY GENERATED VERSION WITH THE PREVIOUS TCB VERSION IN
ORDER TO ASCERTAIN THAT ONLY THE INTENDED CHANGES HAVE
BEEN MADE IN THE CODE THAT WILL ACTUALLY BE USED AS THE
NEW VERSION OF THE TCB.

3.2.4 DOCUMENTATION

3.2.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.2.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. THE TCB MODULES THAT CONTAIN
THE REFERENCE VALIDATION MECHANISM SHALL BE IDENTIFIED. THE
PROCEDURES FOR SECURE GENERATION OF A NEW TCB FROM SOURCE AFTER
MODIFICATION OF ANY MODULES IN THE TCB SHALL BE DESCRIBED.

3.2.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. IT SHALL INCLUDE RESULTS OF
TESTING THE EFFECTIVENESS OF THE METHODS USED TO REDUCE COVERT
CHANNEL BANDWIDTHS.

3.2.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. THE
interfaces between THE TCB modules shall be described. A
FORMAL description of the security policy model enforced by the
TCB shall be available and PROVEN that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. THE DESCRIPTIVE TOP-LEVEL
SPECIFICATION (DTLS) SHALL BE SHOWN TO BE AN ACCURATE
DESCRIPTION OF THE TCB INTERFACE. DOCUMENTATION SHALL DESCRIBE
HOW THE TCB IMPLEMENTS THE REFERENCE MONITOR CONCEPT AND GIVE
AN EXPLANATION WHY IT IS TAMPERPROOF, CANNOT BE BYPASSED, AND
IS CORRECTLY IMPLEMENTED. DOCUMENTATION SHALL DESCRIBE HOW THE
TCB IS STRUCTURED TO FACILITATE TESTING AND TO ENFORCE LEAST
PRIVILEGE. THIS DOCUMENTATION SHALL ALSO PRESENT THE RESULTS
OF THE COVERT CHANNEL ANALYSIS AND THE TRADEOFFS INVOLVED IN
RESTRICTING THE CHANNELS. ALL AUDITABLE EVENTS THAT MAY BE
USED IN THE EXPLOITATION OF KNOWN COVERT STORAGE CHANNELS SHALL
BE IDENTIFIED. THE BANDWIDTHS OF KNOWN COVERT STORAGE CHANNELS,
THE USE OF WHICH IS NOT DETECTABLE BY THE AUDITING MECHANISMS,
SHALL BE PROVIDED. (SEE THE COVERT CHANNEL GUIDELINE SECTION.)

3.3 CLASS (B3): SECURITY DOMAINS

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests. To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity. A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required. The system is highly resistant to
penetration. The following are minimal requirements for systems assigned a
class (B3) rating:

3.3.1 SECURITY POLICY

3.3.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (E.G., ACCESS CONTROL LISTS) shall
allow users to specify and control sharing of those OBJECTS.
The discretionary access control mechanism shall, either by
explicit user action or by default, provide that objects are
protected from unauthorized access. These access controls shall
be capable of SPECIFYING, FOR EACH NAMED OBJECT, A LIST OF NAMED
INDIVIDUALS AND A LIST OF GROUPS OF NAMED INDIVIDUALS WITH THEIR
RESPECTIVE MODES OF ACCESS TO THAT OBJECT. FURTHERMORE, FOR
EACH SUCH NAMED OBJECT, IT SHALL BE POSSIBLE TO SPECIFY A LIST
OF NAMED INDIVIDUALS AND A LIST OF GROUPS OF NAMED INDIVIDUALS
FOR WHICH NO ACCESS TO THE OBJECT IS TO BE GIVEN. Access
permission to an object by users not already possessing access
permission shall only be assigned by authorized users.

3.3.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.3.1.3 Labels

Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

3.3.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

3.3.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

3.3.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

3.3.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the information
they process. However, the TCB shall include a
mechanism by which the TCB and an authorized user
reliably communicate to designate the single
security level of information imported or exported
via single-level communication channels or I/O
devices.

3.3.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.

_____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
_____________________________________________________________

3.3.1.3.3 Subject Sensitivity Labels

The TCB shall immediately notify a terminal user of each
change in the security level associated with that user
during an interactive session. A terminal user shall be
able to query the TCB as desired for a display of the
subject’s complete sensitivity label.

3.3.1.3.4 Device Labels

The TCB shall support the assignment of minimum and
maximum security levels to all attached physical devices.
These security levels shall be used by the TCB to enforce
constraints imposed by the physical environments in which
the devices are located.

3.3.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O
devices) that are directly or indirectly accessible by subjects
external to the TCB. These subjects and objects shall be
assigned sensitivity labels that are a combination of
hierarchical classification levels and non-hierarchical
categories, and the labels shall be used as the basis for
mandatory access control decisions. The TCB shall be able to
support two or more such security levels. (See the Mandatory
Access Control guidelines.) The following requirements shall
hold for all accesses between all subjects external to the TCB
and all objects directly or indirectly accessible by these
subjects: A subject can read an object only if the hierarchical
classification in the subject’s security level is greater than
or equal to the hierarchical classification in the object’s
security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical
categories in the object’s security level. A subject can write
an object only if the hierarchical classification in the
subject’s security level is less than or equal to the
hierarchical classification in the object’s security level and
all the non-hierarchical categories in the subject’s security
level are included in the non- hierarchical categories in the
object’s security level.

3.3.2 ACCOUNTABILITY

3.3.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.3.2.1.1 Trusted Path

The TCB shall support a trusted communication path
between itself and USERS for USE WHEN A POSITIVE TCB-TO-
USER CONNECTION IS REQUIRED (E.G., LOGIN, CHANGE SUBJECT
SECURITY LEVEL). Communications via this TRUSTED path
shall be ACTIVATED exclusively by a user OR THE TCB AND
SHALL BE LOGICALLY ISOLATED AND UNMISTAKABLY
DISTINGUISHABLE FROM OTHER PATHS.

3.3.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object and the object’s security level.
The ADP system administrator shall be able to selectively audit
the actions of any one or more users based on individual
identity and/or object security level. The TCB shall be able to
audit the identified events that may be used in the exploitation
of covert storage channels. THE TCB SHALL CONTAIN A MECHANISM
THAT IS ABLE TO MONITOR THE OCCURRENCE OR ACCUMULATION OF
SECURITY AUDITABLE EVENTS THAT MAY INDICATE AN IMMINENT
VIOLATION OF SECURITY POLICY. THIS MECHANISM SHALL BE ABLE TO
IMMEDIATELY NOTIFY THE SECURITY ADMINISTRATOR WHEN THRESHOLDS
ARE EXCEEDED.

3.3.3 ASSURANCE

3.3.3.1 Operational Assurance

3.3.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
The TCB shall maintain process isolation through the
provision of distinct address spaces under its control.
The TCB shall be internally structured into well-defined
largely independent modules. It shall make effective use
of available hardware to separate those elements that are
protection-critical from those that are not. The TCB
modules shall be designed such that the principle of
least privilege is enforced. Features in hardware, such
as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely:
readable, writeable). The user interface to the TCB shall
be completely defined and all elements of the TCB
identified. THE TCB SHALL BE DESIGNED AND STRUCTURED TO
USE A COMPLETE, CONCEPTUALLY SIMPLE PROTECTION MECHANISM
WITH PRECISELY DEFINED SEMANTICS. THIS MECHANISM SHALL
PLAY A CENTRAL ROLE IN ENFORCING THE INTERNAL STRUCTURING
OF THE TCB AND THE SYSTEM. THE TCB SHALL INCORPORATE
SIGNIFICANT USE OF LAYERING, ABSTRACTION AND DATA HIDING.
SIGNIFICANT SYSTEM ENGINEERING SHALL BE DIRECTED TOWARD
MINIMIZING THE COMPLEXITY OF THE TCB AND EXCLUDING FROM
THE TCB MODULES THAT ARE NOT PROTECTION-CRITICAL.

3.3.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

3.3.3.1.3 Covert Channel Analysis

The system developer shall conduct a thorough search for
COVERT CHANNELS and make a determination (either by
actual measurement or by engineering estimation) of the
maximum bandwidth of each identified channel. (See the
Covert Channels Guideline section.)

3.3.3.1.4 Trusted Facility Management

The TCB shall support separate operator and administrator
functions. THE FUNCTIONS PERFORMED IN THE ROLE OF A
SECURITY ADMINISTRATOR SHALL BE IDENTIFIED. THE ADP
SYSTEM ADMINISTRATIVE PERSONNEL SHALL ONLY BE ABLE TO
PERFORM SECURITY ADMINISTRATOR FUNCTIONS AFTER TAKING A
DISTINCT AUDITABLE ACTION TO ASSUME THE SECURITY
ADMINISTRATOR ROLE ON THE ADP SYSTEM. NON-SECURITY
FUNCTIONS THAT CAN BE PERFORMED IN THE SECURITY
ADMINISTRATION ROLE SHALL BE LIMITED STRICTLY TO THOSE
ESSENTIAL TO PERFORMING THE SECURITY ROLE EFFECTIVELY.

3.3.3.1.5 Trusted Recovery

PROCEDURES AND/OR MECHANISMS SHALL BE PROVIDED TO ASSURE
THAT, AFTER AN ADP SYSTEM FAILURE OR OTHER DISCONTINUITY,
RECOVERY WITHOUT A PROTECTION COMPROMISE IS OBTAINED.

3.3.3.2 Life-Cycle Assurance

3.3.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that
would permit a subject external to the TCB to read,
change, or delete data normally denied under the
mandatory or discretionary security policy enforced by
the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter
a state such that it is unable to respond to
communications initiated by other users. The TCB shall
be FOUND RESISTANT TO penetration. All discovered flaws
shall be corrected and the TCB retested to demonstrate
that they have been eliminated and that new flaws have
not been introduced. Testing shall demonstrate that the
TCB implementation is consistent with the descriptive
top-level specification. (See the Security Testing
Guidelines.) NO DESIGN FLAWS AND NO MORE THAN A FEW
CORRECTABLE IMPLEMENTATION FLAWS MAY BE FOUND DURING
TESTING AND THERE SHALL BE REASONABLE CONFIDENCE THAT
FEW REMAIN.

3.3.3.2.2 Design Specification and Verification

A formal model of the security policy supported by the
TCB shall be maintained that is proven consistent with
its axioms. A descriptive top-level specification (DTLS)
of the TCB shall be maintained that completely and
accurately describes the TCB in terms of exceptions, error
messages, and effects. It shall be shown to be an
accurate description of the TCB interface. A CONVINCING
ARGUMENT SHALL BE GIVEN THAT THE DTLS IS CONSISTENT WITH
THE MODEL.

3.3.3.2.3 Configuration Management

During development and maintenance of the TCB, a
configuration management system shall be in place that
maintains control of changes to the descriptive top-level
specification, other design data, implementation
documentation, source code, the running version of the
object code, and test fixtures and documentation. The
configuration management system shall assure a consistent
mapping among all documentation and code associated with
the current version of the TCB. Tools shall be provided
for generation of a new version of the TCB from source
code. Also available shall be tools for comparing a
newly generated version with the previous TCB version in
order to ascertain that only the intended changes have
been made in the code that will actually be used as the
new version of the TCB.

3.3.4 DOCUMENTATION

3.3.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.3.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. The TCB modules that contain
the reference validation mechanism shall be identified. The
procedures for secure generation of a new TCB from source after
modification of any modules in the TCB shall be described. IT
SHALL INCLUDE THE PROCEDURES TO ENSURE THAT THE SYSTEM IS
INITIALLY STARTED IN A SECURE MANNER. PROCEDURES SHALL ALSO BE
INCLUDED TO RESUME SECURE SYSTEM OPERATION AFTER ANY LAPSE IN
SYSTEM OPERATION.

3.3.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. It shall include results of
testing the effectiveness of the methods used to reduce covert
channel bandwidths.

3.3.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. The
interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the
TCB shall be available and proven that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. The descriptive top-level
specification (DTLS) shall be shown to be an accurate
description of the TCB interface. Documentation shall describe
how the TCB implements the reference monitor concept and give
an explanation why it is tamperproof, cannot be bypassed, and
is correctly implemented. THE TCB IMPLEMENTATION (I.E., IN
HARDWARE, FIRMWARE, AND SOFTWARE) SHALL BE INFORMALLY SHOWN TO
BE CONSISTENT WITH THE DTLS. THE ELEMENTS OF THE DTLS SHALL BE
SHOWN, USING INFORMAL TECHNIQUES, TO CORRESPOND TO THE ELEMENTS
OF THE TCB. Documentation shall describe how the TCB is
structured to facilitate testing and to enforce least privilege.
This documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage channels,
the use of which is not detectable by the auditing mechanisms,
shall be provided. (See the Covert Channel Guideline section.)

4.0 DIVISION A: VERIFIED PROTECTION

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system. Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

4.1 CLASS (A1): VERIFIED DESIGN

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented. This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design.
Independent of the particular specification language or verification system
used, there are five important criteria for class (A1) design verification:

* A formal model of the security policy must be clearly
identified and documented, including a mathematical proof
that the model is consistent with its axioms and is
sufficient to support the security policy.

* An FTLS must be produced that includes abstract definitions
of the functions the TCB performs and of the hardware and/or
firmware mechanisms that are used to support separate
execution domains.

* The FTLS of the TCB must be shown to be consistent with the
model by formal techniques where possible (i.e., where
verification tools exist) and informal ones otherwise.

* The TCB implementation (i.e., in hardware, firmware, and
software) must be informally shown to be consistent with the
FTLS. The elements of the FTLS must be shown, using
informal techniques, to correspond to the elements of the
TCB. The FTLS must express the unified protection mechanism
required to satisfy the security policy, and it is the
elements of this protection mechanism that are mapped to the
elements of the TCB.

* Formal analysis techniques must be used to identify and
analyze covert channels. Informal techniques may be used to
identify covert timing channels. The continued existence of
identified covert channels in the system must be justified.

In keeping with the extensive design and development analysis of the TCB
required of systems in class (A1), more stringent configuration management is
required and procedures are established for securely distributing the system
to sites. A system security administrator is supported.

The following are minimal requirements for systems assigned a class (A1)
rating:

4.1.1 SECURITY POLICY

4.1.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., access control lists) shall
allow users to specify and control sharing of those objects.
The discretionary access control mechanism shall, either by
explicit user action or by default, provide that objects are
protected from unauthorized access. These access controls
shall be capable of specifying, for each named object, a list
of named individuals and a list of groups of named individuals
with their respective modes of access to that object.
Furthermore, for each such named object, it shall be possible to
specify a list of named individuals and a list of groups of
named individuals for which no access to the object is to be
given. Access permission to an object by users not already
possessing access permission shall only be assigned by
authorized users.

4.1.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

4.1.1.3 Labels

Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

4.1.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

4.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

4.1.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

4.1.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the information
they process. However, the TCB shall include a
mechanism by which the TCB and an authorized user
reliably communicate to designate the single
security level of information imported or exported
via single-level communication channels or I/O
devices.

4.1.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.

____________________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
____________________________________________________________________

4.1.1.3.3 Subject Sensitivity Labels

The TCB shall immediately notify a terminal user of each
change in the security level associated with that user
during an interactive session. A terminal user shall be
able to query the TCB as desired for a display of the
subject’s complete sensitivity label.

4.1.1.3.4 Device Labels

The TCB shall support the assignment of minimum and
maximum security levels to all attached physical devices.
These security levels shall be used by the TCB to enforce
constraints imposed by the physical environments in which
the devices are located.

4.1.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O
devices) that are directly or indirectly accessible by subjects
external to the TCB. These subjects and objects shall be
assigned sensitivity labels that are a combination of
hierarchical classification levels and non-hierarchical
categories, and the labels shall be used as the basis for
mandatory access control decisions. The TCB shall be able to
support two or more such security levels. (See the Mandatory
Access Control guidelines.) The following requirements shall
hold for all accesses between all subjects external to the TCB
and all objects directly or indirectly accessible by these
subjects: A subject can read an object only if the hierarchical
classification in the subject’s security level is greater than
or equal to the hierarchical classification in the object’s
security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical
categories in the object’s security level. A subject can write
an object only if the hierarchical classification in the
subject’s security level is less than or equal to the
hierarchical classification in the object’s security level and
all the non-hierarchical categories in the subject’s security
level are included in the non- hierarchical categories in the
object’s security level.

4.1.2 ACCOUNTABILITY

4.1.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

4.1.2.1.1 Trusted Path

The TCB shall support a trusted communication path
between itself and users for use when a positive TCB-to-
user connection is required (e.g., login, change subject
security level). Communications via this trusted path
shall be activated exclusively by a user or the TCB and
shall be logically isolated and unmistakably
distinguishable from other paths.

4.1.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record. For
events that introduce an object into a user’s address space and
for object deletion events the audit record shall include the
name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
and/or object security level. The TCB shall be able to audit
the identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able to
immediately notify the security administrator when thresholds
are exceeded.

4.1.3 ASSURANCE

4.1.3.1 Operational Assurance

4.1.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
The TCB shall maintain process isolation through the
provision of distinct address spaces under its control.
The TCB shall be internally structured into well-defined
largely independent modules. It shall make effective use
of available hardware to separate those elements that are
protection-critical from those that are not. The TCB
modules shall be designed such that the principle of
least privilege is enforced. Features in hardware, such
as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely:
readable, writeable). The user interface to the TCB
shall be completely defined and all elements of the TCB
identified. The TCB shall be designed and structured to
use a complete, conceptually simple protection mechanism
with precisely defined semantics. This mechanism shall
play a central role in enforcing the internal structuring
of the TCB and the system. The TCB shall incorporate
significant use of layering, abstraction and data hiding.
Significant system engineering shall be directed toward
minimizing the complexity of the TCB and excluding from
the TCB modules that are not protection-critical.

4.1.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

4.1.3.1.3 Covert Channel Analysis

The system developer shall conduct a thorough search for
COVERT CHANNELS and make a determination (either by
actual measurement or by engineering estimation) of the
maximum bandwidth of each identified channel. (See the
Covert Channels Guideline section.) FORMAL METHODS SHALL
BE USED IN THE ANALYSIS.

4.1.3.1.4 Trusted Facility Management

The TCB shall support separate operator and administrator
functions. The functions performed in the role of a
security administrator shall be identified. The ADP
system administrative personnel shall only be able to
perform security administrator functions after taking a
distinct auditable action to assume the security
administrator role on the ADP system. Non-security
functions that can be performed in the security
administration role shall be limited strictly to those
essential to performing the security role effectively.

4.1.3.1.5 Trusted Recovery

Procedures and/or mechanisms shall be provided to assure
that, after an ADP system failure or other discontinuity,
recovery without a protection compromise is obtained.

4.1.3.2 Life-Cycle Assurance

4.1.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that
would permit a subject external to the TCB to read,
change, or delete data normally denied under the
mandatory or discretionary security policy enforced by
the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter
a state such that it is unable to respond to
communications initiated by other users. The TCB shall
be found resistant to penetration. All discovered flaws
shall be corrected and the TCB retested to demonstrate
that they have been eliminated and that new flaws have
not been introduced. Testing shall demonstrate that the
TCB implementation is consistent with the FORMAL top-
level specification. (See the Security Testing
Guidelines.) No design flaws and no more than a few
correctable implementation flaws may be found during
testing and there shall be reasonable confidence that few
remain. MANUAL OR OTHER MAPPING OF THE FTLS TO THE
SOURCE CODE MAY FORM A BASIS FOR PENETRATION TESTING.

4.1.3.2.2 Design Specification and Verification

A formal model of the security policy supported by the
TCB shall be maintained that is proven consistent with
its axioms. A descriptive top-level specification (DTLS)
of the TCB shall be maintained that completely and
accurately describes the TCB in terms of exceptions, error
messages, and effects. A FORMAL TOP-LEVEL SPECIFICATION
(FTLS) OF THE TCB SHALL BE MAINTAINED THAT ACCURATELY
DESCRIBES THE TCB IN TERMS OF EXCEPTIONS, ERROR MESSAGES,
AND EFFECTS. THE DTLS AND FTLS SHALL INCLUDE THOSE
COMPONENTS OF THE TCB THAT ARE IMPLEMENTED AS HARDWARE
AND/OR FIRMWARE IF THEIR PROPERTIES ARE VISIBLE AT THE
TCB INTERFACE. THE FTLS shall be shown to be an accurate
description of the TCB interface. A convincing argument
shall be given that the DTLS is consistent with the model
AND A COMBINATION OF FORMAL AND INFORMAL TECHNIQUES SHALL
BE USED TO SHOW THAT THE FTLS IS CONSISTENT WITH THE
MODEL. THIS VERIFICATION EVIDENCE SHALL BE CONSISTENT
WITH THAT PROVIDED WITHIN THE STATE-OF-THE-ART OF THE
PARTICULAR COMPUTER SECURITY CENTER-ENDORSED FORMAL
SPECIFICATION AND VERIFICATION SYSTEM USED. MANUAL OR
OTHER MAPPING OF THE FTLS TO THE TCB SOURCE CODE SHALL BE
PERFORMED TO PROVIDE EVIDENCE OF CORRECT IMPLEMENTATION.

4.1.3.2.3 Configuration Management

During THE ENTIRE LIFE-CYCLE, I.E., DURING THE DESIGN,
DEVELOPMENT, and maintenance of the TCB, a configuration
management system shall be in place FOR ALL SECURITY-
RELEVANT HARDWARE, FIRMWARE, AND SOFTWARE that maintains
control of changes to THE FORMAL MODEL, the descriptive
AND FORMAL top-level SPECIFICATIONS, other design data,
implementation documentation, source code, the running
version of the object code, and test fixtures and
documentation. The configuration management system shall
assure a consistent mapping among all documentation and
code associated with the current version of the TCB.
Tools shall be provided for generation of a new version
of the TCB from source code. Also available shall be
tools, MAINTAINED UNDER STRICT CONFIGURATION CONTROL, for
comparing a newly generated version with the previous TCB
version in order to ascertain that only the intended
changes have been made in the code that will actually be
used as the new version of the TCB. A COMBINATION OF
TECHNICAL, PHYSICAL, AND PROCEDURAL SAFEGUARDS SHALL BE
USED TO PROTECT FROM UNAUTHORIZED MODIFICATION OR
DESTRUCTION THE MASTER COPY OR COPIES OF ALL MATERIAL
USED TO GENERATE THE TCB.

4.1.3.2.4 Trusted Distribution

A TRUSTED ADP SYSTEM CONTROL AND DISTRIBUTION FACILITY
SHALL BE PROVIDED FOR MAINTAINING THE INTEGRITY OF THE
MAPPING BETWEEN THE MASTER DATA DESCRIBING THE CURRENT
VERSION OF THE TCB AND THE ON-SITE MASTER COPY OF THE
CODE FOR THE CURRENT VERSION. PROCEDURES (E.G., SITE
SECURITY ACCEPTANCE TESTING) SHALL EXIST FOR ASSURING
THAT THE TCB SOFTWARE, FIRMWARE, AND HARDWARE UPDATES
DISTRIBUTED TO A CUSTOMER ARE EXACTLY AS SPECIFIED BY
THE MASTER COPIES.

4.1.4 DOCUMENTATION

4.1.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

4.1.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. The TCB modules that contain
the reference validation mechanism shall be identified. The
procedures for secure generation of a new TCB from source after
modification of any modules in the TCB shall be described. It
shall include the procedures to ensure that the system is
initially started in a secure manner. Procedures shall also be
included to resume secure system operation after any lapse in
system operation.

4.1.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. It shall include results of
testing the effectiveness of the methods used to reduce covert
channel bandwidths. THE RESULTS OF THE MAPPING BETWEEN THE
FORMAL TOP-LEVEL SPECIFICATION AND THE TCB SOURCE CODE SHALL BE
GIVEN.

4.1.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. The
interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the
TCB shall be available and proven that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. The descriptive top-level
specification (DTLS) shall be shown to be an accurate
description of the TCB interface. Documentation shall describe
how the TCB implements the reference monitor concept and give
an explanation why it is tamperproof, cannot be bypassed, and
is correctly implemented. The TCB implementation (i.e., in
hardware, firmware, and software) shall be informally shown to
be consistent with the FORMAL TOP- LEVEL SPECIFICATION (FTLS).
The elements of the FTLS shall be shown, using informal
techniques, to correspond to the elements of the TCB.
Documentation shall describe how the TCB is structured to
facilitate testing and to enforce least privilege. This
documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage channels,
the use of which is not detectable by the auditing mechanisms,
shall be provided. (See the Covert Channel Guideline section.)
HARDWARE, FIRMWARE, AND SOFTWARE MECHANISMS NOT DEALT WITH IN
THE FTLS BUT STRICTLY INTERNAL TO THE TCB (E.G., MAPPING
REGISTERS, DIRECT MEMORY ACCESS I/O) SHALL BE CLEARLY DESCRIBED.

4.2 BEYOND CLASS (A1)

Most of the security enhancements envisioned for systems that will provide
features and assurance in addition to that already provided by class (Al)
systems are beyond current technology. The discussion below is intended to
guide future work and is derived from research and development activities
already underway in both the public and private sectors. As more and better
analysis techniques are developed, the requirements for these systems will
become more explicit. In the future, use of formal verification will be
extended to the source level and covert timing channels will be more fully
addressed. At this level the design environment will become important and
testing will be aided by analysis of the formal top-level specification.
Consideration will be given to the correctness of the tools used in TCB
development (e.g., compilers, assemblers, loaders) and to the correct
functioning of the hardware/firmware on which the TCB will run. Areas to be
addressed by systems beyond class (A1) include:

* System Architecture

A demonstration (formal or otherwise) must be given showing
that requirements of self-protection and completeness for
reference monitors have been implemented in the TCB.

* Security Testing

Although beyond the current state-of-the-art, it is
envisioned that some test-case generation will be done
automatically from the formal top-level specification or
formal lower-level specifications.

* Formal Specification and Verification

The TCB must be verified down to the source code level,
using formal verification methods where feasible. Formal
verification of the source code of the security-relevant
portions of an operating system has proven to be a difficult
task. Two important considerations are the choice of a
high-level language whose semantics can be fully and
formally expressed, and a careful mapping, through
successive stages, of the abstract formal design to a
formalization of the implementation in low-level
specifications. Experience has shown that only when the
lowest level specifications closely correspond to the actual
code can code proofs be successfully accomplished.

* Trusted Design Environment

The TCB must be designed in a trusted facility with only
trusted (cleared) personnel.

PART II:

5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS

The criteria are divided within each class into groups of requirements. These
groupings were developed to assure that three basic control objectives for
computer security are satisfied and not overlooked. These control objectives
deal with:

* Security Policy
* Accountability
* Assurance

This section provides a discussion of these general control objectives and
their implication in terms of designing trusted systems.

5.1 A Need for Consensus

A major goal of the DoD Computer Security Center is to encourage the Computer
Industry to develop trusted computer systems and products, making them widely
available in the commercial market place. Achievement of this goal will
require recognition and articulation by both the public and private sectors of
a need and demand for such products.

As described in the introduction to this document, efforts to define the
problems and develop solutions associated with processing nationally sensitive
information, as well as other sensitive data such as financial, medical, and
personnel information used by the National Security Establishment, have been
underway for a number of years. The criteria, as described in Part I,
represent the culmination of these efforts and describe basic requirements for
building trusted computer systems. To date, however, these systems have been
viewed by many as only satisfying National Security needs. As long as this
perception continues the consensus needed to motivate manufacture of trusted
systems will be lacking.

The purpose of this section is to describe, in some detail, the fundamental
control objectives that lay the foundations for requirements delineated in the
criteria. The goal is to explain the foundations so that those outside the
National Security Establishment can assess their universality and, by
extension, the universal applicability of the criteria requirements to
processing all types of sensitive applications whether they be for National
Security or the private sector.

5.2 Definition and Usefulness

The term “control objective” refers to a statement of intent with respect to
control over some aspect of an organization’s resources, or processes, or
both. In terms of a computer system, control objectives provide a framework
for developing a strategy for fulfilling a set of security requirements for
any given system. Developed in response to generic vulnerabilities, such as
the need to manage and handle sensitive data in order to prevent compromise,
or the need to provide accountability in order to detect fraud, control
objectives have been identified as a useful method of expressing security
goals.[3]

Examples of control objectives include the three basic design requirements for
implementing the reference monitor concept discussed in Section 6. They are:

* The reference validation mechanism must be tamperproof.

* The reference validation mechanism must always be invoked.

* The reference validation mechanism must be small enough to be
subjected to analysis and tests, the completeness of which can
be assured.[1]

5.3 Criteria Control Objectives

The three basic control objectives of the criteria are concerned with security
policy, accountability, and assurance. The remainder of this section provides
a discussion of these basic requirements.

5.3.1 Security Policy

In the most general sense, computer security is concerned with
controlling the way in which a computer can be used, i.e.,
controlling how information processed by it can be accessed and
manipulated. However, at closer examination, computer security
can refer to a number of areas. Symptomatic of this, FIPS PUB 39,
Glossary For Computer Systems Security, does not have a unique
definition for computer security.[16] Instead there are eleven
separate definitions for security which include: ADP systems
security, administrative security, data security, etc. A common
thread running through these definitions is the word “protection.”
Further declarations of protection requirements can be found in
DoD Directive 5200.28 which describes an acceptable level of
protection for classified data to be one that will “assure that
systems which process, store, or use classified data and produce
classified information will, with reasonable dependability,
prevent: a. Deliberate or inadvertent access to classified
material by unauthorized persons, and b. Unauthorized
manipulation of the computer and its associated peripheral
devices.”[8]

In summary, protection requirements must be defined in terms of
the perceived threats, risks, and goals of an organization. This
is often stated in terms of a security policy. It has been
pointed out in the literature that it is external laws, rules,
regulations, etc. that establish what access to information is to
be permitted, independent of the use of a computer. In particular,
a given system can only be said to be secure with respect to its
enforcement of some specific policy.[30] Thus, the control
objective for security policy is:

SECURITY POLICY CONTROL OBJECTIVE

A STATEMENT OF INTENT WITH REGARD TO CONTROL OVER ACCESS TO AND
DISSEMINATION OF INFORMATION, TO BE KNOWN AS THE SECURITY POLICY,
MUST BE PRECISELY DEFINED AND IMPLEMENTED FOR EACH SYSTEM THAT IS
USED TO PROCESS SENSITIVE INFORMATION. THE SECURITY POLICY MUST
ACCURATELY REFLECT THE LAWS, REGULATIONS, AND GENERAL POLICIES
FROM WHICH IT IS DERIVED.

5.3.1.1 Mandatory Security Policy

Where a security policy is developed that is to be applied
to control of classified or other specifically designated
sensitive information, the policy must include detailed
rules on how to handle that information throughout its
life-cycle. These rules are a function of the various
sensitivity designations that the information can assume
and the various forms of access supported by the system.
Mandatory security refers to the enforcement of a set of
access control rules that constrains a subject’s access to
information on the basis of a comparison of that
individual’s clearance/authorization to the information,
the classification/sensitivity designation of the
information, and the form of access being mediated.
Mandatory policies either require or can be satisfied by
systems that can enforce a partial ordering of
designations, namely, the designations must form what is
mathematically known as a “lattice.”[5]

A clear implication of the above is that the system must
assure that the designations associated with sensitive data
cannot be arbitrarily changed, since this could permit
individuals who lack the appropriate authorization to
access sensitive information. Also implied is the
requirement that the system control the flow of information
so that data cannot be stored with lower sensitivity
designations unless its “downgrading” has been authorized.
The control objective is:

MANDATORY SECURITY CONTROL OBJECTIVE

SECURITY POLICIES DEFINED FOR SYSTEMS THAT ARE USED TO
PROCESS CLASSIFIED OR OTHER SPECIFICALLY CATEGORIZED
SENSITIVE INFORMATION MUST INCLUDE PROVISIONS FOR THE
ENFORCEMENT OF MANDATORY ACCESS CONTROL RULES. THAT IS,
THEY MUST INCLUDE A SET OF RULES FOR CONTROLLING ACCESS
BASED DIRECTLY ON A COMPARISON OF THE INDIVIDUAL’S
CLEARANCE OR AUTHORIZATION FOR THE INFORMATION AND THE
CLASSIFICATION OR SENSITIVITY DESIGNATION OF THE
INFORMATION BEING SOUGHT, AND INDIRECTLY ON CONSIDERATIONS
OF PHYSICAL AND OTHER ENVIRONMENTAL FACTORS OF CONTROL.
THE MANDATORY ACCESS CONTROL RULES MUST ACCURATELY REFLECT
THE LAWS, REGULATIONS, AND GENERAL POLICIES FROM WHICH
THEY ARE DERIVED.

5.3.1.2 Discretionary Security Policy

Discretionary security is the principal type of access
control available in computer systems today. The basis of
this kind of security is that an individual user, or
program operating on his behalf, is allowed to specify
explicitly the types of access other users may have to
information under his control. Discretionary security
differs from mandatory security in that it implements an
access control policy on the basis of an individual’s
need-to-know as opposed to mandatory controls which are
driven by the classification or sensitivity designation of
the information.

Discretionary controls are not a replacement for mandatory
controls. In an environment in which information is
classified (as in the DoD) discretionary security provides
for a finer granularity of control within the overall
constraints of the mandatory policy. Access to classified
information requires effective implementation of both types
of controls as precondition to granting that access. In
general, no person may have access to classified
information unless: (a) that person has been determined to
be trustworthy, i.e., granted a personnel security
clearance — MANDATORY, and (b) access is necessary for the
performance of official duties, i.e., determined to have a
need-to-know — DISCRETIONARY. In other words,
discretionary controls give individuals discretion to
decide on which of the permissible accesses will actually
be allowed to which users, consistent with overriding
mandatory policy restrictions. The control objective is:

DISCRETIONARY SECURITY CONTROL OBJECTIVE

SECURITY POLICIES DEFINED FOR SYSTEMS THAT ARE USED TO
PROCESS CLASSIFIED OR OTHER SENSITIVE INFORMATION MUST
INCLUDE PROVISIONS FOR THE ENFORCEMENT OF DISCRETIONARY
ACCESS CONTROL RULES. THAT IS, THEY MUST INCLUDE A
CONSISTENT SET OF RULES FOR CONTROLLING AND LIMITING ACCESS
BASED ON IDENTIFIED INDIVIDUALS WHO HAVE BEEN DETERMINED TO
HAVE A NEED-TO-KNOW FOR THE INFORMATION.

5.3.1.3 Marking

To implement a set of mechanisms that will put into effect
a mandatory security policy, it is necessary that the
system mark information with appropriate classification or
sensitivity labels and maintain these markings as the
information moves through the system. Once information is
unalterably and accurately marked, comparisons required by
the mandatory access control rules can be accurately and
consistently made. An additional benefit of having the
system maintain the classification or sensitivity label
internally is the ability to automatically generate
properly “labeled” output. The labels, if accurately and
integrally maintained by the system, remain accurate when
output from the system. The control objective is:

MARKING CONTROL OBJECTIVE

SYSTEMS THAT ARE DESIGNED TO ENFORCE A MANDATORY SECURITY
POLICY MUST STORE AND PRESERVE THE INTEGRITY OF
CLASSIFICATION OR OTHER SENSITIVITY LABELS FOR ALL
INFORMATION. LABELS EXPORTED FROM THE SYSTEM MUST BE
ACCURATE REPRESENTATIONS OF THE CORRESPONDING INTERNAL
SENSITIVITY LABELS BEING EXPORTED.

5.3.2 Accountability

The second basic control objective addresses one of the
fundamental principles of security, i.e., individual
accountability. Individual accountability is the key to securing
and controlling any system that processes information on behalf
of individuals or groups of individuals. A number of requirements
must be met in order to satisfy this objective.

The first requirement is for individual user identification.
Second, there is a need for authentication of the identification.
Identification is functionally dependent on authentication.
Without authentication, user identification has no credibility.
Without a credible identity, neither mandatory nor discretionary
security policies can be properly invoked because there is no
assurance that proper authorizations can be made.

The third requirement is for dependable audit capabilities. That
is, a trusted computer system must provide authorized personnel
with the ability to audit any action that can potentially cause
access to, generation of, or effect the release of classified or
sensitive information. The audit data will be selectively
acquired based on the auditing needs of a particular installation
and/or application. However, there must be sufficient granularity
in the audit data to support tracing the auditable events to a
specific individual who has taken the actions or on whose behalf
the actions were taken. The control objective is:

ACCOUNTABILITY CONTROL OBJECTIVE

SYSTEMS THAT ARE USED TO PROCESS OR HANDLE CLASSIFIED OR OTHER
SENSITIVE INFORMATION MUST ASSURE INDIVIDUAL ACCOUNTABILITY
WHENEVER EITHER A MANDATORY OR DISCRETIONARY SECURITY POLICY IS
INVOKED. FURTHERMORE, TO ASSURE ACCOUNTABILITY THE CAPABILITY
MUST EXIST FOR AN AUTHORIZED AND COMPETENT AGENT TO ACCESS AND
EVALUATE ACCOUNTABILITY INFORMATION BY A SECURE MEANS, WITHIN A
REASONABLE AMOUNT OF TIME, AND WITHOUT UNDUE DIFFICULTY.

5.3.3 Assurance

The third basic control objective is concerned with guaranteeing
or providing confidence that the security policy has been
implemented correctly and that the protection-relevant elements of
the system do, indeed, accurately mediate and enforce the intent
of that policy. By extension, assurance must include a guarantee
that the trusted portion of the system works only as intended. To
accomplish these objectives, two types of assurance are needed.
They are life-cycle assurance and operational assurance.

Life-cycle assurance refers to steps taken by an organization to
ensure that the system is designed, developed, and maintained
using formalized and rigorous controls and standards.[17]
Computer systems that process and store sensitive or classified
information depend on the hardware and software to protect that
information. It follows that the hardware and software themselves
must be protected against unauthorized changes that could cause
protection mechanisms to malfunction or be bypassed completely.
For this reason trusted computer systems must be carefully
evaluated and tested during the design and development phases and
reevaluated whenever changes are made that could affect the
integrity of the protection mechanisms. Only in this way can
confidence be provided that the hardware and software
interpretation of the security policy is maintained accurately
and without distortion.

While life-cycle assurance is concerned with procedures for
managing system design, development, and maintenance; operational
assurance focuses on features and system architecture used to
ensure that the security policy is uncircumventably enforced
during system operation. That is, the security policy must be
integrated into the hardware and software protection features of
the system. Examples of steps taken to provide this kind of
confidence include: methods for testing the operational hardware
and software for correct operation, isolation of protection-
critical code, and the use of hardware and software to provide
distinct domains. The control objective is:

ASSURANCE CONTROL OBJECTIVE

SYSTEMS THAT ARE USED TO PROCESS OR HANDLE CLASSIFIED OR OTHER
SENSITIVE INFORMATION MUST BE DESIGNED TO GUARANTEE CORRECT AND
ACCURATE INTERPRETATION OF THE SECURITY POLICY AND MUST NOT
DISTORT THE INTENT OF THAT POLICY. ASSURANCE MUST BE PROVIDED
THAT CORRECT IMPLEMENTATION AND OPERATION OF THE POLICY EXISTS
THROUGHOUT THE SYSTEM’S LIFE-CYCLE.

6.0 RATIONALE BEHIND THE EVALUATION CLASSES

6.1 The Reference Monitor Concept

In October of 1972, the Computer Security Technology Planning Study, conducted
by James P. Anderson & Co., produced a report for the Electronic Systems
Division (ESD) of the United States Air Force.[1] In that report, the concept
of “a reference monitor which enforces the authorized access relationships
between subjects and objects of a system” was introduced. The reference
monitor concept was found to be an essential element of any system that would
provide multilevel secure computing facilities and controls.

The Anderson report went on to define the reference validation mechanism as
“an implementation of the reference monitor concept . . . that validates
each reference to data or programs by any user (program) against a list of
authorized types of reference for that user.” It then listed the three design
requirements that must be met by a reference validation mechanism:

a. The reference validation mechanism must be tamper proof.

b. The reference validation mechanism must always be invoked.

c. The reference validation mechanism must be small enough to be
subject to analysis and tests, the completeness of which can
be assured.”[1]

Extensive peer review and continuing research and development activities have
sustained the validity of the Anderson Committee’s findings. Early examples
of the reference validation mechanism were known as security kernels. The
Anderson Report described the security kernel as “that combination of hardware
and software which implements the reference monitor concept.”[1] In this vein,
it will be noted that the security kernel must support the three reference
monitor requirements listed above.

6.2 A Formal Security Policy Model

Following the publication of the Anderson report, considerable research was
initiated into formal models of security policy requirements and of the
mechanisms that would implement and enforce those policy models as a security
kernel. Prominent among these efforts was the ESD-sponsored development of
the Bell and LaPadula model, an abstract formal treatment of DoD security
policy.[2] Using mathematics and set theory, the model precisely defines the
notion of secure state, fundamental modes of access, and the rules for
granting subjects specific modes of access to objects. Finally, a theorem is
proven to demonstrate that the rules are security-preserving operations, so
that the application of any sequence of the rules to a system that is in a
secure state will result in the system entering a new state that is also
secure. This theorem is known as the Basic Security Theorem.

The Bell and LaPadula model defines a relationship between clearances of
subjects and classifications of system objects, now referenced as the
“dominance relation.” From this definition, accesses permitted between
subjects and objects are explicitly defined for the fundamental modes of
access, including read-only access, read/write access, and write-only access.
The model defines the Simple Security Condition to control granting a subject
read access to a specific object, and the *-Property (read “Star Property”) to
control granting a subject write access to a specific object. Both the Simple
Security Condition and the *-Property include mandatory security provisions
based on the dominance relation between the clearance of the subject and the
classification of the object. The Discretionary Security Property is also
defined, and requires that a specific subject be authorized for the particular
mode of access required for the state transition. In its treatment of
subjects (processes acting on behalf of a user), the model distinguishes
between trusted subjects (i.e., not constrained within the model by the
*-Property) and untrusted subjects (those that are constrained by the
*-Property).

From the Bell and LaPadula model there evolved a model of the method of proof
required to formally demonstrate that all arbitrary sequences of state
transitions are security-preserving. It was also shown that the *- Property
is sufficient to prevent the compromise of information by Trojan Horse
attacks.

6.3 The Trusted Computing Base

In order to encourage the widespread commercial availability of trusted
computer systems, these evaluation criteria have been designed to address
those systems in which a security kernel is specifically implemented as well
as those in which a security kernel has not been implemented. The latter case
includes those systems in which objective (c) is not fully supported because
of the size or complexity of the reference validation mechanism. For
convenience, these evaluation criteria use the term Trusted Computing Base to
refer to the reference validation mechanism, be it a security kernel,
front-end security filter, or the entire trusted computer system.

The heart of a trusted computer system is the Trusted Computing Base (TCB)
which contains all of the elements of the system responsible for supporting
the security policy and supporting the isolation of objects (code and data) on
which the protection is based. The bounds of the TCB equate to the “security
perimeter” referenced in some computer security literature. In the interest
of understandable and maintainable protection, a TCB should be as simple as
possible consistent with the functions it has to perform. Thus, the TCB
includes hardware, firmware, and software critical to protection and must be
designed and implemented such that system elements excluded from it need not
be trusted to maintain protection. Identification of the interface and
elements of the TCB along with their correct functionality therefore forms the
basis for evaluation.

For general-purpose systems, the TCB will include key elements of the
operating system and may include all of the operating system. For embedded
systems, the security policy may deal with objects in a way that is meaningful
at the application level rather than at the operating system level. Thus, the
protection policy may be enforced in the application software rather than in
the underlying operating system. The TCB will necessarily include all those
portions of the operating system and application software essential to the
support of the policy. Note that, as the amount of code in the TCB increases,
it becomes harder to be confident that the TCB enforces the reference monitor
requirements under all circumstances.

6.4 Assurance

The third reference monitor design objective is currently interpreted as
meaning that the TCB “must be of sufficiently simple organization and
complexity to be subjected to analysis and tests, the completeness of which
can be assured.”

Clearly, as the perceived degree of risk increases (e.g., the range of
sensitivity of the system’s protected data, along with the range of clearances
held by the system’s user population) for a particular system’s operational
application and environment, so also must the assurances be increased to
substantiate the degree of trust that will be placed in the system. The
hierarchy of requirements that are presented for the evaluation classes in the
trusted computer system evaluation criteria reflect the need for these
assurances.

As discussed in Section 5.3, the evaluation criteria uniformly require a
statement of the security policy that is enforced by each trusted computer
system. In addition, it is required that a convincing argument be presented
that explains why the TCB satisfies the first two design requirements for a
reference monitor. It is not expected that this argument will be entirely
formal. This argument is required for each candidate system in order to
satisfy the assurance control objective.

The systems to which security enforcement mechanisms have been added, rather
than built-in as fundamental design objectives, are not readily amenable to
extensive analysis since they lack the requisite conceptual simplicity of a
security kernel. This is because their TCB extends to cover much of the
entire system. Hence, their degree of trustworthiness can best be ascertained
only by obtaining test results. Since no test procedure for something as
complex as a computer system can be truly exhaustive, there is always the
possibility that a subsequent penetration attempt could succeed. It is for
this reason that such systems must fall into the lower evaluation classes.

On the other hand, those systems that are designed and engineered to support
the TCB concepts are more amenable to analysis and structured testing. Formal
methods can be used to analyze the correctness of their reference validation
mechanisms in enforcing the system’s security policy. Other methods,
including less-formal arguments, can be used in order to substantiate claims
for the completeness of their access mediation and their degree of
tamper-resistance. More confidence can be placed in the results of this
analysis and in the thoroughness of the structured testing than can be placed
in the results for less methodically structured systems. For these reasons,
it appears reasonable to conclude that these systems could be used in
higher-risk environments. Successful implementations of such systems would be
placed in the higher evaluation classes.

6.5 The Classes

It is highly desirable that there be only a small number of overall evaluation
classes. Three major divisions have been identified in the evaluation
criteria with a fourth division reserved for those systems that have been
evaluated and found to offer unacceptable security protection. Within each
major evaluation division, it was found that “intermediate” classes of trusted
system design and development could meaningfully be defined. These
intermediate classes have been designated in the criteria because they
identify systems that:

* are viewed to offer significantly better protection and assurance
than would systems that satisfy the basic requirements for their
evaluation class; and

* there is reason to believe that systems in the intermediate
evaluation classes could eventually be evolved such that they
would satisfy the requirements for the next higher evaluation
class.

Except within division A it is not anticipated that additional “intermediate”
evaluation classes satisfying the two characteristics described above will be
identified.

Distinctions in terms of system architecture, security policy enforcement, and
evidence of credibility between evaluation classes have been defined such that
the “jump” between evaluation classes would require a considerable investment
of effort on the part of implementors. Correspondingly, there are expected to
be significant differentials of risk to which systems from the higher
evaluation classes will be exposed.

7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA

Section 1 presents fundamental computer security requirements and Section 5
presents the control objectives for Trusted Computer Systems. They are
general requirements, useful and necessary, for the development of all secure
systems. However, when designing systems that will be used to process
classified or other sensitive information, functional requirements for meeting
the Control Objectives become more specific. There is a large body of policy
laid down in the form of Regulations, Directives, Presidential Executive
Orders, and OMB Circulars that form the basis of the procedures for the
handling and processing of Federal information in general and classified
information specifically. This section presents pertinent excerpts from these
policy statements and discusses their relationship to the Control Objectives.

7.1 Established Federal Policies

A significant number of computer security policies and associated requirements
have been promulgated by Federal government elements. The interested reader
is referred to reference [32] which analyzes the need for trusted systems in
the civilian agencies of the Federal government, as well as in state and local
governments and in the private sector. This reference also details a number
of relevant Federal statutes, policies and requirements not treated further
below.

Security guidance for Federal automated information systems is provided by the
Office of Management and Budget. Two specifically applicable Circulars have
been issued. OMB Circular No. A-71, Transmittal Memorandum No. 1, “Security
of Federal Automated Information Systems,”[26] directs each executive agency
to establish and maintain a computer security program. It makes the head of
each executive branch, department and agency responsible “for assuring an
adequate level of security for all agency data whether processed in-house or
commercially. This includes responsibility for the establishment of physical,
administrative and technical safeguards required to adequately protect
personal, proprietary or other sensitive data not subject to national security
regulations, as well as national security data.”[26, para. 4 p. 2]

OMB Circular No. A-123, “Internal Control Systems,”[27] issued to help
eliminate fraud, waste, and abuse in government programs requires: (a) agency
heads to issue internal control directives and assign responsibility, (b)
managers to review programs for vulnerability, and (c) managers to perform
periodic reviews to evaluate strengths and update controls. Soon after
promulgation of OMB Circular A-123, the relationship of its internal control
requirements to building secure computer systems was recognized.[4] While not
stipulating computer controls specifically, the definition of Internal
Controls in A-123 makes it clear that computer systems are to be included:

“Internal Controls – The plan of organization and all of the methods and
measures adopted within an agency to safeguard its resources, assure the
accuracy and reliability of its information, assure adherence to
applicable laws, regulations and policies, and promote operational
economy and efficiency.”[27, sec. 4.C]

The matter of classified national security information processed by ADP
systems was one of the first areas given serious and extensive concern in
computer security. The computer security policy documents promulgated as a
result contain generally more specific and structured requirements than most,
keyed in turn to an authoritative basis that itself provides a rather clearly
articulated and structured information security policy. This basis, Executive
Order 12356, “National Security Information,” sets forth requirements for the
classification, declassification and safeguarding of “national security
information” per se.[14]

7.2 DoD Policies

Within the Department of Defense, these broad requirements are implemented and
further specified primarily through two vehicles: 1) DoD Regulation 5200.1-R
[7], which applies to all components of the DoD as such, and 2) DoD 5220.22-M,
“Industrial Security Manual for Safeguarding Classified Information” [11],
which applies to contractors included within the Defense Industrial Security
Program. Note that the latter transcends DoD as such, since it applies not
only to any contractors handling classified information for any DoD component,
but also to the contractors of eighteen other Federal organizations for whom
the Secretary of Defense is authorized to act in rendering industrial security
services.*

____________________________________________________________
* i.e., NASA, Commerce Department, GSA, State Department,
Small Business Administration, National Science Foundation,
Treasury Department, Transportation Department, Interior
Department, Agriculture Department, Health and Human
Services Department, Labor Department, Environmental
Protection Agency, Justice Department, U.S. Arms Control and
Disarmament Agency, Federal Emergency Management Agency,
Federal Reserve System, and U.S. General Accounting Office.
____________________________________________________________

For ADP systems, these information security requirements are further amplified
and specified in: 1) DoD Directive 5200.28 [8] and DoD Manual 5200.28-M [9],
for DoD components; and 2) Section XIII of DoD 5220.22-M [11] for contractors.
DoD Directive 5200.28, “Security Requirements for Automatic Data Processing
(ADP) Systems,” stipulates: “Classified material contained in an ADP system
shall be safeguarded by the continuous employment of protective features in
the system’s hardware and software design and configuration . . . .”[8,
sec. IV] Furthermore, it is required that ADP systems that “process, store,
or use classified data and produce classified information will, with
reasonable dependability, prevent:

a. Deliberate or inadvertent access to classified material by
unauthorized persons, and

b. Unauthorized manipulation of the computer and its associated
peripheral devices.”[8, sec. I B.3]

Requirements equivalent to these appear within DoD 5200.28-M [9] and in DoD
5220.22-M [11].

From requirements imposed by these regulations, directives and circulars, the
three components of the Security Policy Control Objective, i.e., Mandatory and
Discretionary Security and Marking, as well as the Accountability and
Assurance Control Objectives, can be functionally defined for DoD
applications. The following discussion provides further specificity in Policy
for these Control Objectives.

7.3 Criteria Control Objective for Security Policy

7.3.1 Marking

The control objective for marking is: “Systems that are designed
to enforce a mandatory security policy must store and preserve the
integrity of classification or other sensitivity labels for all
information. Labels exported from the system must be accurate
representations of the corresonding internal sensitivity labels
being exported.”

DoD 5220.22-M, “Industrial Security Manual for Safeguarding
Classified Information,” explains in paragraph 11 the reasons for
marking information:

“Designation by physical marking, notation or other means
serves to inform and to warn the holder about the
classification designation of the information which requires
protection in the interest of national security. The degree
of protection against unauthorized disclosure which will be
required for a particular level of classification is directly
commensurate with the marking designation which is assigned
to the material.”[11]

Marking requirements are given in a number of policy statements.

Executive Order 12356 (Sections 1.5.a and 1.5.a.1) requires that
classification markings “shall be shown on the face of all
classified documents, or clearly associated with other forms of
classified information in a manner appropriate to the medium
involved.”[14]

DoD Regulation 5200.1-R (Section 1-500) requires that: “. . .
information or material that requires protection against
unauthorized disclosure in the interest of national security shall
be classified in one of three designations, namely: ‘Top Secret,’
‘Secret’ or ‘Confidential.'”[7] (By extension, for use in computer
processing, the unofficial designation “Unclassified” is used to
indicate information that does not fall under one of the other
three designations of classified information.)

DoD Regulation 5200.1-R (Section 4-304b) requires that: “ADP
systems and word processing systems employing such media shall
provide for internal classification marking to assure that
classified information contained therein that is reproduced or
generated, will bear applicable classification and associated
markings.” (This regulation provides for the exemption of certain
existing systems where “internal classification and applicable
associated markings cannot be implemented without extensive system
modifications.”[7] However, it is clear that future DoD ADP
systems must be able to provide applicable and accurate labels for
classified and other sensitive information.)

DoD Manual 5200.28-M (Section IV, 4-305d) requires the following:
“Security Labels – All classified material accessible by or within
the ADP system shall be identified as to its security
classification and access or dissemination limitations, and all
output of the ADP system shall be appropriately marked.”[9]

7.3.2 Mandatory Security

The control objective for mandatory security is: “Security
policies defined for systems that are used to process classified
or other specifically categorized sensitive information must
include provisions for the enforcement of mandatory access control
rules. That is, they must include a set of rules for controlling
access based directly on a comparison of the individual’s
clearance or authorization for the information and the
classification or sensitivity designation of the information being
sought, and indirectly on considerations of physical and other
environmental factors of control. The mandatory access control
rules must accurately reflect the laws, regulations, and general
policies from which they are derived.”

There are a number of policy statements that are related to
mandatory security.

Executive Order 12356 (Section 4.1.a) states that “a person is
eligible for access to classified information provided that a
determination of trustworthiness has been made by agency heads or
designated officials and provided that such access is essential
to the accomplishment of lawful and authorized Government
purposes.”[14]

DoD Regulation 5200.1-R (Chapter I, Section 3) defines a Special
Access Program as “any program imposing ‘need-to-know’ or access
controls beyond those normally provided for access to
Confidential, Secret, or Top Secret information. Such a program
includes, but is not limited to, special clearance, adjudication,
or investigative requirements, special designation of officials
authorized to determine ‘need-to-know’, or special lists of persons
determined to have a ‘need-to- know.'”[7, para. 1-328] This
passage distinguishes between a ‘discretionary’ determination of
need-to-know and formal need-to-know which is implemented through
Special Access Programs. DoD Regulation 5200.1-R, paragraph 7-100
describes general requirements for trustworthiness (clearance) and
need-to-know, and states that the individual with possession,
knowledge or control of classified information has final
responsibility for determining if conditions for access have been
met. This regulation further stipulates that “no one has a right
to have access to classified information solely by virtue of rank
or position.” [7, para. 7-100])

DoD Manual 5200.28-M (Section II 2-100) states that, “Personnel
who develop, test (debug), maintain, or use programs which are
classified or which will be used to access or develop classified
material shall have a personnel security clearance and an access
authorization (need-to-know), as appropriate for the highest
classified and most restrictive category of classified material
which they will access under system constraints.”[9]

DoD Manual 5220.22-M (Paragraph 3.a) defines access as “the
ability and opportunity to obtain knowledge of classified
information. An individual, in fact, may have access to
classified information by being in a place where such information
is kept, if the security measures which are in force do not
prevent him from gaining knowledge of the classified
information.”[11]

The above mentioned Executive Order, Manual, Directives and
Regulations clearly imply that a trusted computer system must
assure that the classification labels associated with sensitive
data cannot be arbitrarily changed, since this could permit
individuals who lack the appropriate clearance to access
classified information. Also implied is the requirement that a
trusted computer system must control the flow of information so
that data from a higher classification cannot be placed in a
storage object of lower classification unless its “downgrading”
has been authorized.

7.3.3 Discretionary Security

The term discretionary security refers to a computer system’s
ability to control information on an individual basis. It stems
from the fact that even though an individual has all the formal
clearances for access to specific classified information, each
individual’s access to information must be based on a demonstrated
need-to-know. Because of this, it must be made clear that this
requirement is not discretionary in a “take it or leave it” sense.
The directives and regulations are explicit in stating that the
need-to-know test must be satisfied before access can be granted
to the classified information. The control objective for
discretionary security is: “Security policies defined for systems
that are used to process classified or other sensitive information
must include provisions for the enforcement of discretionary
access control rules. That is, they must include a consistent set
of rules for controlling and limiting access based on identified
individuals who have been determined to have a need-to-know for the
information.”

DoD Regulation 5200.1-R (Paragraph 7-100) In addition to excerpts
already provided that touch on need-to- know, this section of the
regulation stresses the need- to-know principle when it states “no
person may have access to classified information unless . . .
access is necessary for the performance of official duties.”[7]

Also, DoD Manual 5220.22-M (Section III 20.a) states that “an
individual shall be permitted to have access to classified
information only . . . when the contractor determines that access
is necessary in the performance of tasks or services essential to
the fulfillment of a contract or program, i.e., the individual has
a need-to-know.”[11]

7.4 Criteria Control Objective for Accountability

The control objective for accountability is: “Systems that are used to
process or handle classified or other sensitive information must assure
individual accountability whenever either a mandatory or discretionary
security policy is invoked. Furthermore, to assure accountability the
capability must exist for an authorized and competent agent to access and
evaluate accountability information by a secure means, within a reasonable
amount of time, and without undue difficulty.”

This control objective is supported by the following citations:

DoD Directive 5200.28 (VI.A.1) states: “Each user’s identity shall be
positively established, and his access to the system, and his activity in
the system (including material accessed and actions taken) controlled and
open to scrutiny.”[8]

DoD Manual 5200.28-M (Section V 5-100) states: “An audit log or file
(manual, machine, or a combination of both) shall be maintained as a
history of the use of the ADP System to permit a regular security review
of system activity. (e.g., The log should record security related
transactions, including each access to a classified file and the nature
of the access, e.g., logins, production of accountable classified
outputs, and creation of new classified files. Each classified file
successfully accessed [regardless of the number of individual references]
during each ‘job’ or ‘interactive session’ should also be recorded in the
audit log. Much of the material in this log may also be required to
assure that the system preserves information entrusted to it.)”[9]

DoD Manual 5200.28-M (Section IV 4-305f) states: “Where needed to assure
control of access and individual accountability, each user or specific
group of users shall be identified to the ADP System by appropriate
administrative or hardware/software measures. Such identification
measures must be in sufficient detail to enable the ADP System to provide
the user only that material which he is authorized.”[9]

DoD Manual 5200.28-M (Section I 1-102b) states:

“Component’s Designated Approving Authorities, or their designees
for this purpose . . . will assure:

. . . . . . . . . . . . . . . . .

(4) Maintenance of documentation on operating systems (O/S)
and all modifications thereto, and its retention for a
sufficient period of time to enable tracing of security-
related defects to their point of origin or inclusion in the
system.

. . . . . . . . . . . . . . . . .

(6) Establishment of procedures to discover, recover,
handle, and dispose of classified material improperly
disclosed through system malfunction or personnel action.

(7) Proper disposition and correction of security
deficiencies in all approved ADP Systems, and the effective
use and disposition of system housekeeping or audit records,
records of security violations or security-related system
malfunctions, and records of tests of the security features
of an ADP System.”[9]

DoD Manual 5220.22-M (Section XIII 111) states: “Audit Trails

a. The general security requirement for any ADP system audit
trail is that it provide a documented history of the use of
the system. An approved audit trail will permit review of
classified system activity and will provide a detailed
activity record to facilitate reconstruction of events to
determine the magnitude of compromise (if any) should a
security malfunction occur. To fulfill this basic
requirement, audit trail systems, manual, automated or a
combination of both must document significant events
occurring in the following areas of concern: (i) preparation
of input data and dissemination of output data (i.e.,
reportable interactivity between users and system support
personnel), (ii) activity involved within an ADP environment
(e.g., ADP support personnel modification of security and
related controls), and (iii) internal machine activity.

b. The audit trail for an ADP system approved to process
classified information must be based on the above three
areas and may be stylized to the particular system. All
systems approved for classified processing should contain
most if not all of the audit trail records listed below. The
contractor’s SPP documentation must identify and describe
those applicable:

1. Personnel access;

2. Unauthorized and surreptitious entry into the
central computer facility or remote terminal areas;

3. Start/stop time of classified processing indicating
pertinent systems security initiation and termination events
(e.g., upgrading/downgrading actions pursuant to paragraph
107);

4. All functions initiated by ADP system console
operators;

5. Disconnects of remote terminals and peripheral
devices (paragraph 107c);

6. Log-on and log-off user activity;

7. Unauthorized attempts to access files or programs,
as well as all open, close, create, and file destroy
actions;

8. Program aborts and anomalies including
identification information (i.e., user/program name, time
and location of incident, etc.);

9. System hardware additions, deletions and maintenance
actions;

10. Generations and modifications affecting the
security features of the system software.

c. The ADP system security supervisor or designee shall
review the audit trail logs at least weekly to assure that
all pertinent activity is properly recorded and that
appropriate action has been taken to correct any anomaly.
The majority of ADP systems in use today can develop audit
trail systems in accord with the above; however, special
systems such as weapons, communications, communications
security, and tactical data exchange and display systems,
may not be able to comply with all aspects of the above and
may require individualized consideration by the cognizant
security office.

d. Audit trail records shall be retained for a period of one
inspection cycle.”[11]

7.5 Criteria Control Objective for Assurance

The control objective for assurance is: “Systems that are used to process
or handle classified or other sensitive information must be designed to
guarantee correct and accurate interpretation of the security policy and
must not distort the intent of that policy. Assurance must be provided
that correct implementation and operation of the policy exists throughout
the system’s life-cycle.”

A basis for this objective can be found in the following sections of DoD
Directive 5200.28:

DoD Directive 5200.28 (IV.B.1) stipulates: “Generally, security of an ADP
system is most effective and economical if the system is designed
originally to provide it. Each Department of Defense Component
undertaking design of an ADP system which is expected to process, store,
use, or produce classified material shall: From the beginning of the
design process, consider the security policies, concepts, and measures
prescribed in this Directive.”[8]

DoD Directive 5200.28 (IV.C.5.a) states: “Provision may be made to permit
adjustment of ADP system area controls to the level of protection
required for the classification category and type(s) of material actually
being handled by the system, provided change procedures are developed and
implemented which will prevent both the unauthorized access to classified
material handled by the system and the unauthorized manipulation of the
system and its components. Particular attention shall be given to the
continuous protection of automated system security measures, techniques
and procedures when the personnel security clearance level of users
having access to the system changes.”[8]

DoD Directive 5200.28 (VI.A.2) states: “Environmental Control. The ADP
System shall be externally protected to minimize the likelihood of
unauthorized access to system entry points, access to classified
information in the system, or damage to the system.”[8]

DoD Manual 5200.28-M (Section I 1-102b) states:

“Component’s Designated Approving Authorities, or their designees
for this purpose . . . will assure:

. . . . . . . . . . . . . . . . .

(5) Supervision, monitoring, and testing, as appropriate, of
changes in an approved ADP System which could affect the
security features of the system, so that a secure system is
maintained.

. . . . . . . . . . . . . . . . .

(7) Proper disposition and correction of security
deficiencies in all approved ADP Systems, and the effective
use and disposition of system housekeeping or audit records,
records of security violations or security-related system
malfunctions, and records of tests of the security features
of an ADP System.

(8) Conduct of competent system ST&E, timely review of
system ST&E reports, and correction of deficiencies needed
to support conditional or final approval or disapproval of
an ADP System for the processing of classified information.

(9) Establishment, where appropriate, of a central ST&E
coordination point for the maintenance of records of
selected techniques, procedures, standards, and tests used
in the testing and evaluation of security features of ADP
Systems which may be suitable for validation and use by
other Department of Defense Components.”[9]

DoD Manual 5220.22-M (Section XIII 103a) requires: “the initial approval,
in writing, of the cognizant security office prior to processing any
classified information in an ADP system. This section requires
reapproval by the cognizant security office for major system
modifications made subsequent to initial approval. Reapprovals will be
required because of (i) major changes in personnel access requirements,
(ii) relocation or structural modification of the central computer
facility, (iii) additions, deletions or changes to main frame, storage or
input/output devices, (iv) system software changes impacting security
protection features, (v) any change in clearance, declassification, audit
trail or hardware/software maintenance procedures, and (vi) other system
changes as determined by the cognizant security office.”[11]

A major component of assurance, life-cycle assurance, is concerned with
testing ADP systems both in the development phase as well as during
operation. DoD Directive 5215.1 (Section F.2.C.(2)) requires
“evaluations of selected industry and government-developed trusted
computer systems against these criteria.”[10]

8.0 A GUIDELINE ON COVERT CHANNELS

A covert channel is any communication channel that can be exploited by a
process to transfer information in a manner that violates the system’s
security policy. There are two types of covert channels: storage channels and
timing channels. Covert storage channels include all vehicles that would
allow the direct or indirect writing of a storage location by one process and
the direct or indirect reading of it by another. Covert timing channels
include all vehicles that would allow one process to signal information to
another process by modulating its own use of system resources in such a way
that the change in response time observed by the second process would provide
information.

From a security perspective, covert channels with low bandwidths represent a
lower threat than those with high bandwidths. However, for many types of
covert channels, techniques used to reduce the bandwidth below a certain rate
(which depends on the specific channel mechanism and the system architecture)
also have the effect of degrading the performance provided to legitimate
system users. Hence, a trade-off between system performance and covert
channel bandwidth must be made. Because of the threat of compromise that
would be present in any multilevel computer system containing classified or
sensitive information, such systems should not contain covert channels with
high bandwidths. This guideline is intended to provide system developers with
an idea of just how high a “high” covert channel bandwidth is.

A covert channel bandwidth that exceeds a rate of one hundred (100) bits per
second is considered “high” because 100 bits per second is the approximate
rate at which many computer terminals are run. It does not seem appropriate
to call a computer system “secure” if information can be compromised at a rate
equal to the normal output rate of some commonly used device.

In any multilevel computer system there are a number of relatively
low-bandwidth covert channels whose existence is deeply ingrained in the
system design. Faced with the large potential cost of reducing the bandwidths
of such covert channels, it is felt that those with maximum bandwidths of less
than one (1) bit per second are acceptable in most application environments.
Though maintaining acceptable performance in some systems may make it
impractical to eliminate all covert channels with bandwidths of 1 or more bits
per second, it is possible to audit their use without adversely affecting
system performance. This audit capability provides the system administration
with a means of detecting — and procedurally correcting — significant
compromise. Therefore, a Trusted Computing Base should provide, wherever
possible, the capability to audit the use of covert channel mechanisms with
bandwidths that may exceed a rate of one (1) bit in ten (10) seconds.

The covert channel problem has been addressed by a number of authors. The
interested reader is referred to references [5], [6], [19], [21], [22], [23],
and [29].

9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL FEATURES

The Mandatory Access Control requirement includes a capability to support an
unspecified number of hierarchical classifications and an unspecified number
of non-hierarchical categories at each hierarchical level. To encourage
consistency and portability in the design and development of the National
Security Establishment trusted computer systems, it is desirable for all such
systems to be able to support a minimum number of levels and categories. The
following suggestions are provided for this purpose:

* The number of hierarchical classifications should be greater than or
equal to eight (8).

* The number of non-hierarchical categories should be greater than or
equal to twenty-nine (29).

10.0 A GUIDELINE ON SECURITY TESTING

These guidelines are provided to give an indication of the extent and
sophistication of testing undertaken by the DoD Computer Security Center
during the Formal Product Evaluation process. Organizations wishing to use
“Department of Defense Trusted Computer System Evaluation Criteria” for
performing their own evaluations may find this section useful for planning
purposes.

As in Part I, highlighting is used to indicate changes in the guidelines from
the next lower division.

10.1 Testing for Division C

10.1.1 Personnel

The security testing team shall consist of at least two
individuals with bachelor degrees in Computer Science or the
equivalent. Team members shall be able to follow test plans
prepared by the system developer and suggest additions, shall
be familiar with the “flaw hypothesis” or equivalent security
testing methodology, and shall have assembly level programming
experience. Before testing begins, the team members shall have
functional knowledge of, and shall have completed the system
developer’s internals course for, the system being evaluated.

10.1.2 Testing

The team shall have “hands-on” involvement in an independent run
of the tests used by the system developer. The team shall
independently design and implement at least five system-specific
tests in an attempt to circumvent the security mechanisms of the
system. The elapsed time devoted to testing shall be at least
one month and need not exceed three months. There shall be no
fewer than twenty hands-on hours spent carrying out system
developer-defined tests and test team-defined tests.

10.2 Testing for Division B

10.2.1 Personnel

The security testing team shall consist of at least two
individuals with bachelor degrees in Computer Science or the
equivalent and at least one individual with a master’s degree in
Computer Science or equivalent. Team members shall be able to
follow test plans prepared by the system developer and suggest
additions, shall be conversant with the “flaw hypothesis” or
equivalent security testing methodology, shall be fluent in the
TCB implementation language(s), and shall have assembly level
programming experience. Before testing begins, the team members
shall have functional knowledge of, and shall have completed the
system developer’s internals course for, the system being
evaluated. At least one team member shall have previously
completed a security test on another system.

10.2.2 Testing

The team shall have “hands-on” involvement in an independent run
of the test package used by the system developer to test
security-relevant hardware and software. The team shall
independently design and implement at least fifteen system-
specific tests in an attempt to circumvent the security
mechanisms of the system. The elapsed time devoted to testing
shall be at least two months and need not exceed four months.
There shall be no fewer than thirty hands-on hours per team
member spent carrying out system developer-defined tests and
test team-defined tests.

10.3 Testing for Division A

10.3.1 Personnel

The security testing team shall consist of at least one
individual with a bachelor’s degree in Computer Science or the
equivalent and at least two individuals with masters’ degrees in
Computer Science or equivalent. Team members shall be able to
follow test plans prepared by the system developer and suggest
additions, shall be conversant with the “flaw hypothesis” or
equivalent security testing methodology, shall be fluent in the
TCB implementation language(s), and shall have assembly level
programming experience. Before testing begins, the team members
shall have functional knowledge of, and shall have completed the
system developer’s internals course for, the system being
evaluated. At least one team member shall be familiar enough
with the system hardware to understand the maintenance diagnostic
programs and supporting hardware documentation. At least two
team members shall have previously completed a security test on
another system. At least one team member shall have
demonstrated system level programming competence on the system
under test to a level of complexity equivalent to adding a device
driver to the system.

10.3.2 Testing

The team shall have “hands-on” involvement in an independent run
of the test package used by the system developer to test
security-relevant hardware and software. The team shall
independently design and implement at least twenty-five system-
specific tests in an attempt to circumvent the security
mechanisms of the system. The elapsed time devoted to testing
shall be at least three months and need not exceed six months.
There shall be no fewer than fifty hands-on hours per team
member spent carrying out system developer-defined tests and
test team-defined tests.

APPENDIX A

Commercial Product Evaluation Process

“Department of Defense Trusted Computer System Evaluation Criteria” forms the
basis upon which the Computer Security Center will carry out the commercial
computer security evaluation process. This process is focused on commercially
produced and supported general-purpose operating system products that meet the
needs of government departments and agencies. The formal evaluation is aimed
at “off-the-shelf” commercially supported products and is completely divorced
from any consideration of overall system performance, potential applications,
or particular processing environments. The evaluation provides a key input to
a computer system security approval/accreditation. However, it does not
constitute a complete computer system security evaluation. A complete study
(e.g., as in reference [18]) must consider additional factors dealing with the
system in its unique environment, such as it’s proposed security mode of
operation, specific users, applications, data sensitivity, physical and
personnel security, administrative and procedural security, TEMPEST, and
communications security.

The product evaluation process carried out by the Computer Security Center has
three distinct elements:

* Preliminary Product Evaluation – An informal dialogue between a vendor
and the Center in which technical information is exchanged to create a
common understanding of the vendor’s product, the criteria, and the
rating that may be expected to result from a formal product evaluation.

* Formal Product Evaluation – A formal evaluation, by the Center, of a
product that is available to the DoD, and that results in that product
and its assigned rating being placed on the Evaluated Products List.

* Evaluated Products List – A list of products that have been subjected
to formal product evaluation and their assigned ratings.

PRELIMINARY PRODUCT EVALUATION

Since it is generally very difficult to add effective security measures late
in a product’s life cycle, the Center is interested in working with system
vendors in the early stages of product design. A preliminary product
evaluation allows the Center to consult with computer vendors on computer
security issues found in products that have not yet been formally announced.

A preliminary evaluation is typically initiated by computer system vendors who
are planning new computer products that feature security or major
security-related upgrades to existing products. After an initial meeting
between the vendor and the Center, appropriate non-disclosure agreements are
executed that require the Center to maintain the confidentiality of any
proprietary information disclosed to it. Technical exchange meetings follow
in which the vendor provides details about the proposed product (particularly
its internal designs and goals) and the Center provides expert feedback to the
vendor on potential computer security strengths and weaknesses of the vendor’s
design choices, as well as relevant interpretation of the criteria. The
preliminary evaluation is typically terminated when the product is completed
and ready for field release by the vendor. Upon termination, the Center
prepares a wrap-up report for the vendor and for internal distribution within
the Center. Those reports containing proprietary information are not
available to the public.

During preliminary evaluation, the vendor is under no obligation to actually
complete or market the potential product. The Center is, likewise, not
committed to conduct a formal product evaluation. A preliminary evaluation
may be terminated by either the Center or the vendor when one notifies the
other, in writing, that it is no longer advantageous to continue the
evaluation.

FORMAL PRODUCT EVALUATION

The formal product evaluation provides a key input to certification of a
computer system for use in National Security Establishment applications and is
the sole basis for a product being placed on the Evaluated Products List.

A formal product evaluation begins with a request by a vendor for the Center
to evaluate a product for which the product itself and accompanying
documentation needed to meet the requirements defined by this publication are
complete. Non-disclosure agreements are executed and a formal product
evaluation team is formed by the Center. An initial meeting is then held with
the vendor to work out the schedule for the formal evaluation. Since testing
of the implemented product forms an important part of the evaluation process,
access by the evaluation team to a working version of the system is negotiated
with the vendor. Additional support required from the vendor includes
complete design documentation, source code, and access to vendor personnel who
can answer detailed questions about specific portions of the product. The
evaluation team tests the product against each requirement, making any
necessary interpretations of the criteria with respect to the product being
evaluated.

The evaluation team writes a two-part final report on their findings about the
system. The first part is publicly available (containing no proprietary
information) and contains the overall class rating assigned to the system and
the details of the evaluation team’s findings when comparing the product
against the evaluation criteria. The second part of the evaluation report
contains vulnerability analyses and other detailed information supporting the
rating decision. Since this part may contain proprietary or other sensitive
information it will be distributed only within the U.S. Government on a
strict need-to-know and non- disclosure basis, and to the vendor. No portion
of the evaluation results will be withheld from the vendor.

APPENDIX B

Summary of Evaluation Criteria Divisions

The divisions of systems recognized under the trusted computer system
evaluation criteria are as follows. Each division represents a major
improvement in the overall confidence one can place in the system to protect
classified and other sensitive information.

Division (D): Minimal Protection

This division contains only one class. It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

Division (C): Discretionary Protection

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

Division (B): Mandatory Protection

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division. Systems in this division must carry the
sensitivity labels with major data structures in the system. The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB. Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

Division (A): Verified Protection

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system. Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

APPENDIX C

Summary of Evaluation Criteria Classes

The classes of systems recognized under the trusted computer system evaluation
criteria are as follows. They are presented in the order of increasing
desirablity from a computer security point of view.

Class (D): Minimal Protection

This class is reserved for those systems that have been evaluated but that
fail to meet the requirements for a higher evaluation class.

Class (C1): Discretionary Security Protection

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data. It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data. The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity.

Class (C2): Controlled Access Protection

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation.

Class (B1): Labeled Security Protection

Class (B1) systems require all the features required for class (C2). In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information. Any
flaws identified by testing must be removed.

Class (B2): Structured Protection

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system. In addition, covert channels are
addressed. The TCB must be carefully structured into protection-critical and
non- protection-critical elements. The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review. Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed. The system is relatively resistant to penetration.

Class (B3): Security Domains

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests. To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity. A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required. The system is highly resistant to
penetration.

Class (A1): Verified Design

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented. This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design. In
keeping with the extensive design and development analysis of the TCB required
of systems in class (A1), more stringent configuration management is required
and procedures are established for securely distributing the system to sites.
A system security administrator is supported.

APPENDIX D

Requirement Directory

This appendix lists requirements defined in “Department of Defense Trusted
Computer System Evaluation Criteria” alphabetically rather than by class. It
is provided to assist in following the evolution of a requirement through the
classes. For each requirement, three types of criteria may be present. Each
will be preceded by the word: NEW, CHANGE, or ADD to indicate the following:

NEW: Any criteria appearing in a lower class are superseded
by the criteria that follow.

CHANGE: The criteria that follow have appeared in a lower class
but are changed for this class. Highlighting is used
to indicate the specific changes to previously stated
criteria.

ADD: The criteria that follow have not been required for any
lower class, and are added in this class to the
previously stated criteria for this requirement.

Abbreviations are used as follows:

NR: (No Requirement) This requirement is not included in
this class.

NAR: (No Additional Requirements) This requirement does not
change from the previous class.

The reader is referred to Part I of this document when placing new criteria
for a requirement into the complete context for that class.

Figure 1 provides a pictorial summary of the evolution of requirements through
the classes.

Audit

C1: NR.

C2: NEW: The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit trail of
accesses to the objects it protects. The audit data shall be
protected by the TCB so that read access to it is limited to those
who are authorized for audit data. The TCB shall be able to record
the following types of events: use of identification and
authentication mechanisms, introduction of objects into a user’s
address space (e.g., file open, program initiation), deletion of
objects, and actions taken by computer operators and system
administrators and/or system security officers. For each recorded
event, the audit record shall identify: date and time of the event,
user, type of event, and success or failure of the event. For
identification/authentication events the origin of request (e.g.,
terminal ID) shall be included in the audit record. For events that
introduce an object into a user’s address space and for object
deletion events the audit record shall include the name of the object.
The ADP system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity.

B1: CHANGE: For events that introduce an object into a user’s address
space and for object deletion events the audit record shall include
the name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the actions
of any one or more users based on individual identity and/or object
security level.

ADD: The TCB shall also be able to audit any override of
human-readable output markings.

B2: ADD: The TCB shall be able to audit the identified events that may be
used in the exploitation of covert storage channels.

B3: ADD: The TCB shall contain a mechanism that is able to monitor the
occurrence or accumulation of security auditable events that may
indicate an imminent violation of security policy. This mechanism
shall be able to immediately notify the security administrator when
thresholds are exceeded.

A1: NAR.

Configuration Management

C1: NR.

C2: NR.

B1: NR.

B2: NEW: During development and maintenance of the TCB, a configuration
management system shall be in place that maintains control of changes
to the descriptive top-level specification, other design data,
implementation documentation, source code, the running version of the
object code, and test fixtures and documentation. The configuration
management system shall assure a consistent mapping among all
documentation and code associated with the current version of the TCB.
Tools shall be provided for generation of a new version of the TCB
from source code. Also available shall be tools for comparing a
newly generated version with the previous TCB version in order to
ascertain that only the intended changes have been made in the code
that will actually be used as the new version of the TCB.

B3: NAR.

A1: CHANGE: During the entire life-cycle, i.e., during the design,
development, and maintenance of the TCB, a configuration management
system shall be in place for all security-relevant hardware, firmware,
and software that maintains control of changes to the formal model,
the descriptive and formal top-level specifications, other design
data, implementation documentation, source code, the running version
of the object code, and test fixtures and documentation. Also
available shall be tools, maintained under strict configuration
control, for comparing a newly generated version with the previous
TCB version in order to ascertain that only the intended changes have
been made in the code that will actually be used as the new version
of the TCB.

ADD: A combination of technical, physical, and procedural safeguards
shall be used to protect from unauthorized modification or
destruction the master copy or copies of all material used to
generate the TCB.

Covert Channel Analysis

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The system developer shall conduct a thorough search for covert
storage channels and make a determination (either by actual
measurement or by engineering estimation) of the maximum bandwidth of
each identified channel. (See the Covert Channels Guideline section.)

B3: CHANGE: The system developer shall conduct a thorough search for
covert channels and make a determination (either by actual
measurement or by engineering estimation) of the maximum bandwidth
of each identified channel.

A1: ADD: Formal methods shall be used in the analysis.

Design Documentation

C1: NEW: Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation of how
this philosophy is translated into the TCB. If the TCB is composed
of distinct modules, the interfaces between these modules shall be
described.

C2: NAR.

B1: ADD: An informal or formal description of the security policy model
enforced by the TCB shall be available and an explanation provided to
show that it is sufficient to enforce the security policy. The
specific TCB protection mechanisms shall be identified and an
explanation given to show that they satisfy the model.

B2: CHANGE: The interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the TCB
shall be available and proven that it is sufficient to enforce the
security policy.

ADD: The descriptive top-level specification (DTLS) shall be shown to
be an accurate description of the TCB interface. Documentation shall
describe how the TCB implements the reference monitor concept and
give an explanation why it is tamperproof, cannot be bypassed, and is
correctly implemented. Documentation shall describe how the TCB is
structured to facilitate testing and to enforce least privilege.
This documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the exploitation
of known covert storage channels shall be identified. The bandwidths
of known covert storage channels, the use of which is not detectable
by the auditing mechanisms, shall be provided. (See the Covert
Channel Guideline section.)

B3: ADD: The TCB implementation (i.e., in hardware, firmware, and
software) shall be informally shown to be consistent with the DTLS.
The elements of the DTLS shall be shown, using informal techniques,
to correspond to the elements of the TCB.

A1: CHANGE: The TCB implementation (i.e., in hardware, firmware, and
software) shall be informally shown to be consistent with the formal
top-level specification (FTLS). The elements of the FTLS shall be
shown, using informal techniques, to correspond to the elements of
the TCB.

ADD: Hardware, firmware, and software mechanisms not dealt with in
the FTLS but strictly internal to the TCB (e.g., mapping registers,
direct memory access I/O) shall be clearly described.

Design Specification and Verification

C1: NR.

C2: NR.

B1: NEW: An informal or formal model of the security policy supported by
the TCB shall be maintained that is shown to be consistent with its
axioms.

B2: CHANGE: A formal model of the security policy supported by the TCB
shall be maintained that is proven consistent with its axioms.

ADD: A descriptive top-level specification (DTLS) of the TCB shall be
maintained that completely and accurately describes the TCB in terms
of exceptions, error messages, and effects. It shall be shown to be
an accurate description of the TCB interface.

B3: ADD: A convincing argument shall be given that the DTLS is consistent
with the model.

A1: CHANGE: The FTLS shall be shown to be an accurate description of the
TCB interface. A convincing argument shall be given that the DTLS is
consistent with the model and a combination of formal and informal
techniques shall be used to show that the FTLS is consistent with the
model.

ADD: A formal top-level specification (FTLS) of the TCB shall be
maintained that accurately describes the TCB in terms of exceptions,
error messages, and effects. The DTLS and FTLS shall include those
components of the TCB that are implemented as hardware and/or
firmware if their properties are visible at the TCB interface. This
verification evidence shall be consistent with that provided within
the state-of-the-art of the particular Computer Security Center-
endorsed formal specification and verification system used. Manual
or other mapping of the FTLS to the TCB source code shall be
performed to provide evidence of correct implementation.

Device Labels

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support the assignment of minimum and maximum
security levels to all attached physical devices. These security
levels shall be used by the TCB to enforce constraints imposed by
the physical environments in which the devices are located.

B3: NAR.

A1: NAR.

Discretionary Access Control

C1: NEW: The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system. The
enforcement mechanism (e.g., self/group/public controls, access
control lists) shall allow users to specify and control sharing of
those objects by named individuals or defined groups or both.

C2: CHANGE: The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined groups of
individuals, or by both.

ADD: The discretionary access control mechanism shall, either by explicit
user action or by default, provide that objects are protected from
unauthorized access. These access controls shall be capable of
including or excluding access to the granularity of a single user.
Access permission to an object by users not already possessing access
permission shall only be assigned by authorized users.

B1: NAR.

B2: NAR.

B3: CHANGE: The enforcement mechanism (e.g., access control lists) shall
allow users to specify and control sharing of those objects. These
access controls shall be capable of specifying, for each named
object, a list of named individuals and a list of groups of named
individuals with their respective modes of access to that object.

ADD: Furthermore, for each such named object, it shall be possible to
specify a list of named individuals and a list of groups of named
individuals for which no access to the object is to be given.

A1: NAR.

Exportation of Labeled Information

C1: NR.

C2: NR.

B1: NEW: The TCB shall designate each communication channel and I/O
device as either single-level or multilevel. Any change in this
designation shall be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit any change in the
current security level associated with a single-level communication
channel or I/O device.

B2: NAR.

B3: NAR.

A1: NAR.

Exportation to Multilevel Devices

C1: NR.

C2: NR.

B1: NEW: When the TCB exports an object to a multilevel I/O device, the
sensitivity label associated with that object shall also be exported
and shall reside on the same physical medium as the exported
information and shall be in the same form (i.e., machine-readable or
human-readable form). When the TCB exports or imports an object over
a multilevel communication channel, the protocol used on that channel
shall provide for the unambiguous pairing between the sensitivity
labels and the associated information that is sent or received.

B2: NAR.

B3: NAR.

A1: NAR.

Exportation to Single-Level Devices

C1: NR.

C2: NR.

B1: NEW: Single-level I/O devices and single-level communication channels
are not required to maintain the sensitivity labels of the
information they process. However, the TCB shall include a mechanism
by which the TCB and an authorized user reliably communicate to
designate the single security level of information imported or
exported via single-level communication channels or I/O devices.

B2: NAR.

B3: NAR.

A1: NAR.

Identification and Authentication

C1: NEW: The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected to
mediate. Furthermore, the TCB shall use a protected mechanism (e.g.,
passwords) to authenticate the user’s identity. The TCB shall
protect authentication data so that it cannot be accessed by any
unauthorized user.

C2: ADD: The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual ADP
system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by that
individual.

B1: CHANGE: Furthermore, the TCB shall maintain authentication data that
includes information for verifying the identity of individual users
(e.g., passwords) as well as information for determining the
clearance and authorizations of individual users. This data shall be
used by the TCB to authenticate the user’s identity and to determine
the security level and authorizations of subjects that may be created
to act on behalf of the individual user.

B2: NAR.

B3: NAR.

A1: NAR.

Label Integrity

C1: NR.

C2: NR.

B1: NEW: Sensitivity labels shall accurately represent security levels of
the specific subjects or objects with which they are associated. When
exported by the TCB, sensitivity labels shall accurately and
unambiguously represent the internal labels and shall be associated
with the information being exported.

B2: NAR.

B3: NAR.

A1: NAR.

Labeling Human-Readable Output

C1: NR.

C2: NR.

B1: NEW: The ADP system administrator shall be able to specify the
printable label names associated with exported sensitivity labels.
The TCB shall mark the beginning and end of all human-readable,
paged, hardcopy output (e.g., line printer output) with human-
readable sensitivity labels that properly* represent the sensitivity
of the output. The TCB shall, by default, mark the top and bottom of
each page of human-readable, paged, hardcopy output (e.g., line
printer output) with human-readable sensitivity labels that
properly* represent the overall sensitivity of the output or that
properly* represent the sensitivity of the information on the page.
The TCB shall, by default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps, graphics) with human-
readable sensitivity labels that properly* represent the sensitivity
of the output. Any override of these marking defaults shall be
auditable by the TCB.

B2: NAR.

B3: NAR.

A1: NAR.

____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
____________________________________________________________

Labels

C1: NR.

C2: NR.

B1: NEW: Sensitivity labels associated with each subject and storage
object under its control (e.g., process, file, segment, device) shall
be maintained by the TCB. These labels shall be used as the basis
for mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an authorized
user the security level of the data, and all such actions shall be
auditable by the TCB.

B2: CHANGE: Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained by
the TCB.

B3: NAR.

A1: NAR.

Mandatory Access Control

C1: NR.

C2: NR.

B1: NEW: The TCB shall enforce a mandatory access control policy over all
subjects and storage objects under its control (e.g., processes,
files, segments, devices). These subjects and objects shall be
assigned sensitivity labels that are a combination of hierarchical
classification levels and non-hierarchical categories, and the labels
shall be used as the basis for mandatory access control decisions.
The TCB shall be able to support two or more such security levels.
(See the Mandatory Access Control guidelines.) The following
requirements shall hold for all accesses between subjects and objects
controlled by the TCB: A subject can read an object only if the
hierarchical classification in the subject’s security level is
greater than or equal to the hierarchical classification in the
object’s security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical categories
in the object’s security level. A subject can write an object only
if the hierarchical classification in the subject’s security level is
less than or equal to the hierarchical classification in the object’s
security level and all the non-hierarchical categories in the
subject’s security level are included in the non-hierarchical
categories in the object’s security level.

B2: CHANGE: The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O devices) that
are directly or indirectly accessible by subjects external to the TCB.
The following requirements shall hold for all accesses between all
subjects external to the TCB and all objects directly or indirectly
accessible by these subjects:

B3: NAR.

A1: NAR.

Object Reuse

C1: NR.

C2: NEW: When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data for
which the subject is not authorized.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Security Features User’s Guide

C1: NEW: A single summary, chapter, or manual in user documentation shall
describe the protection mechanisms provided by the TCB, guidelines on
their use, and how they interact with one another.

C2: NAR.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Security Testing

C1: NEW: The security mechanisms of the ADP system shall be tested and
found to work as claimed in the system documentation. Testing shall
be done to assure that there are no obvious ways for an unauthorized
user to bypass or otherwise defeat the security protection mechanisms
of the TCB. (See the Security Testing guidelines.)

C2: ADD: Testing shall also include a search for obvious flaws that would
allow violation of resource isolation, or that would permit
unauthorized access to the audit or authentication data.

B1: NEW: The security mechanisms of the ADP system shall be tested and
found to work as claimed in the system documentation. A team of
individuals who thoroughly understand the specific implementation of
the TCB shall subject its design documentation, source code, and
object code to thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that would permit
a subject external to the TCB to read, change, or delete data
normally denied under the mandatory or discretionary security policy
enforced by the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter a state
such that it is unable to respond to communications initiated by
other users. All discovered flaws shall be removed or neutralized
and the TCB retested to demonstrate that they have been eliminated
and that new flaws have not been introduced. (See the Security
Testing Guidelines.)

B2: CHANGE: All discovered flaws shall be corrected and the TCB retested
to demonstrate that they have been eliminated and that new flaws have
not been introduced.

ADD: The TCB shall be found relatively resistant to penetration.
Testing shall demonstrate that the TCB implementation is consistent
with the descriptive top-level specification.

B3: CHANGE: The TCB shall be found resistant to penetration.

ADD: No design flaws and no more than a few correctable
implementation flaws may be found during testing and there shall be
reasonable confidence that few remain.

A1: CHANGE: Testing shall demonstrate that the TCB implementation is
consistent with the formal top-level specification.

ADD: Manual or other mapping of the FTLS to the source code may form
a basis for penetration testing.

Subject Sensitivity Labels

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall immediately notify a terminal user of each change
in the security level associated with that user during an interactive
session. A terminal user shall be able to query the TCB as desired
for a display of the subject’s complete sensitivity label.

B3: NAR.

A1: NAR.

System Architecture

C1: NEW: The TCB shall maintain a domain for its own execution that
protects it from external interference or tampering (e.g., by
modification of its code or data structures). Resources controlled
by the TCB may be a defined subset of the subjects and objects in
the ADP system.

C2: ADD: The TCB shall isolate the resources to be protected so that they
are subject to the access control and auditing requirements.

B1: ADD: The TCB shall maintain process isolation through the provision
of distinct address spaces under its control.

B2: NEW: The TCB shall maintain a domain for its own execution that
protects it from external interference or tampering (e.g., by
modification of its code or data structures). The TCB shall maintain
process isolation through the provision of distinct address spaces
under its control. The TCB shall be internally structured into well-
defined largely independent modules. It shall make effective use of
available hardware to separate those elements that are protection-
critical from those that are not. The TCB modules shall be designed
such that the principle of least privilege is enforced. Features in
hardware, such as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely: readable,
writeable). The user interface to the TCB shall be completely
defined and all elements of the TCB identified.

B3: ADD: The TCB shall be designed and structured to use a complete,
conceptually simple protection mechanism with precisely defined
semantics. This mechanism shall play a central role in enforcing the
internal structuring of the TCB and the system. The TCB shall
incorporate significant use of layering, abstraction and data hiding.
Significant system engineering shall be directed toward minimizing
the complexity of the TCB and excluding from the TCB modules that are
not protection-critical.

A1: NAR.

System Integrity

C1: NEW: Hardware and/or software features shall be provided that can be
used to periodically validate the correct operation of the on-site
hardware and firmware elements of the TCB.

C2: NAR.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Test Documentation

C1: NEW: The system developer shall provide to the evaluators a document
that describes the test plan and results of the security mechanisms’
functional testing.

C2: NAR.

B1: NAR.

B2: ADD: It shall include results of testing the effectiveness of the
methods used to reduce covert channel bandwidths.

B3: NAR.

A1: ADD: The results of the mapping between the formal top-level
specification and the TCB source code shall be given.

Trusted Distribution

C1: NR.

C2: NR.

B1: NR.

B2: NR.

B3: NR.

A1: NEW: A trusted ADP system control and distribution facility shall be
provided for maintaining the integrity of the mapping between the
master data describing the current version of the TCB and the on-site
master copy of the code for the current version. Procedures (e.g.,
site security acceptance testing) shall exist for assuring that the
TCB software, firmware, and hardware updates distributed to a
customer are exactly as specified by the master copies.

Trusted Facility Management

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support separate operator and administrator
functions.

B3: ADD: The functions performed in the role of a security administrator
shall be identified. The ADP system administrative personnel shall
only be able to perform security administrator functions after taking
a distinct auditable action to assume the security administrator role
on the ADP system. Non-security functions that can be performed in
the security administration role shall be limited strictly to those
essential to performing the security role effectively.

A1: NAR.

Trusted Facility Manual

C1: NEW: A manual addressed to the ADP system administrator shall present
cautions about functions and privileges that should be controlled
when running a secure facility.

C2: ADD: The procedures for examining and maintaining the audit files as
well as the detailed audit record structure for each type of audit
event shall be given.

B1: ADD: The manual shall describe the operator and administrator
functions related to security, to include changing the
characteristics of a user. It shall provide guidelines on the
consistent and effective use of the protection features of the
system, how they interact, how to securely generate a new TCB, and
facility procedures, warnings, and privileges that need to be
controlled in order to operate the facility in a secure manner.

B2: ADD: The TCB modules that contain the reference validation mechanism
shall be identified. The procedures for secure generation of a new
TCB from source after modification of any modules in the TCB shall
be described.

B3: ADD: It shall include the procedures to ensure that the system is
initially started in a secure manner. Procedures shall also be
included to resume secure system operation after any lapse in system
operation.

A1: NAR.

Trusted Path

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support a trusted communication path between
itself and user for initial login and authentication. Communications
via this path shall be initiated exclusively by a user.

B3: CHANGE: The TCB shall support a trusted communication path between
itself and users for use when a positive TCB-to-user connection is
required (e.g., login, change subject security level).
Communications via this trusted path shall be activated exclusively
by a user or the TCB and shall be logically isolated and unmistakably
distinguishable from other paths.

A1: NAR.

Trusted Recovery

C1: NR.

C2: NR.

B1: NR.

B2: NR.

B3: NEW: Procedures and/or mechanisms shall be provided to assure that,
after an ADP system failure or other discontinuity, recovery without a
protection compromise is obtained.

A1: NAR.

(this page is reserved for Figure 1)

GLOSSARY

Access – A specific type of interaction between a subject and an object
that results in the flow of information from one to the other.

Approval/Accreditation – The official authorization that is
granted to an ADP system to process sensitive information in
its operational environment, based upon comprehensive
security evaluation of the system’s hardware, firmware, and
software security design, configuration, and implementation
and of the other system procedural, administrative,
physical, TEMPEST, personnel, and communications security
controls.

Audit Trail – A set of records that collectively provide
documentary evidence of processing used to aid in tracing
from original transactions forward to related records and
reports, and/or backwards from records and reports to their
component source transactions.

Authenticate – To establish the validity of a claimed identity.

Automatic Data Processing (ADP) System – An assembly of computer
hardware, firmware, and software configured for the purpose
of classifying, sorting, calculating, computing,
summarizing, transmitting and receiving, storing, and
retrieving data with a minimum of human intervention.

Bandwidth – A characteristic of a communication channel that is
the amount of information that can be passed through it in a
given amount of time, usually expressed in bits per second.

Bell-LaPadula Model – A formal state transition model of computer
security policy that describes a set of access control
rules. In this formal model, the entities in a computer
system are divided into abstract sets of subjects and
objects. The notion of a secure state is defined and it is
proven that each state transition preserves security by
moving from secure state to secure state; thus, inductively
proving that the system is secure. A system state is
defined to be “secure” if the only permitted access modes of
subjects to objects are in accordance with a specific
security policy. In order to determine whether or not a
specific access mode is allowed, the clearance of a subject
is compared to the classification of the object and a
determination is made as to whether the subject is
authorized for the specific access mode. The
clearance/classification scheme is expressed in terms of a
lattice. See also: Lattice, Simple Security Property, *-
Property.

Certification – The technical evaluation of a system’s security
features, made as part of and in support of the
approval/accreditation process, that establishes the extent
to which a particular computer system’s design and
implementation meet a set of specified security
requirements.

Channel – An information transfer path within a system. May also
refer to the mechanism by which the path is effected.

Covert Channel – A communication channel that allows a process to
transfer information in a manner that violates the system’s
security policy. See also: Covert Storage Channel, Covert
Timing Channel.

Covert Storage Channel – A covert channel that involves the
direct or indirect writing of a storage location by one
process and the direct or indirect reading of the storage
location by another process. Covert storage channels
typically involve a finite resource (e.g., sectors on a
disk) that is shared by two subjects at different security
levels.

Covert Timing Channel – A covert channel in which one process
signals information to another by modulating its own use of
system resources (e.g., CPU time) in such a way that this
manipulation affects the real response time observed by the
second process.

Data – Information with a specific physical representation.

Data Integrity – The state that exists when computerized data is
the same as that in the source documents and has not been
exposed to accidental or malicious alteration or
destruction.

Descriptive Top-Level Specification (DTLS) – A top-level
specification that is written in a natural language (e.g.,
English), an informal program design notation, or a
combination of the two.

Discretionary Access Control – A means of restricting access to
objects based on the identity of subjects and/or groups to
which they belong. The controls are discretionary in the
sense that a subject with a certain access permission is
capable of passing that permission (perhaps indirectly) on
to any other subject.

Domain – The set of objects that a subject has the ability to
access.

Dominate – Security level S1 is said to dominate security level
S2 if the hierarchical classification of S1 is greater than
or equal to that of S2 and the non-hierarchical categories
of S1 include all those of S2 as a subset.

Exploitable Channel – Any channel that is useable or detectable
by subjects external to the Trusted Computing Base.

Flaw Hypothesis Methodology – A system analysis and penetration
technique where specifications and documentation for the
system are analyzed and then flaws in the system are
hypothesized. The list of hypothesized flaws is then
prioritized on the basis of the estimated probability that a
flaw actually exists and, assuming a flaw does exist, on the
ease of exploiting it and on the extent of control or
compromise it would provide. The prioritized list is used
to direct the actual testing of the system.

Flaw – An error of commission, omission, or oversight in a system
that allows protection mechanisms to be bypassed.

Formal Proof – A complete and convincing mathematical argument,
presenting the full logical justification for each proof
step, for the truth of a theorem or set of theorems. The
formal verification process uses formal proofs to show the
truth of certain properties of formal specification and for
showing that computer programs satisfy their specifications.

Formal Security Policy Model – A mathematically precise statement
of a security policy. To be adequately precise, such a
model must represent the initial state of a system, the way
in which the system progresses from one state to another,
and a definition of a “secure” state of the system. To be
acceptable as a basis for a TCB, the model must be supported
by a formal proof that if the initial state of the system
satisfies the definition of a “secure” state and if all
assumptions required by the model hold, then all future
states of the system will be secure. Some formal modeling
techniques include: state transition models, temporal logic
models, denotational semantics models, algebraic
specification models. An example is the model described by
Bell and LaPadula in reference [2]. See also: Bell-
LaPadula Model, Security Policy Model.

Formal Top-Level Specification (FTLS) – A Top-Level Specification
that is written in a formal mathematical language to allow
theorems showing the correspondence of the system
specification to its formal requirements to be hypothesized
and formally proven.

Formal Verification – The process of using formal proofs to
demonstrate the consistency (design verification) between a
formal specification of a system and a formal security
policy model or (implementation verification) between the
formal specification and its program implementation.

Functional Testing – The portion of security testing in which the
advertised features of a system are tested for correct
operation.

General-Purpose System – A computer system that is designed to
aid in solving a wide variety of problems.

Lattice – A partially ordered set for which every pair of
elements has a greatest lower bound and a least upper bound.

Least Privilege – This principle requires that each subject in a
system be granted the most restrictive set of privileges (or
lowest clearance) needed for the performance of authorized
tasks. The application of this principle limits the damage
that can result from accident, error, or unauthorized use.

Mandatory Access Control – A means of restricting access to
objects based on the sensitivity (as represented by a label)
of the information contained in the objects and the formal
authorization (i.e., clearance) of subjects to access
information of such sensitivity.

Multilevel Device – A device that is used in a manner that
permits it to simultaneously process data of two or more
security levels without risk of compromise. To accomplish
this, sensitivity labels are normally stored on the same
physical medium and in the same form (i.e., machine-readable
or human-readable) as the data being processed.

Multilevel Secure – A class of system containing information with
different sensitivities that simultaneously permits access
by users with different security clearances and needs-to-
know, but prevents users from obtaining access to
information for which they lack authorization.

Object – A passive entity that contains or receives information.
Access to an object potentially implies access to the
information it contains. Examples of objects are: records,
blocks, pages, segments, files, directories, directory
trees, and programs, as well as bits, bytes, words, fields,
processors, video displays, keyboards, clocks, printers,
network nodes, etc.

Object Reuse – The reassignment to some subject of a medium
(e.g., page frame, disk sector, magnetic tape) that
contained one or more objects. To be securely reassigned,
such media must contain no residual data from the previously
contained object(s).

Output – Information that has been exported by a TCB.

Password – A private character string that is used to
authenticate an identity.

Penetration Testing – The portion of security testing in which
the penetrators attempt to circumvent the security features
of a system. The penetrators may be assumed to use all
system design and implementation documentation, which may
include listings of system source code, manuals, and circuit
diagrams. The penetrators work under no constraints other
than those that would be applied to ordinary users.

Process – A program in execution. It is completely characterized
by a single current execution point (represented by the
machine state) and address space.

Protection-Critical Portions of the TCB – Those portions of the
TCB whose normal function is to deal with the control of
access between subjects and objects.

Protection Philosophy – An informal description of the overall
design of a system that delineates each of the protection
mechanisms employed. A combination (appropriate to the
evaluation class) of formal and informal techniques is used
to show that the mechanisms are adequate to enforce the
security policy.

Read – A fundamental operation that results only in the flow of
information from an object to a subject.

Read Access – Permission to read information.

Reference Monitor Concept – An access control concept that refers
to an abstract machine that mediates all accesses to objects
by subjects.

Resource – Anything used or consumed while performing a function.
The categories of resources are: time, information, objects
(information containers), or processors (the ability to use
information). Specific examples are: CPU time; terminal
connect time; amount of directly-addressable memory; disk
space; number of I/O requests per minute, etc.

Security Kernel – The hardware, firmware, and software elements
of a Trusted Computing Base that implement the reference
monitor concept. It must mediate all accesses, be protected
from modification, and be verifiable as correct.

Security Level – The combination of a hierarchical classification
and a set of non-hierarchical categories that represents the
sensitivity of information.

Security Policy – The set of laws, rules, and practices that
regulate how an organization manages, protects, and
distributes sensitive information.

Security Policy Model – An informal presentation of a formal
security policy model.

Security Testing – A process used to determine that the security
features of a system are implemented as designed and that
they are adequate for a proposed application environment.
This process includes hands-on functional testing,
penetration testing, and verification. See also: Functional
Testing, Penetration Testing, Verification.

Sensitive Information – Information that, as determined by a
competent authority, must be protected because its
unauthorized disclosure, alteration, loss, or destruction
will at least cause perceivable damage to someone or
something.

Sensitivity Label – A piece of information that represents the
security level of an object and that describes the
sensitivity (e.g., classification) of the data in the
object. Sensitivity labels are used by the TCB as the basis
for mandatory access control decisions.

Simple Security Property – A Bell-LaPadula security model rule
allowing a subject read access to an object only if the
security level of the subject dominates the security level
of the object.

Single-Level Device – A device that is used to process data of a
single security level at any one time. Since the device
need not be trusted to separate data of different security
levels, sensitivity labels do not have to be stored with the
data being processed.

*-Property (Star Property) – A Bell-LaPadula security model rule
allowing a subject write access to an object only if the
security level of the subject is dominated by the security
level of the object. Also known as the Confinement
Property.

Storage Object – An object that supports both read and write
accesses.

Subject – An active entity, generally in the form of a person,
process, or device that causes information to flow among
objects or changes the system state. Technically, a
process/domain pair.

Subject Security Level – A subject’s security level is equal to
the security level of the objects to which it has both read
and write access. A subject’s security level must always be
dominated by the clearance of the user the subject is
associated with.

TEMPEST – The study and control of spurious electronic signals
emitted from ADP equipment.

Top-Level Specification (TLS) – A non-procedural description of
system behavior at the most abstract level. Typically a
functional specification that omits all implementation
details.

Trap Door – A hidden software or hardware mechanism that permits
system protection mechanisms to be circumvented. It is
activated in some non-apparent manner (e.g., special
“random” key sequence at a terminal).

Trojan Horse – A computer program with an apparently or actually
useful function that contains additional (hidden) functions
that surreptitiously exploit the legitimate authorizations
of the invoking process to the detriment of security. For
example, making a “blind copy” of a sensitive file for the
creator of the Trojan Horse.

Trusted Computer System – A system that employs sufficient
hardware and software integrity measures to allow its use
for processing simultaneously a range of sensitive or
classified information.

Trusted Computing Base (TCB) – The totality of protection
mechanisms within a computer system — including hardware,
firmware, and software — the combination of which is
responsible for enforcing a security policy. It creates a
basic protection environment and provides additional user
services required for a trusted computer system. The
ability of a trusted computing base to correctly enforce a
security policy depends solely on the mechanisms within the
TCB and on the correct input by system administrative
personnel of parameters (e.g., a user’s clearance) related
to the security policy.

Trusted Path – A mechanism by which a person at a terminal can
communicate directly with the Trusted Computing Base. This
mechanism can only be activated by the person or the Trusted
Computing Base and cannot be imitated by untrusted software.

Trusted Software – The software portion of a Trusted Computing
Base.

User – Any person who interacts directly with a computer system.

Verification – The process of comparing two levels of system
specification for proper correspondence (e.g., security
policy model with top-level specification, TLS with source
code, or source code with object code). This process may or
may not be automated.

Write – A fundamental operation that results only in the flow of
information from a subject to an object.

Write Access – Permission to write an object.

REFERENCES

1. Anderson, J. P. Computer Security Technology Planning
Study, ESD-TR-73-51, vol. I, ESD/AFSC, Hanscom AFB,
Bedford, Mass., October 1972 (NTIS AD-758 206).

2. Bell, D. E. and LaPadula, L. J. Secure Computer Systems:
Unified Exposition and Multics Interpretation, MTR-2997
Rev. 1, MITRE Corp., Bedford, Mass., March 1976.

3. Brand, S. L. “An Approach to Identification and Audit of
Vulnerabilities and Control in Application Systems,” in
Audit and Evaluation of Computer Security II: System
Vulnerabilities and Controls, Z. Ruthberg, ed., NBS
Special Publication #500-57, MD78733, April 1980.

4. Brand, S. L. “Data Processing and A-123,” in Proceedings of
the Computer Performance Evaluation User’s Group 18th
Meeting, C. B. Wilson, ed., NBS Special Publication
#500-95, October 1982.

5. Denning, D. E. “A Lattice Model of Secure Information
Flow,” in Communications of the ACM, vol. 19, no. 5
(May 1976), pp. 236-243.

6. Denning, D. E. Secure Information Flow in Computer Systems,
Ph.D. dissertation, Purdue Univ., West Lafayette, Ind.,
May 1975.

7. DoD 5200.1-R, Information Security Program Regulation,
August 1982.

8. DoD Directive 5200.28, Security Requirements for Automatic
Data Processing (ADP) Systems, revised April 1978.

9. DoD 5200.28-M, ADP Security Manual — Techniques and
Procedures for Implementing, Deactivating, Testing, and
Evaluating Secure Resource-Sharing ADP Systems, revised
June 1979.

10. DoD Directive 5215.1, Computer Security Evaluation Center,
25 October 1982.

11. DoD 5220.22-M, Industrial Security Manual for Safeguarding
Classified Information, January 1983.

12. DoD 5220.22-R, Industrial Security Regulation, January 1983.

13. DoD Directive 5400.11, Department of Defense Privacy
Program, 9 June 1982.

14. Executive Order 12356, National Security Information,
6 April 1982.

15. Faurer, L. D. “Keeping the Secrets Secret,” in Government
Data Systems, November – December 1981, pp. 14-17.

16. Federal Information Processing Standards Publication (FIPS
PUB) 39, Glossary for Computer Systems Security,
15 February 1976.

17. Federal Information Processing Standards Publication (FIPS
PUB) 73, Guidelines for Security of Computer
Applications, 30 June 1980.

18. Federal Information Processing Standards Publication (FIPS
PUB) 102, Guideline for Computer Security Certification
and Accreditation.

19. Lampson, B. W. “A Note on the Confinement Problem,” in
Communications of the ACM, vol. 16, no. 10 (October
1973), pp. 613-615.

20. Lee, T. M. P., et al. “Processors, Operating Systems and
Nearby Peripherals: A Consensus Report,” in Audit and
Evaluation of Computer Security II: System
Vulnerabilities and Controls, Z. Ruthberg, ed., NBS
Special Publication #500-57, MD78733, April 1980.

21. Lipner, S. B. A Comment on the Confinement Problem, MITRE
Corp., Bedford, Mass.

22. Millen, J. K. “An Example of a Formal Flow Violation,” in
Proceedings of the IEEE Computer Society 2nd
International Computer Software and Applications
Conference, November 1978, pp. 204-208.

23. Millen, J. K. “Security Kernel Validation in Practice,” in
Communications of the ACM, vol. 19, no. 5 (May 1976),
pp. 243-250.

24. Nibaldi, G. H. Proposed Technical Evaluation Criteria for
Trusted Computer Systems, MITRE Corp., Bedford, Mass.,
M79-225, AD-A108-832, 25 October 1979.

25. Nibaldi, G. H. Specification of A Trusted Computing Base,
(TCB), MITRE Corp., Bedford, Mass., M79-228, AD-A108-
831, 30 November 1979.

26. OMB Circular A-71, Transmittal Memorandum No. 1, Security of
Federal Automated Information Systems, 27 July 1978.

27. OMB Circular A-123, Internal Control Systems, 5 November
1981.

28. Ruthberg, Z. and McKenzie, R., eds. Audit and Evaluation of
Computer Security, in NBS Special Publication #500-19,
October 1977.

29. Schaefer, M., Linde, R. R., et al. “Program Confinement in
KVM/370,” in Proceedings of the ACM National
Conference, October 1977, Seattle.

30. Schell, R. R. “Security Kernels: A Methodical Design of
System Security,” in Technical Papers, USE Inc. Spring
Conference, 5-9 March 1979, pp. 245-250.

31. Trotter, E. T. and Tasker, P. S. Industry Trusted Computer
Systems Evaluation Process, MITRE Corp., Bedford,
Mass., MTR-3931, 1 May 1980.

32. Turn, R. Trusted Computer Systems: Needs and Incentives for
Use in government and Private Sector, (AD # A103399),
Rand Corporation (R-28811-DR&E), June 1981.

33. Walker, S. T. “The Advent of Trusted Computer Operating
Systems,” in National Computer Conference Proceedings,
May 1980, pp. 655-665.

34. Ware, W. H., ed., Security Controls for Computer Systems:
Report of Defense Science Board Task Force on Computer
Security, AD # A076617/0, Rand Corporation, Santa
Monica, Calif., February 1970, reissued October 1979.

DoD STANDARD 5200.28: SUMMARY OF THE DIFFERENCES
BETWEEN IT AND CSC-STD-001-83

Note: Text which has been added or changed is indented and preceded by > sign.
Text which has been deleted is enclosed in slashes (/). “Computer Security
Center” was changed to “National Computer Security Center” throughout the
document.

The FOREWORD Section was rewritten and signed by Mr. Don Latham on
26 Dec 85. The ACKNOWLEDGEMENTS Section was updated.

The PREFACE was changed as follows:

PREFACE

The trusted computer system evaluation criteria defined in this
document classify systems into four broad hierarchical divisions
of enhanced security protection. The criteria provide a basis
for the evaluation of effectiveness of security controls built
into automatic data processing system products. The criteria
were developed with three objectives in mind: (a) to provide
users with a yardstick with which to assess the degree of trust
that can be placed in computer systems for the secure processing
of classified or other sensitive information; (b) to provide
guidance to manufacturers as to what to build into their new,
widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide
a basis for specifying security requirements in acquisition
specifications. Two types of requirements are delineated for
secure processing: (a) specific security feature requirements and
(b) assurance requirements. Some of the latter requirements
enable evaluation personnel to determine if the required features
are present and functioning as intended.

>The scope of these criteria is to be applied to
>the set of components comprising a trusted system, and is
>not necessarily to be applied to each system component
>individually. Hence, some components of a system may be
>completely untrusted, while others may be individually
>evaluated to a lower or higher evaluation class than the
>trusted product considered as a whole system. In trusted
>products at the high end of the range, the strength of the
>reference monitor is such that most of the system
>components can be completely untrusted.

Though the criteria are

>intended to be

application-independent, /it is recognized that/ the
specific security feature requirements may have to be
interpreted when applying the criteria to specific

>systems with their own functional requirements,
>applications or special environments (e.g., communications
>processors, process control computers, and embedded systems
>in general).

The underlying assurance requirements can be
applied across the entire spectrum of ADP system or
application processing environments without special
interpretation.

The SCOPE Section was changed as follows:

Scope

The trusted computer system evaluation criteria defined in this
document apply

>primarily

to /both/ trusted, commercially available
automatic data processing (ADP) systems.

>They are also applicable, as amplified below, to the
>evaluation of existing systems and to the specification of
>security requirements for ADP systems acquisition.

Included are two distinct sets of requirements: l) specific security
feature requirements; and 2) assurance requirements. The specific
feature requirements encompass the capabilities typically found
in information processing systems employing general-purpose
operating systems that are distinct from the applications programs
being supported.

>However, specific security feature requirements
>may also apply to specific systems with their own functional
>requirements, applications or special environments (e.g.,
>communications processors, process control computers, and embedded
>systems in general).

The assurance requirements, on the other hand,
apply to systems that cover the full range of computing environments
from dedicated controllers to full range multilevel secure resource
sharing systems.

Changed the Purpose Section as follows:

Purpose

As outlined in the Preface, the criteria have been developed to
serve a number of intended purposes:

To provide

>a standard

to manufacturers as to what security features to build
into their new and planned, … trust requirements

>(with particular emphasis on preventing the
>disclosure of data)

for sensitive applications.

To provide

>DoD components

with a metric with which to evaluate
the degree of trust that can be placed in …

To provide a basis for specifying security requirements in
acquisition specifications.

With respect to the

>second

purpose for development of the criteria, i.e., providing

>DoD components

with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be
performed on a computer product from a perspective that
excludes the application environment; or, (b) it can be
done to assess whether appropriate security measures …

The latter type of evaluation, i.e., those done for the purpose
of assessing a system’s security attributes with respect to a
specific operational mission, is known as a certification
evaluation. It must be understood that the completion of a
formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific
application environment. On the contrary, the evaluation report
only provides a trusted computer system’s evaluation rating along
with supporting data describing the product system’s strengths
and weaknesses from a computer security point of view. The
system security certification and the formal
approval/accreditation procedure, done in accordance with the
applicable policies of the issuing agencies, must still be
followed before a system can be approved for use in processing or
handling classified information.,8;9.

>Designated Approving Authorities (DAAs) remain ultimately
>responsible for specifying security of systems they
>accredit.

The trusted computer system evaluation criteria will be used
directly and indirectly in the certification process. Along with
applicable policy, it will be used directly as

>technical guidance

for evaluation of the total system and for specifying system
security and certification requirements for new acquisitions. Where
a system being evaluated for certification employs a product that
has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation.
Technical data will be furnished to designers, evaluators and the
Designated Approving Authorities to support their needs for
making decisions.

2.1.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 2.2.1.1 as follows:

2.2.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely Reworded Section 2.2.1.2 as follows:

2.2.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Reworded Section 2.2.2.2 as follows:

2.2.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

For each recorded event, the audit record shall
identify: date and time of the event, user, type of event,
and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object. The ADP system
administrator shall be able to selectively audit the
actions of any one or more users based on individual
identity.

Changed Section 2.2.4.3 as follows:

2.2.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 3.1.1.1 as follows:

3.1.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely reworded Section 3.1.1.2 as follows:

3.1.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.1.1.3.2 as follows:

3.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/ communication
channel or I/O device.

Appended a sentence to Section 3.1.1.4 as follows:

3.1.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed one sentence in Section 3.1.2.1 as follows:

3.1.2.1. Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Reworded Section 3.1.2.2 as follows:

3.1.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

> and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level.

‘Unbolded’ the first sentence of Section 3.1.3.2.1.

Reworded Section 3.1.3.2.2 as follows:

3.1.3.2.2 Design Specification and Verification

An informal or formal model of the security policy
supported by the TCB shall be maintained

>over the life cycle of the ADP system and demonstrated

to be consistent with its axioms.

Changed sentence as follows:

3.1.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 3.2.1.1 as follows:

3.2.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely reworded Section 3.2.1.2 as follows:

3.2.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.2.1.3 as follows:

3.2.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object, ROM) that is
directly or indirectly accessible by subjects external
to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 3.2.1.3.2 as follows:

3.2.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/
communication channel or I/O device.

Appended Sectence to Section 3.2.1.4 as follows:

3.2.1.4 Mandatory Access Control

… Identification and authentication data shall be
used by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 3.2.2.1 as follows:

3.2.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Reworded section 3.2.2.2 as follows:

3.2.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels.

Changed Section 3.2.3.2.2 as follows:

3.2.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its
axioms. A descriptive top-level specification
(DTLS) of the TCB shall be maintained that
completely and accurately describes the TCB in
terms of exceptions, error messages, and effects.
It shall be shown to be an accurate description of
the TCB interface.

Changed Section 3.2.4.3 as follows:

3.2.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths.

Replaced “tamperproof” with “tamper resistant”:

3.2.4.4 Design Documentation

Documentation shall be available that provides a
description of the manufacturer’s philosophy of
protection and an explanation of how this philosophy is
translated into the TCB. The interfaces between the
TCB modules shall be described. A formal description
of the security policy model enforced by the TCB shall
be available and proven that it is sufficient to
enforce the security policy. The specific TCB
protection mechanisms shall be identified and an
explanation given to show that they satisfy the model.
The descriptive top-level specification (DTLS) shall be
shown to be an accurate description of the TCB
interface. Documentation shall describe how the TCB
implements the reference monitor concept and give an
explanation why it is

>tamper resistant,

cannot be bypassed, and is correctly implemented.
Documentation shall describe how the TCB is structured to
facilitate testing and to enforce least privilege. This
documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting
the channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage
channels, the use of which is not detectable by the
auditing mechanisms, shall be provided. (See the Covert
Channel Guideline section.)

Changed Section 3.3.1.1 as follows:

3.3.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
access control lists) shall allow users to specify and
control sharing of those objects,

>and shall provide controls to limit
>propagation of access rights.

The discretionary access control mechanism shall, either by
explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of specifying, for
each named object, a list of named individuals and a
list of groups of named individuals with their
respective modes of access to that object.
Furthermore, for each such named object, it shall be
possible to specify a list of named individuals and a
list of groups of named individuals for which no access
to the object is to be given. Access permission to an
object by users not already possessing access
permission shall only be assigned by authorized users.

Completely reworded Section 3.3.1.2 as follows:

3.3.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.3.1.3 as follows:

3.3.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object, ROM) that is
directly or indirectly accessible by subjects external
to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 3.3.1.3.2 as follows:

3.3.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/
communication channel or I/O device.

Appended Sentence to Section 3.3.1.4 as follows:

3.3.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 3.3.2.1 as follows:

3.3.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Changed Section 3.3.2.2 as follows:

3.3.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able
to immediately notify the security administrator when
thresholds are exceeded,

>and if the occurrence or accumulation
>of these security relevant events continues,
>the system shall take the least disruptive
>action to terminate the event.

Changed the first sentence of Section 3.3.3.2.2 as follows:

3.3.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of
>the ADP system

that is proven consistent with its axioms. …

Changed Section 3.3.4.3 as follows:

3.3.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths.

Replaced “tamperproof” with “tamper resistant” in Section 3.3.4.4.

Changed Section 4.1.1.1 as follows:

4.1.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
access control lists) shall allow users to specify and
control sharing of those objects,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall, either by
explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of specifying, for
each named object, a list of named individuals and a
list of groups of named individuals with their
respective modes of access to that object.
Furthermore, for each such named object, it shall be
possible to specify a list of named individuals and a
list of groups of named individuals for which no access
to the object is to be given. Access permission to an
object by users not already possessing access
permission shall only be assigned by authorized users.

Completely reworded Section 4.1.1.2 as follows:

4.1.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 4.1.1.3 as follows:

4.1.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object,

>ROM)

that is directly or indirectly accessible by subjects
external to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 4.1.1.3.2 as follows:

4.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level

>or levels

associated with a /single-level/
communication channel or I/O device.

Appended Sentence to Section 4.1.1.4 as follows:

4.1.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 4.1.2.1 as follows:

4.1.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Changed Section 4.1.2.2 as follows:

4.1.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able
to immediately notify the security administrator when
thresholds are exceeded,

>and, if the occurrence or accumulation of these
>security relevant events continues, the system
>shall take the least disruptive action to
>terminate the event.

‘Unbolded’ the words “covert channels” in Section 4.1.3.1.3.

Changed the first sentence of Section 4.1.3.2.2 as follows:

4.1.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its axioms. …

Changed Section 4.1.4.3 as follows:

4.1.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested, and

results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths. The results of the mapping between the
formal top-level specification and the TCB source code
shall be given.

Replaced “tamperproof” with “tamper resistant” in Section 4.1.4.4.

Changed the last paragraph of Section 5.1 as follows:

5.1 A Need for Consensus

A major goal of …

As described …

>The Purpose of this section is to describe in detail the
>fundamental control objectives. These objectives lay the
>foundation for the requirements outlined in the criteria.

The goal is to explain the foundations so that those outside
the National Security Establishment can assess their
universality and, by extension, the universal applicability
of the criteria requirements to processing all types of
sensitive applications whether they be for National Security
or the private sector.

Changed the second paragraph of Section 6.2 as follows:

6.2 A Formal Policy Model

Following the publication of …

>A subject can act on behalf of a user or another
>subject. The subject is created as a surrogate
>for the cleared user and is assigned a formal
>security level based on their classification.
>The state transitions and invariants of the formal
>policy model define the invariant relationships
>that must hold between the clearance of the user,
>the formal security level of any process that can
>act on the user’s behalf, and the formal security
>level of the devices and other objects to which any
>process can obtain specific modes of access.

The Bell and LaPadula model,

>for example,

defines a relationship between

>formal security levels of subjects and objects,

now referenced as the “dominance relation.” From this definition …
… Both the Simple Security Condition and the *-Property
include mandatory security provisions based on the dominance
relation between the

>formal security levels of subjects and objects.

The Discretionary Security Property …

Added a sentence to the end of Section 7.0:

7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA

Section 1 presents fundamental computer security
requirements and Section 5 presents the control objectives
for Trusted Computer Systems. They are general
requirements, useful and necessary, for the development of
all secure systems. However, when designing systems that
will be used to process classified or other sensitive
information, functional requirements for meeting the Control
Objectives become more specific. There is a large body of
policy laid down in the form of Regulations, Directives,
Presidential Executive Orders, and OMB Circulars that form
the basis of the procedures for the handling and processing
of Federal information in general and classified information
specifically. This section presents pertinent excerpts from
these policy statements and discusses their relationship to
the Control Objectives.

>These excerpts are examples to illustrate the relationship
>of the policies to criteria and may not be complete.

Inserted the following

>as the next to last paragraph

of Section 7.2:

>DoD Directive 5200.28 provides the security requirements for
>ADP systems. For some types of information, such as
>Sensitive Compartmented Information (SCI), DoD Directive
>5200.28 states that other minimum security requirements also
>apply. These minima are found in DCID 1/16 (new reference
>number 5) which is implemented in DIAM 50-4 (new reference
>number 6) for DoD and DoD contractor ADP systems.

From requirements imposed by …

Changed Footnote #1 referenced by Section 7.2 as follows:

Replaced “Health and Human Services Department” with “U.S.
Information Agency.”

Changed (updated) the quote from DoD 5220.22-M, Section 7.3.1, as
follows:

7.3 Criteria Control Objective for Security Policy

7.3.1 Marking

The control objective for marking …

DoD 5220.22-M, “Industrial Security …

>”a. General. Classification designation by physical
>marking, notation or other means serves to warn and to
>inform the holder what degree of protection against
>unauthorized disclosure is required for that
>information or material.” (14)

Changed the

>last paragraph

of Section 7.5 as follows:

A major component of assurance, life-cycle assurance,

>as described in DoD Directive 7920.1,

is concerned with testing ADP systems both in the
development phase as well as during operation.

>(17)

DoD Directive 5215.1 …

Changed Section 9.0 as follows:

9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL FEATURES

The Mandatory Access Control requirement …

* The number of hierarchical classifications should be
greater than or equal to

>sixteen (16).

* The number of non-hierarchical categories should be
greater than or equal to

>sixty-four (64)..

Completely reworded the third paragraph of Formal Product
Evaluation, in Appendix A, as follows:

Formal Product Evaluation

The formal product evaluation provides …

A formal product evaluation begins with …

>The evaluation team writes a final report on their findings about
>the system. The report is publicly available (containing no
>proprietary or sensitive information) and contains the overall
>class rating assigned to the system and the details of the
>evaluation team’s findings when comparing the product against the
>evaluation criteria. Detailed information concerning
>vulnerabilities found by the evaluation team is furnished to the
>system developers and designers as each is found so that the
>vendor has a chance to eliminate as many of them as possible
>prior to the completion of the Formal Product Evaluation.
>Vulnerability analyses and other proprietary or sensitive
>information are controlled within the Center through the
>Vulnerability Reporting Program and are distributed only within
>the U.S. Government on a strict need-to-know and non-disclosure
>basis, and to the vendor.

Changed two paragraphs in Audit (Appendix D) as follows:

C2: NEW: The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction an
audit trail of accesses to the objects it protects. The
audit data shall be protected by the TCB so that read access
to it is limited to those who are authorized for audit data.
The TCB shall be able to record the following types of
events: use of identification and authentication mechanisms,
introduction of objects into a user’s address space (e.g.,
file open, program initiation), deletion of objects, actions
taken by computer operators and system administrators and/or
system security officers,

>and other security relevant events.

or each recorded event, the audit record shall
identify: date and time of the event, user, type of event,
and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object. The ADP system
administrator shall be able to selectively audit the actions
of any one or more users based on individual identity.

B3: ADD: …when thresholds are exceeded,

>and, if the occurrence or accumulation of these
>security relevant events continues, the system
>shall take the least disruptive action to terminate
>the event.

Changed one paragraph in Design Documentation (Appendix D):

B2: ADD: Change “tamperproof” to “tamper resistant.”

Changed two paragraphs in Design Specification and Verification:

B1: NEW: An informal or formal model of the security policy
supported by the TCB shall be maintained

>over the life cycle of the ADP system and demonstrated

to be consistent with its axioms.

B2: CHANGE: A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its axioms.

Changed two paragraphs in Discretionary Access Control as follows:

C2: CHANGE: The enforcement mechanism (e.g., self/group/public
controls, access control lists) shall allow users to specify
and control sharing of those objects by named individuals,
or defined groups of individuals, or by both,

>and shall provide controls to limit propagation of access rights.

B3: CHANGE: The enforcement mechanism (e.g., access control
lists) shall allow users to specify and control sharing of
those objects,

>and shall provide controls to limit propagation of access rights.

These access controls shall be capable of specifying, for each
named object, a list of named individuals and a list of groups of
named individuals with their respective modes of access to that object.

Changed 1 paragraph in Exportation of Labeled Information:

B1: NEW: The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any change
in this designation shall be done manually and shall be
auditable by the TCB. The TCB shall maintain and be able to
audit any change in the /current/ security level

>or levels

associated with a /single-level/ communication channel or
I/O device.

Changed 1 paragraph in Identification and Authorization:

B1: CHANGE: … This data shall be used by the TCB to authenticate
the user’s identity and

>to ensure that

the security level and authorizations of subjects external to
the TCB that may be created to act on behalf of the individual
user

>are dominated by the clearance and authorization
>of that user.

Changed 1 paragraph in Labels:

B2: CHANGE: … (e.g., subject, storage object, ROM) …

Changed 1 paragraph in Mandatory Access Control:

B1: NEW: … Identification and authentication data shall be used

>by the TCB to authenticate the user’s identity and to ensure
>that the security level and authorization of subjects external
>to the TCB that may be created to act on behalf of the
>individual user are dominated by the clearance and authoriza-
>tion of that user.

Rewrote 1 paragraph in Object Reuse:

C2: NEW:
>All authorizations to the information contained
>within a storage object shall be revoked prior to initial
>assignment, allocation or reallocation to a subject from the
>TCB’s pool of unused storage objects. No information,
>including encrypted representations of information, produced
>by a prior subject’s actions is to be available to any
>subject that obtains access to an object that has been
>released back to the system.

Changed l paragraph in Test Documentation:

C1: NEW: The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested,

and results of the security mechanisms’ functional testing.

GLOSSARY

Changed Discretionary Access Control:

Discretionary Access Control – A means of restricting access to
objects based on the identity of subjects and/or groups to
which they belong. The controls are discretionary in the
sense that a subject with a certain access permission is
capable of passing that permission (perhaps indirectly) on
to any other subject

(unless restrained by mandatory access control).

Added:

Front-End Security Filter – A process that is invoked to process
data according to a specified security policy prior to
releasing the data outside the processing environment or
upon receiving data from an external source.

Granularity – The relative fineness or coarseness by which a
mechanism can be adjusted. The phrase “the granularity of
a single user” means the access control mechanism can be
adjusted to include or exclude any single user.

Read-Only Memory (ROM) – A storage area in which the contents
can be read but not altered during normal computer
processing.

Security Relevant Event – Any event that attempts to change the
security state of the system, (e.g., change discretionary
access controls, change the security level of the subject,
change user password, etc.). Also, any event that attempts
to violate the security policy of the system, (e.g., too
many attempts to login, attempts to violate the mandatory
access control limits of a device, attempts to downgrade a
file, etc.).

Changed the name of the term:

Simple Security /Property/

>Condition

– A Bell-LaPadula security model rule allowing a subject
read access to an object only if the security level of the
subject dominates the security level of the object.

Changed definition:

Trusted Computing Base (TCB) – The totality of protection
mechanisms within a computer system –including hardware,
firmware, and software — the combination of which is
responsible for enforcing a security policy.

>A TCB consists of one or more components that together enforce
>a unified security policy over a product or system.

The ability of a TCB to correctly enforce a security
policy depends solely on the mechanisms within the TCB and
on the correct input by system administrative personnel of
parameters (e.g., a user’s clearance) related to the
security policy.

REFERENCES

Added: (References were renumbered as necessary)

5. DCID 1/16, Security of Foreign Intelligence in Automated
Data Processing Systems and Networks (U), 4 January 1983.

6. DIAM 50-4, Security of Compartmented Computer Operations (U),
24 June 1980.

9. DoD Directive 5000.29, Management of Computer Resources in
Major Defense Systems, 26 April 1976.

17. DoD Directive 7920.1, Life Cycle Management of Automated
Information Systems (AIS), 17 October 1978.

Corrected dates on the following References:

14. DoD 5220.22-M, Industrial Security Manual for Safeguarding
Classified Information, March 1984.

15. DoD 5220.22-R, Industrial Security Regulation, February
1984.

%

The NIST Management Guide to the Protection of Information Resources

Management Guide to the Protection of Information
Resources

National Institute of Standards and Technology
The National Institute of Standards and Technology (NIST), is
responsible for developing standards, providing technical
assistance, and conducting research for computers and related
systems. These activities provide technical support to
government and industry in the effective, safe, and
economical use of computers. With the passage of the Computer
Security Act of 1987 (P.L. 100-235), NIST’s activities also
include the development of standards and guidelines needed to
assure the cost-effective security and privacy of sensitive
information in Federal computer systems. This guide represents
one activity towards the protection and management of sensitive
information resources.

Acknowledgments
This guide was written by Cheryl Helsing of Deloitte, Haskins &
Sells in conjunction with Marianne Swanson and Mary Anne Todd,
National Institute of Standards and Technology.

Executive Summary
Today computers are integral to all aspects of operations within
an organization. As Federal agencies are becoming critically
dependent upon computer information systems to carry out their
missions, the agency executives (policy makers) are recognizing
that computers and computer-related problems must be understood
and managed, the same as any other resource. They are beginning
to understand the importance of setting policies, goals, and
standards for protection of data, information, and computer
resources, and are committing resources for information security
programs. They are also learning that primary responsibility for
data security must rest with the managers of the functional areas
supported by the data.

All managers who use any type of automated information resource
system must become familiar with their agency’s policies and
procedures for protecting the information which is processed and
stored within them. Adequately secure systems deter, prevent, or
detect unauthorized disclosure, modification, or use of
information. Agency information requires protection from
intruders, as well as from employees with authorized computer
access privileges who attempt to perform unauthorized actions.
Protection is achieved not only by technical, physical and
personnel safeguards, but also by clearly articulating and
implementing agency policy regarding authorized system use to
information users and processing personnel at all levels. This
guide is one of three brochures that have been designed for a
specific audience. The “Executive Guide to the Protection of
Information Resources” and the “Computer User’s Guide to the
Protection of Information Resources” complete the series.

Table of Contents

Executive Summary iv
Introduction 1
Purpose of Guide 1
The Risks 1
Responsibilities 2
Information Systems Development 5
Control Decisions 5
Security Principles 5
Access Decisions 7
Systems Development Process 7
Computer Facility Management 9
Physical Security 9
Data Security 11
Monitoring and Review 11
Personnel Management 13
Personnel Security 13
Training 14
For Additional Information 15

Introduction

Purpose of this Guide
This guide introduces information systems security concerns and
outlines the issues that must be addressed by all agency managers
in meeting their responsibilities to protect information systems
within their organizations. It describes essential components of
an effective information resource protection process that applies
to a stand alone personal computer or to a large data processing
facility.

The Risks
Effort is required by every Federal agency to safeguard
information resources and to reduce risks to a prudent level.
The spread of computing power to individual employees via
personal computers, local-area networks, and distributed
processing has drastically changed the way we manage and control
information resources. Internal controls and control points that
were present in the past when we were dealing with manual or
batch processes have not been established in many of today’s
automated systems. Reliance upon inadequately controlled computer
systems can have serious consequences, including:

Inability or impairment of the agency’s ability to perform its
mission

Inability to provide needed services to the public

Waste, loss, misuse, or misappropriation of funds

Loss of credibility or embarrassment to an agency

To avoid these consequences, a broad set of information security
issues must be effectively and comprehensively addressed.
Responsibilities
All functional managers have a responsibility to implement the
policies and goals established by executive management for
protection of automated information resources (data, processes,
facilities, equipment, personnel, and information). Managers in
all areas of an organization are clearly accountable for the
protection of any of these resources assigned to them to enable
them to perform their duties. They are responsible for
developing, administering, monitoring, and enforcing internal
controls, including security controls, within their assigned
areas of authority. Each manager’s specific responsibilities will
vary, depending on the role that manager has with regard to
computer systems.

Portions of this document provide more detailed information on
the respective security responsibilities of managers of computer
resources, managers responsible for information systems
applications and the personnel security issues involved.
However, all agency management must strive to:

Achieve Cost-Effective Security
The dollars spent for security measures to control or contain
losses should never be more than the projected dollar loss if
something adverse happened to the information resource.
Cost-effective security results when reduction in risk through
implementation of safeguards is balanced with costs. The greater
the value of information processed, or the more severe the
consequences if something happens to it, the greater the need
for control measures to protect it.
The person who can best determine the value or importance of
data is the functional manager who is responsible for the data.
For example, the manager responsible for the agency’s budget
program is the one who should establish requirements for the
protection of the automated data which supports the program. This
manager knows better than anyone else in the organization what
the impact will be if the data is inaccurate or unavailable.
Additionally, this manager usually is the supervisor of most of
the users of the data.

It is important that these trade-offs of cost versus risk
reduction be explicitly considered, and that management
understand the degree of risk remaining after selected controls
are implemented.

Assure Operational Continuity
With ever-increasing demands for timely information and greater
volumes of information being processed, the threat of information
system disruption is a very serious one. In some cases,
interruptions of only a few hours are unacceptable. The impact
due to inability to process data should be assessed, and actions
should be taken to assure availability of those systems
considered essential to agency operation. Functional management
must identify critical computer applications and develop
contingency plans so that the probability of loss of data
processing and telecommunications support is minimized.

Maintain Integrity
Integrity of information means you can trust the data and the
processes that manipulate it. Not only does this mean that errors
and omissions are minimized, but also that the information system
is protected from deliberate actions to wrongfully change the
data. Information can be said to have integrity when it
corresponds to the expectations and assumptions of the users.

Assure Confidentiality
Confidentiality of sensitive data is often, but not always, a
requirement of agency systems. Privacy requirements for personal
information is dictated by statute, while confidentiality of
other agency information is determined by the nature of that
information, e.g., information submitted by bidders in
procurement actions. The impact of wrongful disclosure must be
considered in understanding confidentiality requirements.

Comply with Applicable Laws and Regulations
As risks and vulnerabilities associated with information systems
become better understood, the body of law and regulations
compelling positive action to protect information resources
grows. OMB Circular No. A-130, “Management of Federal
Information Resources” and Public Law 100-235, “Computer Security
Act of 1987” are two documents where the knowledge of these
regulations and laws provide a baseline for an information
resource security program.

Information Systems Development
This section describes the protective measures that should be
included as part of the design and development of information
processing application systems. The functional manager that is
responsible for and will use the information contained in the
system, must ensure that security measures have been included and
are adequate. This includes applications designed for personal
computers as well as large mainframes.

Control Decisions
The official responsible for the agency function served by the
automated information system has a critical role in making
decisions regarding security and control. In the past, risk was
often unconsciously accepted when such individuals assumed the
computer facility operators were taking care of security. In
fact, there are decisions to be made and security elements to be
provided that cannot be delegated to the operator of the system.
In many cases, the user or manager develops the application and
operates solely.

The cost of control must be balanced with system efficiency and
usability issues. Risk must be evaluated and cost-effective
controls selected to provide a prudent level of control while
maximizing productivity. Controls are often closely connected
with the system function, and cannot be effectively designed
without significant understanding of the process being automated.

Security Principles
There are some common security attributes that should be present
in any system that processes valuable personal or sensitive
information. System designs should include mechanisms to enforce
the following security attributes.

Identification and Authentication of Users
Each user of a computer system should have a unique
identification on the system, such as an account number or other
user identification code. There must also be a means of verifying
that the individual claiming that identity (e.g., by typing in
that identifying code at a terminal) is really the authorized
individual and not an imposter. The most common means of
authentication is by a secret password, known only to the
authorized user.

Authorization Capability Enforcing the Principle of Least
Possible Privilege
Beyond ensuring that only authorized individuals can access the
system, it is also necessary to limit the users access to
information and transaction capabilities. Each person should be
limited to only the information and transaction authority that is
required by their job responsibilities. This concept, known as
the principle of least possible privilege, is a long-standing
control practice. There should be a way to easily assign each
user just the specific access authorities needed.

Individual Accountability
From both a control and legal point of view, it is necessary to
maintain records of the activities performed by each computer
user. The requirements for automated audit trails should be
developed when a system is designed. The information to be
recorded depends on what is significant about each particular
system. To be able to hold individuals accountable for their
actions, there must be a positive means of uniquely identifying
each computer user and a routinely maintained record of each
user’s activities.

Audit Mechanisms
Audit mechanisms detect unusual events and bring them to the
attention of management. This commonly occurs by violation
reporting or by an immediate warning to the computer system
operator. The type of alarm generated depends on the seriousness
of the event.

A common technique to detect access attempts by unauthorized
individuals is to count attempts. The security monitoring
functions of the system can automatically keep track of
unsuccessful attempts to gain access and generate an alarm if the
attempts reach an unacceptable number.

Performance Assurance
A basic design consideration for any information system should
be the ability to verify that the system is functioning as
intended. Systems that are developed without such design
considerations are often very difficult to independently audit or
review, leading to the possibility of unintended results or
inaccurate processing.

Recoverability
Because Federal agencies can potentially be heavily dependent on
a computer system, an important design consideration is the
ability to easily recover from troublesome events, whether minor
problems or major disruptions of the system. From a design point
of view, systems should be designed to easily recover from minor
problems, and to be either transportable to another backup
computer system or replaced by manual processes in case of major
disruption or loss of computer facility.

Access Decisions
Once the automated system is ready to use, decisions must be
made regarding access to the system and the information it
contains. For example, many individuals require the ability to
access and view data, but not the ability to change or delete
data. Even when computer systems have been designed to provide
the ability to narrowly designate access authorities, a
knowledgeable and responsible official must actually make those
access decisions. The care that is taken in this process is a
major determining factor of the level of security and control
present in the system. If sensitive data is being transmitted
over unprotected lines, it can be intercepted or passive
eavesdropping can occur. Encrypting the files will make the data
unintelligible and port protection devices will protect the files
from unauthorized access, if warranted.

Systems Development Process
All information systems software should be developed in a
controlled and systematic manner according to agency standards.
The quality and efficiency of the data processed, and the
possible reconfiguration of the system can all be affected by an
inadequate development process. The risk of security exposures
and vulnerabilities is greatly reduced when the systems
development process is itself controlled.

Computer Facility Management
Functional managers play a critical role in assuring that agency
information resources are appropriately safeguarded. This section
describes the protective measures that should be incorporated
into the ongoing management of information resource processing
facilities. As defined in OMB Circular No. A-130, “Management of
Federal Information Resources,” the term “information technology
facility” means an organizationally defined set of personnel,
hardware, software, and physical facilities, a primary function
of which is the operation of information technology. This
section, therefore applies to any manager who houses a personal
computer, mainframe or any other form of office system or
automated equipment.

Physical Security
Information cannot be appropriately protected unless the
facilities that house the equipment are properly protected from
physical threats and hazards. The major areas of concern are
described below.

Environmental Conditions
For many types of computer equipment, strict environmental
conditions must be maintained. Manufacturer’s specifications
should be observed for temperature, humidity, and electrical
power requirements.

Control of Media
The media upon which information is stored should be carefully
controlled. Transportable media such as tapes and cartridges
should be kept in secure locations, and accurate records kept of
the location and disposition of each. In addition, media from an
external source should be subject to a check-in process to ensure
it is from an authorized source.

Control of Physical Hazards
Each area should be surveyed for potential physical hazards.
Fire and water are two of the most damaging forces with regard to
computer systems. Opportunities for loss should be minimized by
an effective fire detection and suppression mechanism, and
planning reduces the danger of leaks or flooding. Other physical
controls include reducing the visibility of the equipment and
strictly limiting access to the area or equipment.

Contingency Planning
Although risks can be minimized, they cannot be eliminated. When
reliance upon a computer facility or application is substantial,
some type of contingency plan should be devised to allow critical
systems to be recovered following a major disaster, such as a
fire. There are a number of alternative approaches that should be
evaluated to most cost-effectively meet the agency’s need for
continuity of service.

Configuration Management
Risk can be introduced through unofficial and unauthorized
hardware or software. Another key component of information
resource management is ensuring only authorized hardware and
software are being utilized. There are several control issues to
be addressed.

Maintaining Accurate Records
Records of hardware/software inventories, configurations, and
locations should be maintained and kept up-to-date.

Complying with Terms of Software Licenses
Especially with microcomputer software, illegal copying and
other uses in conflict with licensing agreements are concerns.
The use of software subject to licensing agreements must be
monitored to ensure it is used according to the terms of the
agreement.

Protecting Against Malicious Software and Hardware
The recent occurrences of destructive computer “viruses” point
to the need to ensure that agencies do not allow unauthorized
software to be introduced to their computer environments.
Unauthorized hardware can also contain hidden vulnerabilities.
Management should adopt a strong policy against unauthorized
hardware/software, inform personnel about the risks and
consequences of unauthorized additions to computer systems, and
develop a monitoring process to detect violations of the policy.

Data Security
Management must ensure that appropriate security mechanisms are
in place that allow responsible officials to designate access to
data according to individual computer users’ specific needs.
Security mechanisms should be sufficient to implement individual
authentication of system users, allow authorization to specific
information and transaction authorities, maintain audit trails as
specified by the responsible official, and encrypt sensitive
files if required by user management.

Monitoring and Review
A final aspect of information resource protection to be
considered is the need for ongoing management monitoring and
review. To be effective, a security program must be a continuous
effort. Ideally, ongoing processes should be adapted to include
information protection checkpoints and reviews. Information
resource protection should be a key consideration in all major
computer system initiatives.

Earlier, the need for system audit trails was discussed. Those
audit trails are useful only if management regularly reviews
exception items or unusual activities. Irregularities should be
researched and action taken when merited. Similarly, all
information-related losses and incidents should be investigated.

A positive benefit of an effective monitoring process is an
increased understanding of the degree of information-related risk
in agency operations. Without an ongoing feedback process,
management may unknowingly accept too much risk. Prudent
decisions about trade-offs between efficiency and control can
only be made with a clear understanding of the degree of inherent
risk. Every manager should ask questions and periodically review
operations to judge whether changes in the environment have
introduced new risk, and to ensure that controls are working
effectively.

Personnel Management
Managers must be aware that information security is more a
people issue than a technical issue. Personnel are a vital link
in the protection of information resources, as information is
gathered by people, entered into information resource systems by
people, and ultimately used by people. Security issues should be
addressed with regard to:
People who use computer systems and store information in the
course of their normal job responsibilities
People who design, program, test, and implement critical or
sensitive systems
People who operate computer facilities that process critical or
sensitive data

Personnel Security
From the point of hire, individuals who will have routine access
to sensitive information resources should be subject to special
security procedures. More extensive background or reference
checks may be appropriate for such positions, and security
responsibilities should be explicitly covered in employee
orientations. Position descriptions and performance evaluations
should also explicitly reference unusual responsibilities
affecting the security of information resources.

Individuals in sensitive positions should be subject to job
rotation, and work flow should be designed in such a way as to
provide as much separation of sensitive functions as possible.
Upon decision to terminate or notice of resignation, expedited
termination or rotation to less sensitive duties for the
remainder of employment is a reasonable precaution.

Any Federal computer user who deliberately performs or attempts
to perform unauthorized activity should be subject to
disciplinary action, and such disciplinary action must be
uniformly applied throughout the agency. Any criminal activity
under Federal or state computer crime laws must be reported to
law enforcement authorities.

Training
Most information resource security problems involve people.
Problems can usually be identified in their earliest stages by
people who are attuned to the importance of information
protection issues. A strong training program will yield large
benefits in prevention and early detection of problems and
losses. To be most effective, training should be tailored to the
particular audience being addressed, e.g., executives and policy
makers; program and functional managers; IRM security and audit:
ADP management and operations; end users.

Most employees want to do the right thing, if agency
expectations are clearly communicated. Internal policies can be
enforced only if staff have been made aware of their individual
responsibilities. All personnel who access agency computer
systems should be aware of their responsibilities under agency
policy, as well as obligations under the law. Disciplinary
actions and legal penalties should be communicated.

For Additional Information

National Institute Of Standards and Technology
Computer Security Program Office, A-216 Technology
Gaithersburg, MD 20899
(301) 975-5200

For further information on the management of information
resources, NIST publishes Federal Information Processing
Standards Publications (FIBS PUBS). These publications deal with
many aspects of computer security, including password usage, data
encryption, ADP risk management and contingency planning, and
computer system security certification and accreditation. A list
of current publications is available from:
Standards Processing Coordinator (ADP)
National Computer Systems Laboratory
National Institute of Standards and Technology
Technology Building, B-64
Gaithersburg, MD 20899
Phone: (301) 975-2817
������������������������

The Venice Blue Book: Computer Security Subsystems (September 1988)

NCSC-TG-009 - Computer Security Subsystems
Library No. S230,512 
Version 1 
FOREWORD
This publication is issued by the National Computer Security Center (NCSC) as part of its program to promulgate technical computer security guidelines. This interpretation extends the Department of Defense Trusted Computer System Evaluation Criteria (DOD 5200.28-STD) to computer security subsystems. 
This document will be used for a period of at least one year after date of signature. During this period the NCSC will gain experience using the Computer Security Subsystem Interpretation in several subsystem evaluations. After this trial period, necessary changes to the document will be made and a revised version issued. 
Anyone wishing more information, or wishing to provide comments on the usefulness or correctness of the Computer Security Subsystem Interpretation may contact: Chief Technical Guidelines Division, National Computer Security Center, Fort George G. Meade, MD 20755-6000, ATTN: Cll. 
PATRICK R GALLAGHER, JR. 16 September 1988 
Director National Computer Security Center 
Computer Security Subsystems ACKNOWLEDGEMENT 
ACKNOWLEDGEMENT
Acknowledgment is extended to the members of the working group who produced this Interpretation. Members were: Michael W. Hale, National Computer Security Center (Chair); James P. Anderson; Terry Mayfie!d, Institute For Defense Analyses; Alfred W. Arsenault, NCSC; William Geer, NCSC; John C. Inglis, NCSC; Dennis Steinauer, National Bureau of Standards; Mario Tinto, NCSC; Grant Wagner, NCSC; and Chris Wilcox, NCSC. 
Acknowledgement is further extended to those individuals who conducted thorough reviews and and provided constructive comments on this document. Reviewers included: Steve Lipner, Earl Boebert, Virgil Gligor, Debbie Downs, Len Brown, Doug Hardie, Steve Covington, Jill Sole and Bob Morris. 
1. INTRODUCTION
This document provides interpretations of the Department of Defense Trusted Computer System Evaluation Criteria (DoD 5200.28-STD or TCSEC) for computer security subsystems. A computer security subsystem (subsystem) is defined, herein, as hardware, firmware and/or software which can be added to a computer system to enhance the security of the overall system. A subsystem's primary utility is to increase the security of a computer system. The computer system that the subsystem is to protect is referred to as the protected system in this Interpretation. 
When incorporated into a system environment, evaluated computer security subsystems may be very effective in reducing or eliminating certain types of vulnerabilities whenever entire evaluated systems are unavailable or impractical. 
1.1 PURPOSE
This Interpretation has been prepared for the following purposes: 
1. to establish a standard for manufacturers as to what security features and assurance levels to build into their new and planned computer security subsystem products to provide widely available products that satisfy trust requirements for sensitive applications; 
2. to provide a metric to evaluate the degree of trust that can be placed in a subsystem for protecting classified and sensitive information; 
3. to lend consistency to evaluations of these products by explicitly stating the implications that are in the TCSEC; and 
4. to provide the security requirements for subsystems in acquisition specifications. 
1.2 BACKGROUND
The Department of Defense Trusted Computer System Evaluation Criteria (DoD 5200.28-STD or TCSEC) was developed to establish uniform DoD policy and security requirements for "trusted, commercially available, automatic data processing (ADP) systems." Evaluation criteria defined in the TCSEC provides a standard to manufacturers as to what security features to build into their commercial products to satisfy trust requirements for sensitive applications, and serves as a metric with which to evaluate the degree of trust that can be placed in a computer system for the secure processing of classified or other sensitive information. 
The TCSEC specifies a variety of features that a computer system must provide to constitute a complete security system. The security requirements specified in the TCSEC depend on and complement one another to provide the basis for effective implementation of a security policy in a trusted computer system. The effectiveness of any one security feature present within a system is, therefore, dependent to some degree on the presence and effectiveness of other security features found within the same system. Because it was intended to be used only for systems which incorporated all the security features of a particular evaluation class, the TCSEC does not, in all cases, completely specify these interdependencies among security features. 
In addition to the class of trusted system products, there exists a recognized need for a class of computer security products which may not individually meet all of the security features and assurances of the TCSEC. Instead, these products may implement some subset of the features enumerated in the TCSEC and can potentially improve the security posture in existing systems. These products are collectively known as computer security subsystems. 
Evaluation of computer security subsystems against a subset of the requirements given in the TCSEC has proven an extremely difficult task because of the implied dependencies among the various features discussed in the TCSEC. As a consequence, interpretations of these interdependencies and the relative merits of specific subsystem implementations have been highly subjective and given to considerable variation. 
This document provides interpretations of the TCSEC for computer security subsystems in an effort to lend consistency to evaluations of these products by explicitly stating the implications in the TCSEC. 
Evaluations can be divided into two types: (l) a product evaluation can be perforrned on a subsystem from a perspective that excludes the application environment, or (2) a certification evaluation can be done to assess whether appropriate security measures have been taken to permit an entire system to be used operationally in a specific environment. The product evaluation type is done by the National Computer Security Center (NCSC) through the Trusted Product Evaluation Process using this interpretation for subsystems. The certification type of evaluation lS done in support of a formal accreditation for a system to operate in a specific environment using the TCSEC. 
1.3 SCOPE
This document interprets the security feature, assurance and documentation requirements of the TCSEC for subsystem evaluations. In this interpretation, the functional requirements of the TCSEC are divided into four general categories: 
1. Discretionary Access Control (DAC) 
2. Object Reuse (OR). 
3. Identification and Authentication (I&A) 
4. Audit (AUD) 
These categories form the basis for classifying products to be evaluated as computer security subsystems. 
The document, in addition to this introductory section, is organized into three major sections and a glossary. Section 2 contains the feature requirements for each of the above four categories on which subsystems evaluations are based. The requirements in this section are listed in increments, with only new or changed requirements being added for each subsequent class of the same feature. All requirements that are quoted from the TCSEC are in bold print for easy identification and are clarified, in the context of subsystems, by interpretation paragraphs. 
Section 3 contains the assurance requirements for all subsystems. The assurances that are relevant to each category are listed here in the same format as the requirements in Section 2. Section 4 contains the requirements and interpretations for subsystem documentation, again, in the same forrnat as Section 2. 
The TCSEC-related feature and assurance requirements described herein are intended for the evaluation of computer security subsystems designed to protect sensitive information. This Interpretation, like the TCSEC, assumes that physical, administrative, and procedural protection measures adequate to protect the inforrnation being handled are already in place. 
This Interpretation can be used to support a certification evaluation. In fact, it would be helpful whenever subsystems are a part of the overall system being certified. 
1.4 EVALUATION OF SUBSYSTEMS
1.4.1 Basis for Evaluation
Subsystems are evaluated for the specific security-relevant functions they perforrn. This Interpretation interprets the relevant TCSEC requirements for each function evaluated. So the function(s) for which subsystems are evaluated will be identified within its ratings. Each function has its own set of ratings as identified in Table 1.1. Subsystems that are evaluated for more than one function will receive a separate rating for each function evaluated. 
TABLE 1.1. Possible Subsystem Ratings 
SUBSYSTEM FUNCTION                   POSSIBLE RATINGS                      

Discretionary Access Control         DAC/D, DAC/Dl, DAC/D2, DAC/D3         

Object Reuse                         OR/D,OR/D2                            

Identification & Authentication      I&A/D, I&A/Dl, I&A/D2                 

Audit                                AUD/D, AUD/D2, AUD/D3                 

Although the requirements for subsystems are derived from the TCSEC, the ratings for subsystems will not directly reflect the TCSEC class they are derived from. Since subsystems, by their very nature, do not meet all of the requirements for a class Cl or higher computer system, it is most appropriate to associate subsystem ratings with the D division of the TCSEC. This Interpretation defines the Dl, D2 and D3 classes within the D division for subsystems. The Dl class is assigned to subsystems that meet the interpretations for requirements drawn from the Cl TCSEC class. Likewise, the D2 class consists of requirements and interpretations that are drawn from the C2 TCSEC class. The D3 subsystem class is reserved for DAC subsystems and audit subsystems that meet the B3 functionality requirements for those functions. 
In addition to meeting the functionality requirements and interpretations, subsystems must also meet the assurance and documentation requirements in sections 3 and 4 of this document. The Dl and D2 classes have requirements and interpretations for ~ssurances and documentation as well as functionality. 
The D3 class contains additional requirements and interpretations only for functionality, not for assurances or documentation. So, subsystems with this rating will adhere to the D2 assurance and documentation requirements and interpretations. 
Like the classes within the TCSEC, the Dl, D2 and D3 classes are ordered hierarchically. Subsystems being evaluated for the Dl class must meet the requirements and interpretations for the Dl class. Subsystems being evaluated for the D2 class must meet the requirements and interpretations for the Dl class plus the additional requirements and interpretations for the D2 class. Subsystems being evaluated for the D3 class must meet the additional requirements and interpretations associated with the functionality at D3. 
Although the subsystem requirements and interpretations are derived directly from the TCSEC, subsystems are not considered to be complete computer security solutions. There is no general algorithm to derive a system rating from an arbitrary collection of computer security subsystems. Any collection of individually evaluated subsystems must be evaluated as a whole to determine the rating of the resulting system. The ratings of the individual subsystems in a complete system are not a factor in the rating of that system. 
1.4.2 Integration Requirements
Because all of the TCSEC requirements for a given rating class were intended to be implemented in a complete computer security system, many of the security features are dependent upon each other for support within the system. This poses a certain degree of difficulty with extracting only the relevant requirements from the TCSEC for a given feature. Further, this poses a fundamental problem for subsystems because there is an explicit dependency between security features that restricts the "independent" incorporation of subsystems into the system's environment. The problem has been handled in this Interpretation by discussing the integration requirements for each type of subsystem. The requirements for integration are discussed for each type of subsystem in a sub-section entitled, "Role Within Complete Security System." Furthermore, explicit requirements for integration are stated in the interpretations at appropriate points. The developer must show, and the evaluation shall validate, that the subsystem can be integrated into a system to fulfill its designated role. 
Most all computer security subsystems will rely on other security-relevant functions in the enviromnent where they are implemented. Audit subsystems, for example, depend on an identification and authentication function to provide the unique user identities that are necessary for individual accountability. Also, it is important to realize that some of these functions may be dependent on each other in a cyclic fashion (e.g., I&A depends on DAC and DAC depends on I&A). In these cases, the cyclic dependencies should be removed either by complete integration of the functions or by modularizing the functions in a way that allows linear dependencies. Tl~is latter method is termed "sandwiching" and it requires the splitting of one function and surroundmg the other dependent function with the two functions resulting from the split. For example, in the case of DAC and I&A cyclic dependencies, one might split I&A into two parts so that there is a system I&A, a DAC subsystem, and a DAC module containing its own I&A functionality. 
With the exception of object reuse, all functions implemented by subsystems will be dependent on other functions as shown in Table 1.2. The functions upon which any subsystem is dependent will be referred to as that subsystem's required supporting functions. These required supporting functions must be present in the subsystem's environment for the effective integration of the subsystem. 
TABLE 1.2. Required Supporting Functions 
SUBSYSTEM FUNCTION                   REQUIRED SUPPORTING FUNCTIONS         

Discretionary Access Control         I&A, Audit                            

Object Reuse                         None                                  

Identification & Authentication      Audit,DAC2, Audit, I&A, DAC2          

Subsystems that are not self-sufficient in providing required supporting functions must, at a minimum, provide an interface to their required 
supporting functions. The evaluation team will perform tests to show whether the interface to the required supporting functions is reliable and works properly. The robustness of the required supporting functions on the other side of the interface will not be tested, as the scope of the subsystem evaluation is bounded by the interface. 
A more integrated solution is for subsystems to be self- su~cient in providing all of their required supporting functions. Such subsystems w_ill be evaluated and assigned a separate rating for each function they provide. Unlike the previous solution, where only an interface is provided, each required supporting function is performed by the subsystem and must be a part of the subsystem evaluation. 
The audit supporting functions are required at D2. 2 Audit and/or authentication data must be protected through domain isolation or DAC. 
1.4.3 WARNING
An overan system rating, such as that provided by the TCSEC, cannot be inferred from the application of one or more separately-rated subsystems. Mechanisms, interfaces, and the extent of required supporting functions for each subsystem may differ substantiany and may introduce significant vulnerabilities that are not present in systems where security features are designed with fun knowledge of interfaces and host system support. Therefore, incorporation of an evaluated subsystem into any system environment does not automaticany confer any rating to the resulting system. 
2. FEATURE REQUIREMENTS
2.1 DISCRETIONARY ACCESS CONTROL DAC) SUBSYSTEMS
2.1.1 Global Description of Subsystem Features
2.1.1.1 Purpose
This subsystem provides user-specified, controlled sharing of resources. 
This control is established from security policies which define, given identified subjects and objects, the set of rules that are used by the system to determine whether a given subject is authorized to gain access to a specific object. 
DAC features include the means for restricting access to objects; the means for instantiating authorizations for objects; and the mechanisms for distribution, review, and revocation of access privileges, especially during object creation and deletion. 
2.1.1.2 Role Within Complete Security System
The requirement is to give individual users the ability to restrict access to objects created or controlled by them. Thus, given identified subjects and objects, DAC includes the set of rules (group-oriented and/or individually-oriented) used by the subsystem to ensure that only specified users or groups of users may obtain access to data (e.g., based on a need-to-know). 
A DAC subsystem controls access to resowces. As such, it shall be integrable with the operating system of the protected system and shall mediate all accesses to the protected resources. To fully protect itself and the resources it controls, the DAC subsystem must be interfaced to the protected system in such a way that it is tamperproof and always invoked. 
DAC subsystems use the identifiers of both subjects and DAC-controlled objects as a basis for access control decisions. Thus, they must be supplied with the identifiers in a reliable manner. The DAC subsystem may supply subject identification for itself or it may rely on an I&A mechanism in the protected system or in another subsystem. It is also essential that DAC subsystems be implemented in an environment where the objects it protects are well defined and uniquely identified. 
At the DAC/D2 class, the DAC subsystem must interface with an auditing mechanism. This auditing mechanism can be included within the DAC subsystem, or it may reside elsewhere in the subsystem's environment. 
2.1.2 Evaluation of DAC Subsystems
Subsystems which are designed to implement discretionary access controls to assist a host in controlling the sharing of a collection of objects must comply with all of the TCSEC requirements as outlined below for features, assurances and documentation. Compliance with these requirements will assure that the subsystem can enforce a specifically defined group-oriented and/or individually-oriented discretionary access control policy. 
As a part of the evaluation, the subsystem vendor shall set up the subsystem in a typical functional configuration for security testing. This will show that the subsystem interfaces correctly with the protected system to meet all of the feature requirements in this section and ali of the assurance and documentation requirements in Sections 3 and 4. It will also show that the subsystem can be integrated into a larger system environment. 
The interpretations for applying the feature requirements to DAC subsystems are explained in the subsequent interpretations sections. The application of the assurances requirements and documentation requirements is explained in Sections 3 and 4, respectively. 
2.1.3 Feature Requirements For DAC Subsystems
2.1.3.1 DAC/Dl
TCSEC Quote: 
"Cl: New: The TCB shall define and control access between named users and named objects (e.g., files and programs) in the ADP system. The enforcement mechanism (e.g., self/group/public controls, access control lists) shall allow users to special and control sharing of those objects by named indinduals or defined groups or both." 
Interpretation: 
In the TCSEC quote, "TCB" is interpreted to mean "DAC subsystem". 
2.1.3.1.1 Identified users and objects
DAC subsystems must use some mechanism to determine whether users are authorized for each access attempted. At DAC/Dl, this mechanism must control access by groups of users. The mechanisms that can meet this requirement include, but are not limited to: access control lists, capabilities, descriptors, user profiles, and protection bits. The DAC mechanism uses the identification of subjects and objects to perform access control decisions. This implies that the DAC subsystem must interface with or provide some I&A mechanism. The evaluation shall show that user identities are available to DAC. 
2.1.3.1.2 User-specified object sharing
The DAC subsystem must provide the capability for users to specify how other users or groups may access the objects they control. This requires that the user have a means to specify the set of authorizations (e.g., access control list) of all users or groups permitted to access an object and/or the set of all objects accessible to a user or group (e.g., capabilities). 
2.1.3.1.3 Mediation
The checking of the specified authorizations of a user prior to granting access to an object is the essential function of DAC which must be provided. Mediation either allows or disallows the access. 
2.1.3.2 DAC/D2
TCSEC Quote: 
"C2: Change: The enforcement mechanism (e.g. self/group/public controls, access control lists) shall allow users to specify and control sharing of those objects by named individuals, or defined groups of individuals, or by both, and shan provide controls to limit propagation of access rights." 
"C2: Add: The discretionary access control mechamsm shan, either by explicit user action or by default, provide that objects are protected from unauthorized access. These access controls s~ll be capable of including or excluding access to the granularity of a single wer. Access permission to an object by users not already possessing access pernlission shan only be assigned by authorized users." 
Interpretation: 
The following interpretations, in addition to the interpretations for the DAC/Dl Class, shall be satisfied at the DAC/D2 Class. 
2.1.3.2.1 DAC/D2
The DAC/D2 class requires mdividual access controls; therefore, the granularity of user identification must enable the capabili~ to discern an individual user. That is, access control based upon group identi~ alone is insufflcient. To comply with the requirement, the DAC subsystem must either provide unique user identities through its own I&A mechanism or Mterface with an I&A mechanism that provides unique user identities. The DAC subsystem must be able to interface to an auditing mechanism that records data about access mediation events. The evaluation shall show that audit data is created and is available to the auditing mechanism. 
2.1.3.2.2 Authorized user-specified object sharing
The ability to propagate access rights to objects must be lirnited to authorized users. This additional feature is incorporated to limit access rights propagation. This distribution of privileges encompasses granting, reviewing, and revoking of access. The ability to grant the right to grant propagation of access will itself be limited to authorized users. 
2.1.3.2.3 Default protection
The DAC mechanism must deny all users access to objects when no explicit action has been taken by the authorized user to allow access. 
2.1.3.3 DAC/D3
· TCSEC Quote: 
"B3: Change: The enforcement mechanism (e.g., access control lists) shall allow users to specify and control sharing of those objects, and shall provide controls to limit propagation of access rights. These access controls shall be capable of specifying, for each named object, a list of named individuals and a list of groups of named individuals with their respective modes of access to that object." 
"Add: Furtherrnore, for each such named object, it shall be possible to specify a list of named individuals and a list of groups of named individuals for which no access to the object is to be given." 
· Interpretation: 
The following interpretation, in addition to the interpretations and 
requirements for the DAC/D2 class, shall be satisfied for the DACID3 class. 
2.1.3.3.1 Access control lists for each object
The DAC subsystem shan anow users to specify the list of individuals or groups of individuals who can access each object. The list shan additionally specify the mode(s) of access that is anowed each user or group. This implies that access control lists associated with each object is the only acceptable mechanism to satisfy the DAC/D3 requirement. 
2.1.4 Assurance Requirements for DAC Subsystems
DAC subsystems must comply with an of the assurance requirements for their given class as indicated below. The interpretations for these assurance requirements are contained in Section 3. 
Subsystems at the DAC/Dl class must comply with: 
· System Architecture (Dl) 
· System Integrity (Dl) 
· Security Testing (Dl) 
Subsystems at the DAC/D2 and DAC/D3 classes must comply with: 
· System Architecture (D2) 
· System Integrity (D2) 
· Security Testing (D2) 
2.1.5 Documentation Requirements for DAC Subsystems
DAC subsystems must meet the documentation requirements listed below for their target rating class. The interpretations for these documentation requirements are contained in Section 4. 
Subsystems at the DAC/Dl class must comply with: 
· Security Features User's Guide (Dl) 
· Trusted Facility Manual (Dl) 
· Test Documentation (Dl) 
· Desi~ Documentation (Dl) 
Subsystems at the DAC/D2 and DAC/D3 classes must comply with: 
· Security Features User's Guide (D2) 
· Trusted Facility Manual (D2) 
· Test Documentation (D2) 
· Design Documentation (D2) 
2.2 OBJECT REUSE SUBSYSTEMS
2.2.1 Global Description of Subsystem Features
2.2.1.1 Purpose
Object reuse subsystems clear storage objects to prevent subjects from scavenging data from storage objects which have been previously used. 
2.2.1.2 Role Within the Complete Security System
Object reuse can be used to prevent information scavenging by erasing information residue contained in previously used storage objects that have been released by the storage management system. Object reuse subsystems are most effective in environments where some security policy is implemented on the system. 
To prevent scavenging of information from previously used storage objects, object reuse subsystems must be fully integrable with the operating system of the protected system. The object reuse subsystem must perform its function for all reusable storage objects on the protected system (i.e., main memory, disk storage, tape storage, I/O buffers, etc.). 
Object reuse subsystems must be interfaced with the protected system in such a way that they are tamperproof and always invoked. 
2.2.2 Evaluation of Object Reuse Subsystems
Subsystems which implement object reuse must comply with all of the TCSEC requirements as outlined below for features, assurances, and documentation. Compliance with these requirements will show that the subsystem can enforce object reuse adequately to receive an OR/D2 rating for object reuse. 
As a part of the evaluation, the subsystem vendor shall set up the subsystem in a typical functional connguration for security testing. This will show that the subsystem interfaces correctly with the protected system to meet all of the feature requirements in this section and all of the assurance and documentation requirements in Sections 3 and 4. It will also show that the subsystem can be integrated into a larger system environment. 
The interpretations for applying the feature requirements of object reuse subsystems are explained in the subsequent interpretations section. The application of the assurance requirements listed below is explained in Sections 3 and 4, respectively. 
2.2.3 Feature Requirements for Object Reuse Subsystems
2.2.3.1 OR/D2
TCSEC Quote: 
"C2: New: all authorizations to the information contained within a storage object shall be revoked prior to initial assignment, allocation or reallocation to a subject from the TCB's pool of unused storage objects. No information, including encrypted representations of information, produced by a prior subject's actions is to be available to any subject that obtains access to an object that has been released back to the system." 
Interpretation: 
In the TCSEC quote, "TCB" is interpreted to mean "protected system". Otherwise, this requirement applies as stated. The object reuse subsystem shall perform its function for all storage objects on the protected system that are accessible to users. 
Rationale/Discussion: 
Object reuse subsystems must assure that no previously used storage objects (e.g., message buffers, page frames, disk sectors, magnetic tape, memory registers, etc.) can be used to scavenge residual information. Information remaining in previously used storage objects can be destroyed by overwriting it with meaningless or unintelligible bit patterns. An alternative way of approaching the problem is to deny read access to previously used storage objects until the user who has just acquired them has overwritten them with his own data. 
Object reuse subsystems do not equate to systems used to eliminate magnetic remnance. 
2.2.4 Assurance Requirements for Object Reuse Subsystems
Object reuse subsystems must comply with all of the assurance requirements shown below for the D2 class. The interpretations for these assurance requirements for Object Reuse subsystems are contained in Section 3. 
· System Architecture (D2) 
· System Integrity (D2) 
· Security Testing (D2) 
2.2.5 Documentation Requirements for Object ReuseSubsystems 
Object reuse subsystems must meet the documentation requirements shown below for the D2 class. The interpretations for these documentation requirements are contained in Section 4. 
· Security Features User's Guide (D2) 
· Trusted Facility Manual (D2) 
· Test Documentation (D2) 
· Design Documentation (D2) 
2.3 IDENTICATION & AUTHENTICATION (I&A) SUBSYSTEMS 
2.3.1 Global Description of Subsystem Features
2.3.1.1 Purpose
This subsystem provides the authenticated identification of a user seeking to gain access to any resources under the control of the protected system. 
2.3.1.2 Role Within Complete Security System
The I&A subsystem provides an authenticated user identification needed to provide accountability for and control access to the protected system. The granularity of user identification is determined by the requirements in this interpretation. The granularity increases from group identification at I&A/Dl to individual identification at I&A/D2. 
The requirement is to be able to accurately authenticate the claimed identity of a user. The I&A subsystem must determine whether a user is authorized to use the protected system. For all authorized users, the I&A subsystem communicates the identity of the user to the protected system. This identity can then be used by the protected system or other subsystems to provide accountability for use of the system and access controls to protected objects on the system. To be effective and to protect the authentication data it uses, the I&A subsystem must be tamperproof and always invoked. 
At I&A/D2, it is important that all uses of the I&A subsystem be recorded in an audit trail. The auditing of these actions may be performed entirely by the auditing mechanism on the I&A subsystem, or through an interface with an auditing mechanism in the protected system or another subsystem. 
2.3.2 Evaluation of I&A Subsystems
Subsystems which are designed to implement I&A must comply with all of the TCSEC requirements outlined below for features, assurances, and documentation. Compliance with these requirements will assure that the subsystem can enforce, either wholly or in part, a specific I&A policy. As a part of the evaluation, the subsystem vendor shall set up the subsystem in a typical functional configuration for security testing. This will show that the subsystem interfaces correctly with the protected system to meet all of the feature requirements in this section and all of the assurance and documentation requirements in Sections 3 and 4. It will also show that the subsystem can be integrated into a larger system environment. 
The interetations for applying the feature requirements to I&A subsystems are explained in the subsequent interpretations sections. The application of the assurance requirements and documentation requirements listed in the next section is explained in Sections 3 and 4, respectively. 
2.3.3 Feature Requirement for I&A Subsystems
2.3.3.1 I&A/Dl
TCSEC Quote: 
"Cl: New: The TCB shall require users to identify themselves to it before beginning to perform any other actions that the TCB is expected to mediate. Furthermore, the - TCB shall use a protected mechanism (e.g., passwords) to authenticate the user's identity. The TCB shall protect authentication data so that it cannot be accessed by any unauthorized user." 
Interpretation: 
The I&A subsystem shall require users to identify themselves to it before beginning to perforrn any other actions that the system is expected to mediate. Furthermore, the I&A subsystem shall use a protected mechanism (e.g., passwords) to authenticate the user's identity. The I&A subsystem shall protect authentication data so that it cannot be accessed by any unauthorized user. 
The I&A subsystem shall, at a minimum, identify and authenticate system users. At I&A/Dl, users need not be individually identified. 
Rationale/Discussion: 
Identification and Authentication must be based on at least a two-step process, which is derived from a combination of something the user possesses (e.g., smart card, magnetic stripe card), some physical attribute about the user (e.g., fingerprint, voiceprint), something the user knows (e.g., password, passphrase). The claimed identification of a user must be authenticated by an explicit action of the user. It is not acceptable for one step to be used as both identification and authentication. The claimed identity can be public. The measure used for authentication must be resistant to forging, guessing, and fabricating. 
The I&A subsystem must interface to the protected system in such a way that it can reliably pass authenticated user identities to the protected system. The evaluation shall show that authenticated user identities can be passed to the protected system. 
2.3.3.2 I&A/D2
TCSEC Quote: - 
"C2: Add: The TCB shan be able to enforce individual accountability by providing the capability to uniqueb identify each individual ADP system user. The TCB shall also ; provide the capabmty of associa~ ~is identity ~nth an auditable actiol~ taken by ; that indindual." 
Interpretation ~ 
The following interpretations, in addition to those interpretations for I&A/Dl, shall be satisfied at the I&A/D2 Class. 
In the TCSEC quote, "TCB" is interpreted to mean "I&A subsystem." The I&A subsystem shall pass to the protected system a unique identifier for each individual. 
The I&A subsystem shall be able to uniquely identify each individual user. This includes the ability to identify individual members within an authorized user group and the ability to identify specific system users such as operators, system administrators, etc. 
The I&A subsystem shall provide for the audit logging of security-relevant I&A events. For I&A, the origin of the request (e.g., terminal ID, etc.), the date and time of the event, user ID (to the extent recorded), type of event, and the success or failure of the event shall be recorded. The I&A subsystem may meet this requirement either through its own auditing mechanism or by providing an interface for passing the necessary data to another auditing mechanism. , 
Rationale/Discussion: 
The intent of this requirement is for the I&A subsystem to supply a unique identity for each user to the protected system. The subsystem supplies a unique user identity which may or may not be used by an auditing mechanism. This auditing support is : required to maintain consistency with the C2 level of trust as defined by the TCSEC. 
2.3.4 Assurance Requirements for I&A Subsystems
I&A subsystems must comply with all of the assurance requirements listed below for their given class. The interpretations for these assurance requirements to I&A subsystems are contained in Section 3. 
Subsystems at the I&A/Dl class shall comply with: 
· System Architecture (Dl) 
· System Integrity (Dl) 
· Security Testing (Dl) . 
Subsystems at the I&A/D2 class shall comply with: 
· System Architecture (D2) 
· System Integrity (D2) 
· Security Testing(D2) 
2.3.5 Documentation Requirements for I&A Subsystems
I&A subsystems must meet the documentation requirements listed below for their target rating class. The interpretations for these documentation requirements are contained in Section 4. 
Subsystems at the I&A/Dl class shall comply with: 
· Security Features User's Guide (Dl) 
· Trusted Facility Manual (Dl) 
· Test Documentation (Dl) 
· Design Documentation (Dl) 
Subsystems at the I&A/D2 class shall comply with: 
· Security Features User's Guide (D2) 
· Trusted Facility Manual (D2) 
· Test Documentation (D2) 
· Design Documentation (D2) 
2.4 AUDlT SUBSYSTEMS
2.4.1 Global Description of Subsystem Features
2.4.1.1 Purpose
Accountability is partly achieved through auditing. That is, data from security- relevant events is captured and passed to the audit mechanism to be recorded for use in detecting possible security breaches and providing a trace to the party responsible. 
2.4.1.2 Role Within Complete Security System
The requirement is to be able to record security-relevant events in a manner that will allow detection and/or after-the-fact investigations to trace security violations to the responsible party. 
An auditing subsystem must be capable of recording all security-relevant actions -i - that take place throughout the computer system. To accomplish this goal, it must integrate itself into the mechanisms that mediate access and perform user identification and authentication, and capture data about the events they control. Additionally, an audit subsystem must be interfaced with the protected system in such a way that it is tamperproof and always invoked. 
The auditing subsystem must be provided all of the necessary data associated with actions as specified in Section 2.4.3. The necessary data includes the unique identity of the user that is responsible for each action. This implies that an auditing subsystem must be augmented by an identification and authentication mechanism either within the subsystem itself or elsewhere on the system. 
2.4.2 Evaluation of Auditing Subsystems
Subsystems which are designed to implement audit data collection and control functions for a host must comply with all of the TCSEC requirements as outlined below for features, assurances and documentatioi. Compliance with these features will assure that the subsystem, through its integration, can detect or generate the relevant audit data or can record all relevant audit data passed to it by the host or other subsystems. 
As a part of the evaluation, the subsystem vendor shall set up the subsystem in a typical functional configuration for security testing. This will show that the subsystem interfaces correctly with the protected system to meet all of the feature requirements in this section and all of the assurance and documentation requirements in Sections 3 and 4. It will also show that the subsystem can be integrated into a larger system environrnent. 
The interpretations for applying the feature requirements to auditing subsystems are explained in the subsequent interpretations sections. The application of the assurance requirements and documentation requirements is explained in Sections 3 and 4, respectively. 
2.4.3 Feature Requirements For Auditing Subsystems
2.4.3.1 AUD/D2
TCSEC Quote: 
"C2: New: The TCB shan be able to create, maintain, and protect from modification or unauthorized access or destruction an audit trail of accesses to the objects it protects. The audit data shan be protected by the TCB so that read access to it is limited to those who are authorized for audit data. The TCB shall be able to record the following types of events: use of identification and authentication mechanisms introduction of objects into a user's address space (e.g., file open, program ~. initiation), deletion of objects, actions taken by computer operators and system administrators and/or system security officers, and other security relevant events. For each recorded event, the audit record shall identify: date and time of the event, ~ user, type of event, and success or failure of the event. For identincation/authentication events the origin of request (e.g., terminal ID) shan be - included in the audit record. For events that introduce an object into a user's address space and for object deletion events the audit record shall include the name of the object. rne ADP system administrator shall be able to selectively audit the actions of any one or more users based on individual identity." 
Interpretations: 
The following subsections provide interpretations of the TCSEC requirements which shall be satisfied by auditing subsystems at AUD/D2. 
2.4.3.1.1 Creation and management of audit trail
The auditing subsystem shall create and manage the audit trail of security-relevant " events in the system. If the other portions of the system are unable to capture data about such events, the auditiug subsystem shaU coutain the necessary interfaces into the system to perform this function. Alternatively, the auditing subsystem might simply accept and store data about events if the other portions of the system are capable of creating such data and passing them on. 
Rationale/Discussion: 
To meet this requirement, it is sufficient that the audit subsystem provides a set of calls which permit the system to supply the needed data as parameters that the audit subsystem puts into a data structure and routes to audit storage (or transmits securely to an audit logger). 
2.4.3.1.2 Protection of audit data
It shall be demonstrated that the audit data is protected from unauthorized modification. This protection will be provided either by the subsystem itself or by its integration with the protected system. 
Rationale/Discussion: 
The auditing subsystem might store the audit data in a dedicated data storage area that cannot be accessed by any subject on the system except the auditing subsystem itself and the system security officer (or system administrator through the auditing subsystem. Or, if the protected system has adequate access control facilities, the audit data might be stored on the protected system, using its access control mechanisms for protection. 
2.4.3.1.3 Access control to audit
The audit mechanism, auditing parameters, and the audit data storage media shall be protected to ensure access is allowed only to authorized individuals. Individuals who are authorized to access the audit data shall be able to gain access only through the auditing subsystem. 
Rationale/Discussion: 
This interpretation assumes that discretionary access controls or physical controls will be in place to keep unauthorized individuals from gaining access to the audit data. 
2.4.3.1.4 Specific types of events
Data about all security relevant events must be recorded. The other portions of the system shall be able to pass data concerning these events to the auditing subsystem, or the auditing subsystem shall have the necessary code integrated into the other portions of the system to pass the data to the collection point. 
2.4.3.1.5 Specific infolmation per event
All of the specific information enumerated in the TCSEC quote shall be captured for each recorded event. Of particular concern, is the recording of the user identity with each recorded event. 
Rationale/Discussion: 
This implies that the audit subsystem must be able to acquire user identities from an I&A mechanism, which may be provided on the audit subsystem itself, on the protected system, or in a separate I&A subsystem. Whichever is the case, the evaluation shall show that the audit subsystem has a working interface to an I&A mechanism. 
2.4.3.1.6 Ability to selectively audit individuals
The auditing subsystem shall have the ability to perform selection of audit data based on individual users. 
Rationale/Discussion: 
This requirement can be satisfied by pre-selection of the information to be recorded in the audit log (selective logging) and/or by post-selection of information to be extracted from the audit log (selective reduction). The reduction of the audit log must be able to show all of the security-relevant actions performed by any specified individual. The intent of selective logging is to reduce the volume of audit data to be recorded by only recording audit data for those specific individuals that the systcm security officer (or system administrator) specifies. The intent of selective reduction is to reduce the large volume of audit data into a collection of intelligible information which can be more efficiently used by the system administrator. 
2.4.3.2 AUD/D3
· TCSEC Quote: 
"B3: Add: The TCB shal~ contain a mechanism that is able to monitor the occurrence or accumulation of security auditable events that may indicate an imminent violation of security policy. This mechanism shall be able to immediately notify the security administrator when thresholds are exceeded and, if the occurrence or accumulation of these securib relevant events continues, the system shall take the least disruptive action to terminate the event." 
· Interpretation: The following interpretation, in addition to the interpretation and requirement for AUD/D2, shall be satisfied for the AUD/D3 class. 
2.4.3.2.1 Real-time alarms
The auditing subsystem shall provide the capability for the security administrator to set thresholds for certain auditable events. Furthermore, when the thresholds are exceeded, the audit subsystem shall immediately notify the security administrator of an imminent security violation. 
2.4.4 Assurance Requirements for Auditing Subsystems
Audit subsystems, whether being evaluated at AUD/D2 or AUD/D3, must comply with the assurance requirements listed below for the D2 class. The interpretations for these assurance requirements are contained in Section 3. 
· System Architecture (D2) 
· System Integrity (D2) 
· Security Testing (D2) 
2.4.5 Documentation Requirements for Auditing Subsystems
Audit subsystems, whether being evaluated at AUD/D2 or AUD/D3, must meet the documentation requirements listed below for the D2 class. The interpretations for these documentation requirements are contained in Section 4. 
· Security Features User's Guide (D2) 
· Trusted Facility Manual (D2) 
· Test Documentation (D2) 
· Design Documentation (D2) 
3. ASSURANCE REQUIREMENTS
Rated subsystems must provide correct and accurate operations. Assurance must be provided that correct implementation and operation of the subsystem's function exist throughout the subsystem's life cycle. The objective in applying these assurance requirements is to develop confidence that the subsystem has been implemented correctly and that it is protected from tampering and circumvention. 
The requirement is that the subsystem must contain hardware/software mechanisms that can be independently evaluated through a combination of inspection and testing to provide sufficient assurance that the subsystem features enforce or support the functions for which the subsystem is intended. To receive a rating, a subsystem must meet the assurance requirements at the same level of trust as it has I met the requirements for functionality. The assurances must be applied to the different types of subsystems as described in the previous sections. 
3.1 SUBSYSTEM ARCHITECTURE
Subsystem architecture evaluation is designed to provide operational assurances with regard to the design and implementation of the protection mechanisms of the subsystem and its interfaces to the host/host TCB. 
3.1.1 Arch:D1
TCSEC Quote: 
"Cl: New: The TCB shall maintain a domain for its own execution that protects it from external interference or tampering (e.g., by modification of its code or data structures). Resources controned by the TCB may be a defined subset of the subjects and objects in the ADP system." 
Interpretation: 
This requirement applies to all subsystems evaluated at all classes, regardless of the function(s) they perform. There are two specific elements of this requirement: Execution Domain Protection and Defined Subsets. 
3.1.1.1 Execution Domain Protection
Protection of the subsystem's mechanism and data from external interference or tampering must be provided. The code and data of the subsystem may be protected' through physical protection (e.g., by the subsystem's dedicated hardware base) or by 
logical isolation (e.g., using the protected system's domain mechanism). 
Rationale and Discussion: 
The subsystem may be contained entirely on its own hardware base which must protect the operational elements of the mechanisms. Alternatively, all or a portion of the subsystem may be implemented on the hardware of the host, in which case the host system's architecture must protect this portion from external interference or tampering. 
3.1.1.2 Defined Subsets
I&A subsystems, when used for the system's I&A, define the subset of subjects under the control of the system's TCB. DAC subsystems may protect a subset of the total collection of objects on the protected system. 
3.1.2 Arch:D2
TCSEC Quotes: 
"C2: Add: The TCB shall isolate the resources to be protected so that they are subject to the access control and auditing requirements." 
Interpretation: 
In the TCSEC quote, "TCB" is interpreted to mean "subsystem". 
This requirement applies to all subsystems evaluated at the D2 class or the D3 class. The following interpretations explain how this requirement applies to specific functions performed by subsystems. 
· Interpretation for DAC Subsystems: 
All named objects which are in the defined subset of protected objects shall be isolated such that the DAC subsystem mediates all access to those objects. 
· Interpretation for Auditing Subsystems: 
The system's architecture shall ensure that the auditing mechanism cannot be bypassed by any subjects accessing those objects under the system's control. 
· Interpretation for Object Reuse Subsystems 
The notion of subsetting objects is not applicable to object reuse subsystems. Object reuse subsystems shall perform their function for all storage objects on the protected system that are accessible to users. 
· Interpretation for I&A Subsystems: 
This requirement applies to I&A subsystems. Authentication data shall be protected from unauthorized access. Access to the authentication data shall also be recorded in the audit trail. 
3.2 SUBSYSTEM INTEGRITY
Subsystem integrity evaluation is designed to provide operational assurances with regard to the correct operation of the protection mechanisms of the subsystem and its interfaces to the protected system. 
3.2.1 Integity:D1
TCSEC Quote 
"Cl: New: Hardware and/or software features shan be provided that can be used to periodicany ~aUdate the correct operation of the on site hardware and firmware elements of the TCB." 
Interpretation: 
In the TCSEC quote, "TCB" is interpreted to mean "subsystem". 
This requirement applies to an subsystems evaluated at any class, regardless of the functions they perform. 
Rationale/Discussion 
The capability must exist to validate the correct operation of all hardware and firrnware elements of the system regardless of whether they reside within the subsystem, the protected system, or other interfacing subsystems. If the hardware and/or firmware elements of the protected system or other interfacing subsystems play an integral role in the protection and/or correct operation of the subsystem, then they must comply with this requirement as though they were part of the subsystem. 
3.2.2 Integrity:D2
There are no additional requirements for System Integrity at D2. 
3.3 SECURITY TESTING
Testing, as part of the evaluation, is designed to provide life cycle assurances with regard to the integrity of the subsystem. Further, testing provides additional assurances regarding the correct operation of the protection mechanisms of the subsystem and the subsystem's interfaces to the protected system. These mechanisms and their interfaces to the protected system, are termed the Subsystem's Security- Relevant Portion (SRP). 
3.3.1 Test:Dl
TCSEC Quote: 
"Cl: New: The securib mechanisms of the ADP system shan be tested and found to work as claimed in the system documentation. Testing shan be done to assure that there are no ob~ious ways for an unauthorized wer to bypass or otherwise defeat the security protection mechanisms of the TCB. (See the Security Testing Guidelines.) " 
Interpretation 
This requirement applies to all subsystems evaluated at any class, regardless of the function(s) they perform. In the TCSEC quote, "TCB" is interpreted to mean subsystem. 
The subsystem's SRP shall be tested and found to work as claimed in the subsystem's documentation. The addition of a subsystem to a protected system shall not cause obvious flaws to the resulting system. _ 
Test results shall show that there are no obvious ways for an unauthorized user to bypass or otherwise defeat the subsystem's SRP. 
Rational/Discussion: 
Security testing is a very important part of subsystem evaluations. It is essential that the subsystem be demonstrated to operate securely. 
3.3.2 Test:D2
TCSEC Quote: 
"C2: Add: Testing shan also include a search for obvious flaws that would anow nolation of resource isolation, or that would permit unauthorized access to the audit or authentication data." 
Interpretation: 
This requirement applies to the testing of the SRP of any subsystem evaluated at the D2 class or the D3 class. 
Rationale/Discussion 
The requirement as written in the TCSEC quote is directly applicable. This requirement is to ensure that subsystems at D2 cannot be circumvented or tampered with. 
4. DOCUMENTATION REQUIREMENTS
Documentation shan produce evidence that the subsystem can and does provide specified security features. The evaluation will focus on the completeness of this evidence through inspection of documentation structure and content and through a mapping of the documentation to the subsystem's implementation and its operation. 
4.1 SECURITY FEATURES USER'S GUIDE
4.1.1 SFUG:Dl
TCSEC Quote: 
"Cl: New: A single summaIy, chapter, or manual in user documentation shall describe the protection mechanisms provided by the TCB, guidelines on their use, and how they interact with one another." 
Interpretation: 
All subsystems shall meet this requirement in that they shall describe the protection mechanisms provided by the subsystem. 
Rationale/Discussion: 
It is recognized that some subsystems may be partially or completely transparent to the general user. In such cases, this requirement can be met by documenting the functions the subsystem performs so users will be aware of what the subsystem does. Other subsystems which have a very limited user interface may not need to be accompanied by more than a pocketsize card available to every user. In short, the documentation required to meet this requirement need not be elaborate, but must be clear and comprehenslve. 
4.1.2 SFUG:D2
Interpretation: 
There are no additional requirements at the D2 class. 
4.2 TRUSTED FACILITY MANUAL
4.2.1 TFM:Dl
TCSEC Quote : 
"Cl: New: A manual addressed to the ADP system admmistrator shan present cautions about functions and prvileges that should be controlled when running a secure facility." 
Interpretation: 
This requirement applies to all subsystems in that the manual shall present cautions about functions and privileges provided by the subsystem. Further, this manual shall present specific and precise direction for effectively integrating the subsystem into the overall system. 
4.2.2 TFM:D2
TCSEC Quote: 
"C2: Add: The procedures for examining and maintaMing the audit files as well as the detailed audit record structure for each type of audit event shall be given." 
Interpretation: 
This requirement applies directly to all auditing subsystems and to other subsystems that maintain their own audit data concerning events that happen under their control. For subsystems that create audit data and pass it to an external auditing collection and maintenance facility, the audit record structure shall be documented; however, the procedures for examination and maintenance of audit files may be left to the external auditing facility. 
4.3 TEST DOCUMENTATION
4.3.1 TD:Dl
TCSEC Quote: 
"Cl: New: The system developer shall provide to the evaluators a document that describes the test plan, test procedures that show how the securib mechanisms were tested, and results of the security mechanisms' functional testing." 
Interpretation: 
The document shall explain the exact configuration used for security testing. All mechanisms supplying the required supporting functions shall be identified. All interfaces between the subsystem being tested, the protected system, and other subsystems shall be described. 
4.3.2 TD:D2
Interpretation 
There are no additional requirements at the D2 class. 
4.4 DESIGN DOCUMENTATION
4.4.1 DD:Dl
TCSEC Quote: 
"Cl: New: Documentation shall be available that provides a description of the manufacturer's philosophy of protection and an explanation of how this philosophy is translated into the TCB. If the TCB is composed of distinct modules, the interfaces between these modules shall be described. " 
Interpretation: 
This requirement applies directly to all subsystems. Specifically, the design documentation shall state what types of threats the subsystem is designed to protect against (e.g., casual browsing, determined attacks, accidents). This documentation shan show how the protection philosophy is translated into the subsystem's SRP. Design documentation shan also specify how the subsystem is to interact with the protected system and other subsystems to provide a complete computer security system. If the SRP is modularized, the interfaces between these modules shall be described. 
4.4.2 DD:D2
There are no additional requirements for Design Documentation at the D2 class. 
5- GLOSSARY
Accreditation - The offlcial authorization that is granted to an ADP system to process sensitive information in its operational environment, based upon , comprehensive security evaluation of the system's hardware, firmware, and software . security design, configuration and implementation of the other system procedural, administrative, physical, TEMPEST, personnel, and comrnunications controls. 
Audit - The procedure of capturing, storing, maintaining, and managing data concerning security-relevant events that occur on a computer system. The data recorded are intended for use in detecting security violations and tracing thosc violations to the responsible individual. 
Audit trail - A set of records that collectively provide documentary evidence of processing users to aid in tracing from original transactions forward to related records and reports, and/or backwards from records and reports to their component source transactions. 
Authenticate - To establish the validity of a claimed identity. 
Authorization - Permission which establishes right to access information. 
Certification evaluation - The technical evaluation of a system's security features, made as part of and in support of the approval/accreditation process, that establishes " the extent to which a particular computer system's design and implementation meet a set of specified security requirements. 
Computer security subsystem - Hardware, firmware and/or software which are added to a computer system to enhance the security of the overall system. 
Group user - A user of a computer system whose system identification is the name of a defined group of users on that system. 
Individual user - A user of a computer system whose system identification is unique, in that no other user on that system has that same identification. 
Named object - An object which is directly manipulable at the TCB interface. Thc object must have meaning to more than one process. 
Product evaluation - Thc technical evaluation of a product's security features to determine the level of trust that can be placed in that product as defined by thc NCSC. evaluation criteria for that type of product (e.g., operating system, database management system, computer network, computer security subsystem). Product evaluations do not consider the application of the product in the evaluation. 
Protected system - The system being protected. In the context of computer security subsystems, a stand-alone computer system or a computer network to which a subsystem is attached to pronde some computer security function. 
Security Relevant Portion (SRP) - The protection-critical mechanism of the subsystem, the subsystem's interface(s) to the protected system, and interfaces to the mechanisms providing required supporting functions. For most cases, the SRP encompasses the entire subsystem. 
Subsystem - See "computer security subsystem." 
System - The combination of the protected system and the computer security subsystem. 
*U.S. GOVERNMENT PRINTING OFFICE: 1989-225-703

The Tan Book: A Guide to Understanding Audit in Trusted Systems

 

                                              NCSC-TG-001 
                                         Library No. S-228,470 

                          FOREWORD 

This publication, "A Guide to Understanding Audit in Trusted 
Systems," is being issued by the National Computer Security 
Center (NCSC) under the authority of and in accordance with 
Department of Defense (DoD) Directive 5215.1.  The guidelines 
described in this document provide a set of good practices 
related to the use of auditing in automatic data processing 
systems employed for processing classified and other sensitive 
information. Recommendations for revision to this guideline are 
encouraged and will be reviewed biannually by the National 
Computer Security Center through a formal review process.  
Address all proposals for revision through appropriate channels 
to:  

       National Computer Security Center 
       9800 Savage Road 
       Fort George G. Meade, MD  20755-6000  

       Attention: Chief, Computer Security Technical Guidelines 

_________________________________ 
Patrick R. Gallagher, Jr.                     28 July 1987 
Director 
National Computer Security Center  

                                   i 

                          ACKNOWLEDGEMENTS 

Special recognition is extended to James N. Menendez, National 
Computer Security Center (NCSC), as project manager of the 
preparation and production of this document. 

Acknowledgement is also given to the NCSC Product Evaluations 
Team who provided the technical guidance that helped form this 
document and to those members of the computer security community 
who contributed their time and expertise by actively
participating in the review of this document. 

                                   ii 

                          CONTENTS 

FOREWORD ...................................................  i 

ACKNOWLEDGEMENTS ...........................................  ii 

CONTENTS ...................................................  iii

PREFACE .....................................................  v 

1. INTRODUCTION .............................................  1 

    1.1 HISTORY OF THE NATIONAL COMPUTER SECURITY CENTER ....  1 
    1.2 GOAL OF THE NATIONAL COMPUTER SECURITY CENTER .......  1 

2. PURPOSE ..................................................  2 

3. SCOPE ....................................................  3 

4. CONTROL OBJECTIVES .......................................  4 

5. OVERVIEW OF AUDITING PRINCIPLES ..........................  8 

    5.1 PURPOSE OF THE AUDIT MECHANISM.......................  8 
    5.2 USERS OF THE AUDIT MECHANISM.........................  8 
    5.3 ASPECTS OF EFFECTIVE AUDITING .......................  9 

         5.3.1 Identification/Authentication ................  9 
         5.3.2 Administrative ...............................  10
         5.3.3 System Design ................................  10

    5.4 SECURITY OF THE AUDIT ...............................  10 

6. MEETING THE CRITERIA REQUIREMENTS ........................  12

    6.1 THE C2 AUDIT REQUIREMENT ............................  12

         6.1.1 Auditable Events .............................  12
         6.1.2 Auditable Information ........................  12
         6.1.3 Audit Basis ..................................  13

    6.2 THE B1 AUDIT REQUIREMENT ............................  13

         6.2.1 Auditable Events .............................  13
         6.2.2 Auditable Information ........................  13
         6.2.3 Audit Basis ..................................  14

                                  iii 

                          CONTENTS (Continued) 

    6.3 THE B2 AUDIT REQUIREMENT ............................  14

         6.3.1 Auditable Events .............................  14
         6.3.2 Auditable Information ........................  14
         6.3.3 Audit Basis ..................................  14

    6.4 THE B3 AUDIT REQUIREMENT ............................  15

         6.4.1 Auditable Events .............................  15
         6.4.2 Auditable Information ........................  15
         6.4.3 Audit Basis ..................................  15

    6.5 THE A1 AUDIT REQUIREMENT ............................  16

         6.5.1 Auditable Events .............................  16
         6.5.2 Auditable Information ........................  16
         6.5.3 Audit Basis ..................................  16 

7. POSSIBLE IMPLEMENTATION METHODS ..........................  17

    7.1 PRE/POST SELECTION OF AUDITABLE EVENTS ..............  17 

         7.1.1 Pre-Selection ................................  17
         7.1.2 Post-Selection ...............................  18

    7.2 DATA COMPRESSION ....................................  18
    7.3 MULTIPLE AUDIT TRAILS ...............................  19
    7.4 PHYSICAL STORAGE ....................................  19
    7.5 WRITE-ONCE DEVICE ...................................  20
    7.6 FORWARDING AUDIT DATA ...............................  21

8. OTHER TOPICS .............................................  22

    8.1 AUDIT DATA REDUCTION ................................  22
    8.2 AVAILABILITY OF AUDIT DATA ..........................  22
    8.3 AUDIT DATA RETENTION ................................  22
    8.4 TESTING .............................................  23
    8.5 DOCUMENTATION .......................................  23
    8.6 UNAVOIDABLE SECURITY RISKS ..........................  24

         8.6.1 Auditing Administrators/Insider Threat .......  24 
         8.6.2 Data Loss ....................................  25

9. AUDIT SUMMARY ...........................................  26 

GLOSSARY

REFERENCES ..............................................  27 

                          PREFACE                

Throughout this guideline there will be recommendations made that
are not included in the Trusted Computer System Evaluation 
Criteria (the Criteria) as requirements.  Any recommendations 
that are not in the Criteria will be prefaced by the word 
"should," whereas all requirements will be prefaced by the word 
"shall."  It is hoped that this will help to avoid any confusion.

                                   v 
                                                                1

1.   INTRODUCTION 

1.1   History of the National Computer Security Center 

The DoD Computer Security Center (DoDCSC) was established in 
January 1981 for the purpose of expanding on the work started by 
the DoD Security Initiative.  Accordingly, the Director, National
Computer Security Center, has the responsibility for establishing
and publishing standards and guidelines for all areas of computer
security.  In 1985, DoDCSC's name was changed to the National 
Computer Security Center to reflect its responsibility for 
computer security throughout the federal government. 

1.2   Goal of the National Computer Security Center 

The main goal of the National Computer Security Center is to 
encourage the widespread availability of trusted computer 
systems.  In support of that goal a metric was created, the DoD 
Trusted Computer System Evaluation Criteria (the Criteria), 
against which computer systems could be evaluated for security.  
The Criteria was originally published on 15 August 1983 as CSC- 
STD-001-83.  In December 1985 the DoD adopted it, with a few 
changes, as a DoD Standard, DoD 5200.28-STD.  DoD Directive 
5200.28, "Security Requirements for Automatic Data Processing 
(ADP) Systems" has been written to, among other things, require 
the Department of Defense Trusted Computer System Evaluation 
Criteria to be used throughout the DoD.  The Criteria is the 
standard used for evaluating the effectiveness of security 
controls built into ADP systems.  The Criteria is divided into 
four divisions: D, C, B, and A, ordered in a hierarchical manner 
with the highest division (A) being reserved for systems 
providing the best available level of assurance.  Within 
divisions C and B there are a number of subdivisions known as 
classes, which are also ordered in a hierarchical manner to 
represent different levels of security in these classes.   

2.   PURPOSE 

For Criteria classes C2 through A1 the Criteria requires that a 
user's actions be open to scrutiny by means of an audit.  The 
audit process of a secure system is the process of recording, 
examining, and reviewing any or all security-relevant activities 
on the system.  This guideline is intended to discuss issues 
involved in implementing and evaluating an audit mechanism.  The 
purpose of this document is twofold.  It provides guidance to 
manufacturers on how to design and incorporate an effective audit
mechanism into their system, and it provides guidance to 
implementors on how to make effective use of the audit 
                                1

capabilities provided by trusted systems.  This document contains
suggestions as to what information should be recorded on the 
audit trail, how the audit should be conducted, and what 
protective measures should be accorded to the audit resources. 

Any examples in this document are not to be construed as the only
implementations that will satisfy the Criteria requirement.  The 
examples are merely suggestions of appropriate implementations.  
The recommendations in this document are also not to be construed
as supplementary requirements to the Criteria. The Criteria is 
the only metric against which systems are to be evaluated.   

This guideline is part of an on-going program to provide helpful 
guidance on Criteria issues and the features they address. 

3.   SCOPE 

An important security feature of Criteria classes C2 through A1 
is the ability of the ADP system to audit any or all of the 
activities on the system.  This guideline will discuss auditing 
and the features of audit facilities as they apply to computer 
systems and products that are being built with the intention of 
meeting the requirements of the Criteria. 

                                2 

4.  CONTROL OBJECTIVES

The Trusted Computer System Evaluation Criteria gives the 
following as the Accountability Control Objective: 

    "Systems that are used to process or handle classified or 
     other sensitive information must assure individual          
     accountability whenever either a mandatory or               
     discretionary security policy is invoked.  Furthermore, to  
     assure accountability the capability must exist for an 
     authorized and competent agent to access and evaluate       
     accountability information by a secure means, within a      
     reasonable amount of time and without undue difficulty."(1) 

The Accountability Control Objective as it relates to auditing 
leads to the following control objective for auditing: 

    "A trusted computer system must provide authorized personnel 
     with the ability to audit any action that can potentially  
     cause access to, generation of, or effect the release 
     of classified or sensitive information.  The audit 
     data will be selectively acquired based on the auditing 
     needs of a particular installation and/or application.      
     However, there must be sufficient granularity in the audit  
     data to support tracing the auditable events to a specific  
     individual (or process) who has taken the actions or on     
     whose behalf the actions were taken."(1)   

                                3 

5.   OVERVIEW OF AUDITING PRINCIPLES 

Audit trails are used to detect and deter penetration of a
computer system and to reveal usage that identifies misuse.  At
the discretion of the auditor, audit trails may be limited to
specific events or may encompass all of the activities on a
system.  Although not required by the TCSEC, it should be
possible for the target of the audit mechanism to be either a
subject or an object.  That is to say, the audit mechanism should
be capable of monitoring every time John accessed the system as
well as every time the nuclear reactor file was accessed; and
likewise every time John accessed the nuclear reactor file. 

5.1   Purpose of the Audit Mechanism 

The audit mechanism of a computer system has five important
security goals.  First, the audit mechanism must "allow the
review of patterns of access to individual objects, access
histories of specific processes and individuals, and the use of
the various protection mechanisms supported by the system and
their effectiveness."(2)  Second, the audit mechanism must allow
discovery of both users' and outsiders' repeated attempts to
bypass the protection mechanisms.  Third, the audit mechanism
must allow discovery of any use of privileges that may occur when
a user assumes a functionality with privileges greater than his
or her own, i.e., programmer to administrator.  In this case
there may be no bypass of security controls but nevertheless a
violation is made possible.  Fourth, the audit mechanism must act
as a deterrent against perpetrators' habitual attempts to bypass
the system protection mechanisms.  However, to act as a
deterrent, the perpetrator must be aware of the audit mechanism's
existence and its active use to detect any attempts to bypass
system protection mechanisms.  The fifth goal of the audit
mechanism is to supply "an additional form of user assurance that
attempts to bypass the protection mechanisms are recorded and
discovered."(2)  Even if the attempt to bypass the protection
mechanism is successful, the audit trail will still provide
assurance by its ability to aid in assessing the damage done by
the violation, thus improving the system's ability to control the
damage. 

5.2.  Users of the Audit Mechanism 

"The users of the audit mechanism can be divided into two groups. 
The first group consists of the auditor, who is an individual
with administrative duties, who selects the events to be audited
on the system, sets up the audit flags which enable the recording

                                4

of those events, and analyzes the trail of audit events."(2)  In
some systems the duties of the auditor may be encompassed in the
duties of the system security administrator.  Also, at the lower
classes, the auditor role may be performed by the system
administrator.  This document will refer to the person
responsible for auditing as the system security administrator,
although it is understood that the auditing guidelines may apply
to system administrators and/or system security administrators
and/or a separate auditor in some ADP systems.   

"The second group of users of the audit mechanism consists of the
system users themselves; this group includes the administrators,
the operators, the system programmers, and all other users.  They
are considered users of the audit mechanism not only because
they, and their programs, generate audit events,"(2) but because
they must understand that the audit mechanism exists and what
impact it has on them.  This is important because otherwise the
user deterrence and user assurance goals of the audit mechanism
cannot be achieved.    

5.3  Aspects of Effective Auditing 

5.3.1.  Identification/Authentication 

 Logging in on a system normally requires that a user enter the 
specified form of identification (e.g., login ID, magnetic strip) 
and a password (or some other mechanism) for authentication. 
Whether this information is valid or invalid, the execution of
the login procedure is an auditable event and the identification
entered may be considered to be auditable information.  It is
recommended that authentication information, such as passwords,
not be forwarded to the audit trail.  In the event that the
identification entered is not recognized as being valid, the
system should also omit this information from the audit trail. 
The reason for this is that a user may have entered a password
when the system expected a login ID.  If the information had been
written to the audit trail, it would compromise the password and
the security of the user. 

There are, however, environments where the risk involved in 
recording invalid identification information is reduced.  In
systems that support formatted terminals, the likelihood of
password entry in the identification field is markedly reduced,
hence the recording of identification information would pose no
major threat.  The benefit of recording the identification
information is that break-in attempts would be easier to detect
and identifying the perpetrator would also be assisted.  The 

                                 5

information gathered here may be necessary for any legal 
prosecution that may follow a security  violation.    

5.3.2  Administrative 

All systems rated at class C2 or higher shall have audit 
capabilities and personnel designated as responsible for the
audit procedures.  For the C2 and B1 classes, the duties of the
system operators could encompass all functions including those of
the auditor.  Starting at the B2 class, there is a requirement
for the TCB to support separate operator and administrator
functions.  In addition, at the B3 class and above, there is a
requirement to identify the system security administrator
functions.  When one assumes the system security administrator
role on the system, it shall be after taking distinct auditable
action, e.g., login procedure.  When one with the privilege of
assuming the role is on the system, the act of assuming that role
shall also be an auditable event. 

5.3.3   System Design 

The system design should include a mechanism to invoke the audit 
function at the request of the system security administrator.  A 
mechanism should also be included to determine if the event is to
be selected for inclusion as an audit trail entry.  If
pre-selection of events is not implemented, then all auditable
events should be forwarded to the audit trail.  The Criteria
requirement for the administrator to be able to select events
based on user identity and/or object security classification must
still be able to be satisfied.  This requirement can be met by
allowing post-selection of events through the use of queries. 
Whatever reduction tool is used to analyze the audit trail shall
be provided by the vendor.  

5.4   Security of the Audit 

Audit trail software, as well as the audit trail itself, should
be protected by the Trusted Computing Base and should be subject
to strict access controls.  The security requirements of the
audit mechanism are the following: 

(1)  The event recording mechanism shall be part of the TCB and  
     shall be protected from unauthorized modification or        
     circumvention. 

(2)  The audit trail itself shall be protected by the TCB from   

                                 6

     unauthorized access (i.e., only the audit personnel may     
     access the audit trail).  The audit trail shall also be     
     protected from unauthorized modification.  

(3)  The audit-event enabling/disabling mechanism shall be part  
     of the TCB and shall remain inaccessible to the unauthorized 
     users.(2)  

At a minimum, the data on the audit trail should be considered to
be sensitive, and the audit trail itself shall be considered to
be as sensitive as the most sensitive data contained in the
system. 

When the medium containing the audit trail is physically removed 
from the ADP system, the medium should be accorded the physical 
protection required for the highest sensitivity level of data 
contained in the system. 

                                 7 

6.   MEETING THE CRITERIA REQUIREMENTS 

This section of the guideline will discuss the audit requirements
in the Criteria and will present a number of additional 
recommendations.  There are four levels of audit requirements. 
The first level is at the C2 Criteria class and the requirements 
continue evolving through the B3 Criteria class.   At each of
these levels, the guideline will list some of the events which
should be auditable, what information should be on the audit
trail, and on what basis events may be selected to be audited. 
All of the requirements will be prefaced by the word "shall," and
any additional recommendations will be prefaced by the word
"should." 

6.1   The C2 Audit Requirement 

6.1.1   Auditable Events 

The following events shall be subject to audit at the C2 class:  

   * Use of identification and authentication mechanisms 

   * Introduction of objects into a user's address space  

   * Deletion of objects from a user's address space 

   * Actions taken by computer operators and system              
     administrators and/or system security administrators    

   * All security-relevant events (as defined in Section 5 of    
     this guideline) 

   * Production of printed output 

6.1.2   Auditable Information 

The following information shall be recorded on the audit trail at
the C2 class:  

   * Date and time of the event 

   * The unique identifier on whose behalf the subject generating 
     the event was operating 

   * Type of event 

   * Success or failure of the event 

                                8

   * Origin of the request (e.g., terminal ID) for               
     identification/authentication events 

   * Name of object introduced, accessed, or deleted from a      
    user's address space 

   * Description of modifications made by the system             
     administrator to the user/system security databases   

6.1.3   Audit Basis 

At the C2 level, the ADP System Administrator shall be able to
audit based on individual identity. 

The ADP System Administrator should also be able to audit based
on object identity. 

6.2   The B1 Audit Requirement 

6.2.1   Auditable Events 

The Criteria specifically adds the following to the list of
events that shall be auditable at the B1 class: 

   * Any override of human readable output markings (including   
     overwrite of sensitivity label markings and the turning off 
     of labelling capabilities) on paged, hard-copy output       
   devices 

   * Change of designation (single-level to/from multi-level) of 
     any communication channel or I/O device 

   * Change of sensitivity level(s) associated with a            
   single-level communication channel or I/O device 

   * Change of range designation of any multi-level communication 
     channel or I/O device  

6.2.2   Auditable Information 

The Criteria specifically adds the following to the list of 
information that shall be recorded on the audit trail at the B1  
class: 

   * Security level of the object 

                                 9 

The following information should also be recorded on the audit
trail at the B1 class: 

   * Subject sensitivity level  

6.2.3   Audit Basis 

In addition to previous selection criteria, at the B1 level the 
Criteria specifically requires that the ADP System Administrator 
shall be able to audit based on individual identity and/or object
security level. 

6.3   The B2 Audit Requirement 

6.3.1   Auditable Events 

The Criteria specifically adds the following to the list of
events that shall be auditable at the B2 class: 

   * Events that may exercise covert storage channels  

6.3.2   Auditable Information 

No new requirements have been added at the B2 class. 

6.3.3   Audit Basis 

In addition to previous selection criteria, at the B2 level the 
Criteria specifically requires that "the TCB shall be able to
audit the identified events that may be used in the exploitation
of covert storage channels."  The Trusted Computing Base shall
audit covert storage channels that exceed ten bits per second.(1) 

The Trusted Computing Base should also provide the capability to 
audit the use of covert storage mechanisms with bandwidths that
may exceed a rate of one bit in ten seconds.  

6.4   The B3 Audit Requirement 

6.4.1   Auditable Events 

The Criteria specifically adds the following to the list of
events that shall be auditable at the B3 class: 

   * Events that may indicate an imminent violation of the 

                                10

     system's security policy (e.g., exercise covert timing      
     channels) 

6.4.2   Auditable Information 

No new requirements have been added at the B3 class. 

6.4.3   Audit Basis 

In addition to previous selection criteria, at the B3 level the  
Criteria specifically requires that "the TCB shall contain a 
mechanism that is able to monitor the occurrence or accumulation
of security auditable events that may indicate an imminent
violation of security policy.  This mechanism shall be able to
immediately notify the system security administrator when
thresholds are exceeded and, if the occurrence or accumulation of
these security-relevant events continues, the system shall take
the least disruptive action to terminate the event."(1)     

Events that would indicate an imminent security violation would 
include events that utilize covert timing channels that may
exceed a rate of ten bits per second and any repeated
unsuccessful login attempts.   

Being able to immediately notify the system security
administrator when thresholds are exceeded means that the
mechanism shall be able to recognize, report, and respond to a
violation of the security policy more rapidly than required at
lower levels of the Criteria, which usually only requires the
System Security Administrator to review an audit trail at some
time after the event.  Notification of the violation "should be
at the same priority as any other TCB message to an operator."(5) 

"If the occurrence or accumulation of these security-relevant
events continues, the system shall take the least disruptive
action to terminate the event."(1)  These actions may include
locking the terminal of the user who is causing the event or
terminating the suspect's process(es).  In general, the least
disruptive action is application dependent and there is no
requirement to demonstrate that the action is the least
disruptive of all possible actions.  Any action which terminates
the event is acceptable, but halting the system should be the
last resort.   

                                11

7.5   The A1 Audit Requirement 

7.5.1   Auditable Events 

No new requirements have been added at the A1 class. 

7.5.2   Auditable Information 

No new requirements have been added at the A1 class. 

7.5.3   Audit Basis 

No new requirements have been added at the A1 class. 

                                12 

7.   POSSIBLE IMPLEMENTATION METHODS 

The techniques for implementing the audit requirements will vary 
from system to system depending upon the characteristics of the 
software, firmware, and hardware involved and any optional
features that are to be available.  Technologically advanced
techniques that are available should be used to the best
advantage in the system design to provide the requisite security
as well as cost-effectiveness and performance.  

7.1   Pre/Post Selection of Auditable Events 

There is a requirement at classes C2 and above that all security-
relevant events be auditable.  However, these events may or may
not always be recorded on the audit trail.  Options that may be 
exercised in selecting which events should be audited include a
pre-selection feature and a post-selection feature.  A system may
choose to implement both options, a pre-selection option only, or
a post-selection option only.  

If a system developer chooses not to implement a general pre/post
selection option, there is still a requirement to allow the 
administrator to selectively audit the actions of specified users
for all Criteria classes.  Starting at the B1 class, the 
administrator shall also be able to audit based on object
security level. 

There should be options to allow selection by either individuals
or groups of users.  For example, the administrator may select
events related to a specified individual or select events related
to individuals included in a specified group.  Also, the
administrator may specify that events related to the audit file
be selected or, at classes B1 and above, that accesses to objects
with a given sensitivity level, such as Top Secret, be selected. 

7.1.1   Pre-Selection 

For each auditable event the TCB should contain a mechanism to 
indicate if the event is to be recorded on the audit trail.  The 
system security administrator or designee shall be the only
person authorized to select the events to be recorded. 
Pre-selection may be by user(s) identity, and at the B1 class and
above, pre-selection may also be possible by object security
level.  Although the system security administrator shall be
authorized to select which events are to be recorded, the system
security administrator should not be able to exclude himself from
being audited. 

                                13

Although it would not be recommended, the system security  
administrator may have the capability to select that no events be
recorded regardless of the Criteria requirements.  The intention 
here is to provide flexibility.  The purpose of designing audit 
features into a system is not to impose the Criteria on users
that may not want it, but merely to provide the capability to
implement the requirements. 

A disadvantage of pre-selection is that it is very hard to
predict what events may be of security-relevant interest at a
future date.  There is always the possibility that events not
pre-selected could one day become security-relevant, and the
potential loss from not auditing these events would be impossible
to determine. 

The advantage of pre-selection could possibly be better
performance as a result of not auditing all the events on the
system. 

7.1.2   Post-Selection 

If the post-selection option to select only specified events from
an existing audit trail is implemented, again, only authorized 
personnel shall be able to make this selection.  Inclusion of
this option requires that the system should have trusted
facilities (as described in section 9.1) to accept
query/retrieval requests, to expand any compressed data, and to
output the requested data. 

The main advantage of post-selection is that information that may
prove useful in the future is already recorded on an audit trail
and may be queried at any time. 

The disadvantage involved in post-selection could possibly be 
degraded performance due to the writing and storing of what could
possibly be a very large audit trail. 

7.2   Data Compression 

"Since a system that selects all events to be audited may
generate a large amount of data, it may be necessary to encode
the data to conserve space and minimize the processor time
required" to record the audit records.(3)  If the audit trail is
encoded, a complementary mechanism must be included to decode the
data when required.  The decoding of the audit trail may be done
as a preprocess before the audit records are accessed by the
database or as a postprocess after a relevant record has been 

                                14

found.  Such decoding is necessary to present the data in an 
understandable form both at the administrators terminal and on
batch reports.  The cost of compressing the audit trail would be
the time required for the compression and expansion processes. 
The benefit of compressing data is the savings in storage and the
savings in time to write the records to the audit trail.  

7.3   Multiple Audit Trails 

All events included on the audit trail may be written as part of
the same audit trail, but some systems may prefer to have several
distinct audit trails, e.g., one would be for "user" events, one
for "operator" events, and one for "system security
administrator" events.  This would result in several smaller
trails for subsequent analysis.  In some cases, however, it may
be necessary to combine the information from the trails when
questionable events occur in order to obtain a composite of the
sequence of events as they occurred.  In cases where there are
multiple audit trails, it is preferred that there be some
accurate, or at least synchronized, time stamps across the
multiple logs.    

Although the preference for several distinct audit trails may be 
present, it is important to note that it is often more useful
that the TCB be able to present all audit data as one
comprehensive audit trail. 

7.4   Physical Storage 

A factor to consider in the selection of the medium to be used
for the audit trail would be the expected usage of the system. 
The I/O volume for a system with few users executing few
applications would be quite different from that of a large system
with a multitude of users performing a variety of applications. 
In any case, however, the system should notify the system
operator or administrator when the audit trail medium is
approaching its storage capacity.  Adequate advance notification
to the operator is especially necessary if human intervention is
required.   

If the audit trail storage medium is saturated before it is 
replaced, the operating system shall detect this and take some 
appropriate action such as: 

1.  Notifying the operator that the medium is "full" and action  
    is necessary.  The system should then stop and require       
    rebooting.  Although a valid option, this action creates a   

                                15

    severe threat of denial-of-service attacks. 

2.  Storing the current audit records on a temporary medium with 
    the intention of later migration to the normal operational   
    medium, thus allowing auditing to continue.  This temporary  
    storage medium should be afforded the same protection as the 
    regular audit storage medium in order to prevent any attempts 
    to tamper with it. 

3.  Delaying input of new actions and/or slowing down current    
    operations to prevent any action that requires use of the    
    audit mechanism. 

4.  Stopping until the administrative personnel make more space  
    available for writing audit records.    

5.  Stopping auditing entirely as a result of a decision by the  
    system security administrator. 

Any action that is taken in response to storage overflow shall be 
audited.  There is, however, a case in which the action taken may
not be audited that deserves mention.  It is possible to have the
system security administrator's decisions embedded in the system 
logic.  Such pre-programmed choices, embedded in the system
logic, may be triggered automatically and this action may not be
audited. 

Still another consideration is the speed at which the medium 
operates.  It should be able to accommodate the "worst case" 
condition such as when there are a large number of users on the 
system and all auditable events are to be recorded.  This worst
case rate should be estimated during the system design phase and
(when possible) suitable hardware should be selected for this
purpose. 

Regardless of how the system handles audit trail overflow, there 
must be a way to archive all of the audit data.  

7.5   Write-Once Device 

For the lower Criteria classes (e.g., C2, B1) the audit trail may
be the major tool used in detecting security compromises. 
Implicit in this is that the audit resources should provide the
maximum protection possible.  One technique that may be employed
to protect the audit trail is to record it on a mechanism
designed to be a write-only device.  Other choices would be to
set the designated device to write-once mode by disabling the 

                                16

read mechanism.  This method could prevent an attacker from
erasing or modifying the data already written on the audit trail
because the attacker will not be able to go back and read or find
the data that he or she wishes to modify.   

If a hardware device is available that permits only the writing
of data on a medium, modification of data already recorded would
be quite difficult.  Spurious messages could be written, but to
locate and modify an already recorded message would be difficult. 
Use of a write-once device does not prevent a penetrator from
modifying audit resources in memory, including any buffers, in
the current audit trail. 

If a write-once device is used to record the audit trail, the
medium can later be switched to a compatible read device to allow 
authorized personnel to analyze the information on the audit
trail in order to detect any attempts to penetrate the system. 
If a penetrator modified the audit software to prevent writing
records on the audit trail, the absence of data during an
extended period of time would indicate a possible security
compromise.  The disadvantage of using a write-once device is
that it necessitates a delay before the audit trail is available
for analysis by the administrator.  This may be offset by
allowing the system security administrator to review the audit
trail in real-time by getting copies of all audit records on
their way to the device. 

7.6   Forwarding Audit Data 

If the facilities are available, another method of protecting the
audit trail would be to forward it to a dedicated processor.  The
audit trail should then be more readily available for analysis by
the system security administrator.  

                                17 

8.  OTHER TOPICS 

8.1   Audit Data Reduction 

Depending upon the amount of activity on a system and the audit 
selection process used, the audit trail size may vary.  It is a
safe assumption though, that the audit trail would grow to sizes
that would necessitate some form of audit data reduction.  The
data reduction tool would most likely be a batch program that
would interface to the system security administrator.  This batch
run could be a combination of database query language and a
report generator with the input being a standardized audit file. 

Although they are not necessarily part of the TCB, the audit 
reduction tools should be maintained under the same configuration
control system as the remainder of the system. 

8.2  Availability of Audit Data 

In standard data processing, audit information is recorded as it 
occurs.  Although most information is not required to be
immediately available for real-time analysis, the system security
administrator should have the capability to retreive audit
information within minutes of its recording.  The delay between
recording audit information and making it available for analysis
should be minimal, in the range of several minutes.   

For events which do require immediate attention, at the B3 class
and above, an alert shall be sent out to the system security 
administrator.  In systems that store the audit trail in a
buffer, the system security administrator should have the
capability to cause the buffer to be written out.  Regarding
real-time alarms, where they are sent is system dependent.   

8.3  Audit Data Retention 

The exact period of time required for retaining the audit trail  
is site dependent and should be documented in the site's
operating procedures manual.  When trying to arrive at the
optimum time for audit trail retention, any time restrictions on
the storage medium should be considered.  The storage medium used
must be able to reliably retain the audit data for the amount of
time required by the site.     

The audit trail should be reviewed at least once a week.  It is
very possible that once a week may be too long to wait to review 

                                18

the audit trail.  Depending on the amount of audit data expected 
by the system, this parameter should be adjusted accordingly. 
The recommended time in between audit trail reviews should be
documented in the Trusted Facility Manual.      

8.4  Testing 

The audit resources, along with all other resources protected by
the TCB, have increasing assurance requirements at each higher
Criteria class.  For the lower classes, an audit trail would be a
major factor in detecting penetration attempts.  Unfortunately,
at these lower classes, the audit resources are more susceptible
to penetration and corruption.  "The TCB must provide some
assurance that the data will still be there when the
administrator tries to use it."(3)  The testing requirement
recognizes the vulnerability of the audit trail, and starting
with the C2 class, shall include a search for obvious flaws that
would corrupt or destroy the audit trail.  If the audit trail is
corrupted or destroyed, the existence of such flaws indicates
that the system can be penetrated.  Testing should also be
performed to uncover any ways of circumventing the audit
mechanisms.  The "flaws found in testing may be neutralized in 
any of a number of ways.  One way available to the system
designer is to audit all uses of the mechanism in which the flaw
is found and to log such events."(3)  An attempt should be made
to remove the flaw.   

At class B2 and above, it is required that all detected flaws
shall be corrected or else a lower rating will be given.  If
during testing the audit trail appears valid, analysis of this
data can verify that it does or does not accurately reflect the
events that should be included on the audit trail.  Even though
system assurances may increase at the higher classes, the audit
trail is still an effective tool during the testing phase as well
as operationally in detecting actual or potential security
compromises. 

8.5  Documentation  

Starting at the C2 class, documentation concerning the audit 
requirements shall be contained in the Trusted Facility Manual.  
The Trusted Facility Manual shall explain the procedures to
record, examine, and maintain audit files.  It shall detail the
audit record structure for each type of audit event, and should
include what each field is and what the size of the field is. 

The Trusted Facility Manual shall also include a complete  

                                19

description of the audit mechanism interface, how it should be
used, its default settings, cautions about the trade-offs
involved in using various configurations and capabilities, and
how to set up and run the system such that the audit data is 
afforded appropriate protection. 

If audit events can be pre- or post-selected, the manual should
also describe the tools and mechanisms available and how they are
to be used. 

8.6  Unavoidable Security Risks 

There are certain risks contained in the audit process that exist
simply because there is no way to prevent these events from ever 
occurring.  Because there are certain unpredictable factors  
involved in auditing, i.e., man, nature, etc., the audit
mechanism may never be one hundred per cent reliable.  Preventive
measures may be taken to minimize the likelihood of any of these
factors adversely affecting the security provided by the audit
mechanism, but no audit mechanism will ever be risk free.      

8.6.1   Auditing Administrators/Insider Threat 

Even with auditing mechanisms in place to detect and deter
security violations, the threat of the perpetrator actually being
the system security administrator or someone involved with the
system security design will always be present.  It is quite
possible that the system security administrator of a secure
system could stop the auditing of activities while entering the
system and corrupting files for personal benefit.  These
authorized personnel, who may also have access to identification
and authentication information, could also choose to enter the
system disguised as another user in order to commit crimes under
a false identity.  

Management should be aware of this risk and should be certain to 
exercise discretion when selecting the system security 
administrator.  The person who is to be selected for a trusted 
position, such as the system security administrator, should be 
subject to a background check before being granted the privileges
that could one day be used against the employer.   

The system security administrator could also be watched to ensure
that there are no unexplained variances in normal duties.  Any 
deviation from the norm of operations may indicate that a
violation of security has occurred or is about to occur. 

                                20

An additional security measure to control this insider threat is
to ensure that the system administrator and the person
responsible for the audit are two different people.  "The
separation of the auditor's functions, databases, and access
privileges from those of the system administrator is an important
application of the separation of privilege and least privilege 
principles.  Should such a separation not be performed, and
should the administrator be allowed to undertake auditor
functions or vice-versa, the entire security function would
become the responsibility of a single, unaccountable
individual."(2) 

Another alternative may be to employ separate auditor roles. 
Such a situation may give one person the authority to turn off
the audit mechanism, while another person may have the authority
to turn it back on.  In this case no individual would be able to
turn off the audit mechanism, compromise the system, and then
turn it back on. 

8.6.2   Data Loss 

Although the audit software and hardware are reliable security  
mechanisms, they are not infallible.  They, like the rest of the 
system, are dependent upon constant supplies of power and are  
readily subject to interruption due to mechanical or power
failures.  Their failure can cause the loss or destruction of
valuable audit data.  The system security administrator should be
aware of this risk and should establish some procedure that would
ensure that the audit trail is preserved somewhere.  The system
security administrator should duplicate the audit trail on a
removable medium at certain points in time to minimize the data
loss in the event of a system failure.  The Trusted Facility
Manual should include what the possibilities and nature of loss
exposure are, and how the data may be recovered in the event that
a catastrophe does occur.  

If a mechanical or power failure occurs, the system security 
administrator should ensure that audit mechanisms still function 
properly after system recovery.  For example, any auditing
mechanism options pre-selected before the system malfunction must
still be the ones in operation after the system recovery.   

                                21 

9.  AUDIT SUMMARY 

For classes C2 and above, it is required that the TCB "be able to
create, maintain, and protect from modification or unauthorized 
access or destruction an audit trail of accesses to the objects
it protects."(1)  The audit trail plays a key role in performing
damage assessment in the case of a corrupted system.   

The audit trail shall keep track of all security-relevant events 
such as the use of identification and authentication mechanisms, 
introduction of objects into a user's address space, deletion of 
objects from the system, system administrator actions, and any
other events that attempt to violate the security policy of the
system.  The option should exist that either all activities be
audited or that the system security administrator select the
events to be audited.  If it is decided that all activities
should be audited, there are overhead factors to be considered. 
The storage space needed for a total audit would generally
require more operator maintenance to prevent any loss of data and
to provide adequate protection.  A requirement exists that
authorized personnel shall be able to read all events recorded on
the audit trail.  Analysis of the total audit trail would be both
a difficult and time-consuming task for the administrator.  Thus,
a selection option is required which may be either a
pre-selection or post-selection option.   

The audit trail information should be sufficient to reconstruct a
complete sequence of security-relevant events and processes for a
system.  To do this, the audit trail shall contain the following 
information:  date and time of the event, user, type of event, 
success or failure of the event, the origin of the request, the
name of the object introduced into the user's address space,
accessed, or deleted from the storage system, and at the B1 class
and above, the sensitivity determination of the object. 

It should be remembered that the audit trail shall be included in
the Trusted Computing Base and shall be accorded the same
protection as the TCB.  The audit trail shall be subject to
strict access controls. 

An effective audit trail is necessary in order to detect and 
evaluate hostile attacks on a system.    

                                22 

GLOSSARY

Administrator - Any one of a group of personnel assigned to 
supervise all or a portion of an ADP system.   

Archive - To file or store records off-line. 

Audit - To conduct the independent review and examination of 
system records and activities. 

Auditor - An authorized individual with administrative duties,
whose duties include selecting the events to be audited on the
system, setting up the audit flags which enable the recording of
those events, and analyzing the trail of audit events.(2) 

Audit Mechanism - The device used to collect, review, and/or
examine system activities. 

Audit Trail - A set of records that collectively provide
documentary evidence of processing used to aid in tracing from
original transactions forward to related records and reports,
and/or backwards from records and reports to their component
source transactions.(1) 

Auditable Event - Any event that can be selected for inclusion in
the audit trail.  These events should include, in addition to 
security-relevant events, events taken to recover the system
after failure and any events that might prove to be
security-relevant at a later time.  

Authenticated User - A user who has accessed an ADP system with a
valid identifier and authentication combination.  

Automatic Data Processing (ADP) System - An assembly of computer 
hardware, firmware, and software configured for the purpose of 
classifying, sorting, calculating, computing, summarizing, 
transmitting and receiving, storing, and retrieving data with a 
minimum of human intervention.(1) 

Category - A grouping of classified or unclassified sensitive 
information, to which an additional restrictive label is applied 
(e.g., proprietary, compartmented information) to signify that 
personnel are granted access to the information only if they have
formal approval or other appropriate authorization.(4)  

Covert Channel - A communication channel that allows a process to 
transfer information in a manner that violates the system's
security policy.(1) 

                                23 

Covert Storage Channel - A covert channel that involves the
direct or indirect writing of a storage location by one process
and the direct or indirect reading of the storage location by
another process.  Covert storage channels typically involve a
finite resource (e.g., sectors on a disk) that is shared by two
subjects at different security levels.(1) 

Covert Timing Channel - A covert channel in which one process 
signals information to another by modulating its own use of
system resources (e.g., CPU time) in such a way that this
manipulation affects the real response time observed by the
second process.(1) 

Flaw - An error of commission, omission or oversight in a system 
that allows protection mechanisms to be bypassed.(1) 

Object - A passive entity that contains or receives information. 
Access to an object potentially implies access to the information
it contains.  Examples of objects are:  records, blocks, pages, 
segments, files, directories, directory trees and programs, as
well as bits, bytes, words, fields, processors, video displays, 
keyboards, clocks, printers, network nodes, etc.(1) 

Post-Selection - Selection, by authorized personnel, of specified
events that had been recorded on the audit trail. 

Pre-Selection - Selection, by authorized personnel, of the
auditable events that are to be recorded on the audit trail. 

Security Level - The combination of a hierarchical classification
and a set of non-hierarchical categories that represents the 
sensitivity of information.(1) 

Security Policy - The set of laws, rules, and practices that 
regulate how an organization manages, protects, and distributes 
sensitive information.(1) 

Security-Relevant Event - Any event that attempts to change the  
security state of the system,  (e.g., change discretionary access
controls, change the security level of the subject, change user  
password, etc.).  Also, any event that attempts to violate the  
security policy of the system, (e.g., too many attempts to login,
attempts to violate the mandatory access control limits of a
device, attempts to downgrade a file, etc.).(1) 

Sensitive Information - Information that, as determined by a 
competent authority, must be protected because its unauthorized 
disclosure, alteration, loss, or destruction will at least cause 
perceivable damage to someone or something.(1) 

                                24

Subject - An active entity, generally in the form of a person,  
process, or device that causes information to flow among objects
or changes the system state.  Technically, a process/domain
pair.(1) 

Subject Sensitivity Level - The sensitivity level of the objects
to which the subject has both read and write access.  A subject's
sensitivity level must always be less than or equal to the
clearance of the user the subject is associated with.(4) 

System Security Administrator - The person responsible for the 
security of an Automated Information System and having the
authority to enforce the security safeguards on all others who
have access to the Automated Information System.(4)  

Trusted Computing Base (TCB) - The totality of protection
mechanisms within a computer system -- including hardware,
firmware, and software -- the combination of which is responsible
for enforcing a security policy.  A TCB consists of one or more
components that together enforce a unified security policy over a
product or system.  The ability of a TCB to correctly enforce a
security policy depends solely on the mechanisms within the TCB
and on the correct input by system administrative personnel of
parameters (e.g., a user's clearance) related to the security
policy.(1) 

User - Any person who interacts directly with a computer
system.(1) 

                                25 

REFERENCES 

1.    National Computer Security Center, DoD Trusted Computer    
      System Evaluation Criteria, DoD, DoD 5200.28-STD, 1985. 

2.    Gligor, Virgil D., "Guidelines for Trusted Facility        
      Management and Audit," University of Maryland, 1985. 

3.    Brown, Leonard R., "Guidelines for Audit Log Mechanisms in 
      Secure Computer Systems," Technical Report                 
      TR-0086A(2770-29)-1, The Aerospace Corporation, 1986. 

4.    Subcommittee on Automated Information System Security,     
      Working Group #3, "Dictionary of Computer Security         
      Terminology," 23 November 1986. 

5.    National Computer Security Center, Criterion               
      Interpretation, Report No. C1-C1-02-87, 1987. 

                                26����������������������������������������������������������������������

DoD 5200.28-STD: Department of Defense Trusted Computer System Evaluation Criteria (December 26, 1985)

      
                                                               DoD 5200.28-STD
                                                                    Supersedes
                                                 CSC-STD-00l-83, dtd l5 Aug 83
                                                          Library No. S225,7ll

                        DEPARTMENT OF DEFENSE STANDARD

                                 DEPARTMENT OF

                                    DEFENSE

                               TRUSTED COMPUTER

                               SYSTEM EVALUATION

                                   CRITERIA

                                 DECEMBER l985

                                                             December 26, l985 

                                   FOREWORD

This publication, DoD 5200.28-STD, "Department of Defense Trusted Computer
System Evaluation Criteria," is issued under the authority of an in accordance
with DoD Directive 5200.28, "Security Requirements for Automatic Data
Processing (ADP) Systems," and in furtherance of responsibilities assigned by
DoD Directive 52l5.l, "Computer Security Evaluation Center."  Its purpose is to
provide technical hardware/firmware/software security criteria and associated
technical evaluation methodologies in support of the overall ADP system
security policy, evaluation and approval/accreditation responsibilities
promulgated by DoD Directive 5200.28.

The provisions of this document apply to the Office of the Secretary of Defense
(ASD), the Military Departments, the Organization of the Joint Chiefs of Staff,
the Unified and Specified Commands, the Defense Agencies and activities
administratively supported by OSD (hereafter called "DoD Components").

This publication is effective immediately and is mandatory for use by all DoD
Components in carrying out ADP system technical security evaluation activities
applicable to the processing and storage of classified and other sensitive DoD
information and applications as set forth herein.

Recommendations for revisions to this publication are encouraged and will be
reviewed biannually by the National Computer Security Center through a formal
review process.  Address all proposals for revision through appropriate
channels to:  National Computer Security Center, Attention:  Chief, Computer
Security Standards.

DoD Components may obtain copies of this publication through their own
publications channels.  Other federal agencies and the public may obtain copies
from:  Office of Standards and Products, National Computer Security Center,
Fort Meade, MD  20755-6000, Attention:  Chief, Computer Security Standards.

_________________________________

Donald C. Latham
Assistant Secretary of Defense
(Command, Control, Communications, and Intelligence)

                               ACKNOWLEDGEMENTS

Special recognition is extended to Sheila L. Brand, National Computer Security
Center (NCSC), who integrated theory, policy, and practice into and directed
the production of this document.

Acknowledgment is also given for the contributions of: Grace Hammonds and
Peter S. Tasker, the MITRE Corp., Daniel J. Edwards, NCSC, Roger R. Schell,
former Deputy Director of NCSC, Marvin Schaefer, NCSC, and Theodore M. P. Lee,
Sperry Corp., who as original architects formulated and articulated the
technical issues and solutions presented in this document; Jeff Makey, formerly
NCSC, Warren F. Shadle, NCSC, and Carole S. Jordan, NCSC, who assisted in the
preparation of this document; James P. Anderson, James P. Anderson & Co.,
Steven B. Lipner, Digital Equipment Corp., Clark Weissman, System Development
Corp., LTC Lawrence A. Noble, formerly U.S. Air Force, Stephen T. Walker,
formerly DoD, Eugene V. Epperly, DoD, and James E. Studer, formerly Dept. of
the Army, who gave generously of their time and expertise in the review and
critique of this document; and finally, thanks are given to the computer
industry and others interested in trusted computing for their enthusiastic
advice and assistance throughout this effort.

                                   CONTENTS

          FOREWORD. . . . . . . . . . . . . . . . . . . . . . . . . . . .i

          ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . ii

          PREFACE . . . . . . . . . . . . . . . . . . . . . . . . . . . .v

          INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . . . . .1

                             PART I:  THE CRITERIA

          1.0  DIVISION D:  MINIMAL PROTECTION. . . . . . . . . . . . . .9

          2.0  DIVISION C:  DISCRETIONARY PROTECTION. . . . . . . . . . 11
               2.1   Class (C1):  Discretionary Security Protection . . 12
               2.2   Class (C2):  Controlled Access Protection. . . . . 15

          3.0  DIVISION B:  MANDATORY PROTECTION. . . . . . . . . . . . 19
               3.1   Class (B1):  Labeled Security Protection . . . . . 20
               3.2   Class (B2):  Structured Protection . . . . . . . . 26
               3.3   Class (B3):  Security Domains. . . . . . . . . . . 33

          4.0  DIVISION A:  VERIFIED PROTECTION . . . . . . . . . . . . 41
               4.1   Class (A1):  Verified Design . . . . . . . . . . . 42
               4.2   Beyond Class (A1). . . . . . . . . . . . . . . . . 51

                      PART II:  RATIONALE AND GUIDELINES

          5.0  CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS. . . . . 55
               5.1   A Need for Consensus . . . . . . . . . . . . . . . 56
               5.2   Definition and Usefulness. . . . . . . . . . . . . 56
               5.3   Criteria Control Objective . . . . . . . . . . . . 56

          6.0  RATIONALE BEHIND THE EVALUATION CLASSES. . . . . . . . . 63
               6.1   The Reference Monitor Concept. . . . . . . . . . . 64
               6.2   A Formal Security Policy Model . . . . . . . . . . 64
               6.3   The Trusted Computing Base . . . . . . . . . . . . 65
               6.4   Assurance. . . . . . . . . . . . . . . . . . . . . 65
               6.5   The Classes. . . . . . . . . . . . . . . . . . . . 66

          7.0  THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA . . . . 69
               7.1   Established Federal Policies . . . . . . . . . . . 70
               7.2   DoD Policies . . . . . . . . . . . . . . . . . . . 70
               7.3   Criteria Control Objective For Security Policy . . 71
               7.4   Criteria Control Objective for Accountability. . . 74
               7.5   Criteria Control Objective for Assurance . . . . . 76

          8.0  A GUIDELINE ON COVERT CHANNELS . . . . . . . . . . . . . 79

          9.0  A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL
               FEATURES . . . . . . . . . . . . . . . . . . . . . . . . 81

          10.0  A GUIDELINE ON SECURITY TESTING . . . . . . . . . . . . 83
                10.1 Testing for Division C . . . . . . . . . . . . . . 84
                10.2 Testing for Division B . . . . . . . . . . . . . . 84
                10.3 Testing for Division A . . . . . . . . . . . . . . 85

          APPENDIX A:  Commercial Product Evaluation Process. . . . . . 87

          APPENDIX B:  Summary of Evaluation Criteria Divisions . . . . 89

          APPENDIX C:  Sumary of Evaluation Criteria Classes. . . . . . 91

          APPENDIX D:  Requirement Directory. . . . . . . . . . . . . . 93

          GLOSSARY. . . . . . . . . . . . . . . . . . . . . . . . . . .109

          REFERENCES. . . . . . . . . . . . . . . . . . . . . . . . . .115

                                    PREFACE

The trusted computer system evaluation criteria defined in this document
classify systems into four broad hierarchical divisions of enhanced security
protection.  They provide a basis for the evaluation of effectiveness of
security controls built into automatic data processing system products.  The
criteria were developed with three objectives in mind: (a) to provide users
with a yardstick with which to assess the degree of trust that can be placed
in computer systems for the secure processing of classified or other sensitive
information; (b) to provide guidance to manufacturers as to what to build into
their new, widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide a basis for
specifying security requirements in acquisition specifications.  Two types of
requirements are delineated for secure processing: (a) specific security
feature requirements and (b) assurance requirements.  Some of the latter
requirements enable evaluation personnel to determine if the required features
are present and functioning as intended.  The scope of these criteria is to be
applied to the set of components comprising a trusted system, and is not
necessarily to be applied to each system component individually.  Hence, some
components of a system may be completely untrusted, while others may be
individually evaluated to a lower or higher evaluation class than the trusted
product considered as a whole system.  In trusted products at the high end of
the range, the strength of the reference monitor is such that most of the
components can be completely untrusted.  Though the criteria are intended to be
application-independent, the specific security feature requirements may have to
be interpreted when applying the criteria to specific systems with their own
functional requirements, applications or special environments (e.g.,
communications processors, process control computers, and embedded systems in
general).  The underlying assurance requirements can be applied across the
entire spectrum of ADP system or application processing environments without
special interpretation.

                                 INTRODUCTION

Historical Perspective

In October 1967, a task force was assembled under the auspices of the Defense
Science Board to address computer security safeguards that would protect
classified information in remote-access, resource-sharing computer systems.
The Task Force report, "Security Controls for Computer Systems," published in
February 1970, made a number of policy and technical recommendations on
actions to be taken to reduce the threat of compromise of classified
information processed on remote-access computer systems.[34]  Department of
Defense Directive 5200.28 and its accompanying manual DoD 5200.28-M, published
in 1972 and 1973 respectively, responded to one of these recommendations by
establishing uniform DoD policy, security requirements, administrative
controls, and technical measures to protect classified information processed
by DoD computer systems.[8;9]  Research and development work undertaken by the
Air Force, Advanced Research Projects Agency, and other defense agencies in
the early and mid 70's developed and demonstrated solution approaches for the
technical problems associated with controlling the flow of information in
resource and information sharing computer systems.[1]  The DoD Computer
Security Initiative was started in 1977 under the auspices of the Under
Secretary of Defense for Research and Engineering to focus DoD efforts
addressing computer security issues.[33]

Concurrent with DoD efforts to address computer security issues, work was
begun under the leadership of the National Bureau of Standards (NBS) to define
problems and solutions for building, evaluating, and auditing secure computer
systems.[17]  As part of this work NBS held two invitational workshops on the
subject of audit and evaluation of computer security.[20;28]  The first was
held in March 1977, and the second in November of 1978.  One of the products
of the second workshop was a definitive paper on the problems related to
providing criteria for the evaluation of technical computer security
effectiveness.[20]  As an outgrowth of recommendations from this report, and in
support of the DoD Computer Security Initiative, the MITRE Corporation began
work on a set of computer security evaluation criteria that could be used to
assess the degree of trust one could place in a computer system to protect
classified data.[24;25;31]  The preliminary concepts for computer security
evaluation were defined and expanded upon at invitational workshops and
symposia whose participants represented computer security expertise drawn from
industry and academia in addition to the government.  Their work has since
been subjected to much peer review and constructive technical criticism from
the DoD, industrial research and development organizations, universities, and
computer manufacturers.

The DoD Computer Security Center (the Center) was formed in January 1981 to
staff and expand on the work started by the DoD Computer Security
Initiative.[15]  A major goal of the Center as given in its DoD Charter is to
encourage the widespread availability of trusted computer systems for use by
those who process classified or other sensitive information.[10]  The criteria
presented in this document have evolved from the earlier NBS and MITRE
evaluation material.

Scope

The trusted computer system evaluation criteria defined in this document apply
primarily to trusted commercially available automatic data processing (ADP)
systems.  They are also applicable, as amplified below, the the evaluation of
existing systems and to the specification of security requirements for ADP
systems acquisition.  Included are two distinct sets of requirements: 1)
specific security feature requirements; and 2) assurance requirements.  The
specific feature requirements encompass the capabilities typically found in
information processing systems employing general-purpose operating systems that
are distinct from the applications programs being supported.  However, specific
security feature requirements may also apply to specific systems with their own
functional requirements, applications or special environments (e.g.,
communications processors, process control computers, and embedded systems in
general).  The assurance requirements, on the other hand, apply to systems that
cover the full range of computing environments from dedicated controllers to
full range multilevel secure resource sharing systems.

Purpose

As outlined in the Preface, the criteria have been developedto serve a number
of intended purposes:

           * To provide a standard to manufacturers as to what security
           features to build into their new and planned, commercial 
           products in order to provide widely available systems that
           satisfy trust requirements (with particular emphasis on preventing
           the disclosure of data) for sensitive applications.

           * To provide DoD components with a metric with which to evaluate
           the degree of trust that can be placed in computer systems for
           the secure processing of classified and other sensitive
           information.

           * To provide a basis for specifying security requirements in
           acquisition specifications.

With respect to the second purpose for development of the criteria, i.e.,
providing DoD components with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be performed on a computer
product from a perspective that excludes the application environment; or, (b)
it can be done to assess whether appropriate security measures have been taken
to permit the system to be used operationally in a specific environment.  The
former type of evaluation is done by the Computer Security Center through the
Commercial Product Evaluation Process.  That process is described in Appendix
A.

The latter type of evaluation, i.e., those done for the purpose of assessing a
system's security attributes with respect to a specific operational mission,
is known as a certification evaluation.  It must be understood that the
completion of a formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific application
environment.  On the contrary, the evaluation report only provides a trusted
computer system's evaluation rating along with supporting data describing the
product system's strengths and weaknesses from a computer security point of
view.  The system security certification and the formal approval/accreditation
procedure, done in accordance with the applicable policies of the issuing
agencies, must still be followed-before a system can be approved for use in
processing or handling classified information.[8;9]  Designated Approving
Authorities (DAAs) remain ultimately responsible for specifying security of
systems they accredit.

The trusted computer system evaluation criteria will be used directly and
indirectly in the certification process.  Along with applicable policy, it
will be used directly as technical guidance for evaluation of the total system
and for specifying system security and certification requirements for new
acquisitions.  Where a system being evaluated for certification employs a
product that has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation.  Technical data
will be furnished to designers, evaluators and the Designated Approving
Authorities to support their needs for making decisions.

Fundamental Computer Security Requirements

Any discussion of computer security necessarily starts from a statement of
requirements, i.e., what it really means to call a computer system "secure."
In general, secure systems will control, through use of specific security
features, access to information such that only properly authorized
individuals, or processes operating on their behalf, will have access to read,
write, create, or delete information.  Six fundamental requirements are
derived from this basic statement of objective: four deal with what needs to
be provided to control access to information; and two deal with how one can
obtain credible assurances that this is accomplished in a trusted computer
system.

                                    Policy

          Requirement 1 - SECURITY POLICY - There must be an explicit and
well-defined security policy enforced by the system.  Given identified subjects
and objects, there must be a set of rules that are used by the system to
determine whether a given subject can be permitted to gain access to a specific
object.  Computer systems of interest must enforce a mandatory security policy
that can effectively implement access rules for handling sensitive (e.g.,
classified) information.[7]  These rules include requirements such as: No
person lacking proper personnel security clearance shall obtain access to
classified
information.  In addition, discretionary security controls are required to
ensure that only selected users or groups of users may obtain access to data
(e.g., based on a need-to-know). 

          Requirement 2 - MARKING - Access control labels must be associated
with objects.  In order to control access to information stored in a computer,
according to the rules of a mandatory security policy, it must be possible to
mark every object with a label that reliably identifies the object's
sensitivity level (e.g., classification), and/or the modes of access accorded
those subjects who may potentially access the object.

                                Accountability

          Requirement 3 - IDENTIFICATION - Individual subjects must be
identified.  Each access to information must be mediated based on who is
accessing the information and what classes of information they are authorized
to deal with.  This identification and authorization information must be
securely maintained by the computer system and be associated with every active
element that performs some security-relevant action in the system.

          Requirement 4 - ACCOUNTABILITY - Audit information must be
selectively kept and protected so that actions affecting security can be traced
to the responsible party.  A trusted system must be able to record the
occurrences of security-relevant events in an audit log.  The capability to
select the audit events to be recorded is necessary to minimize the expense of
auditing and to allow efficient analysis.  Audit data must be protected from
modification and unauthorized destruction to permit detection and
after-the-fact investigations of security violations.

                                   Assurance

          Requirement 5 - ASSURANCE - The computer system must contain
hardware/software mechanisms that can be independently evaluated to provide
sufficient assurance that the system enforces requirements 1 through 4 above. 
In order to assure that the four requirements of Security Policy, Marking,
Identification, and Accountability are enforced by a computer system, there
must be some identified and unified collection of hardware and software
controls that perform those functions.  These mechanisms are typically embedded
in the operating system and are designed to carry out the assigned tasks in a
secure manner.  The basis for trusting such system mechanisms in their
operational setting must be clearly documented such that it is possible to
independently examine the evidence to evaluate their sufficiency.

          Requirement 6 - CONTINUOUS PROTECTION - The trusted mechanisms that
enforce these basic requirements must be continuously protected against
tampering and/or unauthorized changes.  No computer system can be considered
truly secure if the basic hardware and software mechanisms that enforce the
security policy are themselves subject to unauthorized modification or
subversion.  The continuous protection requirement has direct implications
throughout the computer system's life-cycle.

These fundamental requirements form the basis for the individual evaluation
criteria applicable for each evaluation division and class.  The interested
reader is referred to Section 5 of this document, "Control Objectives for
Trusted Computer Systems," for a more complete discussion and further
amplification of these fundamental requirements as they apply to
general-purpose information processing systems and to Section 7 for
amplification of the relationship between Policy and these requirements.

Structure of the Document

The remainder of this document is divided into two parts, four appendices, and
a glossary.  Part I (Sections 1 through 4) presents the detailed criteria
derived from the fundamental requirements described above and relevant to the
rationale and policy excerpts contained in Part II.

Part II (Sections 5 through 10) provides a discussion of basic objectives,
rationale, and national policy behind the development of the criteria, and
guidelines for developers pertaining to: mandatory access control rules
implementation, the covert channel problem, and security testing.  It is
divided into six sections.  Section 5 discusses the use of control objectives
in general and presents the three basic control objectives of the criteria.
Section 6 provides the theoretical basis behind the criteria.  Section 7 gives
excerpts from pertinent regulations, directives, OMB Circulars, and Executive
Orders which provide the basis for many trust requirements for processing
nationally sensitive and classified information with computer systems.
Section 8 provides guidance to system developers on expectations in dealing
with the covert channel problem.  Section 9 provides guidelines dealing with
mandatory security.  Section 10 provides guidelines for security testing.
There are four appendices, including a description of the Trusted Computer
System Commercial Products Evaluation Process (Appendix A), summaries of the
evaluation divisions (Appendix B) and classes (Appendix C), and finally a
directory of requirements ordered alphabetically.  In addition, there is a
glossary.

Structure of the Criteria

The criteria are divided into four divisions: D, C, B, and A ordered in a
hierarchical manner with the highest division (A) being reserved for systems
providing the most comprehensive security.  Each division represents a major
improvement in the overall confidence one can place in the system for the
protection of sensitive information.  Within divisions C and B there are a
number of subdivisions known as classes.  The classes are also ordered in a
hierarchical manner with systems representative of division C and lower
classes of division B being characterized by the set of computer security
mechanisms that they possess.  Assurance of correct and complete design and
implementation for these systems is gained mostly through testing of the
security- relevant portions of the system.  The security-relevant portions of
a system are referred to throughout this document as the Trusted Computing
Base (TCB).  Systems representative of higher classes in division B and
division A derive their security attributes more from their design and
implementation structure.  Increased assurance that the required features are
operative, correct, and tamperproof under all circumstances is gained through
progressively more rigorous analysis during the design process.

Within each class, four major sets of criteria are addressed.  The first three
represent features necessary to satisfy the broad control objectives of
Security Policy, Accountability, and Assurance that are discussed in Part II,
Section 5.  The fourth set, Documentation, describes the type of written
evidence in the form of user guides, manuals, and the test and design
documentation required for each class.

A reader using this publication for the first time may find it helpful to
first read Part II, before continuing on with Part I.

                             PART I:  THE CRITERIA

Highlighting (UPPERCASE) is used in Part I to indicate criteria not contained
in a lower class or changes and additions to already defined criteria.  Where
there is no highlighting, requirements have been carried over from lower
classes without addition or modification.

                     1.0  DIVISION D:  MINIMAL PROTECTION

This division contains only one class.  It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

                   2.0 DIVISION C:  DISCRETIONARY PROTECTION

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

2.1  CLASS (C1):  DISCRETIONARY SECURITY PROTECTION

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data.  It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data.  The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity.  The following are minimal
requirements for systems assigned a class (C1) rating:

2.1.1  Security Policy

     2.1.1.1   Discretionary Access Control

               The TCB shall define and control access between named users and
               named objects (e.g., files and programs) in the ADP system.  The
               enforcement mechanism (e.g., self/group/public controls, access
               control lists) shall allow users to specify and control sharing
               of those objects by named individuals or defined groups or both.

2.1.2  Accountability

     2.1.2.1   Identification and Authentication

               The TCB shall require users to identify themselves to it before
               beginning to perform any other actions that the TCB is expected
               to mediate.  Furthermore, the TCB shall use a protected 
               mechanism (e.g., passwords) to authenticate the user's identity.
               The TCB shall protect authentication data so that it cannot be
               accessed by any unauthorized user.

2.1.3  Assurance

     2.1.3.1   Operational Assurance

          2.1.3.1.1  System Architecture

                     The TCB shall maintain a domain for its own execution
                     protects it from external interference or tampering
                     (e.g., by modification of its code or data strucutres).
                     Resources controlled by the TCB may be a defined subset
                     of the subjects and objects in the ADP system.

          2.1.3.1.2  System Integrity

                     Hardware and/or software features shall be provided that
                     can be used to periodically validate the correct operation
                     of the on-site hardware and firmware elements of the TCB.

     2.1.3.2   Life-Cycle Assurance

          2.1.3.2.1  Security Testing

                     The security mechanisms of the ADP system shall be tested
                     and found to work as claimed in the system documentation.
                     Testing shall be done to assure that there are no obvious
                     ways for an unauthorized user to bypass or otherwise
                     defeat the security protection mechanisms of the TCB.
                     (See the Security Testing Guidelines.)

2.1.4  Documentation

     2.1.4.1   Security Features User's Guide

               A single summary, chapter, or manual in user documentation
               shall describe the protection mechanisms provided by the TCB,
               guidelines on their use, and how they interact with one another.

     2.1.4.2   Trusted Facility Manual

               A manual addressed to the ADP System Administrator shall
               present cautions about functions and privileges that should be
               controlled when running a secure facility.

     2.1.4.3   Test Documentation

               The system developer shall provide to the evaluators a document
               that describes the test plan, test procedures that show how the
               the security mechanisms were tested, and results of the
               security mechanisms' functional testing.

     2.1.4.4   Design Documentation

               Documentation shall be available that provides a description of
               the manufacturer's philosophy of protection and an explanation
               of how this philosophy is translated into the TCB.  If the TCB
               is composed of distinct modules, the interfaces between these
               modules shall be described.

2.2  CLASS (C2):  CONTROLLED ACCESS PROTECTION

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation.  The following are minimal requirements for systems
assigned a class (C2) rating:

2.2.1  Security Policy

     2.2.1.1   Discretionary Access Control

               The TCB shall define and control access between named users and
               named objects (e.g., files and programs) in the ADP system.  The
               enforcement mechanism (e.g., self/group/public controls, access
               control lists) shall allow users to specify and control sharing
               of those objects by named individuals, or defined groups of 
               individuals, or by both, and shall provide controls to limit
               propagation of access rights.  The discretionary access control
               mechanism shall, either by explicit user action or by default,
               provide that objects are protected from unauthorized access.
               These access controls shall be capable of including or excluding
               access to the granularity of a single user.  Access permission
               to an object by users not already possessing access permission
               shall only be assigned by authorized users.

     2.2.1.2   Object Reuse

               All authorizations to the information contained within a
               storage object shall be revoked prior to initial assignment,
               allocation or reallocation to a subject from the TCB's pool
               of unused storage objects.  No information, including encrypted
               representations of information, produced by a prior subject's
               actions is to be available to any subject that obtains access
               to an object that has been released back to the system.

2.2.2  Accountability

     2.2.2.1   Identification and Authentication

               The TCB shall require users to identify themselves to it before
               beginning to perform any other actions that the TCB is expected
               to mediate.  Furthermore, the TCB shall use a protected 
               mechanism (e.g., passwords) to authenticate the user's identity.
               The TCB shall protect authentication data so that it cannot be
               accessed by any unauthorized user.  The TCB shall be able to
               enforce individual accountability by providing the capability to
               uniquely identify each individual ADP system user.  The TCB
               shall also provide the capability of associating this identity
               with all auditable actions taken by that individual.

     2.2.2.2   Audit

               The TCB shall be able to create, maintain, and protect from
               modification or unauthorized access or destruction an audit
               trail of accesses to the objects it protects.  The audit data
               shall be protected by the TCB so that read access to it is
               limited to those who are authorized for audit data.  The TCB
               shall be able to record the following types of events:  use of
               identification and authentication mechanisms, introduction or
               objects into a user's address space (e.g., file open, program
               initiation), deletion of objects, and actions taken by 
               computer operators and system administrators and/or system
               security officers, and other security relevant events.  For
               each recorded event, the audit record shall identify:  date and
               time of the event, user, type of event, and success or failure
               of the event.  For identification/authentication events the
               origin of request (e.g., terminal ID) shall be included in the
               audit record.  For events that introduce an object into a user's
               address space and  for object deletion events the audit record
               shall include the name of the object.  The ADP system
               administrator shall be able to selectively audit the actions of
               any one or more users based on individual identity.

2.2.3  Assurance

     2.2.3.1   Operational Assurance

          2.2.3.1.1  System Architecture

                     The TCB shall maintain a domain for its own execution
                     that protects it from external interference or tampering
                     (e.g., by modification of its code or data structures). 
                     Resources controlled by the TCB may be a defined subset
                     of the subjects and objects in the ADP system.  The TCB
                     shall isolate the resources to be protected so that they
                     are subject to the access control and auditing
                     requirements.

          2.2.3.1.2  System Integrity

                     Hardware and/or software features shall be provided that
                     can be used to periodically validate the correct operation
                     of the on-site hardware and firmware elements of the TCB.

     2.2.3.2   Life-Cycle Assurance

          2.2.3.2.1  Security Testing

                     The security mechanisms of the ADP system shall be tested
                     and found to work as claimed in the system documentation.
                     Testing shall be done to assure that there are no obvious
                     ways for an unauthorized user to bypass or otherwise
                     defeat the security protection mechanisms of the TCB.
                     Testing shall also include a search for obvious flaws that
                     would allow violation of resource isolation, or that would
                     permit unauthorized access to the audit or authentication
                     data.  (See the Security Testing guidelines.)

2.2.4  Documentation

     2.2.4.1   Security Features User's Guide

               A single summary, chapter, or manual in user documentation
               shall describe the protection mechanisms provided by the TCB, 
               guidelines on their use, and how they interact with one another.

     2.2.4.2   Trusted Facility Manual

               A manual addressed to the ADP system administrator shall
               present cautions about functions and privileges that should be
               controlled when running a secure facility.  The procedures for
               examining and maintaining the audit files as well as the 
               detailed audit record structure for each type of audit event
               shall be given.

     2.2.4.3   Test Documentation

               The system developer shall provide to the evaluators a document
               that describes the test plan, test procedures that show how the
               security mechanisms were tested, and results of the security 
               mechanisms' functional testing.

     2.2.4.4   Design Documentation

               Documentation shall be available that provides a description of
               the manufacturer's philosophy of protection and an explanation
               of how this philosophy is translated into the TCB.  If the TCB
               is composed of distinct modules, the interfaces between these
               modules shall be described.

                    3.0  DIVISION B:  MANDATORY PROTECTION

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division.  Systems in this division must carry the
sensitivity labels with major data structures in the system.  The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB.  Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

3.1  CLASS (B1):  LABELED SECURITY PROTECTION

Class (B1) systems require all the features required for class (C2).  In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information.  Any
flaws identified by testing must be removed.  The following are minimal
requirements for systems assigned a class (B1) rating:

3.1.1  Security Policy

     3.1.1.1   Discretionary Access Control

               The TCB shall define and control access between named users and
               named objects (e.g., files and programs) in the ADP system.  
               The enforcement mechanism (e.g., self/group/public controls, 
               access control lists) shall allow users to specify and control
               sharing of those objects by named individuals, or defined groups
               of individuals, or by both, and shall provide controls to limit
               propagation of access rights.  The discretionary access control
               mechanism shall, either by explicit user action or by default,
               provide that objects are protected from unauthorized access. 
               These access controls shall be capable of including or excluding
               access to the granularity of a single user.  Access permission
               to an object by users not already possessing access permission
               shall only be assigned by authorized users.

     3.1.1.2   Object Reuse

               All authorizations to the information contained within a
               storage object shall be revoked prior to initial assignment,
               allocation or reallocation to a subject from the TCB's pool
               of unused storage objects.  No information, including encrypted
               representations of information, produced by a prior subject's
               actions is to be available to any subject that obtains access
               to an object that has been released back to the system.

     3.1.1.3   Labels

               Sensitivity labels associated with each subject and storage
               object under its control (e.g., process, file, segment, device)
               shall be maintained by the TCB.  These labels shall be used as
               the basis for mandatory access control decisions.  In order to
               import non-labeled data, the TCB shall request and receive from
               an authorized user the security level of the data, and all such
               actions shall be auditable by the TCB.

          3.1.1.3.1  Label Integrity

                     Sensitivity labels shall accurately represent security
                     levels of the specific subjects or objects with which they
                     are associated.  When exported by the TCB, sensitivity
                     labels shall accurately and unambiguously represent the
                     internal labels and shall be associated with the
                     information being exported.

          3.1.1.3.2  Exportation of Labeled Information

                     The TCB shall designate each communication channel and
                     I/O device as either single-level or miltilevel.  Any
                     change in this designation shall be done manually and
                     shall be auditable by the TCB.  The TCB shall maintain
                     and be able to audit any change in the security level
                     or levels associated with a communication channel or 
                     I/O device.

               3.1.1.3.2.1  Exportation to Multilevel Devices

                          When the TCB exports an object to a multilevel I/O
                          device, the sensitivity label associated with that
                          object shall also be exported and shall reside on
                          the same physical medium as the exported
                          information and shall be in the same form
                          (i.e., machine-readable or human-readable form).
                          When the TCB exports or imports an object over a 
                          multilevel communication channel, the protocol
                          used on that channel shall provide for the
                          unambiguous pairing between the sensitivity labels
                          and the associated information that is sent or
                          received.

               3.1.1.3.2.2  Exportation to Single-Level Devices

                          Single-level I/O devices and single-level
                          communication channels are not required to
                          maintain the sensitivity labels of the information
                          they process.  However, the TCB shall include a
                          mechanism by which the TCb and an authorized user 
                          reliably communicate to designate the single
                          security level of information imported or exported
                          via single-level communication channels or I/O
                          devices.

               3.1.1.3.2.3  Labeling Human-Readable Output

                          The ADP system administrator shall be able to
                          specify the printable label names associated with
                          exported sensitivity labels.  The TCB shall mark
                          the beginning and end of all human-readable, paged,
                          hardcopy output (e.g., line printer output) with
                          human-readable sensitivity labels that properly*
                          represent the sensitivity of the output.  The TCB
                          shall, be default, mark the top and bottom of each
                          page of human-readable, paged, hardcopy output
                          (e.g., line printer output) with human-readable
                          sensitivity labels that properly* represent the
                          overall sensitivity of the output or that properly*
                          represent the sensitivity of the information on the
                          page.  The TCB shall, by default and in an
                          appropriate manner, mark other forms of human-
                          readable output (e.g., maps, graphics) with human-
                          readable sensitivity labels that properly*
                          represent the sensitivity of the touput.  Any
                          override of these marking defaults shall be
                          auditable by the TCB.

     3.1.1.4  Mandatory Access Control

               The TCB shall enforce a mandatory access control policy over
               all subjects and storage objects under its control (e.g.,
               processes, files, segments, devices).  These subjects and
               objects shall be assigned sensitivity labels that are a
               combination of hierarchical classification levels and
               non-hierarchical categories, and the labels shall be used as
               the basis for mandatory access control decisions.  The TCB
               shall be able to support two or more such security levels.
               (See the Mandatory Access Control Guidelines.)  The following
               requirements shall hold for all accesses between subjects and
               objects controlled by the TCB:  a subject can read an object
               only if the hierarchical classification in the subject's
               security level is greater than or equal to the hierarchical
               classification in the object's security level and the non-
               hierarchical categories in the subject's security level include
               all the non-hierarchical categories in the object's security
               level.  A subject can write an object only if the hierarchical
               classification in the subject's security level is less than or
               equal to the hierarchical classification in the object's 
               security level and all the non-hierarchical categories in the
               subject's security level are included in the non-hierarchical
               categories in the object's security level.  Identification 
               and authentication data shall be used by the TCB to authenti-
               cate the user's identity and to ensure that the security level
               and authorization of subjects external to the TCB that may be
               created to act on behalf of the individual user are dominated
               by the clearance and authorization of that user.

3.1.2  Accountability

     3.1.2.1  Identification and Authentication

               The TCB shall require users to identify themselves to it before
               beginning to perform any other actions that the TCB is expected
               to mediate.  Furthermore, the TCB shall maintain authentication
               data that includes information for verifying the identity of
               individual users (e.g., passwords) as well as information for
               determining the clearance and authorizations or individual
_____________________________
* The hierarchical classification component in human-readable sensitivity
labels shall be equal to the greatest hierarchical classification or any of the
information in the output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical categories of the
information in the output the labels refer to, but no other non-hierarchical
categories.

               users.  This data shall be used by the TCB to authenticate the
               user's identity and to ensure that the security level and
               authorizations of subjects external to the TCB that may be
               created to act on behalf of the individual user are dominated
               by the clearance and authorization of that user.  The TCB shall
               protect authentication data so that it cannot be accessed by any
               unauthorized user.  The TCB shall be able to enforce individual
               accountability by providing the capability to uniquely identify
               each individual ADP system user.  The TCB shall also provide the
               capability of associating this identity with all auditable
               actions taken by that individual.

     3.1.2.2   Audit

               The TCB shall be able to create, maintain, and protect from
               modification or unauthorized access or destruction an audit
               trail of accesses to the objects it protects.  The audit data
               shall be protected by the TCB so that read access to it is
               limited to those who are authorized for audit data.  The TCB
               shall be able to record the following types of events: use of
               identification and authentication mechanisms, introduction of
               objects into a user's address space (e.g., file open, program
               initiation), deletion of objects, and actions taken by computer
               operators and system administrators and/or system security
               officers and other security relevant events.  The TCB shall also
               be able to audit any override of human-readable output markings.
               For each recorded event, the audit record shall identify: date
               and time of the event, user, type of event, and success or
               failure of the event.  For identification/authentication events
               the origin of request (e.g., terminal ID) shall be included in
               the audit record.  For events that introduce an object into a
               user's address space and for object deletion events the audit
               record shall include the name of the object and the object's
               security level.  The ADP system administrator shall be able to
               selectively audit the actions of any one or more users based on
               individual identity and/or object security level.

3.1.3  Assurance

     3.1.3.1   Operational Assurance

          3.1.3.1.1  System Architecture

                     The TCB shall maintain a domain for its own execution
                     that protects it from external interference or tampering
                     (e.g., by modification of its code or data structures). 
                     Resources controlled by the TCB may be a defined subset
                     of the subjects and objects in the ADP system.  The TCB
                     shall maintain process isolation through the provision of
                     distinct address spaces under its control.  The TCB shall
                     isolate the resources to be protected so that they are
                     subject to the access control and auditing requirements.

          3.1.3.1.2  System Integrity

                     Hardware and/or software features shall be provided that
                     can be used to periodically validate the correct operation
                     of the on-site hardware and firmware elements of the TCB.

     3.1.3.2   Life-Cycle Assurance

          3.1.3.2.1  Security Testing

                     The security mechanisms of the ADP system shall be tested
                     and found to work as claimed in the system documentation.
                     A team of individuals who thoroughly understand the
                     specific implementation of the TCB shall subject its
                     design documentation, source code, and object code to
                     thorough analysis and testing.  Their objectives shall be:
                     to uncover all design and implementation flaws that would
                     permit a subject external to the TCB to read, change, or
                     delete data normally denied under the mandatory or
                     discretionary security policy enforced by the TCB; as well
                     as to assure that no subject (without authorization to do
                     so) is able to cause the TCB to enter a state such that
                     it is unable to respond to communications initiated by
                     other users.  All discovered flaws shall be removed or
                     neutralized and the TCB retested to demonstrate that they
                     have been eliminated and that new flaws have not been
                     introduced.  (See the Security Testing Guidelines.)

          3.1.3.2.2  Design Specification and Verification

                     An informal or formal model of the security policy
                     supported by the TCB shall be maintained over the life
                     cycle of the ADP system and demonstrated to be consistent
                     with its axioms.

3.1.4  Documentation

     3.1.4.1   Security Features User's Guide

               A single summary, chapter, or manual in user documentation
               shall describe the protection mechanisms provided by the TCB,
               guidelines on their use, and how they interact with one another.

     3.1.4.2   Trusted Facility Manual

               A manual addressed to the ADP system administrator shall
               present cautions about functions and privileges that should be
               controlled when running a secure facility.  The procedures for
               examining and maintaining the audit files as well as the
               detailed audit record structure for each type of audit event
               shall be given.  The manual shall describe the operator and
               administrator functions related to security, to include changing
               the security characteristics of a user.  It shall provide
               guidelines on the consistent and effective use of the protection
               features of the system, how they interact, how to securely
               generate a new TCB, and facility procedures, warnings, and
               privileges that need to be controlled in order to operate the
               facility in a secure manner.

     3.1.4.3   Test Documentation

               The system developer shall provide to the evaluators a document
               that describes the test plan, test procedures that show how the
               security mechanisms were tested, and results of the security 
               mechanisms' functional testing.

     3.1.4.4   Design Documentation

               Documentation shall be available that provides a description of
               the manufacturer's philosophy of protection and an explanation
               of how this philosophy is translated into the TCB.  If the TCB
               is composed of distinct modules, the interfaces between these
               modules shall be described.  An informal or formal description
               of the security policy model enforced by the TCB shall be
               available and an explanation provided to show that it is
               sufficient to enforce the security policy.  The specific TCB
               protection mechanisms shall be identified and an explanation
               given to show that they satisfy the model.

3.2  CLASS (B2):  STRUCTURED PROTECTION

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system.  In addition, covert channels are
addressed.  The TCB must be carefully structured into protection-critical and
non- protection-critical elements.  The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review.  Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed.  The system is relatively resistant to penetration.  The
following are minimal requirements for systems assigned a class (B2) rating:

3.2.1  Security Policy

     3.2.1.1   Discretionary Access Control

               The TCB shall define and control access between named users and
               named objects (e.g., files and programs) in the ADP system.
               The enforcement mechanism (e.g., self/group/public controls,
               access control lists) shall allow users to specify and control
               sharing of those objects by named individuals, or defined
               groups of individuals, or by both, and shall provide controls 
               to limit propagation of access rights.  The discretionary access
               control mechanism shall, either by explicit user action or by
               default, provide that objects are protected from unauthorized
               access.  These access controls shall be capable of including
               or excluding access to the granularity of a single user.
               Access permission to an object by users not already possessing
               access permission shall only be assigned by authorized users.

     3.2.1.2   Object Reuse

               All authorizations to the information contained within a
               storage object shall be revoked prior to initial assignment,
               allocation or reallocation to a subject from the TCB's pool of 
               unused storage objects.  No information, including encrypted
               representations of information, produced by a prior subject's
               actions is to be available to any subject that obtains access
               to an object that has been released back to the system.

     3.2.1.3   Labels

               Sensitivity labels associated with each ADP system resource
               (e.g., subject, storage object, ROM) that is directly or
               indirectly accessible by subjects external to the TCB shall be
               maintained by the TCB.  These labels shall be used as the basis
               for mandatory access control decisions.  In order to import 
               non-labeled data, the TCB shall request and receive from an
               authorized user the security level of the data, and all such
               actions shall be auditable by the TCB.

          3.2.1.3.1  Label Integrity

                     Sensitivity labels shall accurately represent security
                     levels of the specific subjects or objects with which
                     they are associated.  When exported by the TCB,
                     sensitivity labels shall accurately and unambiguously
                     represent the internal labels and shall be associated
                     with the information being exported.

          3.2.1.3.2  Exportation of Labeled Information

                     The TCB shall designate each communication channel and
                     I/O device as either single-level or multilevel.  Any
                     change in this designation shall be done manually and
                     shall be auditable by the TCB.  The TCB shall maintain
                     and be able to audit any change in the security level
                     or levels associated with a communication channel or 
                     I/O device.

               3.2.1.3.2.1  Exportation to Multilevel Devices

                          When the TCB exports an object to a multilevel I/O
                          device, the sensitivity label associated with that
                          object shall also be exported and shall reside on
                          the same physical medium as the exported 
                          information and shall be in the same form (i.e., 
                          machine-readable or human-readable form).  When
                          the TCB exports or imports an object over a
                          multilevel communication channel, the protocol
                          used on that channel shall provide for the
                          unambiguous pairing between the sensitivity labels
                          and the associated information that is sent or
                          received.

               3.2.1.3.2.2  Exportation to Single-Level Devices

                          Single-level I/O devices and single-level
                          communication channels are not required to 
                          maintain the sensitivity labels of the
                          information they process.  However, the TCB shall
                          include a mechanism by which the TCB and an
                          authorized user reliably communicate to designate
                          the single security level of information imported
                          or exported via single-level communication 
                          channels or I/O devices.

               3.2.1.3.2.3  Labeling Human-Readable Output

                          The ADP system administrator shall be able to
                          specify the printable label names associated with
                          exported sensitivity labels.  The TCB shall mark
                          the beginning and end of all human-readable, paged,
                          hardcopy output (e.g., line printer output) with
                          human-readable sensitivity labels that properly*
                          represent the sensitivity of the output.  The TCB
                          shall, by default, mark the top and bottom of each
                          page of human-readable, paged, hardcopy output
                          (e.g., line printer output) with human-readable
                          sensitivity labels that properly* represent the
                          overall sensitivity of the output or that 
                          properly* represent the sensitivity of the
                          information on the page.  The TCB shall, by
                          default and in an appropriate manner, mark other
                          forms of human-readable output (e.g., maps,
                          graphics) with human-readable sensitivity labels
                          that properly* represent the sensitivity of the
                          output.  Any override of these marking defaults
                          shall be auditable by the TCB.

          3.2.1.3.3  Subject Sensitivity Labels

                     The TCB shall immediately notify a terminal user of each
                     change in the security level associated with that user
                     during an interactive session.  A terminal user shall be
                     able to query the TCB as desired for a display of the
                     subject's complete sensitivity label.

          3.2.1.3.4  Device Labels

                     The TCB shall support the assignment of minimum and
                     maximum security levels to all attached physical devices.
                     These security levels shall be used by the TCB to enforce
                     constraints imposed by the physical environments in which
                     the devices are located.

     3.2.1.4   Mandatory Access Control

               The TCB shall enforce a mandatory access control policy over
               all resources (i.e., subjects, storage objects, and I/O devices
               that are directly or indirectly accessible by subjects external
               to the TCB.  These subjects and objects shall be assigned
               sensitivity labels that are a combination of hierarchical
               classification levels and non-hierarchical categories, and the
               labels shall be used as the basis for mandatory access control
               decisions.  The TCB shall be able to support two or more such
               security levels.  (See the Mandatory Access Control guidelines.)
               The following requirements shall hold for all accesses between
               All subjects external to the TCB and all objects directly or
               indirectly accessible by these subjects:  A subject can read an
               object only if the hierarchical classification in the subject's
               security level is greater than or equal to the hierarchical
               classification in the object's security level and the non-
               hierarchical categories in the subject's security level include
               all the non-hierarchical categories in the object's security
               level.  A subject can write an object only if the hierarchical
               classification in the subject's security level is less than or
               equal to the hierarchical classification in the object's
               security level and all the non-hierarchical categories in the
               subject's security level are included in the non-hierarchical
               categories in the object's security level.  Identification and
               authentication data shall be used by the TCB to authenticate
               the user's identity and to ensure that the security level and
               authorization of subjects external to the TCB that may be
               created to act on behalf of the individual user are dominated
               by the clearance and authorization of that user.

3.2.2  Accountability

     3.2.2.1   Identification and Authentication

               The TCB shall require users to identify themselves to it before
               beginning to perform any other actions that the TCB is expected
               to mediate.  Furthermore, the TCB shall maintain authentication
               data that includes information for verifying the identity of
               individual users (e.g., passwords) as well as information for
               determining the clearance and authorizations of individual
               users.  This data shall be used by the TCB to authenticate the
               user's identity and to ensure that the security level and
               authorizations of subjects external to the TCB that may be
               created to act on behalf of the individual user are dominated by
               the clearance and authorization of that user.  The TCB shall
               protect authentication data so that it cannot be accessed by any
               unauthorized user.  The TCB shall be able to enforce individual
               accountability by providing the capability to uniquely identify
               each individual ADP system user.  The TCB shall also provide the
               capability of associating this identity with all auditable
               actions taken by that individual.

          3.2.2.1.1  Trusted Path

                     The TCB shall support a trusted communication path
                     between itself and user for initial login and
                     authentication.  Communications via this path shall be
                     initiated exclusively by a user.

     3.2.2.2   Audit

               The TCB shall be able to create, maintain, and protect from
               modification or unauthorized access or destruction an audit
               trail of accesses to the objects it protects.  The audit data
               shall be protected by the TCB so that read access to it is
               limited to those who are authorized for audit data.  The TCB
               shall be able to record the following types of events: use of
               identification and authentication mechanisms, introduction of
               objects into a user's address space (e.g., file open, program
               initiation), deletion of objects, and actions taken by computer
               operators and system administrators and/or system security
               officers, and other security relevant events.  The TCB shall
               also be able to audit any override of human-readable output
               markings.  For each recorded event, the audit record shall
               identify:  date and time of the event, user, type of event, and
               success or failure of the event.  For identification/
               authentication events the origin of request (e.g., terminal ID)
               shall be included in the audit record.  For events that
               introduce an object into a user's address space and for object
               deletion events the audit record shall include the name of the
               object and the object's security level.  The ADP system
               administrator shall be able to selectively audit the actions of
               any one or more users based on individual identity and/or object
               security level.  The TCB shall be able to audit the identified
               events that may be used in the exploitation of covert storage
               channels.

3.2.3  Assurance

     3.2.3.1   Operational Assurance

          3.2.3.1.1  System Architecture

                     The TCB shall maintain a domain for its own execution
                     that protects it from external interference or tampering
                     (e.g., by modification of its code or data structures).
                     The TCB shall maintain process isolation through the
                     provision of distinct address spaces under its control.
                     The TCB shall be internally structured into well-defined
                     largely independent modules.  It shall make effective use
                     of available hardware to separate those elements that are
                     protection-critical from those that are not.  The TCB
                     modules shall be designed such that the principle of least
                     privilege is enforced.  Features in hardware, such as
                     segmentation, shall be used to support logically distinct
                     storage objects with separate attributes (namely:
                     readable, writeable).  The user interface to the TCB
                     shall be completely defined and all elements of the TCB
                     identified.

          3.2.3.1.2  System Integrity

                     Hardware and/or software features shall be provided that
                     can be used to periodically validate the correct 
                     operation of the on-site hardware and firmware elements 
                     of the TCB.

          3.2.3.1.3  Covert Channel Analysis

                     The system developer shall conduct a thorough search for
                     covert storage channels and make a determination (either
                     by actual measurement or by engineering estimation) of
                     the maximum bandwidth of each identified channel.  (See
                     the covert channels guideline section.)

          3.2.3.1.4  Trusted Facility Management

                     The TCB shall support separate operator and administrator
                     functions.

     3.2.3.2   Life-Cycle Assurance

          3.2.3.2.1  Security Testing

                     The security mechanisms of the ADP system shall be tested
                     and found to work as claimed in the system documentation. 
                     A team of individuals who thoroughly understand the
                     specific implementation of the TCB shall subject its
                     design documentation, source code, and object code to
                     thorough analysis and testing.  Their objectives shall be:
                     to uncover all design and implementation flaws that would
                     permit a subject external to the TCB to read, change, or
                     delete data normally denied under the mandatory or
                     discretionary security policy enforced by the TCB; as well
                     as to assure that no subject (without authorization to do
                     so) is able to cause the TCB to enter a state such that it
                     is unable to respond to communications initiated by other
                     users.  The TCB shall be found relatively resistant to
                     penetration.  All discovered flaws shall be corrected and
                     the TCB retested to demonstrate that they have been
                     eliminated and that new flaws have not been introduced.
                     Testing shall demonstrate that the TCB implementation is
                     consistent with the descriptive top-level specification.
                     (See the Security Testing Guidelines.)

          3.2.3.2.2  Design Specification and Verification

                     A formal model of the security policy supported by the
                     TCB shall be maintained over the life cycle of the ADP
                     system that is proven consistent with its axioms.  A
                     descriptive top-level specification (DTLS) of the TCB
                     shall be maintained that completely and accurately
                     describes the TCB in terms of exceptions, error messages,
                     and effects.  It shall be shown to be an accurate
                     description of the TCB interface.

          3.2.3.2.3  Configuration Management

                     During development and maintenance of the TCB, a
                     configuration management system shall be in place that
                     maintains control of changes to the descriptive top-level
                     specification, other design data, implementation
                     documentation, source code, the running versionof the
                     object code, and test fixtures and documentation.  The
                     configuration management system shall assure a consistent
                     mapping among all documentation and code associated with 
                     the current version of the TCB.  Tools shall be provided
                     for generation of a new version of the TCB from source
                     code.  Also available shall be tools for comparing a
                     newly generated version with the previous TCB version in
                     order to ascertain that only the intended changes have
                     been made in the code that will actually be used as the
                     new version of the TCB.

3.2.4  Documentation

     3.2.4.1   Security Features User's Guide

               A single summary, chapter, or manual in user documentation
               shall describe the protection mechanisms provided by the TCB,
               guidelines on their use, and how they interact with one another.

     3.2.4.2   Trusted Facility Manual

               A manual addressed to the ADP system administrator shall
               present cautions about functions and privileges that should be
               controlled when running a secure facility.  The procedures for
               examining and maintaining the audit files as well as the
               detailed audit record structure for each type of audit event
               shall be given.  The manual shall describe the operator and
               administrator functions related to security, to include 
               changing the security characteristics of a user.  It shall
               provide guidelines on the consistent and effective use of the
               protection features of the system, how they interact, how to
               securely generate a new TCB, and facility procedures, warnings,
               and privileges that need to be controlled in order to operate
               the facility in a secure manner.  The TCB modules that contain
               the reference validation mechanism shall be identified.  The
               procedures for secure generation of a new TCB from source after
               modification of any modules in the TCB shall be described.

     3.2.4.3   Test Documentation

               The system developer shall provide to the evaluators a document
               that describes the test plan, test procedures that show how the
               security mechanisms were tested, and results of the security
               mechanisms' functional testing.  It shall include results of
               testing the effectiveness of the methods used to reduce covert
               channel bandwidths.

     3.2.4.4   Design Documentation

               Documentation shall be available that provides a description of
               the manufacturer's philosophy of protection and an explanation
               of how this philosophy is translated into the TCB.  The
               interfaces between the TCB modules shall be described.  A
               formal description of the security policy model enforced by the
               TCB shall be available and proven that it is sufficient to
               enforce the security policy.  The specific TCB protection 
               mechanisms shall be identified and an explanation given to show
               that they satisfy the model.  The descriptive top-level
               specification (DTLS) shall be shown to be an accurate
               description of the TCB interface.  Documentation shall describe
               how the TCB implements the reference monitor concept and give
               an explanation why it is tamper resistant, cannot be bypassed,
               and is correctly implemented.  Documentation shall describe how
               the TCB is structured to facilitate testing and to enforce least
               privilege.  This documentation shall also present the results
               of the covert channel analysis and the tradeoffs involved in
               restricting the channels.  All auditable events that may be
               used in the exploitation of known covert storage channels shall
               be identified.  The bandwidths of known covert storage channels
               the use of which is not detectable by the auditing mechanisms,
               shall be provided.  (See the Covert Channel Guideline section.)

3.3  CLASS (B3):  SECURITY DOMAINS

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests.  To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity.  A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required.  The system is highly resistant to
penetration.  The following are minimal requirements for systems assigned a
class (B3) rating:

3.1.1  Security Policy

     3.3.1.1   Discretionary Access Control

               The TCB shall define and control access between named users and
               named objects (e.g., files and programs) in the ADP system.
               The enforcement mechanism (e.g., access control lists) shall
               allow users to specify and control sharing of those objects,
               and shall provide controls to limit propagation of access
               rights.  The discretionary access control mechanism shall,
               either by explicit user action or by default, provide that
               objects are protected from unauthorized access.  These access
               controls shall be capable of specifying, for each named object,
               a list of named individuals and a list of groups of named
               individuals with their respective modes of access to that
               object.  Furthermore, for each such named object, it shall be
               possible to specify a list of named individuals and a list of
               groups of named individuals for which no access to the object is
               to be given.  Access permission to an object by users not
               already possessing access permission shall only be assigned by
               authorized users.

     3.3.1.2   Object Reuse

               All authorizations to the information contained within a
               storage object shall be revoked prior to initial assignment,
               allocation or reallocation to a subject from the TCB's pool
               of unused storage objects.  No information, including 
               encrypted representations of information, produced by a prior
               subjects actions is to be available to any subject that obtains
               access to an object that has been released back to the system.

     3.3.1.3   Labels

               Sensitivity labels associated with each ADP system resource
               (e.g., subject, storage object, ROM) that is directly or
               indirectly accessible by subjects external to the TCB shall be
               maintained by the TCB.  These labels shall be used as the basis
               for mandatory access control decisions.  In order to import 
               non-labeled data, the TCB shall request and receive from an
               authorized user the security level of the data, and all such
               actions shall be auditable by the TCB.

          3.3.1.3.1  Label Integrity

                     Sensitivity labels shall accurately represent security
                     levels of the specific subjects or objects with which
                     they are associated.  When exported by the TCB,
                     sensitivity labels shall accurately and unambiguously
                     represent the internal labels and shall be associated
                     with the information being exported.

          3.3.1.3.2  Exportation of Labeled Information

                     The TCB shall designate each communication channel and
                     I/O device as either single-level or multilevel.  Any
                     change in this designation shall be done manually and
                     shall be auditable by the TCB.  The TCB shall maintain
                     and be able to audit any change in the security level
                     or levels associated with a communication channel or
                     I/O device.

               3.3.1.3.2.1  Exportation to Multilevel Devices

                            When the TCB exports an object to a multilevel I/O
                            device, the sensitivity label associated with that
                            object shall also be exported and shall reside on
                            the same physical medium as the exported 
                            information and shall be in the same form (i.e., 
                            machine-readable or human-readable form).  When
                            the TCB exports or imports an object over a
                            multilevel communication channel, the protocol 
                            used on that channel shall provide for the
                            unambiguous pairing between the sensitivity labels
                            and the associated information that is sent or
                            received.

               3.3.1.3.2.2  Exportation to Single-Level Devices

                            Single-level I/O devices and single-level
                            communication channels are not required to 
                            maintain the sensitivity labels of the information
                            they process.  However, the TCB shall include a 
                            mechanism by which the TCB and an authorized user
                            reliably communicate to designate the single
                            security level of information imported or exported
                            via single-level communication channels or I/O
                            devices.

               3.3.1.3.2.3  Labeling Human-Readable Output

                            The ADP system administrator shall be able to
                            specify the printable label names associated with
                            exported sensitivity labels.  The TCB shall mark
                            the beginning and end of all human-readable, paged,
                            hardcopy output (e.g., line printer output) with
                            human-readable sensitivity labels that properly*
                            represent the sensitivity of the output.  The TCB
                            shall, by default, mark the top and bottom of each
                            page of human-readable, paged, hardcopy output
                            (e.g., line printer output) with human-readable
                            sensitivity labels that properly* represent the
                            overall sensitivity of the output or that
                            properly* represent the sensitivity of the 
                            information on the page.  The TCB shall, by
                            default and in an appropriate manner, mark other
                            forms of human-readable output (e.g., maps,
                            graphics) with human-readable sensitivity labels
                            that properly* represent the sensitivity of the 
                            output.  Any override of these marking defaults
                            shall be auditable by the TCB.

          3.3.1.3.3  Subject Sensitivity Labels

                     The TCB shall immediately notify a terminal user of each
                     change in the security level associated with that user
                     during an interactive session.  A terminal user shall be
                     able to query the TCB as desired for a display of the
                     subject's complete sensitivity label.

          3.3.1.3.4  Device Labels

                     The TCB shall support the assignment of minimum and
                     maximum security levels to all attached physical devices.
                     These security levels shall be used by the TCB to enforce
                     constraints imposed by the physical environments in which
                     the devices are located.

     3.3.1.4   Mandatory Access Control

               The TCB shall enforce a mandatory access control policy over
               all resources (i.e., subjects, storage objects, and I/O 
               devices) that are directly or indirectly accessible by subjects
               external to the TCB.  These subjects and objects shall be
               assigned sensitivity labels that are a combination of
               hierarchical classification levels and non-hierarchical
               categories, and the labels shall be used as the basis for 
               mandatory access control decisions.  The TCB shall be able to
               support two or more such security levels.  (See the Mandatory
______________________________
* The hierarchical classification component in human-readable sensitivity
labels shall be equal to the greatest hierarchical classification of any of the
information in the output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical categories of the
information in the output the labels refer to, but no other non-hierarchical
categories.

               Access Control guidelines.)  The following requirements shall
               hold for all accesses between all subjects external to the TCB
               and all objects directly or indirectly accessible by these 
               subjects: A subject can read an object only if the hierarchical
               classification in the subject's security level is greater than
               or equal to the hierarchical classification in the object's
               security level and the non-hierarchical categories in the
               subject's security level include all the non-hierarchical
               categories in the object's security level.  A subject can write
               an object only if the hierarchical classification in the 
               subject's security level is less than or equal to the
               hierarchical classification in the object's security level and
               all the non-hierarchical categories in the subject's security 
               level are included in the non- hierarchical categories in the 
               object's security level.  Identification and authentication 
               data shall be used by the TCB to authenticate the user's
               identity and to ensure that the security level and authori-
               zation of subjects external to the TCB that may be created 
               to act on behalf of the individual user are dominated by the
               clearance and authorization of that user.

3.3.2  Accountability

     3.3.2.1   Identification and Authentication

               The TCB shall require users to identify themselves to it before
               beginning to perform any other actions that the TCB is expected
               to mediate.  Furthermore, the TCB shall maintain authentication
               data that includes information for verifying the identity of
               individual users (e.g., passwords) as well as information for
               determining the clearance and authorizations of individual
               users.  This data shall be used by the TCB to authenticate the
               user's identity and to ensure that the security level and 
               authorizations of subjectsexternal to the TCB that may be
               created to act on behalf of the individual user are dominated
               by the clearance and authorization of that user.  The TCB shall
               protect authentication data so that it cannot be accessed by any
               unauthorized user.  The TCB shall be able to enforce individual
               accountability by providing the capability to uniquely identify
               each individual ADP system user.  The TCB shall also provide the
               capability of associating this identity with all auditable
               actions taken by that individual.

          3.3.2.1.1  Trusted Path

                     The TCB shall support a trusted communication path
                     between itself and users for use when a positive TCB-to-
                     user connection is required (e.g., login, change subject
                     security level).  Communications via this trusted path
                     shall be activated exclusively by a user of the TCB and
                     shall be logically isolated and unmistakably
                     distinguishable from other paths.

     3.3.2.2   Audit

               The TCB shall be able to create, maintain, and protect from
               modification or unauthorized access or destruction an audit 
               trail of accesses to the objects it protects.  The audit data
               shall be protected by the TCB so that read access to it is
               limited to those who are authorized for audit data.  The TCB
               shall be able to record the following types of events: use of
               identification and authentication mechanisms, introduction of
               objects into a user's address space (e.g., file open, program
               initiation), deletion of objects, and actions taken by computer
               operators and system administrators and/or system security
               officers and other security relevant events.  The TCB shall also
               be able to audit any override of human-readable output markings.
               For each recorded event, the audit record shall identify:  date
               and time of the event, user, type of event, and success or
               failure of the event.  For identification/authentication events
               the origin of request (e.g., terminal ID) shall be included in
               the audit record.  For events that introduce an object into a
               user's address space and for object deletion events the audit
               record shall include the name of the object and the object's
               security level.  The ADP system administrator shall be able to
               selectively audit the actions of any one or more users based on
               individual identity and/or object security level.  The TCB shall
               be able to audit the identified events that may be used in the
               exploitation of covert storage channels.  The TCB shall contain
               a mechanism that is able to monitor the occurrence or
               accumulation of security auditable events that may indicate an
               imminent violation of security policy.  This mechanism shall be
               able to immediately notify the security administrator when
               thresholds are exceeded, and if the occurrence or accumulation
               of these security relevant events continues, the system shall
               take the least disruptive action to terminate the event.

3.3.3  Assurance

     3.3.3.1   Operational Assurance

          3.3.3.1.1  System Architecture

                     The TCB shall maintain a domain for its own execution
                     that protects it from external interference or tampering
                     (e.g., by modification of its code or data structures).
                     The TCB shall maintain process isolation through the
                     provision of distinct address spaces under its control.
                     The TCB shall be internally structured into well-defined
                     largely independent modules.  It shall make effective use
                     of available hardware to separate those elements that are
                     protection-critical from those that are not.  The TCB
                     modules shall be designed such that the principle of 
                     least privilege is enforced.  Features in hardware, such
                     as segmentation, shall be used to support logically
                     distinct storage objects with separate attributes (namely:
                     readable, writeable).  The user interface to the TCB shall
                     be completely defined and all elements of the TCB
                     identified.  The TCB shall be designed and structured to
                     use a complete, conceptually simple protection mechanism
                     with precisely defined semantics.  This mechanism shall
                     play a central role in enforcing the internal structuring
                     of the TCB and the system.  The TCB shall incorporate
                     significant use of layering, abstraction and data hiding.
                     Significant system engineering shall be directed toward
                     minimizing the complexity of the TCB and excluding from 
                     the TCB modules that are not protection-critical.

          3.3.3.1.2  System Integrity

                     Hardware and/or software features shall be provided that
                     can be used to periodically validate the correct 
                     operation of the on-site hardware and firmware elements 
                     of the TCB.

          3.3.3.1.3  Covert Channel Analysis

                     The system developer shall conduct a thorough search for
                     covert channels and make a determination (either by 
                     actual measurement or by engineering estimation) of the
                     maximum bandwidth of each identified channel.  (See the
                     Covert Channels Guideline section.)

          3.3.3.1.4  Trusted Facility Management

                     The TCB shall support separate operator and administrator
                     functions.  The functions performed in the role of a
                     security administrator shall be identified.  The ADP
                     system administrative personnel shall only be able to
                     perform security administrator functions after taking a
                     distinct auditable action to assume the security
                     administrator role on the ADP system.  Non-security
                     functions that can be performed in the security
                     administration role shall be limited strictly to those
                     essential to performing the security role effectively.

          3.3.3.1.5  Trusted Recovery

                     Procedures and/or mechanisms shall be provided to assure
                     that, after an ADP system failure or other discontinuity,
                     recovery without a protection compromise is obtained.

     3.3.3.2   Life-Cycle Assurance

          3.3.3.2.1  Security Testing

                     The security mechanisms of the ADP system shall be tested
                     and found to work as claimed in the system documentation.
                     A team of individuals who thoroughly understand the
                     specific implementation of the TCB shall subject its
                     design documentation, source code, and object code to
                     thorough analysis and testing.  Their objectives shall 
                     be: to uncover all design and implementation flaws that
                     would permit a subject external to the TCB to read,
                     change, or delete data normally denied under the 
                     mandatory or discretionary security policy enforced by
                     the TCB; as well as to assure that no subject (without
                     authorization to do so) is able to cause the TCB to enter
                     a state such that it is unable to respond to 
                     communications initiated by other users.  The TCB shall
                     be found resistant to penetration.  All discovered flaws
                     shall be corrected and the TCB retested to demonstrate 
                     that they have been eliminated and that new flaws have
                     not been introduced.  Testing shall demonstrate that the
                     TCB implementation is consistent with the descriptive
                     top-level specification.  (See the Security Testing 
                     Guidelines.)  No design flaws and no more than a few
                     correctable implementation flaws may be found during 
                     testing and there shall be reasonable confidence that
                     few remain.

          3.3.3.2.2  Design Specification and Verification

                     A formal model of the security policy supported by the
                     TCB shall be maintained over the life cycle of the ADP
                     system that is proven consistent with its axioms.  A
                     descriptive top-level specification (DTLS) of the TCB
                     shall be maintained that completely and accurately
                     describes the TCB in terms of exceptions, error messages,
                     and effects.  It shall be shown to be an accurate
                     description of the TCB interface.  A convincing argument
                     shall be given that the DTLS is consistent with the model.

          3.3.3.2.3  Configuration Management

                     During development and maintenance of the TCB, a
                     configuration management system shall be in place that 
                     maintains control of changes to the descriptive top-level
                     specification, other design data, implementation
                     documentation, source code, the running version of the
                     object code, and test fixtures and documentation.  The
                     configuration management system shall assure a consistent
                     mapping among all documentation and code associated with
                     the current version of the TCB.  Tools shall be provided
                     for generation of a new version of the TCB from source 
                     code.  Also available shall be tools for comparing a
                     newly generated version with the previous TCB version in
                     order to ascertain that only the intended changes have 
                     been made in the code that will actually be used as the
                     new version of the TCB.

3.3.4  Documentation

     3.3.4.1   Security Features User's Guide

               A single summary, chapter, or manual in user documentation
               shall describe the protection mechanisms provided by the TCB,
               guidelines on their use, and how they interact with one another.

     3.3.4.2   Trusted Facility Manual

               A manual addressed to the ADP system administrator shall
               present cautions about functions and privileges that should be
               controlled when running a secure facility.  The procedures for
               examining and maintaining the audit files as well as the
               detailed audit record structure for each type of audit event
               shall be given.  The manual shall describe the operator and
               administrator functions related to security, to include 
               changing the security characteristics of a user.  It shall
               provide guidelines on the consistent and effective use of the
               protection features of the system, how they interact, how to
               securely generate a new TCB, and facility procedures, warnings,
               and privileges that need to be controlled in order to operate
               the facility in a secure manner.  The TCB modules that contain
               the reference validation mechanism shall be identified.  The
               procedures for secure generation of a new TCB from source after
               modification of any modules in the TCB shall be described.  It
               shall include the procedures to ensure that the system is
               initially started in a secure manner.  Procedures shall also be
               included to resume secure system operation after any lapse in
               system operation.

     3.3.4.3   Test Documentation

               The system developer shall provide to the evaluators a document
               that describes the test plan, test procedures that show how the
               security mechanisms were tested, and results of the security 
               mechanisms' functional testing.  It shall include results of
               testing the effectiveness of the methods used to reduce covert
               channel bandwidths.

     3.3.4.4   Design Documentation

               Documentation shall be available that provides a description of
               the manufacturer's philosophy of protection and an explanation
               of how this philosophy is translated into the TCB.  The
               interfaces between the TCB modules shall be described.  A
               formal description of the security policy model enforced by the
               TCB shall be available and proven that it is sufficient to
               enforce the security policy.  The specific TCB protection 
               mechanisms shall be identified and an explanation given to show
               that they satisfy the model.  The descriptive top-level
               specification (DTLS) shall be shown to be an accurate 
               description of the TCB interface.  Documentation shall describe
               how the TCB implements the reference monitor concept and give 
               an explanation why it is tamper resistant, cannot be bypassed,
               and is correctly implemented.  The TCB implementation (i.e., in
               hardware, firmware, and software) shall be informally shown to
               be consistent with the DTLS.  The elements of the DTLS shall be
               shown, using informal techniques, to correspond to the elements
               of the TCB.  Documentation shall describe how the TCB is
               structured to facilitate testing and to enforce least privilege.
               This documentation shall also present the results of the covert
               channel analysis and the tradeoffs involved in restricting the
               channels.  All auditable events that may be used in the 
               exploitation of known covert storage channels shall be 
               identified.  The bandwidths of known covert storage channels,
               the use of which is not detectable by the auditing mechanisms,
               shall be provided.  (See the Covert Channel Guideline section.)

                     4.0  DIVISION A:  VERIFIED PROTECTION

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system.  Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

4.1  CLASS (A1):  VERIFIED DESIGN

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented.  This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design.
Independent of the particular specification language or verification system
used, there are five important criteria for class (A1) design verification:

          * A formal model of the security policy must be clearly
          identified and documented, including a mathematical proof
          that the model is consistent with its axioms and is
          sufficient to support the security policy.

          * An FTLS must be produced that includes abstract definitions
          of the functions the TCB performs and of the hardware and/or
          firmware mechanisms that are used to support separate
          execution domains.

          * The FTLS of the TCB must be shown to be consistent with the
          model by formal techniques where possible (i.e., where
          verification tools exist) and informal ones otherwise.

          * The TCB implementation (i.e., in hardware, firmware, and
          software) must be informally shown to be consistent with the
          FTLS.  The elements of the FTLS must be shown, using
          informal techniques, to correspond to the elements of the
          TCB.  The FTLS must express the unified protection mechanism
          required to satisfy the security policy, and it is the
          elements of this protection mechanism that are mapped to the
          elements of the TCB.

          * Formal analysis techniques must be used to identify and
          analyze covert channels.  Informal techniques may be used to
          identify covert timing channels.  The continued existence of
          identified covert channels in the system must be justified.

In keeping with the extensive design and development analysis of the TCB
required of systems in class (A1), more stringent configuration management is
required and procedures are established for securely distributing the system
to sites.  A system security administrator is supported.

The following are minimal requirements for systems assigned a class (A1)
rating:

4.1.1  Security Policy

     4.1.1.1   Discretionary Access Control

               The TCB shall define and control access between named users and
               named objects (e.g., files and programs) in the ADP system.  
               The enforcement mechanism (e.g., access control lists) shall 
               allow users to specify and control sharing of those objects,
               and shall provide controls to limit propagation of access
               rights.  The discretionary access control mechanism shall,
               either by explicit user action or by default, provide that
               objects are protected from unauthorized access.  These access
               controls shall be capable of specifying, for each named object,
               a list of named individuals and a list of groups of named
               individuals with their respective modes of access to that
               object.  Furthermore, for each such named object, it shall be
               possible to specify a list of named individuals and a list of
               groups of named individuals for which no access to the object is
               to be given.  Access permission to an object by users not
               already possessing access permission shall only be assigned by
               authorized users.

     4.1.1.2   Object Reuse

               All authorizations to the information contained within a
               storage object shall be revoked prior to initial assignment,
               allocation or reallocation to a subject from the TCB's pool
               of unused storage objects.  No information, including encrypted
               representations of information, produced by a prior subject's
               actions is to be available to any subject that obtains access
               to an object that has been released back to the system.

     4.1.1.3   Labels

               Sensitivity labels associated with each ADP system resource
               (e.g., subject, storage object, ROM) that is directly or
               indirectly accessible by subjects external to the TCB shall be
               maintained by the TCB.  These labels shall be used as the basis
               for mandatory access control decisions.  In order to import 
               non-labeled data, the TCB shall request and receive from an 
               authorized user the security level of the data, and all such
               actions shall be auditable by the TCB.

          4.1.1.3.1  Label Integrity

                     Sensitivity labels shall accurately represent security
                     levels of the specific subjects or objects with which 
                     they are associated.  When exported by the TCB,
                     sensitivity labels shall accurately and unambiguously
                     represent the internal labels and shall be associated 
                     with the information being exported.

          4.1.1.3.2  Exportation of Labeled Information

                     The TCB shall designate each communication channel and
                     I/O device as either single-level or multilevel.  Any 
                     change in this designation shall be done manually and
                     shall be auditable by the TCB.  The TCB shall maintain
                     and be able to audit any change in the security level
                     or levels associated with a communication channel or 
                     I/O device.

               4.1.1.3.2.1  Exportation to Multilevel Devices

                            When the TCB exports an object to a multilevel I/O
                            device, the sensitivity label associated with that
                            object shall also be exported and shall reside on
                            the same physical medium as the exported 
                            information and shall be in the same form (i.e., 
                            machine-readable or human-readable form).  When
                            the TCB exports or imports an object over a
                            multilevel communication channel, the protocol 
                            used on that channel shall provide for the
                            unambiguous pairing between the sensitivity labels
                            and the associated information that is sent or 
                            received.

               4.1.1.3.2.2  Exportation to Single-Level Devices

                            Single-level I/O devices and single-level
                            communication channels are not required to 
                            maintain the sensitivity labels of the information
                            they process.  However, the TCB shall include a 
                            mechanism by which the TCB and an authorized user
                            reliably communicate to designate the single
                            security level of information imported or exported
                            via single-level communication channels or I/O 
                            devices.

               4.1.1.3.2.3  Labeling Human-Readable Output

                            The ADP system administrator shall be able to
                            specify the printable label names associated with
                            exported sensitivity labels.  The TCB shall mark
                            the beginning and end of all human-readable, paged,
                            hardcopy output (e.g., line printer output) with 
                            human-readable sensitivity labels that properly*
                            represent the sensitivity of the output.  The TCB
                            shall, by default, mark the top and bottom of each
                            page of human-readable, paged, hardcopy output
                            (e.g., line printer output) with human-readable 
                            sensitivity labels that properly* represent the
                            overall sensitivity of the output or that 
                            properly* represent the sensitivity of the 
                            information on the page.  The TCB shall, by
                            default and in an appropriate manner, mark other
                            forms of human-readable output (e.g., maps,
                            graphics) with human-readable sensitivity labels
                            that properly* represent the sensitivity of the 
                            output.  Any override of these marking defaults
                            shall be auditable by the TCB.

          4.1.1.3.3  Subject Sensitivity Labels

                     The TCB shall immediately notify a terminal user of each
                     change in the security level associated with that user 
                     during an interactive session.  A terminal user shall be
                     able to query the TCB as desired for a display of the
                     subject's complete sensitivity label.

          4.1.1.3.4  Device Labels

                     The TCB shall support the assignment of minimum and
                     maximum security levels to all attached physical devices.
                     These security levels shall be used by the TCB to enforce
                     constraints imposed by the physical environments in which
                     the devices are located.

     4.1.1.4   Mandatory Access Control

               The TCB shall enforce a mandatory access control policy over
               all resources (i.e., subjects, storage objects, and I/O 
               devices) that are directly or indirectly accessible by subjects
               external to the TCB.  These subjects and objects shall be
               assigned sensitivity labels that are a combination of
               hierarchical classification levels and non-hierarchical
               categories, and the labels shall be used as the basis for 
               mandatory access control decisions.  The TCB shall be able to
               support two or more such security levels.  (See the Mandatory
               Access Control guidelines.) The following requirements shall
               hold for all accesses between all subjects external to the TCB
               and all objects directly or indirectly accessible by these 
               subjects: A subject can read an object only if the hierarchical
               classification in the subject's security level is greater than
               or equal to the hierarchical classification in the object's
               security level and the non-hierarchical categories in the
               subject's security level include all the non-hierarchical
               categories in the object's security level.  A subject can write
______________________________
* The hierarchical classification component in human-readable sensitivity
labels shall be equal to the greatest hierarchical classification of any of the
information in the output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical categories of the
information in the output the labels refer to, but no other non-hierarchical
categories.

               an object only if the hierarchical classification in the 
               subject's security level is less than or equal to the
               hierarchical classification in the object's security level and
               all the non-hierarchical categories in the subject's security 
               level are included in the non- hierarchical categories in the 
               object's security level.  Identification and authentication 
               data shall be used by the TCB to authenticate the user's
               identity and to ensure that the security level and authoriza-
               tion of subjects external to the TCB that may be created to
               act on behalf of the individual user are dominated by the
               clearance and authorization of that user.

4.1.2  Accountability

     4.1.2.1   Identification and Authentication

               The TCB shall require users to identify themselves to it before
               beginning to perform any other actions that the TCB is expected
               to mediate.  Furthermore, the TCB shall maintain authentication
               data that includes information for verifying the identity of
               individual users (e.g., passwords) as well as information for
               determining the clearance and authorizations of individual
               users.  This data shall be used by the TCB to authenticate the
               user's identity and to ensure that the security level and 
               authorizations of subjects external to the TCB that may be
               created to act on behalf of the individual user are dominated by
               the clearance and authorization of that user.  The TCB shall
               protect authentication data so that it cannot be accessed by any
               unauthorized user.  The TCB shall be able to enforce individual
               accountability by providing the capability to uniquely identify
               each individual ADP system user.  The TCB shall also provide the
               capability of associating this identity with all auditable
               actions taken by that individual.

          4.1.2.1.1  Trusted Path

                     The TCB shall support a trusted communication path
                     between itself and users for use when a positive TCB-to-
                     user connection is required (e.g., login, change subject
                     security level).  Communications via this trusted path
                     shall be activated exclusively by a user or the TCB and
                     shall be logically isolated and unmistakably 
                     distinguishable from other paths.

     4.1.2.2   Audit

               The TCB shall be able to create, maintain, and protect from
               modification or unauthorized access or destruction an audit 
               trail of accesses to the objects it protects.  The audit data
               shall be protected by the TCB so that read access to it is
               limited to those who are authorized for audit data.  The TCB
               shall be able to record the following types of events: use of
               identification and authentication mechanisms, introduction of
               objects into a user's address space (e.g., file open, program
               initiation), deletion of objects, and actions taken by computer
               operators and system administrators and/or system security
               officers, and other security relevant events.  The TCB shall
               also be able to audit any override of human-readable output
               markings.  For each recorded event, the audit record shall
               identify: date and time of the event, user, type of event, and
               success or failure of the event.  For identification/
               authentication events the origin of request (e.g., terminal ID)
               shall be included in the audit record.  For events that
               introduce an object into a user's address space and for object
               deletion events the audit record shall include the name of the
               object and the object's security level.  The ADP system
               administrator shall be able to selectively audit the actions of
               any one or more users based on individual identity and/or object
               security level.  The TCB shall be able to audit the identified
               events that may be used in the exploitation of covert storage
               channels.  The TCB shall contain a mechanism that is able to
               monitor the occurrence or accumulation of security auditable
               events that may indicate an imminent violation of security
               policy.  This mechanism shall be able to immediately notify the
               security administrator when thresholds are exceeded, and, if
               the occurrence or accumulation of these security relevant
               events continues, the system shall take the least disruptive
               action to terminate the event.

4.1.3  Assurance

     4.1.3.1   Operational Assurance

          4.1.3.1.1  System Architecture

                     The TCB shall maintain a domain for its own execution
                     that protects it from external interference or tampering
                     (e.g., by modification of its code or data structures).
                     The TCB shall maintain process isolation through the
                     provision of distinct address spaces under its control.
                     The TCB shall be internally structured into well-defined
                     largely independent modules.  It shall make effective use
                     of available hardware to separate those elements that are
                     protection-critical from those that are not.  The TCB
                     modules shall be designed such that the principle of 
                     least privilege is enforced.  Features in hardware, such
                     as segmentation, shall be used to support logically 
                     distinct storage objects with separate attributes (namely:
                     readable, writeable).  The user interface to the TCB 
                     shall be completely defined and all elements of the TCB 
                     identified.  The TCB shall be designed and structured to
                     use a complete, conceptually simple protection mechanism
                     with precisely defined semantics.  This mechanism shall 
                     play a central role in enforcing the internal structuring
                     of the TCB and the system.  The TCB shall incorporate
                     significant use of layering, abstraction and data hiding.
                     Significant system engineering shall be directed toward 
                     minimizing the complexity of the TCB and excluding from
                     the TCB modules that are not protection-critical.

          4.1.3.1.2  System Integrity

                     Hardware and/or software features shall be provided that
                     can be used to periodically validate the correct 
                     operation of the on-site hardware and firmware elements
                     of the TCB.

          4.1.3.1.3  Covert Channel Analysis

                     The system developer shall conduct a thorough search for
                     covert channels and make a determination (either by 
                     actual measurement or by engineering estimation) of the
                     maximum bandwidth of each identified channel.  (See the
                     Covert Channels Guideline section.)  Formal methods shall
                     be used in the analysis.

          4.1.3.1.4  Trusted Facility Management

                     The TCB shall support separate operator and administrator
                     functions.  The functions performed in the role of a 
                     security administrator shall be identified.  The ADP
                     system administrative personnel shall only be able to
                     perform security administrator functions after taking a 
                     distinct auditable action to assume the security
                     administrator role on the ADP system.  Non-security
                     functions that can be performed in the security 
                     administration role shall be limited strictly to those
                     essential to performing the security role effectively.

          4.1.3.1.5  Trusted Recovery

                     Procedures and/or mechanisms shall be provided to assure
                     that, after an ADP system failure or other discontinuity,
                     recovery without a protection compromise is obtained.

     4.1.3.2   Life-Cycle Assurance

          4.1.3.2.1  Security Testing

                     The security mechanisms of the ADP system shall be tested
                     and found to work as claimed in the system documentation.
                     A team of individuals who thoroughly understand the
                     specific implementation of the TCB shall subject its
                     design documentation, source code, and object code to
                     thorough analysis and testing.  Their objectives shall 
                     be: to uncover all design and implementation flaws that
                     would permit a subject external to the TCB to read,
                     change, or delete data normally denied under the 
                     mandatory or discretionary security policy enforced by 
                     the TCB; as well as to assure that no subject (without
                     authorization to do so) is able to cause the TCB to enter
                     a state such that it is unable to respond to 
                     communications initiated by other users.  The TCB shall 
                     be found resistant to penetration.  All discovered flaws
                     shall be corrected and the TCB retested to demonstrate 
                     that they have been eliminated and that new flaws have
                     not been introduced.  Testing shall demonstrate that the
                     TCB implementation is consistent with the formal top-
                     level specification.  (See the Security Testing 
                     Guidelines.)  No design flaws and no more than a few
                     correctable implementation flaws may be found during
                     testing and there shall be reasonable confidence that few
                     remain.  Manual or other mapping of the FTLS to the
                     source code may form a basis for penetration testing.

          4.1.3.2.2  Design Specification and Verification

                     A formal model of the security policy supported by the
                     TCB shall be maintained over the life-cycle of the ADP
                     system that is proven consistent with its axioms.  A
                     descriptive top-level specification (DTLS) of the TCB
                     shall be maintained that completely and accurately
                     describes the TCB in terms of exceptions, error messages,
                     and effects. A formal top-level specification (FTLS) of
                     the TCB shall be maintained that accurately describes the
                     TCB in terms of exceptions, error messages, and effects. 
                     The DTLS and FTLS shall include those components of the
                     TCB that are implemented as hardware and/or firmware if
                     their properties are visible at the TCB interface.  The
                     FTLS shall be shown to be an accurate description of the
                     TCB interface.  A convincing argument shall be given that
                     the DTLS is consistent with the model and a combination of
                     formal and informal techniques shall be used to show that
                     the FTLS is consistent with the model.  This verification
                     evidence shall be consistent with that provided within the
                     state-of-the-art of the particular computer security
                     center-endorsed formal specification and verification
                     system used.  Manual or other mapping of the FTLS to the
                     TCB source code shall be performed to provide evidence of
                     correct implementation.

          4.1.3.2.3  Configuration Management

                     During the entire life-cycle, i.e., during the design,
                     development, and maintenance of the TCB, a configuration
                     management system shall be in place for all security-
                     relevant hardware, firmware, and software that maintains
                     control of changes to the formal model, the descriptive 
                     and formal top-level specifications, other design data, 
                     implementation documentation, source code, the running
                     version of the object code, and test fixtures and
                     documentation.  The configuration management system shall
                     assure a consistent mapping among all documentation and 
                     code associated with the current version of the TCB.
                     Tools shall be provided for generation of a new version
                     of the TCB from source code.  Also available shall be 
                     tools, maintained under strict configuration control, for
                     comparing a newly generated version with the previous TCB
                     version in order to ascertain that only the intended
                     changes have been made in the code that will actually be
                     used as the new version of the TCB.  A combination of
                     technical, physical, and procedural safeguards shall be
                     used to protect from unauthorized modification or
                     destruction the master copy or copies of all material
                     used to generate the TCB.

          4.1.3.2.4  Trusted Distribution

                     A trusted ADP system control and distribution facility
                     shall be provided for maintaining the integrity of the
                     mapping between the master data describing the current
                     version of the TCB and the on-site master copy of the
                     code for the current version.  Procedures (e.g., site
                     security acceptance testing) shall exist for assuring 
                     that the TCb software, firmware, and hardware updates
                     distributed to a customer are exactly as specified by
                     the master copies.

4.1.4  Documentation

     4.1.4.1   Security Features User's Guide

               A single summary, chapter, or manual in user documentation
               shall describe the protection mechanisms provided by the TCB,
               guidelines on their use, and how they interact with one another.

     4.1.4.2   Trusted Facility Manual

               A manual addressed to the ADP system administrator shall
               present cautions about functions and privileges that should be
               controlled when running a secure facility.  The procedures for
               examining and maintaining the audit files as well as the
               detailed audit record structure for each type of audit event
               shall be given.  The manual shall describe the operator and
               administrator functions related to security, to include 
               changing the security characteristics of a user.  It shall
               provide guidelines on the consistent and effective use of the
               protection features of the system, how they interact, how to
               securely generate a new TCB, and facility procedures, warnings,
               and privileges that need to be controlled in order to operate
               the facility in a secure manner.  The TCB modules that contain
               the reference validation mechanism shall be identified.  The 
               procedures for secure generation of a new TCB from source after
               modification of any modules in the TCB shall be described.  It
               shall include the procedures to ensure that the system is 
               initially started in a secure manner.  Procedures shall also be
               included to resume secure system operation after any lapse in 
               system operation.  

     4.1.4.3   Test Documentation

               The system developer shall provide to the evaluators a document
               that describes the test plan, test procedures that show how the
               security mechanisms were tested, and results of the security
               mechanisms' functional testing.  It shall include results of 
               testing the effectiveness of the methods used to reduce covert
               channel bandwidths.  The results of the mapping between the
               formal top-level specification and the TCB source code shall be
               given.

     4.1.4.4   Design Documentation

               Documentation shall be available that provides a description of
               the manufacturer's philosophy of protection and an explanation
               of how this philosophy is translated into the TCB.  The 
               interfaces between the TCB modules shall be described.  A
               formal description of the security policy model enforced by the
               TCB shall be available and proven that it is sufficient to 
               enforce the security policy.  The specific TCB protection 
               mechanisms shall be identified and an explanation given to show
               that they satisfy the model.  The descriptive top-level speci-
               fication (DTLS) shall be shown to be an accurate description of
               the TCB interface.  Documentation shall describe how the TCB
               implements the reference monitor concept and give an explana-
               tion why it is tamper resistant, cannot be bypassed, and
               is correctly implemented.  The TCB implementation (i.e., in
               hardware, firmware, and software) shall be informally shown to
               be consistent with the formal top-level specification (FTLS).
               The elements of the FTLS shall be shown, using informal 
               techniques, to correspond to the elements of the TCB.  
               Documentation shall describe how the TCB is structured to
               facilitate testing and to enforce least privilege.  This
               documentation shall also present the results of the covert 
               channel analysis and the tradeoffs involved in restricting the
               channels.  All auditable events that may be used in the
               exploitation of known covert storage channels shall be 
               identified.  The bandwidths of known covert storage channels,
               the use of which is not detectable by the auditing mechanisms,
               shall be provided.  (See the Covert Channel Guideline section.)
               Hardware, firmware, and software mechanisms not dealt with in
               the FTLS but strictly internal to the TCB (e.g., mapping
               registers, direct memory access I/O) shall be clearly described.

4.2  BEYOND CLASS (A1)

Most of the security enhancements envisioned for systems that will provide
features and assurance in addition to that already provided by class (Al)
systems are beyond current technology.  The discussion below is intended to
guide future work and is derived from research and development activities
already underway in both the public and private sectors.  As more and better
analysis techniques are developed, the requirements for these systems will
become more explicit.  In the future, use of formal verification will be
extended to the source level and covert timing channels will be more fully
addressed.  At this level the design environment will become important and
testing will be aided by analysis of the formal top