The Department of Devense Trusted Computer System Evaluation Criteria (August 15, 1993) (The Orange Book)

orange-boot.txt: No such file or directory
% cat orange.boo
orange.boo: No such file or directory
% cat orange-book.txt
CSC-STD-001-83
Library No. S225,711

DEPARTMENT OF DEFENSE

TRUSTED COMPUTER SYSTEM EVALUATION CRITERIA

15 August 1983

CSC-STD-001-83

FOREWORD

This publication, “Department of Defense Trusted Computer System Evaluation
Criteria,” is being issued by the DoD Computer Security Center under the
authority of and in accordance with DoD Directive 5215.1, “Computer Security
Evaluation Center.” The criteria defined in this document constitute a uniform
set of basic requirements and evaluation classes for assessing the
effectiveness of security controls built into Automatic Data Processing (ADP)
systems. These criteria are intended for use in the evaluation and selection
of ADP systems being considered for the processing and/or storage and
retrieval of sensitive or classified information by the Department of Defense.
Point of contact concerning this publication is the Office of Standards and
Products, Attention: Chief, Computer Security Standards.

____________________________ 15 August 1983
Melville H. Klein
Director
DoD Computer Security Center

ACKNOWLEDGMENTS

Special recognition is extended to Sheila L. Brand, DoD Computer Security
Center (DoDCSC), who integrated theory, policy, and practice into and directed
the production of this document.

Acknowledgment is also given for the contributions of: Grace Hammonds and
Peter S. Tasker, the MITRE Corp., Daniel J. Edwards, Col. Roger R. Schell,
Marvin Schaefer, DoDCSC, and Theodore M. P. Lee, Sperry UNIVAC, who as
original architects formulated and articulated the technical issues and
solutions presented in this document; Jeff Makey and Warren F. Shadle,
DoDCSC, who assisted in the preparation of this document; James P. Anderson,
James P. Anderson & Co., Steven B. Lipner, Digital Equipment Corp., Clark
Weissman, System Development Corp., LTC Lawrence A. Noble, formerly U.S. Air
Force, Stephen T. Walker, formerly DoD, Eugene V. Epperly, DoD, and James E.
Studer, formerly Dept. of the Army, who gave generously of their time and
expertise in the review and critique of this document; and finally, thanks are
given to the computer industry and others interested in trusted computing for
their enthusiastic advice and assistance throughout this effort.

TABLE OF CONTENTS

FOREWORD. . . . . . . . . . . . . . . . . . . . . . . . . . . .i
ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . ii
PREFACE . . . . . . . . . . . . . . . . . . . . . . . . . . . .v
INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . . . . .1

PART I: THE CRITERIA
Section
1.0 DIVISION D: MINIMAL PROTECTION. . . . . . . . . . . . .9
2.0 DIVISION C: DISCRETIONARY PROTECTION. . . . . . . . . 11
2.1 Class (C1): Discretionary Security Protection . . 12
2.2 Class (C2): Controlled Access Protection. . . . . 15
3.0 DIVISION B: MANDATORY PROTECTION. . . . . . . . . . . 19
3.1 Class (B1): Labeled Security Protection . . . . . 20
3.2 Class (B2): Structured Protection . . . . . . . . 26
3.3 Class (B3): Security Domains. . . . . . . . . . . 33
4.0 DIVISION A: VERIFIED PROTECTION . . . . . . . . . . . 41
4.1 Class (A1): Verified Design . . . . . . . . . . . 42
4.2 Beyond Class (A1). . . . . . . . . . . . . . . . . 51

PART II: RATIONALE AND GUIDELINES

5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS. . . . . 55
5.1 A Need for Consensus . . . . . . . . . . . . . . . 56
5.2 Definition and Usefulness. . . . . . . . . . . . . 56
5.3 Criteria Control Objective . . . . . . . . . . . . 56
6.0 RATIONALE BEHIND THE EVALUATION CLASSES. . . . . . . . . 63
6.1 The Reference Monitor Concept. . . . . . . . . . . 64
6.2 A Formal Security Policy Model . . . . . . . . . . 64
6.3 The Trusted Computing Base . . . . . . . . . . . . 65
6.4 Assurance. . . . . . . . . . . . . . . . . . . . . 65
6.5 The Classes. . . . . . . . . . . . . . . . . . . . 66
7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA . . . . 69
7.1 Established Federal Policies . . . . . . . . . . . 70
7.2 DoD Policies . . . . . . . . . . . . . . . . . . . 70
7.3 Criteria Control Objective For Security Policy . . 71
7.4 Criteria Control Objective for Accountability. . . 74
7.5 Criteria Control Objective for Assurance . . . . . 76
8.0 A GUIDELINE ON COVERT CHANNELS . . . . . . . . . . . . . 79
9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL
FEATURES . . . . . . . . . . . . . . . . . . . . . . . . 81
10.0 A GUIDELINE ON SECURITY TESTING . . . . . . . . . . . . 83
10.1 Testing for Division C . . . . . . . . . . . . . . 84
10.2 Testing for Division B . . . . . . . . . . . . . . 84
10.3 Testing for Division A . . . . . . . . . . . . . . 85
APPENDIX A: Commercial Product Evaluation Process. . . . . . 87
APPENDIX B: Summary of Evaluation Criteria Divisions . . . . 89
APPENDIX C: Sumary of Evaluation Criteria Classes. . . . . . 91
APPENDIX D: Requirement Directory. . . . . . . . . . . . . . 93

GLOSSARY. . . . . . . . . . . . . . . . . . . . . . . . . . .109

REFERENCES. . . . . . . . . . . . . . . . . . . . . . . . . .115

PREFACE

The trusted computer system evaluation criteria defined in this document
classify systems into four broad hierarchical divisions of enhanced security
protection. They provide a basis for the evaluation of effectiveness of
security controls built into automatic data processing system products. The
criteria were developed with three objectives in mind: (a) to provide users
with a yardstick with which to assess the degree of trust that can be placed
in computer systems for the secure processing of classified or other sensitive
information; (b) to provide guidance to manufacturers as to what to build into
their new, widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide a basis for
specifying security requirements in acquisition specifications. Two types of
requirements are delineated for secure processing: (a) specific security
feature requirements and (b) assurance requirements. Some of the latter
requirements enable evaluation personnel to determine if the required features
are present and functioning as intended. Though the criteria are
application-independent, it is recognized that the specific security feature
requirements may have to be interpreted when applying the criteria to specific
applications or other special processing environments. The underlying
assurance requirements can be applied across the entire spectrum of ADP system
or application processing environments without special interpretation.

INTRODUCTION

Historical Perspective

In October 1967, a task force was assembled under the auspices of the Defense
Science Board to address computer security safeguards that would protect
classified information in remote-access, resource-sharing computer systems.
The Task Force report, “Security Controls for Computer Systems,” published in
February 1970, made a number of policy and technical recommendations on
actions to be taken to reduce the threat of compromise of classified
information processed on remote-access computer systems.[34] Department of
Defense Directive 5200.28 and its accompanying manual DoD 5200.28-M, published
in 1972 and 1973 respectivley, responded to one of these recommendations by
establishing uniform DoD policy, security requirements, administrative
controls, and technical measures to protect classified information processed
by DoD computer systems.[8;9] Research and development work undertaken by the
Air Force, Advanced Research Projects Agency, and other defense agencies in
the early and mid 70’s developed and demonstrated solution approaches for the
technical problems associated with controlling the flow of information in
resource and information sharing computer systems.[1] The DoD Computer
Security Initiative was started in 1977 under the auspices of the Under
Secretary of Defense for Research and Engineering to focus DoD efforts
addressing computer security issues.[33]

Concurrent with DoD efforts to address computer security issues, work was
begun under the leadership of the National Bureau of Standards (NBS) to define
problems and solutions for building, evaluating, and auditing secure computer
systems.[17] As part of this work NBS held two invitational workshops on the
subject of audit and evaluation of computer security.[20;28] The first was
held in March 1977, and the second in November of 1978. One of the products
of the second workshop was a definitive paper on the problems related to
providing criteria for the evaluation of technical computer security
effectiveness.[20] As an outgrowth of recommendations from this report, and in
support of the DoD Computer Security Initiative, the MITRE Corporation began
work on a set of computer security evaluation criteria that could be used to
assess the degree of trust one could place in a computer system to protect
classified data.[24;25;31] The preliminary concepts for computer security
evaluation were defined and expanded upon at invitational workshops and
symposia whose participants represented computer security expertise drawn from
industry and academia in addition to the government. Their work has since
been subjected to much peer review and constructive technical criticism from
the DoD, industrial research and development organizations, universities, and
computer manufacturers.

The DoD Computer Security Center (the Center) was formed in January 1981 to
staff and expand on the work started by the DoD Computer Security
Initiative.[15] A major goal of the Center as given in its DoD Charter is to
encourage the widespread availability of trusted computer systems for use by
those who process classified or other sensitive information.[10] The criteria
presented in this document have evolved from the earlier NBS and MITRE
evaluation material.

Scope

The trusted computer system evaluation criteria defined in this document apply
to both trusted general-purpose and trusted embedded (e.g., those dedicated to
a specific application) automatic data processing (ADP) systems. Included are
two distinct sets of requirements: 1) specific security feature requirements;
and 2) assurance requirements. The specific feature requirements encompass
the capabilities typically found in information processing systems employing
general-purpose operating systems that are distinct from the applications
programs being supported. The assurance requirements, on the other hand,
apply to systems that cover the full range of computing environments from
dedicated controllers to full range multilevel secure resource sharing
systems.

Purpose

As outlined in the Preface, the criteria have been developed for a number of
reasons:

* To provide users with a metric with which to evaluate the
degree of trust that can be placed in computer systems for
the secure processing of classified and other sensitive
information.

* To provide guidance to manufacturers as to what security
features to build into their new and planned, commercial
products in order to provide widely available systems that
satisfy trust requirements for sensitive applications.

* To provide a basis for specifying security requirements in
acquisition specifications.

With respect to the first purpose for development of the criteria, i.e.,
providing users with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be performed on a computer
product from a perspective that excludes the application environment; or, (b)
it can be done to assess whether appropriate security measures have been taken
to permit the system to be used operationally in a specific environment. The
former type of evaluation is done by the Computer Security Center through the
Commercial Product Evaluation Process. That process is described in Appendix
A.

The latter type of evaluation, i.e., those done for the purpose of assessing a
system’s security attributes with respect to a specific operational mission,
is known as a certification evaluation. It must be understood that the
completion of a formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific application
environment. On the contrary, the evaluation report only provides a trusted
computer system’s evaluation rating along with supporting data describing the
product system’s strengths and weaknesses from a computer security point of
view. The system security certification and the formal approval/accreditation
procedure, done in accordance with the applicable policies of the issuing
agencies, must still be followed-before a system can be approved for use in
processing or handling classified information.[8;9]

The trusted computer system evaluation criteria will be used directly and
indirectly in the certification process. Along with applicable policy, it
will be used directly as the basis for evaluation of the total system and for
specifying system security and certification requirements for new
acquisitions. Where a system being evaluated for certification employs a
product that has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation. Technical data
will be furnished to designers, evaluators and the Designated Approving
Authorities to support their needs for making decisions.

Fundamental Computer Security Requirements

Any discussion of computer security necessarily starts from a statement of
requirements, i.e., what it really means to call a computer system “secure.”
In general, secure systems will control, through use of specific security
features, access to information such that only properly authorized
individuals, or processes operating on their behalf, will have access to read,
write, create, or delete information. Six fundamental requirements are
derived from this basic statement of objective: four deal with what needs to
be provided to control access to information; and two deal with how one can
obtain credible assurances that this is accomplished in a trusted computer
system.

POLICY

Requirement 1 – SECURITY POLICY – There must be an explicit and well-defined
security policy enforced by the system. Given identified subjects and
objects, there must be a set of rules that are used by the system to determine
whether a given subject can be permitted to gain access to a specific object.
Computer systems of interest must enforce a mandatory security policy that can
effectively implement access rules for handling sensitive (e.g., classified)
information.[7] These rules include requirements such as: No person lacking
proper personnel security clearance shall obtain access to classified
information. In addition, discretionary security controls are required to
ensure that only selected users or groups of users may obtain access to data
(e.g., based on a need-to-know).

Requirement 2 – MARKING – Access control labels must be associated with
objects. In order to control access to information stored in a computer,
according to the rules of a mandatory security policy, it must be possible to
mark every object with a label that reliably identifies the object’s
sensitivity level (e.g., classification), and/or the modes of access accorded
those subjects who may potentially access the object.

ACCOUNTABILITY

Requirement 3 – IDENTIFICATION – Individual subjects must be identified. Each
access to information must be mediated based on who is accessing the
information and what classes of information they are authorized to deal with.
This identification and authorization information must be securely maintained
by the computer system and be associated with every active element that
performs some security-relevant action in the system.

Requirement 4 – ACCOUNTABILITY – Audit information must be selectively kept
and protected so that actions affecting security can be traced to the
responsible party. A trusted system must be able to record the occurrences of
security-relevant events in an audit log. The capability to select the audit
events to be recorded is necessary to minimize the expense of auditing and to
allow efficient analysis. Audit data must be protected from modification and
unauthorized destruction to permit detection and after-the-fact investigations
of security violations.

ASSURANCE

Requirement 5 – ASSURANCE – The computer system must contain hardware/software
mechanisms that can be independently evaluated to provide sufficient assurance
that the system enforces requirements 1 through 4 above. In order to assure
that the four requirements of Security Policy, Marking, Identification, and
Accountability are enforced by a computer system, there must be some
identified and unified collection of hardware and software controls that
perform those functions. These mechanisms are typically embedded in the
operating system and are designed to carry out the assigned tasks in a secure
manner. The basis for trusting such system mechanisms in their operational
setting must be clearly documented such that it is possible to independently
examine the evidence to evaluate their sufficiency.

Requirement 6 – CONTINUOUS PROTECTION – The trusted mechanisms that enforce
these basic requirements must be continuously protected against tampering
and/or unauthorized changes. No computer system can be considered truly
secure if the basic hardware and software mechanisms that enforce the security
policy are themselves subject to unauthorized modification or subversion. The
continuous protection requirement has direct implications throughout the
computer system’s life-cycle.

These fundamental requirements form the basis for the individual evaluation
criteria applicable for each evaluation division and class. The interested
reader is referred to Section 5 of this document, “Control Objectives for
Trusted Computer Systems,” for a more complete discussion and further
amplification of these fundamental requirements as they apply to
general-purpose information processing systems and to Section 7 for
amplification of the relationship between Policy and these requirements.

Structure of the Document

The remainder of this document is divided into two parts, four appendices, and
a glossary. Part I (Sections 1 through 4) presents the detailed criteria
derived from the fundamental requirements described above and relevant to the
rationale and policy excerpts contained in Part II.

Part II (Sections 5 through 10) provides a discussion of basic objectives,
rationale, and national policy behind the development of the criteria, and
guidelines for developers pertaining to: mandatory access control rules
implementation, the covert channel problem, and security testing. It is
divided into six sections. Section 5 discusses the use of control objectives
in general and presents the three basic control objectives of the criteria.
Section 6 provides the theoretical basis behind the criteria. Section 7 gives
excerpts from pertinent regulations, directives, OMB Circulars, and Executive
Orders which provide the basis for many trust requirements for processing
nationally sensitive and classified information with computer systems.
Section 8 provides guidance to system developers on expectations in dealing
with the covert channel problem. Section 9 provides guidelines dealing with
mandatory security. Section 10 provides guidelines for security testing.
There are four appendices, including a description of the Trusted Computer
System Commercial Products Evaluation Process (Appendix A), summaries of the
evaluation divisions (Appendix B) and classes (Appendix C), and finally a
directory of requirements ordered alphabetically. In addition, there is a
glossary.

Structure of the Criteria

The criteria are divided into four divisions: D, C, B, and A ordered in a
hierarchical manner with the highest division (A) being reserved for systems
providing the most comprehensive security. Each division represents a major
improvement in the overall confidence one can place in the system for the
protection of sensitive information. Within divisions C and B there are a
number of subdivisions known as classes. The classes are also ordered in a
hierarchical manner with systems representative of division C and lower
classes of division B being characterized by the set of computer security
mechanisms that they possess. Assurance of correct and complete design and
implementation for these systems is gained mostly through testing of the
security- relevant portions of the system. The security-relevant portions of
a system are referred to throughout this document as the Trusted Computing
Base (TCB). Systems representative of higher classes in division B and
division A derive their security attributes more from their design and
implementation structure. Increased assurance that the required features are
operative, correct, and tamperproof under all circumstances is gained through
progressively more rigorous analysis during the design process.

Within each class, four major sets of criteria are addressed. The first three
represent features necessary to satisfy the broad control objectives of
Security Policy, Accountability, and Assurance that are discussed in Part II,
Section 5. The fourth set, Documentation, describes the type of written
evidence in the form of user guides, manuals, and the test and design
documentation required for each class.

A reader using this publication for the first time may find it helpful to
first read Part II, before continuing on with Part I.

PART I: THE CRITERIA

Highlighting (UPPERCASE) is used in Part I to indicate criteria not contained
in a lower class or changes and additions to already defined criteria. Where
there is no highlighting, requirements have been carried over from lower
classes without addition or modification.

1.0 DIVISION D: MINIMAL PROTECTION

This division contains only one class. It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

2.0 DIVISION C: DISCRETIONARY PROTECTION

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

2.1 CLASS (C1): DISCRETIONARY SECURITY PROTECTION

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data. It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data. The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity. The following are minimal
requirements for systems assigned a class (C1) rating:

2.1.1 SECURITY POLICY

2.1.1.1 Discretionary Access Control

THE TCB SHALL DEFINE AND CONTROL ACCESS BETWEEN NAMED USERS AND
NAMED OBJECTS (E.G., FILES AND PROGRAMS) IN THE ADP SYSTEM. THE
ENFORCEMENT MECHANISM (E.G., SELF/GROUP/PUBLIC CONTROLS, ACCESS
CONTROL LISTS) SHALL ALLOW USERS TO SPECIFY AND CONTROL SHARING
OF THOSE OBJECTS BY NAMED INDIVIDUALS OR DEFINED GROUPS OR BOTH.

2.1.2 ACCOUNTABILITY

2.1.2.1 Identification and Authentication

THE TCB SHALL REQUIRE USERS TO IDENTIFY THEMSELVES TO IT BEFORE
BEGINNING TO PERFORM ANY OTHER ACTIONS THAT THE TCB IS EXPECTED
TO MEDIATE. FURTHERMORE, THE TCB SHALL USE A PROTECTED
MECHANISM (E.G., PASSWORDS) TO AUTHENTICATE THE USER’S IDENTITY.
THE TCB SHALL PROTECT AUTHENTICATION DATA SO THAT IT CANNOT BE
ACCESSED BY ANY UNAUTHORIZED USER.

2.1.3 ASSURANCE

2.1.3.1 Operational Assurance

2.1.3.1.1 System Architecture

THE TCB SHALL MAINTAIN A DOMAIN FOR ITS OWN EXECUTION
THAT PROTECTS IT FROM EXTERNAL INTERFERENCE OR TAMPERING
(E.G., BY MODIFICATION OF ITS CODE OR DATA STRUCTURES).
RESOURCES CONTROLLED BY THE TCB MAY BE A DEFINED SUBSET
OF THE SUBJECTS AND OBJECTS IN THE ADP SYSTEM.

2.1.3.1.2 System Integrity

HARDWARE AND/OR SOFTWARE FEATURES SHALL BE PROVIDED THAT
CAN BE USED TO PERIODICALLY VALIDATE THE CORRECT OPERATION
OF THE ON-SITE HARDWARE AND FIRMWARE ELEMENTS OF THE TCB.

2.1.3.2 Life-Cycle Assurance

2.1.3.2.1 Security Testing

THE SECURITY MECHANISMS OF THE ADP SYSTEM SHALL BE TESTED
AND FOUND TO WORK AS CLAIMED IN THE SYSTEM DOCUMENTATION.
TESTING SHALL BE DONE TO ASSURE THAT THERE ARE NO OBVIOUS
WAYS FOR AN UNAUTHORIZED USER TO BYPASS OR OTHERWISE
DEFEAT THE SECURITY PROTECTION MECHANISMS OF THE TCB.
(SEE THE SECURITY TESTING GUIDELINES.)

2.1.4 DOCUMENTATION

2.1.4.1 Security Features User’s Guide

A SINGLE SUMMARY, CHAPTER, OR MANUAL IN USER DOCUMENTATION
SHALL DESCRIBE THE PROTECTION MECHANISMS PROVIDED BY THE TCB,
GUIDELINES ON THEIR USE, AND HOW THEY INTERACT WITH ONE ANOTHER.

2.1.4.2 Trusted Facility Manual

A MANUAL ADDRESSED TO THE ADP SYSTEM ADMINISTRATOR SHALL
PRESENT CAUTIONS ABOUT FUNCTIONS AND PRIVILEGES THAT SHOULD BE
CONTROLLED WHEN RUNNING A SECURE FACILITY.

2.1.4.3 Test Documentation

THE SYSTEM DEVELOPER SHALL PROVIDE TO THE EVALUATORS A DOCUMENT
THAT DESCRIBES THE TEST PLAN AND RESULTS OF THE SECURITY
MECHANISMS’ FUNCTIONAL TESTING.

2.1.4.4 Design Documentation

DOCUMENTATION SHALL BE AVAILABLE THAT PROVIDES A DESCRIPTION OF
THE MANUFACTURER’S PHILOSOPHY OF PROTECTION AND AN EXPLANATION
OF HOW THIS PHILOSOPHY IS TRANSLATED INTO THE TCB. IF THE TCB
IS COMPOSED OF DISTINCT MODULES, THE INTERFACES BETWEEN THESE
MODULES SHALL BE DESCRIBED.

2.2 CLASS (C2): CONTROLLED ACCESS PROTECTION

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation. The following are minimal requirements for systems
assigned a class (C2) rating:

2.2.1 SECURITY POLICY

2.2.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system. The
enforcement mechanism (e.g., self/group/public controls, access
control lists) shall allow users to specify and control sharing
of those objects by named individuals, or defined groups OF
INDIVIDUALS, or by both. THE DISCRETIONARY ACCESS CONTROL
MECHANISM SHALL, EITHER BY EXPLICIT USER ACTION OR BY DEFAULT,
PROVIDE THAT OBJECTS ARE PROTECTED FROM UNAUTHORIZED ACCESS.
THESE ACCESS CONTROLS SHALL BE CAPABLE OF INCLUDING OR EXCLUDING
ACCESS TO THE GRANULARITY OF A SINGLE USER. ACCESS PERMISSION
TO AN OBJECT BY USERS NOT ALREADY POSSESSING ACCESS PERMISSION
SHALL ONLY BE ASSIGNED BY AUTHORIZED USERS.

2.2.1.2 Object Reuse

WHEN A STORAGE OBJECT IS INITIALLY ASSIGNED, ALLOCATED, OR
REALLOCATED TO A SUBJECT FROM THE TCB’S POOL OF UNUSED STORAGE
OBJECTS, THE TCB SHALL ASSURE THAT THE OBJECT CONTAINS NO DATA
FOR WHICH THE SUBJECT IS NOT AUTHORIZED.

2.2.2 ACCOUNTABILITY

2.2.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall use a protected
mechanism (e.g., passwords) to authenticate the user’s identity.
The TCB shall protect authentication data so that it cannot be
accessed by any unauthorized user. THE TCB SHALL BE ABLE TO
ENFORCE INDIVIDUAL ACCOUNTABILITY BY PROVIDING THE CAPABILITY TO
UNIQUELY IDENTIFY EACH INDIVIDUAL ADP SYSTEM USER. THE TCB
SHALL ALSO PROVIDE THE CAPABILITY OF ASSOCIATING THIS IDENTITY
WITH ALL AUDITABLE ACTIONS TAKEN BY THAT INDIVIDUAL.

2.2.2.2 Audit

THE TCB SHALL BE ABLE TO CREATE, MAINTAIN, AND PROTECT FROM
MODIFICATION OR UNAUTHORIZED ACCESS OR DESTRUCTION AN AUDIT
TRAIL OF ACCESSES TO THE OBJECTS IT PROTECTS. THE AUDIT DATA
SHALL BE PROTECTED BY THE TCB SO THAT READ ACCESS TO IT IS
LIMITED TO THOSE WHO ARE AUTHORIZED FOR AUDIT DATA. THE TCB
SHALL BE ABLE TO RECORD THE FOLLOWING TYPES OF EVENTS: USE OF
IDENTIFICATION AND AUTHENTICATION MECHANISMS, INTRODUCTION OF
OBJECTS INTO A USER’S ADDRESS SPACE (E.G., FILE OPEN, PROGRAM
INITIATION), DELETION OF OBJECTS, AND ACTIONS TAKEN BY
COMPUTER OPERATORS AND SYSTEM ADMINISTRATORS AND/OR SYSTEM
SECURITY OFFICERS. FOR EACH RECORDED EVENT, THE AUDIT RECORD
SHALL IDENTIFY: DATE AND TIME OF THE EVENT, USER, TYPE OF
EVENT, AND SUCCESS OR FAILURE OF THE EVENT. FOR
IDENTIFICATION/AUTHENTICATION EVENTS THE ORIGIN OF REQUEST
(E.G., TERMINAL ID) SHALL BE INCLUDED IN THE AUDIT RECORD. FOR
EVENTS THAT INTRODUCE AN OBJECT INTO A USER’S ADDRESS SPACE AND
FOR OBJECT DELETION EVENTS THE AUDIT RECORD SHALL INCLUDE THE
NAME OF THE OBJECT. THE ADP SYSTEM ADMINISTRATOR SHALL BE ABLE
TO SELECTIVELY AUDIT THE ACTIONS OF ANY ONE OR MORE USERS BASED
ON INDIVIDUAL IDENTITY.

2.2.3 ASSURANCE

2.2.3.1 Operational Assurance

2.2.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
Resources controlled by the TCB may be a defined subset
of the subjects and objects in the ADP system. THE TCB
SHALL ISOLATE THE RESOURCES TO BE PROTECTED SO THAT THEY
ARE SUBJECT TO THE ACCESS CONTROL AND AUDITING
REQUIREMENTS.

2.2.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct operation
of the on-site hardware and firmware elements of the TCB.

2.2.3.2 Life-Cycle Assurance

2.2.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
Testing shall be done to assure that there are no obvious
ways for an unauthorized user to bypass or otherwise
defeat the security protection mechanisms of the TCB.
TESTING SHALL ALSO INCLUDE A SEARCH FOR OBVIOUS FLAWS THAT
WOULD ALLOW VIOLATION OF RESOURCE ISOLATION, OR THAT WOULD
PERMIT UNAUTHORIZED ACCESS TO THE AUDIT OR AUTHENTICATION
DATA. (See the Security Testing guidelines.)

2.2.4 DOCUMENTATION

2.2.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

2.2.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. THE PROCEDURES FOR
EXAMINING AND MAINTAINING THE AUDIT FILES AS WELL AS THE
DETAILED AUDIT RECORD STRUCTURE FOR EACH TYPE OF AUDIT EVENT
SHALL BE GIVEN.

2.2.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing.

2.2.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. If the TCB
is composed of distinct modules, the interfaces between these
modules shall be described.

3.0 DIVISION B: MANDATORY PROTECTION

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division. Systems in this division must carry the
sensitivity labels with major data structures in the system. The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB. Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

3.1 CLASS (B1): LABELED SECURITY PROTECTION

Class (B1) systems require all the features required for class (C2). In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information. Any
flaws identified by testing must be removed. The following are minimal
requirements for systems assigned a class (B1) rating:

3.1.1 SECURITY POLICY

3.1.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined groups
of individuals, or by both. The discretionary access control
mechanism shall, either by explicit user action or by default,
provide that objects are protected from unauthorized access.
These access controls shall be capable of including or excluding
access to the granularity of a single user. Access permission
to an object by users not already possessing access permission
shall only be assigned by authorized users.

3.1.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.1.1.3 Labels

SENSITIVITY LABELS ASSOCIATED WITH EACH SUBJECT AND STORAGE
OBJECT UNDER ITS CONTROL (E.G., PROCESS, FILE, SEGMENT, DEVICE)
SHALL BE MAINTAINED BY THE TCB. THESE LABELS SHALL BE USED AS
THE BASIS FOR MANDATORY ACCESS CONTROL DECISIONS. IN ORDER TO
IMPORT NON-LABELED DATA, THE TCB SHALL REQUEST AND RECEIVE FROM
AN AUTHORIZED USER THE SECURITY LEVEL OF THE DATA, AND ALL SUCH
ACTIONS SHALL BE AUDITABLE BY THE TCB.

3.1.1.3.1 Label Integrity

SENSITIVITY LABELS SHALL ACCURATELY REPRESENT SECURITY
LEVELS OF THE SPECIFIC SUBJECTS OR OBJECTS WITH WHICH THEY
ARE ASSOCIATED. WHEN EXPORTED BY THE TCB, SENSITIVITY
LABELS SHALL ACCURATELY AND UNAMBIGUOUSLY REPRESENT THE
INTERNAL LABELS AND SHALL BE ASSOCIATED WITH THE
INFORMATION BEING EXPORTED.

3.1.1.3.2 Exportation of Labeled Information

THE TCB SHALL DESIGNATE EACH COMMUNICATION CHANNEL AND
I/O DEVICE AS EITHER SINGLE-LEVEL OR MULTILEVEL. ANY
CHANGE IN THIS DESIGNATION SHALL BE DONE MANUALLY AND
SHALL BE AUDITABLE BY THE TCB. THE TCB SHALL MAINTAIN
AND BE ABLE TO AUDIT ANY CHANGE IN THE CURRENT SECURITY
LEVEL ASSOCIATED WITH A SINGLE-LEVEL COMMUNICATION
CHANNEL OR I/O DEVICE.

3.1.1.3.2.1 Exportation to Multilevel Devices

WHEN THE TCB EXPORTS AN OBJECT TO A MULTILEVEL I/O
DEVICE, THE SENSITIVITY LABEL ASSOCIATED WITH THAT
OBJECT SHALL ALSO BE EXPORTED AND SHALL RESIDE ON
THE SAME PHYSICAL MEDIUM AS THE EXPORTED
INFORMATION AND SHALL BE IN THE SAME FORM
(I.E., MACHINE-READABLE OR HUMAN-READABLE FORM).
WHEN THE TCB EXPORTS OR IMPORTS AN OBJECT OVER A
MULTILEVEL COMMUNICATION CHANNEL, THE PROTOCOL
USED ON THAT CHANNEL SHALL PROVIDE FOR THE
UNAMBIGUOUS PAIRING BETWEEN THE SENSITIVITY LABELS
AND THE ASSOCIATED INFORMATION THAT IS SENT OR
RECEIVED.

3.1.1.3.2.2 Exportation to Single-Level Devices

SINGLE-LEVEL I/O DEVICES AND SINGLE-LEVEL
COMMUNICATION CHANNELS ARE NOT REQUIRED TO
MAINTAIN THE SENSITIVITY LABELS OF THE INFORMATION
THEY PROCESS. HOWEVER, THE TCB SHALL INCLUDE A
MECHANISM BY WHICH THE TCB AND AN AUTHORIZED USER
RELIABLY COMMUNICATE TO DESIGNATE THE SINGLE
SECURITY LEVEL OF INFORMATION IMPORTED OR EXPORTED
VIA SINGLE-LEVEL COMMUNICATION CHANNELS OR I/O
DEVICES.

3.1.1.3.2.3 Labeling Human-Readable Output

THE ADP SYSTEM ADMINISTRATOR SHALL BE ABLE TO
SPECIFY THE PRINTABLE LABEL NAMES ASSOCIATED WITH
EXPORTED SENSITIVITY LABELS. THE TCB SHALL MARK
THE BEGINNING AND END OF ALL HUMAN-READABLE, PAGED,
HARDCOPY OUTPUT (E.G., LINE PRINTER OUTPUT) WITH
HUMAN-READABLE SENSITIVITY LABELS THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE OUTPUT. THE TCB
SHALL, BY DEFAULT, MARK THE TOP AND BOTTOM OF EACH
PAGE OF HUMAN-READABLE, PAGED, HARDCOPY OUTPUT
(E.G., LINE PRINTER OUTPUT) WITH HUMAN-READABLE
SENSITIVITY LABELS THAT PROPERLY* REPRESENT THE
OVERALL SENSITIVITY OF THE OUTPUT OR THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE INFORMATION ON THE
PAGE. THE TCB SHALL, BY DEFAULT AND IN AN
APPROPRIATE MANNER, MARK OTHER FORMS OF HUMAN-
READABLE OUTPUT (E.G., MAPS, GRAPHICS) WITH HUMAN-
READABLE SENSITIVITY LABELS THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE OUTPUT. ANY
OVERRIDE OF THESE MARKING DEFAULTS SHALL BE
AUDITABLE BY THE TCB.

_____________________________________________________________
* THE HIERARCHICAL CLASSIFICATION COMPONENT IN HUMAN-READABLE
SENSITIVITY LABELS SHALL BE EQUAL TO THE GREATEST
HIERARCHICAL CLASSIFICATION OF ANY OF THE INFORMATION IN THE
OUTPUT THAT THE LABELS REFER TO; THE NON-HIERARCHICAL
CATEGORY COMPONENT SHALL INCLUDE ALL OF THE NON-HIERARCHICAL
CATEGORIES OF THE INFORMATION IN THE OUTPUT THE LABELS REFER
TO, BUT NO OTHER NON-HIERARCHICAL CATEGORIES.
_____________________________________________________________

3.1.1.4 Mandatory Access Control

THE TCB SHALL ENFORCE A MANDATORY ACCESS CONTROL POLICY OVER
ALL SUBJECTS AND STORAGE OBJECTS UNDER ITS CONTROL (E.G.,
PROCESSES, FILES, SEGMENTS, DEVICES). THESE SUBJECTS AND
OBJECTS SHALL BE ASSIGNED SENSITIVITY LABELS THAT ARE A
COMBINATION OF HIERARCHICAL CLASSIFICATION LEVELS AND
NON-HIERARCHICAL CATEGORIES, AND THE LABELS SHALL BE USED AS
THE BASIS FOR MANDATORY ACCESS CONTROL DECISIONS. THE TCB
SHALL BE ABLE TO SUPPORT TWO OR MORE SUCH SECURITY LEVELS.
(SEE THE MANDATORY ACCESS CONTROL GUIDELINES.) THE FOLLOWING
REQUIREMENTS SHALL HOLD FOR ALL ACCESSES BETWEEN SUBJECTS AND
OBJECTS CONTROLLED BY THE TCB: A SUBJECT CAN READ AN OBJECT
ONLY IF THE HIERARCHICAL CLASSIFICATION IN THE SUBJECT’S
SECURITY LEVEL IS GREATER THAN OR EQUAL TO THE HIERARCHICAL
CLASSIFICATION IN THE OBJECT’S SECURITY LEVEL AND THE NON-
HIERARCHICAL CATEGORIES IN THE SUBJECT’S SECURITY LEVEL INCLUDE
ALL THE NON-HIERARCHICAL CATEGORIES IN THE OBJECT’S SECURITY
LEVEL. A SUBJECT CAN WRITE AN OBJECT ONLY IF THE HIERARCHICAL
CLASSIFICATION IN THE SUBJECT’S SECURITY LEVEL IS LESS THAN OR
EQUAL TO THE HIERARCHICAL CLASSIFICATION IN THE OBJECT’S
SECURITY LEVEL AND ALL THE NON-HIERARCHICAL CATEGORIES IN THE
SUBJECT’S SECURITY LEVEL ARE INCLUDED IN THE NON- HIERARCHICAL
CATEGORIES IN THE OBJECT’S SECURITY LEVEL.

3.1.2 ACCOUNTABILITY

3.1.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall MAINTAIN AUTHENTICATION
DATA THAT INCLUDES INFORMATION FOR VERIFYING THE IDENTITY OF
INDIVIDUAL USERS (E.G., PASSWORDS) AS WELL AS INFORMATION FOR
DETERMINING THE CLEARANCE AND AUTHORIZATIONS OF INDIVIDUAL
USERS. THIS DATA SHALL BE USED BY THE TCB TO AUTHENTICATE the
user’s identity AND TO DETERMINE THE SECURITY LEVEL AND
AUTHORIZATIONS OF SUBJECTS THAT MAY BE CREATED TO ACT ON BEHALF
OF THE INDIVIDUAL USER. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.1.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. THE TCB SHALL ALSO BE ABLE TO AUDIT ANY OVERRIDE OF
HUMAN-READABLE OUTPUT MARKINGS. FOR each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address space
and for object deletion events the audit record shall include
the name of the object AND THE OBJECT’S SECURITY LEVEL. The
ADP system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
AND/OR OBJECT SECURITY LEVEL.

3.1.3 ASSURANCE

3.1.3.1 Operational Assurance

3.1.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
Resources controlled by the TCB may be a defined subset
of the subjects and objects in the ADP system. THE TCB
SHALL MAINTAIN PROCESS ISOLATION THROUGH THE PROVISION OF
DISTINCT ADDRESS SPACES UNDER ITS CONTROL. The TCB shall
isolate the resources to be protected so that they are
subject to the access control and auditing requirements.

3.1.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct operation
of the on-site hardware and firmware elements of the TCB.

3.1.3.2 Life-Cycle Assurance

3.1.3.2.1 Security Testing

THE SECURITY MECHANISMS OF THE ADP SYSTEM SHALL BE TESTED
AND FOUND TO WORK AS CLAIMED IN THE SYSTEM DOCUMENTATION.
A TEAM OF INDIVIDUALS WHO THOROUGHLY UNDERSTAND THE
SPECIFIC IMPLEMENTATION OF THE TCB SHALL SUBJECT ITS
DESIGN DOCUMENTATION, SOURCE CODE, AND OBJECT CODE TO
THOROUGH ANALYSIS AND TESTING. THEIR OBJECTIVES SHALL BE:
TO UNCOVER ALL DESIGN AND IMPLEMENTATION FLAWS THAT WOULD
PERMIT A SUBJECT EXTERNAL TO THE TCB TO READ, CHANGE, OR
DELETE DATA NORMALLY DENIED UNDER THE MANDATORY OR
DISCRETIONARY SECURITY POLICY ENFORCED BY THE TCB; AS WELL
AS TO ASSURE THAT NO SUBJECT (WITHOUT AUTHORIZATION TO DO
SO) IS ABLE TO CAUSE THE TCB TO ENTER A STATE SUCH THAT
IT IS UNABLE TO RESPOND TO COMMUNICATIONS INITIATED BY
OTHER USERS. ALL DISCOVERED FLAWS SHALL BE REMOVED OR
NEUTRALIZED AND THE TCB RETESTED TO DEMONSTRATE THAT THEY
HAVE BEEN ELIMINATED AND THAT NEW FLAWS HAVE NOT BEEN
INTRODUCED. (SEE THE SECURITY TESTING GUIDELINES.)

3.1.3.2.2 Design Specification and Verification

AN INFORMAL OR FORMAL MODEL OF THE SECURITY POLICY
SUPPORTED BY THE TCB SHALL BE MAINTAINED THAT IS SHOWN TO
BE CONSISTENT WITH ITS AXIOMS.

3.1.4 DOCUMENTATION

3.1.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.1.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. THE MANUAL SHALL DESCRIBE THE OPERATOR AND
ADMINISTRATOR FUNCTIONS RELATED TO SECURITY, TO INCLUDE CHANGING
THE SECURITY CHARACTERISTICS OF A USER. IT SHALL PROVIDE
GUIDELINES ON THE CONSISTENT AND EFFECTIVE USE OF THE PROTECTION
FEATURES OF THE SYSTEM, HOW THEY INTERACT, HOW TO SECURELY
GENERATE A NEW TCB, AND FACILITY PROCEDURES, WARNINGS, AND
PRIVILEGES THAT NEED TO BE CONTROLLED IN ORDER TO OPERATE THE
FACILITY IN A SECURE MANNER.

3.1.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing.

3.1.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. If the TCB
is composed of distinct modules, the interfaces between these
modules shall be described. AN INFORMAL OR FORMAL DESCRIPTION
OF THE SECURITY POLICY MODEL ENFORCED BY THE TCB SHALL BE
AVAILABLE AND AN EXPLANATION PROVIDED TO SHOW THAT IT IS
SUFFICIENT TO ENFORCE THE SECURITY POLICY. THE SPECIFIC TCB
PROTECTION MECHANISMS SHALL BE IDENTIFIED AND AN EXPLANATION
GIVEN TO SHOW THAT THEY SATISFY THE MODEL.

3.2 CLASS (B2): STRUCTURED PROTECTION

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system. In addition, covert channels are
addressed. The TCB must be carefully structured into protection-critical and
non- protection-critical elements. The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review. Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed. The system is relatively resistant to penetration. The
following are minimal requirements for systems assigned a class (B2) rating:

3.2.1 SECURITY POLICY

3.2.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined
groups of individuals, or by both. The discretionary access
control mechanism shall, either by explicit user action or by
default, provide that objects are protected from unauthorized
access. These access controls shall be capable of including
or excluding access to the granularity of a single user.
Access permission to an object by users not already possessing
access permission shall only be assigned by authorized users.

3.2.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.2.1.3 Labels

Sensitivity labels associated with each ADP SYSTEM RESOURCE
(E.G., SUBJECT, STORAGE OBJECT) THAT IS DIRECTLY OR INDIRECTLY
ACCESSIBLE BY SUBJECTS EXTERNAL TO THE TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

3.2.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

3.2.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

3.2.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

3.2.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the
information they process. However, the TCB shall
include a mechanism by which the TCB and an
authorized user reliably communicate to designate
the single security level of information imported
or exported via single-level communication
channels or I/O devices.

3.2.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.
_____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
_____________________________________________________________

3.2.1.3.3 Subject Sensitivity Labels

THE TCB SHALL IMMEDIATELY NOTIFY A TERMINAL USER OF EACH
CHANGE IN THE SECURITY LEVEL ASSOCIATED WITH THAT USER
DURING AN INTERACTIVE SESSION. A TERMINAL USER SHALL BE
ABLE TO QUERY THE TCB AS DESIRED FOR A DISPLAY OF THE
SUBJECT’S COMPLETE SENSITIVITY LABEL.

3.2.1.3.4 Device Labels

THE TCB SHALL SUPPORT THE ASSIGNMENT OF MINIMUM AND
MAXIMUM SECURITY LEVELS TO ALL ATTACHED PHYSICAL DEVICES.
THESE SECURITY LEVELS SHALL BE USED BY THE TCB TO ENFORCE
CONSTRAINTS IMPOSED BY THE PHYSICAL ENVIRONMENTS IN WHICH
THE DEVICES ARE LOCATED.

3.2.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all RESOURCES (I.E., SUBJECTS, STORAGE OBJECTS, AND I/O DEVICES)
THAT ARE DIRECTLY OR INDIRECTLY ACCESSIBLE BY SUBJECTS EXTERNAL
TO THE TCB. These subjects and objects shall be assigned
sensitivity labels that are a combination of hierarchical
classification levels and non-hierarchical categories, and the
labels shall be used as the basis for mandatory access control
decisions. The TCB shall be able to support two or more such
security levels. (See the Mandatory Access Control guidelines.)
The following requirements shall hold for all accesses between
ALL SUBJECTS EXTERNAL TO THE TCB AND ALL OBJECTS DIRECTLY OR
INDIRECTLY ACCESSIBLE BY THESE SUBJECTS: A subject can read an
object only if the hierarchical classification in the subject’s
security level is greater than or equal to the hierarchical
classification in the object’s security level and the non-
hierarchical categories in the subject’s security level include
all the non-hierarchical categories in the object’s security
level. A subject can write an object only if the hierarchical
classification in the subject’s security level is less than or
equal to the hierarchical classification in the object’s
security level and all the non-hierarchical categories in the
subject’s security level are included in the non-hierarchical
categories in the object’s security level.

3.2.2 ACCOUNTABILITY

3.2.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.2.2.1.1 Trusted Path

THE TCB SHALL SUPPORT A TRUSTED COMMUNICATION PATH
BETWEEN ITSELF AND USER FOR INITIAL LOGIN AND
AUTHENTICATION. COMMUNICATIONS VIA THIS PATH SHALL BE
INITIATED EXCLUSIVELY BY A USER.

3.2.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record. For
events that introduce an object into a user’s address space and
for object deletion events the audit record shall include the
name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
and/or object security level. THE TCB SHALL BE ABLE TO AUDIT
THE IDENTIFIED EVENTS THAT MAY BE USED IN THE EXPLOITATION OF
COVERT STORAGE CHANNELS.

3.2.3 ASSURANCE

3.2.3.1 Operational Assurance

3.2.3.1.1 System Architecture

THE TCB SHALL MAINTAIN A DOMAIN FOR ITS OWN EXECUTION
THAT PROTECTS IT FROM EXTERNAL INTERFERENCE OR TAMPERING
(E.G., BY MODIFICATION OF ITS CODE OR DATA STRUCTURES).
THE TCB SHALL MAINTAIN PROCESS ISOLATION THROUGH THE
PROVISION OF DISTINCT ADDRESS SPACES UNDER ITS CONTROL.
THE TCB SHALL BE INTERNALLY STRUCTURED INTO WELL-DEFINED
LARGELY INDEPENDENT MODULES. IT SHALL MAKE EFFECTIVE USE
OF AVAILABLE HARDWARE TO SEPARATE THOSE ELEMENTS THAT ARE
PROTECTION-CRITICAL FROM THOSE THAT ARE NOT. THE TCB
MODULES SHALL BE DESIGNED SUCH THAT THE PRINCIPLE OF LEAST
PRIVILEGE IS ENFORCED. FEATURES IN HARDWARE, SUCH AS
SEGMENTATION, SHALL BE USED TO SUPPORT LOGICALLY DISTINCT
STORAGE OBJECTS WITH SEPARATE ATTRIBUTES (NAMELY:
READABLE, WRITEABLE). THE USER INTERFACE TO THE TCB
SHALL BE COMPLETELY DEFINED AND ALL ELEMENTS OF THE TCB
IDENTIFIED.

3.2.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

3.2.3.1.3 Covert Channel Analysis

THE SYSTEM DEVELOPER SHALL CONDUCT A THOROUGH SEARCH FOR
COVERT STORAGE CHANNELS AND MAKE A DETERMINATION (EITHER
BY ACTUAL MEASUREMENT OR BY ENGINEERING ESTIMATION) OF
THE MAXIMUM BANDWIDTH OF EACH IDENTIFIED CHANNEL. (SEE
THE COVERT CHANNELS GUIDELINE SECTION.)

3.2.3.1.4 Trusted Facility Management

THE TCB SHALL SUPPORT SEPARATE OPERATOR AND ADMINISTRATOR
FUNCTIONS.

3.2.3.2 Life-Cycle Assurance

3.2.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall be:
to uncover all design and implementation flaws that would
permit a subject external to the TCB to read, change, or
delete data normally denied under the mandatory or
discretionary security policy enforced by the TCB; as well
as to assure that no subject (without authorization to do
so) is able to cause the TCB to enter a state such that it
is unable to respond to communications initiated by other
users. THE TCB SHALL BE FOUND RELATIVELY RESISTANT TO
PENETRATION. All discovered flaws shall be CORRECTED and
the TCB retested to demonstrate that they have been
eliminated and that new flaws have not been introduced.
TESTING SHALL DEMONSTRATE THAT THE TCB IMPLEMENTATION IS
CONSISTENT WITH THE DESCRIPTIVE TOP-LEVEL SPECIFICATION.
(See the Security Testing Guidelines.)

3.2.3.2.2 Design Specification and Verification

A FORMAL model of the security policy supported by the
TCB shall be maintained that is PROVEN consistent with
its axioms. A DESCRIPTIVE TOP-LEVEL SPECIFICATION (DTLS)
OF THE TCB SHALL BE MAINTAINED THAT COMPLETELY AND
ACCURATELY DESCRIBES THE TCB IN TERMS OF EXCEPTIONS, ERROR
MESSAGES, AND EFFECTS. IT SHALL BE SHOWN TO BE AN
ACCURATE DESCRIPTION OF THE TCB INTERFACE.

3.2.3.2.3 Configuration Management

DURING DEVELOPMENT AND MAINTENANCE OF THE TCB, A
CONFIGURATION MANAGEMENT SYSTEM SHALL BE IN PLACE THAT
MAINTAINS CONTROL OF CHANGES TO THE DESCRIPTIVE TOP-LEVEL
SPECIFICATION, OTHER DESIGN DATA, IMPLEMENTATION
DOCUMENTATION, SOURCE CODE, THE RUNNING VERSION OF THE
OBJECT CODE, AND TEST FIXTURES AND DOCUMENTATION. THE
CONFIGURATION MANAGEMENT SYSTEM SHALL ASSURE A CONSISTENT
MAPPING AMONG ALL DOCUMENTATION AND CODE ASSOCIATED WITH
THE CURRENT VERSION OF THE TCB. TOOLS SHALL BE PROVIDED
FOR GENERATION OF A NEW VERSION OF THE TCB FROM SOURCE
CODE. ALSO AVAILABLE SHALL BE TOOLS FOR COMPARING A
NEWLY GENERATED VERSION WITH THE PREVIOUS TCB VERSION IN
ORDER TO ASCERTAIN THAT ONLY THE INTENDED CHANGES HAVE
BEEN MADE IN THE CODE THAT WILL ACTUALLY BE USED AS THE
NEW VERSION OF THE TCB.

3.2.4 DOCUMENTATION

3.2.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.2.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. THE TCB MODULES THAT CONTAIN
THE REFERENCE VALIDATION MECHANISM SHALL BE IDENTIFIED. THE
PROCEDURES FOR SECURE GENERATION OF A NEW TCB FROM SOURCE AFTER
MODIFICATION OF ANY MODULES IN THE TCB SHALL BE DESCRIBED.

3.2.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. IT SHALL INCLUDE RESULTS OF
TESTING THE EFFECTIVENESS OF THE METHODS USED TO REDUCE COVERT
CHANNEL BANDWIDTHS.

3.2.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. THE
interfaces between THE TCB modules shall be described. A
FORMAL description of the security policy model enforced by the
TCB shall be available and PROVEN that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. THE DESCRIPTIVE TOP-LEVEL
SPECIFICATION (DTLS) SHALL BE SHOWN TO BE AN ACCURATE
DESCRIPTION OF THE TCB INTERFACE. DOCUMENTATION SHALL DESCRIBE
HOW THE TCB IMPLEMENTS THE REFERENCE MONITOR CONCEPT AND GIVE
AN EXPLANATION WHY IT IS TAMPERPROOF, CANNOT BE BYPASSED, AND
IS CORRECTLY IMPLEMENTED. DOCUMENTATION SHALL DESCRIBE HOW THE
TCB IS STRUCTURED TO FACILITATE TESTING AND TO ENFORCE LEAST
PRIVILEGE. THIS DOCUMENTATION SHALL ALSO PRESENT THE RESULTS
OF THE COVERT CHANNEL ANALYSIS AND THE TRADEOFFS INVOLVED IN
RESTRICTING THE CHANNELS. ALL AUDITABLE EVENTS THAT MAY BE
USED IN THE EXPLOITATION OF KNOWN COVERT STORAGE CHANNELS SHALL
BE IDENTIFIED. THE BANDWIDTHS OF KNOWN COVERT STORAGE CHANNELS,
THE USE OF WHICH IS NOT DETECTABLE BY THE AUDITING MECHANISMS,
SHALL BE PROVIDED. (SEE THE COVERT CHANNEL GUIDELINE SECTION.)

3.3 CLASS (B3): SECURITY DOMAINS

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests. To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity. A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required. The system is highly resistant to
penetration. The following are minimal requirements for systems assigned a
class (B3) rating:

3.3.1 SECURITY POLICY

3.3.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (E.G., ACCESS CONTROL LISTS) shall
allow users to specify and control sharing of those OBJECTS.
The discretionary access control mechanism shall, either by
explicit user action or by default, provide that objects are
protected from unauthorized access. These access controls shall
be capable of SPECIFYING, FOR EACH NAMED OBJECT, A LIST OF NAMED
INDIVIDUALS AND A LIST OF GROUPS OF NAMED INDIVIDUALS WITH THEIR
RESPECTIVE MODES OF ACCESS TO THAT OBJECT. FURTHERMORE, FOR
EACH SUCH NAMED OBJECT, IT SHALL BE POSSIBLE TO SPECIFY A LIST
OF NAMED INDIVIDUALS AND A LIST OF GROUPS OF NAMED INDIVIDUALS
FOR WHICH NO ACCESS TO THE OBJECT IS TO BE GIVEN. Access
permission to an object by users not already possessing access
permission shall only be assigned by authorized users.

3.3.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.3.1.3 Labels

Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

3.3.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

3.3.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

3.3.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

3.3.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the information
they process. However, the TCB shall include a
mechanism by which the TCB and an authorized user
reliably communicate to designate the single
security level of information imported or exported
via single-level communication channels or I/O
devices.

3.3.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.

_____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
_____________________________________________________________

3.3.1.3.3 Subject Sensitivity Labels

The TCB shall immediately notify a terminal user of each
change in the security level associated with that user
during an interactive session. A terminal user shall be
able to query the TCB as desired for a display of the
subject’s complete sensitivity label.

3.3.1.3.4 Device Labels

The TCB shall support the assignment of minimum and
maximum security levels to all attached physical devices.
These security levels shall be used by the TCB to enforce
constraints imposed by the physical environments in which
the devices are located.

3.3.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O
devices) that are directly or indirectly accessible by subjects
external to the TCB. These subjects and objects shall be
assigned sensitivity labels that are a combination of
hierarchical classification levels and non-hierarchical
categories, and the labels shall be used as the basis for
mandatory access control decisions. The TCB shall be able to
support two or more such security levels. (See the Mandatory
Access Control guidelines.) The following requirements shall
hold for all accesses between all subjects external to the TCB
and all objects directly or indirectly accessible by these
subjects: A subject can read an object only if the hierarchical
classification in the subject’s security level is greater than
or equal to the hierarchical classification in the object’s
security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical
categories in the object’s security level. A subject can write
an object only if the hierarchical classification in the
subject’s security level is less than or equal to the
hierarchical classification in the object’s security level and
all the non-hierarchical categories in the subject’s security
level are included in the non- hierarchical categories in the
object’s security level.

3.3.2 ACCOUNTABILITY

3.3.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.3.2.1.1 Trusted Path

The TCB shall support a trusted communication path
between itself and USERS for USE WHEN A POSITIVE TCB-TO-
USER CONNECTION IS REQUIRED (E.G., LOGIN, CHANGE SUBJECT
SECURITY LEVEL). Communications via this TRUSTED path
shall be ACTIVATED exclusively by a user OR THE TCB AND
SHALL BE LOGICALLY ISOLATED AND UNMISTAKABLY
DISTINGUISHABLE FROM OTHER PATHS.

3.3.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object and the object’s security level.
The ADP system administrator shall be able to selectively audit
the actions of any one or more users based on individual
identity and/or object security level. The TCB shall be able to
audit the identified events that may be used in the exploitation
of covert storage channels. THE TCB SHALL CONTAIN A MECHANISM
THAT IS ABLE TO MONITOR THE OCCURRENCE OR ACCUMULATION OF
SECURITY AUDITABLE EVENTS THAT MAY INDICATE AN IMMINENT
VIOLATION OF SECURITY POLICY. THIS MECHANISM SHALL BE ABLE TO
IMMEDIATELY NOTIFY THE SECURITY ADMINISTRATOR WHEN THRESHOLDS
ARE EXCEEDED.

3.3.3 ASSURANCE

3.3.3.1 Operational Assurance

3.3.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
The TCB shall maintain process isolation through the
provision of distinct address spaces under its control.
The TCB shall be internally structured into well-defined
largely independent modules. It shall make effective use
of available hardware to separate those elements that are
protection-critical from those that are not. The TCB
modules shall be designed such that the principle of
least privilege is enforced. Features in hardware, such
as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely:
readable, writeable). The user interface to the TCB shall
be completely defined and all elements of the TCB
identified. THE TCB SHALL BE DESIGNED AND STRUCTURED TO
USE A COMPLETE, CONCEPTUALLY SIMPLE PROTECTION MECHANISM
WITH PRECISELY DEFINED SEMANTICS. THIS MECHANISM SHALL
PLAY A CENTRAL ROLE IN ENFORCING THE INTERNAL STRUCTURING
OF THE TCB AND THE SYSTEM. THE TCB SHALL INCORPORATE
SIGNIFICANT USE OF LAYERING, ABSTRACTION AND DATA HIDING.
SIGNIFICANT SYSTEM ENGINEERING SHALL BE DIRECTED TOWARD
MINIMIZING THE COMPLEXITY OF THE TCB AND EXCLUDING FROM
THE TCB MODULES THAT ARE NOT PROTECTION-CRITICAL.

3.3.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

3.3.3.1.3 Covert Channel Analysis

The system developer shall conduct a thorough search for
COVERT CHANNELS and make a determination (either by
actual measurement or by engineering estimation) of the
maximum bandwidth of each identified channel. (See the
Covert Channels Guideline section.)

3.3.3.1.4 Trusted Facility Management

The TCB shall support separate operator and administrator
functions. THE FUNCTIONS PERFORMED IN THE ROLE OF A
SECURITY ADMINISTRATOR SHALL BE IDENTIFIED. THE ADP
SYSTEM ADMINISTRATIVE PERSONNEL SHALL ONLY BE ABLE TO
PERFORM SECURITY ADMINISTRATOR FUNCTIONS AFTER TAKING A
DISTINCT AUDITABLE ACTION TO ASSUME THE SECURITY
ADMINISTRATOR ROLE ON THE ADP SYSTEM. NON-SECURITY
FUNCTIONS THAT CAN BE PERFORMED IN THE SECURITY
ADMINISTRATION ROLE SHALL BE LIMITED STRICTLY TO THOSE
ESSENTIAL TO PERFORMING THE SECURITY ROLE EFFECTIVELY.

3.3.3.1.5 Trusted Recovery

PROCEDURES AND/OR MECHANISMS SHALL BE PROVIDED TO ASSURE
THAT, AFTER AN ADP SYSTEM FAILURE OR OTHER DISCONTINUITY,
RECOVERY WITHOUT A PROTECTION COMPROMISE IS OBTAINED.

3.3.3.2 Life-Cycle Assurance

3.3.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that
would permit a subject external to the TCB to read,
change, or delete data normally denied under the
mandatory or discretionary security policy enforced by
the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter
a state such that it is unable to respond to
communications initiated by other users. The TCB shall
be FOUND RESISTANT TO penetration. All discovered flaws
shall be corrected and the TCB retested to demonstrate
that they have been eliminated and that new flaws have
not been introduced. Testing shall demonstrate that the
TCB implementation is consistent with the descriptive
top-level specification. (See the Security Testing
Guidelines.) NO DESIGN FLAWS AND NO MORE THAN A FEW
CORRECTABLE IMPLEMENTATION FLAWS MAY BE FOUND DURING
TESTING AND THERE SHALL BE REASONABLE CONFIDENCE THAT
FEW REMAIN.

3.3.3.2.2 Design Specification and Verification

A formal model of the security policy supported by the
TCB shall be maintained that is proven consistent with
its axioms. A descriptive top-level specification (DTLS)
of the TCB shall be maintained that completely and
accurately describes the TCB in terms of exceptions, error
messages, and effects. It shall be shown to be an
accurate description of the TCB interface. A CONVINCING
ARGUMENT SHALL BE GIVEN THAT THE DTLS IS CONSISTENT WITH
THE MODEL.

3.3.3.2.3 Configuration Management

During development and maintenance of the TCB, a
configuration management system shall be in place that
maintains control of changes to the descriptive top-level
specification, other design data, implementation
documentation, source code, the running version of the
object code, and test fixtures and documentation. The
configuration management system shall assure a consistent
mapping among all documentation and code associated with
the current version of the TCB. Tools shall be provided
for generation of a new version of the TCB from source
code. Also available shall be tools for comparing a
newly generated version with the previous TCB version in
order to ascertain that only the intended changes have
been made in the code that will actually be used as the
new version of the TCB.

3.3.4 DOCUMENTATION

3.3.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.3.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. The TCB modules that contain
the reference validation mechanism shall be identified. The
procedures for secure generation of a new TCB from source after
modification of any modules in the TCB shall be described. IT
SHALL INCLUDE THE PROCEDURES TO ENSURE THAT THE SYSTEM IS
INITIALLY STARTED IN A SECURE MANNER. PROCEDURES SHALL ALSO BE
INCLUDED TO RESUME SECURE SYSTEM OPERATION AFTER ANY LAPSE IN
SYSTEM OPERATION.

3.3.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. It shall include results of
testing the effectiveness of the methods used to reduce covert
channel bandwidths.

3.3.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. The
interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the
TCB shall be available and proven that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. The descriptive top-level
specification (DTLS) shall be shown to be an accurate
description of the TCB interface. Documentation shall describe
how the TCB implements the reference monitor concept and give
an explanation why it is tamperproof, cannot be bypassed, and
is correctly implemented. THE TCB IMPLEMENTATION (I.E., IN
HARDWARE, FIRMWARE, AND SOFTWARE) SHALL BE INFORMALLY SHOWN TO
BE CONSISTENT WITH THE DTLS. THE ELEMENTS OF THE DTLS SHALL BE
SHOWN, USING INFORMAL TECHNIQUES, TO CORRESPOND TO THE ELEMENTS
OF THE TCB. Documentation shall describe how the TCB is
structured to facilitate testing and to enforce least privilege.
This documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage channels,
the use of which is not detectable by the auditing mechanisms,
shall be provided. (See the Covert Channel Guideline section.)

4.0 DIVISION A: VERIFIED PROTECTION

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system. Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

4.1 CLASS (A1): VERIFIED DESIGN

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented. This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design.
Independent of the particular specification language or verification system
used, there are five important criteria for class (A1) design verification:

* A formal model of the security policy must be clearly
identified and documented, including a mathematical proof
that the model is consistent with its axioms and is
sufficient to support the security policy.

* An FTLS must be produced that includes abstract definitions
of the functions the TCB performs and of the hardware and/or
firmware mechanisms that are used to support separate
execution domains.

* The FTLS of the TCB must be shown to be consistent with the
model by formal techniques where possible (i.e., where
verification tools exist) and informal ones otherwise.

* The TCB implementation (i.e., in hardware, firmware, and
software) must be informally shown to be consistent with the
FTLS. The elements of the FTLS must be shown, using
informal techniques, to correspond to the elements of the
TCB. The FTLS must express the unified protection mechanism
required to satisfy the security policy, and it is the
elements of this protection mechanism that are mapped to the
elements of the TCB.

* Formal analysis techniques must be used to identify and
analyze covert channels. Informal techniques may be used to
identify covert timing channels. The continued existence of
identified covert channels in the system must be justified.

In keeping with the extensive design and development analysis of the TCB
required of systems in class (A1), more stringent configuration management is
required and procedures are established for securely distributing the system
to sites. A system security administrator is supported.

The following are minimal requirements for systems assigned a class (A1)
rating:

4.1.1 SECURITY POLICY

4.1.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., access control lists) shall
allow users to specify and control sharing of those objects.
The discretionary access control mechanism shall, either by
explicit user action or by default, provide that objects are
protected from unauthorized access. These access controls
shall be capable of specifying, for each named object, a list
of named individuals and a list of groups of named individuals
with their respective modes of access to that object.
Furthermore, for each such named object, it shall be possible to
specify a list of named individuals and a list of groups of
named individuals for which no access to the object is to be
given. Access permission to an object by users not already
possessing access permission shall only be assigned by
authorized users.

4.1.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

4.1.1.3 Labels

Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

4.1.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

4.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

4.1.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

4.1.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the information
they process. However, the TCB shall include a
mechanism by which the TCB and an authorized user
reliably communicate to designate the single
security level of information imported or exported
via single-level communication channels or I/O
devices.

4.1.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.

____________________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
____________________________________________________________________

4.1.1.3.3 Subject Sensitivity Labels

The TCB shall immediately notify a terminal user of each
change in the security level associated with that user
during an interactive session. A terminal user shall be
able to query the TCB as desired for a display of the
subject’s complete sensitivity label.

4.1.1.3.4 Device Labels

The TCB shall support the assignment of minimum and
maximum security levels to all attached physical devices.
These security levels shall be used by the TCB to enforce
constraints imposed by the physical environments in which
the devices are located.

4.1.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O
devices) that are directly or indirectly accessible by subjects
external to the TCB. These subjects and objects shall be
assigned sensitivity labels that are a combination of
hierarchical classification levels and non-hierarchical
categories, and the labels shall be used as the basis for
mandatory access control decisions. The TCB shall be able to
support two or more such security levels. (See the Mandatory
Access Control guidelines.) The following requirements shall
hold for all accesses between all subjects external to the TCB
and all objects directly or indirectly accessible by these
subjects: A subject can read an object only if the hierarchical
classification in the subject’s security level is greater than
or equal to the hierarchical classification in the object’s
security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical
categories in the object’s security level. A subject can write
an object only if the hierarchical classification in the
subject’s security level is less than or equal to the
hierarchical classification in the object’s security level and
all the non-hierarchical categories in the subject’s security
level are included in the non- hierarchical categories in the
object’s security level.

4.1.2 ACCOUNTABILITY

4.1.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

4.1.2.1.1 Trusted Path

The TCB shall support a trusted communication path
between itself and users for use when a positive TCB-to-
user connection is required (e.g., login, change subject
security level). Communications via this trusted path
shall be activated exclusively by a user or the TCB and
shall be logically isolated and unmistakably
distinguishable from other paths.

4.1.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record. For
events that introduce an object into a user’s address space and
for object deletion events the audit record shall include the
name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
and/or object security level. The TCB shall be able to audit
the identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able to
immediately notify the security administrator when thresholds
are exceeded.

4.1.3 ASSURANCE

4.1.3.1 Operational Assurance

4.1.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
The TCB shall maintain process isolation through the
provision of distinct address spaces under its control.
The TCB shall be internally structured into well-defined
largely independent modules. It shall make effective use
of available hardware to separate those elements that are
protection-critical from those that are not. The TCB
modules shall be designed such that the principle of
least privilege is enforced. Features in hardware, such
as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely:
readable, writeable). The user interface to the TCB
shall be completely defined and all elements of the TCB
identified. The TCB shall be designed and structured to
use a complete, conceptually simple protection mechanism
with precisely defined semantics. This mechanism shall
play a central role in enforcing the internal structuring
of the TCB and the system. The TCB shall incorporate
significant use of layering, abstraction and data hiding.
Significant system engineering shall be directed toward
minimizing the complexity of the TCB and excluding from
the TCB modules that are not protection-critical.

4.1.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

4.1.3.1.3 Covert Channel Analysis

The system developer shall conduct a thorough search for
COVERT CHANNELS and make a determination (either by
actual measurement or by engineering estimation) of the
maximum bandwidth of each identified channel. (See the
Covert Channels Guideline section.) FORMAL METHODS SHALL
BE USED IN THE ANALYSIS.

4.1.3.1.4 Trusted Facility Management

The TCB shall support separate operator and administrator
functions. The functions performed in the role of a
security administrator shall be identified. The ADP
system administrative personnel shall only be able to
perform security administrator functions after taking a
distinct auditable action to assume the security
administrator role on the ADP system. Non-security
functions that can be performed in the security
administration role shall be limited strictly to those
essential to performing the security role effectively.

4.1.3.1.5 Trusted Recovery

Procedures and/or mechanisms shall be provided to assure
that, after an ADP system failure or other discontinuity,
recovery without a protection compromise is obtained.

4.1.3.2 Life-Cycle Assurance

4.1.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that
would permit a subject external to the TCB to read,
change, or delete data normally denied under the
mandatory or discretionary security policy enforced by
the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter
a state such that it is unable to respond to
communications initiated by other users. The TCB shall
be found resistant to penetration. All discovered flaws
shall be corrected and the TCB retested to demonstrate
that they have been eliminated and that new flaws have
not been introduced. Testing shall demonstrate that the
TCB implementation is consistent with the FORMAL top-
level specification. (See the Security Testing
Guidelines.) No design flaws and no more than a few
correctable implementation flaws may be found during
testing and there shall be reasonable confidence that few
remain. MANUAL OR OTHER MAPPING OF THE FTLS TO THE
SOURCE CODE MAY FORM A BASIS FOR PENETRATION TESTING.

4.1.3.2.2 Design Specification and Verification

A formal model of the security policy supported by the
TCB shall be maintained that is proven consistent with
its axioms. A descriptive top-level specification (DTLS)
of the TCB shall be maintained that completely and
accurately describes the TCB in terms of exceptions, error
messages, and effects. A FORMAL TOP-LEVEL SPECIFICATION
(FTLS) OF THE TCB SHALL BE MAINTAINED THAT ACCURATELY
DESCRIBES THE TCB IN TERMS OF EXCEPTIONS, ERROR MESSAGES,
AND EFFECTS. THE DTLS AND FTLS SHALL INCLUDE THOSE
COMPONENTS OF THE TCB THAT ARE IMPLEMENTED AS HARDWARE
AND/OR FIRMWARE IF THEIR PROPERTIES ARE VISIBLE AT THE
TCB INTERFACE. THE FTLS shall be shown to be an accurate
description of the TCB interface. A convincing argument
shall be given that the DTLS is consistent with the model
AND A COMBINATION OF FORMAL AND INFORMAL TECHNIQUES SHALL
BE USED TO SHOW THAT THE FTLS IS CONSISTENT WITH THE
MODEL. THIS VERIFICATION EVIDENCE SHALL BE CONSISTENT
WITH THAT PROVIDED WITHIN THE STATE-OF-THE-ART OF THE
PARTICULAR COMPUTER SECURITY CENTER-ENDORSED FORMAL
SPECIFICATION AND VERIFICATION SYSTEM USED. MANUAL OR
OTHER MAPPING OF THE FTLS TO THE TCB SOURCE CODE SHALL BE
PERFORMED TO PROVIDE EVIDENCE OF CORRECT IMPLEMENTATION.

4.1.3.2.3 Configuration Management

During THE ENTIRE LIFE-CYCLE, I.E., DURING THE DESIGN,
DEVELOPMENT, and maintenance of the TCB, a configuration
management system shall be in place FOR ALL SECURITY-
RELEVANT HARDWARE, FIRMWARE, AND SOFTWARE that maintains
control of changes to THE FORMAL MODEL, the descriptive
AND FORMAL top-level SPECIFICATIONS, other design data,
implementation documentation, source code, the running
version of the object code, and test fixtures and
documentation. The configuration management system shall
assure a consistent mapping among all documentation and
code associated with the current version of the TCB.
Tools shall be provided for generation of a new version
of the TCB from source code. Also available shall be
tools, MAINTAINED UNDER STRICT CONFIGURATION CONTROL, for
comparing a newly generated version with the previous TCB
version in order to ascertain that only the intended
changes have been made in the code that will actually be
used as the new version of the TCB. A COMBINATION OF
TECHNICAL, PHYSICAL, AND PROCEDURAL SAFEGUARDS SHALL BE
USED TO PROTECT FROM UNAUTHORIZED MODIFICATION OR
DESTRUCTION THE MASTER COPY OR COPIES OF ALL MATERIAL
USED TO GENERATE THE TCB.

4.1.3.2.4 Trusted Distribution

A TRUSTED ADP SYSTEM CONTROL AND DISTRIBUTION FACILITY
SHALL BE PROVIDED FOR MAINTAINING THE INTEGRITY OF THE
MAPPING BETWEEN THE MASTER DATA DESCRIBING THE CURRENT
VERSION OF THE TCB AND THE ON-SITE MASTER COPY OF THE
CODE FOR THE CURRENT VERSION. PROCEDURES (E.G., SITE
SECURITY ACCEPTANCE TESTING) SHALL EXIST FOR ASSURING
THAT THE TCB SOFTWARE, FIRMWARE, AND HARDWARE UPDATES
DISTRIBUTED TO A CUSTOMER ARE EXACTLY AS SPECIFIED BY
THE MASTER COPIES.

4.1.4 DOCUMENTATION

4.1.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

4.1.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. The TCB modules that contain
the reference validation mechanism shall be identified. The
procedures for secure generation of a new TCB from source after
modification of any modules in the TCB shall be described. It
shall include the procedures to ensure that the system is
initially started in a secure manner. Procedures shall also be
included to resume secure system operation after any lapse in
system operation.

4.1.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. It shall include results of
testing the effectiveness of the methods used to reduce covert
channel bandwidths. THE RESULTS OF THE MAPPING BETWEEN THE
FORMAL TOP-LEVEL SPECIFICATION AND THE TCB SOURCE CODE SHALL BE
GIVEN.

4.1.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. The
interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the
TCB shall be available and proven that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. The descriptive top-level
specification (DTLS) shall be shown to be an accurate
description of the TCB interface. Documentation shall describe
how the TCB implements the reference monitor concept and give
an explanation why it is tamperproof, cannot be bypassed, and
is correctly implemented. The TCB implementation (i.e., in
hardware, firmware, and software) shall be informally shown to
be consistent with the FORMAL TOP- LEVEL SPECIFICATION (FTLS).
The elements of the FTLS shall be shown, using informal
techniques, to correspond to the elements of the TCB.
Documentation shall describe how the TCB is structured to
facilitate testing and to enforce least privilege. This
documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage channels,
the use of which is not detectable by the auditing mechanisms,
shall be provided. (See the Covert Channel Guideline section.)
HARDWARE, FIRMWARE, AND SOFTWARE MECHANISMS NOT DEALT WITH IN
THE FTLS BUT STRICTLY INTERNAL TO THE TCB (E.G., MAPPING
REGISTERS, DIRECT MEMORY ACCESS I/O) SHALL BE CLEARLY DESCRIBED.

4.2 BEYOND CLASS (A1)

Most of the security enhancements envisioned for systems that will provide
features and assurance in addition to that already provided by class (Al)
systems are beyond current technology. The discussion below is intended to
guide future work and is derived from research and development activities
already underway in both the public and private sectors. As more and better
analysis techniques are developed, the requirements for these systems will
become more explicit. In the future, use of formal verification will be
extended to the source level and covert timing channels will be more fully
addressed. At this level the design environment will become important and
testing will be aided by analysis of the formal top-level specification.
Consideration will be given to the correctness of the tools used in TCB
development (e.g., compilers, assemblers, loaders) and to the correct
functioning of the hardware/firmware on which the TCB will run. Areas to be
addressed by systems beyond class (A1) include:

* System Architecture

A demonstration (formal or otherwise) must be given showing
that requirements of self-protection and completeness for
reference monitors have been implemented in the TCB.

* Security Testing

Although beyond the current state-of-the-art, it is
envisioned that some test-case generation will be done
automatically from the formal top-level specification or
formal lower-level specifications.

* Formal Specification and Verification

The TCB must be verified down to the source code level,
using formal verification methods where feasible. Formal
verification of the source code of the security-relevant
portions of an operating system has proven to be a difficult
task. Two important considerations are the choice of a
high-level language whose semantics can be fully and
formally expressed, and a careful mapping, through
successive stages, of the abstract formal design to a
formalization of the implementation in low-level
specifications. Experience has shown that only when the
lowest level specifications closely correspond to the actual
code can code proofs be successfully accomplished.

* Trusted Design Environment

The TCB must be designed in a trusted facility with only
trusted (cleared) personnel.

PART II:

5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS

The criteria are divided within each class into groups of requirements. These
groupings were developed to assure that three basic control objectives for
computer security are satisfied and not overlooked. These control objectives
deal with:

* Security Policy
* Accountability
* Assurance

This section provides a discussion of these general control objectives and
their implication in terms of designing trusted systems.

5.1 A Need for Consensus

A major goal of the DoD Computer Security Center is to encourage the Computer
Industry to develop trusted computer systems and products, making them widely
available in the commercial market place. Achievement of this goal will
require recognition and articulation by both the public and private sectors of
a need and demand for such products.

As described in the introduction to this document, efforts to define the
problems and develop solutions associated with processing nationally sensitive
information, as well as other sensitive data such as financial, medical, and
personnel information used by the National Security Establishment, have been
underway for a number of years. The criteria, as described in Part I,
represent the culmination of these efforts and describe basic requirements for
building trusted computer systems. To date, however, these systems have been
viewed by many as only satisfying National Security needs. As long as this
perception continues the consensus needed to motivate manufacture of trusted
systems will be lacking.

The purpose of this section is to describe, in some detail, the fundamental
control objectives that lay the foundations for requirements delineated in the
criteria. The goal is to explain the foundations so that those outside the
National Security Establishment can assess their universality and, by
extension, the universal applicability of the criteria requirements to
processing all types of sensitive applications whether they be for National
Security or the private sector.

5.2 Definition and Usefulness

The term “control objective” refers to a statement of intent with respect to
control over some aspect of an organization’s resources, or processes, or
both. In terms of a computer system, control objectives provide a framework
for developing a strategy for fulfilling a set of security requirements for
any given system. Developed in response to generic vulnerabilities, such as
the need to manage and handle sensitive data in order to prevent compromise,
or the need to provide accountability in order to detect fraud, control
objectives have been identified as a useful method of expressing security
goals.[3]

Examples of control objectives include the three basic design requirements for
implementing the reference monitor concept discussed in Section 6. They are:

* The reference validation mechanism must be tamperproof.

* The reference validation mechanism must always be invoked.

* The reference validation mechanism must be small enough to be
subjected to analysis and tests, the completeness of which can
be assured.[1]

5.3 Criteria Control Objectives

The three basic control objectives of the criteria are concerned with security
policy, accountability, and assurance. The remainder of this section provides
a discussion of these basic requirements.

5.3.1 Security Policy

In the most general sense, computer security is concerned with
controlling the way in which a computer can be used, i.e.,
controlling how information processed by it can be accessed and
manipulated. However, at closer examination, computer security
can refer to a number of areas. Symptomatic of this, FIPS PUB 39,
Glossary For Computer Systems Security, does not have a unique
definition for computer security.[16] Instead there are eleven
separate definitions for security which include: ADP systems
security, administrative security, data security, etc. A common
thread running through these definitions is the word “protection.”
Further declarations of protection requirements can be found in
DoD Directive 5200.28 which describes an acceptable level of
protection for classified data to be one that will “assure that
systems which process, store, or use classified data and produce
classified information will, with reasonable dependability,
prevent: a. Deliberate or inadvertent access to classified
material by unauthorized persons, and b. Unauthorized
manipulation of the computer and its associated peripheral
devices.”[8]

In summary, protection requirements must be defined in terms of
the perceived threats, risks, and goals of an organization. This
is often stated in terms of a security policy. It has been
pointed out in the literature that it is external laws, rules,
regulations, etc. that establish what access to information is to
be permitted, independent of the use of a computer. In particular,
a given system can only be said to be secure with respect to its
enforcement of some specific policy.[30] Thus, the control
objective for security policy is:

SECURITY POLICY CONTROL OBJECTIVE

A STATEMENT OF INTENT WITH REGARD TO CONTROL OVER ACCESS TO AND
DISSEMINATION OF INFORMATION, TO BE KNOWN AS THE SECURITY POLICY,
MUST BE PRECISELY DEFINED AND IMPLEMENTED FOR EACH SYSTEM THAT IS
USED TO PROCESS SENSITIVE INFORMATION. THE SECURITY POLICY MUST
ACCURATELY REFLECT THE LAWS, REGULATIONS, AND GENERAL POLICIES
FROM WHICH IT IS DERIVED.

5.3.1.1 Mandatory Security Policy

Where a security policy is developed that is to be applied
to control of classified or other specifically designated
sensitive information, the policy must include detailed
rules on how to handle that information throughout its
life-cycle. These rules are a function of the various
sensitivity designations that the information can assume
and the various forms of access supported by the system.
Mandatory security refers to the enforcement of a set of
access control rules that constrains a subject’s access to
information on the basis of a comparison of that
individual’s clearance/authorization to the information,
the classification/sensitivity designation of the
information, and the form of access being mediated.
Mandatory policies either require or can be satisfied by
systems that can enforce a partial ordering of
designations, namely, the designations must form what is
mathematically known as a “lattice.”[5]

A clear implication of the above is that the system must
assure that the designations associated with sensitive data
cannot be arbitrarily changed, since this could permit
individuals who lack the appropriate authorization to
access sensitive information. Also implied is the
requirement that the system control the flow of information
so that data cannot be stored with lower sensitivity
designations unless its “downgrading” has been authorized.
The control objective is:

MANDATORY SECURITY CONTROL OBJECTIVE

SECURITY POLICIES DEFINED FOR SYSTEMS THAT ARE USED TO
PROCESS CLASSIFIED OR OTHER SPECIFICALLY CATEGORIZED
SENSITIVE INFORMATION MUST INCLUDE PROVISIONS FOR THE
ENFORCEMENT OF MANDATORY ACCESS CONTROL RULES. THAT IS,
THEY MUST INCLUDE A SET OF RULES FOR CONTROLLING ACCESS
BASED DIRECTLY ON A COMPARISON OF THE INDIVIDUAL’S
CLEARANCE OR AUTHORIZATION FOR THE INFORMATION AND THE
CLASSIFICATION OR SENSITIVITY DESIGNATION OF THE
INFORMATION BEING SOUGHT, AND INDIRECTLY ON CONSIDERATIONS
OF PHYSICAL AND OTHER ENVIRONMENTAL FACTORS OF CONTROL.
THE MANDATORY ACCESS CONTROL RULES MUST ACCURATELY REFLECT
THE LAWS, REGULATIONS, AND GENERAL POLICIES FROM WHICH
THEY ARE DERIVED.

5.3.1.2 Discretionary Security Policy

Discretionary security is the principal type of access
control available in computer systems today. The basis of
this kind of security is that an individual user, or
program operating on his behalf, is allowed to specify
explicitly the types of access other users may have to
information under his control. Discretionary security
differs from mandatory security in that it implements an
access control policy on the basis of an individual’s
need-to-know as opposed to mandatory controls which are
driven by the classification or sensitivity designation of
the information.

Discretionary controls are not a replacement for mandatory
controls. In an environment in which information is
classified (as in the DoD) discretionary security provides
for a finer granularity of control within the overall
constraints of the mandatory policy. Access to classified
information requires effective implementation of both types
of controls as precondition to granting that access. In
general, no person may have access to classified
information unless: (a) that person has been determined to
be trustworthy, i.e., granted a personnel security
clearance — MANDATORY, and (b) access is necessary for the
performance of official duties, i.e., determined to have a
need-to-know — DISCRETIONARY. In other words,
discretionary controls give individuals discretion to
decide on which of the permissible accesses will actually
be allowed to which users, consistent with overriding
mandatory policy restrictions. The control objective is:

DISCRETIONARY SECURITY CONTROL OBJECTIVE

SECURITY POLICIES DEFINED FOR SYSTEMS THAT ARE USED TO
PROCESS CLASSIFIED OR OTHER SENSITIVE INFORMATION MUST
INCLUDE PROVISIONS FOR THE ENFORCEMENT OF DISCRETIONARY
ACCESS CONTROL RULES. THAT IS, THEY MUST INCLUDE A
CONSISTENT SET OF RULES FOR CONTROLLING AND LIMITING ACCESS
BASED ON IDENTIFIED INDIVIDUALS WHO HAVE BEEN DETERMINED TO
HAVE A NEED-TO-KNOW FOR THE INFORMATION.

5.3.1.3 Marking

To implement a set of mechanisms that will put into effect
a mandatory security policy, it is necessary that the
system mark information with appropriate classification or
sensitivity labels and maintain these markings as the
information moves through the system. Once information is
unalterably and accurately marked, comparisons required by
the mandatory access control rules can be accurately and
consistently made. An additional benefit of having the
system maintain the classification or sensitivity label
internally is the ability to automatically generate
properly “labeled” output. The labels, if accurately and
integrally maintained by the system, remain accurate when
output from the system. The control objective is:

MARKING CONTROL OBJECTIVE

SYSTEMS THAT ARE DESIGNED TO ENFORCE A MANDATORY SECURITY
POLICY MUST STORE AND PRESERVE THE INTEGRITY OF
CLASSIFICATION OR OTHER SENSITIVITY LABELS FOR ALL
INFORMATION. LABELS EXPORTED FROM THE SYSTEM MUST BE
ACCURATE REPRESENTATIONS OF THE CORRESPONDING INTERNAL
SENSITIVITY LABELS BEING EXPORTED.

5.3.2 Accountability

The second basic control objective addresses one of the
fundamental principles of security, i.e., individual
accountability. Individual accountability is the key to securing
and controlling any system that processes information on behalf
of individuals or groups of individuals. A number of requirements
must be met in order to satisfy this objective.

The first requirement is for individual user identification.
Second, there is a need for authentication of the identification.
Identification is functionally dependent on authentication.
Without authentication, user identification has no credibility.
Without a credible identity, neither mandatory nor discretionary
security policies can be properly invoked because there is no
assurance that proper authorizations can be made.

The third requirement is for dependable audit capabilities. That
is, a trusted computer system must provide authorized personnel
with the ability to audit any action that can potentially cause
access to, generation of, or effect the release of classified or
sensitive information. The audit data will be selectively
acquired based on the auditing needs of a particular installation
and/or application. However, there must be sufficient granularity
in the audit data to support tracing the auditable events to a
specific individual who has taken the actions or on whose behalf
the actions were taken. The control objective is:

ACCOUNTABILITY CONTROL OBJECTIVE

SYSTEMS THAT ARE USED TO PROCESS OR HANDLE CLASSIFIED OR OTHER
SENSITIVE INFORMATION MUST ASSURE INDIVIDUAL ACCOUNTABILITY
WHENEVER EITHER A MANDATORY OR DISCRETIONARY SECURITY POLICY IS
INVOKED. FURTHERMORE, TO ASSURE ACCOUNTABILITY THE CAPABILITY
MUST EXIST FOR AN AUTHORIZED AND COMPETENT AGENT TO ACCESS AND
EVALUATE ACCOUNTABILITY INFORMATION BY A SECURE MEANS, WITHIN A
REASONABLE AMOUNT OF TIME, AND WITHOUT UNDUE DIFFICULTY.

5.3.3 Assurance

The third basic control objective is concerned with guaranteeing
or providing confidence that the security policy has been
implemented correctly and that the protection-relevant elements of
the system do, indeed, accurately mediate and enforce the intent
of that policy. By extension, assurance must include a guarantee
that the trusted portion of the system works only as intended. To
accomplish these objectives, two types of assurance are needed.
They are life-cycle assurance and operational assurance.

Life-cycle assurance refers to steps taken by an organization to
ensure that the system is designed, developed, and maintained
using formalized and rigorous controls and standards.[17]
Computer systems that process and store sensitive or classified
information depend on the hardware and software to protect that
information. It follows that the hardware and software themselves
must be protected against unauthorized changes that could cause
protection mechanisms to malfunction or be bypassed completely.
For this reason trusted computer systems must be carefully
evaluated and tested during the design and development phases and
reevaluated whenever changes are made that could affect the
integrity of the protection mechanisms. Only in this way can
confidence be provided that the hardware and software
interpretation of the security policy is maintained accurately
and without distortion.

While life-cycle assurance is concerned with procedures for
managing system design, development, and maintenance; operational
assurance focuses on features and system architecture used to
ensure that the security policy is uncircumventably enforced
during system operation. That is, the security policy must be
integrated into the hardware and software protection features of
the system. Examples of steps taken to provide this kind of
confidence include: methods for testing the operational hardware
and software for correct operation, isolation of protection-
critical code, and the use of hardware and software to provide
distinct domains. The control objective is:

ASSURANCE CONTROL OBJECTIVE

SYSTEMS THAT ARE USED TO PROCESS OR HANDLE CLASSIFIED OR OTHER
SENSITIVE INFORMATION MUST BE DESIGNED TO GUARANTEE CORRECT AND
ACCURATE INTERPRETATION OF THE SECURITY POLICY AND MUST NOT
DISTORT THE INTENT OF THAT POLICY. ASSURANCE MUST BE PROVIDED
THAT CORRECT IMPLEMENTATION AND OPERATION OF THE POLICY EXISTS
THROUGHOUT THE SYSTEM’S LIFE-CYCLE.

6.0 RATIONALE BEHIND THE EVALUATION CLASSES

6.1 The Reference Monitor Concept

In October of 1972, the Computer Security Technology Planning Study, conducted
by James P. Anderson & Co., produced a report for the Electronic Systems
Division (ESD) of the United States Air Force.[1] In that report, the concept
of “a reference monitor which enforces the authorized access relationships
between subjects and objects of a system” was introduced. The reference
monitor concept was found to be an essential element of any system that would
provide multilevel secure computing facilities and controls.

The Anderson report went on to define the reference validation mechanism as
“an implementation of the reference monitor concept . . . that validates
each reference to data or programs by any user (program) against a list of
authorized types of reference for that user.” It then listed the three design
requirements that must be met by a reference validation mechanism:

a. The reference validation mechanism must be tamper proof.

b. The reference validation mechanism must always be invoked.

c. The reference validation mechanism must be small enough to be
subject to analysis and tests, the completeness of which can
be assured.”[1]

Extensive peer review and continuing research and development activities have
sustained the validity of the Anderson Committee’s findings. Early examples
of the reference validation mechanism were known as security kernels. The
Anderson Report described the security kernel as “that combination of hardware
and software which implements the reference monitor concept.”[1] In this vein,
it will be noted that the security kernel must support the three reference
monitor requirements listed above.

6.2 A Formal Security Policy Model

Following the publication of the Anderson report, considerable research was
initiated into formal models of security policy requirements and of the
mechanisms that would implement and enforce those policy models as a security
kernel. Prominent among these efforts was the ESD-sponsored development of
the Bell and LaPadula model, an abstract formal treatment of DoD security
policy.[2] Using mathematics and set theory, the model precisely defines the
notion of secure state, fundamental modes of access, and the rules for
granting subjects specific modes of access to objects. Finally, a theorem is
proven to demonstrate that the rules are security-preserving operations, so
that the application of any sequence of the rules to a system that is in a
secure state will result in the system entering a new state that is also
secure. This theorem is known as the Basic Security Theorem.

The Bell and LaPadula model defines a relationship between clearances of
subjects and classifications of system objects, now referenced as the
“dominance relation.” From this definition, accesses permitted between
subjects and objects are explicitly defined for the fundamental modes of
access, including read-only access, read/write access, and write-only access.
The model defines the Simple Security Condition to control granting a subject
read access to a specific object, and the *-Property (read “Star Property”) to
control granting a subject write access to a specific object. Both the Simple
Security Condition and the *-Property include mandatory security provisions
based on the dominance relation between the clearance of the subject and the
classification of the object. The Discretionary Security Property is also
defined, and requires that a specific subject be authorized for the particular
mode of access required for the state transition. In its treatment of
subjects (processes acting on behalf of a user), the model distinguishes
between trusted subjects (i.e., not constrained within the model by the
*-Property) and untrusted subjects (those that are constrained by the
*-Property).

From the Bell and LaPadula model there evolved a model of the method of proof
required to formally demonstrate that all arbitrary sequences of state
transitions are security-preserving. It was also shown that the *- Property
is sufficient to prevent the compromise of information by Trojan Horse
attacks.

6.3 The Trusted Computing Base

In order to encourage the widespread commercial availability of trusted
computer systems, these evaluation criteria have been designed to address
those systems in which a security kernel is specifically implemented as well
as those in which a security kernel has not been implemented. The latter case
includes those systems in which objective (c) is not fully supported because
of the size or complexity of the reference validation mechanism. For
convenience, these evaluation criteria use the term Trusted Computing Base to
refer to the reference validation mechanism, be it a security kernel,
front-end security filter, or the entire trusted computer system.

The heart of a trusted computer system is the Trusted Computing Base (TCB)
which contains all of the elements of the system responsible for supporting
the security policy and supporting the isolation of objects (code and data) on
which the protection is based. The bounds of the TCB equate to the “security
perimeter” referenced in some computer security literature. In the interest
of understandable and maintainable protection, a TCB should be as simple as
possible consistent with the functions it has to perform. Thus, the TCB
includes hardware, firmware, and software critical to protection and must be
designed and implemented such that system elements excluded from it need not
be trusted to maintain protection. Identification of the interface and
elements of the TCB along with their correct functionality therefore forms the
basis for evaluation.

For general-purpose systems, the TCB will include key elements of the
operating system and may include all of the operating system. For embedded
systems, the security policy may deal with objects in a way that is meaningful
at the application level rather than at the operating system level. Thus, the
protection policy may be enforced in the application software rather than in
the underlying operating system. The TCB will necessarily include all those
portions of the operating system and application software essential to the
support of the policy. Note that, as the amount of code in the TCB increases,
it becomes harder to be confident that the TCB enforces the reference monitor
requirements under all circumstances.

6.4 Assurance

The third reference monitor design objective is currently interpreted as
meaning that the TCB “must be of sufficiently simple organization and
complexity to be subjected to analysis and tests, the completeness of which
can be assured.”

Clearly, as the perceived degree of risk increases (e.g., the range of
sensitivity of the system’s protected data, along with the range of clearances
held by the system’s user population) for a particular system’s operational
application and environment, so also must the assurances be increased to
substantiate the degree of trust that will be placed in the system. The
hierarchy of requirements that are presented for the evaluation classes in the
trusted computer system evaluation criteria reflect the need for these
assurances.

As discussed in Section 5.3, the evaluation criteria uniformly require a
statement of the security policy that is enforced by each trusted computer
system. In addition, it is required that a convincing argument be presented
that explains why the TCB satisfies the first two design requirements for a
reference monitor. It is not expected that this argument will be entirely
formal. This argument is required for each candidate system in order to
satisfy the assurance control objective.

The systems to which security enforcement mechanisms have been added, rather
than built-in as fundamental design objectives, are not readily amenable to
extensive analysis since they lack the requisite conceptual simplicity of a
security kernel. This is because their TCB extends to cover much of the
entire system. Hence, their degree of trustworthiness can best be ascertained
only by obtaining test results. Since no test procedure for something as
complex as a computer system can be truly exhaustive, there is always the
possibility that a subsequent penetration attempt could succeed. It is for
this reason that such systems must fall into the lower evaluation classes.

On the other hand, those systems that are designed and engineered to support
the TCB concepts are more amenable to analysis and structured testing. Formal
methods can be used to analyze the correctness of their reference validation
mechanisms in enforcing the system’s security policy. Other methods,
including less-formal arguments, can be used in order to substantiate claims
for the completeness of their access mediation and their degree of
tamper-resistance. More confidence can be placed in the results of this
analysis and in the thoroughness of the structured testing than can be placed
in the results for less methodically structured systems. For these reasons,
it appears reasonable to conclude that these systems could be used in
higher-risk environments. Successful implementations of such systems would be
placed in the higher evaluation classes.

6.5 The Classes

It is highly desirable that there be only a small number of overall evaluation
classes. Three major divisions have been identified in the evaluation
criteria with a fourth division reserved for those systems that have been
evaluated and found to offer unacceptable security protection. Within each
major evaluation division, it was found that “intermediate” classes of trusted
system design and development could meaningfully be defined. These
intermediate classes have been designated in the criteria because they
identify systems that:

* are viewed to offer significantly better protection and assurance
than would systems that satisfy the basic requirements for their
evaluation class; and

* there is reason to believe that systems in the intermediate
evaluation classes could eventually be evolved such that they
would satisfy the requirements for the next higher evaluation
class.

Except within division A it is not anticipated that additional “intermediate”
evaluation classes satisfying the two characteristics described above will be
identified.

Distinctions in terms of system architecture, security policy enforcement, and
evidence of credibility between evaluation classes have been defined such that
the “jump” between evaluation classes would require a considerable investment
of effort on the part of implementors. Correspondingly, there are expected to
be significant differentials of risk to which systems from the higher
evaluation classes will be exposed.

7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA

Section 1 presents fundamental computer security requirements and Section 5
presents the control objectives for Trusted Computer Systems. They are
general requirements, useful and necessary, for the development of all secure
systems. However, when designing systems that will be used to process
classified or other sensitive information, functional requirements for meeting
the Control Objectives become more specific. There is a large body of policy
laid down in the form of Regulations, Directives, Presidential Executive
Orders, and OMB Circulars that form the basis of the procedures for the
handling and processing of Federal information in general and classified
information specifically. This section presents pertinent excerpts from these
policy statements and discusses their relationship to the Control Objectives.

7.1 Established Federal Policies

A significant number of computer security policies and associated requirements
have been promulgated by Federal government elements. The interested reader
is referred to reference [32] which analyzes the need for trusted systems in
the civilian agencies of the Federal government, as well as in state and local
governments and in the private sector. This reference also details a number
of relevant Federal statutes, policies and requirements not treated further
below.

Security guidance for Federal automated information systems is provided by the
Office of Management and Budget. Two specifically applicable Circulars have
been issued. OMB Circular No. A-71, Transmittal Memorandum No. 1, “Security
of Federal Automated Information Systems,”[26] directs each executive agency
to establish and maintain a computer security program. It makes the head of
each executive branch, department and agency responsible “for assuring an
adequate level of security for all agency data whether processed in-house or
commercially. This includes responsibility for the establishment of physical,
administrative and technical safeguards required to adequately protect
personal, proprietary or other sensitive data not subject to national security
regulations, as well as national security data.”[26, para. 4 p. 2]

OMB Circular No. A-123, “Internal Control Systems,”[27] issued to help
eliminate fraud, waste, and abuse in government programs requires: (a) agency
heads to issue internal control directives and assign responsibility, (b)
managers to review programs for vulnerability, and (c) managers to perform
periodic reviews to evaluate strengths and update controls. Soon after
promulgation of OMB Circular A-123, the relationship of its internal control
requirements to building secure computer systems was recognized.[4] While not
stipulating computer controls specifically, the definition of Internal
Controls in A-123 makes it clear that computer systems are to be included:

“Internal Controls – The plan of organization and all of the methods and
measures adopted within an agency to safeguard its resources, assure the
accuracy and reliability of its information, assure adherence to
applicable laws, regulations and policies, and promote operational
economy and efficiency.”[27, sec. 4.C]

The matter of classified national security information processed by ADP
systems was one of the first areas given serious and extensive concern in
computer security. The computer security policy documents promulgated as a
result contain generally more specific and structured requirements than most,
keyed in turn to an authoritative basis that itself provides a rather clearly
articulated and structured information security policy. This basis, Executive
Order 12356, “National Security Information,” sets forth requirements for the
classification, declassification and safeguarding of “national security
information” per se.[14]

7.2 DoD Policies

Within the Department of Defense, these broad requirements are implemented and
further specified primarily through two vehicles: 1) DoD Regulation 5200.1-R
[7], which applies to all components of the DoD as such, and 2) DoD 5220.22-M,
“Industrial Security Manual for Safeguarding Classified Information” [11],
which applies to contractors included within the Defense Industrial Security
Program. Note that the latter transcends DoD as such, since it applies not
only to any contractors handling classified information for any DoD component,
but also to the contractors of eighteen other Federal organizations for whom
the Secretary of Defense is authorized to act in rendering industrial security
services.*

____________________________________________________________
* i.e., NASA, Commerce Department, GSA, State Department,
Small Business Administration, National Science Foundation,
Treasury Department, Transportation Department, Interior
Department, Agriculture Department, Health and Human
Services Department, Labor Department, Environmental
Protection Agency, Justice Department, U.S. Arms Control and
Disarmament Agency, Federal Emergency Management Agency,
Federal Reserve System, and U.S. General Accounting Office.
____________________________________________________________

For ADP systems, these information security requirements are further amplified
and specified in: 1) DoD Directive 5200.28 [8] and DoD Manual 5200.28-M [9],
for DoD components; and 2) Section XIII of DoD 5220.22-M [11] for contractors.
DoD Directive 5200.28, “Security Requirements for Automatic Data Processing
(ADP) Systems,” stipulates: “Classified material contained in an ADP system
shall be safeguarded by the continuous employment of protective features in
the system’s hardware and software design and configuration . . . .”[8,
sec. IV] Furthermore, it is required that ADP systems that “process, store,
or use classified data and produce classified information will, with
reasonable dependability, prevent:

a. Deliberate or inadvertent access to classified material by
unauthorized persons, and

b. Unauthorized manipulation of the computer and its associated
peripheral devices.”[8, sec. I B.3]

Requirements equivalent to these appear within DoD 5200.28-M [9] and in DoD
5220.22-M [11].

From requirements imposed by these regulations, directives and circulars, the
three components of the Security Policy Control Objective, i.e., Mandatory and
Discretionary Security and Marking, as well as the Accountability and
Assurance Control Objectives, can be functionally defined for DoD
applications. The following discussion provides further specificity in Policy
for these Control Objectives.

7.3 Criteria Control Objective for Security Policy

7.3.1 Marking

The control objective for marking is: “Systems that are designed
to enforce a mandatory security policy must store and preserve the
integrity of classification or other sensitivity labels for all
information. Labels exported from the system must be accurate
representations of the corresonding internal sensitivity labels
being exported.”

DoD 5220.22-M, “Industrial Security Manual for Safeguarding
Classified Information,” explains in paragraph 11 the reasons for
marking information:

“Designation by physical marking, notation or other means
serves to inform and to warn the holder about the
classification designation of the information which requires
protection in the interest of national security. The degree
of protection against unauthorized disclosure which will be
required for a particular level of classification is directly
commensurate with the marking designation which is assigned
to the material.”[11]

Marking requirements are given in a number of policy statements.

Executive Order 12356 (Sections 1.5.a and 1.5.a.1) requires that
classification markings “shall be shown on the face of all
classified documents, or clearly associated with other forms of
classified information in a manner appropriate to the medium
involved.”[14]

DoD Regulation 5200.1-R (Section 1-500) requires that: “. . .
information or material that requires protection against
unauthorized disclosure in the interest of national security shall
be classified in one of three designations, namely: ‘Top Secret,’
‘Secret’ or ‘Confidential.'”[7] (By extension, for use in computer
processing, the unofficial designation “Unclassified” is used to
indicate information that does not fall under one of the other
three designations of classified information.)

DoD Regulation 5200.1-R (Section 4-304b) requires that: “ADP
systems and word processing systems employing such media shall
provide for internal classification marking to assure that
classified information contained therein that is reproduced or
generated, will bear applicable classification and associated
markings.” (This regulation provides for the exemption of certain
existing systems where “internal classification and applicable
associated markings cannot be implemented without extensive system
modifications.”[7] However, it is clear that future DoD ADP
systems must be able to provide applicable and accurate labels for
classified and other sensitive information.)

DoD Manual 5200.28-M (Section IV, 4-305d) requires the following:
“Security Labels – All classified material accessible by or within
the ADP system shall be identified as to its security
classification and access or dissemination limitations, and all
output of the ADP system shall be appropriately marked.”[9]

7.3.2 Mandatory Security

The control objective for mandatory security is: “Security
policies defined for systems that are used to process classified
or other specifically categorized sensitive information must
include provisions for the enforcement of mandatory access control
rules. That is, they must include a set of rules for controlling
access based directly on a comparison of the individual’s
clearance or authorization for the information and the
classification or sensitivity designation of the information being
sought, and indirectly on considerations of physical and other
environmental factors of control. The mandatory access control
rules must accurately reflect the laws, regulations, and general
policies from which they are derived.”

There are a number of policy statements that are related to
mandatory security.

Executive Order 12356 (Section 4.1.a) states that “a person is
eligible for access to classified information provided that a
determination of trustworthiness has been made by agency heads or
designated officials and provided that such access is essential
to the accomplishment of lawful and authorized Government
purposes.”[14]

DoD Regulation 5200.1-R (Chapter I, Section 3) defines a Special
Access Program as “any program imposing ‘need-to-know’ or access
controls beyond those normally provided for access to
Confidential, Secret, or Top Secret information. Such a program
includes, but is not limited to, special clearance, adjudication,
or investigative requirements, special designation of officials
authorized to determine ‘need-to-know’, or special lists of persons
determined to have a ‘need-to- know.'”[7, para. 1-328] This
passage distinguishes between a ‘discretionary’ determination of
need-to-know and formal need-to-know which is implemented through
Special Access Programs. DoD Regulation 5200.1-R, paragraph 7-100
describes general requirements for trustworthiness (clearance) and
need-to-know, and states that the individual with possession,
knowledge or control of classified information has final
responsibility for determining if conditions for access have been
met. This regulation further stipulates that “no one has a right
to have access to classified information solely by virtue of rank
or position.” [7, para. 7-100])

DoD Manual 5200.28-M (Section II 2-100) states that, “Personnel
who develop, test (debug), maintain, or use programs which are
classified or which will be used to access or develop classified
material shall have a personnel security clearance and an access
authorization (need-to-know), as appropriate for the highest
classified and most restrictive category of classified material
which they will access under system constraints.”[9]

DoD Manual 5220.22-M (Paragraph 3.a) defines access as “the
ability and opportunity to obtain knowledge of classified
information. An individual, in fact, may have access to
classified information by being in a place where such information
is kept, if the security measures which are in force do not
prevent him from gaining knowledge of the classified
information.”[11]

The above mentioned Executive Order, Manual, Directives and
Regulations clearly imply that a trusted computer system must
assure that the classification labels associated with sensitive
data cannot be arbitrarily changed, since this could permit
individuals who lack the appropriate clearance to access
classified information. Also implied is the requirement that a
trusted computer system must control the flow of information so
that data from a higher classification cannot be placed in a
storage object of lower classification unless its “downgrading”
has been authorized.

7.3.3 Discretionary Security

The term discretionary security refers to a computer system’s
ability to control information on an individual basis. It stems
from the fact that even though an individual has all the formal
clearances for access to specific classified information, each
individual’s access to information must be based on a demonstrated
need-to-know. Because of this, it must be made clear that this
requirement is not discretionary in a “take it or leave it” sense.
The directives and regulations are explicit in stating that the
need-to-know test must be satisfied before access can be granted
to the classified information. The control objective for
discretionary security is: “Security policies defined for systems
that are used to process classified or other sensitive information
must include provisions for the enforcement of discretionary
access control rules. That is, they must include a consistent set
of rules for controlling and limiting access based on identified
individuals who have been determined to have a need-to-know for the
information.”

DoD Regulation 5200.1-R (Paragraph 7-100) In addition to excerpts
already provided that touch on need-to- know, this section of the
regulation stresses the need- to-know principle when it states “no
person may have access to classified information unless . . .
access is necessary for the performance of official duties.”[7]

Also, DoD Manual 5220.22-M (Section III 20.a) states that “an
individual shall be permitted to have access to classified
information only . . . when the contractor determines that access
is necessary in the performance of tasks or services essential to
the fulfillment of a contract or program, i.e., the individual has
a need-to-know.”[11]

7.4 Criteria Control Objective for Accountability

The control objective for accountability is: “Systems that are used to
process or handle classified or other sensitive information must assure
individual accountability whenever either a mandatory or discretionary
security policy is invoked. Furthermore, to assure accountability the
capability must exist for an authorized and competent agent to access and
evaluate accountability information by a secure means, within a reasonable
amount of time, and without undue difficulty.”

This control objective is supported by the following citations:

DoD Directive 5200.28 (VI.A.1) states: “Each user’s identity shall be
positively established, and his access to the system, and his activity in
the system (including material accessed and actions taken) controlled and
open to scrutiny.”[8]

DoD Manual 5200.28-M (Section V 5-100) states: “An audit log or file
(manual, machine, or a combination of both) shall be maintained as a
history of the use of the ADP System to permit a regular security review
of system activity. (e.g., The log should record security related
transactions, including each access to a classified file and the nature
of the access, e.g., logins, production of accountable classified
outputs, and creation of new classified files. Each classified file
successfully accessed [regardless of the number of individual references]
during each ‘job’ or ‘interactive session’ should also be recorded in the
audit log. Much of the material in this log may also be required to
assure that the system preserves information entrusted to it.)”[9]

DoD Manual 5200.28-M (Section IV 4-305f) states: “Where needed to assure
control of access and individual accountability, each user or specific
group of users shall be identified to the ADP System by appropriate
administrative or hardware/software measures. Such identification
measures must be in sufficient detail to enable the ADP System to provide
the user only that material which he is authorized.”[9]

DoD Manual 5200.28-M (Section I 1-102b) states:

“Component’s Designated Approving Authorities, or their designees
for this purpose . . . will assure:

. . . . . . . . . . . . . . . . .

(4) Maintenance of documentation on operating systems (O/S)
and all modifications thereto, and its retention for a
sufficient period of time to enable tracing of security-
related defects to their point of origin or inclusion in the
system.

. . . . . . . . . . . . . . . . .

(6) Establishment of procedures to discover, recover,
handle, and dispose of classified material improperly
disclosed through system malfunction or personnel action.

(7) Proper disposition and correction of security
deficiencies in all approved ADP Systems, and the effective
use and disposition of system housekeeping or audit records,
records of security violations or security-related system
malfunctions, and records of tests of the security features
of an ADP System.”[9]

DoD Manual 5220.22-M (Section XIII 111) states: “Audit Trails

a. The general security requirement for any ADP system audit
trail is that it provide a documented history of the use of
the system. An approved audit trail will permit review of
classified system activity and will provide a detailed
activity record to facilitate reconstruction of events to
determine the magnitude of compromise (if any) should a
security malfunction occur. To fulfill this basic
requirement, audit trail systems, manual, automated or a
combination of both must document significant events
occurring in the following areas of concern: (i) preparation
of input data and dissemination of output data (i.e.,
reportable interactivity between users and system support
personnel), (ii) activity involved within an ADP environment
(e.g., ADP support personnel modification of security and
related controls), and (iii) internal machine activity.

b. The audit trail for an ADP system approved to process
classified information must be based on the above three
areas and may be stylized to the particular system. All
systems approved for classified processing should contain
most if not all of the audit trail records listed below. The
contractor’s SPP documentation must identify and describe
those applicable:

1. Personnel access;

2. Unauthorized and surreptitious entry into the
central computer facility or remote terminal areas;

3. Start/stop time of classified processing indicating
pertinent systems security initiation and termination events
(e.g., upgrading/downgrading actions pursuant to paragraph
107);

4. All functions initiated by ADP system console
operators;

5. Disconnects of remote terminals and peripheral
devices (paragraph 107c);

6. Log-on and log-off user activity;

7. Unauthorized attempts to access files or programs,
as well as all open, close, create, and file destroy
actions;

8. Program aborts and anomalies including
identification information (i.e., user/program name, time
and location of incident, etc.);

9. System hardware additions, deletions and maintenance
actions;

10. Generations and modifications affecting the
security features of the system software.

c. The ADP system security supervisor or designee shall
review the audit trail logs at least weekly to assure that
all pertinent activity is properly recorded and that
appropriate action has been taken to correct any anomaly.
The majority of ADP systems in use today can develop audit
trail systems in accord with the above; however, special
systems such as weapons, communications, communications
security, and tactical data exchange and display systems,
may not be able to comply with all aspects of the above and
may require individualized consideration by the cognizant
security office.

d. Audit trail records shall be retained for a period of one
inspection cycle.”[11]

7.5 Criteria Control Objective for Assurance

The control objective for assurance is: “Systems that are used to process
or handle classified or other sensitive information must be designed to
guarantee correct and accurate interpretation of the security policy and
must not distort the intent of that policy. Assurance must be provided
that correct implementation and operation of the policy exists throughout
the system’s life-cycle.”

A basis for this objective can be found in the following sections of DoD
Directive 5200.28:

DoD Directive 5200.28 (IV.B.1) stipulates: “Generally, security of an ADP
system is most effective and economical if the system is designed
originally to provide it. Each Department of Defense Component
undertaking design of an ADP system which is expected to process, store,
use, or produce classified material shall: From the beginning of the
design process, consider the security policies, concepts, and measures
prescribed in this Directive.”[8]

DoD Directive 5200.28 (IV.C.5.a) states: “Provision may be made to permit
adjustment of ADP system area controls to the level of protection
required for the classification category and type(s) of material actually
being handled by the system, provided change procedures are developed and
implemented which will prevent both the unauthorized access to classified
material handled by the system and the unauthorized manipulation of the
system and its components. Particular attention shall be given to the
continuous protection of automated system security measures, techniques
and procedures when the personnel security clearance level of users
having access to the system changes.”[8]

DoD Directive 5200.28 (VI.A.2) states: “Environmental Control. The ADP
System shall be externally protected to minimize the likelihood of
unauthorized access to system entry points, access to classified
information in the system, or damage to the system.”[8]

DoD Manual 5200.28-M (Section I 1-102b) states:

“Component’s Designated Approving Authorities, or their designees
for this purpose . . . will assure:

. . . . . . . . . . . . . . . . .

(5) Supervision, monitoring, and testing, as appropriate, of
changes in an approved ADP System which could affect the
security features of the system, so that a secure system is
maintained.

. . . . . . . . . . . . . . . . .

(7) Proper disposition and correction of security
deficiencies in all approved ADP Systems, and the effective
use and disposition of system housekeeping or audit records,
records of security violations or security-related system
malfunctions, and records of tests of the security features
of an ADP System.

(8) Conduct of competent system ST&E, timely review of
system ST&E reports, and correction of deficiencies needed
to support conditional or final approval or disapproval of
an ADP System for the processing of classified information.

(9) Establishment, where appropriate, of a central ST&E
coordination point for the maintenance of records of
selected techniques, procedures, standards, and tests used
in the testing and evaluation of security features of ADP
Systems which may be suitable for validation and use by
other Department of Defense Components.”[9]

DoD Manual 5220.22-M (Section XIII 103a) requires: “the initial approval,
in writing, of the cognizant security office prior to processing any
classified information in an ADP system. This section requires
reapproval by the cognizant security office for major system
modifications made subsequent to initial approval. Reapprovals will be
required because of (i) major changes in personnel access requirements,
(ii) relocation or structural modification of the central computer
facility, (iii) additions, deletions or changes to main frame, storage or
input/output devices, (iv) system software changes impacting security
protection features, (v) any change in clearance, declassification, audit
trail or hardware/software maintenance procedures, and (vi) other system
changes as determined by the cognizant security office.”[11]

A major component of assurance, life-cycle assurance, is concerned with
testing ADP systems both in the development phase as well as during
operation. DoD Directive 5215.1 (Section F.2.C.(2)) requires
“evaluations of selected industry and government-developed trusted
computer systems against these criteria.”[10]

8.0 A GUIDELINE ON COVERT CHANNELS

A covert channel is any communication channel that can be exploited by a
process to transfer information in a manner that violates the system’s
security policy. There are two types of covert channels: storage channels and
timing channels. Covert storage channels include all vehicles that would
allow the direct or indirect writing of a storage location by one process and
the direct or indirect reading of it by another. Covert timing channels
include all vehicles that would allow one process to signal information to
another process by modulating its own use of system resources in such a way
that the change in response time observed by the second process would provide
information.

From a security perspective, covert channels with low bandwidths represent a
lower threat than those with high bandwidths. However, for many types of
covert channels, techniques used to reduce the bandwidth below a certain rate
(which depends on the specific channel mechanism and the system architecture)
also have the effect of degrading the performance provided to legitimate
system users. Hence, a trade-off between system performance and covert
channel bandwidth must be made. Because of the threat of compromise that
would be present in any multilevel computer system containing classified or
sensitive information, such systems should not contain covert channels with
high bandwidths. This guideline is intended to provide system developers with
an idea of just how high a “high” covert channel bandwidth is.

A covert channel bandwidth that exceeds a rate of one hundred (100) bits per
second is considered “high” because 100 bits per second is the approximate
rate at which many computer terminals are run. It does not seem appropriate
to call a computer system “secure” if information can be compromised at a rate
equal to the normal output rate of some commonly used device.

In any multilevel computer system there are a number of relatively
low-bandwidth covert channels whose existence is deeply ingrained in the
system design. Faced with the large potential cost of reducing the bandwidths
of such covert channels, it is felt that those with maximum bandwidths of less
than one (1) bit per second are acceptable in most application environments.
Though maintaining acceptable performance in some systems may make it
impractical to eliminate all covert channels with bandwidths of 1 or more bits
per second, it is possible to audit their use without adversely affecting
system performance. This audit capability provides the system administration
with a means of detecting — and procedurally correcting — significant
compromise. Therefore, a Trusted Computing Base should provide, wherever
possible, the capability to audit the use of covert channel mechanisms with
bandwidths that may exceed a rate of one (1) bit in ten (10) seconds.

The covert channel problem has been addressed by a number of authors. The
interested reader is referred to references [5], [6], [19], [21], [22], [23],
and [29].

9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL FEATURES

The Mandatory Access Control requirement includes a capability to support an
unspecified number of hierarchical classifications and an unspecified number
of non-hierarchical categories at each hierarchical level. To encourage
consistency and portability in the design and development of the National
Security Establishment trusted computer systems, it is desirable for all such
systems to be able to support a minimum number of levels and categories. The
following suggestions are provided for this purpose:

* The number of hierarchical classifications should be greater than or
equal to eight (8).

* The number of non-hierarchical categories should be greater than or
equal to twenty-nine (29).

10.0 A GUIDELINE ON SECURITY TESTING

These guidelines are provided to give an indication of the extent and
sophistication of testing undertaken by the DoD Computer Security Center
during the Formal Product Evaluation process. Organizations wishing to use
“Department of Defense Trusted Computer System Evaluation Criteria” for
performing their own evaluations may find this section useful for planning
purposes.

As in Part I, highlighting is used to indicate changes in the guidelines from
the next lower division.

10.1 Testing for Division C

10.1.1 Personnel

The security testing team shall consist of at least two
individuals with bachelor degrees in Computer Science or the
equivalent. Team members shall be able to follow test plans
prepared by the system developer and suggest additions, shall
be familiar with the “flaw hypothesis” or equivalent security
testing methodology, and shall have assembly level programming
experience. Before testing begins, the team members shall have
functional knowledge of, and shall have completed the system
developer’s internals course for, the system being evaluated.

10.1.2 Testing

The team shall have “hands-on” involvement in an independent run
of the tests used by the system developer. The team shall
independently design and implement at least five system-specific
tests in an attempt to circumvent the security mechanisms of the
system. The elapsed time devoted to testing shall be at least
one month and need not exceed three months. There shall be no
fewer than twenty hands-on hours spent carrying out system
developer-defined tests and test team-defined tests.

10.2 Testing for Division B

10.2.1 Personnel

The security testing team shall consist of at least two
individuals with bachelor degrees in Computer Science or the
equivalent and at least one individual with a master’s degree in
Computer Science or equivalent. Team members shall be able to
follow test plans prepared by the system developer and suggest
additions, shall be conversant with the “flaw hypothesis” or
equivalent security testing methodology, shall be fluent in the
TCB implementation language(s), and shall have assembly level
programming experience. Before testing begins, the team members
shall have functional knowledge of, and shall have completed the
system developer’s internals course for, the system being
evaluated. At least one team member shall have previously
completed a security test on another system.

10.2.2 Testing

The team shall have “hands-on” involvement in an independent run
of the test package used by the system developer to test
security-relevant hardware and software. The team shall
independently design and implement at least fifteen system-
specific tests in an attempt to circumvent the security
mechanisms of the system. The elapsed time devoted to testing
shall be at least two months and need not exceed four months.
There shall be no fewer than thirty hands-on hours per team
member spent carrying out system developer-defined tests and
test team-defined tests.

10.3 Testing for Division A

10.3.1 Personnel

The security testing team shall consist of at least one
individual with a bachelor’s degree in Computer Science or the
equivalent and at least two individuals with masters’ degrees in
Computer Science or equivalent. Team members shall be able to
follow test plans prepared by the system developer and suggest
additions, shall be conversant with the “flaw hypothesis” or
equivalent security testing methodology, shall be fluent in the
TCB implementation language(s), and shall have assembly level
programming experience. Before testing begins, the team members
shall have functional knowledge of, and shall have completed the
system developer’s internals course for, the system being
evaluated. At least one team member shall be familiar enough
with the system hardware to understand the maintenance diagnostic
programs and supporting hardware documentation. At least two
team members shall have previously completed a security test on
another system. At least one team member shall have
demonstrated system level programming competence on the system
under test to a level of complexity equivalent to adding a device
driver to the system.

10.3.2 Testing

The team shall have “hands-on” involvement in an independent run
of the test package used by the system developer to test
security-relevant hardware and software. The team shall
independently design and implement at least twenty-five system-
specific tests in an attempt to circumvent the security
mechanisms of the system. The elapsed time devoted to testing
shall be at least three months and need not exceed six months.
There shall be no fewer than fifty hands-on hours per team
member spent carrying out system developer-defined tests and
test team-defined tests.

APPENDIX A

Commercial Product Evaluation Process

“Department of Defense Trusted Computer System Evaluation Criteria” forms the
basis upon which the Computer Security Center will carry out the commercial
computer security evaluation process. This process is focused on commercially
produced and supported general-purpose operating system products that meet the
needs of government departments and agencies. The formal evaluation is aimed
at “off-the-shelf” commercially supported products and is completely divorced
from any consideration of overall system performance, potential applications,
or particular processing environments. The evaluation provides a key input to
a computer system security approval/accreditation. However, it does not
constitute a complete computer system security evaluation. A complete study
(e.g., as in reference [18]) must consider additional factors dealing with the
system in its unique environment, such as it’s proposed security mode of
operation, specific users, applications, data sensitivity, physical and
personnel security, administrative and procedural security, TEMPEST, and
communications security.

The product evaluation process carried out by the Computer Security Center has
three distinct elements:

* Preliminary Product Evaluation – An informal dialogue between a vendor
and the Center in which technical information is exchanged to create a
common understanding of the vendor’s product, the criteria, and the
rating that may be expected to result from a formal product evaluation.

* Formal Product Evaluation – A formal evaluation, by the Center, of a
product that is available to the DoD, and that results in that product
and its assigned rating being placed on the Evaluated Products List.

* Evaluated Products List – A list of products that have been subjected
to formal product evaluation and their assigned ratings.

PRELIMINARY PRODUCT EVALUATION

Since it is generally very difficult to add effective security measures late
in a product’s life cycle, the Center is interested in working with system
vendors in the early stages of product design. A preliminary product
evaluation allows the Center to consult with computer vendors on computer
security issues found in products that have not yet been formally announced.

A preliminary evaluation is typically initiated by computer system vendors who
are planning new computer products that feature security or major
security-related upgrades to existing products. After an initial meeting
between the vendor and the Center, appropriate non-disclosure agreements are
executed that require the Center to maintain the confidentiality of any
proprietary information disclosed to it. Technical exchange meetings follow
in which the vendor provides details about the proposed product (particularly
its internal designs and goals) and the Center provides expert feedback to the
vendor on potential computer security strengths and weaknesses of the vendor’s
design choices, as well as relevant interpretation of the criteria. The
preliminary evaluation is typically terminated when the product is completed
and ready for field release by the vendor. Upon termination, the Center
prepares a wrap-up report for the vendor and for internal distribution within
the Center. Those reports containing proprietary information are not
available to the public.

During preliminary evaluation, the vendor is under no obligation to actually
complete or market the potential product. The Center is, likewise, not
committed to conduct a formal product evaluation. A preliminary evaluation
may be terminated by either the Center or the vendor when one notifies the
other, in writing, that it is no longer advantageous to continue the
evaluation.

FORMAL PRODUCT EVALUATION

The formal product evaluation provides a key input to certification of a
computer system for use in National Security Establishment applications and is
the sole basis for a product being placed on the Evaluated Products List.

A formal product evaluation begins with a request by a vendor for the Center
to evaluate a product for which the product itself and accompanying
documentation needed to meet the requirements defined by this publication are
complete. Non-disclosure agreements are executed and a formal product
evaluation team is formed by the Center. An initial meeting is then held with
the vendor to work out the schedule for the formal evaluation. Since testing
of the implemented product forms an important part of the evaluation process,
access by the evaluation team to a working version of the system is negotiated
with the vendor. Additional support required from the vendor includes
complete design documentation, source code, and access to vendor personnel who
can answer detailed questions about specific portions of the product. The
evaluation team tests the product against each requirement, making any
necessary interpretations of the criteria with respect to the product being
evaluated.

The evaluation team writes a two-part final report on their findings about the
system. The first part is publicly available (containing no proprietary
information) and contains the overall class rating assigned to the system and
the details of the evaluation team’s findings when comparing the product
against the evaluation criteria. The second part of the evaluation report
contains vulnerability analyses and other detailed information supporting the
rating decision. Since this part may contain proprietary or other sensitive
information it will be distributed only within the U.S. Government on a
strict need-to-know and non- disclosure basis, and to the vendor. No portion
of the evaluation results will be withheld from the vendor.

APPENDIX B

Summary of Evaluation Criteria Divisions

The divisions of systems recognized under the trusted computer system
evaluation criteria are as follows. Each division represents a major
improvement in the overall confidence one can place in the system to protect
classified and other sensitive information.

Division (D): Minimal Protection

This division contains only one class. It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

Division (C): Discretionary Protection

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

Division (B): Mandatory Protection

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division. Systems in this division must carry the
sensitivity labels with major data structures in the system. The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB. Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

Division (A): Verified Protection

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system. Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

APPENDIX C

Summary of Evaluation Criteria Classes

The classes of systems recognized under the trusted computer system evaluation
criteria are as follows. They are presented in the order of increasing
desirablity from a computer security point of view.

Class (D): Minimal Protection

This class is reserved for those systems that have been evaluated but that
fail to meet the requirements for a higher evaluation class.

Class (C1): Discretionary Security Protection

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data. It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data. The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity.

Class (C2): Controlled Access Protection

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation.

Class (B1): Labeled Security Protection

Class (B1) systems require all the features required for class (C2). In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information. Any
flaws identified by testing must be removed.

Class (B2): Structured Protection

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system. In addition, covert channels are
addressed. The TCB must be carefully structured into protection-critical and
non- protection-critical elements. The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review. Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed. The system is relatively resistant to penetration.

Class (B3): Security Domains

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests. To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity. A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required. The system is highly resistant to
penetration.

Class (A1): Verified Design

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented. This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design. In
keeping with the extensive design and development analysis of the TCB required
of systems in class (A1), more stringent configuration management is required
and procedures are established for securely distributing the system to sites.
A system security administrator is supported.

APPENDIX D

Requirement Directory

This appendix lists requirements defined in “Department of Defense Trusted
Computer System Evaluation Criteria” alphabetically rather than by class. It
is provided to assist in following the evolution of a requirement through the
classes. For each requirement, three types of criteria may be present. Each
will be preceded by the word: NEW, CHANGE, or ADD to indicate the following:

NEW: Any criteria appearing in a lower class are superseded
by the criteria that follow.

CHANGE: The criteria that follow have appeared in a lower class
but are changed for this class. Highlighting is used
to indicate the specific changes to previously stated
criteria.

ADD: The criteria that follow have not been required for any
lower class, and are added in this class to the
previously stated criteria for this requirement.

Abbreviations are used as follows:

NR: (No Requirement) This requirement is not included in
this class.

NAR: (No Additional Requirements) This requirement does not
change from the previous class.

The reader is referred to Part I of this document when placing new criteria
for a requirement into the complete context for that class.

Figure 1 provides a pictorial summary of the evolution of requirements through
the classes.

Audit

C1: NR.

C2: NEW: The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit trail of
accesses to the objects it protects. The audit data shall be
protected by the TCB so that read access to it is limited to those
who are authorized for audit data. The TCB shall be able to record
the following types of events: use of identification and
authentication mechanisms, introduction of objects into a user’s
address space (e.g., file open, program initiation), deletion of
objects, and actions taken by computer operators and system
administrators and/or system security officers. For each recorded
event, the audit record shall identify: date and time of the event,
user, type of event, and success or failure of the event. For
identification/authentication events the origin of request (e.g.,
terminal ID) shall be included in the audit record. For events that
introduce an object into a user’s address space and for object
deletion events the audit record shall include the name of the object.
The ADP system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity.

B1: CHANGE: For events that introduce an object into a user’s address
space and for object deletion events the audit record shall include
the name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the actions
of any one or more users based on individual identity and/or object
security level.

ADD: The TCB shall also be able to audit any override of
human-readable output markings.

B2: ADD: The TCB shall be able to audit the identified events that may be
used in the exploitation of covert storage channels.

B3: ADD: The TCB shall contain a mechanism that is able to monitor the
occurrence or accumulation of security auditable events that may
indicate an imminent violation of security policy. This mechanism
shall be able to immediately notify the security administrator when
thresholds are exceeded.

A1: NAR.

Configuration Management

C1: NR.

C2: NR.

B1: NR.

B2: NEW: During development and maintenance of the TCB, a configuration
management system shall be in place that maintains control of changes
to the descriptive top-level specification, other design data,
implementation documentation, source code, the running version of the
object code, and test fixtures and documentation. The configuration
management system shall assure a consistent mapping among all
documentation and code associated with the current version of the TCB.
Tools shall be provided for generation of a new version of the TCB
from source code. Also available shall be tools for comparing a
newly generated version with the previous TCB version in order to
ascertain that only the intended changes have been made in the code
that will actually be used as the new version of the TCB.

B3: NAR.

A1: CHANGE: During the entire life-cycle, i.e., during the design,
development, and maintenance of the TCB, a configuration management
system shall be in place for all security-relevant hardware, firmware,
and software that maintains control of changes to the formal model,
the descriptive and formal top-level specifications, other design
data, implementation documentation, source code, the running version
of the object code, and test fixtures and documentation. Also
available shall be tools, maintained under strict configuration
control, for comparing a newly generated version with the previous
TCB version in order to ascertain that only the intended changes have
been made in the code that will actually be used as the new version
of the TCB.

ADD: A combination of technical, physical, and procedural safeguards
shall be used to protect from unauthorized modification or
destruction the master copy or copies of all material used to
generate the TCB.

Covert Channel Analysis

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The system developer shall conduct a thorough search for covert
storage channels and make a determination (either by actual
measurement or by engineering estimation) of the maximum bandwidth of
each identified channel. (See the Covert Channels Guideline section.)

B3: CHANGE: The system developer shall conduct a thorough search for
covert channels and make a determination (either by actual
measurement or by engineering estimation) of the maximum bandwidth
of each identified channel.

A1: ADD: Formal methods shall be used in the analysis.

Design Documentation

C1: NEW: Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation of how
this philosophy is translated into the TCB. If the TCB is composed
of distinct modules, the interfaces between these modules shall be
described.

C2: NAR.

B1: ADD: An informal or formal description of the security policy model
enforced by the TCB shall be available and an explanation provided to
show that it is sufficient to enforce the security policy. The
specific TCB protection mechanisms shall be identified and an
explanation given to show that they satisfy the model.

B2: CHANGE: The interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the TCB
shall be available and proven that it is sufficient to enforce the
security policy.

ADD: The descriptive top-level specification (DTLS) shall be shown to
be an accurate description of the TCB interface. Documentation shall
describe how the TCB implements the reference monitor concept and
give an explanation why it is tamperproof, cannot be bypassed, and is
correctly implemented. Documentation shall describe how the TCB is
structured to facilitate testing and to enforce least privilege.
This documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the exploitation
of known covert storage channels shall be identified. The bandwidths
of known covert storage channels, the use of which is not detectable
by the auditing mechanisms, shall be provided. (See the Covert
Channel Guideline section.)

B3: ADD: The TCB implementation (i.e., in hardware, firmware, and
software) shall be informally shown to be consistent with the DTLS.
The elements of the DTLS shall be shown, using informal techniques,
to correspond to the elements of the TCB.

A1: CHANGE: The TCB implementation (i.e., in hardware, firmware, and
software) shall be informally shown to be consistent with the formal
top-level specification (FTLS). The elements of the FTLS shall be
shown, using informal techniques, to correspond to the elements of
the TCB.

ADD: Hardware, firmware, and software mechanisms not dealt with in
the FTLS but strictly internal to the TCB (e.g., mapping registers,
direct memory access I/O) shall be clearly described.

Design Specification and Verification

C1: NR.

C2: NR.

B1: NEW: An informal or formal model of the security policy supported by
the TCB shall be maintained that is shown to be consistent with its
axioms.

B2: CHANGE: A formal model of the security policy supported by the TCB
shall be maintained that is proven consistent with its axioms.

ADD: A descriptive top-level specification (DTLS) of the TCB shall be
maintained that completely and accurately describes the TCB in terms
of exceptions, error messages, and effects. It shall be shown to be
an accurate description of the TCB interface.

B3: ADD: A convincing argument shall be given that the DTLS is consistent
with the model.

A1: CHANGE: The FTLS shall be shown to be an accurate description of the
TCB interface. A convincing argument shall be given that the DTLS is
consistent with the model and a combination of formal and informal
techniques shall be used to show that the FTLS is consistent with the
model.

ADD: A formal top-level specification (FTLS) of the TCB shall be
maintained that accurately describes the TCB in terms of exceptions,
error messages, and effects. The DTLS and FTLS shall include those
components of the TCB that are implemented as hardware and/or
firmware if their properties are visible at the TCB interface. This
verification evidence shall be consistent with that provided within
the state-of-the-art of the particular Computer Security Center-
endorsed formal specification and verification system used. Manual
or other mapping of the FTLS to the TCB source code shall be
performed to provide evidence of correct implementation.

Device Labels

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support the assignment of minimum and maximum
security levels to all attached physical devices. These security
levels shall be used by the TCB to enforce constraints imposed by
the physical environments in which the devices are located.

B3: NAR.

A1: NAR.

Discretionary Access Control

C1: NEW: The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system. The
enforcement mechanism (e.g., self/group/public controls, access
control lists) shall allow users to specify and control sharing of
those objects by named individuals or defined groups or both.

C2: CHANGE: The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined groups of
individuals, or by both.

ADD: The discretionary access control mechanism shall, either by explicit
user action or by default, provide that objects are protected from
unauthorized access. These access controls shall be capable of
including or excluding access to the granularity of a single user.
Access permission to an object by users not already possessing access
permission shall only be assigned by authorized users.

B1: NAR.

B2: NAR.

B3: CHANGE: The enforcement mechanism (e.g., access control lists) shall
allow users to specify and control sharing of those objects. These
access controls shall be capable of specifying, for each named
object, a list of named individuals and a list of groups of named
individuals with their respective modes of access to that object.

ADD: Furthermore, for each such named object, it shall be possible to
specify a list of named individuals and a list of groups of named
individuals for which no access to the object is to be given.

A1: NAR.

Exportation of Labeled Information

C1: NR.

C2: NR.

B1: NEW: The TCB shall designate each communication channel and I/O
device as either single-level or multilevel. Any change in this
designation shall be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit any change in the
current security level associated with a single-level communication
channel or I/O device.

B2: NAR.

B3: NAR.

A1: NAR.

Exportation to Multilevel Devices

C1: NR.

C2: NR.

B1: NEW: When the TCB exports an object to a multilevel I/O device, the
sensitivity label associated with that object shall also be exported
and shall reside on the same physical medium as the exported
information and shall be in the same form (i.e., machine-readable or
human-readable form). When the TCB exports or imports an object over
a multilevel communication channel, the protocol used on that channel
shall provide for the unambiguous pairing between the sensitivity
labels and the associated information that is sent or received.

B2: NAR.

B3: NAR.

A1: NAR.

Exportation to Single-Level Devices

C1: NR.

C2: NR.

B1: NEW: Single-level I/O devices and single-level communication channels
are not required to maintain the sensitivity labels of the
information they process. However, the TCB shall include a mechanism
by which the TCB and an authorized user reliably communicate to
designate the single security level of information imported or
exported via single-level communication channels or I/O devices.

B2: NAR.

B3: NAR.

A1: NAR.

Identification and Authentication

C1: NEW: The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected to
mediate. Furthermore, the TCB shall use a protected mechanism (e.g.,
passwords) to authenticate the user’s identity. The TCB shall
protect authentication data so that it cannot be accessed by any
unauthorized user.

C2: ADD: The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual ADP
system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by that
individual.

B1: CHANGE: Furthermore, the TCB shall maintain authentication data that
includes information for verifying the identity of individual users
(e.g., passwords) as well as information for determining the
clearance and authorizations of individual users. This data shall be
used by the TCB to authenticate the user’s identity and to determine
the security level and authorizations of subjects that may be created
to act on behalf of the individual user.

B2: NAR.

B3: NAR.

A1: NAR.

Label Integrity

C1: NR.

C2: NR.

B1: NEW: Sensitivity labels shall accurately represent security levels of
the specific subjects or objects with which they are associated. When
exported by the TCB, sensitivity labels shall accurately and
unambiguously represent the internal labels and shall be associated
with the information being exported.

B2: NAR.

B3: NAR.

A1: NAR.

Labeling Human-Readable Output

C1: NR.

C2: NR.

B1: NEW: The ADP system administrator shall be able to specify the
printable label names associated with exported sensitivity labels.
The TCB shall mark the beginning and end of all human-readable,
paged, hardcopy output (e.g., line printer output) with human-
readable sensitivity labels that properly* represent the sensitivity
of the output. The TCB shall, by default, mark the top and bottom of
each page of human-readable, paged, hardcopy output (e.g., line
printer output) with human-readable sensitivity labels that
properly* represent the overall sensitivity of the output or that
properly* represent the sensitivity of the information on the page.
The TCB shall, by default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps, graphics) with human-
readable sensitivity labels that properly* represent the sensitivity
of the output. Any override of these marking defaults shall be
auditable by the TCB.

B2: NAR.

B3: NAR.

A1: NAR.

____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
____________________________________________________________

Labels

C1: NR.

C2: NR.

B1: NEW: Sensitivity labels associated with each subject and storage
object under its control (e.g., process, file, segment, device) shall
be maintained by the TCB. These labels shall be used as the basis
for mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an authorized
user the security level of the data, and all such actions shall be
auditable by the TCB.

B2: CHANGE: Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained by
the TCB.

B3: NAR.

A1: NAR.

Mandatory Access Control

C1: NR.

C2: NR.

B1: NEW: The TCB shall enforce a mandatory access control policy over all
subjects and storage objects under its control (e.g., processes,
files, segments, devices). These subjects and objects shall be
assigned sensitivity labels that are a combination of hierarchical
classification levels and non-hierarchical categories, and the labels
shall be used as the basis for mandatory access control decisions.
The TCB shall be able to support two or more such security levels.
(See the Mandatory Access Control guidelines.) The following
requirements shall hold for all accesses between subjects and objects
controlled by the TCB: A subject can read an object only if the
hierarchical classification in the subject’s security level is
greater than or equal to the hierarchical classification in the
object’s security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical categories
in the object’s security level. A subject can write an object only
if the hierarchical classification in the subject’s security level is
less than or equal to the hierarchical classification in the object’s
security level and all the non-hierarchical categories in the
subject’s security level are included in the non-hierarchical
categories in the object’s security level.

B2: CHANGE: The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O devices) that
are directly or indirectly accessible by subjects external to the TCB.
The following requirements shall hold for all accesses between all
subjects external to the TCB and all objects directly or indirectly
accessible by these subjects:

B3: NAR.

A1: NAR.

Object Reuse

C1: NR.

C2: NEW: When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data for
which the subject is not authorized.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Security Features User’s Guide

C1: NEW: A single summary, chapter, or manual in user documentation shall
describe the protection mechanisms provided by the TCB, guidelines on
their use, and how they interact with one another.

C2: NAR.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Security Testing

C1: NEW: The security mechanisms of the ADP system shall be tested and
found to work as claimed in the system documentation. Testing shall
be done to assure that there are no obvious ways for an unauthorized
user to bypass or otherwise defeat the security protection mechanisms
of the TCB. (See the Security Testing guidelines.)

C2: ADD: Testing shall also include a search for obvious flaws that would
allow violation of resource isolation, or that would permit
unauthorized access to the audit or authentication data.

B1: NEW: The security mechanisms of the ADP system shall be tested and
found to work as claimed in the system documentation. A team of
individuals who thoroughly understand the specific implementation of
the TCB shall subject its design documentation, source code, and
object code to thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that would permit
a subject external to the TCB to read, change, or delete data
normally denied under the mandatory or discretionary security policy
enforced by the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter a state
such that it is unable to respond to communications initiated by
other users. All discovered flaws shall be removed or neutralized
and the TCB retested to demonstrate that they have been eliminated
and that new flaws have not been introduced. (See the Security
Testing Guidelines.)

B2: CHANGE: All discovered flaws shall be corrected and the TCB retested
to demonstrate that they have been eliminated and that new flaws have
not been introduced.

ADD: The TCB shall be found relatively resistant to penetration.
Testing shall demonstrate that the TCB implementation is consistent
with the descriptive top-level specification.

B3: CHANGE: The TCB shall be found resistant to penetration.

ADD: No design flaws and no more than a few correctable
implementation flaws may be found during testing and there shall be
reasonable confidence that few remain.

A1: CHANGE: Testing shall demonstrate that the TCB implementation is
consistent with the formal top-level specification.

ADD: Manual or other mapping of the FTLS to the source code may form
a basis for penetration testing.

Subject Sensitivity Labels

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall immediately notify a terminal user of each change
in the security level associated with that user during an interactive
session. A terminal user shall be able to query the TCB as desired
for a display of the subject’s complete sensitivity label.

B3: NAR.

A1: NAR.

System Architecture

C1: NEW: The TCB shall maintain a domain for its own execution that
protects it from external interference or tampering (e.g., by
modification of its code or data structures). Resources controlled
by the TCB may be a defined subset of the subjects and objects in
the ADP system.

C2: ADD: The TCB shall isolate the resources to be protected so that they
are subject to the access control and auditing requirements.

B1: ADD: The TCB shall maintain process isolation through the provision
of distinct address spaces under its control.

B2: NEW: The TCB shall maintain a domain for its own execution that
protects it from external interference or tampering (e.g., by
modification of its code or data structures). The TCB shall maintain
process isolation through the provision of distinct address spaces
under its control. The TCB shall be internally structured into well-
defined largely independent modules. It shall make effective use of
available hardware to separate those elements that are protection-
critical from those that are not. The TCB modules shall be designed
such that the principle of least privilege is enforced. Features in
hardware, such as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely: readable,
writeable). The user interface to the TCB shall be completely
defined and all elements of the TCB identified.

B3: ADD: The TCB shall be designed and structured to use a complete,
conceptually simple protection mechanism with precisely defined
semantics. This mechanism shall play a central role in enforcing the
internal structuring of the TCB and the system. The TCB shall
incorporate significant use of layering, abstraction and data hiding.
Significant system engineering shall be directed toward minimizing
the complexity of the TCB and excluding from the TCB modules that are
not protection-critical.

A1: NAR.

System Integrity

C1: NEW: Hardware and/or software features shall be provided that can be
used to periodically validate the correct operation of the on-site
hardware and firmware elements of the TCB.

C2: NAR.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Test Documentation

C1: NEW: The system developer shall provide to the evaluators a document
that describes the test plan and results of the security mechanisms’
functional testing.

C2: NAR.

B1: NAR.

B2: ADD: It shall include results of testing the effectiveness of the
methods used to reduce covert channel bandwidths.

B3: NAR.

A1: ADD: The results of the mapping between the formal top-level
specification and the TCB source code shall be given.

Trusted Distribution

C1: NR.

C2: NR.

B1: NR.

B2: NR.

B3: NR.

A1: NEW: A trusted ADP system control and distribution facility shall be
provided for maintaining the integrity of the mapping between the
master data describing the current version of the TCB and the on-site
master copy of the code for the current version. Procedures (e.g.,
site security acceptance testing) shall exist for assuring that the
TCB software, firmware, and hardware updates distributed to a
customer are exactly as specified by the master copies.

Trusted Facility Management

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support separate operator and administrator
functions.

B3: ADD: The functions performed in the role of a security administrator
shall be identified. The ADP system administrative personnel shall
only be able to perform security administrator functions after taking
a distinct auditable action to assume the security administrator role
on the ADP system. Non-security functions that can be performed in
the security administration role shall be limited strictly to those
essential to performing the security role effectively.

A1: NAR.

Trusted Facility Manual

C1: NEW: A manual addressed to the ADP system administrator shall present
cautions about functions and privileges that should be controlled
when running a secure facility.

C2: ADD: The procedures for examining and maintaining the audit files as
well as the detailed audit record structure for each type of audit
event shall be given.

B1: ADD: The manual shall describe the operator and administrator
functions related to security, to include changing the
characteristics of a user. It shall provide guidelines on the
consistent and effective use of the protection features of the
system, how they interact, how to securely generate a new TCB, and
facility procedures, warnings, and privileges that need to be
controlled in order to operate the facility in a secure manner.

B2: ADD: The TCB modules that contain the reference validation mechanism
shall be identified. The procedures for secure generation of a new
TCB from source after modification of any modules in the TCB shall
be described.

B3: ADD: It shall include the procedures to ensure that the system is
initially started in a secure manner. Procedures shall also be
included to resume secure system operation after any lapse in system
operation.

A1: NAR.

Trusted Path

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support a trusted communication path between
itself and user for initial login and authentication. Communications
via this path shall be initiated exclusively by a user.

B3: CHANGE: The TCB shall support a trusted communication path between
itself and users for use when a positive TCB-to-user connection is
required (e.g., login, change subject security level).
Communications via this trusted path shall be activated exclusively
by a user or the TCB and shall be logically isolated and unmistakably
distinguishable from other paths.

A1: NAR.

Trusted Recovery

C1: NR.

C2: NR.

B1: NR.

B2: NR.

B3: NEW: Procedures and/or mechanisms shall be provided to assure that,
after an ADP system failure or other discontinuity, recovery without a
protection compromise is obtained.

A1: NAR.

(this page is reserved for Figure 1)

GLOSSARY

Access – A specific type of interaction between a subject and an object
that results in the flow of information from one to the other.

Approval/Accreditation – The official authorization that is
granted to an ADP system to process sensitive information in
its operational environment, based upon comprehensive
security evaluation of the system’s hardware, firmware, and
software security design, configuration, and implementation
and of the other system procedural, administrative,
physical, TEMPEST, personnel, and communications security
controls.

Audit Trail – A set of records that collectively provide
documentary evidence of processing used to aid in tracing
from original transactions forward to related records and
reports, and/or backwards from records and reports to their
component source transactions.

Authenticate – To establish the validity of a claimed identity.

Automatic Data Processing (ADP) System – An assembly of computer
hardware, firmware, and software configured for the purpose
of classifying, sorting, calculating, computing,
summarizing, transmitting and receiving, storing, and
retrieving data with a minimum of human intervention.

Bandwidth – A characteristic of a communication channel that is
the amount of information that can be passed through it in a
given amount of time, usually expressed in bits per second.

Bell-LaPadula Model – A formal state transition model of computer
security policy that describes a set of access control
rules. In this formal model, the entities in a computer
system are divided into abstract sets of subjects and
objects. The notion of a secure state is defined and it is
proven that each state transition preserves security by
moving from secure state to secure state; thus, inductively
proving that the system is secure. A system state is
defined to be “secure” if the only permitted access modes of
subjects to objects are in accordance with a specific
security policy. In order to determine whether or not a
specific access mode is allowed, the clearance of a subject
is compared to the classification of the object and a
determination is made as to whether the subject is
authorized for the specific access mode. The
clearance/classification scheme is expressed in terms of a
lattice. See also: Lattice, Simple Security Property, *-
Property.

Certification – The technical evaluation of a system’s security
features, made as part of and in support of the
approval/accreditation process, that establishes the extent
to which a particular computer system’s design and
implementation meet a set of specified security
requirements.

Channel – An information transfer path within a system. May also
refer to the mechanism by which the path is effected.

Covert Channel – A communication channel that allows a process to
transfer information in a manner that violates the system’s
security policy. See also: Covert Storage Channel, Covert
Timing Channel.

Covert Storage Channel – A covert channel that involves the
direct or indirect writing of a storage location by one
process and the direct or indirect reading of the storage
location by another process. Covert storage channels
typically involve a finite resource (e.g., sectors on a
disk) that is shared by two subjects at different security
levels.

Covert Timing Channel – A covert channel in which one process
signals information to another by modulating its own use of
system resources (e.g., CPU time) in such a way that this
manipulation affects the real response time observed by the
second process.

Data – Information with a specific physical representation.

Data Integrity – The state that exists when computerized data is
the same as that in the source documents and has not been
exposed to accidental or malicious alteration or
destruction.

Descriptive Top-Level Specification (DTLS) – A top-level
specification that is written in a natural language (e.g.,
English), an informal program design notation, or a
combination of the two.

Discretionary Access Control – A means of restricting access to
objects based on the identity of subjects and/or groups to
which they belong. The controls are discretionary in the
sense that a subject with a certain access permission is
capable of passing that permission (perhaps indirectly) on
to any other subject.

Domain – The set of objects that a subject has the ability to
access.

Dominate – Security level S1 is said to dominate security level
S2 if the hierarchical classification of S1 is greater than
or equal to that of S2 and the non-hierarchical categories
of S1 include all those of S2 as a subset.

Exploitable Channel – Any channel that is useable or detectable
by subjects external to the Trusted Computing Base.

Flaw Hypothesis Methodology – A system analysis and penetration
technique where specifications and documentation for the
system are analyzed and then flaws in the system are
hypothesized. The list of hypothesized flaws is then
prioritized on the basis of the estimated probability that a
flaw actually exists and, assuming a flaw does exist, on the
ease of exploiting it and on the extent of control or
compromise it would provide. The prioritized list is used
to direct the actual testing of the system.

Flaw – An error of commission, omission, or oversight in a system
that allows protection mechanisms to be bypassed.

Formal Proof – A complete and convincing mathematical argument,
presenting the full logical justification for each proof
step, for the truth of a theorem or set of theorems. The
formal verification process uses formal proofs to show the
truth of certain properties of formal specification and for
showing that computer programs satisfy their specifications.

Formal Security Policy Model – A mathematically precise statement
of a security policy. To be adequately precise, such a
model must represent the initial state of a system, the way
in which the system progresses from one state to another,
and a definition of a “secure” state of the system. To be
acceptable as a basis for a TCB, the model must be supported
by a formal proof that if the initial state of the system
satisfies the definition of a “secure” state and if all
assumptions required by the model hold, then all future
states of the system will be secure. Some formal modeling
techniques include: state transition models, temporal logic
models, denotational semantics models, algebraic
specification models. An example is the model described by
Bell and LaPadula in reference [2]. See also: Bell-
LaPadula Model, Security Policy Model.

Formal Top-Level Specification (FTLS) – A Top-Level Specification
that is written in a formal mathematical language to allow
theorems showing the correspondence of the system
specification to its formal requirements to be hypothesized
and formally proven.

Formal Verification – The process of using formal proofs to
demonstrate the consistency (design verification) between a
formal specification of a system and a formal security
policy model or (implementation verification) between the
formal specification and its program implementation.

Functional Testing – The portion of security testing in which the
advertised features of a system are tested for correct
operation.

General-Purpose System – A computer system that is designed to
aid in solving a wide variety of problems.

Lattice – A partially ordered set for which every pair of
elements has a greatest lower bound and a least upper bound.

Least Privilege – This principle requires that each subject in a
system be granted the most restrictive set of privileges (or
lowest clearance) needed for the performance of authorized
tasks. The application of this principle limits the damage
that can result from accident, error, or unauthorized use.

Mandatory Access Control – A means of restricting access to
objects based on the sensitivity (as represented by a label)
of the information contained in the objects and the formal
authorization (i.e., clearance) of subjects to access
information of such sensitivity.

Multilevel Device – A device that is used in a manner that
permits it to simultaneously process data of two or more
security levels without risk of compromise. To accomplish
this, sensitivity labels are normally stored on the same
physical medium and in the same form (i.e., machine-readable
or human-readable) as the data being processed.

Multilevel Secure – A class of system containing information with
different sensitivities that simultaneously permits access
by users with different security clearances and needs-to-
know, but prevents users from obtaining access to
information for which they lack authorization.

Object – A passive entity that contains or receives information.
Access to an object potentially implies access to the
information it contains. Examples of objects are: records,
blocks, pages, segments, files, directories, directory
trees, and programs, as well as bits, bytes, words, fields,
processors, video displays, keyboards, clocks, printers,
network nodes, etc.

Object Reuse – The reassignment to some subject of a medium
(e.g., page frame, disk sector, magnetic tape) that
contained one or more objects. To be securely reassigned,
such media must contain no residual data from the previously
contained object(s).

Output – Information that has been exported by a TCB.

Password – A private character string that is used to
authenticate an identity.

Penetration Testing – The portion of security testing in which
the penetrators attempt to circumvent the security features
of a system. The penetrators may be assumed to use all
system design and implementation documentation, which may
include listings of system source code, manuals, and circuit
diagrams. The penetrators work under no constraints other
than those that would be applied to ordinary users.

Process – A program in execution. It is completely characterized
by a single current execution point (represented by the
machine state) and address space.

Protection-Critical Portions of the TCB – Those portions of the
TCB whose normal function is to deal with the control of
access between subjects and objects.

Protection Philosophy – An informal description of the overall
design of a system that delineates each of the protection
mechanisms employed. A combination (appropriate to the
evaluation class) of formal and informal techniques is used
to show that the mechanisms are adequate to enforce the
security policy.

Read – A fundamental operation that results only in the flow of
information from an object to a subject.

Read Access – Permission to read information.

Reference Monitor Concept – An access control concept that refers
to an abstract machine that mediates all accesses to objects
by subjects.

Resource – Anything used or consumed while performing a function.
The categories of resources are: time, information, objects
(information containers), or processors (the ability to use
information). Specific examples are: CPU time; terminal
connect time; amount of directly-addressable memory; disk
space; number of I/O requests per minute, etc.

Security Kernel – The hardware, firmware, and software elements
of a Trusted Computing Base that implement the reference
monitor concept. It must mediate all accesses, be protected
from modification, and be verifiable as correct.

Security Level – The combination of a hierarchical classification
and a set of non-hierarchical categories that represents the
sensitivity of information.

Security Policy – The set of laws, rules, and practices that
regulate how an organization manages, protects, and
distributes sensitive information.

Security Policy Model – An informal presentation of a formal
security policy model.

Security Testing – A process used to determine that the security
features of a system are implemented as designed and that
they are adequate for a proposed application environment.
This process includes hands-on functional testing,
penetration testing, and verification. See also: Functional
Testing, Penetration Testing, Verification.

Sensitive Information – Information that, as determined by a
competent authority, must be protected because its
unauthorized disclosure, alteration, loss, or destruction
will at least cause perceivable damage to someone or
something.

Sensitivity Label – A piece of information that represents the
security level of an object and that describes the
sensitivity (e.g., classification) of the data in the
object. Sensitivity labels are used by the TCB as the basis
for mandatory access control decisions.

Simple Security Property – A Bell-LaPadula security model rule
allowing a subject read access to an object only if the
security level of the subject dominates the security level
of the object.

Single-Level Device – A device that is used to process data of a
single security level at any one time. Since the device
need not be trusted to separate data of different security
levels, sensitivity labels do not have to be stored with the
data being processed.

*-Property (Star Property) – A Bell-LaPadula security model rule
allowing a subject write access to an object only if the
security level of the subject is dominated by the security
level of the object. Also known as the Confinement
Property.

Storage Object – An object that supports both read and write
accesses.

Subject – An active entity, generally in the form of a person,
process, or device that causes information to flow among
objects or changes the system state. Technically, a
process/domain pair.

Subject Security Level – A subject’s security level is equal to
the security level of the objects to which it has both read
and write access. A subject’s security level must always be
dominated by the clearance of the user the subject is
associated with.

TEMPEST – The study and control of spurious electronic signals
emitted from ADP equipment.

Top-Level Specification (TLS) – A non-procedural description of
system behavior at the most abstract level. Typically a
functional specification that omits all implementation
details.

Trap Door – A hidden software or hardware mechanism that permits
system protection mechanisms to be circumvented. It is
activated in some non-apparent manner (e.g., special
“random” key sequence at a terminal).

Trojan Horse – A computer program with an apparently or actually
useful function that contains additional (hidden) functions
that surreptitiously exploit the legitimate authorizations
of the invoking process to the detriment of security. For
example, making a “blind copy” of a sensitive file for the
creator of the Trojan Horse.

Trusted Computer System – A system that employs sufficient
hardware and software integrity measures to allow its use
for processing simultaneously a range of sensitive or
classified information.

Trusted Computing Base (TCB) – The totality of protection
mechanisms within a computer system — including hardware,
firmware, and software — the combination of which is
responsible for enforcing a security policy. It creates a
basic protection environment and provides additional user
services required for a trusted computer system. The
ability of a trusted computing base to correctly enforce a
security policy depends solely on the mechanisms within the
TCB and on the correct input by system administrative
personnel of parameters (e.g., a user’s clearance) related
to the security policy.

Trusted Path – A mechanism by which a person at a terminal can
communicate directly with the Trusted Computing Base. This
mechanism can only be activated by the person or the Trusted
Computing Base and cannot be imitated by untrusted software.

Trusted Software – The software portion of a Trusted Computing
Base.

User – Any person who interacts directly with a computer system.

Verification – The process of comparing two levels of system
specification for proper correspondence (e.g., security
policy model with top-level specification, TLS with source
code, or source code with object code). This process may or
may not be automated.

Write – A fundamental operation that results only in the flow of
information from a subject to an object.

Write Access – Permission to write an object.

REFERENCES

1. Anderson, J. P. Computer Security Technology Planning
Study, ESD-TR-73-51, vol. I, ESD/AFSC, Hanscom AFB,
Bedford, Mass., October 1972 (NTIS AD-758 206).

2. Bell, D. E. and LaPadula, L. J. Secure Computer Systems:
Unified Exposition and Multics Interpretation, MTR-2997
Rev. 1, MITRE Corp., Bedford, Mass., March 1976.

3. Brand, S. L. “An Approach to Identification and Audit of
Vulnerabilities and Control in Application Systems,” in
Audit and Evaluation of Computer Security II: System
Vulnerabilities and Controls, Z. Ruthberg, ed., NBS
Special Publication #500-57, MD78733, April 1980.

4. Brand, S. L. “Data Processing and A-123,” in Proceedings of
the Computer Performance Evaluation User’s Group 18th
Meeting, C. B. Wilson, ed., NBS Special Publication
#500-95, October 1982.

5. Denning, D. E. “A Lattice Model of Secure Information
Flow,” in Communications of the ACM, vol. 19, no. 5
(May 1976), pp. 236-243.

6. Denning, D. E. Secure Information Flow in Computer Systems,
Ph.D. dissertation, Purdue Univ., West Lafayette, Ind.,
May 1975.

7. DoD 5200.1-R, Information Security Program Regulation,
August 1982.

8. DoD Directive 5200.28, Security Requirements for Automatic
Data Processing (ADP) Systems, revised April 1978.

9. DoD 5200.28-M, ADP Security Manual — Techniques and
Procedures for Implementing, Deactivating, Testing, and
Evaluating Secure Resource-Sharing ADP Systems, revised
June 1979.

10. DoD Directive 5215.1, Computer Security Evaluation Center,
25 October 1982.

11. DoD 5220.22-M, Industrial Security Manual for Safeguarding
Classified Information, January 1983.

12. DoD 5220.22-R, Industrial Security Regulation, January 1983.

13. DoD Directive 5400.11, Department of Defense Privacy
Program, 9 June 1982.

14. Executive Order 12356, National Security Information,
6 April 1982.

15. Faurer, L. D. “Keeping the Secrets Secret,” in Government
Data Systems, November – December 1981, pp. 14-17.

16. Federal Information Processing Standards Publication (FIPS
PUB) 39, Glossary for Computer Systems Security,
15 February 1976.

17. Federal Information Processing Standards Publication (FIPS
PUB) 73, Guidelines for Security of Computer
Applications, 30 June 1980.

18. Federal Information Processing Standards Publication (FIPS
PUB) 102, Guideline for Computer Security Certification
and Accreditation.

19. Lampson, B. W. “A Note on the Confinement Problem,” in
Communications of the ACM, vol. 16, no. 10 (October
1973), pp. 613-615.

20. Lee, T. M. P., et al. “Processors, Operating Systems and
Nearby Peripherals: A Consensus Report,” in Audit and
Evaluation of Computer Security II: System
Vulnerabilities and Controls, Z. Ruthberg, ed., NBS
Special Publication #500-57, MD78733, April 1980.

21. Lipner, S. B. A Comment on the Confinement Problem, MITRE
Corp., Bedford, Mass.

22. Millen, J. K. “An Example of a Formal Flow Violation,” in
Proceedings of the IEEE Computer Society 2nd
International Computer Software and Applications
Conference, November 1978, pp. 204-208.

23. Millen, J. K. “Security Kernel Validation in Practice,” in
Communications of the ACM, vol. 19, no. 5 (May 1976),
pp. 243-250.

24. Nibaldi, G. H. Proposed Technical Evaluation Criteria for
Trusted Computer Systems, MITRE Corp., Bedford, Mass.,
M79-225, AD-A108-832, 25 October 1979.

25. Nibaldi, G. H. Specification of A Trusted Computing Base,
(TCB), MITRE Corp., Bedford, Mass., M79-228, AD-A108-
831, 30 November 1979.

26. OMB Circular A-71, Transmittal Memorandum No. 1, Security of
Federal Automated Information Systems, 27 July 1978.

27. OMB Circular A-123, Internal Control Systems, 5 November
1981.

28. Ruthberg, Z. and McKenzie, R., eds. Audit and Evaluation of
Computer Security, in NBS Special Publication #500-19,
October 1977.

29. Schaefer, M., Linde, R. R., et al. “Program Confinement in
KVM/370,” in Proceedings of the ACM National
Conference, October 1977, Seattle.

30. Schell, R. R. “Security Kernels: A Methodical Design of
System Security,” in Technical Papers, USE Inc. Spring
Conference, 5-9 March 1979, pp. 245-250.

31. Trotter, E. T. and Tasker, P. S. Industry Trusted Computer
Systems Evaluation Process, MITRE Corp., Bedford,
Mass., MTR-3931, 1 May 1980.

32. Turn, R. Trusted Computer Systems: Needs and Incentives for
Use in government and Private Sector, (AD # A103399),
Rand Corporation (R-28811-DR&E), June 1981.

33. Walker, S. T. “The Advent of Trusted Computer Operating
Systems,” in National Computer Conference Proceedings,
May 1980, pp. 655-665.

34. Ware, W. H., ed., Security Controls for Computer Systems:
Report of Defense Science Board Task Force on Computer
Security, AD # A076617/0, Rand Corporation, Santa
Monica, Calif., February 1970, reissued October 1979.

DoD STANDARD 5200.28: SUMMARY OF THE DIFFERENCES
BETWEEN IT AND CSC-STD-001-83

Note: Text which has been added or changed is indented and preceded by > sign.
Text which has been deleted is enclosed in slashes (/). “Computer Security
Center” was changed to “National Computer Security Center” throughout the
document.

The FOREWORD Section was rewritten and signed by Mr. Don Latham on
26 Dec 85. The ACKNOWLEDGEMENTS Section was updated.

The PREFACE was changed as follows:

PREFACE

The trusted computer system evaluation criteria defined in this
document classify systems into four broad hierarchical divisions
of enhanced security protection. The criteria provide a basis
for the evaluation of effectiveness of security controls built
into automatic data processing system products. The criteria
were developed with three objectives in mind: (a) to provide
users with a yardstick with which to assess the degree of trust
that can be placed in computer systems for the secure processing
of classified or other sensitive information; (b) to provide
guidance to manufacturers as to what to build into their new,
widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide
a basis for specifying security requirements in acquisition
specifications. Two types of requirements are delineated for
secure processing: (a) specific security feature requirements and
(b) assurance requirements. Some of the latter requirements
enable evaluation personnel to determine if the required features
are present and functioning as intended.

>The scope of these criteria is to be applied to
>the set of components comprising a trusted system, and is
>not necessarily to be applied to each system component
>individually. Hence, some components of a system may be
>completely untrusted, while others may be individually
>evaluated to a lower or higher evaluation class than the
>trusted product considered as a whole system. In trusted
>products at the high end of the range, the strength of the
>reference monitor is such that most of the system
>components can be completely untrusted.

Though the criteria are

>intended to be

application-independent, /it is recognized that/ the
specific security feature requirements may have to be
interpreted when applying the criteria to specific

>systems with their own functional requirements,
>applications or special environments (e.g., communications
>processors, process control computers, and embedded systems
>in general).

The underlying assurance requirements can be
applied across the entire spectrum of ADP system or
application processing environments without special
interpretation.

The SCOPE Section was changed as follows:

Scope

The trusted computer system evaluation criteria defined in this
document apply

>primarily

to /both/ trusted, commercially available
automatic data processing (ADP) systems.

>They are also applicable, as amplified below, to the
>evaluation of existing systems and to the specification of
>security requirements for ADP systems acquisition.

Included are two distinct sets of requirements: l) specific security
feature requirements; and 2) assurance requirements. The specific
feature requirements encompass the capabilities typically found
in information processing systems employing general-purpose
operating systems that are distinct from the applications programs
being supported.

>However, specific security feature requirements
>may also apply to specific systems with their own functional
>requirements, applications or special environments (e.g.,
>communications processors, process control computers, and embedded
>systems in general).

The assurance requirements, on the other hand,
apply to systems that cover the full range of computing environments
from dedicated controllers to full range multilevel secure resource
sharing systems.

Changed the Purpose Section as follows:

Purpose

As outlined in the Preface, the criteria have been developed to
serve a number of intended purposes:

To provide

>a standard

to manufacturers as to what security features to build
into their new and planned, … trust requirements

>(with particular emphasis on preventing the
>disclosure of data)

for sensitive applications.

To provide

>DoD components

with a metric with which to evaluate
the degree of trust that can be placed in …

To provide a basis for specifying security requirements in
acquisition specifications.

With respect to the

>second

purpose for development of the criteria, i.e., providing

>DoD components

with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be
performed on a computer product from a perspective that
excludes the application environment; or, (b) it can be
done to assess whether appropriate security measures …

The latter type of evaluation, i.e., those done for the purpose
of assessing a system’s security attributes with respect to a
specific operational mission, is known as a certification
evaluation. It must be understood that the completion of a
formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific
application environment. On the contrary, the evaluation report
only provides a trusted computer system’s evaluation rating along
with supporting data describing the product system’s strengths
and weaknesses from a computer security point of view. The
system security certification and the formal
approval/accreditation procedure, done in accordance with the
applicable policies of the issuing agencies, must still be
followed before a system can be approved for use in processing or
handling classified information.,8;9.

>Designated Approving Authorities (DAAs) remain ultimately
>responsible for specifying security of systems they
>accredit.

The trusted computer system evaluation criteria will be used
directly and indirectly in the certification process. Along with
applicable policy, it will be used directly as

>technical guidance

for evaluation of the total system and for specifying system
security and certification requirements for new acquisitions. Where
a system being evaluated for certification employs a product that
has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation.
Technical data will be furnished to designers, evaluators and the
Designated Approving Authorities to support their needs for
making decisions.

2.1.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 2.2.1.1 as follows:

2.2.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely Reworded Section 2.2.1.2 as follows:

2.2.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Reworded Section 2.2.2.2 as follows:

2.2.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

For each recorded event, the audit record shall
identify: date and time of the event, user, type of event,
and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object. The ADP system
administrator shall be able to selectively audit the
actions of any one or more users based on individual
identity.

Changed Section 2.2.4.3 as follows:

2.2.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 3.1.1.1 as follows:

3.1.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely reworded Section 3.1.1.2 as follows:

3.1.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.1.1.3.2 as follows:

3.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/ communication
channel or I/O device.

Appended a sentence to Section 3.1.1.4 as follows:

3.1.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed one sentence in Section 3.1.2.1 as follows:

3.1.2.1. Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Reworded Section 3.1.2.2 as follows:

3.1.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

> and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level.

‘Unbolded’ the first sentence of Section 3.1.3.2.1.

Reworded Section 3.1.3.2.2 as follows:

3.1.3.2.2 Design Specification and Verification

An informal or formal model of the security policy
supported by the TCB shall be maintained

>over the life cycle of the ADP system and demonstrated

to be consistent with its axioms.

Changed sentence as follows:

3.1.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 3.2.1.1 as follows:

3.2.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely reworded Section 3.2.1.2 as follows:

3.2.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.2.1.3 as follows:

3.2.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object, ROM) that is
directly or indirectly accessible by subjects external
to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 3.2.1.3.2 as follows:

3.2.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/
communication channel or I/O device.

Appended Sectence to Section 3.2.1.4 as follows:

3.2.1.4 Mandatory Access Control

… Identification and authentication data shall be
used by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 3.2.2.1 as follows:

3.2.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Reworded section 3.2.2.2 as follows:

3.2.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels.

Changed Section 3.2.3.2.2 as follows:

3.2.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its
axioms. A descriptive top-level specification
(DTLS) of the TCB shall be maintained that
completely and accurately describes the TCB in
terms of exceptions, error messages, and effects.
It shall be shown to be an accurate description of
the TCB interface.

Changed Section 3.2.4.3 as follows:

3.2.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths.

Replaced “tamperproof” with “tamper resistant”:

3.2.4.4 Design Documentation

Documentation shall be available that provides a
description of the manufacturer’s philosophy of
protection and an explanation of how this philosophy is
translated into the TCB. The interfaces between the
TCB modules shall be described. A formal description
of the security policy model enforced by the TCB shall
be available and proven that it is sufficient to
enforce the security policy. The specific TCB
protection mechanisms shall be identified and an
explanation given to show that they satisfy the model.
The descriptive top-level specification (DTLS) shall be
shown to be an accurate description of the TCB
interface. Documentation shall describe how the TCB
implements the reference monitor concept and give an
explanation why it is

>tamper resistant,

cannot be bypassed, and is correctly implemented.
Documentation shall describe how the TCB is structured to
facilitate testing and to enforce least privilege. This
documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting
the channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage
channels, the use of which is not detectable by the
auditing mechanisms, shall be provided. (See the Covert
Channel Guideline section.)

Changed Section 3.3.1.1 as follows:

3.3.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
access control lists) shall allow users to specify and
control sharing of those objects,

>and shall provide controls to limit
>propagation of access rights.

The discretionary access control mechanism shall, either by
explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of specifying, for
each named object, a list of named individuals and a
list of groups of named individuals with their
respective modes of access to that object.
Furthermore, for each such named object, it shall be
possible to specify a list of named individuals and a
list of groups of named individuals for which no access
to the object is to be given. Access permission to an
object by users not already possessing access
permission shall only be assigned by authorized users.

Completely reworded Section 3.3.1.2 as follows:

3.3.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.3.1.3 as follows:

3.3.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object, ROM) that is
directly or indirectly accessible by subjects external
to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 3.3.1.3.2 as follows:

3.3.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/
communication channel or I/O device.

Appended Sentence to Section 3.3.1.4 as follows:

3.3.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 3.3.2.1 as follows:

3.3.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Changed Section 3.3.2.2 as follows:

3.3.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able
to immediately notify the security administrator when
thresholds are exceeded,

>and if the occurrence or accumulation
>of these security relevant events continues,
>the system shall take the least disruptive
>action to terminate the event.

Changed the first sentence of Section 3.3.3.2.2 as follows:

3.3.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of
>the ADP system

that is proven consistent with its axioms. …

Changed Section 3.3.4.3 as follows:

3.3.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths.

Replaced “tamperproof” with “tamper resistant” in Section 3.3.4.4.

Changed Section 4.1.1.1 as follows:

4.1.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
access control lists) shall allow users to specify and
control sharing of those objects,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall, either by
explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of specifying, for
each named object, a list of named individuals and a
list of groups of named individuals with their
respective modes of access to that object.
Furthermore, for each such named object, it shall be
possible to specify a list of named individuals and a
list of groups of named individuals for which no access
to the object is to be given. Access permission to an
object by users not already possessing access
permission shall only be assigned by authorized users.

Completely reworded Section 4.1.1.2 as follows:

4.1.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 4.1.1.3 as follows:

4.1.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object,

>ROM)

that is directly or indirectly accessible by subjects
external to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 4.1.1.3.2 as follows:

4.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level

>or levels

associated with a /single-level/
communication channel or I/O device.

Appended Sentence to Section 4.1.1.4 as follows:

4.1.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 4.1.2.1 as follows:

4.1.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Changed Section 4.1.2.2 as follows:

4.1.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able
to immediately notify the security administrator when
thresholds are exceeded,

>and, if the occurrence or accumulation of these
>security relevant events continues, the system
>shall take the least disruptive action to
>terminate the event.

‘Unbolded’ the words “covert channels” in Section 4.1.3.1.3.

Changed the first sentence of Section 4.1.3.2.2 as follows:

4.1.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its axioms. …

Changed Section 4.1.4.3 as follows:

4.1.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested, and

results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths. The results of the mapping between the
formal top-level specification and the TCB source code
shall be given.

Replaced “tamperproof” with “tamper resistant” in Section 4.1.4.4.

Changed the last paragraph of Section 5.1 as follows:

5.1 A Need for Consensus

A major goal of …

As described …

>The Purpose of this section is to describe in detail the
>fundamental control objectives. These objectives lay the
>foundation for the requirements outlined in the criteria.

The goal is to explain the foundations so that those outside
the National Security Establishment can assess their
universality and, by extension, the universal applicability
of the criteria requirements to processing all types of
sensitive applications whether they be for National Security
or the private sector.

Changed the second paragraph of Section 6.2 as follows:

6.2 A Formal Policy Model

Following the publication of …

>A subject can act on behalf of a user or another
>subject. The subject is created as a surrogate
>for the cleared user and is assigned a formal
>security level based on their classification.
>The state transitions and invariants of the formal
>policy model define the invariant relationships
>that must hold between the clearance of the user,
>the formal security level of any process that can
>act on the user’s behalf, and the formal security
>level of the devices and other objects to which any
>process can obtain specific modes of access.

The Bell and LaPadula model,

>for example,

defines a relationship between

>formal security levels of subjects and objects,

now referenced as the “dominance relation.” From this definition …
… Both the Simple Security Condition and the *-Property
include mandatory security provisions based on the dominance
relation between the

>formal security levels of subjects and objects.

The Discretionary Security Property …

Added a sentence to the end of Section 7.0:

7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA

Section 1 presents fundamental computer security
requirements and Section 5 presents the control objectives
for Trusted Computer Systems. They are general
requirements, useful and necessary, for the development of
all secure systems. However, when designing systems that
will be used to process classified or other sensitive
information, functional requirements for meeting the Control
Objectives become more specific. There is a large body of
policy laid down in the form of Regulations, Directives,
Presidential Executive Orders, and OMB Circulars that form
the basis of the procedures for the handling and processing
of Federal information in general and classified information
specifically. This section presents pertinent excerpts from
these policy statements and discusses their relationship to
the Control Objectives.

>These excerpts are examples to illustrate the relationship
>of the policies to criteria and may not be complete.

Inserted the following

>as the next to last paragraph

of Section 7.2:

>DoD Directive 5200.28 provides the security requirements for
>ADP systems. For some types of information, such as
>Sensitive Compartmented Information (SCI), DoD Directive
>5200.28 states that other minimum security requirements also
>apply. These minima are found in DCID 1/16 (new reference
>number 5) which is implemented in DIAM 50-4 (new reference
>number 6) for DoD and DoD contractor ADP systems.

From requirements imposed by …

Changed Footnote #1 referenced by Section 7.2 as follows:

Replaced “Health and Human Services Department” with “U.S.
Information Agency.”

Changed (updated) the quote from DoD 5220.22-M, Section 7.3.1, as
follows:

7.3 Criteria Control Objective for Security Policy

7.3.1 Marking

The control objective for marking …

DoD 5220.22-M, “Industrial Security …

>”a. General. Classification designation by physical
>marking, notation or other means serves to warn and to
>inform the holder what degree of protection against
>unauthorized disclosure is required for that
>information or material.” (14)

Changed the

>last paragraph

of Section 7.5 as follows:

A major component of assurance, life-cycle assurance,

>as described in DoD Directive 7920.1,

is concerned with testing ADP systems both in the
development phase as well as during operation.

>(17)

DoD Directive 5215.1 …

Changed Section 9.0 as follows:

9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL FEATURES

The Mandatory Access Control requirement …

* The number of hierarchical classifications should be
greater than or equal to

>sixteen (16).

* The number of non-hierarchical categories should be
greater than or equal to

>sixty-four (64)..

Completely reworded the third paragraph of Formal Product
Evaluation, in Appendix A, as follows:

Formal Product Evaluation

The formal product evaluation provides …

A formal product evaluation begins with …

>The evaluation team writes a final report on their findings about
>the system. The report is publicly available (containing no
>proprietary or sensitive information) and contains the overall
>class rating assigned to the system and the details of the
>evaluation team’s findings when comparing the product against the
>evaluation criteria. Detailed information concerning
>vulnerabilities found by the evaluation team is furnished to the
>system developers and designers as each is found so that the
>vendor has a chance to eliminate as many of them as possible
>prior to the completion of the Formal Product Evaluation.
>Vulnerability analyses and other proprietary or sensitive
>information are controlled within the Center through the
>Vulnerability Reporting Program and are distributed only within
>the U.S. Government on a strict need-to-know and non-disclosure
>basis, and to the vendor.

Changed two paragraphs in Audit (Appendix D) as follows:

C2: NEW: The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction an
audit trail of accesses to the objects it protects. The
audit data shall be protected by the TCB so that read access
to it is limited to those who are authorized for audit data.
The TCB shall be able to record the following types of
events: use of identification and authentication mechanisms,
introduction of objects into a user’s address space (e.g.,
file open, program initiation), deletion of objects, actions
taken by computer operators and system administrators and/or
system security officers,

>and other security relevant events.

or each recorded event, the audit record shall
identify: date and time of the event, user, type of event,
and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object. The ADP system
administrator shall be able to selectively audit the actions
of any one or more users based on individual identity.

B3: ADD: …when thresholds are exceeded,

>and, if the occurrence or accumulation of these
>security relevant events continues, the system
>shall take the least disruptive action to terminate
>the event.

Changed one paragraph in Design Documentation (Appendix D):

B2: ADD: Change “tamperproof” to “tamper resistant.”

Changed two paragraphs in Design Specification and Verification:

B1: NEW: An informal or formal model of the security policy
supported by the TCB shall be maintained

>over the life cycle of the ADP system and demonstrated

to be consistent with its axioms.

B2: CHANGE: A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its axioms.

Changed two paragraphs in Discretionary Access Control as follows:

C2: CHANGE: The enforcement mechanism (e.g., self/group/public
controls, access control lists) shall allow users to specify
and control sharing of those objects by named individuals,
or defined groups of individuals, or by both,

>and shall provide controls to limit propagation of access rights.

B3: CHANGE: The enforcement mechanism (e.g., access control
lists) shall allow users to specify and control sharing of
those objects,

>and shall provide controls to limit propagation of access rights.

These access controls shall be capable of specifying, for each
named object, a list of named individuals and a list of groups of
named individuals with their respective modes of access to that object.

Changed 1 paragraph in Exportation of Labeled Information:

B1: NEW: The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any change
in this designation shall be done manually and shall be
auditable by the TCB. The TCB shall maintain and be able to
audit any change in the /current/ security level

>or levels

associated with a /single-level/ communication channel or
I/O device.

Changed 1 paragraph in Identification and Authorization:

B1: CHANGE: … This data shall be used by the TCB to authenticate
the user’s identity and

>to ensure that

the security level and authorizations of subjects external to
the TCB that may be created to act on behalf of the individual
user

>are dominated by the clearance and authorization
>of that user.

Changed 1 paragraph in Labels:

B2: CHANGE: … (e.g., subject, storage object, ROM) …

Changed 1 paragraph in Mandatory Access Control:

B1: NEW: … Identification and authentication data shall be used

>by the TCB to authenticate the user’s identity and to ensure
>that the security level and authorization of subjects external
>to the TCB that may be created to act on behalf of the
>individual user are dominated by the clearance and authoriza-
>tion of that user.

Rewrote 1 paragraph in Object Reuse:

C2: NEW:
>All authorizations to the information contained
>within a storage object shall be revoked prior to initial
>assignment, allocation or reallocation to a subject from the
>TCB’s pool of unused storage objects. No information,
>including encrypted representations of information, produced
>by a prior subject’s actions is to be available to any
>subject that obtains access to an object that has been
>released back to the system.

Changed l paragraph in Test Documentation:

C1: NEW: The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested,

and results of the security mechanisms’ functional testing.

GLOSSARY

Changed Discretionary Access Control:

Discretionary Access Control – A means of restricting access to
objects based on the identity of subjects and/or groups to
which they belong. The controls are discretionary in the
sense that a subject with a certain access permission is
capable of passing that permission (perhaps indirectly) on
to any other subject

(unless restrained by mandatory access control).

Added:

Front-End Security Filter – A process that is invoked to process
data according to a specified security policy prior to
releasing the data outside the processing environment or
upon receiving data from an external source.

Granularity – The relative fineness or coarseness by which a
mechanism can be adjusted. The phrase “the granularity of
a single user” means the access control mechanism can be
adjusted to include or exclude any single user.

Read-Only Memory (ROM) – A storage area in which the contents
can be read but not altered during normal computer
processing.

Security Relevant Event – Any event that attempts to change the
security state of the system, (e.g., change discretionary
access controls, change the security level of the subject,
change user password, etc.). Also, any event that attempts
to violate the security policy of the system, (e.g., too
many attempts to login, attempts to violate the mandatory
access control limits of a device, attempts to downgrade a
file, etc.).

Changed the name of the term:

Simple Security /Property/

>Condition

– A Bell-LaPadula security model rule allowing a subject
read access to an object only if the security level of the
subject dominates the security level of the object.

Changed definition:

Trusted Computing Base (TCB) – The totality of protection
mechanisms within a computer system –including hardware,
firmware, and software — the combination of which is
responsible for enforcing a security policy.

>A TCB consists of one or more components that together enforce
>a unified security policy over a product or system.

The ability of a TCB to correctly enforce a security
policy depends solely on the mechanisms within the TCB and
on the correct input by system administrative personnel of
parameters (e.g., a user’s clearance) related to the
security policy.

REFERENCES

Added: (References were renumbered as necessary)

5. DCID 1/16, Security of Foreign Intelligence in Automated
Data Processing Systems and Networks (U), 4 January 1983.

6. DIAM 50-4, Security of Compartmented Computer Operations (U),
24 June 1980.

9. DoD Directive 5000.29, Management of Computer Resources in
Major Defense Systems, 26 April 1976.

17. DoD Directive 7920.1, Life Cycle Management of Automated
Information Systems (AIS), 17 October 1978.

Corrected dates on the following References:

14. DoD 5220.22-M, Industrial Security Manual for Safeguarding
Classified Information, March 1984.

15. DoD 5220.22-R, Industrial Security Regulation, February
1984.

%

Department of Defense Trusted Computer System Evaluation Criteria (The Orange Book) 15 August 1983

orange-boot.txt: No such file or directory
% cat orange.boo
orange.boo: No such file or directory
% cat orange-book.txt
CSC-STD-001-83
Library No. S225,711

DEPARTMENT OF DEFENSE

TRUSTED COMPUTER SYSTEM EVALUATION CRITERIA

15 August 1983

CSC-STD-001-83

FOREWORD

This publication, “Department of Defense Trusted Computer System Evaluation
Criteria,” is being issued by the DoD Computer Security Center under the
authority of and in accordance with DoD Directive 5215.1, “Computer Security
Evaluation Center.” The criteria defined in this document constitute a uniform
set of basic requirements and evaluation classes for assessing the
effectiveness of security controls built into Automatic Data Processing (ADP)
systems. These criteria are intended for use in the evaluation and selection
of ADP systems being considered for the processing and/or storage and
retrieval of sensitive or classified information by the Department of Defense.
Point of contact concerning this publication is the Office of Standards and
Products, Attention: Chief, Computer Security Standards.

____________________________ 15 August 1983
Melville H. Klein
Director
DoD Computer Security Center

ACKNOWLEDGMENTS

Special recognition is extended to Sheila L. Brand, DoD Computer Security
Center (DoDCSC), who integrated theory, policy, and practice into and directed
the production of this document.

Acknowledgment is also given for the contributions of: Grace Hammonds and
Peter S. Tasker, the MITRE Corp., Daniel J. Edwards, Col. Roger R. Schell,
Marvin Schaefer, DoDCSC, and Theodore M. P. Lee, Sperry UNIVAC, who as
original architects formulated and articulated the technical issues and
solutions presented in this document; Jeff Makey and Warren F. Shadle,
DoDCSC, who assisted in the preparation of this document; James P. Anderson,
James P. Anderson & Co., Steven B. Lipner, Digital Equipment Corp., Clark
Weissman, System Development Corp., LTC Lawrence A. Noble, formerly U.S. Air
Force, Stephen T. Walker, formerly DoD, Eugene V. Epperly, DoD, and James E.
Studer, formerly Dept. of the Army, who gave generously of their time and
expertise in the review and critique of this document; and finally, thanks are
given to the computer industry and others interested in trusted computing for
their enthusiastic advice and assistance throughout this effort.

TABLE OF CONTENTS

FOREWORD. . . . . . . . . . . . . . . . . . . . . . . . . . . .i
ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . ii
PREFACE . . . . . . . . . . . . . . . . . . . . . . . . . . . .v
INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . . . . .1

PART I: THE CRITERIA
Section
1.0 DIVISION D: MINIMAL PROTECTION. . . . . . . . . . . . .9
2.0 DIVISION C: DISCRETIONARY PROTECTION. . . . . . . . . 11
2.1 Class (C1): Discretionary Security Protection . . 12
2.2 Class (C2): Controlled Access Protection. . . . . 15
3.0 DIVISION B: MANDATORY PROTECTION. . . . . . . . . . . 19
3.1 Class (B1): Labeled Security Protection . . . . . 20
3.2 Class (B2): Structured Protection . . . . . . . . 26
3.3 Class (B3): Security Domains. . . . . . . . . . . 33
4.0 DIVISION A: VERIFIED PROTECTION . . . . . . . . . . . 41
4.1 Class (A1): Verified Design . . . . . . . . . . . 42
4.2 Beyond Class (A1). . . . . . . . . . . . . . . . . 51

PART II: RATIONALE AND GUIDELINES

5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS. . . . . 55
5.1 A Need for Consensus . . . . . . . . . . . . . . . 56
5.2 Definition and Usefulness. . . . . . . . . . . . . 56
5.3 Criteria Control Objective . . . . . . . . . . . . 56
6.0 RATIONALE BEHIND THE EVALUATION CLASSES. . . . . . . . . 63
6.1 The Reference Monitor Concept. . . . . . . . . . . 64
6.2 A Formal Security Policy Model . . . . . . . . . . 64
6.3 The Trusted Computing Base . . . . . . . . . . . . 65
6.4 Assurance. . . . . . . . . . . . . . . . . . . . . 65
6.5 The Classes. . . . . . . . . . . . . . . . . . . . 66
7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA . . . . 69
7.1 Established Federal Policies . . . . . . . . . . . 70
7.2 DoD Policies . . . . . . . . . . . . . . . . . . . 70
7.3 Criteria Control Objective For Security Policy . . 71
7.4 Criteria Control Objective for Accountability. . . 74
7.5 Criteria Control Objective for Assurance . . . . . 76
8.0 A GUIDELINE ON COVERT CHANNELS . . . . . . . . . . . . . 79
9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL
FEATURES . . . . . . . . . . . . . . . . . . . . . . . . 81
10.0 A GUIDELINE ON SECURITY TESTING . . . . . . . . . . . . 83
10.1 Testing for Division C . . . . . . . . . . . . . . 84
10.2 Testing for Division B . . . . . . . . . . . . . . 84
10.3 Testing for Division A . . . . . . . . . . . . . . 85
APPENDIX A: Commercial Product Evaluation Process. . . . . . 87
APPENDIX B: Summary of Evaluation Criteria Divisions . . . . 89
APPENDIX C: Sumary of Evaluation Criteria Classes. . . . . . 91
APPENDIX D: Requirement Directory. . . . . . . . . . . . . . 93

GLOSSARY. . . . . . . . . . . . . . . . . . . . . . . . . . .109

REFERENCES. . . . . . . . . . . . . . . . . . . . . . . . . .115

PREFACE

The trusted computer system evaluation criteria defined in this document
classify systems into four broad hierarchical divisions of enhanced security
protection. They provide a basis for the evaluation of effectiveness of
security controls built into automatic data processing system products. The
criteria were developed with three objectives in mind: (a) to provide users
with a yardstick with which to assess the degree of trust that can be placed
in computer systems for the secure processing of classified or other sensitive
information; (b) to provide guidance to manufacturers as to what to build into
their new, widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide a basis for
specifying security requirements in acquisition specifications. Two types of
requirements are delineated for secure processing: (a) specific security
feature requirements and (b) assurance requirements. Some of the latter
requirements enable evaluation personnel to determine if the required features
are present and functioning as intended. Though the criteria are
application-independent, it is recognized that the specific security feature
requirements may have to be interpreted when applying the criteria to specific
applications or other special processing environments. The underlying
assurance requirements can be applied across the entire spectrum of ADP system
or application processing environments without special interpretation.

INTRODUCTION

Historical Perspective

In October 1967, a task force was assembled under the auspices of the Defense
Science Board to address computer security safeguards that would protect
classified information in remote-access, resource-sharing computer systems.
The Task Force report, “Security Controls for Computer Systems,” published in
February 1970, made a number of policy and technical recommendations on
actions to be taken to reduce the threat of compromise of classified
information processed on remote-access computer systems.[34] Department of
Defense Directive 5200.28 and its accompanying manual DoD 5200.28-M, published
in 1972 and 1973 respectivley, responded to one of these recommendations by
establishing uniform DoD policy, security requirements, administrative
controls, and technical measures to protect classified information processed
by DoD computer systems.[8;9] Research and development work undertaken by the
Air Force, Advanced Research Projects Agency, and other defense agencies in
the early and mid 70’s developed and demonstrated solution approaches for the
technical problems associated with controlling the flow of information in
resource and information sharing computer systems.[1] The DoD Computer
Security Initiative was started in 1977 under the auspices of the Under
Secretary of Defense for Research and Engineering to focus DoD efforts
addressing computer security issues.[33]

Concurrent with DoD efforts to address computer security issues, work was
begun under the leadership of the National Bureau of Standards (NBS) to define
problems and solutions for building, evaluating, and auditing secure computer
systems.[17] As part of this work NBS held two invitational workshops on the
subject of audit and evaluation of computer security.[20;28] The first was
held in March 1977, and the second in November of 1978. One of the products
of the second workshop was a definitive paper on the problems related to
providing criteria for the evaluation of technical computer security
effectiveness.[20] As an outgrowth of recommendations from this report, and in
support of the DoD Computer Security Initiative, the MITRE Corporation began
work on a set of computer security evaluation criteria that could be used to
assess the degree of trust one could place in a computer system to protect
classified data.[24;25;31] The preliminary concepts for computer security
evaluation were defined and expanded upon at invitational workshops and
symposia whose participants represented computer security expertise drawn from
industry and academia in addition to the government. Their work has since
been subjected to much peer review and constructive technical criticism from
the DoD, industrial research and development organizations, universities, and
computer manufacturers.

The DoD Computer Security Center (the Center) was formed in January 1981 to
staff and expand on the work started by the DoD Computer Security
Initiative.[15] A major goal of the Center as given in its DoD Charter is to
encourage the widespread availability of trusted computer systems for use by
those who process classified or other sensitive information.[10] The criteria
presented in this document have evolved from the earlier NBS and MITRE
evaluation material.

Scope

The trusted computer system evaluation criteria defined in this document apply
to both trusted general-purpose and trusted embedded (e.g., those dedicated to
a specific application) automatic data processing (ADP) systems. Included are
two distinct sets of requirements: 1) specific security feature requirements;
and 2) assurance requirements. The specific feature requirements encompass
the capabilities typically found in information processing systems employing
general-purpose operating systems that are distinct from the applications
programs being supported. The assurance requirements, on the other hand,
apply to systems that cover the full range of computing environments from
dedicated controllers to full range multilevel secure resource sharing
systems.

Purpose

As outlined in the Preface, the criteria have been developed for a number of
reasons:

* To provide users with a metric with which to evaluate the
degree of trust that can be placed in computer systems for
the secure processing of classified and other sensitive
information.

* To provide guidance to manufacturers as to what security
features to build into their new and planned, commercial
products in order to provide widely available systems that
satisfy trust requirements for sensitive applications.

* To provide a basis for specifying security requirements in
acquisition specifications.

With respect to the first purpose for development of the criteria, i.e.,
providing users with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be performed on a computer
product from a perspective that excludes the application environment; or, (b)
it can be done to assess whether appropriate security measures have been taken
to permit the system to be used operationally in a specific environment. The
former type of evaluation is done by the Computer Security Center through the
Commercial Product Evaluation Process. That process is described in Appendix
A.

The latter type of evaluation, i.e., those done for the purpose of assessing a
system’s security attributes with respect to a specific operational mission,
is known as a certification evaluation. It must be understood that the
completion of a formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific application
environment. On the contrary, the evaluation report only provides a trusted
computer system’s evaluation rating along with supporting data describing the
product system’s strengths and weaknesses from a computer security point of
view. The system security certification and the formal approval/accreditation
procedure, done in accordance with the applicable policies of the issuing
agencies, must still be followed-before a system can be approved for use in
processing or handling classified information.[8;9]

The trusted computer system evaluation criteria will be used directly and
indirectly in the certification process. Along with applicable policy, it
will be used directly as the basis for evaluation of the total system and for
specifying system security and certification requirements for new
acquisitions. Where a system being evaluated for certification employs a
product that has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation. Technical data
will be furnished to designers, evaluators and the Designated Approving
Authorities to support their needs for making decisions.

Fundamental Computer Security Requirements

Any discussion of computer security necessarily starts from a statement of
requirements, i.e., what it really means to call a computer system “secure.”
In general, secure systems will control, through use of specific security
features, access to information such that only properly authorized
individuals, or processes operating on their behalf, will have access to read,
write, create, or delete information. Six fundamental requirements are
derived from this basic statement of objective: four deal with what needs to
be provided to control access to information; and two deal with how one can
obtain credible assurances that this is accomplished in a trusted computer
system.

POLICY

Requirement 1 – SECURITY POLICY – There must be an explicit and well-defined
security policy enforced by the system. Given identified subjects and
objects, there must be a set of rules that are used by the system to determine
whether a given subject can be permitted to gain access to a specific object.
Computer systems of interest must enforce a mandatory security policy that can
effectively implement access rules for handling sensitive (e.g., classified)
information.[7] These rules include requirements such as: No person lacking
proper personnel security clearance shall obtain access to classified
information. In addition, discretionary security controls are required to
ensure that only selected users or groups of users may obtain access to data
(e.g., based on a need-to-know).

Requirement 2 – MARKING – Access control labels must be associated with
objects. In order to control access to information stored in a computer,
according to the rules of a mandatory security policy, it must be possible to
mark every object with a label that reliably identifies the object’s
sensitivity level (e.g., classification), and/or the modes of access accorded
those subjects who may potentially access the object.

ACCOUNTABILITY

Requirement 3 – IDENTIFICATION – Individual subjects must be identified. Each
access to information must be mediated based on who is accessing the
information and what classes of information they are authorized to deal with.
This identification and authorization information must be securely maintained
by the computer system and be associated with every active element that
performs some security-relevant action in the system.

Requirement 4 – ACCOUNTABILITY – Audit information must be selectively kept
and protected so that actions affecting security can be traced to the
responsible party. A trusted system must be able to record the occurrences of
security-relevant events in an audit log. The capability to select the audit
events to be recorded is necessary to minimize the expense of auditing and to
allow efficient analysis. Audit data must be protected from modification and
unauthorized destruction to permit detection and after-the-fact investigations
of security violations.

ASSURANCE

Requirement 5 – ASSURANCE – The computer system must contain hardware/software
mechanisms that can be independently evaluated to provide sufficient assurance
that the system enforces requirements 1 through 4 above. In order to assure
that the four requirements of Security Policy, Marking, Identification, and
Accountability are enforced by a computer system, there must be some
identified and unified collection of hardware and software controls that
perform those functions. These mechanisms are typically embedded in the
operating system and are designed to carry out the assigned tasks in a secure
manner. The basis for trusting such system mechanisms in their operational
setting must be clearly documented such that it is possible to independently
examine the evidence to evaluate their sufficiency.

Requirement 6 – CONTINUOUS PROTECTION – The trusted mechanisms that enforce
these basic requirements must be continuously protected against tampering
and/or unauthorized changes. No computer system can be considered truly
secure if the basic hardware and software mechanisms that enforce the security
policy are themselves subject to unauthorized modification or subversion. The
continuous protection requirement has direct implications throughout the
computer system’s life-cycle.

These fundamental requirements form the basis for the individual evaluation
criteria applicable for each evaluation division and class. The interested
reader is referred to Section 5 of this document, “Control Objectives for
Trusted Computer Systems,” for a more complete discussion and further
amplification of these fundamental requirements as they apply to
general-purpose information processing systems and to Section 7 for
amplification of the relationship between Policy and these requirements.

Structure of the Document

The remainder of this document is divided into two parts, four appendices, and
a glossary. Part I (Sections 1 through 4) presents the detailed criteria
derived from the fundamental requirements described above and relevant to the
rationale and policy excerpts contained in Part II.

Part II (Sections 5 through 10) provides a discussion of basic objectives,
rationale, and national policy behind the development of the criteria, and
guidelines for developers pertaining to: mandatory access control rules
implementation, the covert channel problem, and security testing. It is
divided into six sections. Section 5 discusses the use of control objectives
in general and presents the three basic control objectives of the criteria.
Section 6 provides the theoretical basis behind the criteria. Section 7 gives
excerpts from pertinent regulations, directives, OMB Circulars, and Executive
Orders which provide the basis for many trust requirements for processing
nationally sensitive and classified information with computer systems.
Section 8 provides guidance to system developers on expectations in dealing
with the covert channel problem. Section 9 provides guidelines dealing with
mandatory security. Section 10 provides guidelines for security testing.
There are four appendices, including a description of the Trusted Computer
System Commercial Products Evaluation Process (Appendix A), summaries of the
evaluation divisions (Appendix B) and classes (Appendix C), and finally a
directory of requirements ordered alphabetically. In addition, there is a
glossary.

Structure of the Criteria

The criteria are divided into four divisions: D, C, B, and A ordered in a
hierarchical manner with the highest division (A) being reserved for systems
providing the most comprehensive security. Each division represents a major
improvement in the overall confidence one can place in the system for the
protection of sensitive information. Within divisions C and B there are a
number of subdivisions known as classes. The classes are also ordered in a
hierarchical manner with systems representative of division C and lower
classes of division B being characterized by the set of computer security
mechanisms that they possess. Assurance of correct and complete design and
implementation for these systems is gained mostly through testing of the
security- relevant portions of the system. The security-relevant portions of
a system are referred to throughout this document as the Trusted Computing
Base (TCB). Systems representative of higher classes in division B and
division A derive their security attributes more from their design and
implementation structure. Increased assurance that the required features are
operative, correct, and tamperproof under all circumstances is gained through
progressively more rigorous analysis during the design process.

Within each class, four major sets of criteria are addressed. The first three
represent features necessary to satisfy the broad control objectives of
Security Policy, Accountability, and Assurance that are discussed in Part II,
Section 5. The fourth set, Documentation, describes the type of written
evidence in the form of user guides, manuals, and the test and design
documentation required for each class.

A reader using this publication for the first time may find it helpful to
first read Part II, before continuing on with Part I.

PART I: THE CRITERIA

Highlighting (UPPERCASE) is used in Part I to indicate criteria not contained
in a lower class or changes and additions to already defined criteria. Where
there is no highlighting, requirements have been carried over from lower
classes without addition or modification.

1.0 DIVISION D: MINIMAL PROTECTION

This division contains only one class. It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

2.0 DIVISION C: DISCRETIONARY PROTECTION

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

2.1 CLASS (C1): DISCRETIONARY SECURITY PROTECTION

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data. It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data. The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity. The following are minimal
requirements for systems assigned a class (C1) rating:

2.1.1 SECURITY POLICY

2.1.1.1 Discretionary Access Control

THE TCB SHALL DEFINE AND CONTROL ACCESS BETWEEN NAMED USERS AND
NAMED OBJECTS (E.G., FILES AND PROGRAMS) IN THE ADP SYSTEM. THE
ENFORCEMENT MECHANISM (E.G., SELF/GROUP/PUBLIC CONTROLS, ACCESS
CONTROL LISTS) SHALL ALLOW USERS TO SPECIFY AND CONTROL SHARING
OF THOSE OBJECTS BY NAMED INDIVIDUALS OR DEFINED GROUPS OR BOTH.

2.1.2 ACCOUNTABILITY

2.1.2.1 Identification and Authentication

THE TCB SHALL REQUIRE USERS TO IDENTIFY THEMSELVES TO IT BEFORE
BEGINNING TO PERFORM ANY OTHER ACTIONS THAT THE TCB IS EXPECTED
TO MEDIATE. FURTHERMORE, THE TCB SHALL USE A PROTECTED
MECHANISM (E.G., PASSWORDS) TO AUTHENTICATE THE USER’S IDENTITY.
THE TCB SHALL PROTECT AUTHENTICATION DATA SO THAT IT CANNOT BE
ACCESSED BY ANY UNAUTHORIZED USER.

2.1.3 ASSURANCE

2.1.3.1 Operational Assurance

2.1.3.1.1 System Architecture

THE TCB SHALL MAINTAIN A DOMAIN FOR ITS OWN EXECUTION
THAT PROTECTS IT FROM EXTERNAL INTERFERENCE OR TAMPERING
(E.G., BY MODIFICATION OF ITS CODE OR DATA STRUCTURES).
RESOURCES CONTROLLED BY THE TCB MAY BE A DEFINED SUBSET
OF THE SUBJECTS AND OBJECTS IN THE ADP SYSTEM.

2.1.3.1.2 System Integrity

HARDWARE AND/OR SOFTWARE FEATURES SHALL BE PROVIDED THAT
CAN BE USED TO PERIODICALLY VALIDATE THE CORRECT OPERATION
OF THE ON-SITE HARDWARE AND FIRMWARE ELEMENTS OF THE TCB.

2.1.3.2 Life-Cycle Assurance

2.1.3.2.1 Security Testing

THE SECURITY MECHANISMS OF THE ADP SYSTEM SHALL BE TESTED
AND FOUND TO WORK AS CLAIMED IN THE SYSTEM DOCUMENTATION.
TESTING SHALL BE DONE TO ASSURE THAT THERE ARE NO OBVIOUS
WAYS FOR AN UNAUTHORIZED USER TO BYPASS OR OTHERWISE
DEFEAT THE SECURITY PROTECTION MECHANISMS OF THE TCB.
(SEE THE SECURITY TESTING GUIDELINES.)

2.1.4 DOCUMENTATION

2.1.4.1 Security Features User’s Guide

A SINGLE SUMMARY, CHAPTER, OR MANUAL IN USER DOCUMENTATION
SHALL DESCRIBE THE PROTECTION MECHANISMS PROVIDED BY THE TCB,
GUIDELINES ON THEIR USE, AND HOW THEY INTERACT WITH ONE ANOTHER.

2.1.4.2 Trusted Facility Manual

A MANUAL ADDRESSED TO THE ADP SYSTEM ADMINISTRATOR SHALL
PRESENT CAUTIONS ABOUT FUNCTIONS AND PRIVILEGES THAT SHOULD BE
CONTROLLED WHEN RUNNING A SECURE FACILITY.

2.1.4.3 Test Documentation

THE SYSTEM DEVELOPER SHALL PROVIDE TO THE EVALUATORS A DOCUMENT
THAT DESCRIBES THE TEST PLAN AND RESULTS OF THE SECURITY
MECHANISMS’ FUNCTIONAL TESTING.

2.1.4.4 Design Documentation

DOCUMENTATION SHALL BE AVAILABLE THAT PROVIDES A DESCRIPTION OF
THE MANUFACTURER’S PHILOSOPHY OF PROTECTION AND AN EXPLANATION
OF HOW THIS PHILOSOPHY IS TRANSLATED INTO THE TCB. IF THE TCB
IS COMPOSED OF DISTINCT MODULES, THE INTERFACES BETWEEN THESE
MODULES SHALL BE DESCRIBED.

2.2 CLASS (C2): CONTROLLED ACCESS PROTECTION

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation. The following are minimal requirements for systems
assigned a class (C2) rating:

2.2.1 SECURITY POLICY

2.2.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system. The
enforcement mechanism (e.g., self/group/public controls, access
control lists) shall allow users to specify and control sharing
of those objects by named individuals, or defined groups OF
INDIVIDUALS, or by both. THE DISCRETIONARY ACCESS CONTROL
MECHANISM SHALL, EITHER BY EXPLICIT USER ACTION OR BY DEFAULT,
PROVIDE THAT OBJECTS ARE PROTECTED FROM UNAUTHORIZED ACCESS.
THESE ACCESS CONTROLS SHALL BE CAPABLE OF INCLUDING OR EXCLUDING
ACCESS TO THE GRANULARITY OF A SINGLE USER. ACCESS PERMISSION
TO AN OBJECT BY USERS NOT ALREADY POSSESSING ACCESS PERMISSION
SHALL ONLY BE ASSIGNED BY AUTHORIZED USERS.

2.2.1.2 Object Reuse

WHEN A STORAGE OBJECT IS INITIALLY ASSIGNED, ALLOCATED, OR
REALLOCATED TO A SUBJECT FROM THE TCB’S POOL OF UNUSED STORAGE
OBJECTS, THE TCB SHALL ASSURE THAT THE OBJECT CONTAINS NO DATA
FOR WHICH THE SUBJECT IS NOT AUTHORIZED.

2.2.2 ACCOUNTABILITY

2.2.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall use a protected
mechanism (e.g., passwords) to authenticate the user’s identity.
The TCB shall protect authentication data so that it cannot be
accessed by any unauthorized user. THE TCB SHALL BE ABLE TO
ENFORCE INDIVIDUAL ACCOUNTABILITY BY PROVIDING THE CAPABILITY TO
UNIQUELY IDENTIFY EACH INDIVIDUAL ADP SYSTEM USER. THE TCB
SHALL ALSO PROVIDE THE CAPABILITY OF ASSOCIATING THIS IDENTITY
WITH ALL AUDITABLE ACTIONS TAKEN BY THAT INDIVIDUAL.

2.2.2.2 Audit

THE TCB SHALL BE ABLE TO CREATE, MAINTAIN, AND PROTECT FROM
MODIFICATION OR UNAUTHORIZED ACCESS OR DESTRUCTION AN AUDIT
TRAIL OF ACCESSES TO THE OBJECTS IT PROTECTS. THE AUDIT DATA
SHALL BE PROTECTED BY THE TCB SO THAT READ ACCESS TO IT IS
LIMITED TO THOSE WHO ARE AUTHORIZED FOR AUDIT DATA. THE TCB
SHALL BE ABLE TO RECORD THE FOLLOWING TYPES OF EVENTS: USE OF
IDENTIFICATION AND AUTHENTICATION MECHANISMS, INTRODUCTION OF
OBJECTS INTO A USER’S ADDRESS SPACE (E.G., FILE OPEN, PROGRAM
INITIATION), DELETION OF OBJECTS, AND ACTIONS TAKEN BY
COMPUTER OPERATORS AND SYSTEM ADMINISTRATORS AND/OR SYSTEM
SECURITY OFFICERS. FOR EACH RECORDED EVENT, THE AUDIT RECORD
SHALL IDENTIFY: DATE AND TIME OF THE EVENT, USER, TYPE OF
EVENT, AND SUCCESS OR FAILURE OF THE EVENT. FOR
IDENTIFICATION/AUTHENTICATION EVENTS THE ORIGIN OF REQUEST
(E.G., TERMINAL ID) SHALL BE INCLUDED IN THE AUDIT RECORD. FOR
EVENTS THAT INTRODUCE AN OBJECT INTO A USER’S ADDRESS SPACE AND
FOR OBJECT DELETION EVENTS THE AUDIT RECORD SHALL INCLUDE THE
NAME OF THE OBJECT. THE ADP SYSTEM ADMINISTRATOR SHALL BE ABLE
TO SELECTIVELY AUDIT THE ACTIONS OF ANY ONE OR MORE USERS BASED
ON INDIVIDUAL IDENTITY.

2.2.3 ASSURANCE

2.2.3.1 Operational Assurance

2.2.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
Resources controlled by the TCB may be a defined subset
of the subjects and objects in the ADP system. THE TCB
SHALL ISOLATE THE RESOURCES TO BE PROTECTED SO THAT THEY
ARE SUBJECT TO THE ACCESS CONTROL AND AUDITING
REQUIREMENTS.

2.2.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct operation
of the on-site hardware and firmware elements of the TCB.

2.2.3.2 Life-Cycle Assurance

2.2.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
Testing shall be done to assure that there are no obvious
ways for an unauthorized user to bypass or otherwise
defeat the security protection mechanisms of the TCB.
TESTING SHALL ALSO INCLUDE A SEARCH FOR OBVIOUS FLAWS THAT
WOULD ALLOW VIOLATION OF RESOURCE ISOLATION, OR THAT WOULD
PERMIT UNAUTHORIZED ACCESS TO THE AUDIT OR AUTHENTICATION
DATA. (See the Security Testing guidelines.)

2.2.4 DOCUMENTATION

2.2.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

2.2.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. THE PROCEDURES FOR
EXAMINING AND MAINTAINING THE AUDIT FILES AS WELL AS THE
DETAILED AUDIT RECORD STRUCTURE FOR EACH TYPE OF AUDIT EVENT
SHALL BE GIVEN.

2.2.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing.

2.2.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. If the TCB
is composed of distinct modules, the interfaces between these
modules shall be described.

3.0 DIVISION B: MANDATORY PROTECTION

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division. Systems in this division must carry the
sensitivity labels with major data structures in the system. The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB. Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

3.1 CLASS (B1): LABELED SECURITY PROTECTION

Class (B1) systems require all the features required for class (C2). In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information. Any
flaws identified by testing must be removed. The following are minimal
requirements for systems assigned a class (B1) rating:

3.1.1 SECURITY POLICY

3.1.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined groups
of individuals, or by both. The discretionary access control
mechanism shall, either by explicit user action or by default,
provide that objects are protected from unauthorized access.
These access controls shall be capable of including or excluding
access to the granularity of a single user. Access permission
to an object by users not already possessing access permission
shall only be assigned by authorized users.

3.1.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.1.1.3 Labels

SENSITIVITY LABELS ASSOCIATED WITH EACH SUBJECT AND STORAGE
OBJECT UNDER ITS CONTROL (E.G., PROCESS, FILE, SEGMENT, DEVICE)
SHALL BE MAINTAINED BY THE TCB. THESE LABELS SHALL BE USED AS
THE BASIS FOR MANDATORY ACCESS CONTROL DECISIONS. IN ORDER TO
IMPORT NON-LABELED DATA, THE TCB SHALL REQUEST AND RECEIVE FROM
AN AUTHORIZED USER THE SECURITY LEVEL OF THE DATA, AND ALL SUCH
ACTIONS SHALL BE AUDITABLE BY THE TCB.

3.1.1.3.1 Label Integrity

SENSITIVITY LABELS SHALL ACCURATELY REPRESENT SECURITY
LEVELS OF THE SPECIFIC SUBJECTS OR OBJECTS WITH WHICH THEY
ARE ASSOCIATED. WHEN EXPORTED BY THE TCB, SENSITIVITY
LABELS SHALL ACCURATELY AND UNAMBIGUOUSLY REPRESENT THE
INTERNAL LABELS AND SHALL BE ASSOCIATED WITH THE
INFORMATION BEING EXPORTED.

3.1.1.3.2 Exportation of Labeled Information

THE TCB SHALL DESIGNATE EACH COMMUNICATION CHANNEL AND
I/O DEVICE AS EITHER SINGLE-LEVEL OR MULTILEVEL. ANY
CHANGE IN THIS DESIGNATION SHALL BE DONE MANUALLY AND
SHALL BE AUDITABLE BY THE TCB. THE TCB SHALL MAINTAIN
AND BE ABLE TO AUDIT ANY CHANGE IN THE CURRENT SECURITY
LEVEL ASSOCIATED WITH A SINGLE-LEVEL COMMUNICATION
CHANNEL OR I/O DEVICE.

3.1.1.3.2.1 Exportation to Multilevel Devices

WHEN THE TCB EXPORTS AN OBJECT TO A MULTILEVEL I/O
DEVICE, THE SENSITIVITY LABEL ASSOCIATED WITH THAT
OBJECT SHALL ALSO BE EXPORTED AND SHALL RESIDE ON
THE SAME PHYSICAL MEDIUM AS THE EXPORTED
INFORMATION AND SHALL BE IN THE SAME FORM
(I.E., MACHINE-READABLE OR HUMAN-READABLE FORM).
WHEN THE TCB EXPORTS OR IMPORTS AN OBJECT OVER A
MULTILEVEL COMMUNICATION CHANNEL, THE PROTOCOL
USED ON THAT CHANNEL SHALL PROVIDE FOR THE
UNAMBIGUOUS PAIRING BETWEEN THE SENSITIVITY LABELS
AND THE ASSOCIATED INFORMATION THAT IS SENT OR
RECEIVED.

3.1.1.3.2.2 Exportation to Single-Level Devices

SINGLE-LEVEL I/O DEVICES AND SINGLE-LEVEL
COMMUNICATION CHANNELS ARE NOT REQUIRED TO
MAINTAIN THE SENSITIVITY LABELS OF THE INFORMATION
THEY PROCESS. HOWEVER, THE TCB SHALL INCLUDE A
MECHANISM BY WHICH THE TCB AND AN AUTHORIZED USER
RELIABLY COMMUNICATE TO DESIGNATE THE SINGLE
SECURITY LEVEL OF INFORMATION IMPORTED OR EXPORTED
VIA SINGLE-LEVEL COMMUNICATION CHANNELS OR I/O
DEVICES.

3.1.1.3.2.3 Labeling Human-Readable Output

THE ADP SYSTEM ADMINISTRATOR SHALL BE ABLE TO
SPECIFY THE PRINTABLE LABEL NAMES ASSOCIATED WITH
EXPORTED SENSITIVITY LABELS. THE TCB SHALL MARK
THE BEGINNING AND END OF ALL HUMAN-READABLE, PAGED,
HARDCOPY OUTPUT (E.G., LINE PRINTER OUTPUT) WITH
HUMAN-READABLE SENSITIVITY LABELS THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE OUTPUT. THE TCB
SHALL, BY DEFAULT, MARK THE TOP AND BOTTOM OF EACH
PAGE OF HUMAN-READABLE, PAGED, HARDCOPY OUTPUT
(E.G., LINE PRINTER OUTPUT) WITH HUMAN-READABLE
SENSITIVITY LABELS THAT PROPERLY* REPRESENT THE
OVERALL SENSITIVITY OF THE OUTPUT OR THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE INFORMATION ON THE
PAGE. THE TCB SHALL, BY DEFAULT AND IN AN
APPROPRIATE MANNER, MARK OTHER FORMS OF HUMAN-
READABLE OUTPUT (E.G., MAPS, GRAPHICS) WITH HUMAN-
READABLE SENSITIVITY LABELS THAT PROPERLY*
REPRESENT THE SENSITIVITY OF THE OUTPUT. ANY
OVERRIDE OF THESE MARKING DEFAULTS SHALL BE
AUDITABLE BY THE TCB.

_____________________________________________________________
* THE HIERARCHICAL CLASSIFICATION COMPONENT IN HUMAN-READABLE
SENSITIVITY LABELS SHALL BE EQUAL TO THE GREATEST
HIERARCHICAL CLASSIFICATION OF ANY OF THE INFORMATION IN THE
OUTPUT THAT THE LABELS REFER TO; THE NON-HIERARCHICAL
CATEGORY COMPONENT SHALL INCLUDE ALL OF THE NON-HIERARCHICAL
CATEGORIES OF THE INFORMATION IN THE OUTPUT THE LABELS REFER
TO, BUT NO OTHER NON-HIERARCHICAL CATEGORIES.
_____________________________________________________________

3.1.1.4 Mandatory Access Control

THE TCB SHALL ENFORCE A MANDATORY ACCESS CONTROL POLICY OVER
ALL SUBJECTS AND STORAGE OBJECTS UNDER ITS CONTROL (E.G.,
PROCESSES, FILES, SEGMENTS, DEVICES). THESE SUBJECTS AND
OBJECTS SHALL BE ASSIGNED SENSITIVITY LABELS THAT ARE A
COMBINATION OF HIERARCHICAL CLASSIFICATION LEVELS AND
NON-HIERARCHICAL CATEGORIES, AND THE LABELS SHALL BE USED AS
THE BASIS FOR MANDATORY ACCESS CONTROL DECISIONS. THE TCB
SHALL BE ABLE TO SUPPORT TWO OR MORE SUCH SECURITY LEVELS.
(SEE THE MANDATORY ACCESS CONTROL GUIDELINES.) THE FOLLOWING
REQUIREMENTS SHALL HOLD FOR ALL ACCESSES BETWEEN SUBJECTS AND
OBJECTS CONTROLLED BY THE TCB: A SUBJECT CAN READ AN OBJECT
ONLY IF THE HIERARCHICAL CLASSIFICATION IN THE SUBJECT’S
SECURITY LEVEL IS GREATER THAN OR EQUAL TO THE HIERARCHICAL
CLASSIFICATION IN THE OBJECT’S SECURITY LEVEL AND THE NON-
HIERARCHICAL CATEGORIES IN THE SUBJECT’S SECURITY LEVEL INCLUDE
ALL THE NON-HIERARCHICAL CATEGORIES IN THE OBJECT’S SECURITY
LEVEL. A SUBJECT CAN WRITE AN OBJECT ONLY IF THE HIERARCHICAL
CLASSIFICATION IN THE SUBJECT’S SECURITY LEVEL IS LESS THAN OR
EQUAL TO THE HIERARCHICAL CLASSIFICATION IN THE OBJECT’S
SECURITY LEVEL AND ALL THE NON-HIERARCHICAL CATEGORIES IN THE
SUBJECT’S SECURITY LEVEL ARE INCLUDED IN THE NON- HIERARCHICAL
CATEGORIES IN THE OBJECT’S SECURITY LEVEL.

3.1.2 ACCOUNTABILITY

3.1.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall MAINTAIN AUTHENTICATION
DATA THAT INCLUDES INFORMATION FOR VERIFYING THE IDENTITY OF
INDIVIDUAL USERS (E.G., PASSWORDS) AS WELL AS INFORMATION FOR
DETERMINING THE CLEARANCE AND AUTHORIZATIONS OF INDIVIDUAL
USERS. THIS DATA SHALL BE USED BY THE TCB TO AUTHENTICATE the
user’s identity AND TO DETERMINE THE SECURITY LEVEL AND
AUTHORIZATIONS OF SUBJECTS THAT MAY BE CREATED TO ACT ON BEHALF
OF THE INDIVIDUAL USER. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.1.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. THE TCB SHALL ALSO BE ABLE TO AUDIT ANY OVERRIDE OF
HUMAN-READABLE OUTPUT MARKINGS. FOR each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address space
and for object deletion events the audit record shall include
the name of the object AND THE OBJECT’S SECURITY LEVEL. The
ADP system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
AND/OR OBJECT SECURITY LEVEL.

3.1.3 ASSURANCE

3.1.3.1 Operational Assurance

3.1.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
Resources controlled by the TCB may be a defined subset
of the subjects and objects in the ADP system. THE TCB
SHALL MAINTAIN PROCESS ISOLATION THROUGH THE PROVISION OF
DISTINCT ADDRESS SPACES UNDER ITS CONTROL. The TCB shall
isolate the resources to be protected so that they are
subject to the access control and auditing requirements.

3.1.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct operation
of the on-site hardware and firmware elements of the TCB.

3.1.3.2 Life-Cycle Assurance

3.1.3.2.1 Security Testing

THE SECURITY MECHANISMS OF THE ADP SYSTEM SHALL BE TESTED
AND FOUND TO WORK AS CLAIMED IN THE SYSTEM DOCUMENTATION.
A TEAM OF INDIVIDUALS WHO THOROUGHLY UNDERSTAND THE
SPECIFIC IMPLEMENTATION OF THE TCB SHALL SUBJECT ITS
DESIGN DOCUMENTATION, SOURCE CODE, AND OBJECT CODE TO
THOROUGH ANALYSIS AND TESTING. THEIR OBJECTIVES SHALL BE:
TO UNCOVER ALL DESIGN AND IMPLEMENTATION FLAWS THAT WOULD
PERMIT A SUBJECT EXTERNAL TO THE TCB TO READ, CHANGE, OR
DELETE DATA NORMALLY DENIED UNDER THE MANDATORY OR
DISCRETIONARY SECURITY POLICY ENFORCED BY THE TCB; AS WELL
AS TO ASSURE THAT NO SUBJECT (WITHOUT AUTHORIZATION TO DO
SO) IS ABLE TO CAUSE THE TCB TO ENTER A STATE SUCH THAT
IT IS UNABLE TO RESPOND TO COMMUNICATIONS INITIATED BY
OTHER USERS. ALL DISCOVERED FLAWS SHALL BE REMOVED OR
NEUTRALIZED AND THE TCB RETESTED TO DEMONSTRATE THAT THEY
HAVE BEEN ELIMINATED AND THAT NEW FLAWS HAVE NOT BEEN
INTRODUCED. (SEE THE SECURITY TESTING GUIDELINES.)

3.1.3.2.2 Design Specification and Verification

AN INFORMAL OR FORMAL MODEL OF THE SECURITY POLICY
SUPPORTED BY THE TCB SHALL BE MAINTAINED THAT IS SHOWN TO
BE CONSISTENT WITH ITS AXIOMS.

3.1.4 DOCUMENTATION

3.1.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.1.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. THE MANUAL SHALL DESCRIBE THE OPERATOR AND
ADMINISTRATOR FUNCTIONS RELATED TO SECURITY, TO INCLUDE CHANGING
THE SECURITY CHARACTERISTICS OF A USER. IT SHALL PROVIDE
GUIDELINES ON THE CONSISTENT AND EFFECTIVE USE OF THE PROTECTION
FEATURES OF THE SYSTEM, HOW THEY INTERACT, HOW TO SECURELY
GENERATE A NEW TCB, AND FACILITY PROCEDURES, WARNINGS, AND
PRIVILEGES THAT NEED TO BE CONTROLLED IN ORDER TO OPERATE THE
FACILITY IN A SECURE MANNER.

3.1.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing.

3.1.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. If the TCB
is composed of distinct modules, the interfaces between these
modules shall be described. AN INFORMAL OR FORMAL DESCRIPTION
OF THE SECURITY POLICY MODEL ENFORCED BY THE TCB SHALL BE
AVAILABLE AND AN EXPLANATION PROVIDED TO SHOW THAT IT IS
SUFFICIENT TO ENFORCE THE SECURITY POLICY. THE SPECIFIC TCB
PROTECTION MECHANISMS SHALL BE IDENTIFIED AND AN EXPLANATION
GIVEN TO SHOW THAT THEY SATISFY THE MODEL.

3.2 CLASS (B2): STRUCTURED PROTECTION

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system. In addition, covert channels are
addressed. The TCB must be carefully structured into protection-critical and
non- protection-critical elements. The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review. Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed. The system is relatively resistant to penetration. The
following are minimal requirements for systems assigned a class (B2) rating:

3.2.1 SECURITY POLICY

3.2.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined
groups of individuals, or by both. The discretionary access
control mechanism shall, either by explicit user action or by
default, provide that objects are protected from unauthorized
access. These access controls shall be capable of including
or excluding access to the granularity of a single user.
Access permission to an object by users not already possessing
access permission shall only be assigned by authorized users.

3.2.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.2.1.3 Labels

Sensitivity labels associated with each ADP SYSTEM RESOURCE
(E.G., SUBJECT, STORAGE OBJECT) THAT IS DIRECTLY OR INDIRECTLY
ACCESSIBLE BY SUBJECTS EXTERNAL TO THE TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

3.2.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

3.2.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

3.2.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

3.2.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the
information they process. However, the TCB shall
include a mechanism by which the TCB and an
authorized user reliably communicate to designate
the single security level of information imported
or exported via single-level communication
channels or I/O devices.

3.2.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.
_____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
_____________________________________________________________

3.2.1.3.3 Subject Sensitivity Labels

THE TCB SHALL IMMEDIATELY NOTIFY A TERMINAL USER OF EACH
CHANGE IN THE SECURITY LEVEL ASSOCIATED WITH THAT USER
DURING AN INTERACTIVE SESSION. A TERMINAL USER SHALL BE
ABLE TO QUERY THE TCB AS DESIRED FOR A DISPLAY OF THE
SUBJECT’S COMPLETE SENSITIVITY LABEL.

3.2.1.3.4 Device Labels

THE TCB SHALL SUPPORT THE ASSIGNMENT OF MINIMUM AND
MAXIMUM SECURITY LEVELS TO ALL ATTACHED PHYSICAL DEVICES.
THESE SECURITY LEVELS SHALL BE USED BY THE TCB TO ENFORCE
CONSTRAINTS IMPOSED BY THE PHYSICAL ENVIRONMENTS IN WHICH
THE DEVICES ARE LOCATED.

3.2.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all RESOURCES (I.E., SUBJECTS, STORAGE OBJECTS, AND I/O DEVICES)
THAT ARE DIRECTLY OR INDIRECTLY ACCESSIBLE BY SUBJECTS EXTERNAL
TO THE TCB. These subjects and objects shall be assigned
sensitivity labels that are a combination of hierarchical
classification levels and non-hierarchical categories, and the
labels shall be used as the basis for mandatory access control
decisions. The TCB shall be able to support two or more such
security levels. (See the Mandatory Access Control guidelines.)
The following requirements shall hold for all accesses between
ALL SUBJECTS EXTERNAL TO THE TCB AND ALL OBJECTS DIRECTLY OR
INDIRECTLY ACCESSIBLE BY THESE SUBJECTS: A subject can read an
object only if the hierarchical classification in the subject’s
security level is greater than or equal to the hierarchical
classification in the object’s security level and the non-
hierarchical categories in the subject’s security level include
all the non-hierarchical categories in the object’s security
level. A subject can write an object only if the hierarchical
classification in the subject’s security level is less than or
equal to the hierarchical classification in the object’s
security level and all the non-hierarchical categories in the
subject’s security level are included in the non-hierarchical
categories in the object’s security level.

3.2.2 ACCOUNTABILITY

3.2.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.2.2.1.1 Trusted Path

THE TCB SHALL SUPPORT A TRUSTED COMMUNICATION PATH
BETWEEN ITSELF AND USER FOR INITIAL LOGIN AND
AUTHENTICATION. COMMUNICATIONS VIA THIS PATH SHALL BE
INITIATED EXCLUSIVELY BY A USER.

3.2.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record. For
events that introduce an object into a user’s address space and
for object deletion events the audit record shall include the
name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
and/or object security level. THE TCB SHALL BE ABLE TO AUDIT
THE IDENTIFIED EVENTS THAT MAY BE USED IN THE EXPLOITATION OF
COVERT STORAGE CHANNELS.

3.2.3 ASSURANCE

3.2.3.1 Operational Assurance

3.2.3.1.1 System Architecture

THE TCB SHALL MAINTAIN A DOMAIN FOR ITS OWN EXECUTION
THAT PROTECTS IT FROM EXTERNAL INTERFERENCE OR TAMPERING
(E.G., BY MODIFICATION OF ITS CODE OR DATA STRUCTURES).
THE TCB SHALL MAINTAIN PROCESS ISOLATION THROUGH THE
PROVISION OF DISTINCT ADDRESS SPACES UNDER ITS CONTROL.
THE TCB SHALL BE INTERNALLY STRUCTURED INTO WELL-DEFINED
LARGELY INDEPENDENT MODULES. IT SHALL MAKE EFFECTIVE USE
OF AVAILABLE HARDWARE TO SEPARATE THOSE ELEMENTS THAT ARE
PROTECTION-CRITICAL FROM THOSE THAT ARE NOT. THE TCB
MODULES SHALL BE DESIGNED SUCH THAT THE PRINCIPLE OF LEAST
PRIVILEGE IS ENFORCED. FEATURES IN HARDWARE, SUCH AS
SEGMENTATION, SHALL BE USED TO SUPPORT LOGICALLY DISTINCT
STORAGE OBJECTS WITH SEPARATE ATTRIBUTES (NAMELY:
READABLE, WRITEABLE). THE USER INTERFACE TO THE TCB
SHALL BE COMPLETELY DEFINED AND ALL ELEMENTS OF THE TCB
IDENTIFIED.

3.2.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

3.2.3.1.3 Covert Channel Analysis

THE SYSTEM DEVELOPER SHALL CONDUCT A THOROUGH SEARCH FOR
COVERT STORAGE CHANNELS AND MAKE A DETERMINATION (EITHER
BY ACTUAL MEASUREMENT OR BY ENGINEERING ESTIMATION) OF
THE MAXIMUM BANDWIDTH OF EACH IDENTIFIED CHANNEL. (SEE
THE COVERT CHANNELS GUIDELINE SECTION.)

3.2.3.1.4 Trusted Facility Management

THE TCB SHALL SUPPORT SEPARATE OPERATOR AND ADMINISTRATOR
FUNCTIONS.

3.2.3.2 Life-Cycle Assurance

3.2.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall be:
to uncover all design and implementation flaws that would
permit a subject external to the TCB to read, change, or
delete data normally denied under the mandatory or
discretionary security policy enforced by the TCB; as well
as to assure that no subject (without authorization to do
so) is able to cause the TCB to enter a state such that it
is unable to respond to communications initiated by other
users. THE TCB SHALL BE FOUND RELATIVELY RESISTANT TO
PENETRATION. All discovered flaws shall be CORRECTED and
the TCB retested to demonstrate that they have been
eliminated and that new flaws have not been introduced.
TESTING SHALL DEMONSTRATE THAT THE TCB IMPLEMENTATION IS
CONSISTENT WITH THE DESCRIPTIVE TOP-LEVEL SPECIFICATION.
(See the Security Testing Guidelines.)

3.2.3.2.2 Design Specification and Verification

A FORMAL model of the security policy supported by the
TCB shall be maintained that is PROVEN consistent with
its axioms. A DESCRIPTIVE TOP-LEVEL SPECIFICATION (DTLS)
OF THE TCB SHALL BE MAINTAINED THAT COMPLETELY AND
ACCURATELY DESCRIBES THE TCB IN TERMS OF EXCEPTIONS, ERROR
MESSAGES, AND EFFECTS. IT SHALL BE SHOWN TO BE AN
ACCURATE DESCRIPTION OF THE TCB INTERFACE.

3.2.3.2.3 Configuration Management

DURING DEVELOPMENT AND MAINTENANCE OF THE TCB, A
CONFIGURATION MANAGEMENT SYSTEM SHALL BE IN PLACE THAT
MAINTAINS CONTROL OF CHANGES TO THE DESCRIPTIVE TOP-LEVEL
SPECIFICATION, OTHER DESIGN DATA, IMPLEMENTATION
DOCUMENTATION, SOURCE CODE, THE RUNNING VERSION OF THE
OBJECT CODE, AND TEST FIXTURES AND DOCUMENTATION. THE
CONFIGURATION MANAGEMENT SYSTEM SHALL ASSURE A CONSISTENT
MAPPING AMONG ALL DOCUMENTATION AND CODE ASSOCIATED WITH
THE CURRENT VERSION OF THE TCB. TOOLS SHALL BE PROVIDED
FOR GENERATION OF A NEW VERSION OF THE TCB FROM SOURCE
CODE. ALSO AVAILABLE SHALL BE TOOLS FOR COMPARING A
NEWLY GENERATED VERSION WITH THE PREVIOUS TCB VERSION IN
ORDER TO ASCERTAIN THAT ONLY THE INTENDED CHANGES HAVE
BEEN MADE IN THE CODE THAT WILL ACTUALLY BE USED AS THE
NEW VERSION OF THE TCB.

3.2.4 DOCUMENTATION

3.2.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.2.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. THE TCB MODULES THAT CONTAIN
THE REFERENCE VALIDATION MECHANISM SHALL BE IDENTIFIED. THE
PROCEDURES FOR SECURE GENERATION OF A NEW TCB FROM SOURCE AFTER
MODIFICATION OF ANY MODULES IN THE TCB SHALL BE DESCRIBED.

3.2.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. IT SHALL INCLUDE RESULTS OF
TESTING THE EFFECTIVENESS OF THE METHODS USED TO REDUCE COVERT
CHANNEL BANDWIDTHS.

3.2.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. THE
interfaces between THE TCB modules shall be described. A
FORMAL description of the security policy model enforced by the
TCB shall be available and PROVEN that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. THE DESCRIPTIVE TOP-LEVEL
SPECIFICATION (DTLS) SHALL BE SHOWN TO BE AN ACCURATE
DESCRIPTION OF THE TCB INTERFACE. DOCUMENTATION SHALL DESCRIBE
HOW THE TCB IMPLEMENTS THE REFERENCE MONITOR CONCEPT AND GIVE
AN EXPLANATION WHY IT IS TAMPERPROOF, CANNOT BE BYPASSED, AND
IS CORRECTLY IMPLEMENTED. DOCUMENTATION SHALL DESCRIBE HOW THE
TCB IS STRUCTURED TO FACILITATE TESTING AND TO ENFORCE LEAST
PRIVILEGE. THIS DOCUMENTATION SHALL ALSO PRESENT THE RESULTS
OF THE COVERT CHANNEL ANALYSIS AND THE TRADEOFFS INVOLVED IN
RESTRICTING THE CHANNELS. ALL AUDITABLE EVENTS THAT MAY BE
USED IN THE EXPLOITATION OF KNOWN COVERT STORAGE CHANNELS SHALL
BE IDENTIFIED. THE BANDWIDTHS OF KNOWN COVERT STORAGE CHANNELS,
THE USE OF WHICH IS NOT DETECTABLE BY THE AUDITING MECHANISMS,
SHALL BE PROVIDED. (SEE THE COVERT CHANNEL GUIDELINE SECTION.)

3.3 CLASS (B3): SECURITY DOMAINS

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests. To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity. A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required. The system is highly resistant to
penetration. The following are minimal requirements for systems assigned a
class (B3) rating:

3.3.1 SECURITY POLICY

3.3.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (E.G., ACCESS CONTROL LISTS) shall
allow users to specify and control sharing of those OBJECTS.
The discretionary access control mechanism shall, either by
explicit user action or by default, provide that objects are
protected from unauthorized access. These access controls shall
be capable of SPECIFYING, FOR EACH NAMED OBJECT, A LIST OF NAMED
INDIVIDUALS AND A LIST OF GROUPS OF NAMED INDIVIDUALS WITH THEIR
RESPECTIVE MODES OF ACCESS TO THAT OBJECT. FURTHERMORE, FOR
EACH SUCH NAMED OBJECT, IT SHALL BE POSSIBLE TO SPECIFY A LIST
OF NAMED INDIVIDUALS AND A LIST OF GROUPS OF NAMED INDIVIDUALS
FOR WHICH NO ACCESS TO THE OBJECT IS TO BE GIVEN. Access
permission to an object by users not already possessing access
permission shall only be assigned by authorized users.

3.3.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

3.3.1.3 Labels

Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

3.3.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

3.3.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

3.3.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

3.3.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the information
they process. However, the TCB shall include a
mechanism by which the TCB and an authorized user
reliably communicate to designate the single
security level of information imported or exported
via single-level communication channels or I/O
devices.

3.3.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.

_____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
_____________________________________________________________

3.3.1.3.3 Subject Sensitivity Labels

The TCB shall immediately notify a terminal user of each
change in the security level associated with that user
during an interactive session. A terminal user shall be
able to query the TCB as desired for a display of the
subject’s complete sensitivity label.

3.3.1.3.4 Device Labels

The TCB shall support the assignment of minimum and
maximum security levels to all attached physical devices.
These security levels shall be used by the TCB to enforce
constraints imposed by the physical environments in which
the devices are located.

3.3.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O
devices) that are directly or indirectly accessible by subjects
external to the TCB. These subjects and objects shall be
assigned sensitivity labels that are a combination of
hierarchical classification levels and non-hierarchical
categories, and the labels shall be used as the basis for
mandatory access control decisions. The TCB shall be able to
support two or more such security levels. (See the Mandatory
Access Control guidelines.) The following requirements shall
hold for all accesses between all subjects external to the TCB
and all objects directly or indirectly accessible by these
subjects: A subject can read an object only if the hierarchical
classification in the subject’s security level is greater than
or equal to the hierarchical classification in the object’s
security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical
categories in the object’s security level. A subject can write
an object only if the hierarchical classification in the
subject’s security level is less than or equal to the
hierarchical classification in the object’s security level and
all the non-hierarchical categories in the subject’s security
level are included in the non- hierarchical categories in the
object’s security level.

3.3.2 ACCOUNTABILITY

3.3.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

3.3.2.1.1 Trusted Path

The TCB shall support a trusted communication path
between itself and USERS for USE WHEN A POSITIVE TCB-TO-
USER CONNECTION IS REQUIRED (E.G., LOGIN, CHANGE SUBJECT
SECURITY LEVEL). Communications via this TRUSTED path
shall be ACTIVATED exclusively by a user OR THE TCB AND
SHALL BE LOGICALLY ISOLATED AND UNMISTAKABLY
DISTINGUISHABLE FROM OTHER PATHS.

3.3.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object and the object’s security level.
The ADP system administrator shall be able to selectively audit
the actions of any one or more users based on individual
identity and/or object security level. The TCB shall be able to
audit the identified events that may be used in the exploitation
of covert storage channels. THE TCB SHALL CONTAIN A MECHANISM
THAT IS ABLE TO MONITOR THE OCCURRENCE OR ACCUMULATION OF
SECURITY AUDITABLE EVENTS THAT MAY INDICATE AN IMMINENT
VIOLATION OF SECURITY POLICY. THIS MECHANISM SHALL BE ABLE TO
IMMEDIATELY NOTIFY THE SECURITY ADMINISTRATOR WHEN THRESHOLDS
ARE EXCEEDED.

3.3.3 ASSURANCE

3.3.3.1 Operational Assurance

3.3.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
The TCB shall maintain process isolation through the
provision of distinct address spaces under its control.
The TCB shall be internally structured into well-defined
largely independent modules. It shall make effective use
of available hardware to separate those elements that are
protection-critical from those that are not. The TCB
modules shall be designed such that the principle of
least privilege is enforced. Features in hardware, such
as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely:
readable, writeable). The user interface to the TCB shall
be completely defined and all elements of the TCB
identified. THE TCB SHALL BE DESIGNED AND STRUCTURED TO
USE A COMPLETE, CONCEPTUALLY SIMPLE PROTECTION MECHANISM
WITH PRECISELY DEFINED SEMANTICS. THIS MECHANISM SHALL
PLAY A CENTRAL ROLE IN ENFORCING THE INTERNAL STRUCTURING
OF THE TCB AND THE SYSTEM. THE TCB SHALL INCORPORATE
SIGNIFICANT USE OF LAYERING, ABSTRACTION AND DATA HIDING.
SIGNIFICANT SYSTEM ENGINEERING SHALL BE DIRECTED TOWARD
MINIMIZING THE COMPLEXITY OF THE TCB AND EXCLUDING FROM
THE TCB MODULES THAT ARE NOT PROTECTION-CRITICAL.

3.3.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

3.3.3.1.3 Covert Channel Analysis

The system developer shall conduct a thorough search for
COVERT CHANNELS and make a determination (either by
actual measurement or by engineering estimation) of the
maximum bandwidth of each identified channel. (See the
Covert Channels Guideline section.)

3.3.3.1.4 Trusted Facility Management

The TCB shall support separate operator and administrator
functions. THE FUNCTIONS PERFORMED IN THE ROLE OF A
SECURITY ADMINISTRATOR SHALL BE IDENTIFIED. THE ADP
SYSTEM ADMINISTRATIVE PERSONNEL SHALL ONLY BE ABLE TO
PERFORM SECURITY ADMINISTRATOR FUNCTIONS AFTER TAKING A
DISTINCT AUDITABLE ACTION TO ASSUME THE SECURITY
ADMINISTRATOR ROLE ON THE ADP SYSTEM. NON-SECURITY
FUNCTIONS THAT CAN BE PERFORMED IN THE SECURITY
ADMINISTRATION ROLE SHALL BE LIMITED STRICTLY TO THOSE
ESSENTIAL TO PERFORMING THE SECURITY ROLE EFFECTIVELY.

3.3.3.1.5 Trusted Recovery

PROCEDURES AND/OR MECHANISMS SHALL BE PROVIDED TO ASSURE
THAT, AFTER AN ADP SYSTEM FAILURE OR OTHER DISCONTINUITY,
RECOVERY WITHOUT A PROTECTION COMPROMISE IS OBTAINED.

3.3.3.2 Life-Cycle Assurance

3.3.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that
would permit a subject external to the TCB to read,
change, or delete data normally denied under the
mandatory or discretionary security policy enforced by
the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter
a state such that it is unable to respond to
communications initiated by other users. The TCB shall
be FOUND RESISTANT TO penetration. All discovered flaws
shall be corrected and the TCB retested to demonstrate
that they have been eliminated and that new flaws have
not been introduced. Testing shall demonstrate that the
TCB implementation is consistent with the descriptive
top-level specification. (See the Security Testing
Guidelines.) NO DESIGN FLAWS AND NO MORE THAN A FEW
CORRECTABLE IMPLEMENTATION FLAWS MAY BE FOUND DURING
TESTING AND THERE SHALL BE REASONABLE CONFIDENCE THAT
FEW REMAIN.

3.3.3.2.2 Design Specification and Verification

A formal model of the security policy supported by the
TCB shall be maintained that is proven consistent with
its axioms. A descriptive top-level specification (DTLS)
of the TCB shall be maintained that completely and
accurately describes the TCB in terms of exceptions, error
messages, and effects. It shall be shown to be an
accurate description of the TCB interface. A CONVINCING
ARGUMENT SHALL BE GIVEN THAT THE DTLS IS CONSISTENT WITH
THE MODEL.

3.3.3.2.3 Configuration Management

During development and maintenance of the TCB, a
configuration management system shall be in place that
maintains control of changes to the descriptive top-level
specification, other design data, implementation
documentation, source code, the running version of the
object code, and test fixtures and documentation. The
configuration management system shall assure a consistent
mapping among all documentation and code associated with
the current version of the TCB. Tools shall be provided
for generation of a new version of the TCB from source
code. Also available shall be tools for comparing a
newly generated version with the previous TCB version in
order to ascertain that only the intended changes have
been made in the code that will actually be used as the
new version of the TCB.

3.3.4 DOCUMENTATION

3.3.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

3.3.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. The TCB modules that contain
the reference validation mechanism shall be identified. The
procedures for secure generation of a new TCB from source after
modification of any modules in the TCB shall be described. IT
SHALL INCLUDE THE PROCEDURES TO ENSURE THAT THE SYSTEM IS
INITIALLY STARTED IN A SECURE MANNER. PROCEDURES SHALL ALSO BE
INCLUDED TO RESUME SECURE SYSTEM OPERATION AFTER ANY LAPSE IN
SYSTEM OPERATION.

3.3.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. It shall include results of
testing the effectiveness of the methods used to reduce covert
channel bandwidths.

3.3.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. The
interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the
TCB shall be available and proven that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. The descriptive top-level
specification (DTLS) shall be shown to be an accurate
description of the TCB interface. Documentation shall describe
how the TCB implements the reference monitor concept and give
an explanation why it is tamperproof, cannot be bypassed, and
is correctly implemented. THE TCB IMPLEMENTATION (I.E., IN
HARDWARE, FIRMWARE, AND SOFTWARE) SHALL BE INFORMALLY SHOWN TO
BE CONSISTENT WITH THE DTLS. THE ELEMENTS OF THE DTLS SHALL BE
SHOWN, USING INFORMAL TECHNIQUES, TO CORRESPOND TO THE ELEMENTS
OF THE TCB. Documentation shall describe how the TCB is
structured to facilitate testing and to enforce least privilege.
This documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage channels,
the use of which is not detectable by the auditing mechanisms,
shall be provided. (See the Covert Channel Guideline section.)

4.0 DIVISION A: VERIFIED PROTECTION

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system. Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

4.1 CLASS (A1): VERIFIED DESIGN

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented. This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design.
Independent of the particular specification language or verification system
used, there are five important criteria for class (A1) design verification:

* A formal model of the security policy must be clearly
identified and documented, including a mathematical proof
that the model is consistent with its axioms and is
sufficient to support the security policy.

* An FTLS must be produced that includes abstract definitions
of the functions the TCB performs and of the hardware and/or
firmware mechanisms that are used to support separate
execution domains.

* The FTLS of the TCB must be shown to be consistent with the
model by formal techniques where possible (i.e., where
verification tools exist) and informal ones otherwise.

* The TCB implementation (i.e., in hardware, firmware, and
software) must be informally shown to be consistent with the
FTLS. The elements of the FTLS must be shown, using
informal techniques, to correspond to the elements of the
TCB. The FTLS must express the unified protection mechanism
required to satisfy the security policy, and it is the
elements of this protection mechanism that are mapped to the
elements of the TCB.

* Formal analysis techniques must be used to identify and
analyze covert channels. Informal techniques may be used to
identify covert timing channels. The continued existence of
identified covert channels in the system must be justified.

In keeping with the extensive design and development analysis of the TCB
required of systems in class (A1), more stringent configuration management is
required and procedures are established for securely distributing the system
to sites. A system security administrator is supported.

The following are minimal requirements for systems assigned a class (A1)
rating:

4.1.1 SECURITY POLICY

4.1.1.1 Discretionary Access Control

The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system.
The enforcement mechanism (e.g., access control lists) shall
allow users to specify and control sharing of those objects.
The discretionary access control mechanism shall, either by
explicit user action or by default, provide that objects are
protected from unauthorized access. These access controls
shall be capable of specifying, for each named object, a list
of named individuals and a list of groups of named individuals
with their respective modes of access to that object.
Furthermore, for each such named object, it shall be possible to
specify a list of named individuals and a list of groups of
named individuals for which no access to the object is to be
given. Access permission to an object by users not already
possessing access permission shall only be assigned by
authorized users.

4.1.1.2 Object Reuse

When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data
for which the subject is not authorized.

4.1.1.3 Labels

Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained
by the TCB. These labels shall be used as the basis for
mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an
authorized user the security level of the data, and all such
actions shall be auditable by the TCB.

4.1.1.3.1 Label Integrity

Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which
they are associated. When exported by the TCB,
sensitivity labels shall accurately and unambiguously
represent the internal labels and shall be associated
with the information being exported.

4.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any
change in this designation shall be done manually and
shall be auditable by the TCB. The TCB shall maintain
and be able to audit any change in the current security
level associated with a single-level communication
channel or I/O device.

4.1.1.3.2.1 Exportation to Multilevel Devices

When the TCB exports an object to a multilevel I/O
device, the sensitivity label associated with that
object shall also be exported and shall reside on
the same physical medium as the exported
information and shall be in the same form (i.e.,
machine-readable or human-readable form). When
the TCB exports or imports an object over a
multilevel communication channel, the protocol
used on that channel shall provide for the
unambiguous pairing between the sensitivity labels
and the associated information that is sent or
received.

4.1.1.3.2.2 Exportation to Single-Level Devices

Single-level I/O devices and single-level
communication channels are not required to
maintain the sensitivity labels of the information
they process. However, the TCB shall include a
mechanism by which the TCB and an authorized user
reliably communicate to designate the single
security level of information imported or exported
via single-level communication channels or I/O
devices.

4.1.1.3.2.3 Labeling Human-Readable Output

The ADP system administrator shall be able to
specify the printable label names associated with
exported sensitivity labels. The TCB shall mark
the beginning and end of all human-readable, paged,
hardcopy output (e.g., line printer output) with
human-readable sensitivity labels that properly*
represent the sensitivity of the output. The TCB
shall, by default, mark the top and bottom of each
page of human-readable, paged, hardcopy output
(e.g., line printer output) with human-readable
sensitivity labels that properly* represent the
overall sensitivity of the output or that
properly* represent the sensitivity of the
information on the page. The TCB shall, by
default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps,
graphics) with human-readable sensitivity labels
that properly* represent the sensitivity of the
output. Any override of these marking defaults
shall be auditable by the TCB.

____________________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
____________________________________________________________________

4.1.1.3.3 Subject Sensitivity Labels

The TCB shall immediately notify a terminal user of each
change in the security level associated with that user
during an interactive session. A terminal user shall be
able to query the TCB as desired for a display of the
subject’s complete sensitivity label.

4.1.1.3.4 Device Labels

The TCB shall support the assignment of minimum and
maximum security levels to all attached physical devices.
These security levels shall be used by the TCB to enforce
constraints imposed by the physical environments in which
the devices are located.

4.1.1.4 Mandatory Access Control

The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O
devices) that are directly or indirectly accessible by subjects
external to the TCB. These subjects and objects shall be
assigned sensitivity labels that are a combination of
hierarchical classification levels and non-hierarchical
categories, and the labels shall be used as the basis for
mandatory access control decisions. The TCB shall be able to
support two or more such security levels. (See the Mandatory
Access Control guidelines.) The following requirements shall
hold for all accesses between all subjects external to the TCB
and all objects directly or indirectly accessible by these
subjects: A subject can read an object only if the hierarchical
classification in the subject’s security level is greater than
or equal to the hierarchical classification in the object’s
security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical
categories in the object’s security level. A subject can write
an object only if the hierarchical classification in the
subject’s security level is less than or equal to the
hierarchical classification in the object’s security level and
all the non-hierarchical categories in the subject’s security
level are included in the non- hierarchical categories in the
object’s security level.

4.1.2 ACCOUNTABILITY

4.1.2.1 Identification and Authentication

The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected
to mediate. Furthermore, the TCB shall maintain authentication
data that includes information for verifying the identity of
individual users (e.g., passwords) as well as information for
determining the clearance and authorizations of individual
users. This data shall be used by the TCB to authenticate the
user’s identity and to determine the security level and
authorizations of subjects that may be created to act on behalf
of the individual user. The TCB shall protect authentication
data so that it cannot be accessed by any unauthorized user.
The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual
ADP system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by
that individual.

4.1.2.1.1 Trusted Path

The TCB shall support a trusted communication path
between itself and users for use when a positive TCB-to-
user connection is required (e.g., login, change subject
security level). Communications via this trusted path
shall be activated exclusively by a user or the TCB and
shall be logically isolated and unmistakably
distinguishable from other paths.

4.1.2.2 Audit

The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit
trail of accesses to the objects it protects. The audit data
shall be protected by the TCB so that read access to it is
limited to those who are authorized for audit data. The TCB
shall be able to record the following types of events: use of
identification and authentication mechanisms, introduction of
objects into a user’s address space (e.g., file open, program
initiation), deletion of objects, and actions taken by computer
operators and system administrators and/or system security
officers. The TCB shall also be able to audit any override of
human-readable output markings. For each recorded event, the
audit record shall identify: date and time of the event, user,
type of event, and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record. For
events that introduce an object into a user’s address space and
for object deletion events the audit record shall include the
name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity
and/or object security level. The TCB shall be able to audit
the identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able to
immediately notify the security administrator when thresholds
are exceeded.

4.1.3 ASSURANCE

4.1.3.1 Operational Assurance

4.1.3.1.1 System Architecture

The TCB shall maintain a domain for its own execution
that protects it from external interference or tampering
(e.g., by modification of its code or data structures).
The TCB shall maintain process isolation through the
provision of distinct address spaces under its control.
The TCB shall be internally structured into well-defined
largely independent modules. It shall make effective use
of available hardware to separate those elements that are
protection-critical from those that are not. The TCB
modules shall be designed such that the principle of
least privilege is enforced. Features in hardware, such
as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely:
readable, writeable). The user interface to the TCB
shall be completely defined and all elements of the TCB
identified. The TCB shall be designed and structured to
use a complete, conceptually simple protection mechanism
with precisely defined semantics. This mechanism shall
play a central role in enforcing the internal structuring
of the TCB and the system. The TCB shall incorporate
significant use of layering, abstraction and data hiding.
Significant system engineering shall be directed toward
minimizing the complexity of the TCB and excluding from
the TCB modules that are not protection-critical.

4.1.3.1.2 System Integrity

Hardware and/or software features shall be provided that
can be used to periodically validate the correct
operation of the on-site hardware and firmware elements
of the TCB.

4.1.3.1.3 Covert Channel Analysis

The system developer shall conduct a thorough search for
COVERT CHANNELS and make a determination (either by
actual measurement or by engineering estimation) of the
maximum bandwidth of each identified channel. (See the
Covert Channels Guideline section.) FORMAL METHODS SHALL
BE USED IN THE ANALYSIS.

4.1.3.1.4 Trusted Facility Management

The TCB shall support separate operator and administrator
functions. The functions performed in the role of a
security administrator shall be identified. The ADP
system administrative personnel shall only be able to
perform security administrator functions after taking a
distinct auditable action to assume the security
administrator role on the ADP system. Non-security
functions that can be performed in the security
administration role shall be limited strictly to those
essential to performing the security role effectively.

4.1.3.1.5 Trusted Recovery

Procedures and/or mechanisms shall be provided to assure
that, after an ADP system failure or other discontinuity,
recovery without a protection compromise is obtained.

4.1.3.2 Life-Cycle Assurance

4.1.3.2.1 Security Testing

The security mechanisms of the ADP system shall be tested
and found to work as claimed in the system documentation.
A team of individuals who thoroughly understand the
specific implementation of the TCB shall subject its
design documentation, source code, and object code to
thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that
would permit a subject external to the TCB to read,
change, or delete data normally denied under the
mandatory or discretionary security policy enforced by
the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter
a state such that it is unable to respond to
communications initiated by other users. The TCB shall
be found resistant to penetration. All discovered flaws
shall be corrected and the TCB retested to demonstrate
that they have been eliminated and that new flaws have
not been introduced. Testing shall demonstrate that the
TCB implementation is consistent with the FORMAL top-
level specification. (See the Security Testing
Guidelines.) No design flaws and no more than a few
correctable implementation flaws may be found during
testing and there shall be reasonable confidence that few
remain. MANUAL OR OTHER MAPPING OF THE FTLS TO THE
SOURCE CODE MAY FORM A BASIS FOR PENETRATION TESTING.

4.1.3.2.2 Design Specification and Verification

A formal model of the security policy supported by the
TCB shall be maintained that is proven consistent with
its axioms. A descriptive top-level specification (DTLS)
of the TCB shall be maintained that completely and
accurately describes the TCB in terms of exceptions, error
messages, and effects. A FORMAL TOP-LEVEL SPECIFICATION
(FTLS) OF THE TCB SHALL BE MAINTAINED THAT ACCURATELY
DESCRIBES THE TCB IN TERMS OF EXCEPTIONS, ERROR MESSAGES,
AND EFFECTS. THE DTLS AND FTLS SHALL INCLUDE THOSE
COMPONENTS OF THE TCB THAT ARE IMPLEMENTED AS HARDWARE
AND/OR FIRMWARE IF THEIR PROPERTIES ARE VISIBLE AT THE
TCB INTERFACE. THE FTLS shall be shown to be an accurate
description of the TCB interface. A convincing argument
shall be given that the DTLS is consistent with the model
AND A COMBINATION OF FORMAL AND INFORMAL TECHNIQUES SHALL
BE USED TO SHOW THAT THE FTLS IS CONSISTENT WITH THE
MODEL. THIS VERIFICATION EVIDENCE SHALL BE CONSISTENT
WITH THAT PROVIDED WITHIN THE STATE-OF-THE-ART OF THE
PARTICULAR COMPUTER SECURITY CENTER-ENDORSED FORMAL
SPECIFICATION AND VERIFICATION SYSTEM USED. MANUAL OR
OTHER MAPPING OF THE FTLS TO THE TCB SOURCE CODE SHALL BE
PERFORMED TO PROVIDE EVIDENCE OF CORRECT IMPLEMENTATION.

4.1.3.2.3 Configuration Management

During THE ENTIRE LIFE-CYCLE, I.E., DURING THE DESIGN,
DEVELOPMENT, and maintenance of the TCB, a configuration
management system shall be in place FOR ALL SECURITY-
RELEVANT HARDWARE, FIRMWARE, AND SOFTWARE that maintains
control of changes to THE FORMAL MODEL, the descriptive
AND FORMAL top-level SPECIFICATIONS, other design data,
implementation documentation, source code, the running
version of the object code, and test fixtures and
documentation. The configuration management system shall
assure a consistent mapping among all documentation and
code associated with the current version of the TCB.
Tools shall be provided for generation of a new version
of the TCB from source code. Also available shall be
tools, MAINTAINED UNDER STRICT CONFIGURATION CONTROL, for
comparing a newly generated version with the previous TCB
version in order to ascertain that only the intended
changes have been made in the code that will actually be
used as the new version of the TCB. A COMBINATION OF
TECHNICAL, PHYSICAL, AND PROCEDURAL SAFEGUARDS SHALL BE
USED TO PROTECT FROM UNAUTHORIZED MODIFICATION OR
DESTRUCTION THE MASTER COPY OR COPIES OF ALL MATERIAL
USED TO GENERATE THE TCB.

4.1.3.2.4 Trusted Distribution

A TRUSTED ADP SYSTEM CONTROL AND DISTRIBUTION FACILITY
SHALL BE PROVIDED FOR MAINTAINING THE INTEGRITY OF THE
MAPPING BETWEEN THE MASTER DATA DESCRIBING THE CURRENT
VERSION OF THE TCB AND THE ON-SITE MASTER COPY OF THE
CODE FOR THE CURRENT VERSION. PROCEDURES (E.G., SITE
SECURITY ACCEPTANCE TESTING) SHALL EXIST FOR ASSURING
THAT THE TCB SOFTWARE, FIRMWARE, AND HARDWARE UPDATES
DISTRIBUTED TO A CUSTOMER ARE EXACTLY AS SPECIFIED BY
THE MASTER COPIES.

4.1.4 DOCUMENTATION

4.1.4.1 Security Features User’s Guide

A single summary, chapter, or manual in user documentation
shall describe the protection mechanisms provided by the TCB,
guidelines on their use, and how they interact with one another.

4.1.4.2 Trusted Facility Manual

A manual addressed to the ADP system administrator shall
present cautions about functions and privileges that should be
controlled when running a secure facility. The procedures for
examining and maintaining the audit files as well as the
detailed audit record structure for each type of audit event
shall be given. The manual shall describe the operator and
administrator functions related to security, to include
changing the security characteristics of a user. It shall
provide guidelines on the consistent and effective use of the
protection features of the system, how they interact, how to
securely generate a new TCB, and facility procedures, warnings,
and privileges that need to be controlled in order to operate
the facility in a secure manner. The TCB modules that contain
the reference validation mechanism shall be identified. The
procedures for secure generation of a new TCB from source after
modification of any modules in the TCB shall be described. It
shall include the procedures to ensure that the system is
initially started in a secure manner. Procedures shall also be
included to resume secure system operation after any lapse in
system operation.

4.1.4.3 Test Documentation

The system developer shall provide to the evaluators a document
that describes the test plan and results of the security
mechanisms’ functional testing. It shall include results of
testing the effectiveness of the methods used to reduce covert
channel bandwidths. THE RESULTS OF THE MAPPING BETWEEN THE
FORMAL TOP-LEVEL SPECIFICATION AND THE TCB SOURCE CODE SHALL BE
GIVEN.

4.1.4.4 Design Documentation

Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation
of how this philosophy is translated into the TCB. The
interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the
TCB shall be available and proven that it is sufficient to
enforce the security policy. The specific TCB protection
mechanisms shall be identified and an explanation given to show
that they satisfy the model. The descriptive top-level
specification (DTLS) shall be shown to be an accurate
description of the TCB interface. Documentation shall describe
how the TCB implements the reference monitor concept and give
an explanation why it is tamperproof, cannot be bypassed, and
is correctly implemented. The TCB implementation (i.e., in
hardware, firmware, and software) shall be informally shown to
be consistent with the FORMAL TOP- LEVEL SPECIFICATION (FTLS).
The elements of the FTLS shall be shown, using informal
techniques, to correspond to the elements of the TCB.
Documentation shall describe how the TCB is structured to
facilitate testing and to enforce least privilege. This
documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage channels,
the use of which is not detectable by the auditing mechanisms,
shall be provided. (See the Covert Channel Guideline section.)
HARDWARE, FIRMWARE, AND SOFTWARE MECHANISMS NOT DEALT WITH IN
THE FTLS BUT STRICTLY INTERNAL TO THE TCB (E.G., MAPPING
REGISTERS, DIRECT MEMORY ACCESS I/O) SHALL BE CLEARLY DESCRIBED.

4.2 BEYOND CLASS (A1)

Most of the security enhancements envisioned for systems that will provide
features and assurance in addition to that already provided by class (Al)
systems are beyond current technology. The discussion below is intended to
guide future work and is derived from research and development activities
already underway in both the public and private sectors. As more and better
analysis techniques are developed, the requirements for these systems will
become more explicit. In the future, use of formal verification will be
extended to the source level and covert timing channels will be more fully
addressed. At this level the design environment will become important and
testing will be aided by analysis of the formal top-level specification.
Consideration will be given to the correctness of the tools used in TCB
development (e.g., compilers, assemblers, loaders) and to the correct
functioning of the hardware/firmware on which the TCB will run. Areas to be
addressed by systems beyond class (A1) include:

* System Architecture

A demonstration (formal or otherwise) must be given showing
that requirements of self-protection and completeness for
reference monitors have been implemented in the TCB.

* Security Testing

Although beyond the current state-of-the-art, it is
envisioned that some test-case generation will be done
automatically from the formal top-level specification or
formal lower-level specifications.

* Formal Specification and Verification

The TCB must be verified down to the source code level,
using formal verification methods where feasible. Formal
verification of the source code of the security-relevant
portions of an operating system has proven to be a difficult
task. Two important considerations are the choice of a
high-level language whose semantics can be fully and
formally expressed, and a careful mapping, through
successive stages, of the abstract formal design to a
formalization of the implementation in low-level
specifications. Experience has shown that only when the
lowest level specifications closely correspond to the actual
code can code proofs be successfully accomplished.

* Trusted Design Environment

The TCB must be designed in a trusted facility with only
trusted (cleared) personnel.

PART II:

5.0 CONTROL OBJECTIVES FOR TRUSTED COMPUTER SYSTEMS

The criteria are divided within each class into groups of requirements. These
groupings were developed to assure that three basic control objectives for
computer security are satisfied and not overlooked. These control objectives
deal with:

* Security Policy
* Accountability
* Assurance

This section provides a discussion of these general control objectives and
their implication in terms of designing trusted systems.

5.1 A Need for Consensus

A major goal of the DoD Computer Security Center is to encourage the Computer
Industry to develop trusted computer systems and products, making them widely
available in the commercial market place. Achievement of this goal will
require recognition and articulation by both the public and private sectors of
a need and demand for such products.

As described in the introduction to this document, efforts to define the
problems and develop solutions associated with processing nationally sensitive
information, as well as other sensitive data such as financial, medical, and
personnel information used by the National Security Establishment, have been
underway for a number of years. The criteria, as described in Part I,
represent the culmination of these efforts and describe basic requirements for
building trusted computer systems. To date, however, these systems have been
viewed by many as only satisfying National Security needs. As long as this
perception continues the consensus needed to motivate manufacture of trusted
systems will be lacking.

The purpose of this section is to describe, in some detail, the fundamental
control objectives that lay the foundations for requirements delineated in the
criteria. The goal is to explain the foundations so that those outside the
National Security Establishment can assess their universality and, by
extension, the universal applicability of the criteria requirements to
processing all types of sensitive applications whether they be for National
Security or the private sector.

5.2 Definition and Usefulness

The term “control objective” refers to a statement of intent with respect to
control over some aspect of an organization’s resources, or processes, or
both. In terms of a computer system, control objectives provide a framework
for developing a strategy for fulfilling a set of security requirements for
any given system. Developed in response to generic vulnerabilities, such as
the need to manage and handle sensitive data in order to prevent compromise,
or the need to provide accountability in order to detect fraud, control
objectives have been identified as a useful method of expressing security
goals.[3]

Examples of control objectives include the three basic design requirements for
implementing the reference monitor concept discussed in Section 6. They are:

* The reference validation mechanism must be tamperproof.

* The reference validation mechanism must always be invoked.

* The reference validation mechanism must be small enough to be
subjected to analysis and tests, the completeness of which can
be assured.[1]

5.3 Criteria Control Objectives

The three basic control objectives of the criteria are concerned with security
policy, accountability, and assurance. The remainder of this section provides
a discussion of these basic requirements.

5.3.1 Security Policy

In the most general sense, computer security is concerned with
controlling the way in which a computer can be used, i.e.,
controlling how information processed by it can be accessed and
manipulated. However, at closer examination, computer security
can refer to a number of areas. Symptomatic of this, FIPS PUB 39,
Glossary For Computer Systems Security, does not have a unique
definition for computer security.[16] Instead there are eleven
separate definitions for security which include: ADP systems
security, administrative security, data security, etc. A common
thread running through these definitions is the word “protection.”
Further declarations of protection requirements can be found in
DoD Directive 5200.28 which describes an acceptable level of
protection for classified data to be one that will “assure that
systems which process, store, or use classified data and produce
classified information will, with reasonable dependability,
prevent: a. Deliberate or inadvertent access to classified
material by unauthorized persons, and b. Unauthorized
manipulation of the computer and its associated peripheral
devices.”[8]

In summary, protection requirements must be defined in terms of
the perceived threats, risks, and goals of an organization. This
is often stated in terms of a security policy. It has been
pointed out in the literature that it is external laws, rules,
regulations, etc. that establish what access to information is to
be permitted, independent of the use of a computer. In particular,
a given system can only be said to be secure with respect to its
enforcement of some specific policy.[30] Thus, the control
objective for security policy is:

SECURITY POLICY CONTROL OBJECTIVE

A STATEMENT OF INTENT WITH REGARD TO CONTROL OVER ACCESS TO AND
DISSEMINATION OF INFORMATION, TO BE KNOWN AS THE SECURITY POLICY,
MUST BE PRECISELY DEFINED AND IMPLEMENTED FOR EACH SYSTEM THAT IS
USED TO PROCESS SENSITIVE INFORMATION. THE SECURITY POLICY MUST
ACCURATELY REFLECT THE LAWS, REGULATIONS, AND GENERAL POLICIES
FROM WHICH IT IS DERIVED.

5.3.1.1 Mandatory Security Policy

Where a security policy is developed that is to be applied
to control of classified or other specifically designated
sensitive information, the policy must include detailed
rules on how to handle that information throughout its
life-cycle. These rules are a function of the various
sensitivity designations that the information can assume
and the various forms of access supported by the system.
Mandatory security refers to the enforcement of a set of
access control rules that constrains a subject’s access to
information on the basis of a comparison of that
individual’s clearance/authorization to the information,
the classification/sensitivity designation of the
information, and the form of access being mediated.
Mandatory policies either require or can be satisfied by
systems that can enforce a partial ordering of
designations, namely, the designations must form what is
mathematically known as a “lattice.”[5]

A clear implication of the above is that the system must
assure that the designations associated with sensitive data
cannot be arbitrarily changed, since this could permit
individuals who lack the appropriate authorization to
access sensitive information. Also implied is the
requirement that the system control the flow of information
so that data cannot be stored with lower sensitivity
designations unless its “downgrading” has been authorized.
The control objective is:

MANDATORY SECURITY CONTROL OBJECTIVE

SECURITY POLICIES DEFINED FOR SYSTEMS THAT ARE USED TO
PROCESS CLASSIFIED OR OTHER SPECIFICALLY CATEGORIZED
SENSITIVE INFORMATION MUST INCLUDE PROVISIONS FOR THE
ENFORCEMENT OF MANDATORY ACCESS CONTROL RULES. THAT IS,
THEY MUST INCLUDE A SET OF RULES FOR CONTROLLING ACCESS
BASED DIRECTLY ON A COMPARISON OF THE INDIVIDUAL’S
CLEARANCE OR AUTHORIZATION FOR THE INFORMATION AND THE
CLASSIFICATION OR SENSITIVITY DESIGNATION OF THE
INFORMATION BEING SOUGHT, AND INDIRECTLY ON CONSIDERATIONS
OF PHYSICAL AND OTHER ENVIRONMENTAL FACTORS OF CONTROL.
THE MANDATORY ACCESS CONTROL RULES MUST ACCURATELY REFLECT
THE LAWS, REGULATIONS, AND GENERAL POLICIES FROM WHICH
THEY ARE DERIVED.

5.3.1.2 Discretionary Security Policy

Discretionary security is the principal type of access
control available in computer systems today. The basis of
this kind of security is that an individual user, or
program operating on his behalf, is allowed to specify
explicitly the types of access other users may have to
information under his control. Discretionary security
differs from mandatory security in that it implements an
access control policy on the basis of an individual’s
need-to-know as opposed to mandatory controls which are
driven by the classification or sensitivity designation of
the information.

Discretionary controls are not a replacement for mandatory
controls. In an environment in which information is
classified (as in the DoD) discretionary security provides
for a finer granularity of control within the overall
constraints of the mandatory policy. Access to classified
information requires effective implementation of both types
of controls as precondition to granting that access. In
general, no person may have access to classified
information unless: (a) that person has been determined to
be trustworthy, i.e., granted a personnel security
clearance — MANDATORY, and (b) access is necessary for the
performance of official duties, i.e., determined to have a
need-to-know — DISCRETIONARY. In other words,
discretionary controls give individuals discretion to
decide on which of the permissible accesses will actually
be allowed to which users, consistent with overriding
mandatory policy restrictions. The control objective is:

DISCRETIONARY SECURITY CONTROL OBJECTIVE

SECURITY POLICIES DEFINED FOR SYSTEMS THAT ARE USED TO
PROCESS CLASSIFIED OR OTHER SENSITIVE INFORMATION MUST
INCLUDE PROVISIONS FOR THE ENFORCEMENT OF DISCRETIONARY
ACCESS CONTROL RULES. THAT IS, THEY MUST INCLUDE A
CONSISTENT SET OF RULES FOR CONTROLLING AND LIMITING ACCESS
BASED ON IDENTIFIED INDIVIDUALS WHO HAVE BEEN DETERMINED TO
HAVE A NEED-TO-KNOW FOR THE INFORMATION.

5.3.1.3 Marking

To implement a set of mechanisms that will put into effect
a mandatory security policy, it is necessary that the
system mark information with appropriate classification or
sensitivity labels and maintain these markings as the
information moves through the system. Once information is
unalterably and accurately marked, comparisons required by
the mandatory access control rules can be accurately and
consistently made. An additional benefit of having the
system maintain the classification or sensitivity label
internally is the ability to automatically generate
properly “labeled” output. The labels, if accurately and
integrally maintained by the system, remain accurate when
output from the system. The control objective is:

MARKING CONTROL OBJECTIVE

SYSTEMS THAT ARE DESIGNED TO ENFORCE A MANDATORY SECURITY
POLICY MUST STORE AND PRESERVE THE INTEGRITY OF
CLASSIFICATION OR OTHER SENSITIVITY LABELS FOR ALL
INFORMATION. LABELS EXPORTED FROM THE SYSTEM MUST BE
ACCURATE REPRESENTATIONS OF THE CORRESPONDING INTERNAL
SENSITIVITY LABELS BEING EXPORTED.

5.3.2 Accountability

The second basic control objective addresses one of the
fundamental principles of security, i.e., individual
accountability. Individual accountability is the key to securing
and controlling any system that processes information on behalf
of individuals or groups of individuals. A number of requirements
must be met in order to satisfy this objective.

The first requirement is for individual user identification.
Second, there is a need for authentication of the identification.
Identification is functionally dependent on authentication.
Without authentication, user identification has no credibility.
Without a credible identity, neither mandatory nor discretionary
security policies can be properly invoked because there is no
assurance that proper authorizations can be made.

The third requirement is for dependable audit capabilities. That
is, a trusted computer system must provide authorized personnel
with the ability to audit any action that can potentially cause
access to, generation of, or effect the release of classified or
sensitive information. The audit data will be selectively
acquired based on the auditing needs of a particular installation
and/or application. However, there must be sufficient granularity
in the audit data to support tracing the auditable events to a
specific individual who has taken the actions or on whose behalf
the actions were taken. The control objective is:

ACCOUNTABILITY CONTROL OBJECTIVE

SYSTEMS THAT ARE USED TO PROCESS OR HANDLE CLASSIFIED OR OTHER
SENSITIVE INFORMATION MUST ASSURE INDIVIDUAL ACCOUNTABILITY
WHENEVER EITHER A MANDATORY OR DISCRETIONARY SECURITY POLICY IS
INVOKED. FURTHERMORE, TO ASSURE ACCOUNTABILITY THE CAPABILITY
MUST EXIST FOR AN AUTHORIZED AND COMPETENT AGENT TO ACCESS AND
EVALUATE ACCOUNTABILITY INFORMATION BY A SECURE MEANS, WITHIN A
REASONABLE AMOUNT OF TIME, AND WITHOUT UNDUE DIFFICULTY.

5.3.3 Assurance

The third basic control objective is concerned with guaranteeing
or providing confidence that the security policy has been
implemented correctly and that the protection-relevant elements of
the system do, indeed, accurately mediate and enforce the intent
of that policy. By extension, assurance must include a guarantee
that the trusted portion of the system works only as intended. To
accomplish these objectives, two types of assurance are needed.
They are life-cycle assurance and operational assurance.

Life-cycle assurance refers to steps taken by an organization to
ensure that the system is designed, developed, and maintained
using formalized and rigorous controls and standards.[17]
Computer systems that process and store sensitive or classified
information depend on the hardware and software to protect that
information. It follows that the hardware and software themselves
must be protected against unauthorized changes that could cause
protection mechanisms to malfunction or be bypassed completely.
For this reason trusted computer systems must be carefully
evaluated and tested during the design and development phases and
reevaluated whenever changes are made that could affect the
integrity of the protection mechanisms. Only in this way can
confidence be provided that the hardware and software
interpretation of the security policy is maintained accurately
and without distortion.

While life-cycle assurance is concerned with procedures for
managing system design, development, and maintenance; operational
assurance focuses on features and system architecture used to
ensure that the security policy is uncircumventably enforced
during system operation. That is, the security policy must be
integrated into the hardware and software protection features of
the system. Examples of steps taken to provide this kind of
confidence include: methods for testing the operational hardware
and software for correct operation, isolation of protection-
critical code, and the use of hardware and software to provide
distinct domains. The control objective is:

ASSURANCE CONTROL OBJECTIVE

SYSTEMS THAT ARE USED TO PROCESS OR HANDLE CLASSIFIED OR OTHER
SENSITIVE INFORMATION MUST BE DESIGNED TO GUARANTEE CORRECT AND
ACCURATE INTERPRETATION OF THE SECURITY POLICY AND MUST NOT
DISTORT THE INTENT OF THAT POLICY. ASSURANCE MUST BE PROVIDED
THAT CORRECT IMPLEMENTATION AND OPERATION OF THE POLICY EXISTS
THROUGHOUT THE SYSTEM’S LIFE-CYCLE.

6.0 RATIONALE BEHIND THE EVALUATION CLASSES

6.1 The Reference Monitor Concept

In October of 1972, the Computer Security Technology Planning Study, conducted
by James P. Anderson & Co., produced a report for the Electronic Systems
Division (ESD) of the United States Air Force.[1] In that report, the concept
of “a reference monitor which enforces the authorized access relationships
between subjects and objects of a system” was introduced. The reference
monitor concept was found to be an essential element of any system that would
provide multilevel secure computing facilities and controls.

The Anderson report went on to define the reference validation mechanism as
“an implementation of the reference monitor concept . . . that validates
each reference to data or programs by any user (program) against a list of
authorized types of reference for that user.” It then listed the three design
requirements that must be met by a reference validation mechanism:

a. The reference validation mechanism must be tamper proof.

b. The reference validation mechanism must always be invoked.

c. The reference validation mechanism must be small enough to be
subject to analysis and tests, the completeness of which can
be assured.”[1]

Extensive peer review and continuing research and development activities have
sustained the validity of the Anderson Committee’s findings. Early examples
of the reference validation mechanism were known as security kernels. The
Anderson Report described the security kernel as “that combination of hardware
and software which implements the reference monitor concept.”[1] In this vein,
it will be noted that the security kernel must support the three reference
monitor requirements listed above.

6.2 A Formal Security Policy Model

Following the publication of the Anderson report, considerable research was
initiated into formal models of security policy requirements and of the
mechanisms that would implement and enforce those policy models as a security
kernel. Prominent among these efforts was the ESD-sponsored development of
the Bell and LaPadula model, an abstract formal treatment of DoD security
policy.[2] Using mathematics and set theory, the model precisely defines the
notion of secure state, fundamental modes of access, and the rules for
granting subjects specific modes of access to objects. Finally, a theorem is
proven to demonstrate that the rules are security-preserving operations, so
that the application of any sequence of the rules to a system that is in a
secure state will result in the system entering a new state that is also
secure. This theorem is known as the Basic Security Theorem.

The Bell and LaPadula model defines a relationship between clearances of
subjects and classifications of system objects, now referenced as the
“dominance relation.” From this definition, accesses permitted between
subjects and objects are explicitly defined for the fundamental modes of
access, including read-only access, read/write access, and write-only access.
The model defines the Simple Security Condition to control granting a subject
read access to a specific object, and the *-Property (read “Star Property”) to
control granting a subject write access to a specific object. Both the Simple
Security Condition and the *-Property include mandatory security provisions
based on the dominance relation between the clearance of the subject and the
classification of the object. The Discretionary Security Property is also
defined, and requires that a specific subject be authorized for the particular
mode of access required for the state transition. In its treatment of
subjects (processes acting on behalf of a user), the model distinguishes
between trusted subjects (i.e., not constrained within the model by the
*-Property) and untrusted subjects (those that are constrained by the
*-Property).

From the Bell and LaPadula model there evolved a model of the method of proof
required to formally demonstrate that all arbitrary sequences of state
transitions are security-preserving. It was also shown that the *- Property
is sufficient to prevent the compromise of information by Trojan Horse
attacks.

6.3 The Trusted Computing Base

In order to encourage the widespread commercial availability of trusted
computer systems, these evaluation criteria have been designed to address
those systems in which a security kernel is specifically implemented as well
as those in which a security kernel has not been implemented. The latter case
includes those systems in which objective (c) is not fully supported because
of the size or complexity of the reference validation mechanism. For
convenience, these evaluation criteria use the term Trusted Computing Base to
refer to the reference validation mechanism, be it a security kernel,
front-end security filter, or the entire trusted computer system.

The heart of a trusted computer system is the Trusted Computing Base (TCB)
which contains all of the elements of the system responsible for supporting
the security policy and supporting the isolation of objects (code and data) on
which the protection is based. The bounds of the TCB equate to the “security
perimeter” referenced in some computer security literature. In the interest
of understandable and maintainable protection, a TCB should be as simple as
possible consistent with the functions it has to perform. Thus, the TCB
includes hardware, firmware, and software critical to protection and must be
designed and implemented such that system elements excluded from it need not
be trusted to maintain protection. Identification of the interface and
elements of the TCB along with their correct functionality therefore forms the
basis for evaluation.

For general-purpose systems, the TCB will include key elements of the
operating system and may include all of the operating system. For embedded
systems, the security policy may deal with objects in a way that is meaningful
at the application level rather than at the operating system level. Thus, the
protection policy may be enforced in the application software rather than in
the underlying operating system. The TCB will necessarily include all those
portions of the operating system and application software essential to the
support of the policy. Note that, as the amount of code in the TCB increases,
it becomes harder to be confident that the TCB enforces the reference monitor
requirements under all circumstances.

6.4 Assurance

The third reference monitor design objective is currently interpreted as
meaning that the TCB “must be of sufficiently simple organization and
complexity to be subjected to analysis and tests, the completeness of which
can be assured.”

Clearly, as the perceived degree of risk increases (e.g., the range of
sensitivity of the system’s protected data, along with the range of clearances
held by the system’s user population) for a particular system’s operational
application and environment, so also must the assurances be increased to
substantiate the degree of trust that will be placed in the system. The
hierarchy of requirements that are presented for the evaluation classes in the
trusted computer system evaluation criteria reflect the need for these
assurances.

As discussed in Section 5.3, the evaluation criteria uniformly require a
statement of the security policy that is enforced by each trusted computer
system. In addition, it is required that a convincing argument be presented
that explains why the TCB satisfies the first two design requirements for a
reference monitor. It is not expected that this argument will be entirely
formal. This argument is required for each candidate system in order to
satisfy the assurance control objective.

The systems to which security enforcement mechanisms have been added, rather
than built-in as fundamental design objectives, are not readily amenable to
extensive analysis since they lack the requisite conceptual simplicity of a
security kernel. This is because their TCB extends to cover much of the
entire system. Hence, their degree of trustworthiness can best be ascertained
only by obtaining test results. Since no test procedure for something as
complex as a computer system can be truly exhaustive, there is always the
possibility that a subsequent penetration attempt could succeed. It is for
this reason that such systems must fall into the lower evaluation classes.

On the other hand, those systems that are designed and engineered to support
the TCB concepts are more amenable to analysis and structured testing. Formal
methods can be used to analyze the correctness of their reference validation
mechanisms in enforcing the system’s security policy. Other methods,
including less-formal arguments, can be used in order to substantiate claims
for the completeness of their access mediation and their degree of
tamper-resistance. More confidence can be placed in the results of this
analysis and in the thoroughness of the structured testing than can be placed
in the results for less methodically structured systems. For these reasons,
it appears reasonable to conclude that these systems could be used in
higher-risk environments. Successful implementations of such systems would be
placed in the higher evaluation classes.

6.5 The Classes

It is highly desirable that there be only a small number of overall evaluation
classes. Three major divisions have been identified in the evaluation
criteria with a fourth division reserved for those systems that have been
evaluated and found to offer unacceptable security protection. Within each
major evaluation division, it was found that “intermediate” classes of trusted
system design and development could meaningfully be defined. These
intermediate classes have been designated in the criteria because they
identify systems that:

* are viewed to offer significantly better protection and assurance
than would systems that satisfy the basic requirements for their
evaluation class; and

* there is reason to believe that systems in the intermediate
evaluation classes could eventually be evolved such that they
would satisfy the requirements for the next higher evaluation
class.

Except within division A it is not anticipated that additional “intermediate”
evaluation classes satisfying the two characteristics described above will be
identified.

Distinctions in terms of system architecture, security policy enforcement, and
evidence of credibility between evaluation classes have been defined such that
the “jump” between evaluation classes would require a considerable investment
of effort on the part of implementors. Correspondingly, there are expected to
be significant differentials of risk to which systems from the higher
evaluation classes will be exposed.

7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA

Section 1 presents fundamental computer security requirements and Section 5
presents the control objectives for Trusted Computer Systems. They are
general requirements, useful and necessary, for the development of all secure
systems. However, when designing systems that will be used to process
classified or other sensitive information, functional requirements for meeting
the Control Objectives become more specific. There is a large body of policy
laid down in the form of Regulations, Directives, Presidential Executive
Orders, and OMB Circulars that form the basis of the procedures for the
handling and processing of Federal information in general and classified
information specifically. This section presents pertinent excerpts from these
policy statements and discusses their relationship to the Control Objectives.

7.1 Established Federal Policies

A significant number of computer security policies and associated requirements
have been promulgated by Federal government elements. The interested reader
is referred to reference [32] which analyzes the need for trusted systems in
the civilian agencies of the Federal government, as well as in state and local
governments and in the private sector. This reference also details a number
of relevant Federal statutes, policies and requirements not treated further
below.

Security guidance for Federal automated information systems is provided by the
Office of Management and Budget. Two specifically applicable Circulars have
been issued. OMB Circular No. A-71, Transmittal Memorandum No. 1, “Security
of Federal Automated Information Systems,”[26] directs each executive agency
to establish and maintain a computer security program. It makes the head of
each executive branch, department and agency responsible “for assuring an
adequate level of security for all agency data whether processed in-house or
commercially. This includes responsibility for the establishment of physical,
administrative and technical safeguards required to adequately protect
personal, proprietary or other sensitive data not subject to national security
regulations, as well as national security data.”[26, para. 4 p. 2]

OMB Circular No. A-123, “Internal Control Systems,”[27] issued to help
eliminate fraud, waste, and abuse in government programs requires: (a) agency
heads to issue internal control directives and assign responsibility, (b)
managers to review programs for vulnerability, and (c) managers to perform
periodic reviews to evaluate strengths and update controls. Soon after
promulgation of OMB Circular A-123, the relationship of its internal control
requirements to building secure computer systems was recognized.[4] While not
stipulating computer controls specifically, the definition of Internal
Controls in A-123 makes it clear that computer systems are to be included:

“Internal Controls – The plan of organization and all of the methods and
measures adopted within an agency to safeguard its resources, assure the
accuracy and reliability of its information, assure adherence to
applicable laws, regulations and policies, and promote operational
economy and efficiency.”[27, sec. 4.C]

The matter of classified national security information processed by ADP
systems was one of the first areas given serious and extensive concern in
computer security. The computer security policy documents promulgated as a
result contain generally more specific and structured requirements than most,
keyed in turn to an authoritative basis that itself provides a rather clearly
articulated and structured information security policy. This basis, Executive
Order 12356, “National Security Information,” sets forth requirements for the
classification, declassification and safeguarding of “national security
information” per se.[14]

7.2 DoD Policies

Within the Department of Defense, these broad requirements are implemented and
further specified primarily through two vehicles: 1) DoD Regulation 5200.1-R
[7], which applies to all components of the DoD as such, and 2) DoD 5220.22-M,
“Industrial Security Manual for Safeguarding Classified Information” [11],
which applies to contractors included within the Defense Industrial Security
Program. Note that the latter transcends DoD as such, since it applies not
only to any contractors handling classified information for any DoD component,
but also to the contractors of eighteen other Federal organizations for whom
the Secretary of Defense is authorized to act in rendering industrial security
services.*

____________________________________________________________
* i.e., NASA, Commerce Department, GSA, State Department,
Small Business Administration, National Science Foundation,
Treasury Department, Transportation Department, Interior
Department, Agriculture Department, Health and Human
Services Department, Labor Department, Environmental
Protection Agency, Justice Department, U.S. Arms Control and
Disarmament Agency, Federal Emergency Management Agency,
Federal Reserve System, and U.S. General Accounting Office.
____________________________________________________________

For ADP systems, these information security requirements are further amplified
and specified in: 1) DoD Directive 5200.28 [8] and DoD Manual 5200.28-M [9],
for DoD components; and 2) Section XIII of DoD 5220.22-M [11] for contractors.
DoD Directive 5200.28, “Security Requirements for Automatic Data Processing
(ADP) Systems,” stipulates: “Classified material contained in an ADP system
shall be safeguarded by the continuous employment of protective features in
the system’s hardware and software design and configuration . . . .”[8,
sec. IV] Furthermore, it is required that ADP systems that “process, store,
or use classified data and produce classified information will, with
reasonable dependability, prevent:

a. Deliberate or inadvertent access to classified material by
unauthorized persons, and

b. Unauthorized manipulation of the computer and its associated
peripheral devices.”[8, sec. I B.3]

Requirements equivalent to these appear within DoD 5200.28-M [9] and in DoD
5220.22-M [11].

From requirements imposed by these regulations, directives and circulars, the
three components of the Security Policy Control Objective, i.e., Mandatory and
Discretionary Security and Marking, as well as the Accountability and
Assurance Control Objectives, can be functionally defined for DoD
applications. The following discussion provides further specificity in Policy
for these Control Objectives.

7.3 Criteria Control Objective for Security Policy

7.3.1 Marking

The control objective for marking is: “Systems that are designed
to enforce a mandatory security policy must store and preserve the
integrity of classification or other sensitivity labels for all
information. Labels exported from the system must be accurate
representations of the corresonding internal sensitivity labels
being exported.”

DoD 5220.22-M, “Industrial Security Manual for Safeguarding
Classified Information,” explains in paragraph 11 the reasons for
marking information:

“Designation by physical marking, notation or other means
serves to inform and to warn the holder about the
classification designation of the information which requires
protection in the interest of national security. The degree
of protection against unauthorized disclosure which will be
required for a particular level of classification is directly
commensurate with the marking designation which is assigned
to the material.”[11]

Marking requirements are given in a number of policy statements.

Executive Order 12356 (Sections 1.5.a and 1.5.a.1) requires that
classification markings “shall be shown on the face of all
classified documents, or clearly associated with other forms of
classified information in a manner appropriate to the medium
involved.”[14]

DoD Regulation 5200.1-R (Section 1-500) requires that: “. . .
information or material that requires protection against
unauthorized disclosure in the interest of national security shall
be classified in one of three designations, namely: ‘Top Secret,’
‘Secret’ or ‘Confidential.'”[7] (By extension, for use in computer
processing, the unofficial designation “Unclassified” is used to
indicate information that does not fall under one of the other
three designations of classified information.)

DoD Regulation 5200.1-R (Section 4-304b) requires that: “ADP
systems and word processing systems employing such media shall
provide for internal classification marking to assure that
classified information contained therein that is reproduced or
generated, will bear applicable classification and associated
markings.” (This regulation provides for the exemption of certain
existing systems where “internal classification and applicable
associated markings cannot be implemented without extensive system
modifications.”[7] However, it is clear that future DoD ADP
systems must be able to provide applicable and accurate labels for
classified and other sensitive information.)

DoD Manual 5200.28-M (Section IV, 4-305d) requires the following:
“Security Labels – All classified material accessible by or within
the ADP system shall be identified as to its security
classification and access or dissemination limitations, and all
output of the ADP system shall be appropriately marked.”[9]

7.3.2 Mandatory Security

The control objective for mandatory security is: “Security
policies defined for systems that are used to process classified
or other specifically categorized sensitive information must
include provisions for the enforcement of mandatory access control
rules. That is, they must include a set of rules for controlling
access based directly on a comparison of the individual’s
clearance or authorization for the information and the
classification or sensitivity designation of the information being
sought, and indirectly on considerations of physical and other
environmental factors of control. The mandatory access control
rules must accurately reflect the laws, regulations, and general
policies from which they are derived.”

There are a number of policy statements that are related to
mandatory security.

Executive Order 12356 (Section 4.1.a) states that “a person is
eligible for access to classified information provided that a
determination of trustworthiness has been made by agency heads or
designated officials and provided that such access is essential
to the accomplishment of lawful and authorized Government
purposes.”[14]

DoD Regulation 5200.1-R (Chapter I, Section 3) defines a Special
Access Program as “any program imposing ‘need-to-know’ or access
controls beyond those normally provided for access to
Confidential, Secret, or Top Secret information. Such a program
includes, but is not limited to, special clearance, adjudication,
or investigative requirements, special designation of officials
authorized to determine ‘need-to-know’, or special lists of persons
determined to have a ‘need-to- know.'”[7, para. 1-328] This
passage distinguishes between a ‘discretionary’ determination of
need-to-know and formal need-to-know which is implemented through
Special Access Programs. DoD Regulation 5200.1-R, paragraph 7-100
describes general requirements for trustworthiness (clearance) and
need-to-know, and states that the individual with possession,
knowledge or control of classified information has final
responsibility for determining if conditions for access have been
met. This regulation further stipulates that “no one has a right
to have access to classified information solely by virtue of rank
or position.” [7, para. 7-100])

DoD Manual 5200.28-M (Section II 2-100) states that, “Personnel
who develop, test (debug), maintain, or use programs which are
classified or which will be used to access or develop classified
material shall have a personnel security clearance and an access
authorization (need-to-know), as appropriate for the highest
classified and most restrictive category of classified material
which they will access under system constraints.”[9]

DoD Manual 5220.22-M (Paragraph 3.a) defines access as “the
ability and opportunity to obtain knowledge of classified
information. An individual, in fact, may have access to
classified information by being in a place where such information
is kept, if the security measures which are in force do not
prevent him from gaining knowledge of the classified
information.”[11]

The above mentioned Executive Order, Manual, Directives and
Regulations clearly imply that a trusted computer system must
assure that the classification labels associated with sensitive
data cannot be arbitrarily changed, since this could permit
individuals who lack the appropriate clearance to access
classified information. Also implied is the requirement that a
trusted computer system must control the flow of information so
that data from a higher classification cannot be placed in a
storage object of lower classification unless its “downgrading”
has been authorized.

7.3.3 Discretionary Security

The term discretionary security refers to a computer system’s
ability to control information on an individual basis. It stems
from the fact that even though an individual has all the formal
clearances for access to specific classified information, each
individual’s access to information must be based on a demonstrated
need-to-know. Because of this, it must be made clear that this
requirement is not discretionary in a “take it or leave it” sense.
The directives and regulations are explicit in stating that the
need-to-know test must be satisfied before access can be granted
to the classified information. The control objective for
discretionary security is: “Security policies defined for systems
that are used to process classified or other sensitive information
must include provisions for the enforcement of discretionary
access control rules. That is, they must include a consistent set
of rules for controlling and limiting access based on identified
individuals who have been determined to have a need-to-know for the
information.”

DoD Regulation 5200.1-R (Paragraph 7-100) In addition to excerpts
already provided that touch on need-to- know, this section of the
regulation stresses the need- to-know principle when it states “no
person may have access to classified information unless . . .
access is necessary for the performance of official duties.”[7]

Also, DoD Manual 5220.22-M (Section III 20.a) states that “an
individual shall be permitted to have access to classified
information only . . . when the contractor determines that access
is necessary in the performance of tasks or services essential to
the fulfillment of a contract or program, i.e., the individual has
a need-to-know.”[11]

7.4 Criteria Control Objective for Accountability

The control objective for accountability is: “Systems that are used to
process or handle classified or other sensitive information must assure
individual accountability whenever either a mandatory or discretionary
security policy is invoked. Furthermore, to assure accountability the
capability must exist for an authorized and competent agent to access and
evaluate accountability information by a secure means, within a reasonable
amount of time, and without undue difficulty.”

This control objective is supported by the following citations:

DoD Directive 5200.28 (VI.A.1) states: “Each user’s identity shall be
positively established, and his access to the system, and his activity in
the system (including material accessed and actions taken) controlled and
open to scrutiny.”[8]

DoD Manual 5200.28-M (Section V 5-100) states: “An audit log or file
(manual, machine, or a combination of both) shall be maintained as a
history of the use of the ADP System to permit a regular security review
of system activity. (e.g., The log should record security related
transactions, including each access to a classified file and the nature
of the access, e.g., logins, production of accountable classified
outputs, and creation of new classified files. Each classified file
successfully accessed [regardless of the number of individual references]
during each ‘job’ or ‘interactive session’ should also be recorded in the
audit log. Much of the material in this log may also be required to
assure that the system preserves information entrusted to it.)”[9]

DoD Manual 5200.28-M (Section IV 4-305f) states: “Where needed to assure
control of access and individual accountability, each user or specific
group of users shall be identified to the ADP System by appropriate
administrative or hardware/software measures. Such identification
measures must be in sufficient detail to enable the ADP System to provide
the user only that material which he is authorized.”[9]

DoD Manual 5200.28-M (Section I 1-102b) states:

“Component’s Designated Approving Authorities, or their designees
for this purpose . . . will assure:

. . . . . . . . . . . . . . . . .

(4) Maintenance of documentation on operating systems (O/S)
and all modifications thereto, and its retention for a
sufficient period of time to enable tracing of security-
related defects to their point of origin or inclusion in the
system.

. . . . . . . . . . . . . . . . .

(6) Establishment of procedures to discover, recover,
handle, and dispose of classified material improperly
disclosed through system malfunction or personnel action.

(7) Proper disposition and correction of security
deficiencies in all approved ADP Systems, and the effective
use and disposition of system housekeeping or audit records,
records of security violations or security-related system
malfunctions, and records of tests of the security features
of an ADP System.”[9]

DoD Manual 5220.22-M (Section XIII 111) states: “Audit Trails

a. The general security requirement for any ADP system audit
trail is that it provide a documented history of the use of
the system. An approved audit trail will permit review of
classified system activity and will provide a detailed
activity record to facilitate reconstruction of events to
determine the magnitude of compromise (if any) should a
security malfunction occur. To fulfill this basic
requirement, audit trail systems, manual, automated or a
combination of both must document significant events
occurring in the following areas of concern: (i) preparation
of input data and dissemination of output data (i.e.,
reportable interactivity between users and system support
personnel), (ii) activity involved within an ADP environment
(e.g., ADP support personnel modification of security and
related controls), and (iii) internal machine activity.

b. The audit trail for an ADP system approved to process
classified information must be based on the above three
areas and may be stylized to the particular system. All
systems approved for classified processing should contain
most if not all of the audit trail records listed below. The
contractor’s SPP documentation must identify and describe
those applicable:

1. Personnel access;

2. Unauthorized and surreptitious entry into the
central computer facility or remote terminal areas;

3. Start/stop time of classified processing indicating
pertinent systems security initiation and termination events
(e.g., upgrading/downgrading actions pursuant to paragraph
107);

4. All functions initiated by ADP system console
operators;

5. Disconnects of remote terminals and peripheral
devices (paragraph 107c);

6. Log-on and log-off user activity;

7. Unauthorized attempts to access files or programs,
as well as all open, close, create, and file destroy
actions;

8. Program aborts and anomalies including
identification information (i.e., user/program name, time
and location of incident, etc.);

9. System hardware additions, deletions and maintenance
actions;

10. Generations and modifications affecting the
security features of the system software.

c. The ADP system security supervisor or designee shall
review the audit trail logs at least weekly to assure that
all pertinent activity is properly recorded and that
appropriate action has been taken to correct any anomaly.
The majority of ADP systems in use today can develop audit
trail systems in accord with the above; however, special
systems such as weapons, communications, communications
security, and tactical data exchange and display systems,
may not be able to comply with all aspects of the above and
may require individualized consideration by the cognizant
security office.

d. Audit trail records shall be retained for a period of one
inspection cycle.”[11]

7.5 Criteria Control Objective for Assurance

The control objective for assurance is: “Systems that are used to process
or handle classified or other sensitive information must be designed to
guarantee correct and accurate interpretation of the security policy and
must not distort the intent of that policy. Assurance must be provided
that correct implementation and operation of the policy exists throughout
the system’s life-cycle.”

A basis for this objective can be found in the following sections of DoD
Directive 5200.28:

DoD Directive 5200.28 (IV.B.1) stipulates: “Generally, security of an ADP
system is most effective and economical if the system is designed
originally to provide it. Each Department of Defense Component
undertaking design of an ADP system which is expected to process, store,
use, or produce classified material shall: From the beginning of the
design process, consider the security policies, concepts, and measures
prescribed in this Directive.”[8]

DoD Directive 5200.28 (IV.C.5.a) states: “Provision may be made to permit
adjustment of ADP system area controls to the level of protection
required for the classification category and type(s) of material actually
being handled by the system, provided change procedures are developed and
implemented which will prevent both the unauthorized access to classified
material handled by the system and the unauthorized manipulation of the
system and its components. Particular attention shall be given to the
continuous protection of automated system security measures, techniques
and procedures when the personnel security clearance level of users
having access to the system changes.”[8]

DoD Directive 5200.28 (VI.A.2) states: “Environmental Control. The ADP
System shall be externally protected to minimize the likelihood of
unauthorized access to system entry points, access to classified
information in the system, or damage to the system.”[8]

DoD Manual 5200.28-M (Section I 1-102b) states:

“Component’s Designated Approving Authorities, or their designees
for this purpose . . . will assure:

. . . . . . . . . . . . . . . . .

(5) Supervision, monitoring, and testing, as appropriate, of
changes in an approved ADP System which could affect the
security features of the system, so that a secure system is
maintained.

. . . . . . . . . . . . . . . . .

(7) Proper disposition and correction of security
deficiencies in all approved ADP Systems, and the effective
use and disposition of system housekeeping or audit records,
records of security violations or security-related system
malfunctions, and records of tests of the security features
of an ADP System.

(8) Conduct of competent system ST&E, timely review of
system ST&E reports, and correction of deficiencies needed
to support conditional or final approval or disapproval of
an ADP System for the processing of classified information.

(9) Establishment, where appropriate, of a central ST&E
coordination point for the maintenance of records of
selected techniques, procedures, standards, and tests used
in the testing and evaluation of security features of ADP
Systems which may be suitable for validation and use by
other Department of Defense Components.”[9]

DoD Manual 5220.22-M (Section XIII 103a) requires: “the initial approval,
in writing, of the cognizant security office prior to processing any
classified information in an ADP system. This section requires
reapproval by the cognizant security office for major system
modifications made subsequent to initial approval. Reapprovals will be
required because of (i) major changes in personnel access requirements,
(ii) relocation or structural modification of the central computer
facility, (iii) additions, deletions or changes to main frame, storage or
input/output devices, (iv) system software changes impacting security
protection features, (v) any change in clearance, declassification, audit
trail or hardware/software maintenance procedures, and (vi) other system
changes as determined by the cognizant security office.”[11]

A major component of assurance, life-cycle assurance, is concerned with
testing ADP systems both in the development phase as well as during
operation. DoD Directive 5215.1 (Section F.2.C.(2)) requires
“evaluations of selected industry and government-developed trusted
computer systems against these criteria.”[10]

8.0 A GUIDELINE ON COVERT CHANNELS

A covert channel is any communication channel that can be exploited by a
process to transfer information in a manner that violates the system’s
security policy. There are two types of covert channels: storage channels and
timing channels. Covert storage channels include all vehicles that would
allow the direct or indirect writing of a storage location by one process and
the direct or indirect reading of it by another. Covert timing channels
include all vehicles that would allow one process to signal information to
another process by modulating its own use of system resources in such a way
that the change in response time observed by the second process would provide
information.

From a security perspective, covert channels with low bandwidths represent a
lower threat than those with high bandwidths. However, for many types of
covert channels, techniques used to reduce the bandwidth below a certain rate
(which depends on the specific channel mechanism and the system architecture)
also have the effect of degrading the performance provided to legitimate
system users. Hence, a trade-off between system performance and covert
channel bandwidth must be made. Because of the threat of compromise that
would be present in any multilevel computer system containing classified or
sensitive information, such systems should not contain covert channels with
high bandwidths. This guideline is intended to provide system developers with
an idea of just how high a “high” covert channel bandwidth is.

A covert channel bandwidth that exceeds a rate of one hundred (100) bits per
second is considered “high” because 100 bits per second is the approximate
rate at which many computer terminals are run. It does not seem appropriate
to call a computer system “secure” if information can be compromised at a rate
equal to the normal output rate of some commonly used device.

In any multilevel computer system there are a number of relatively
low-bandwidth covert channels whose existence is deeply ingrained in the
system design. Faced with the large potential cost of reducing the bandwidths
of such covert channels, it is felt that those with maximum bandwidths of less
than one (1) bit per second are acceptable in most application environments.
Though maintaining acceptable performance in some systems may make it
impractical to eliminate all covert channels with bandwidths of 1 or more bits
per second, it is possible to audit their use without adversely affecting
system performance. This audit capability provides the system administration
with a means of detecting — and procedurally correcting — significant
compromise. Therefore, a Trusted Computing Base should provide, wherever
possible, the capability to audit the use of covert channel mechanisms with
bandwidths that may exceed a rate of one (1) bit in ten (10) seconds.

The covert channel problem has been addressed by a number of authors. The
interested reader is referred to references [5], [6], [19], [21], [22], [23],
and [29].

9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL FEATURES

The Mandatory Access Control requirement includes a capability to support an
unspecified number of hierarchical classifications and an unspecified number
of non-hierarchical categories at each hierarchical level. To encourage
consistency and portability in the design and development of the National
Security Establishment trusted computer systems, it is desirable for all such
systems to be able to support a minimum number of levels and categories. The
following suggestions are provided for this purpose:

* The number of hierarchical classifications should be greater than or
equal to eight (8).

* The number of non-hierarchical categories should be greater than or
equal to twenty-nine (29).

10.0 A GUIDELINE ON SECURITY TESTING

These guidelines are provided to give an indication of the extent and
sophistication of testing undertaken by the DoD Computer Security Center
during the Formal Product Evaluation process. Organizations wishing to use
“Department of Defense Trusted Computer System Evaluation Criteria” for
performing their own evaluations may find this section useful for planning
purposes.

As in Part I, highlighting is used to indicate changes in the guidelines from
the next lower division.

10.1 Testing for Division C

10.1.1 Personnel

The security testing team shall consist of at least two
individuals with bachelor degrees in Computer Science or the
equivalent. Team members shall be able to follow test plans
prepared by the system developer and suggest additions, shall
be familiar with the “flaw hypothesis” or equivalent security
testing methodology, and shall have assembly level programming
experience. Before testing begins, the team members shall have
functional knowledge of, and shall have completed the system
developer’s internals course for, the system being evaluated.

10.1.2 Testing

The team shall have “hands-on” involvement in an independent run
of the tests used by the system developer. The team shall
independently design and implement at least five system-specific
tests in an attempt to circumvent the security mechanisms of the
system. The elapsed time devoted to testing shall be at least
one month and need not exceed three months. There shall be no
fewer than twenty hands-on hours spent carrying out system
developer-defined tests and test team-defined tests.

10.2 Testing for Division B

10.2.1 Personnel

The security testing team shall consist of at least two
individuals with bachelor degrees in Computer Science or the
equivalent and at least one individual with a master’s degree in
Computer Science or equivalent. Team members shall be able to
follow test plans prepared by the system developer and suggest
additions, shall be conversant with the “flaw hypothesis” or
equivalent security testing methodology, shall be fluent in the
TCB implementation language(s), and shall have assembly level
programming experience. Before testing begins, the team members
shall have functional knowledge of, and shall have completed the
system developer’s internals course for, the system being
evaluated. At least one team member shall have previously
completed a security test on another system.

10.2.2 Testing

The team shall have “hands-on” involvement in an independent run
of the test package used by the system developer to test
security-relevant hardware and software. The team shall
independently design and implement at least fifteen system-
specific tests in an attempt to circumvent the security
mechanisms of the system. The elapsed time devoted to testing
shall be at least two months and need not exceed four months.
There shall be no fewer than thirty hands-on hours per team
member spent carrying out system developer-defined tests and
test team-defined tests.

10.3 Testing for Division A

10.3.1 Personnel

The security testing team shall consist of at least one
individual with a bachelor’s degree in Computer Science or the
equivalent and at least two individuals with masters’ degrees in
Computer Science or equivalent. Team members shall be able to
follow test plans prepared by the system developer and suggest
additions, shall be conversant with the “flaw hypothesis” or
equivalent security testing methodology, shall be fluent in the
TCB implementation language(s), and shall have assembly level
programming experience. Before testing begins, the team members
shall have functional knowledge of, and shall have completed the
system developer’s internals course for, the system being
evaluated. At least one team member shall be familiar enough
with the system hardware to understand the maintenance diagnostic
programs and supporting hardware documentation. At least two
team members shall have previously completed a security test on
another system. At least one team member shall have
demonstrated system level programming competence on the system
under test to a level of complexity equivalent to adding a device
driver to the system.

10.3.2 Testing

The team shall have “hands-on” involvement in an independent run
of the test package used by the system developer to test
security-relevant hardware and software. The team shall
independently design and implement at least twenty-five system-
specific tests in an attempt to circumvent the security
mechanisms of the system. The elapsed time devoted to testing
shall be at least three months and need not exceed six months.
There shall be no fewer than fifty hands-on hours per team
member spent carrying out system developer-defined tests and
test team-defined tests.

APPENDIX A

Commercial Product Evaluation Process

“Department of Defense Trusted Computer System Evaluation Criteria” forms the
basis upon which the Computer Security Center will carry out the commercial
computer security evaluation process. This process is focused on commercially
produced and supported general-purpose operating system products that meet the
needs of government departments and agencies. The formal evaluation is aimed
at “off-the-shelf” commercially supported products and is completely divorced
from any consideration of overall system performance, potential applications,
or particular processing environments. The evaluation provides a key input to
a computer system security approval/accreditation. However, it does not
constitute a complete computer system security evaluation. A complete study
(e.g., as in reference [18]) must consider additional factors dealing with the
system in its unique environment, such as it’s proposed security mode of
operation, specific users, applications, data sensitivity, physical and
personnel security, administrative and procedural security, TEMPEST, and
communications security.

The product evaluation process carried out by the Computer Security Center has
three distinct elements:

* Preliminary Product Evaluation – An informal dialogue between a vendor
and the Center in which technical information is exchanged to create a
common understanding of the vendor’s product, the criteria, and the
rating that may be expected to result from a formal product evaluation.

* Formal Product Evaluation – A formal evaluation, by the Center, of a
product that is available to the DoD, and that results in that product
and its assigned rating being placed on the Evaluated Products List.

* Evaluated Products List – A list of products that have been subjected
to formal product evaluation and their assigned ratings.

PRELIMINARY PRODUCT EVALUATION

Since it is generally very difficult to add effective security measures late
in a product’s life cycle, the Center is interested in working with system
vendors in the early stages of product design. A preliminary product
evaluation allows the Center to consult with computer vendors on computer
security issues found in products that have not yet been formally announced.

A preliminary evaluation is typically initiated by computer system vendors who
are planning new computer products that feature security or major
security-related upgrades to existing products. After an initial meeting
between the vendor and the Center, appropriate non-disclosure agreements are
executed that require the Center to maintain the confidentiality of any
proprietary information disclosed to it. Technical exchange meetings follow
in which the vendor provides details about the proposed product (particularly
its internal designs and goals) and the Center provides expert feedback to the
vendor on potential computer security strengths and weaknesses of the vendor’s
design choices, as well as relevant interpretation of the criteria. The
preliminary evaluation is typically terminated when the product is completed
and ready for field release by the vendor. Upon termination, the Center
prepares a wrap-up report for the vendor and for internal distribution within
the Center. Those reports containing proprietary information are not
available to the public.

During preliminary evaluation, the vendor is under no obligation to actually
complete or market the potential product. The Center is, likewise, not
committed to conduct a formal product evaluation. A preliminary evaluation
may be terminated by either the Center or the vendor when one notifies the
other, in writing, that it is no longer advantageous to continue the
evaluation.

FORMAL PRODUCT EVALUATION

The formal product evaluation provides a key input to certification of a
computer system for use in National Security Establishment applications and is
the sole basis for a product being placed on the Evaluated Products List.

A formal product evaluation begins with a request by a vendor for the Center
to evaluate a product for which the product itself and accompanying
documentation needed to meet the requirements defined by this publication are
complete. Non-disclosure agreements are executed and a formal product
evaluation team is formed by the Center. An initial meeting is then held with
the vendor to work out the schedule for the formal evaluation. Since testing
of the implemented product forms an important part of the evaluation process,
access by the evaluation team to a working version of the system is negotiated
with the vendor. Additional support required from the vendor includes
complete design documentation, source code, and access to vendor personnel who
can answer detailed questions about specific portions of the product. The
evaluation team tests the product against each requirement, making any
necessary interpretations of the criteria with respect to the product being
evaluated.

The evaluation team writes a two-part final report on their findings about the
system. The first part is publicly available (containing no proprietary
information) and contains the overall class rating assigned to the system and
the details of the evaluation team’s findings when comparing the product
against the evaluation criteria. The second part of the evaluation report
contains vulnerability analyses and other detailed information supporting the
rating decision. Since this part may contain proprietary or other sensitive
information it will be distributed only within the U.S. Government on a
strict need-to-know and non- disclosure basis, and to the vendor. No portion
of the evaluation results will be withheld from the vendor.

APPENDIX B

Summary of Evaluation Criteria Divisions

The divisions of systems recognized under the trusted computer system
evaluation criteria are as follows. Each division represents a major
improvement in the overall confidence one can place in the system to protect
classified and other sensitive information.

Division (D): Minimal Protection

This division contains only one class. It is reserved for those systems that
have been evaluated but that fail to meet the requirements for a higher
evaluation class.

Division (C): Discretionary Protection

Classes in this division provide for discretionary (need-to-know) protection
and, through the inclusion of audit capabilities, for accountability of
subjects and the actions they initiate.

Division (B): Mandatory Protection

The notion of a TCB that preserves the integrity of sensitivity labels and
uses them to enforce a set of mandatory access control rules is a major
requirement in this division. Systems in this division must carry the
sensitivity labels with major data structures in the system. The system
developer also provides the security policy model on which the TCB is based
and furnishes a specification of the TCB. Evidence must be provided to
demonstrate that the reference monitor concept has been implemented.

Division (A): Verified Protection

This division is characterized by the use of formal security verification
methods to assure that the mandatory and discretionary security controls
employed in the system can effectively protect classified or other sensitive
information stored or processed by the system. Extensive documentation is
required to demonstrate that the TCB meets the security requirements in all
aspects of design, development and implementation.

APPENDIX C

Summary of Evaluation Criteria Classes

The classes of systems recognized under the trusted computer system evaluation
criteria are as follows. They are presented in the order of increasing
desirablity from a computer security point of view.

Class (D): Minimal Protection

This class is reserved for those systems that have been evaluated but that
fail to meet the requirements for a higher evaluation class.

Class (C1): Discretionary Security Protection

The Trusted Computing Base (TCB) of a class (C1) system nominally satisfies
the discretionary security requirements by providing separation of users and
data. It incorporates some form of credible controls capable of enforcing
access limitations on an individual basis, i.e., ostensibly suitable for
allowing users to be able to protect project or private information and to
keep other users from accidentally reading or destroying their data. The
class (C1) environment is expected to be one of cooperating users processing
data at the same level(s) of sensitivity.

Class (C2): Controlled Access Protection

Systems in this class enforce a more finely grained discretionary access
control than (C1) systems, making users individually accountable for their
actions through login procedures, auditing of security-relevant events, and
resource isolation.

Class (B1): Labeled Security Protection

Class (B1) systems require all the features required for class (C2). In
addition, an informal statement of the security policy model, data labeling,
and mandatory access control over named subjects and objects must be present.
The capability must exist for accurately labeling exported information. Any
flaws identified by testing must be removed.

Class (B2): Structured Protection

In class (B2) systems, the TCB is based on a clearly defined and documented
formal security policy model that requires the discretionary and mandatory
access control enforcement found in class (B1) systems be extended to all
subjects and objects in the ADP system. In addition, covert channels are
addressed. The TCB must be carefully structured into protection-critical and
non- protection-critical elements. The TCB interface is well-defined and the
TCB design and implementation enable it to be subjected to more thorough
testing and more complete review. Authentication mechanisms are strengthened,
trusted facility management is provided in the form of support for system
administrator and operator functions, and stringent configuration management
controls are imposed. The system is relatively resistant to penetration.

Class (B3): Security Domains

The class (B3) TCB must satisfy the reference monitor requirements that it
mediate all accesses of subjects to objects, be tamperproof, and be small
enough to be subjected to analysis and tests. To this end, the TCB is
structured to exclude code not essential to security policy enforcement, with
significant system engineering during TCB design and implementation directed
toward minimizing its complexity. A security administrator is supported,
audit mechanisms are expanded to signal security- relevant events, and system
recovery procedures are required. The system is highly resistant to
penetration.

Class (A1): Verified Design

Systems in class (A1) are functionally equivalent to those in class (B3) in
that no additional architectural features or policy requirements are added.
The distinguishing feature of systems in this class is the analysis derived
from formal design specification and verification techniques and the resulting
high degree of assurance that the TCB is correctly implemented. This
assurance is developmental in nature, starting with a formal model of the
security policy and a formal top-level specification (FTLS) of the design. In
keeping with the extensive design and development analysis of the TCB required
of systems in class (A1), more stringent configuration management is required
and procedures are established for securely distributing the system to sites.
A system security administrator is supported.

APPENDIX D

Requirement Directory

This appendix lists requirements defined in “Department of Defense Trusted
Computer System Evaluation Criteria” alphabetically rather than by class. It
is provided to assist in following the evolution of a requirement through the
classes. For each requirement, three types of criteria may be present. Each
will be preceded by the word: NEW, CHANGE, or ADD to indicate the following:

NEW: Any criteria appearing in a lower class are superseded
by the criteria that follow.

CHANGE: The criteria that follow have appeared in a lower class
but are changed for this class. Highlighting is used
to indicate the specific changes to previously stated
criteria.

ADD: The criteria that follow have not been required for any
lower class, and are added in this class to the
previously stated criteria for this requirement.

Abbreviations are used as follows:

NR: (No Requirement) This requirement is not included in
this class.

NAR: (No Additional Requirements) This requirement does not
change from the previous class.

The reader is referred to Part I of this document when placing new criteria
for a requirement into the complete context for that class.

Figure 1 provides a pictorial summary of the evolution of requirements through
the classes.

Audit

C1: NR.

C2: NEW: The TCB shall be able to create, maintain, and protect from
modification or unauthorized access or destruction an audit trail of
accesses to the objects it protects. The audit data shall be
protected by the TCB so that read access to it is limited to those
who are authorized for audit data. The TCB shall be able to record
the following types of events: use of identification and
authentication mechanisms, introduction of objects into a user’s
address space (e.g., file open, program initiation), deletion of
objects, and actions taken by computer operators and system
administrators and/or system security officers. For each recorded
event, the audit record shall identify: date and time of the event,
user, type of event, and success or failure of the event. For
identification/authentication events the origin of request (e.g.,
terminal ID) shall be included in the audit record. For events that
introduce an object into a user’s address space and for object
deletion events the audit record shall include the name of the object.
The ADP system administrator shall be able to selectively audit the
actions of any one or more users based on individual identity.

B1: CHANGE: For events that introduce an object into a user’s address
space and for object deletion events the audit record shall include
the name of the object and the object’s security level. The ADP
system administrator shall be able to selectively audit the actions
of any one or more users based on individual identity and/or object
security level.

ADD: The TCB shall also be able to audit any override of
human-readable output markings.

B2: ADD: The TCB shall be able to audit the identified events that may be
used in the exploitation of covert storage channels.

B3: ADD: The TCB shall contain a mechanism that is able to monitor the
occurrence or accumulation of security auditable events that may
indicate an imminent violation of security policy. This mechanism
shall be able to immediately notify the security administrator when
thresholds are exceeded.

A1: NAR.

Configuration Management

C1: NR.

C2: NR.

B1: NR.

B2: NEW: During development and maintenance of the TCB, a configuration
management system shall be in place that maintains control of changes
to the descriptive top-level specification, other design data,
implementation documentation, source code, the running version of the
object code, and test fixtures and documentation. The configuration
management system shall assure a consistent mapping among all
documentation and code associated with the current version of the TCB.
Tools shall be provided for generation of a new version of the TCB
from source code. Also available shall be tools for comparing a
newly generated version with the previous TCB version in order to
ascertain that only the intended changes have been made in the code
that will actually be used as the new version of the TCB.

B3: NAR.

A1: CHANGE: During the entire life-cycle, i.e., during the design,
development, and maintenance of the TCB, a configuration management
system shall be in place for all security-relevant hardware, firmware,
and software that maintains control of changes to the formal model,
the descriptive and formal top-level specifications, other design
data, implementation documentation, source code, the running version
of the object code, and test fixtures and documentation. Also
available shall be tools, maintained under strict configuration
control, for comparing a newly generated version with the previous
TCB version in order to ascertain that only the intended changes have
been made in the code that will actually be used as the new version
of the TCB.

ADD: A combination of technical, physical, and procedural safeguards
shall be used to protect from unauthorized modification or
destruction the master copy or copies of all material used to
generate the TCB.

Covert Channel Analysis

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The system developer shall conduct a thorough search for covert
storage channels and make a determination (either by actual
measurement or by engineering estimation) of the maximum bandwidth of
each identified channel. (See the Covert Channels Guideline section.)

B3: CHANGE: The system developer shall conduct a thorough search for
covert channels and make a determination (either by actual
measurement or by engineering estimation) of the maximum bandwidth
of each identified channel.

A1: ADD: Formal methods shall be used in the analysis.

Design Documentation

C1: NEW: Documentation shall be available that provides a description of
the manufacturer’s philosophy of protection and an explanation of how
this philosophy is translated into the TCB. If the TCB is composed
of distinct modules, the interfaces between these modules shall be
described.

C2: NAR.

B1: ADD: An informal or formal description of the security policy model
enforced by the TCB shall be available and an explanation provided to
show that it is sufficient to enforce the security policy. The
specific TCB protection mechanisms shall be identified and an
explanation given to show that they satisfy the model.

B2: CHANGE: The interfaces between the TCB modules shall be described. A
formal description of the security policy model enforced by the TCB
shall be available and proven that it is sufficient to enforce the
security policy.

ADD: The descriptive top-level specification (DTLS) shall be shown to
be an accurate description of the TCB interface. Documentation shall
describe how the TCB implements the reference monitor concept and
give an explanation why it is tamperproof, cannot be bypassed, and is
correctly implemented. Documentation shall describe how the TCB is
structured to facilitate testing and to enforce least privilege.
This documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting the
channels. All auditable events that may be used in the exploitation
of known covert storage channels shall be identified. The bandwidths
of known covert storage channels, the use of which is not detectable
by the auditing mechanisms, shall be provided. (See the Covert
Channel Guideline section.)

B3: ADD: The TCB implementation (i.e., in hardware, firmware, and
software) shall be informally shown to be consistent with the DTLS.
The elements of the DTLS shall be shown, using informal techniques,
to correspond to the elements of the TCB.

A1: CHANGE: The TCB implementation (i.e., in hardware, firmware, and
software) shall be informally shown to be consistent with the formal
top-level specification (FTLS). The elements of the FTLS shall be
shown, using informal techniques, to correspond to the elements of
the TCB.

ADD: Hardware, firmware, and software mechanisms not dealt with in
the FTLS but strictly internal to the TCB (e.g., mapping registers,
direct memory access I/O) shall be clearly described.

Design Specification and Verification

C1: NR.

C2: NR.

B1: NEW: An informal or formal model of the security policy supported by
the TCB shall be maintained that is shown to be consistent with its
axioms.

B2: CHANGE: A formal model of the security policy supported by the TCB
shall be maintained that is proven consistent with its axioms.

ADD: A descriptive top-level specification (DTLS) of the TCB shall be
maintained that completely and accurately describes the TCB in terms
of exceptions, error messages, and effects. It shall be shown to be
an accurate description of the TCB interface.

B3: ADD: A convincing argument shall be given that the DTLS is consistent
with the model.

A1: CHANGE: The FTLS shall be shown to be an accurate description of the
TCB interface. A convincing argument shall be given that the DTLS is
consistent with the model and a combination of formal and informal
techniques shall be used to show that the FTLS is consistent with the
model.

ADD: A formal top-level specification (FTLS) of the TCB shall be
maintained that accurately describes the TCB in terms of exceptions,
error messages, and effects. The DTLS and FTLS shall include those
components of the TCB that are implemented as hardware and/or
firmware if their properties are visible at the TCB interface. This
verification evidence shall be consistent with that provided within
the state-of-the-art of the particular Computer Security Center-
endorsed formal specification and verification system used. Manual
or other mapping of the FTLS to the TCB source code shall be
performed to provide evidence of correct implementation.

Device Labels

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support the assignment of minimum and maximum
security levels to all attached physical devices. These security
levels shall be used by the TCB to enforce constraints imposed by
the physical environments in which the devices are located.

B3: NAR.

A1: NAR.

Discretionary Access Control

C1: NEW: The TCB shall define and control access between named users and
named objects (e.g., files and programs) in the ADP system. The
enforcement mechanism (e.g., self/group/public controls, access
control lists) shall allow users to specify and control sharing of
those objects by named individuals or defined groups or both.

C2: CHANGE: The enforcement mechanism (e.g., self/group/public controls,
access control lists) shall allow users to specify and control
sharing of those objects by named individuals, or defined groups of
individuals, or by both.

ADD: The discretionary access control mechanism shall, either by explicit
user action or by default, provide that objects are protected from
unauthorized access. These access controls shall be capable of
including or excluding access to the granularity of a single user.
Access permission to an object by users not already possessing access
permission shall only be assigned by authorized users.

B1: NAR.

B2: NAR.

B3: CHANGE: The enforcement mechanism (e.g., access control lists) shall
allow users to specify and control sharing of those objects. These
access controls shall be capable of specifying, for each named
object, a list of named individuals and a list of groups of named
individuals with their respective modes of access to that object.

ADD: Furthermore, for each such named object, it shall be possible to
specify a list of named individuals and a list of groups of named
individuals for which no access to the object is to be given.

A1: NAR.

Exportation of Labeled Information

C1: NR.

C2: NR.

B1: NEW: The TCB shall designate each communication channel and I/O
device as either single-level or multilevel. Any change in this
designation shall be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit any change in the
current security level associated with a single-level communication
channel or I/O device.

B2: NAR.

B3: NAR.

A1: NAR.

Exportation to Multilevel Devices

C1: NR.

C2: NR.

B1: NEW: When the TCB exports an object to a multilevel I/O device, the
sensitivity label associated with that object shall also be exported
and shall reside on the same physical medium as the exported
information and shall be in the same form (i.e., machine-readable or
human-readable form). When the TCB exports or imports an object over
a multilevel communication channel, the protocol used on that channel
shall provide for the unambiguous pairing between the sensitivity
labels and the associated information that is sent or received.

B2: NAR.

B3: NAR.

A1: NAR.

Exportation to Single-Level Devices

C1: NR.

C2: NR.

B1: NEW: Single-level I/O devices and single-level communication channels
are not required to maintain the sensitivity labels of the
information they process. However, the TCB shall include a mechanism
by which the TCB and an authorized user reliably communicate to
designate the single security level of information imported or
exported via single-level communication channels or I/O devices.

B2: NAR.

B3: NAR.

A1: NAR.

Identification and Authentication

C1: NEW: The TCB shall require users to identify themselves to it before
beginning to perform any other actions that the TCB is expected to
mediate. Furthermore, the TCB shall use a protected mechanism (e.g.,
passwords) to authenticate the user’s identity. The TCB shall
protect authentication data so that it cannot be accessed by any
unauthorized user.

C2: ADD: The TCB shall be able to enforce individual accountability by
providing the capability to uniquely identify each individual ADP
system user. The TCB shall also provide the capability of
associating this identity with all auditable actions taken by that
individual.

B1: CHANGE: Furthermore, the TCB shall maintain authentication data that
includes information for verifying the identity of individual users
(e.g., passwords) as well as information for determining the
clearance and authorizations of individual users. This data shall be
used by the TCB to authenticate the user’s identity and to determine
the security level and authorizations of subjects that may be created
to act on behalf of the individual user.

B2: NAR.

B3: NAR.

A1: NAR.

Label Integrity

C1: NR.

C2: NR.

B1: NEW: Sensitivity labels shall accurately represent security levels of
the specific subjects or objects with which they are associated. When
exported by the TCB, sensitivity labels shall accurately and
unambiguously represent the internal labels and shall be associated
with the information being exported.

B2: NAR.

B3: NAR.

A1: NAR.

Labeling Human-Readable Output

C1: NR.

C2: NR.

B1: NEW: The ADP system administrator shall be able to specify the
printable label names associated with exported sensitivity labels.
The TCB shall mark the beginning and end of all human-readable,
paged, hardcopy output (e.g., line printer output) with human-
readable sensitivity labels that properly* represent the sensitivity
of the output. The TCB shall, by default, mark the top and bottom of
each page of human-readable, paged, hardcopy output (e.g., line
printer output) with human-readable sensitivity labels that
properly* represent the overall sensitivity of the output or that
properly* represent the sensitivity of the information on the page.
The TCB shall, by default and in an appropriate manner, mark other
forms of human-readable output (e.g., maps, graphics) with human-
readable sensitivity labels that properly* represent the sensitivity
of the output. Any override of these marking defaults shall be
auditable by the TCB.

B2: NAR.

B3: NAR.

A1: NAR.

____________________________________________________________
* The hierarchical classification component in human-readable
sensitivity labels shall be equal to the greatest
hierarchical classification of any of the information in the
output that the labels refer to; the non-hierarchical
category component shall include all of the non-hierarchical
categories of the information in the output the labels refer
to, but no other non-hierarchical categories.
____________________________________________________________

Labels

C1: NR.

C2: NR.

B1: NEW: Sensitivity labels associated with each subject and storage
object under its control (e.g., process, file, segment, device) shall
be maintained by the TCB. These labels shall be used as the basis
for mandatory access control decisions. In order to import non-
labeled data, the TCB shall request and receive from an authorized
user the security level of the data, and all such actions shall be
auditable by the TCB.

B2: CHANGE: Sensitivity labels associated with each ADP system resource
(e.g., subject, storage object) that is directly or indirectly
accessible by subjects external to the TCB shall be maintained by
the TCB.

B3: NAR.

A1: NAR.

Mandatory Access Control

C1: NR.

C2: NR.

B1: NEW: The TCB shall enforce a mandatory access control policy over all
subjects and storage objects under its control (e.g., processes,
files, segments, devices). These subjects and objects shall be
assigned sensitivity labels that are a combination of hierarchical
classification levels and non-hierarchical categories, and the labels
shall be used as the basis for mandatory access control decisions.
The TCB shall be able to support two or more such security levels.
(See the Mandatory Access Control guidelines.) The following
requirements shall hold for all accesses between subjects and objects
controlled by the TCB: A subject can read an object only if the
hierarchical classification in the subject’s security level is
greater than or equal to the hierarchical classification in the
object’s security level and the non-hierarchical categories in the
subject’s security level include all the non-hierarchical categories
in the object’s security level. A subject can write an object only
if the hierarchical classification in the subject’s security level is
less than or equal to the hierarchical classification in the object’s
security level and all the non-hierarchical categories in the
subject’s security level are included in the non-hierarchical
categories in the object’s security level.

B2: CHANGE: The TCB shall enforce a mandatory access control policy over
all resources (i.e., subjects, storage objects, and I/O devices) that
are directly or indirectly accessible by subjects external to the TCB.
The following requirements shall hold for all accesses between all
subjects external to the TCB and all objects directly or indirectly
accessible by these subjects:

B3: NAR.

A1: NAR.

Object Reuse

C1: NR.

C2: NEW: When a storage object is initially assigned, allocated, or
reallocated to a subject from the TCB’s pool of unused storage
objects, the TCB shall assure that the object contains no data for
which the subject is not authorized.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Security Features User’s Guide

C1: NEW: A single summary, chapter, or manual in user documentation shall
describe the protection mechanisms provided by the TCB, guidelines on
their use, and how they interact with one another.

C2: NAR.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Security Testing

C1: NEW: The security mechanisms of the ADP system shall be tested and
found to work as claimed in the system documentation. Testing shall
be done to assure that there are no obvious ways for an unauthorized
user to bypass or otherwise defeat the security protection mechanisms
of the TCB. (See the Security Testing guidelines.)

C2: ADD: Testing shall also include a search for obvious flaws that would
allow violation of resource isolation, or that would permit
unauthorized access to the audit or authentication data.

B1: NEW: The security mechanisms of the ADP system shall be tested and
found to work as claimed in the system documentation. A team of
individuals who thoroughly understand the specific implementation of
the TCB shall subject its design documentation, source code, and
object code to thorough analysis and testing. Their objectives shall
be: to uncover all design and implementation flaws that would permit
a subject external to the TCB to read, change, or delete data
normally denied under the mandatory or discretionary security policy
enforced by the TCB; as well as to assure that no subject (without
authorization to do so) is able to cause the TCB to enter a state
such that it is unable to respond to communications initiated by
other users. All discovered flaws shall be removed or neutralized
and the TCB retested to demonstrate that they have been eliminated
and that new flaws have not been introduced. (See the Security
Testing Guidelines.)

B2: CHANGE: All discovered flaws shall be corrected and the TCB retested
to demonstrate that they have been eliminated and that new flaws have
not been introduced.

ADD: The TCB shall be found relatively resistant to penetration.
Testing shall demonstrate that the TCB implementation is consistent
with the descriptive top-level specification.

B3: CHANGE: The TCB shall be found resistant to penetration.

ADD: No design flaws and no more than a few correctable
implementation flaws may be found during testing and there shall be
reasonable confidence that few remain.

A1: CHANGE: Testing shall demonstrate that the TCB implementation is
consistent with the formal top-level specification.

ADD: Manual or other mapping of the FTLS to the source code may form
a basis for penetration testing.

Subject Sensitivity Labels

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall immediately notify a terminal user of each change
in the security level associated with that user during an interactive
session. A terminal user shall be able to query the TCB as desired
for a display of the subject’s complete sensitivity label.

B3: NAR.

A1: NAR.

System Architecture

C1: NEW: The TCB shall maintain a domain for its own execution that
protects it from external interference or tampering (e.g., by
modification of its code or data structures). Resources controlled
by the TCB may be a defined subset of the subjects and objects in
the ADP system.

C2: ADD: The TCB shall isolate the resources to be protected so that they
are subject to the access control and auditing requirements.

B1: ADD: The TCB shall maintain process isolation through the provision
of distinct address spaces under its control.

B2: NEW: The TCB shall maintain a domain for its own execution that
protects it from external interference or tampering (e.g., by
modification of its code or data structures). The TCB shall maintain
process isolation through the provision of distinct address spaces
under its control. The TCB shall be internally structured into well-
defined largely independent modules. It shall make effective use of
available hardware to separate those elements that are protection-
critical from those that are not. The TCB modules shall be designed
such that the principle of least privilege is enforced. Features in
hardware, such as segmentation, shall be used to support logically
distinct storage objects with separate attributes (namely: readable,
writeable). The user interface to the TCB shall be completely
defined and all elements of the TCB identified.

B3: ADD: The TCB shall be designed and structured to use a complete,
conceptually simple protection mechanism with precisely defined
semantics. This mechanism shall play a central role in enforcing the
internal structuring of the TCB and the system. The TCB shall
incorporate significant use of layering, abstraction and data hiding.
Significant system engineering shall be directed toward minimizing
the complexity of the TCB and excluding from the TCB modules that are
not protection-critical.

A1: NAR.

System Integrity

C1: NEW: Hardware and/or software features shall be provided that can be
used to periodically validate the correct operation of the on-site
hardware and firmware elements of the TCB.

C2: NAR.

B1: NAR.

B2: NAR.

B3: NAR.

A1: NAR.

Test Documentation

C1: NEW: The system developer shall provide to the evaluators a document
that describes the test plan and results of the security mechanisms’
functional testing.

C2: NAR.

B1: NAR.

B2: ADD: It shall include results of testing the effectiveness of the
methods used to reduce covert channel bandwidths.

B3: NAR.

A1: ADD: The results of the mapping between the formal top-level
specification and the TCB source code shall be given.

Trusted Distribution

C1: NR.

C2: NR.

B1: NR.

B2: NR.

B3: NR.

A1: NEW: A trusted ADP system control and distribution facility shall be
provided for maintaining the integrity of the mapping between the
master data describing the current version of the TCB and the on-site
master copy of the code for the current version. Procedures (e.g.,
site security acceptance testing) shall exist for assuring that the
TCB software, firmware, and hardware updates distributed to a
customer are exactly as specified by the master copies.

Trusted Facility Management

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support separate operator and administrator
functions.

B3: ADD: The functions performed in the role of a security administrator
shall be identified. The ADP system administrative personnel shall
only be able to perform security administrator functions after taking
a distinct auditable action to assume the security administrator role
on the ADP system. Non-security functions that can be performed in
the security administration role shall be limited strictly to those
essential to performing the security role effectively.

A1: NAR.

Trusted Facility Manual

C1: NEW: A manual addressed to the ADP system administrator shall present
cautions about functions and privileges that should be controlled
when running a secure facility.

C2: ADD: The procedures for examining and maintaining the audit files as
well as the detailed audit record structure for each type of audit
event shall be given.

B1: ADD: The manual shall describe the operator and administrator
functions related to security, to include changing the
characteristics of a user. It shall provide guidelines on the
consistent and effective use of the protection features of the
system, how they interact, how to securely generate a new TCB, and
facility procedures, warnings, and privileges that need to be
controlled in order to operate the facility in a secure manner.

B2: ADD: The TCB modules that contain the reference validation mechanism
shall be identified. The procedures for secure generation of a new
TCB from source after modification of any modules in the TCB shall
be described.

B3: ADD: It shall include the procedures to ensure that the system is
initially started in a secure manner. Procedures shall also be
included to resume secure system operation after any lapse in system
operation.

A1: NAR.

Trusted Path

C1: NR.

C2: NR.

B1: NR.

B2: NEW: The TCB shall support a trusted communication path between
itself and user for initial login and authentication. Communications
via this path shall be initiated exclusively by a user.

B3: CHANGE: The TCB shall support a trusted communication path between
itself and users for use when a positive TCB-to-user connection is
required (e.g., login, change subject security level).
Communications via this trusted path shall be activated exclusively
by a user or the TCB and shall be logically isolated and unmistakably
distinguishable from other paths.

A1: NAR.

Trusted Recovery

C1: NR.

C2: NR.

B1: NR.

B2: NR.

B3: NEW: Procedures and/or mechanisms shall be provided to assure that,
after an ADP system failure or other discontinuity, recovery without a
protection compromise is obtained.

A1: NAR.

(this page is reserved for Figure 1)

GLOSSARY

Access – A specific type of interaction between a subject and an object
that results in the flow of information from one to the other.

Approval/Accreditation – The official authorization that is
granted to an ADP system to process sensitive information in
its operational environment, based upon comprehensive
security evaluation of the system’s hardware, firmware, and
software security design, configuration, and implementation
and of the other system procedural, administrative,
physical, TEMPEST, personnel, and communications security
controls.

Audit Trail – A set of records that collectively provide
documentary evidence of processing used to aid in tracing
from original transactions forward to related records and
reports, and/or backwards from records and reports to their
component source transactions.

Authenticate – To establish the validity of a claimed identity.

Automatic Data Processing (ADP) System – An assembly of computer
hardware, firmware, and software configured for the purpose
of classifying, sorting, calculating, computing,
summarizing, transmitting and receiving, storing, and
retrieving data with a minimum of human intervention.

Bandwidth – A characteristic of a communication channel that is
the amount of information that can be passed through it in a
given amount of time, usually expressed in bits per second.

Bell-LaPadula Model – A formal state transition model of computer
security policy that describes a set of access control
rules. In this formal model, the entities in a computer
system are divided into abstract sets of subjects and
objects. The notion of a secure state is defined and it is
proven that each state transition preserves security by
moving from secure state to secure state; thus, inductively
proving that the system is secure. A system state is
defined to be “secure” if the only permitted access modes of
subjects to objects are in accordance with a specific
security policy. In order to determine whether or not a
specific access mode is allowed, the clearance of a subject
is compared to the classification of the object and a
determination is made as to whether the subject is
authorized for the specific access mode. The
clearance/classification scheme is expressed in terms of a
lattice. See also: Lattice, Simple Security Property, *-
Property.

Certification – The technical evaluation of a system’s security
features, made as part of and in support of the
approval/accreditation process, that establishes the extent
to which a particular computer system’s design and
implementation meet a set of specified security
requirements.

Channel – An information transfer path within a system. May also
refer to the mechanism by which the path is effected.

Covert Channel – A communication channel that allows a process to
transfer information in a manner that violates the system’s
security policy. See also: Covert Storage Channel, Covert
Timing Channel.

Covert Storage Channel – A covert channel that involves the
direct or indirect writing of a storage location by one
process and the direct or indirect reading of the storage
location by another process. Covert storage channels
typically involve a finite resource (e.g., sectors on a
disk) that is shared by two subjects at different security
levels.

Covert Timing Channel – A covert channel in which one process
signals information to another by modulating its own use of
system resources (e.g., CPU time) in such a way that this
manipulation affects the real response time observed by the
second process.

Data – Information with a specific physical representation.

Data Integrity – The state that exists when computerized data is
the same as that in the source documents and has not been
exposed to accidental or malicious alteration or
destruction.

Descriptive Top-Level Specification (DTLS) – A top-level
specification that is written in a natural language (e.g.,
English), an informal program design notation, or a
combination of the two.

Discretionary Access Control – A means of restricting access to
objects based on the identity of subjects and/or groups to
which they belong. The controls are discretionary in the
sense that a subject with a certain access permission is
capable of passing that permission (perhaps indirectly) on
to any other subject.

Domain – The set of objects that a subject has the ability to
access.

Dominate – Security level S1 is said to dominate security level
S2 if the hierarchical classification of S1 is greater than
or equal to that of S2 and the non-hierarchical categories
of S1 include all those of S2 as a subset.

Exploitable Channel – Any channel that is useable or detectable
by subjects external to the Trusted Computing Base.

Flaw Hypothesis Methodology – A system analysis and penetration
technique where specifications and documentation for the
system are analyzed and then flaws in the system are
hypothesized. The list of hypothesized flaws is then
prioritized on the basis of the estimated probability that a
flaw actually exists and, assuming a flaw does exist, on the
ease of exploiting it and on the extent of control or
compromise it would provide. The prioritized list is used
to direct the actual testing of the system.

Flaw – An error of commission, omission, or oversight in a system
that allows protection mechanisms to be bypassed.

Formal Proof – A complete and convincing mathematical argument,
presenting the full logical justification for each proof
step, for the truth of a theorem or set of theorems. The
formal verification process uses formal proofs to show the
truth of certain properties of formal specification and for
showing that computer programs satisfy their specifications.

Formal Security Policy Model – A mathematically precise statement
of a security policy. To be adequately precise, such a
model must represent the initial state of a system, the way
in which the system progresses from one state to another,
and a definition of a “secure” state of the system. To be
acceptable as a basis for a TCB, the model must be supported
by a formal proof that if the initial state of the system
satisfies the definition of a “secure” state and if all
assumptions required by the model hold, then all future
states of the system will be secure. Some formal modeling
techniques include: state transition models, temporal logic
models, denotational semantics models, algebraic
specification models. An example is the model described by
Bell and LaPadula in reference [2]. See also: Bell-
LaPadula Model, Security Policy Model.

Formal Top-Level Specification (FTLS) – A Top-Level Specification
that is written in a formal mathematical language to allow
theorems showing the correspondence of the system
specification to its formal requirements to be hypothesized
and formally proven.

Formal Verification – The process of using formal proofs to
demonstrate the consistency (design verification) between a
formal specification of a system and a formal security
policy model or (implementation verification) between the
formal specification and its program implementation.

Functional Testing – The portion of security testing in which the
advertised features of a system are tested for correct
operation.

General-Purpose System – A computer system that is designed to
aid in solving a wide variety of problems.

Lattice – A partially ordered set for which every pair of
elements has a greatest lower bound and a least upper bound.

Least Privilege – This principle requires that each subject in a
system be granted the most restrictive set of privileges (or
lowest clearance) needed for the performance of authorized
tasks. The application of this principle limits the damage
that can result from accident, error, or unauthorized use.

Mandatory Access Control – A means of restricting access to
objects based on the sensitivity (as represented by a label)
of the information contained in the objects and the formal
authorization (i.e., clearance) of subjects to access
information of such sensitivity.

Multilevel Device – A device that is used in a manner that
permits it to simultaneously process data of two or more
security levels without risk of compromise. To accomplish
this, sensitivity labels are normally stored on the same
physical medium and in the same form (i.e., machine-readable
or human-readable) as the data being processed.

Multilevel Secure – A class of system containing information with
different sensitivities that simultaneously permits access
by users with different security clearances and needs-to-
know, but prevents users from obtaining access to
information for which they lack authorization.

Object – A passive entity that contains or receives information.
Access to an object potentially implies access to the
information it contains. Examples of objects are: records,
blocks, pages, segments, files, directories, directory
trees, and programs, as well as bits, bytes, words, fields,
processors, video displays, keyboards, clocks, printers,
network nodes, etc.

Object Reuse – The reassignment to some subject of a medium
(e.g., page frame, disk sector, magnetic tape) that
contained one or more objects. To be securely reassigned,
such media must contain no residual data from the previously
contained object(s).

Output – Information that has been exported by a TCB.

Password – A private character string that is used to
authenticate an identity.

Penetration Testing – The portion of security testing in which
the penetrators attempt to circumvent the security features
of a system. The penetrators may be assumed to use all
system design and implementation documentation, which may
include listings of system source code, manuals, and circuit
diagrams. The penetrators work under no constraints other
than those that would be applied to ordinary users.

Process – A program in execution. It is completely characterized
by a single current execution point (represented by the
machine state) and address space.

Protection-Critical Portions of the TCB – Those portions of the
TCB whose normal function is to deal with the control of
access between subjects and objects.

Protection Philosophy – An informal description of the overall
design of a system that delineates each of the protection
mechanisms employed. A combination (appropriate to the
evaluation class) of formal and informal techniques is used
to show that the mechanisms are adequate to enforce the
security policy.

Read – A fundamental operation that results only in the flow of
information from an object to a subject.

Read Access – Permission to read information.

Reference Monitor Concept – An access control concept that refers
to an abstract machine that mediates all accesses to objects
by subjects.

Resource – Anything used or consumed while performing a function.
The categories of resources are: time, information, objects
(information containers), or processors (the ability to use
information). Specific examples are: CPU time; terminal
connect time; amount of directly-addressable memory; disk
space; number of I/O requests per minute, etc.

Security Kernel – The hardware, firmware, and software elements
of a Trusted Computing Base that implement the reference
monitor concept. It must mediate all accesses, be protected
from modification, and be verifiable as correct.

Security Level – The combination of a hierarchical classification
and a set of non-hierarchical categories that represents the
sensitivity of information.

Security Policy – The set of laws, rules, and practices that
regulate how an organization manages, protects, and
distributes sensitive information.

Security Policy Model – An informal presentation of a formal
security policy model.

Security Testing – A process used to determine that the security
features of a system are implemented as designed and that
they are adequate for a proposed application environment.
This process includes hands-on functional testing,
penetration testing, and verification. See also: Functional
Testing, Penetration Testing, Verification.

Sensitive Information – Information that, as determined by a
competent authority, must be protected because its
unauthorized disclosure, alteration, loss, or destruction
will at least cause perceivable damage to someone or
something.

Sensitivity Label – A piece of information that represents the
security level of an object and that describes the
sensitivity (e.g., classification) of the data in the
object. Sensitivity labels are used by the TCB as the basis
for mandatory access control decisions.

Simple Security Property – A Bell-LaPadula security model rule
allowing a subject read access to an object only if the
security level of the subject dominates the security level
of the object.

Single-Level Device – A device that is used to process data of a
single security level at any one time. Since the device
need not be trusted to separate data of different security
levels, sensitivity labels do not have to be stored with the
data being processed.

*-Property (Star Property) – A Bell-LaPadula security model rule
allowing a subject write access to an object only if the
security level of the subject is dominated by the security
level of the object. Also known as the Confinement
Property.

Storage Object – An object that supports both read and write
accesses.

Subject – An active entity, generally in the form of a person,
process, or device that causes information to flow among
objects or changes the system state. Technically, a
process/domain pair.

Subject Security Level – A subject’s security level is equal to
the security level of the objects to which it has both read
and write access. A subject’s security level must always be
dominated by the clearance of the user the subject is
associated with.

TEMPEST – The study and control of spurious electronic signals
emitted from ADP equipment.

Top-Level Specification (TLS) – A non-procedural description of
system behavior at the most abstract level. Typically a
functional specification that omits all implementation
details.

Trap Door – A hidden software or hardware mechanism that permits
system protection mechanisms to be circumvented. It is
activated in some non-apparent manner (e.g., special
“random” key sequence at a terminal).

Trojan Horse – A computer program with an apparently or actually
useful function that contains additional (hidden) functions
that surreptitiously exploit the legitimate authorizations
of the invoking process to the detriment of security. For
example, making a “blind copy” of a sensitive file for the
creator of the Trojan Horse.

Trusted Computer System – A system that employs sufficient
hardware and software integrity measures to allow its use
for processing simultaneously a range of sensitive or
classified information.

Trusted Computing Base (TCB) – The totality of protection
mechanisms within a computer system — including hardware,
firmware, and software — the combination of which is
responsible for enforcing a security policy. It creates a
basic protection environment and provides additional user
services required for a trusted computer system. The
ability of a trusted computing base to correctly enforce a
security policy depends solely on the mechanisms within the
TCB and on the correct input by system administrative
personnel of parameters (e.g., a user’s clearance) related
to the security policy.

Trusted Path – A mechanism by which a person at a terminal can
communicate directly with the Trusted Computing Base. This
mechanism can only be activated by the person or the Trusted
Computing Base and cannot be imitated by untrusted software.

Trusted Software – The software portion of a Trusted Computing
Base.

User – Any person who interacts directly with a computer system.

Verification – The process of comparing two levels of system
specification for proper correspondence (e.g., security
policy model with top-level specification, TLS with source
code, or source code with object code). This process may or
may not be automated.

Write – A fundamental operation that results only in the flow of
information from a subject to an object.

Write Access – Permission to write an object.

REFERENCES

1. Anderson, J. P. Computer Security Technology Planning
Study, ESD-TR-73-51, vol. I, ESD/AFSC, Hanscom AFB,
Bedford, Mass., October 1972 (NTIS AD-758 206).

2. Bell, D. E. and LaPadula, L. J. Secure Computer Systems:
Unified Exposition and Multics Interpretation, MTR-2997
Rev. 1, MITRE Corp., Bedford, Mass., March 1976.

3. Brand, S. L. “An Approach to Identification and Audit of
Vulnerabilities and Control in Application Systems,” in
Audit and Evaluation of Computer Security II: System
Vulnerabilities and Controls, Z. Ruthberg, ed., NBS
Special Publication #500-57, MD78733, April 1980.

4. Brand, S. L. “Data Processing and A-123,” in Proceedings of
the Computer Performance Evaluation User’s Group 18th
Meeting, C. B. Wilson, ed., NBS Special Publication
#500-95, October 1982.

5. Denning, D. E. “A Lattice Model of Secure Information
Flow,” in Communications of the ACM, vol. 19, no. 5
(May 1976), pp. 236-243.

6. Denning, D. E. Secure Information Flow in Computer Systems,
Ph.D. dissertation, Purdue Univ., West Lafayette, Ind.,
May 1975.

7. DoD 5200.1-R, Information Security Program Regulation,
August 1982.

8. DoD Directive 5200.28, Security Requirements for Automatic
Data Processing (ADP) Systems, revised April 1978.

9. DoD 5200.28-M, ADP Security Manual — Techniques and
Procedures for Implementing, Deactivating, Testing, and
Evaluating Secure Resource-Sharing ADP Systems, revised
June 1979.

10. DoD Directive 5215.1, Computer Security Evaluation Center,
25 October 1982.

11. DoD 5220.22-M, Industrial Security Manual for Safeguarding
Classified Information, January 1983.

12. DoD 5220.22-R, Industrial Security Regulation, January 1983.

13. DoD Directive 5400.11, Department of Defense Privacy
Program, 9 June 1982.

14. Executive Order 12356, National Security Information,
6 April 1982.

15. Faurer, L. D. “Keeping the Secrets Secret,” in Government
Data Systems, November – December 1981, pp. 14-17.

16. Federal Information Processing Standards Publication (FIPS
PUB) 39, Glossary for Computer Systems Security,
15 February 1976.

17. Federal Information Processing Standards Publication (FIPS
PUB) 73, Guidelines for Security of Computer
Applications, 30 June 1980.

18. Federal Information Processing Standards Publication (FIPS
PUB) 102, Guideline for Computer Security Certification
and Accreditation.

19. Lampson, B. W. “A Note on the Confinement Problem,” in
Communications of the ACM, vol. 16, no. 10 (October
1973), pp. 613-615.

20. Lee, T. M. P., et al. “Processors, Operating Systems and
Nearby Peripherals: A Consensus Report,” in Audit and
Evaluation of Computer Security II: System
Vulnerabilities and Controls, Z. Ruthberg, ed., NBS
Special Publication #500-57, MD78733, April 1980.

21. Lipner, S. B. A Comment on the Confinement Problem, MITRE
Corp., Bedford, Mass.

22. Millen, J. K. “An Example of a Formal Flow Violation,” in
Proceedings of the IEEE Computer Society 2nd
International Computer Software and Applications
Conference, November 1978, pp. 204-208.

23. Millen, J. K. “Security Kernel Validation in Practice,” in
Communications of the ACM, vol. 19, no. 5 (May 1976),
pp. 243-250.

24. Nibaldi, G. H. Proposed Technical Evaluation Criteria for
Trusted Computer Systems, MITRE Corp., Bedford, Mass.,
M79-225, AD-A108-832, 25 October 1979.

25. Nibaldi, G. H. Specification of A Trusted Computing Base,
(TCB), MITRE Corp., Bedford, Mass., M79-228, AD-A108-
831, 30 November 1979.

26. OMB Circular A-71, Transmittal Memorandum No. 1, Security of
Federal Automated Information Systems, 27 July 1978.

27. OMB Circular A-123, Internal Control Systems, 5 November
1981.

28. Ruthberg, Z. and McKenzie, R., eds. Audit and Evaluation of
Computer Security, in NBS Special Publication #500-19,
October 1977.

29. Schaefer, M., Linde, R. R., et al. “Program Confinement in
KVM/370,” in Proceedings of the ACM National
Conference, October 1977, Seattle.

30. Schell, R. R. “Security Kernels: A Methodical Design of
System Security,” in Technical Papers, USE Inc. Spring
Conference, 5-9 March 1979, pp. 245-250.

31. Trotter, E. T. and Tasker, P. S. Industry Trusted Computer
Systems Evaluation Process, MITRE Corp., Bedford,
Mass., MTR-3931, 1 May 1980.

32. Turn, R. Trusted Computer Systems: Needs and Incentives for
Use in government and Private Sector, (AD # A103399),
Rand Corporation (R-28811-DR&E), June 1981.

33. Walker, S. T. “The Advent of Trusted Computer Operating
Systems,” in National Computer Conference Proceedings,
May 1980, pp. 655-665.

34. Ware, W. H., ed., Security Controls for Computer Systems:
Report of Defense Science Board Task Force on Computer
Security, AD # A076617/0, Rand Corporation, Santa
Monica, Calif., February 1970, reissued October 1979.

DoD STANDARD 5200.28: SUMMARY OF THE DIFFERENCES
BETWEEN IT AND CSC-STD-001-83

Note: Text which has been added or changed is indented and preceded by > sign.
Text which has been deleted is enclosed in slashes (/). “Computer Security
Center” was changed to “National Computer Security Center” throughout the
document.

The FOREWORD Section was rewritten and signed by Mr. Don Latham on
26 Dec 85. The ACKNOWLEDGEMENTS Section was updated.

The PREFACE was changed as follows:

PREFACE

The trusted computer system evaluation criteria defined in this
document classify systems into four broad hierarchical divisions
of enhanced security protection. The criteria provide a basis
for the evaluation of effectiveness of security controls built
into automatic data processing system products. The criteria
were developed with three objectives in mind: (a) to provide
users with a yardstick with which to assess the degree of trust
that can be placed in computer systems for the secure processing
of classified or other sensitive information; (b) to provide
guidance to manufacturers as to what to build into their new,
widely-available trusted commercial products in order to satisfy
trust requirements for sensitive applications; and (c) to provide
a basis for specifying security requirements in acquisition
specifications. Two types of requirements are delineated for
secure processing: (a) specific security feature requirements and
(b) assurance requirements. Some of the latter requirements
enable evaluation personnel to determine if the required features
are present and functioning as intended.

>The scope of these criteria is to be applied to
>the set of components comprising a trusted system, and is
>not necessarily to be applied to each system component
>individually. Hence, some components of a system may be
>completely untrusted, while others may be individually
>evaluated to a lower or higher evaluation class than the
>trusted product considered as a whole system. In trusted
>products at the high end of the range, the strength of the
>reference monitor is such that most of the system
>components can be completely untrusted.

Though the criteria are

>intended to be

application-independent, /it is recognized that/ the
specific security feature requirements may have to be
interpreted when applying the criteria to specific

>systems with their own functional requirements,
>applications or special environments (e.g., communications
>processors, process control computers, and embedded systems
>in general).

The underlying assurance requirements can be
applied across the entire spectrum of ADP system or
application processing environments without special
interpretation.

The SCOPE Section was changed as follows:

Scope

The trusted computer system evaluation criteria defined in this
document apply

>primarily

to /both/ trusted, commercially available
automatic data processing (ADP) systems.

>They are also applicable, as amplified below, to the
>evaluation of existing systems and to the specification of
>security requirements for ADP systems acquisition.

Included are two distinct sets of requirements: l) specific security
feature requirements; and 2) assurance requirements. The specific
feature requirements encompass the capabilities typically found
in information processing systems employing general-purpose
operating systems that are distinct from the applications programs
being supported.

>However, specific security feature requirements
>may also apply to specific systems with their own functional
>requirements, applications or special environments (e.g.,
>communications processors, process control computers, and embedded
>systems in general).

The assurance requirements, on the other hand,
apply to systems that cover the full range of computing environments
from dedicated controllers to full range multilevel secure resource
sharing systems.

Changed the Purpose Section as follows:

Purpose

As outlined in the Preface, the criteria have been developed to
serve a number of intended purposes:

To provide

>a standard

to manufacturers as to what security features to build
into their new and planned, … trust requirements

>(with particular emphasis on preventing the
>disclosure of data)

for sensitive applications.

To provide

>DoD components

with a metric with which to evaluate
the degree of trust that can be placed in …

To provide a basis for specifying security requirements in
acquisition specifications.

With respect to the

>second

purpose for development of the criteria, i.e., providing

>DoD components

with a security evaluation metric, evaluations can be
delineated into two types: (a) an evaluation can be
performed on a computer product from a perspective that
excludes the application environment; or, (b) it can be
done to assess whether appropriate security measures …

The latter type of evaluation, i.e., those done for the purpose
of assessing a system’s security attributes with respect to a
specific operational mission, is known as a certification
evaluation. It must be understood that the completion of a
formal product evaluation does not constitute certification or
accreditation for the system to be used in any specific
application environment. On the contrary, the evaluation report
only provides a trusted computer system’s evaluation rating along
with supporting data describing the product system’s strengths
and weaknesses from a computer security point of view. The
system security certification and the formal
approval/accreditation procedure, done in accordance with the
applicable policies of the issuing agencies, must still be
followed before a system can be approved for use in processing or
handling classified information.,8;9.

>Designated Approving Authorities (DAAs) remain ultimately
>responsible for specifying security of systems they
>accredit.

The trusted computer system evaluation criteria will be used
directly and indirectly in the certification process. Along with
applicable policy, it will be used directly as

>technical guidance

for evaluation of the total system and for specifying system
security and certification requirements for new acquisitions. Where
a system being evaluated for certification employs a product that
has undergone a Commercial Product Evaluation, reports from that
process will be used as input to the certification evaluation.
Technical data will be furnished to designers, evaluators and the
Designated Approving Authorities to support their needs for
making decisions.

2.1.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 2.2.1.1 as follows:

2.2.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely Reworded Section 2.2.1.2 as follows:

2.2.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Reworded Section 2.2.2.2 as follows:

2.2.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

For each recorded event, the audit record shall
identify: date and time of the event, user, type of event,
and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object. The ADP system
administrator shall be able to selectively audit the
actions of any one or more users based on individual
identity.

Changed Section 2.2.4.3 as follows:

2.2.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 3.1.1.1 as follows:

3.1.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely reworded Section 3.1.1.2 as follows:

3.1.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.1.1.3.2 as follows:

3.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/ communication
channel or I/O device.

Appended a sentence to Section 3.1.1.4 as follows:

3.1.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed one sentence in Section 3.1.2.1 as follows:

3.1.2.1. Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Reworded Section 3.1.2.2 as follows:

3.1.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

> and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level.

‘Unbolded’ the first sentence of Section 3.1.3.2.1.

Reworded Section 3.1.3.2.2 as follows:

3.1.3.2.2 Design Specification and Verification

An informal or formal model of the security policy
supported by the TCB shall be maintained

>over the life cycle of the ADP system and demonstrated

to be consistent with its axioms.

Changed sentence as follows:

3.1.4.3 Test Documentation

The system developer will provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested,

and results of the security mechanisms’ functional testing.

Changed Section 3.2.1.1 as follows:

3.2.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
self/group/public controls, access control lists) shall
allow users to specify and control sharing of those
objects by named individuals, or defined groups of
individuals, or by both,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall,
either by explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of including or excluding
access to the granularity of a single user. Access
permission to an object by users not already possessing
access permission shall only be assigned by authorized
users.

Completely reworded Section 3.2.1.2 as follows:

3.2.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.2.1.3 as follows:

3.2.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object, ROM) that is
directly or indirectly accessible by subjects external
to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 3.2.1.3.2 as follows:

3.2.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/
communication channel or I/O device.

Appended Sectence to Section 3.2.1.4 as follows:

3.2.1.4 Mandatory Access Control

… Identification and authentication data shall be
used by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 3.2.2.1 as follows:

3.2.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Reworded section 3.2.2.2 as follows:

3.2.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels.

Changed Section 3.2.3.2.2 as follows:

3.2.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its
axioms. A descriptive top-level specification
(DTLS) of the TCB shall be maintained that
completely and accurately describes the TCB in
terms of exceptions, error messages, and effects.
It shall be shown to be an accurate description of
the TCB interface.

Changed Section 3.2.4.3 as follows:

3.2.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths.

Replaced “tamperproof” with “tamper resistant”:

3.2.4.4 Design Documentation

Documentation shall be available that provides a
description of the manufacturer’s philosophy of
protection and an explanation of how this philosophy is
translated into the TCB. The interfaces between the
TCB modules shall be described. A formal description
of the security policy model enforced by the TCB shall
be available and proven that it is sufficient to
enforce the security policy. The specific TCB
protection mechanisms shall be identified and an
explanation given to show that they satisfy the model.
The descriptive top-level specification (DTLS) shall be
shown to be an accurate description of the TCB
interface. Documentation shall describe how the TCB
implements the reference monitor concept and give an
explanation why it is

>tamper resistant,

cannot be bypassed, and is correctly implemented.
Documentation shall describe how the TCB is structured to
facilitate testing and to enforce least privilege. This
documentation shall also present the results of the covert
channel analysis and the tradeoffs involved in restricting
the channels. All auditable events that may be used in the
exploitation of known covert storage channels shall be
identified. The bandwidths of known covert storage
channels, the use of which is not detectable by the
auditing mechanisms, shall be provided. (See the Covert
Channel Guideline section.)

Changed Section 3.3.1.1 as follows:

3.3.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
access control lists) shall allow users to specify and
control sharing of those objects,

>and shall provide controls to limit
>propagation of access rights.

The discretionary access control mechanism shall, either by
explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of specifying, for
each named object, a list of named individuals and a
list of groups of named individuals with their
respective modes of access to that object.
Furthermore, for each such named object, it shall be
possible to specify a list of named individuals and a
list of groups of named individuals for which no access
to the object is to be given. Access permission to an
object by users not already possessing access
permission shall only be assigned by authorized users.

Completely reworded Section 3.3.1.2 as follows:

3.3.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 3.3.1.3 as follows:

3.3.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object, ROM) that is
directly or indirectly accessible by subjects external
to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 3.3.1.3.2 as follows:

3.3.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level or
levels associated with a /single-level/
communication channel or I/O device.

Appended Sentence to Section 3.3.1.4 as follows:

3.3.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 3.3.2.1 as follows:

3.3.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Changed Section 3.3.2.2 as follows:

3.3.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able
to immediately notify the security administrator when
thresholds are exceeded,

>and if the occurrence or accumulation
>of these security relevant events continues,
>the system shall take the least disruptive
>action to terminate the event.

Changed the first sentence of Section 3.3.3.2.2 as follows:

3.3.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of
>the ADP system

that is proven consistent with its axioms. …

Changed Section 3.3.4.3 as follows:

3.3.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the
>security mechanisms were tested,

and results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths.

Replaced “tamperproof” with “tamper resistant” in Section 3.3.4.4.

Changed Section 4.1.1.1 as follows:

4.1.1.1 Discretionary Access Control

The TCB shall define and control access between named
users and named objects (e.g., files and programs) in
the ADP system. The enforcement mechanism (e.g.,
access control lists) shall allow users to specify and
control sharing of those objects,

>and shall provide controls to
>limit propagation of access rights.

The discretionary access control mechanism shall, either by
explicit user action or by default, provide that
objects are protected from unauthorized access. These
access controls shall be capable of specifying, for
each named object, a list of named individuals and a
list of groups of named individuals with their
respective modes of access to that object.
Furthermore, for each such named object, it shall be
possible to specify a list of named individuals and a
list of groups of named individuals for which no access
to the object is to be given. Access permission to an
object by users not already possessing access
permission shall only be assigned by authorized users.

Completely reworded Section 4.1.1.2 as follows:

4.1.1.2 Object Reuse

All authorizations to the information contained within
a storage object shall be revoked prior to initial
assignment, allocation or reallocation to a subject
from the TCB’s pool of unused storage objects. No
information, including encrypted representations of
information, produced by a prior subject’s actions is
to be available to any subject that obtains access to
an object that has been released back to the system.

Changed Section 4.1.1.3 as follows:

4.1.1.3 Labels

Sensitivity labels associated with each ADP system
resource (e.g., subject, storage object,

>ROM)

that is directly or indirectly accessible by subjects
external to the TCB shall be maintained by the TCB. These
labels shall be used as the basis for mandatory access
control decisions. In order to import non-labeled
data, the TCB shall request and receive from an
authorized user the security level of the data, and all
such actions shall be auditable by the TCB.

Changed Section 4.1.1.3.2 as follows:

4.1.1.3.2 Exportation of Labeled Information

The TCB shall designate each communication channel
and I/O device as either single-level or
multilevel. Any change in this designation shall
be done manually and shall be auditable by the
TCB. The TCB shall maintain and be able to audit
any change in the /current/ security level

>or levels

associated with a /single-level/
communication channel or I/O device.

Appended Sentence to Section 4.1.1.4 as follows:

4.1.1.4 Mandatory Access Control

… Identification and authentication data shall be used
by the TCB to authenticate the user’s identity
and to ensure that the security level and authorization
of subjects external to the TCB that may be created to
act on behalf of the individual user are dominated by
the clearance and authorization of that user.

Changed Section 4.1.2.1 as follows:

4.1.2.1 Identification and Authentication

… This data shall be used by the TCB to authenticate
the user’s identity and /to determine/

>to ensure that

the security level and authorizations of subjects

>external to the TCB

that may be created to act on
behalf of the individual user

>are dominated by the clearance
>and authorization of that user.

Changed Section 4.1.2.2 as follows:

4.1.2.2 Audit

The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction
an audit trail of accesses to the objects it protects.
The audit data shall be protected by the TCB so that
read access to it is limited to those who are
authorized for audit data. The TCB shall be able to
record the following types of events: use of
identification and authentication mechanisms,
introduction of objects into a user’s address space
(e.g., file open, program initiation), deletion of
objects, actions taken by computer operators and system
administrators and/or system security officers,

>and other security relevant events.

The TCB shall also be able to audit any override
of human-readable output markings. For each recorded
event, the audit record shall identify: date and time of
the event, user, type of event, and success or failure of
the event. For identification/authentication events the
origin of request (e.g., terminal ID) shall be included in
the audit record. For events that introduce an object into
a user’s address space and for object deletion events the
audit record shall include the name of the object and the
object’s security level. The ADP system administrator
shall be able to selectively audit the actions of any one
or more users based on individual identity and/or object
security level. The TCB shall be able to audit the
identified events that may be used in the exploitation of
covert storage channels. The TCB shall contain a mechanism
that is able to monitor the occurrence or accumulation of
security auditable events that may indicate an imminent
violation of security policy. This mechanism shall be able
to immediately notify the security administrator when
thresholds are exceeded,

>and, if the occurrence or accumulation of these
>security relevant events continues, the system
>shall take the least disruptive action to
>terminate the event.

‘Unbolded’ the words “covert channels” in Section 4.1.3.1.3.

Changed the first sentence of Section 4.1.3.2.2 as follows:

4.1.3.2.2 Design Specification and Verification

A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its axioms. …

Changed Section 4.1.4.3 as follows:

4.1.4.3 Test Documentation

The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested, and

results of the security mechanisms’ functional testing.
It shall include results of testing the effectiveness
of the methods used to reduce covert channel
bandwidths. The results of the mapping between the
formal top-level specification and the TCB source code
shall be given.

Replaced “tamperproof” with “tamper resistant” in Section 4.1.4.4.

Changed the last paragraph of Section 5.1 as follows:

5.1 A Need for Consensus

A major goal of …

As described …

>The Purpose of this section is to describe in detail the
>fundamental control objectives. These objectives lay the
>foundation for the requirements outlined in the criteria.

The goal is to explain the foundations so that those outside
the National Security Establishment can assess their
universality and, by extension, the universal applicability
of the criteria requirements to processing all types of
sensitive applications whether they be for National Security
or the private sector.

Changed the second paragraph of Section 6.2 as follows:

6.2 A Formal Policy Model

Following the publication of …

>A subject can act on behalf of a user or another
>subject. The subject is created as a surrogate
>for the cleared user and is assigned a formal
>security level based on their classification.
>The state transitions and invariants of the formal
>policy model define the invariant relationships
>that must hold between the clearance of the user,
>the formal security level of any process that can
>act on the user’s behalf, and the formal security
>level of the devices and other objects to which any
>process can obtain specific modes of access.

The Bell and LaPadula model,

>for example,

defines a relationship between

>formal security levels of subjects and objects,

now referenced as the “dominance relation.” From this definition …
… Both the Simple Security Condition and the *-Property
include mandatory security provisions based on the dominance
relation between the

>formal security levels of subjects and objects.

The Discretionary Security Property …

Added a sentence to the end of Section 7.0:

7.0 THE RELATIONSHIP BETWEEN POLICY AND THE CRITERIA

Section 1 presents fundamental computer security
requirements and Section 5 presents the control objectives
for Trusted Computer Systems. They are general
requirements, useful and necessary, for the development of
all secure systems. However, when designing systems that
will be used to process classified or other sensitive
information, functional requirements for meeting the Control
Objectives become more specific. There is a large body of
policy laid down in the form of Regulations, Directives,
Presidential Executive Orders, and OMB Circulars that form
the basis of the procedures for the handling and processing
of Federal information in general and classified information
specifically. This section presents pertinent excerpts from
these policy statements and discusses their relationship to
the Control Objectives.

>These excerpts are examples to illustrate the relationship
>of the policies to criteria and may not be complete.

Inserted the following

>as the next to last paragraph

of Section 7.2:

>DoD Directive 5200.28 provides the security requirements for
>ADP systems. For some types of information, such as
>Sensitive Compartmented Information (SCI), DoD Directive
>5200.28 states that other minimum security requirements also
>apply. These minima are found in DCID 1/16 (new reference
>number 5) which is implemented in DIAM 50-4 (new reference
>number 6) for DoD and DoD contractor ADP systems.

From requirements imposed by …

Changed Footnote #1 referenced by Section 7.2 as follows:

Replaced “Health and Human Services Department” with “U.S.
Information Agency.”

Changed (updated) the quote from DoD 5220.22-M, Section 7.3.1, as
follows:

7.3 Criteria Control Objective for Security Policy

7.3.1 Marking

The control objective for marking …

DoD 5220.22-M, “Industrial Security …

>”a. General. Classification designation by physical
>marking, notation or other means serves to warn and to
>inform the holder what degree of protection against
>unauthorized disclosure is required for that
>information or material.” (14)

Changed the

>last paragraph

of Section 7.5 as follows:

A major component of assurance, life-cycle assurance,

>as described in DoD Directive 7920.1,

is concerned with testing ADP systems both in the
development phase as well as during operation.

>(17)

DoD Directive 5215.1 …

Changed Section 9.0 as follows:

9.0 A GUIDELINE ON CONFIGURING MANDATORY ACCESS CONTROL FEATURES

The Mandatory Access Control requirement …

* The number of hierarchical classifications should be
greater than or equal to

>sixteen (16).

* The number of non-hierarchical categories should be
greater than or equal to

>sixty-four (64)..

Completely reworded the third paragraph of Formal Product
Evaluation, in Appendix A, as follows:

Formal Product Evaluation

The formal product evaluation provides …

A formal product evaluation begins with …

>The evaluation team writes a final report on their findings about
>the system. The report is publicly available (containing no
>proprietary or sensitive information) and contains the overall
>class rating assigned to the system and the details of the
>evaluation team’s findings when comparing the product against the
>evaluation criteria. Detailed information concerning
>vulnerabilities found by the evaluation team is furnished to the
>system developers and designers as each is found so that the
>vendor has a chance to eliminate as many of them as possible
>prior to the completion of the Formal Product Evaluation.
>Vulnerability analyses and other proprietary or sensitive
>information are controlled within the Center through the
>Vulnerability Reporting Program and are distributed only within
>the U.S. Government on a strict need-to-know and non-disclosure
>basis, and to the vendor.

Changed two paragraphs in Audit (Appendix D) as follows:

C2: NEW: The TCB shall be able to create, maintain, and protect
from modification or unauthorized access or destruction an
audit trail of accesses to the objects it protects. The
audit data shall be protected by the TCB so that read access
to it is limited to those who are authorized for audit data.
The TCB shall be able to record the following types of
events: use of identification and authentication mechanisms,
introduction of objects into a user’s address space (e.g.,
file open, program initiation), deletion of objects, actions
taken by computer operators and system administrators and/or
system security officers,

>and other security relevant events.

or each recorded event, the audit record shall
identify: date and time of the event, user, type of event,
and success or failure of the event. For
identification/authentication events the origin of request
(e.g., terminal ID) shall be included in the audit record.
For events that introduce an object into a user’s address
space and for object deletion events the audit record shall
include the name of the object. The ADP system
administrator shall be able to selectively audit the actions
of any one or more users based on individual identity.

B3: ADD: …when thresholds are exceeded,

>and, if the occurrence or accumulation of these
>security relevant events continues, the system
>shall take the least disruptive action to terminate
>the event.

Changed one paragraph in Design Documentation (Appendix D):

B2: ADD: Change “tamperproof” to “tamper resistant.”

Changed two paragraphs in Design Specification and Verification:

B1: NEW: An informal or formal model of the security policy
supported by the TCB shall be maintained

>over the life cycle of the ADP system and demonstrated

to be consistent with its axioms.

B2: CHANGE: A formal model of the security policy supported by
the TCB shall be maintained

>over the life cycle of the ADP system

that is proven consistent with its axioms.

Changed two paragraphs in Discretionary Access Control as follows:

C2: CHANGE: The enforcement mechanism (e.g., self/group/public
controls, access control lists) shall allow users to specify
and control sharing of those objects by named individuals,
or defined groups of individuals, or by both,

>and shall provide controls to limit propagation of access rights.

B3: CHANGE: The enforcement mechanism (e.g., access control
lists) shall allow users to specify and control sharing of
those objects,

>and shall provide controls to limit propagation of access rights.

These access controls shall be capable of specifying, for each
named object, a list of named individuals and a list of groups of
named individuals with their respective modes of access to that object.

Changed 1 paragraph in Exportation of Labeled Information:

B1: NEW: The TCB shall designate each communication channel and
I/O device as either single-level or multilevel. Any change
in this designation shall be done manually and shall be
auditable by the TCB. The TCB shall maintain and be able to
audit any change in the /current/ security level

>or levels

associated with a /single-level/ communication channel or
I/O device.

Changed 1 paragraph in Identification and Authorization:

B1: CHANGE: … This data shall be used by the TCB to authenticate
the user’s identity and

>to ensure that

the security level and authorizations of subjects external to
the TCB that may be created to act on behalf of the individual
user

>are dominated by the clearance and authorization
>of that user.

Changed 1 paragraph in Labels:

B2: CHANGE: … (e.g., subject, storage object, ROM) …

Changed 1 paragraph in Mandatory Access Control:

B1: NEW: … Identification and authentication data shall be used

>by the TCB to authenticate the user’s identity and to ensure
>that the security level and authorization of subjects external
>to the TCB that may be created to act on behalf of the
>individual user are dominated by the clearance and authoriza-
>tion of that user.

Rewrote 1 paragraph in Object Reuse:

C2: NEW:
>All authorizations to the information contained
>within a storage object shall be revoked prior to initial
>assignment, allocation or reallocation to a subject from the
>TCB’s pool of unused storage objects. No information,
>including encrypted representations of information, produced
>by a prior subject’s actions is to be available to any
>subject that obtains access to an object that has been
>released back to the system.

Changed l paragraph in Test Documentation:

C1: NEW: The system developer shall provide to the evaluators a
document that describes the test plan,

>test procedures that show how the security
>mechanisms were tested,

and results of the security mechanisms’ functional testing.

GLOSSARY

Changed Discretionary Access Control:

Discretionary Access Control – A means of restricting access to
objects based on the identity of subjects and/or groups to
which they belong. The controls are discretionary in the
sense that a subject with a certain access permission is
capable of passing that permission (perhaps indirectly) on
to any other subject

(unless restrained by mandatory access control).

Added:

Front-End Security Filter – A process that is invoked to process
data according to a specified security policy prior to
releasing the data outside the processing environment or
upon receiving data from an external source.

Granularity – The relative fineness or coarseness by which a
mechanism can be adjusted. The phrase “the granularity of
a single user” means the access control mechanism can be
adjusted to include or exclude any single user.

Read-Only Memory (ROM) – A storage area in which the contents
can be read but not altered during normal computer
processing.

Security Relevant Event – Any event that attempts to change the
security state of the system, (e.g., change discretionary
access controls, change the security level of the subject,
change user password, etc.). Also, any event that attempts
to violate the security policy of the system, (e.g., too
many attempts to login, attempts to violate the mandatory
access control limits of a device, attempts to downgrade a
file, etc.).

Changed the name of the term:

Simple Security /Property/

>Condition

– A Bell-LaPadula security model rule allowing a subject
read access to an object only if the security level of the
subject dominates the security level of the object.

Changed definition:

Trusted Computing Base (TCB) – The totality of protection
mechanisms within a computer system –including hardware,
firmware, and software — the combination of which is
responsible for enforcing a security policy.

>A TCB consists of one or more components that together enforce
>a unified security policy over a product or system.

The ability of a TCB to correctly enforce a security
policy depends solely on the mechanisms within the TCB and
on the correct input by system administrative personnel of
parameters (e.g., a user’s clearance) related to the
security policy.

REFERENCES

Added: (References were renumbered as necessary)

5. DCID 1/16, Security of Foreign Intelligence in Automated
Data Processing Systems and Networks (U), 4 January 1983.

6. DIAM 50-4, Security of Compartmented Computer Operations (U),
24 June 1980.

9. DoD Directive 5000.29, Management of Computer Resources in
Major Defense Systems, 26 April 1976.

17. DoD Directive 7920.1, Life Cycle Management of Automated
Information Systems (AIS), 17 October 1978.

Corrected dates on the following References:

14. DoD 5220.22-M, Industrial Security Manual for Safeguarding
Classified Information, March 1984.

15. DoD 5220.22-R, Industrial Security Regulation, February
1984.

%

Guidelines for Writing Trusted Facility Manuals

Guidelines for Writing Trusted Facility Manuals

————————————————————————

Table of Contents

FOREWORD
ACKNOWLEDGMENTS
PREFACE
1 INTRODUCTION
1.1 Purpose
1.2 Scope and Contents
1.3 Control Objectives
1.4 TFM Introduction
2 SYSTEM SECURITY OVERVIEW
2.1 Threats
2.2 Countermeasures Based on Security and Accountability Policies
and Procedures
2.3 Explicit Physical Security Assumptions
2.4 Protection Mechanisms Available to Administrative Users
2.5 Security Vulnerabilities and Warnings
2.6 Separation of Administrative Roles
3 SECURITY POLICY
4 ACCOUNTABILITY
4.1 Identification and Authentication Functions of Administrative
Users
4.2 Audit
5 ROUTINE OPERATIONS
6 SECURITY OF THE TCB
7 SATISFYING THE TCSEC REQUlREMENTS
7.1 Requirements and Recommendations for Security Class C1
7.1.1 TFM Introduction
7.1.2 System Security Overview
7.1.3 Accountability
7.1.4 Routine Operations
7.1.5 Security of the TCB
7.2 Requirements and Recommendations for Security Class C2
7.2.1 TFM Introduction
7.2.2 System Security Overview
7.2.3 Security Policy
7.2.4 Accountability
7.2.4.1 Identification and Authentication
7.2.4.2 Audit
7.2.5 Routine Operations
7.2.6 Security of the TCB
7.3 Requirements and Recommendations for Security Class B1
7.3.1 TFM Introduction
7.3.2 System Security Overview
7.3.3 Security Policy
7.3.4 Accountability
7.3.4.1 Identification and Authentication
7.3.4.2 Audit
7.3.5 Routine Operations
7.3.6 Security of the TCB
7.4 Requirements and Recommendations for Security Class B2
7.4.1 Introduction
7.4.2 System Security Overview
7.4.3 Security Policy
7.4.4 Accountability
7.4.4.1 Identification and Authentication
7.4.4.2 Audit
7.4.5 Routine Operations
7.4.6 Security of the TCIB
7.5 Requirements and Recommendations for Security Class B3
7.5.1 TFM Introduction
7.5.2 System Overview
7.5.3 Security Policy
7.5.4 Accountability
7.5.4.1 Identification and Authentication
7.5.4.2 Audit
7.5.5 Routine Operations
7.5.6 Security of the TCB
7.6 Requirements of Security Class A1
GLOSSARY
REFERENCES

————————————————————————
NATIONAL COMPUTER SECURITY CENTER

FORT GEORGE G. MEADE, MARYLAND 20755-6000

NCSC-TG-016

Library No. S239,639

Version 1

FOREWORD

Guidelines for Writing Trusted Facility Manuals provides a set of good
practices related to the documentation of trusted facility management
functions of systems employed for processing classified and other sensitive
information. A Trusted Facility Manual (TFM) is a document written by a
system vendor that describes how to configure and install a specific secure
system, operate the system in a secure manner, and make effective use of the
system privileges and protection mechanisms to control access to
administrative functions and databases.

Guidelines for Writing Trusted Facility Manuals is the latest addition to
the “Rainbow Series” of documents. These publications are the product of the
Technical Guidelines Program. The National Computer Security Center designed
these technical guidelines to provide insight to the Trusted Computer System
Evaluation Criteria requirements and guidance for meeting each requirement.

Recommendations for revision to this guideline are encouraged and will be
reviewed by the National Computer Security Center through a formal review
process.

Patrick R. Gallagher, Jr. October 1992
Director
National Computer Security Center

ACKNOWLEDGMENTS

The National Computer Security Center wishes to extend special recognition
and acknowledgement for their contributions to this document to Infosystems
Technology, Inc., and to Dr. Virgil D. Gligor of the University of Maryland
as primary author and preparer of this document. Special thanks also go to
the many computer vendor representatives, and members of the National
Computer Security Center (NCSC) community who enthusiastically gave of their
time and technical expertise in reviewing the material and providing
valuable comments and suggestions.

Special recognition goes to Leon Neufeld, NCSC, who served as project
manager for the preparation and production of this document.

PREFACE

Throughout this guideline there will be recommendations made that are not
included in the Trusted Computer System Evaluation Criteria (TCSEC) as
requirements. Any recommendations that are not in the TCSEC are prefaced by
the word “should,” whereas all requirements are prefaced by the word
“shall.” It is hoped that this will help to avoid any confusion.

Examples in this document are not to be construed as the only implementation
that will satisfy the TCSEC requirement. The examples and literature
citations provided herein are merely suggestions of appropriate designs and,
possibly, implementations. The recommendations in this document are also not
to be construed as supplementary requirements to the TCSEC. The TCSEC is the
only metric against which systems are to be evaluated.

1 INTRODUCTION

The Department of Defense Computer Security Center (DoDCSC), established in
January 1981, expands on the work started by the DoD Security Initiative. In
1985, the DoDCSC became the National Computer Security Center (NCSC) to
reflect its responsibility for computer security throughout the Federal
Government. The Director, NCSC, has the responsibility for establishing and
publishing criteria and guidelines for all areas of computer security.

The principal goal of the NCSC is to encourage the widespread availability
of trusted computer systems. In support of that goal, the NCSC created a
metric, known as the DoD Trusted Computer System Evaluation Criteria
(TCSEC), against which computer systems could be evaluated for security. The
DoDCSC originally published the TCSEC on 15 August 1983 as CSC-STD-001-83.
In December 1985, the DoD adopted it, with a few changes, as a DoD Standard,
DoD 5200.28-STD. DoD Directive 5200.28, Security Requirements for Automated
Information Systems (AIS) requires the TCSEC to be used throughout the DoD.
The TCSEC is the standard used for evaluating the effectiveness of security
controls built into Automated Data Processing (ADP) systems. The TCSEC has
four divisions: D, C, B, and A, ordered in a hierarchical manner with the
highest division (A) being reserved for systems providing the best available
level of assurance. Within divisions C, B, and A, a number of subdivisions,
known as classes, are also ordered in a hierarchical manner to represent
different levels of assurance in these classes.

1.1 Purpose

A Trusted Facility Manual (TFM) is one of the documents necessary to satisfy
the requirements of any class in the TCSEC. The TFM is directed towards the
administrators of an installation, and its goal is to provide detailed,
accurate information on how to (1) configure and install a specific secure
system, (2) operate the system in a secure manner, (3) make effective use of
the system privileges and protection mechanisms to control access to
administrative functions and databases, and (4) avoid pitfalls and improper
use of the administrative functions that would compromise the Trusted
Computing Base (TCB) and user security.

The importance of the TFM in supporting the operation of a secure computer
system cannot be over estimated. Even if one assumes, hypothetically, that
all users of a system and their applications are trusted, and that they will
use all of the available protection mechanisms correctly, the system may
still be administered and operated in an insecure manner. This may be
especially true when administrative users lack the skill, the care, or the
interest to use the system properly. Furthermore, the security damage that
administrative users can cause by careless use, or deliberate misuse, of
administrative authority is significantly larger than that caused by
ordinary users. Although use of a detailed, accurate TFM cannot address or
counter deliberate misuse of administrative authority, it can help minimize
chances of misuse due to lack of awareness of proper system use. To help
minimize these instances of system misuse, the TFM should include examples
of both proper uses and warnings about consequences of misuse of
administrative functions, procedures, privileges, and databases.

This guideline presents the issues involved in writing TFMs. Its objectives
are (1) to provide guidance to manufacturers on how to document functions of
trusted facility management implemented by their systems and (2) recommend a
TFM structure, format, and content that would satisfy the TCSEC
requirements. The recommendations made herein should not be considered as
the-only means to satisfy the TCSEC requirements. Additionally, this
document contains suggestions and recommendations derived from the TCSEC
objectives but which are not required by TCSEC in the TFM area. For example,
the TFM may include documentation required by the TCSEC in the areas of
System Architecture, Design Documentation, and Trusted Distribution. The
inclusion of this documentation in a TFM instead of other separate documents
is optional.

1.2 Scope and Contents

The TFM should give specific guidance to administrative users on how to
configure, install, and operate a secure computer system, and should clearly
illustrate the intended use of all security features, citing actual system
commands and procedures. Although a high level of detail in illustrating key
security concepts would benefit administrative users, the TFM cannot be
considered to be, nor can it be, a training manual in the area of computer
security in general, nor in the area of system administration in particular.
Instead, the TFM user is assumed to have some familiarity with the notion of
trusted systems within the realm of computer security. The TFM will provide
the user with detailed information on how to administer and operate a
specific trusted system in a secure manner.

Many different organizations of the TFM are possible. For example, an
acceptable TFM format would provide a separate section describing specific
security responsibilities of any separate administrative roles, such as
those of the security administrator, auditor, system programmer, operator,
that are supported in the system; available commands for each role; use of
each command; parameter and default settings; specific warnings and advice
regarding the use of functions, privileges and databases of that role; and
the specific responsibilities of that role for TCB security. Use of this
format is advisable for manuals of systems in higher security classes,
namely B2, B3, and A1, where separation of administrative roles is required.

An equally acceptable TFM organization and section format would provide a
separate section for each functional requirement area of the TCSEC, namely,
for security policy (e.g., Discretionary Access Control (DAC), Mandatory
Access Control, (MAC)), accountability, and TCB protection. Each section
would include available commands, system calls, and procedures relevant to
that area; use of each command (including the effects of each command when
used by different administrative roles); parameter and default settings; and
specific warnings and advice regarding the use of functions, privileges, and
databases available to commands of that area. Use of this alternate format
is advisable for lower security classes, namely C1-B1, where the TCSEC does
not mandate any separation of administrative roles. Either of the two
alternate TFM formats mentioned above is equally acceptable for all TCSEC
security classes as long as the TFM satisfies the TCSEC requirements
Furthermore, other TFM formats would also be acceptable as long as they
satisfy the stated TCSEC requirements. The TCSEC neither requires nor
suggests a specific TFM format.

This guideline contains eight additional sections. Section 2 defines the
security and accountability policies and mechanisms of systems. Section 3
identifies and explains the security-relevant and security-irrelevant
functions of an administrator. Section 4 identifies and explains the use of
TCB commands and interfaces used by administrative users. Section 5 defines
day-to-day routine operations performed by administrative users and the
security vulnerabilities of these operations. Section 6 identifies all TCB
security and integrity responsibilities of administrative users.

Section 7 presents recommendations for writing the TFM that satisfy the
requirements of the TCSEC. Section 8 is a glossary. Section 9 lists the
references cited in the text. Each section consists of three parts: a
statement of purpose, an explanation of how that purpose can be achieved,
and an outline summarizing the recommendations made.

These guidelines apply to computer systems and products built or modified
with the intention of satisfying the TCSEC requirements.

1.3 Control Objectives

The control objectives for the TFM are similar to those of other
documentation areas of the TCSEC. They refer to what should be documented in
a particular area, such as the trusted facility management, and how this
documentation should be structured. Thus, the control objectives for writing
the TFM are:

(1) the TFM shall address all the requirements specified by the TCSEC
that are relevant to it; and
(2) the TFM shall provide detailed, accurate information on how to:
– configure and install a specific secure system;
– operate a system in a secure manner;
– avoid pitfalls and improper use of administrative functions that
would compromise system and user security.

1.4 TFM Introduction

The purpose of this section in the TFM is to explain the scope, use, and
contents of the TFM of a particular system. In general, the scope of the TFM
should include explanations of how to configure and maintain secure systems,
administer and operate them in a secure manner, make effective use of the
system’s privileges and protection mechanisms for administrative use, and
avoid pitfalls and misuse of administrative authority. Depending on the
particular computer system, the complexity of trusted facility management
may differ and thus the scope of the TFM may differ accordingly. For
example, in large systems, system configuration and installation is a
complex activity described in a separate system administration manual that
may, or may not, include the other important areas of the TFM. In contrast,
system configuration and installation is a relatively simple activity
defined in a single chapter of a TFM for a small system, such as a
multi-user workstation.

The introduction to the TFM should also discuss the recommended use of the
manual. In particular, this section should define the skills and general
computer systems and security background assumed for administrative
personnel. This is necessary because different administrative functions
require different levels of skill. For example, an individual in the system
programming staff that configures, installs, and maintains the TCB code
often needs considerably more technical skills than an individual in the
accounts management staff. Similarly, a security administrator needs more
detailed knowledge of the system security policy and accountability than
individuals assigned to operator’s roles. The definition of required skills
and background is important in aiding the management of a particular
organization in assigning appropriately trained individuals to various
administrative tasks.

In defining the use of the TFM, the introductory section should also include
a list of other system manuals that may be consulted by the administrative
staff. For example, most administrators may benefit from an understanding of
the Security Features User’s Guide (SFUG). Most system designs use the DAC
mechanisms described in the SFUG for protection of, at least, some
administrative files, and may use the trusted path mechanism to prevent
spoofing of administrative commands. Similarly, whenever manual sections
that logically belong in the TFM are in fact provided in other manuals —
system configuration and installation manuals, and system reference manuals
containing descriptive top-level specifications (DTLSs) of commands and
interfaces used by administrative users—the TFM Introduction should
include references to these additional manuals. The TFM should place the
references to these manuals in context and provide a brief synopsis of the
relevant information from the specific manual citation. This citation would
help narrow the reader’s focus to a few pages of the referenced manual.
Furthermore, references to documents, manuals, and standards that may be
beneficial to some administrative personnel, such as password management and
use guidelines and standards, should be made in this section. References to
educational and training documents that are helpful to administrative
personnel may also be included here.

The TFM writer may also want to define the limitations of the TFM in terms
of security scope. For example, some security issues such as personnel
background verification, assignment and maintenance of users’ trust levels,
physical system and environmental security, proper use of cryptographic
techniques and devices, and procedures that assign individuals to
administrative roles, generally fall outside the scope of TFM definition.
Explicit recognition of such limitations enables the management of a secure
facility to plan countermeasures for areas of vulnerability not countered by
the trusted systems.

Finally, the introductory section of the TFM should include a “road map”
defining the contents of each TFM section and possibly the relationships
between various manual sections. This road map may also identify the
self-contained sections of the manual that can be read independently of
other sections.

In summary, the introductory section of the TFM should include:

(1) Scope of the manual
– guide the configuration and installation of secure systems;
– guide the operation of a system in a secure manner;
– enable administrative personnel to make effective use of the
system’s privileges and protection mechanisms;
-issue warnings about possible misuse of administrative authority.
(2) Recommended use of the manual
– review skills and systems background necessary for
administrative personnel;
– suggest additional manuals, reference material, and standard,
needed by administrative personnel;
– specify the limitations of security scope;
(3) TFM contents
-contents of each section;
– section relationships.

2 SYSTEM SECURITY OVERVIEW

The purpose of this section of the TFM is to define the security and
accountability policies and mechanisms of the system that are designed to
counter a set of perceived threats. The focus of this section should be on
the administrative-user functions available to counter threats, the
privileges and protection mechanisms available to administrative users, and
the general vulnerabilities associated with actions of administrative users.
This section should also include a list of dependencies on other security
measures, such as those for the maintenance of physical security, which,
although not required by the TCSEC, should be taken into account by the
management of the system installation and by system accreditors.

2.1 Threats

Examples of the general security threat handled by systems built to satisfy
a TCSEC class is that of unauthorized disclosure of information through
either unauthorized direct or indirect access to system and user objects
through system failures, subversion, and TCB tampering or through use of
covert channels. The manual should describe some of the common attacks that
cause unauthorized disclosure of information, in the context of the specific
system. These examples might include the use of Trojan horses in untrusted
shared programs, the use of covert channels by untrusted users and
applications, the use of known penetration methods that cause unauthorized
disclosure of sensitive or proprietary information, and the misuse of access
authorization to retrieve and disclose sensitive information (e.g., insider
attacks).

2.2 Countermeasures Based on Security and Accountability Policies and
Procedures

This section of the TFM should include a brief discussion of the protection
mechanisms available in the system that help counter the threats defined in
the above section. This discussion should serve as a summary of the
protection philosophy used in the design and implementation of the
protection mechanisms and should include a presentation of the role of
security policy (both discretionary and mandatory policy, if any),
accountability, and assurance (both operational and life-cycle assurance).
The dependency of the system security mechanisms on administrative-user
actions should be emphasized here.

This section should point out clearly the types of threats that can, or
cannot, be countered by a specific policy or mechanism. For example, this
section should state that DAC mechanisms cannot, and are not meant to,
prevent or contain threats posed by Trojan horses implementing time bombs,
trap doors, or viruses placed in shared, untrusted applications [2]. DAC
mechanisms cannot, nor are they meant to, detect or prevent access performed
by an authorized subject on behalf of an unauthorized subject (e.g., the
surrogate access problem [3]). Furthermore, DAC mechanisms are not, nor were
they ever claimed to be, capable of controlling information (as opposed to
access privilege) flows. Only MAC can handle these problems.

This section should discuss, in the context of the specific system, the role
of specific accountability mechanisms and policies in countering security
threats not handled by access control mechanisms. An example is the use of
audit mechanisms to complement access control mechanisms in the sense that
they can detect attacks initiated by authorized users (i.e., by “insiders”),
or that trusted-path mechanisms are required to prevent spoofing, a threat
not usually countered by access control mechanisms or policies.

The emphasis in describing the above-mentioned threats and countermeasures
should be on the identification of the TCB mechanisms and policies that
counter a specific threat. For example, the summary of the countermeasures
supported by the system should include the basic assertion (and in other
design documents, the justification) that the TCB itself is
non-circumventable and tamper proof. Additional points of emphasis may be
that all countermeasures supported in the system require the interaction of
both access control and accountability mechanisms, and that these mechanisms
should be employed by both ordinary and administrative users. This section
should provide examples of interaction between ordinary and administrative
user decisions to illustrate both the positive and negative consequences of
such interaction.

2.3 Explicit Physical Security Assumptions

The TCSEC does not include requirements for physical security of the system
installation. However, the TFM should include a section or a subsection that
states the physical security assumptions made by the system designers. These
assumptions should be satisfied by the management of the organization
responsible for deploying the system, as the evaluation of physical security
is the responsibility of the system’s accreditors.

The explicit inclusion of the physical security assumptions made by
designers in the TFM will provide the accreditors with the necessary input
for the deployment of the system in different operational environments and
provide the administrative users an important input for the sound definition
of the system security profile. For example, systems that do not provide
trusted paths for administrative users usually assume that a set of terminal
ports is reserved for the connection of administrative consoles that are
physically separated from the rest of the user terminals for the entire
lifetime of the system. Also, a common assumption is that the system
definition of the security profile ensures that the level of trust
associated with the physical environment containing a system’s peripheral
will always dominate the maximum sensitivity associated with that
peripheral. Similarly, this section should emphasize that systems allowing
legitimate users to access their components (e.g., removable media) should
be used only in environments where both administrative and ordinary users
are trusted to access all data in the system and are trusted not to misuse
their physical access permissions. (In such environments, the use of
untrusted applications may still require the use of trusted systems even
though all users are trusted to access all data.) In systems that do not
allow users to access the system components, or when the above level of user
trust cannot be guaranteed, the TFM should suggest the physical controls
necessary to counter, or deter, the potential threat of physical access to
system components. The presentation of the physical security assumptions
made by system designers should enable accreditors to determine the security
risks and exposures assumed by system use as well as the required
countermeasures.

2.4 Protection Mechanisms Available to Administrative Users

The security of any system depends directly on the security of the
administrative commands, interfaces, and databases. For this reason,
administrative commands, privileges, and databases shall be protected from
ordinary users, and in some TCSEC security classes, shall be separated on a
role basis. This section should identify the protection mechanisms available
to administrative users to ensure that these users are aware of the means
available to control access to their commands, privileges, and databases.

All protection mechanisms that can be manipulated by ordinary users are also
usually available to administrative users. For example, all user
identification and authentication, and DAC mechanisms are available to
administrative users. In addition to mentioning these mechanisms, which the
SFUG already defines, this TFM section should include the description of the
mechanisms available only to the administrative users and the mode of their
safe use. For example, the use of special trusted-path mechanisms based on
physically protected, hard-wired consoles, which may allow the invocation of
command processors available only to administrative users, and the use of
audit mechanisms to detect potential intrusion by authorized users, are only
a few of the protection mechanisms specific to administrative users [7].

2.5 Security Vulnerabilities and Warnings

This section should describe the security vulnerabilities of administrative
commands and procedures, and should suggest specific ways to counter them.
Reference [7] cites generic examples of common vulnerabilities of
administrative roles and role-specific vulnerabilities. In addition to
similar examples, this TFM section should include a discussion of
system-specific vulnerabilities and countermeasures required in the assumed
environments of system use.

In any system, design and implementation assumptions are made about
administrative actions and their sequence of use. For example, the loading
of a system during the installation phase, and the installation itself, may
require the use of special administrative commands in a specific sequence.
The definition of a user security profile may require that administrators do
not reuse user and group identifiers, and that the definition of the system
security profile prohibits the reuse of bit encodings of sensitivity levels
without careful analysis of consequences. Other potential vulnerabilities,
such as those resulting from mismanagement of audit logs and post processing
of files (in on-line, off-line, and hard-copy form) should also be explained
here. Design and implementation assumptions should be stated explicitly in
this section to ensure that administrative users are aware of the negative
consequences of not satisfying these assumptions.

2.6 Separation of Administrative Roles

Security classes B2-A1 of the TCSEC require that the roles of the
administrative users be separated. This requirement means that the commands,
procedures, privileges, and databases of the various administrative roles
shall be separated by system design and shall be documented as such. Role
separation of classes B3 and A1 also requires the separation of
security-relevant functions from the security irrelevant ones. Reference [7]
cites the rationale and the means of achieving role separation in trusted
systems.

The TFM shall define each separate role supported by the system. Each role
should be clearly defined in terms of the commands and TCB interfaces
available to the role, the use of each command, the command effects and
exceptions (whenever these are not defined in the DTLS of the TCB),
parameter and default settings, specific warnings for the command use, and
advice. The TFM should also define the specific security mechanisms used to
protect privileged commands and data used by administrators.

In summary, the TFM section presenting the system security overview for
administrative users should include the following subsections:

2.1 Threats to System Security
2.2 Countermeasures Based on Security Policy and Accountability
2.3 Explicit Physical Security Assumptions
2.4 Protection Mechanisms Available to Administrative Users
2.5 Security Vulnerabilities of Administrative Users and Warnings
2.6 Separation of Administrative Roles (for classes B2-A1)

3 SECURITY POLICY

The purpose of this section is to identify and explain the security-relevant
and security-irrelevant functions of the administrators. In particular, this
section should explain, in the area of security-relevant functions, the use
of the TCB commands and interfaces by administrative users to initialize
discretionary access privileges, to set default user accesses to system
objects after user registration, and to distribute, review, and revoke
access privileges on behalf of users in systems that implement DAC in a
centralized way (2]. In systems that support MAC, this section also
identifies and explains the use of TCB commands and interfaces by
administrators to define and change the system security profile (e.g., the
system-sensitivity map, sensitivity level limits for system devices, and
file systems), to define and change object sensitivity levels (e.g., label
imported, unlabeled data, and media), and to change the trust level of
active subjects, whenever such a function is supported. This section also
should define the administrator’s interfaces for other functions related to
the support of DAC and MAC, such as changing object ownership, restoring
privileges deleted accidentally, destroying errant processes, running
consistency checks on system and user security profiles, and managing user
accounts.

Reference [7] outlines the role of the security administrators in support of
the security policy defined in a system. The TFM should specify the
commands, system calls, functions, their parameters and default settings
provided for each area of security policy and support, and should provide
examples of use, potential misuse, and security implications of command
misuse. For example, the TFM should explain how the administrator can change
the sensitivity label of an object or a subject, and cite the expected
security consequences Of such action and also how the administrator may
determine the consequences of such a change in the given system. Similarly,
the administrator may decide to reuse a binary representation of a
sensitivity level to define a new sensitivity level. For this process, the
manual shall state the circumstances in which this change is allowed, if
ever, and should explain the conditions under which this change is safe. All
commands, system calls, and functions should be defined in terms of their
effects, exceptions, and parameters. The use of commands should be
illustrated by examples showing the correct settings of various command
options. This section should describe the recommended reactions of the
administrator to such exceptions (unless these reactions are already
described in the call/command DTLS).

The administrative functions and interfaces used in supporting the security
policy have potential vulnerabilities. Reference [7] outlines some of these
generic vulnerabilities. The TFM shall include warnings of all known
specific vulnerabilities in the given system and possibly suggest means of
reducing system risk associated with such vulnerabilities. Minimally, the
TFM should specify the dependencies of the administrative roles on external
policies and procedures that would help reduce system risk associated with
identified vulnerabilities.

In summary, the security policy section of the TFM should include the
following subsections (whose contents are discussed in more detail in
reference [7]):

3.1 Discretionary Access Control
-TCB commands and interfaces used to initialize DAC privileges and
defaults;
-TCB command interfaces to distribute, review, and revoke user
privileges in systems that support centralized DAC;
-group membership definition and impact on DAC.
-change of object ownership (if any), restoration of accidentally
deleted privileges, destruction of errant processes;
3.2 Mandatory Access Control
– TCB commands and interfaces to define and change system security
profile; classify, reclassify and import objects; and change trust
level of active subjects;
– consistency checking of system security and user profiles.
3.3 Management of User Accounts
-definition and deletion of user and group accounts and
identifiers.
3.4 Command System Call and Function Definitions
– effects and exceptions (if not defined in DTLSs);
– parameter and default settings;
– examples of command use and potential misuse.
3.5 Warnings of Specific Vulnerabilities of Administrative Procedures
and Activities Related to Security Policy.

4 ACCOUNTABILITY

4.1 Identification and Authentication Functions of Administrative Users

The purpose of this section is to identify and explain the use of TCB
commands and interfaces that should be used by administrative users to set
up user security profiles, and to determine authentication and authorization
parameters associated with the user identification and authentication
mechanism. Reference [7] defines the role of the security administrator in
the identification and authentication area. The TFM shall specify the
commands, system calls and functions, and their parameters and default
settings that are provided by the specific system, and should provide
examples of the use, or potential misuse of these commands, and the security
implications of command misuse. For example, the TFM should explain how the
administrator can initialize user passwords, can distribute special
passwords to other administrative users, and set up account restrictions
(e.g., restricted time intervals for login, account cutoff). The commands
that allow the definition of user and group identifiers shall include an
explanation of how these identifiers should be chosen, why they should not
be reused, and what the consequences of identifier reuse are.

In most systems, the setting of the user security profile also includes the
definition of some discretionary privileges associated with the user
account. For example, in systems that use groups to enforce DAC policies,
administrators define the group membership. The TFM shall explain the
consequences of adding or deleting a user identity to a group in terms of
the added or lost discretionary privileges, and provide appropriate
warnings. In systems where the user security profile also includes the
specification of the maximum level of trust for each user, the TFM shall
also discuss the security implications of incorrect definition or change of
these levels and the interactions between these levels and the sensitivity
levels of various system components (defined in the system security
profile). It should also include examples of and warnings about such
changes. The commands available to system administrators also include those
to define and change the parameters of the login/logout mechanism used by a
system. Consequently, the TFM should explain how to define these parameters,
which include the time-out period, multiple login attributes, maximum Iogin
time, and limits on unsuccessful logins from a terminal or into an account
[7] (e.g., specific commands, command options, formats, parameter ranges,
and default values). Whenever the trusted path mechanisms available to
administrative users require special procedures, such as use of specific
hard-wired consoles, the TFM shall specify how the administrative users can
use the trusted path mechanism in a secure manner.

The TFM shall also explain the implications of the system security profile
definition in providing authorization data for user log ins. For example, a
terminal’s maximum and minimum sensitivity levels provide cutoff values for
whether a certain user login level can be used and whether a certain user
with a given user and group level clearance can log in at all from a given
terminal. The relationship between the terminals minimum and maximum
sensitivity levels and the user’s clearance level shall be explained so that
consistent levels can be defined for both terminal sensitivity and user
level of trust.

Finally, administrator commands for temporarily terminating a user access to
the system and for permanently deleting the user account shall be defined,
and the implications of such actions defined. This section should also
include warnings about potential vulnerabilities, such as object ownership
set to the identity of an user or account that is no longer valid, or the
reuse of an old identifier, that persist when a user account is not deleted
correctly or completely, and examples of such vulnerabilities [7].

For all administrative commands defined in this and other system security
areas, this TFM section should include an explanation of all exceptions and,
possibly, a administrator’s recommended response to these exceptions. (This
reaction may already be described in the system call/command DTLS). All
administrative data bases that are accessed by these commands should be
identified showing how they are, or can be, protected. All mechanisms
available for the protection of the identification and authentication data
shall be clearly explained. The use of these mechanisms should be
illustrated by examples.

4.2 Audit

The purpose of this section of the TFM is to familiarize administrative
users with the TCB commands and interfaces of the system’s audit mechanism.
These commands include those that enable or disable the audit selectivity
mechanism (e.g., audit-event setup and change), those that help manage the
audit trails (logs), those that perform data compression and post processing
analysis, and in classes B2—A1, those that set correct channel delays and
randomize variables.

Some system includes a set of audit events that should always be selected
for audit to ensure the consistency of subsequent events selected by the
auditor and the proper functioning of the post processing tools. These
events should be explicitly highlighted for special discussion in the list
of auditable events supported by the system. The complete list of events
shall be defined in the TFM. The audit selection mechanism should also be
presented, and examples of use should be provided. Commands of the audit
selectivity mechanism include those that turn on and off events on a
per-user, per-process, per-terminal, per-sensitivity-level, or per-object
basis. In TCSEC classes B3 and A1, the commands that turn on and off events
representing accumulations of other auditable events and audit-system alarms
(if any) shall also be presented.

Systems that support audit mechanisms include commands that help manage the
audit files. These commands, which include those to create new and destroy
old audit logs, to change audit log size and warning points, to display,
format, and compress audit data, and to check the consistency of the audit
database after crashes, and when these changes take effect, shall also be
included in the TFM. The procedures that shall be used by auditors to ensure
that the audit files do not overflow shall also be presented. The format in
the audit log file of each record field and of each type of auditable event
shall be presented and explained. Commands for post processing of audit logs
(if any) shall also be included in the TFM. Systems designed to satisfy the
B2—A1 security requirements need to have covert channels restricted to
certain limits. One means of reducing covert channel bandwidths is by
placement of delays and by setting of randomization variables in system
kernels and trusted processes. Commands that accomplish this task should be
presented in the TFM of these systems along with a description of the covert
channel handling policy recommended for enforcement. These recommendations
should be derived from the covert-channel analysis guideline of the TCSEC
and are important because they affect not only the security policy and the
accountability areas of the system, but also system performance. Reference
[7] defines the administrative functions necessary to support audit
activities. As suggested in the covert channel guidelines of the TCSEC,
bandwidth reduction policy should be coordinated with audit policy. For this
reason, the TFM should present the bandwidth reduction policy in the same
section with that presenting the audit policy.

Recommendations on audit procedures should also be included in the TFM.
These procedures would suggest auditing groups of specific events that may
reveal misuse of access privileges, potential system-penetration attacks,
and covert channel usage. They may also suggest the frequency of audit
review and provide advice on how to manage audit files on-line and off-line.

For commands used by administrative users for audit, the TFM should include
a description of their effects and exceptions, and should provide examples
of use, potential misuse, and security implications of command misuses.
Recommendations for administrator’s reactions to command exceptions should
also be made. Reference [7] provides examples of vulnerabilities caused by
misuse of audit command and authority. These examples include loss of audit
log consistency, loss of audit logs, loss of user privacy, and various forms
of denial of service. Specific instances of vulnerability in a given system
and possible suggestions for reducing the system’s exposure to such
vulnerabilities should also be included in the audit section of the TFM.

In summary, the accountability section of the TFM should include the
following subsections:

4.1 Identification and Authentication
– TCB commands and interfaces for setting up user security
profiles and authentication and authorization parameters of the
login mechanism;
– password distribution to ordinary and administrative users,
management of password generation, and protection of passwords;
– account restrictions (e.g., restricted time intervals for log
in, and account cutoffs);
– choice of user and group identifiers;
– maximum levels of trust for users and groups;
– computation of the current level of trust for subjects (e.g.,
subject’s clearance).
4.2 Definition and Change of System Parameters of the Log in Mechanism
and when they take effect
– timeout interval;
– multiple login attributes;
– maximum login time;
– limits on unsuccessful logins from a terminal or to an account;
– use of special trusted path mechanisms for administrative users.
4.3 Audit Mechanisms
– audit-event selection mechanisms (e.g., audit-event setup and
change);
– management of audit logs (e.g., protections of audit logs);
– functions for formatting, compressioning, and postprocessing of
audit files;
– interfaces for setting of covert channel delays and
randomization of variables;
– description of audit log and event formats.
4.4 Commands, System Calls and Function Definition
– effects and exceptions of each command of the accountability
area (if not defined in DTLSs);
– parameter and default settings;
– examples of use and potential misuse.
4.5 Warnings of Specific Security Vulnerabilities of Administrative
Activities and Procedures Related to Identification, Authentication,
Trusted Path and Audit

5 ROUTINE OPERATIONS

The purpose of this section of the TFM is to define the routine operations
performed by administrative users, describe the operation’s security,
describe the vulnerabilites associated with these operations, and provide
appropriate warnings. These operations are carried out, in most cases, by
execution of appropriate commands from a system console. However, in some
instances, these operations involve manipulation of physical devices, such
as printers, storage devices, removable media, communication switches, and
modems. For this reason, this TFM section may differ from the rest of the
TFM. It should contain not only definitions of specific commands and TCB
interfaces, but also procedures and policies for secure use and manipulation
of hardware devices.

Routine operations of administrative personnel include both
security-relevant and security-irrelevant operations. Security-relevant
functions include those that boot and shut down the system, set system
clocks, identify damaged user volumes and files, perform TCB backups and
on-line device tests, run system integrity tests, and respond to user
requests to mount/unmount volumes. Routine security- irrelevant operations
include those that perform system metering, and that require operator
response to various user requests [7]

This section the TFM should include a description of each command used for
routine operations, including its effects and exceptions, and should provide
examples of use, potential misuse, and security implications of command
misuse. Examples of vulnerabilities of security-relevant, routine operations
include the booting of an old version of the TCB, causing inconsistency
problems for users; system shutdown while still in normal operation causing
loss of files and file system inconsistencies; and inadequate use of devices
and device interfaces (e.g., printers).

This section the TFM should also include descriptions of administrative
commands that perform security-irrelevant routine operations. These commands
include those traditionally performed by account administrators, such as
commands used for maintenance of accounting files, for turning on and off
accounting, for running accounting tools, for collecting statistics of
system and resource usage, and billing information.

Administrative policies and procedures that define security-relevant
handling of devices shall also be included in the TFM. For example,
procedures to install, activate, and set the current sensitivity level of a
printer within the pre-defined range should be defined, and examples of the
installation procedure should be given. In summary, the TFM section defining
the routine administrative operations and procedure in the following
subsections:

5.1 Security-Relevant Procedures and Operations
– running of system diagnostics;
– system boot and shutdown;
– setting of system clocks;
– identification of damaged user files and volumes;
– routine backup of TCB files;
– on-line device testing;
– response to user requests to mount/unmount volumes;
– handling of peripheral devices, removable storage, and output
(e.g., printers, printer output, diskpacks, tape reels).
5.2 Security-Irrelevant Procedures and Operations
– back-up of user volumes;
– system metering;
– response to various user requests;
– user account administration;
5.3 Commands, System Calls and Function Definitions
– effects and exceptions of each command of the routine operations
area
(unless defined in the DTLSs);
– parameter and default settings;
– examples of use and potential misuse.
5.4 Warning of Specific Security Vulnerabilities of Routine Operations

6 SECURITY OF THE TCB

The two purposes of this TFM section are to identify and explain all aspects
of TCB security and integrity that become the responsibility of
administrative users. Because the security of all user programs, data, and
application subsystems is provided by the TCB, the maintenance of TCB
security and integrity is one of the most sensitive administrative
functions.

Maintenance of TCB security spans the entire system life cycle. It includes
procedures for strict configuration management during system development and
use, and for secure system distribution, installation, and local
maintenance. In some cases, administrative users are allowed and required to
generate another evaluated version of the TCB from source code (e.g., make
changes to the TCB source code and regenerate the TCB on site). In such
cases, the TFM shall include detailed descriptions of procedures that
generate a new TCB version from source code, the necessary system commands,
the list of approved tools (e.g., compilers, linkers, loaders) for TCB
generation, examples of command use, warnings of possible problems in
generating a new TCB, vulnerabilities that may affect TCB security, and
configuration management.

The TFM shall also provide, or reference a separate document that provides,
a description of command exceptions, appropriate warnings, and possible
exception handling advice. The TFM should also provide, or reference a
separate document that describes, the configuration management tools. The
TFM shall include descriptions of the procedures that must be followed by
site administrators to install new releases of the TCB.

TCB security may be violated during installation and maintenance (see [7]).
For this reason, the TFM shall provide a description of the TCB installation
procedures, including the required commands, exceptions, parameter settings,
required system configuration, warnings, and advice. The installation
procedures should contain descriptions of the TCB data structures that must
be initialized by the user, and of the TCB loading. Also, the installation
procedures should include a list of tools (e.g., editors, loaders) approved
for TCB installation and an appropriate description of secure installation
assumptions (e.g., administrative procedures, such as those that require
physical audit of the installation procedure by independent personnel).

All TCB maintenance procedures shall be defined in the TFM. These procedures
should include analyzing system “dumps” after crashes, conducting
crash-recovery and restart actions, performing consistency checking of TCB
files and directories, changing system configuration parameters (e.g., table
sizes, devices, and device drivers), running periodic system integrity
checks, and repairing damaged labels. A list of the approved tools for TCB
maintenance, relevant commands, exceptions, warnings, and advice should also
be included in this section.

The ability to install and maintain a system’s TCB in a secure manner
requires that administrative users be cognizant of all TCB modules.
Administrators should especially be cognizant of those hardware modules
containing the reference monitor mechanism, and of all the of default file
protections for TCB files or objects. If available, the command needed to
run a tool that checks the correct privilege and sensitivity-level
initialization for TCB files or objects shall be identified and its use
illustrated. Thus, either the TFM itself shall provide a list of all TCB
modules, including their interfaces, and shall specify the TCB file or
object privileges necessary to protect the TCB or the TFM shall list a
separate document that does.

The TFM shall include warnings and advice on how to handle both generic and
system-specific vulnerabilities (if any) of TCB installation and
maintenance. For example, administrative users should be warned that
interchanges of dedicated- console and user-terminal communication lines can
`cause potential loss of trusted path for administrative users, that
placement of extraneous code in the TCB configuration may result from using
an unapproved tool, and that running a borrowed untrusted program under
administrative identity may cause an untold number of TCB security problems
[7].

Finally, the TFM shall include a description of policies and procedures that
define the distribution procedures for a trusted system (i.e., a class A1
requirement). These policies and procedures shall be used to maintain the
integrity of the mapping between the master copy defining the current
version of the TCB and the on-site installed copy.

In summary, the TFM section that defines the security measures necessary for
protection of the TCB should include the following subsections:

6.1 The Generation of the TCB Source Code
– list of TCB code modules, module interface and data (including
modules of the reference monitor);
– list of approved tools for TCB generation
– procedures for TCB generation;
– vulnerabilities.
6.2 Configuration Management Policy (if required, reference to a
separate configuration management document)
6.3 Ratings-maintenance Plan (if applicable, reference to a separate
rating maintenance document)
6.4 TCB Installation Procedure
– TCB generation from source code (whenever allowed by the system
manufacturer);
– TCB hardware installation;
– TCB data structure initialization;
– TCB loading;
– setting of TCB file protection;
– list of approved tools.
6.5 TCB Maintenance Procedures
– analysis of system dumps;
– crash recovery and restart;
– changes of configuration parameters;
– repair of damaged TCB data structures;
– consistency-checking procedures;
– running of periodic system-integrity checks
6.6 Trusted Distribution of the TCB
– policies and procedures;
– correspondence between master copy and installed copy
6.7 Commands, System Calls, and Function Definitions for TCB Generation
from Source Code, Installation, Maintenance, and Trusted Distribution
– effects and exceptions (unless defined in DTLSs);
– parameter and default settings;
– examples of use and potential misuse.
6.8 Warnings of Specific Security Vulnerabilities of TCB Generation,
Installation, Maintenance, and Distribution

7 SATISFYING THE TCSEC REQUlREMENTS

This section of the TFM should contain the definition of the TFM
requirements on a TCSEC class basis. All of the requirements listed below
derive from corresponding documentation requirements and objectives of the
TCSEC. Although similar TFM requirements appear in multiple classes, the
contents of TFM sections shall reflect the complexity of policy,
accountability, assurance, and documentation of the evaluation class.
Consequently, this section should contain suggestions and recommendations
that may not be found in the TFM requirements area but that derive from
other TCSEC areas. These suggestions and recommendations illustrate the
added complexity of various TCSEC classes.

7.1 Requirements and Recommendations for Security Class C1

The TFM of a C1 class system may have the following structure:

7.1.1 TFM Introduction

The TFM introduction may include the following topics:

Scope of the TFM
– guide to configure and install secure systems;
– guide to operate a system in a secure manner;
– enable administrative personnel to make effective use of the
system’s privileges and protection mechanisms;
– issue warnings about possible misuse of administrative
authority.
Recommended use of the TFM
– review skills and systems background necessary for
administrative personnel, suggest additional manuals, reference
material, and standards needed by administrative personnel;
– specify the limitations of security scope;
Contents of the TFM
– contents of each section;
– section relationships.

(For specific DAC requirements, the reader should refer to [2].)

7.1.2 System Security Overview

This section of the TFM shall include a brief description of the system
administration vulnerabilites specific to the given system, warnings, and
advice on how to counter these vulnerabilities.

“A manual addressed to the ADP administration shall present cautions
about the function and privileges that should be controlled when
runninga sucure facility [6].”

The above TCSEC requirement sugguest that the administrative functions and
privileges that need to be controlled when running a sucure facility shall
be identifies, and the vulnerabilities associated with those functgions and
privileges shall be determined. Warnings relatiing to thes vulnerabilites
shall be presented.

The administrative functions and privileges that need to be controlled when
running a class C1 secure facility include those supporting security olicy
(i.e., DAC), accountability (i.e., identification and authentication), and
operational assurance (i.e., system integrity).

Security Policy

This section of the TFM shall include descriptions of the TCB commands,
interfaces, and procedures to:

– initialize discretionary access privileges and defaults for
individual users and groups;
– distribute, review, and revoke privileges on an individual user or
group basis;
– change object ownership (if any), restore accidentally deleted
privileges, and kill errant processes;
– define and change group membership (whenever groups are supported),
and explain the effect of such action on DAC;
– explain the implications of creating and deleting user and group
accounts on DAC.

7.1.3 Accountability

Identification and Authentication

This section of the TFM shall include descriptions of the TCB commands,
interfaces and procedures to perform the following functions:

– conduct setup of user/group security profiles, and authentication and
authorization parameters of the login mechanism;
– conduct password management distribution to ordinary and
administrative users or groups (see [4]);
– define account restrictions (e.g., time intervals for login, account
cutoff time).

This section shall also include descriptions of the definition and change of
log in mechanism parameters. These parameters include:

– types of terminals supported and terminal; interface initialization;
– time—out interval;
– multiple log in attributes (if supported);
– maximum login time;
– limits on unsuccessful logins from a terminal or to an account.

7.1.4 Routine Operations

Although the TCSEC does not cite specific requirements in this area, the TFM
should include commands and procedures for the following activities:

– perform system boot and shut down;
– set system clocks;
– conduct on-line device testing;
– perform backup of user volumes;
– perform system metering;
– response to various user requests.

7.1.5 Security of the TCB

This section of the TFM shall include descriptions of the TCB command
procedures that are provided “to validate periodically the correct operation
of the on-site hardware and firmware elements of the TCB.”[6]

In all areas of the TFM, and for all security classes where TCB commands and
interface descriptions are required, the TFM shall include:

– effects and exceptions of each command (if not already defined in the
DTLS);
– parameter and default setting;
– examples of potential use and misuse.
– In all areas of the TFM, and for all security classes, warnings
(i.e., cautions) shall be provided for specific security
vulnerabilities of the relevant administrative commands, interfaces,
and procedures. Any modification to the TCB, for all security classes,
may invalidate the systems rating [5].

7.2 Requirements and Recommendations for Security Class C2

Security class C2 includes all the TFM requirements of security class Cl. In
addition, the following documentation requirements are added.

7.2.1 TFM Introduction

No Additional Requirements/Recommendations (NAR)

7.2.2 System Security Overview

The first design documentation requirement of TCSEC is that:

“Documentation shall be available that provides a description of the
manufacturer’s philosophy of protection and an explanation of how this
philosophy is translated into the TCB.”[6]

The above requirement suggests that the system security overview section
should include an additional subsection on security philosophy. This section
should contain a discussion of the security threats that could be countered
by the use of this system, and of specific countermeasures based on security
policy and accountability.

7.2.3 Security Policy

(NAR)

7.2.4 Accountability

The second documentation requirement is: “The procedures for examining and
maintaining the audit files as well as the detailed audit record structure
for each type of audit event shall be given [6]. This requirements implies
that the following sections should be added to the accountability area:

7.2.4.1 Identification and Authentication

(NAR)

7.2.4.2 Audit

The TFM should include a section describing the audit mechanisms, TCB
commands, interfaces, and procedures for the following activities:

– determine audit selection mechanisms; these mechanisms include the
commands and procedures necessary to display all security-relevant
auditable events, to select the required and the optional audit events,
and to turn on and off events selectively on a per-user and per-process
basis;
– conduct audit log management; this activity includes commands and
procedures to create, save, and destroy saved audit logs; to change
audit log size and warning point for audit log overflow; to format,
compress and display audit logs;
– protect audit commands and databases;
– ensure maintenance of audit consistency;
– perform post processing of audit data; this is an optional
feature of a system and of the TFM, and includes mostly
application-specific commands and procedures for intrusion
detection. (However, all of these commands and procedures, and
also the available tools and their protection from unauthorized
user access, should be described whenever they are provided);
The audit section of the TFM shall include a detailed description
of the audit record structure for each type of audit event and of
the entire audit log. (For specific details of audit requirements,
the reader should refer to [1]).

7.2.5 Routine Operations

(NAR)

7.2.6 Security of the TCB

Additional requirement that is relevant to TCB protection is included here.

7.3 Requirements and Recommendations for Security Class B1

All TFM requirements of a class C2 system are included here. The
documentation requirements of class B1 suggest significant additions to the
TFM contents beyond those implied by the TCSEC requirements of security
policy and accountability.

The TFM of a class B1 system should include the following additional
documentation:

7.3.1 TFM Introduction

(NAR)

7.3.2 System Security Overview

This section should include any additional requirement referring to the
system security overview. That is, this section of the TFM “shall provide
guidelines on the consistent and effective use of the protection features of
the system; [and] how they interact.” [6] This suggests that the TFM should
include a discussion of the interaction between the protection mechanisms
and functions available to administrative users and those available to
ordinary users. As mentioned in Section 2 above, this interaction is
particularly important in the areas of security policy and accountability.

7.3.3 Security Policy

The additional security policy requirements of MAC and labeling suggest that
additional administrative responsibilities should be documented in the TFM.
The TFM requirement that the “manual shall describe the operator and
administrator functions related to security,”[6] suggests that the TFM
should include a description of all TCB commands, interfaces and procedures
to perform the following functions:

– define and change system security profiles;
– classify, reclassify, import, and export objects;
– perform consistency checks on system and user security profiles.

7.3.4 Accountability

7.3.4.1 Identification and Authentication

The B1 requirements mandate the identification and authentication
recommendations of classes Cl and C2 (i.e., they mandate the identification
and authentication on a per-individual-user basis). In addition, it requires
that the TFM includes TCB commands and procedures to define and change the
user (and, possibly, group) levels of trust. It also requires that the
computation of a subject’s login level of trust be included in the TFM.

7.3.4.2 Audit

The additional B1 requirements that shall be included in the TFM
documentation include:

– a description of how the audit mechanism records any override of
output markings;
– a description of how the TCB commands, interfaces, and procedures
support audit on a per-object sensitivity level basis

7.3.5 Routine Operations

(NAR)

7.3.6 Security of the TCB

The additional TFM requirement in this area is that the TFM “shall provide
guidelines on […] how to securely generate a new TCB” [6].

This requirement suggests that the TFM include:

– a list of approved tools for TCB generation;
– a description of procedures for TCB generation;
– a description of the vulnerabilities in generating a new TCB.
The B1 requirements of the TFM also state that the TFM “shall provide
guidelines on […] privileges needed to be controlled in order to
operate the facility in a secure manner” [6]. This implies that the
settings and the defaults for the protection privileges of the TCB
files should be specified. Warnings about the improper setting of such
privileges should be included.

7.4 Requirements and Recommendations for Security Class B2

All TFM requirements of the class B1 are included here. The documentation
requirements of class B2 suggest additions to the TFM contents beyond those
implied by the TCSEC requirements of security policy, accountability, and
operational assurance.

The TFM of a B2 system should include the following additional
documentation.

7.4.1 Introduction

(NAR)

7.4.2 System Security Overview

The only additional requirement for inclusion in this section is the
separation of administrative functions into two roles, namely that of the
administrator and that of the operator. Section 3 discusses the
documentation requirements for B2 role separation.

7.4.3 Security Policy

The two additional security-policy requirements that should be documented in
the TFM address the areas of subject sensitivity and device labels. The TFM
shall include the TCB commands and procedures to:

– change the security label of an active subject (if this function is
provided);
– assign and change the device sensitivity levels.

This section of the TFM shall also include a discussion of the security
vulnerabilities associated with change of trust level of an active subject.
Also it shall include a discussion of the relationship between the device
sensitivity levels and the level of trust associated with the physical
environment in which the devices are located.

7.4.4 Accountability

7.4.4.1 Identification and Authentication

The only additional TFM requirement here is that of documenting the trusted
path mechanisms available to administrative users whenever this mechanism
differs from that available to ordinary users (and documented in the SFUG).

7.4.4.2 Audit

The only additional TFM requirement of the audit area is that of defining
the TCB commands and interfaces for auditing covert channels, for setting
delays in covert channels, and for randomizing covert-channel variables.

7.4.5 Routine Operations

The routine operations performed by administrative users should be presented
according to the separation of roles required by the trusted facility
management area of the TCSEC and suggested by [7].

7.4.6 Security of the TCIB

The additional TFM requirements for this section include:

– the list of TCB modules shall identify the modules of the reference
monitor mechanism;
– “[…] the procedures for secure generation of a new TCB from source
after modification of any modules in the TCB shall be described” [6].
(This requirement implies that configuration management shall be in
place. References to additional documents defining these procedures and
plans could be included in the TFM).

7.5 Requirements and Recommendations for Security Class B3

The only additional requirements of class B3 that shall be included in the
TFM are in the areas of system overview, audit, routine operations, and
security of the TCB.

7.5.1 TFM Introduction

(NAR)

7.5.2 System Overview

The TFM should include a discussion of the physical security assumptions
made by the system designers and implementators that must be satisfied by
the installed system. Also, this section shall include a discussion of the
separation between the security-relevant and security-irrelevant functions
of the administrators and operators (see [7]).

7.5.3 Security Policy

(NAR)

7.5.4 Accountability

7.5.4.1 Identification and Authentication

(NAR)

7.5.4.2 Audit

The TFM should describe the TCB commands and interfaces available to the
auditor that enable him or her to monitor the accumulation of auditable
events and to respond effectively to such event signals.

7.5.5 Routine Operations

The additional routine operations carried out by secure and ordinary
operators should be specified in the TFM. These should include:

– the identification of damaged user files and volumes;
– the routine backup of TCB files;
– the mounting and unmounting of volumes.

Security-irrelevant administrator and operator actions, such as handling
user requests and managing the accounting system, should also be documented
here.

7.5.6 Security of the TCB

Two additional TFM requirements are included here. The first is that “[The
TFM] shall include procedures to ensure that the system is initially started
in a secure manner” [6]. This requirement suggests that the TFM must
document procedures for:

– TCB hardware installation (using the list of approved hardware
modules);
– TCB loading; TCB data structure initialization;
– initialization of privileges for TCB file
– use of approved initialization tools.

The second requirement is that “procedures shall also be included to resume
secure system operation after any lapse in system operation” [6].

This requirement suggests that the TFM should document procedures for:

– analysis of system dumps;
– crash recovery and restart in a secure state;
– repair of damaged TCB data structures (e.g., labels);
– changes of configuration parameters (e.g., table sizes);
– consistency checking procedures.

7.6 Requirements of Security Class A1

Although no additional explicit TFM requirements beyond that required for B3
are included here, the TFM should define procedures for trusted distribution
consistent with the [6] requirements.

GLOSSARY

Access –
A specific type of interaction between a subject and an object that
results in the flow of information from one to the other.
Account Administrator –
An administrative role or user assigned to maintain accounting files,
tools, user accounts, and system statistics.
Administrative User –
A user assigned to supervise all or a portion of an AIS system.
Approval Accreditation –
The official authorization that is granted to an AIS system to process
sensitive information in its operational environment, based upon
comprehensive security evaluation of the system’s hardware, firmware,
and software security design, configuration, and implementation and of
the other system procedural, administrative, physical, TEMPEST,
personnel, and communications security controls.
Audit –
To conduct the independent review and examination of system records and
activities.
Audit Event Selection –
Selection, by authorized personnel, of the auditable events that are to
be recorded on the audit trail.
Audit Mechanism –
The part of the TCB used to collect, review, and/or examine system
activities.
Audit Post Processing –
Processing, by authorized personnel, of specified events that had been
recorded on the audit trail.
Audit Trail –
A chronological record of system activities that is sufficient to
enable the reconstruction, reviewing, and examination of the sequence
of environments and activities surrounding or leading to an operation,
a procedure, or an event in transaction from its inception to final
results.
Auditable Event –
Any event that can be selected for inclusion in the audit trail. These
events should include, in addition to security-relevant events, events
taken to recover the system after failure and any events that might
prove to be security relevant at a later time.
Auditor –
An authorized individual, or role, with administrative duties, which
include selecting the events to be audited on the system, setting up
the audit flags that enable the recording of those events, and
analyzing the audit. trail
Authenticate –
(1) To verify the identity of a user, device, or other entity in a
computer system, often as a prerequisite to allowing access to
resources in a system. (2) To verify the integrity of data that has
been stored, transmitted, or otherwise exposed to possible unauthorized
modification.
Authenticated User –
A user who has accessed an AIS system with a valid identifier and
authenticator.
Automated Information System (AIS) –
An assembly of computer hardware, firmware, and software configured to
collect, create, communicate, compute, disseminate, process, store, and
/or control data or information.
Bandwidth –
A characteristic of a communication channel that is the amount of
information that can be passed through it in a given amount of time,
usually expressed in bits per second.
Category –
A restrictive label that has been applied to classified or unclassified
data as a means of increasing the protection of the data and further
restricting access to the data.
Channel –
An information transfer path within a system. May also refer to the
mechanism by which the path is effected.
Covert Channel –
A communication channel that allows two cooperating processes to
transfer information in a manner that violates the system’s security
policy. Synonymous with Confinement Channel.
Covert Storage Channel –
A covert channel that involves the direct or indirect writing of a
storage location by one process and the direct or indirect reading of
the storage location by another process. Covert storage channels
typically involve a finite resource (e.g., sectors on a disk) that is
shared by two subjects at different security levels.
Covert Timing Channel –
A covert channel in which one process signals information to another by
modulating its own use of system resources (e.g., Central Processing
Unit time) in such a way that this manipulation affects the real
response time observed by these second process.
Data –
Information with a specific physical representation.
Data Integrity –
The state that exists when computerized data is the same as that in the
source documents and has not been exposed to accidental or malicious
alteration or destruction.
Descriptive Top-Level Specification (DTLS) –
A top-level specification that is written in a natural language (e.g.,
English), an informal program design notation, or a combination of the
two.
Discretionary Access Control –
A means of restricting access to objects based on the identity and
need-to-know of the user, process, and/or groups to which they belong.
The controls are discretionary in the sense that a subject with a
certain access permission is capable of passing that permission
(perhaps indirectly) on to any other subject.
Formal Security Policy Model –
A mathematically precise statement of a security policy. To be
adequately precise, such a model shall represent the initial state of a
system, the way in which the system progresses from one state to
another, and a definition of a “secure” state of the system. To be
acceptable as a basis for a TCB, the model shall be supported by a
formal proof that if the initial state of the system satisfies the
definition of a “secure” state. If all assumptions required by the
model hold, then all future states of the system will be secure. Some
formal modeling techniques include state transition models, temporal
logic models, denotational semantics models, and algebraic
specification models.
Formal Top-Level Specification (FTLS) –
A top-level specification that is written in a formal mathematical
language to allow theorems showing the correspondence of the system
specification to its formal requirements to be hypothesized and
formally proven.
Functional Testing –
The portion of security testing in which the advertised features of a
system are tested, under operational conditions, for correct operation.
Least Privilege –
The principle that requires that each subject in a system be granted
the most restrictive set of privileges (or lowest clearance) needed for
the performance of authorized tasks. The application of this principle
limits the damage that can result from accident, error, or unauthorized
use.
Mandatory Access Control –
A means of restricting access to objects based on the sensitivity (as
represented by a label) of the information contained in the objects and
the formal authorization (i.e., clearance) of subjects to access
information of such sensitivity.
Multilevel Device –
A device that is used in a manner that permits simultaneous processing
of data of two or more security levels without risk of compromise. To
accomplish this, sensitivity labels are normally stored on the same
physical medium and in the same form (i.e., machine-readable or
human-readable) as the data being processed.
Multilevel Secure –
A class of system containing information with different sensitivities
that, simultaneously permits access by users with different security
clearances and need-to-know, but prevents users from obtaining access
to information for which they lack authorization.
Object –
A passive entity that contains or receives information. Access to an
object potentially implies access to the information it contains.
Examples of objects are records, blocks, pages, segments, files,
directories, directory trees, and programs, as well as bits, bytes,
words, fields, processors, video displays, keyboards, clocks, printers,
and network nodes.
Operator –
An administrative role or user assigned to perform routine maintenance
operations of the AIS system and to respond to routine user requests.
Output –
Information that has been exported by a TCB.
Password –
A private character string that is used to authenticate an identity.
Process –
A program in execution. It is completely characterized by a single
current execution point (represented by the machine state) and address
space.
Read –
A fundamental operation that results only in the flow of information
from an object to a subject.
Read Access (Privilege) –
Permission to read information.
Security Administrator –
An administrative role or user responsible for the security of an AS
and having the authority to enforce the security safeguards on all
others who have access to the AIS (with the possible exception of the
Auditor).
Security Level –
The combination of a hierarchical classification and a set of
non-hierarchical categories that represents the sensitivity of
information.
Security Policy –
The set of laws, rules, and practices that regulate how an organization
manages, protects, and distributes sensitive information.
Security Policy Model –
A formal (informal.in the case of B1) presentation of the security
policy enforced by the system. It must identify the set of rules and
practices that regulate how a system manages, protects, and distributes
sensitive information.
Security-Relevant Event –
Any event that attempts to change the security state of the system
(e.g., change the DAC, change the security level of the subject, change
user password). Also, any event that attempts to violate the security
policy of the system, (e.g., too many attempts to log in, attempts to
violate the MAC limits of a device, attempts to downgrade a file).
Security Testing –
A process used to determine that the security features of a system are
implemented as designed and that they are adequate for a proposed
application environment.
Sensitive Information –
Any information, the loss, misuse, modification of, or unauthorized
access to, that could affect the national interest or the conduct of
Federal programs, or the privacy to which individuals are entitled
under Section 552a of Title 5, U.S. Code, but that has not been
specifically authorized under criteria established by an Executive
order or act of Congress to be kept classified in the interest of
national defense or foreign policy.
Sensitivity Label –
A piece of information that represents the security level of an object
and that describes the sensitivity (e.g., classification) of the data
in the object. Sensitivity labels are used by the TCB as the basis for
MAC decisions.
Subject –
An active entity, generally in the form of a person, process, or device
that causes information to flow among objects or changes in the system
state. Technically, a process/domain pair.
Subject Security Level –
A subject’s security level that is equal to the security level of the
objects to which it has both read and write access. A subject’s
security level shall always be dominated by the clearance of the user
associated with the subject.
System Programmer –
An administrative role or user responsible for the trusted system
distribution, configuration, installation, and non-routine maintenance.
System Security Map –
A map defining the correspondence between the binary and ASCII formats
of security levels (e.g., between binary format of security levels and
sensitivity labels).
Top-Level Specification (TLS) –
A non-procedural description of system behavior at the most abstract
level; typically, a functional specification that omits all
implementation details.
Trap Door –
A hidden software or hardware mechanism that can be triggered to
permits system protection mechanisms to be circumvented. It is
activated in some innocent-appearing manner (e.g., special “random” key
sequence at a terminal).
Trojan Horse –
A computer program with an apparently or actually useful function that
contains additional (hidden) functions that surreptitiously exploit the
legitimate authorizations of the invoking process to the detriment of
security; for example, making a “blind copy” of a sensitive file for
the creator of the Trojan horse.
Trusted Computer System –
A system that employs sufficient hardware and software assurance
measures to allow its use for simultaneous processing a range of
sensitive or classified information.
Trusted Computing Base (TCB) –
The totality of protection mechanisms within a computer system
-`including hardware, firmware, and software — the combination of
which is responsible for enforcing a security policy. A TCB consists of
one or more components that together enforce a unified security policy
over a product or system. The ability of a TCB to enforce a security
policy correctly depends solely on the mechanisms within the TCB and on
the correct input by system administrative personnel of parameters
(e.g., a user’s clearance) related to the security policy.
Trusted Path –
A mechanism by which a person at a terminal can communicate directly
with the TCB. This mechanism can only be activated by the person or the
TCB and cannot be imitated by untrusted software.
User –
Person or process accessing an AS either by direct connections (i.e.,
via terminals), or indirect connections (i.e., prepare input data or
receive output that is not reviewed for content or classification by a
responsible individual).
Verification –
The process of comparing two levels of system specification for proper
correspondence (e.g., security policy model with top-level
specification, TLS with source code, or source code with object code).
This process may or may not be automated.
Write –
A fundamental operation that results only in the flow of information
from a subject to an object.
Write Access (Privilege) –
Permission to write an object.

REFERENCES

[1]
National Computer Security Center, A Guide to Understanding Audit in
Trusted Systems, NCSC-TG-001, Version 2, June 1988.
[2]
National Computer Security Center, A Guide to Understanding
Discretionary Access Control in Trusted Systems, NCSC-TG-003,
version-I, September 1987.
[3]
Gligor V. D., J. C. Huskamp, S. R. Welke, C. J. Linn, W. T. Mayfield,
Traditional Capability-Based Systems: An Analysis of their Ability to
Meet the Trusted Computer Security Evaluation Criteria, Institute for
Defense Analyses, IDA Paper PI 935, February 1987.
[4]
Department of Defense, Password Management Guideline, CSC-STD-002-85,
April 1985.
[5]
National Computer Security Center, The Rating Maintenance Phase,
NCSCTG—013-89, 23 June 1989.
[6]
National Computer Security Center, Department of Defense Trusted
Computer System Evaluation Criteria, DoD 5200. 28-STD, 1985.
[7]
National Computer Security Center, Guidelines for Trusted Facility
Management, NCSC—TG—0I5-89, 18 October 1989.

Federal Criteria for Information Technology Security Volume II

FEDERAL CRITERIA

for

INFORMATION TECHNOLOGY SECURITY

VOLUME II

Registry of Protection Profiles

Version 1.0

December 1992

This document is undergoing review and 
is subject to modification or withdrawal.

The contents of this document should not 
be referenced in other publications.

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY

&

NATIONAL SECURITY AGENCY

NOTES TO REVIEWERS

This is the first public draft of work in progress by the joint 
National Institute of Standards and Technology (NIST) and 
National Security Agency (NSA) Federal Criteria (FC) Project. 
This draft Federal Criteria for Information Technology Security 
is provided for preliminary review and comment by members of the 
national and international computer security community.  The 
document will evolve into a new Federal Information Processing 
Standard (FIPS) intended principally for use by the United States 
Federal Government, and also by others as desired and 
appropriate.  The FIPS is intended to replace the Trusted Computer 
System Evaluation Criteria (TCSEC) or "Orange Book."

Our objectives in presenting this draft material are threefold: 
first, to give the community a clear view of the FC Project's 
direction in moving beyond the TCSEC method of expressing 
requirements in order to meet new IT security challenges; second, 
to obtain feedback on the innovative approaches taken, the method 
of presentation, and granularity; and third, to make a 
substantial contribution to the dialogue among nations leading to 
the harmonization of IT security requirements and evaluations.

It is important to note a few things about this preliminary FC 
draft. First, it is a new and unpolished document and not intended 
for any purpose except review and comment. Organizations should 
not adopt any contents of this draft document for their use.  It 
is anticipated that the document will undergo extensive revision 
as it works its way through the public FIPS approval process over 
the next year or two.  Second, the FC is being distributed in two 
volumes. Volume I addresses the criteria development process and 
is intended principally for use by developers of protection 
profiles. The information in Volume I may also be of use to IT 
product manufacturers and product evaluators. Volume II presents 
completed IT product security criteria in the form of accepted 
protection profiles.

The protection profiles associated with the final FIPS will help 
consumers identify types of products that meet the protection 
requirements within their particular organizations and 
environments.  However, the FIPS will be supplemented by a series 
of implementing guidance documents, many of which will be 
designed to help consumers make cost-effective decisions about 
obtaining and appropriately using security-capable IT products.

As a preliminary draft of the new FC-FIPS, this document is not 
intended for general distribution or compliance.  The document 
should not be considered a complete or finished product.  Your 
comments will be used by the Federal Criteria Working Group to 
help raise the maturity level of this material prior to being 
circulated for further public comment in the FIPS development 
process.

ADDITIONAL NOTES TO REVIEWERS

Reviewers who provide substantive comments on the enclosed draft 
FC by March 31, 1993 will be invited to attend an Invitational 
Workshop on the Federal Criteria. This two-day workshop will be 
held in the last week of April 1993 in the Washington-Baltimore 
area at a location to be announced. All comments received by the 
cut-off date will be correlated into major themes for discussion 
by break-out groups at the workshop. The results will be used as 
input into the process of re-drafting the FC for a second round of 
comment prior to its being formalized as a FIPS.

Please send your comments (electronic format preferred) to 
Nickilyn Lynch at the U.S. National Institute of Standards and 
Technology (NIST), Computer Systems Laboratory (CSL).

Phone:	(301) 975-4267
FAX:	(301) 926-2733.

(Internet) Electronic Mail:

	lynch@csmes.ncsl.nist.gov

Postal or Express Mail
(Hardcopy or 3.5", 1.44M diskette in MSDOS, Macintosh, or Sun 
format):

	Federal Criteria Comments
	Attn: Nickilyn Lynch
	NIST/CSL, Bldg 224/A241
	Gaithersburg, MD 20899

 NIST   National Institute of Standards and Technology

    Gaithersburg, MD 20899

COMMERCIAL SECURITY REQUIREMENTS

 FOR

 MULTI-USER OPERATING SYSTEMS

 A family of Protection Profiles for the

 Federal Criteria for Information Technology Security

Issue 1.1

January  1993

Supersedes Minimum Security Requirements

for Multi-User Operating Systems

Computer Security Division

Computer Systems Laboratory

National Institute of Standards and Technology

Chapter  1.  
Commercial Security Requirements (CSR)
1.1  Introduction 
1.1.1  CS Description 
1.1.2  Background 
1.1.2.1  Trusted Computer System Evaluation Criteria (TCSEC) 
1.1.2.2  Commercial Security Efforts 
1.1.2.3  System Security Study Committee 
1.1.2.4  Minimum Security Functionality Requirements (MSFR) 
1.1.2.5  Commercial Security (CS) requirements 
1.1.3  Document Organization 
COMMERCIAL SECURITY 1 (CS1)
CS1 Rationale
2.2  Introduction 
2.2.1  Protection Philosophy 
2.2.1.1  Access Authorization 
2.2.1.2  Accountability 
2.2.1.2.1  Identification and Authentication 
2.2.1.2.2  Audit 
2.2.1.3  Assurance 
2.2.2  Intended Method of Use 
2.2.3  Environmental Assumptions 
2.2.4  Expected Threats 
CS1 Functionality
3.  Introduction 
3.1  Identification & Authentication 
3.2  Audit 
3.3  Access Control 
3.4  Reference Mediation 
3.5  TCB Protection 
3.6  TCB Self-Checking 
CS1 Assurance
4.  Introduction 
4.1  TCB Property Definition 
4.2  TCB Element Identification 
4.3  TCB Interface Definition 
4.4  Developer Functional Testing 
4.5  User's Guidance 
4.6  Administrative Guidance 
4.7  Evidence of TCB Protection Properties 
4.8  Evidence of Product Development 
4.9  Evidence of Functional Testing 
4.10  Test Analysis 
4.11  Independent Testing 
COMMERCIAL SECURITY 2 (CS2)
CS2 Rationale
2.12  Introduction 
2.12.1  Protection Philosophy 
2.12.1.1  Access Authorization 
2.12.1.1.1  System Entry 
2.12.1.1.2  Subject and Object Access Mediation 
2.12.1.1.3  Privileges 
2.12.1.2  Accountability 
2.12.1.2.1  Identification and Authentication 
2.12.1.2.2  Audit 
2.12.1.3  Assurance 
2.12.1.4  Intended Method of Use 
2.12.2  Environmental Assumptions 
2.12.3  Expected Threats 
CS2 Functionality
3.  Introduction 
3.1  Identification & Authentication 
3.2  System Entry 
3.3  Trusted Path 
3.4  Audit 
3.5  Access Control 
3.6  Security Management 
3.7  Reference Mediation 
3.8  Logical TCB Protection 
3.9  TCB Self-Checking 
3.10  TCB Initialization and Recovery 
3.11  Privileged Operation 
3.12  Ease-of-TCB-Use 
CS2 Assurance
4.  Introduction 
4.1  TCB Property Definition 
4.2  TCB Element Identification 
4.3  TCB Interface Definition 
4.4  TCB Structuring Support 
4.5  Developer Functional Testing 
4.6  User's Guidance 
4.7  Administrative Guidance 
4.8  Flaw Remediation Procedures 
4.9  Trusted Generation 
4.10  Evidence of TCB Protection Properties 
4.11  Evidence of Product Development 
4.12  Evidence of Functional Testing 
4.13  Evidence of Product Support 
4.14  Test Analysis 
4.15  Independent Testing 
4.16  Operational Support Review 
COMMERCIAL SECURITY 3 (CS3)
CS3 Rationale
2.17  Introduction 
2.17.1  Protection Philosophy 
2.17.1.1  Access Authorization 
2.17.1.1.1  System Entry 
2.17.1.1.2  Subject and Object Access Mediation 
2.17.1.1.3  Privileges 
2.17.1.2  Accountability 
2.17.1.2.1  Identification and Authentication 
2.17.1.2.2  Audit 
2.17.1.3  Availability of Service 
2.17.1.4  Assurance 
2.17.1.5  Intended Method of Use 
2.17.2  Environmental Assumptions 
2.17.3  Expected Threats 
CS3 Functionality
3.  Introduction 
3.1  Identification & Authentication 
3.2  System Entry 
3.3  Trusted Path 
3.4  Audit 
3.5  Access Control 
3.6  Security Management 
3.7  Reference Mediation 
3.8  Resource-Allocation Requirements 
3.9  TCB Protection 
3.10  Physical TCB Protection 
3.11  TCB Self-Checking 
3.12  TCB Initialization and Recovery 
3.13  Privileged Operation 
3.14  Ease-of-TCB-Use 
CS3 Assurance
4.  Introduction 
4.1  TCB Property Definition 
4.2  TCB Element Identification 
4.3  TCB Interface Definition 
4.4  Developer Functional Testing 
4.5  Penetration Analysis 
4.6  User's Guidance 
4.7  Administrative Guidance 
4.8  Flaw Remediation Procedures 
4.9  Trusted Generation 
4.10  Life Cycle Definition 
4.11  Configuration Management 
4.12  Evidence of TCB Protection Properties 
4.13  Evidence of Product Development 
4.14  Evidence of Functional Testing 
4.15  Evidence of Penetration Analysis 
4.16  Evidence of Product Support 
4.17  Test Analysis 
4.18  Independent Testing 
4.19  Development Environment Review 
4.20  Operational Support Review 
4.21  Design Analysis 
GLOSSARY 
CSR References

Chapter  1.
Commercial Security Requirements (CSR)

1.1	Introduction

Government and commercial institutions rely heavily on 
information technology (IT) products to meet their 
operational, financial, and information requirements. The 
corruption, unauthorized disclosure, or theft of 
electronically-maintained resources can have a disruptive 
effect on an organization's operations as well as serious and 
immediate financial, legal, and public confidence impact.

Products conforming to the Commercial Security (CS) 
requirements contained in this document are intended to be 
useful to a broad base of users in the private, civil 
government, and defense sectors. This includes application 
developers, end users, and system administrators. The 
Protection Profiles specified in this document provide 
organizations with three set of security requirements, 
defined as CS1, CS2, and CS3, with CS3 offering the highest 
degree of trust.

The Protection Profiles as a whole specify "baseline" 
requirements that meet generally accepted security 
expectations for a class of products colloquially called 
"general purpose, multi-user operating systems." These 
requirements apply to multi-user workstations, minicomputers, 
and mainframes. Most required mechanisms are configurable so 
that customers can satisfy their unique security policies and 
objectives.

The intent of the Protection Profiles is to promote the wide 
availability of products possessing security enforcing 
functions that are of such broad applicability and 
effectiveness that they become part of the "normal" mode of 
operation. It is anticipated that vendors will respond to user 
expectations by increasing the availability of operating 
systems that meet these general security requirements. These 
requirements represent the integration of a number of security 
requirement specifications from various sources into a single 
set that is expected to have wide acceptance.

1.1.1	CS Description

The Protection Profiles address the security features and 
their development. The Protection Profiles were written to 
meet several objectives: to serve as a "metric" for the amount 
of security present in a computer system processing sensitive 
information; to provide guidance to the developers as to what 
security features to build into their planned products; and 
to provide a method for uniformly specifying security 
requirements in acquisition specifications. 

The CS requirements are divided into three hierarchical 
Protection Profiles. The profiles are CS1, CS2, and CS3, with 
C3 providing the greatest degree of security. Each profile 
represents a level of trust that can be placed in a product 
and specifies a collection of requirements in the form of 
features and assurances. Each profile includes most of the 
features and assurances of the previous profile along with 
additional, more stringent features and assurances. The 
reasoning for requirements leveling for each Protection 
Profile can be found in the rationale in Chapter 2. This 
reasoning is based on the overall effectiveness of each 
Protection Profile in addressing the threats identified in 
that chapter. 

The Protection Profiles specify computer-based protection 
mechanisms for the design, use, and management of information 
systems. The Protection Profiles include technical measures 
that can be incorporated into multi-user, remote-access, 
resource-sharing, and information sharing computer systems. 
CS-conformant computer products provide system administrators 
with tools to control the sharing of information and resources 
based primarily on the identity of users, or, in the case of 
CS3, the role associated with the user, as well as the time 
of day, terminal location, or type of access requested. The 
technical measures also provide tools to protect against both 
common user actions that may compromise security and against 
deliberate penetration attempts by "hackers." In addition, 
there are requirements to log events that may impact the 
security of either the product or the information that it is 
processing. All functionality requirements are based on 
existing and well understood security practices.

1.1.2	Background

These Protection Profiles have been developed by the CS 
Working Group of the Federal Criteria Project under NIST 
leadership with a high level of private sector participation. 
They are based on the Trusted Computer System Evaluation 
Criteria (TCSEC) [1] C2 criteria class, with additions from 
current computer industry practice, from commercial security 
requirements specifications, and from the on-going work of the 
Federal Criteria Project. Their development has also been 
guided by international security standards efforts and by the 
recommendations of the System Security Study Committee.

The following sub-sections provide descriptions of each of 
these sources, and gives further background on the motivation 
for and development of the Protection Profiles.

1.1.2.1	Trusted Computer System Evaluation Criteria (TCSEC) 

The TCSEC [1], originally published in 1983 and revised in 
1985, was the first publicly available document that expressed 
general security requirements that could apply to a specific 
class of technology (e.g., operating systems). It represents 
the culmination of many years of effort to address Information 
Technolgy (IT) security issues within the Department of 
Defense (DoD) classified world. The TCSEC is made up of IT 
security features and assurances that have been derived and 
engineered to support a very specific DoD security policy - 
the prevention of unauthorized disclosure of classified 
information (i.e., confidentiality). 

During the past few years, commercial enterprises and 
government organizations processing sensitive information 
have begun to pay increasing attention to IT security needs. 
Although the TCSEC-motivated security features have proven 
valuable in addressing their security problems, often these 
features have been viewed as less than perfect and incomplete 
and only to have been specified because a more appropriate set 
of security functions has not been available.

The Protection Profiles are intended to be the first step 
in "filling this gap" by providing a set of security 
requirements appropriate for commercial enterprises and 
government organizations concerned with protecting sensitive 
information.

1.1.2.2	Commercial Security Efforts

Recognizing that the TCSEC was a valuable starting point, 
but not sufficient for their security needs, two commercial 
companies - Bellcore and American Express Travel Related 
Services (TRS) - independently initiated efforts to develop 
security requirements for their environments. At Bellcore, 
these efforts resulted in a Bellcore Standard Operating 
Environment Security Requirements [3] document and at TRS the 
efforts resulted in the internal C2-Plus company security 
standard.

The Bellcore document was developed to meet the security 
needs of Bellcore and its client companies, the Regional Bell 
Operating Companies (RBOCs). The requirements specified in 
the Bellcore document were derived both from commonly 
recurring security requirements for RBOC computer 
applications and from experiences of Bellcore's computer 
security assessment group.

 In developing the C2-Plus document, TRS found that, while 
the TCSEC met many requirements of the commercial sector, the 
prescribed features at the C2-level (and its F2-level 
counterpart in the ITSEC [2]) fell short in several areas that 
were either introduced at higher TCSEC levels or were not 
addressed at all in the respective standards. Consequently, 
the TRS document was developed as an enhanced, commercialized 
version of the C2-level security requirements of the TCSEC. 

Using the TRS document as input, the International 
Information Integrity Institute (I-4), a consortium of large 
international corporations, developed the Commercial 
International Security Requirements (CISR) [4]. The rationale 
for the development of the CISR include the following:

"Military-oriented information security 
requirements (i. e., TCSEC) are not suitable in 
many respects for the needs of international 
businesses." [4]

The final version of the CISR was published in April 1992.

1.1.2.3	System Security Study Committee

The System Security Study Committee was formed in 1988 in 
response to a request from the Defense Advance Research 
Projects Agency (DARPA) to address the security and 
trustworthiness of U.S. computing and communications systems. 
The Committee, which was composed of 16 individuals from 
industry and academia, including computer and communications 
security researchers, practitioners, and software engineers, 
was charged with developing a national research, engineering, 
and policy agenda to help the United States achieve a more 
trustworthy computing technology base by the end of the 
century. In 1991, the Committee published the Computers at 
Risk [5] report, which presents the Committee's assessment of 
key computer and communications security issues and its 
recommendations for enhancing the security and 
trustworthiness of the U.S. computing and communications 
infrastructure. 

The development of the Protection Profiles was guided by 
one of the recommendations from this report that:

"...a basic set of security-related principles for 
the design, use, and management of systems that are 
of such broad applicability and effectiveness that 
they ought to be a part of any system with 
significant operational requirements" [5] should be 
developed.

1.1.2.4	Minimum Security Functionality Requirements (MSFR)

The second draft of the Minimum Security Functionality 
Requirements for Multi-User Operating Systems (MSFR) [10] was 
published in January of 1992. The MSFR was developed as part 
of a project to stimulate the development of IT products 
broadly useful to the diverse security needs of the US 
Government (civilian and military) and the private sector. 

The MSFR specified the minimum level of security that NIST 
and NSA felt should be available in any commercially available 
multi-user operating system. The MSFR represents an extension 
of the TCSEC controlled access protection class, level C2, 
with additions based on current industry practice and security 
requirements specifications developed in the commercial 
environment. Much of the MSFR is derived from the TCSEC, the 
Bellcore Standard Operating Environment Security 
Requirements, and the CISR with overall guidance from the 
Computers at Risk report [5]. 

1.1.2.5	Commercial Security (CS) requirements

To help support the Federal Criteria, the CS Working Group 
was tasked with developing a family of Protection Profiles, 
based on an updated version of the MSFR. The three Protection 
Profiles included in this document have been developed in 
compliance with the prescribed approach and format of the 
Federal Criteria [11]. Components of the Federal Criteria were 
selected for each Protection Profile and were enhanced with 
refinements and assignments that were taken from the November 
1992 version of the MSFR. The Protection Profiles are intended 
to satisfy the most common security needs of computer system 
users. 

1.1.3	Document Organization

Chapter 1 (this chapter) provides introductory and 
background information. The rest of this document is divided 
into three Protection Profiles, CS1, CS2, and CS3. The 
development of these Protection Profiles are in accordance 
with the Protection Profile format specified by the Federal 
Criteria. Chapter 2 provides the rationale for the selection 
of the security features and assurance evidence. This 
rationale also includes descriptions of the intended use of 
the product, the environmental assumptions that were made for 
a CS-compliant system, and the expected threats. Chapter 3 
specifies the security functionality that a CS-compliant 
system is required to provide, and Chapter 4 specifies the 
assurance requirements. At the end of the CS requirements, 
there is a Glossary and a list of references. 

COMMERCIAL SECURITY 1 (CS1)

 Products that comply with this Protection Profile 
provide access control capabilities to separate 
users and data based on finely grained access con-
trols. It incorporates credible controls capable of 
enforcing access limitations on an individual 
basis, i.e., ostensibly suitable for allowing users 
to be able to protect sensitive information and to 
keep other users from reading or destroying their 
data. Users are individually accountable for their 
actions through login procedures, auditing of secu-
rity relevant events, and resource isolation. This 
CS1 Protection Profile is equivalent to a Class C2 
- Controlled Access Protection from the TCSEC [1]. 
It consists of TCSEC requirements plus those eval-
uation interpretations that a product must meet 
before it can be evaluated at the C2 level.

COMPONENT SUMMARY: 

            CS1 Functional Component Summary
.------------------------------------------------------.
|                                  | Component |       |
| Component Name                   |   Code    | Level |           
|======================================================|
| Security Policy Support:                             |
|----------------------------------+-----------+-------|
|  Identification & Authentication |    I&A    |   1   |
|----------------------------------+-----------+-------|
|  Audit                           |    AD     |   1   |
|----------------------------------+-----------+-------|
|  Access Control                  |    AC     |   1   |
|----------------------------------+-----------+-------|
|  Reference Mediation             |    RM     |   1   |
|----------------------------------+-----------+-------|
|  TCB Protection                  |    P      |   1   |
|----------------------------------+-----------+-------|
|  Self Checking                   |    SC     |   1   |
`------------------------------------------------------'

      CS1 Assurance Package Summary
.---------------------------------------.
| Assurance Components           |  T1  |
|================================|======|
| Development Assurance Components      |     
|=======================================|
| Development Process                   |
|--------------------------------+------|
| TCB Property Definition        | PD-1 |
|--------------------------------+------|
| TCB Design                            |
|--------------------------------+------|
|   TCB Element Identification   | ID-1 |
|--------------------------------+------|
|   TCB Interface Definition     | IF-1 |
|--------------------------------+------|
|   TCB Modular Decomposition    | ---- |
|--------------------------------+------|
|   TCB Structuring Support      | ---- |
|--------------------------------+------|
|   TCB Design Disciplines       | ---- |
|--------------------------------+------|
| TCB Implementation Support     | ---- |
|--------------------------------+------|
| TCB Testing and Analysis              |
|--------------------------------+------|
|   Functional Testing           | FT-1 |
|--------------------------------+------|
|   Penetration Analysis         | ---- |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Operational Support                   |
|--------------------------------+------|
| User Security Guidance         | UG-1 |
|--------------------------------+------|
| Administrative Guidance        | AG-1 |
|--------------------------------+------|
| Trusted Generation             | ---- |
|--------------------------------+------|
| Development Environment               |
|--------------------------------+------|
| Life Cycle Definition          | ---- |
|--------------------------------+------|
| Configuration Management       | ---- |
|--------------------------------+------|
| Trusted Distribution           | ---- |
|--------------------------------+------|
| Development Evidence                  |
|--------------------------------+------|
| TCB Protection Properties      | EPP1 |
|--------------------------------+------|
| Product Development            | EPD1 |
|--------------------------------+------|
| Product Testing & Analysis            |
|--------------------------------+------|
|   Functional Testing           | EFT1 |
|--------------------------------+------|
|   Penetration Analysis         | ---- |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Product Support                | ---- |
`---------------------------------------'
|=======================================|
| Evaluation Assurance Components       |
|=======================================|
| Testing                               |
|--------------------------------+------|
|   Test Analysis                | TA-1 |
|--------------------------------+------|
|   Independent Testing          | IT-1 |
|--------------------------------+------|
| Review                                |
|--------------------------------+------|
|   Development Environment      | ---- |
|--------------------------------+------|
|   Operational Support          | ---- |
|--------------------------------+------|
| Analysis                              |
|--------------------------------+------|
|   Protection Properties        | ---- |
|--------------------------------+------|
|   Design                       | ---- |
|--------------------------------+------|
|   Implementation               | ---- |
`---------------------------------------'

CS1 Rationale

2.2	Introduction

As outlined in the Federal Criteria, this rationale de-
scribes the protection philosophy, how the security features 
are intended to be used, the assumptions about the environment 
in which a compliant product is intended to operate, the 
threats within that environment, and the security features and 
assurances that counter these threats.

The level of components that were chosen for the CS1 Pro-
tection Profile are equivalent to Class C2 of the TCSEC [1]. 
They consist of TCSEC requirements plus those evaluation in-
terpretations that a product must meet before it can be eval-
uated at the C2 level.

2.2.1	Protection Philosophy

Any discussion of protection necessarily starts from a pro-
tection philosophy, i.e., what it really means to call the 
product "secure." In general, products will control access to 
information and other resources through the use of specific 
security features so that only properly authorized individu-
als or processes acting on their behalf will be granted ac-
cess. For CS1, three fundamental requirements are derived for 
this statement of protection:

o	Access authorization

o	Accountability

o	Assurance 

The totality of the functionality that enforces the access 
authorization and accountability protection philosophy is 
comprised of the hardware, software, and firmware of the 
Trusted Computing Base (TCB). CS1 requires the TCB to be pro-
tected from external interference and tampering so that it is 
effective at countering identified threats. The assurance 
protection philosophy is comprised of the development pro-
cess, operational support, development evidence, and evalua-
tion process assurances. Each of these are explained below.

2.2.1.1	Access Authorization

The access authorization portion of the philosophy of pro-
tection for this profile addresses subject and object access 
mediation. CS1 provides protected access to resources and ob-
jects. As defined in the TCSEC and specified in this profile, 
access control permits system users and the processes that 
represent them to allow or disallow to other users access to 
objects under their control:

Access control is "a means of restricting access to 
objects based on the identity of subjects and/or 
groups to which they belong. The controls are dis-
cretionary in the sense that a subject with a cer-
tain access permission is capable of passing that 
permission (perhaps indirectly) on to any other 
subject." [1]

These controls permit the granting and revoking of access 
privileges to be left to the discretion of the individual us-
ers. 

2.2.1.2	Accountability

The accountability portion of the philosophy of protection 
for this profile addresses user Identification and Authenti-
cation (I&A) and requirements for security auditing. Each of 
these are explained below.

2.2.1.2.1	Identification and Authentication

User identification is required to support access control 
and security auditing. This includes the capability to estab-
lish, maintain, and protect a unique identifier for each au-
thorized user. User identification is functionally dependent 
on authentication. Authentication is a method of validating a 
person as a legitimate user.

2.2.1.2.2	Audit

For most secure products, a capability must exist to audit 
the security relevant events. As each user performs security 
relevant tasks, the product must record the user identifier, 
the action performed, and the result in a security log. For 
CS1 compliant products, a capability is specified to allow a 
system administrator to access and evaluate audit informa-
tion. This capability provides a method of protection in the 
sense that security relevant events that occur within a com-
puter system can be logged and the responsible user held ac-
countable for his/her actions. Audit trails are used to detect 
and deter penetration of a computer system and to reveal ac-
tivity that identifies misuse.

CS1 provides for an effective audit mechanism by supporting 
the following basic security characteristics. It provides the 
ability to:

o	 review the use of I&A mechanisms;

o	 discover the introduction of objects into a user's 
address space;

o	 discover the deletion of objects; and

o	 discover actions taken by computer operators and sys-
tem administrators.

2.2.1.3	Assurance

Assurance addresses threats and vulnerabilities that can 
affect the product during its development and it addresses 
evaluation assurance. Assurance Package T1 was selected for 
the CS1 level. This minimal assurance level is intended to 
include most commercial computer products that incorporate 
protection components today. Minimal assurance refers to the 
fact that this package includes the lowest levels of develop-
ment and evaluation assurance components and only those com-
ponents deemed important to provide the necessary minimal 
understanding of the product. 

The intent of the product development assurance for this 
package is to establish that the external behavior of the 
product conforms to its user level and administrative docu-
mentation without any analysis of the internal structure of 
the product's TCB. For this reason, only the claimed TCB pro-
tection properties, TCB interface description, and TCB ele-
ment list are required to enable security functional testing. 

The intent of the operational support assurance for this 
package is to establish a minimal level of user and adminis-
trative guidance and product information that enables the cor-
rect product installation, use of product security features, 
and remediation of flaws. 

The development evidence is commensurate with the assuranc-
es required. The intent is to require the type of assurance 
evidence that is generated during the normal commercial de-
velopment process. 

 Evaluation support assurance establishes that the product, 
and the context in which it is developed and supported, is 
commensurate with the development assurance requirements. At 
the T1 level, testing analysis and the requirement for inde-
pendent testing determines whether the product minimally 
meets the functional protection requirements. Operational 
support evaluation assurance determines whether the product 
documentation correctly describes the security relevant oper-
ations.

2.2.2	Intended Method of Use

All individual users (both administrative and non-adminis-
trative) are assigned a unique user identifier. This user 
identifier supports individual accountability and access con-
trol. The operating system authenticates the claimed identity 
of the user before allowing the user to perform any further 
actions.

A CS1 compliant product imposes controls on authorized us-
ers and on processes acting on their behalf to prevent users 
from gaining access to information and other resources for 
which they are not authorized. The product provides the capa-
bility for users to allow or disallow to other users access 
to objects under their control. The objects are files that may 
be read or written to or programs which may be executed. The 
granularity of control is to the level of individual users 
(although groups made up of individual users may be specified) 
and individual objects. CS1 access controls permit the grant-
ing and revoking of access to be left to the discretion of the 
individual users.

Products that comply with CS1 specifications are intended 
to be used within the following operational constraints:

o	The information system is designed to be administered 
as a unique entity by a single organization.

o	The information system is designed to manage comput-
ing, storage, input/output, and to control the sharing 
of resources among multiple users and computer pro-
cesses.

o	The administrative and non-administrative users are 
identified as distinct individuals.

o	The granting and revoking of access control permis-
sions are left to the discretion of individual users.

o	The information system provides facilities for real-
time interaction with users that have access to input/
output devices.

2.2.3	Environmental Assumptions

A product designed to meet the CS1 Protection Profile is 
intended to be a general purpose, multi-user operating system 
that runs on either a workstation, minicomputer, or mainframe. 
CS1 compliant products are expected to be used in commercial 
and government environments. For government environments, CS1 
conforms to the TCSEC C2 class of trust [1].The information 
being processed may be unclassified, sensitive-but-unclassi-
fied, or single-level classified, but not multi-level classi-
fied information.

The following specific environmental conditions have been 
assumed in specifying CS1:

o	The product hardware base (e.g., CPU, printers, ter-
minals, etc.), firmware, and software will be pro-
tected from unauthorized physical access.

o	There will be one or more personnel assigned to manage 
the product including the security of the information 
it contains.

o	The operational environment will be managed according 
to the operational environment documentation that is 
required in the assurance chapter of the Protection 
Profile. 

o	The IT product provides a cooperative environment for 
users to accomplish some task or group of tasks.

o	The processing resources of the IT product, including 
all terminals, are assumed to be located within user 
spaces that have physical access controls established.

2.2.4	Expected Threats

In general, the choice of which Protection Profile to 
choose depends upon the level of security that is required for 
that particular organizational environment. The lowest level, 
the CS1 level, is intended for those commercial and government 
environments where all the system personnel are trusted and 
all the data on the system is at the same classification lev-
el. For example, a government agency where all personnel has 
a government clearance, all data is unclassified, and there 
is no outside network connections would be an ideal candidate 
for CS1, i.e., the threats to be countered are such that only 
a minimal level of trust is needed. However, most commercial 
and government environments are more complex and require a 
higher degree of trust. CS2 addresses the security needs for 
the mainstream commercial and government environments. It 
provides a higher level of trust for those organizations that 
need to enforce a security policy where there is no need for 
different classifications of data. CS3 is intended to provide 
the highest level of trust for commercial and government en-
vironments. It is intended to be used in those environments 
where a great deal of trust is required, such as in law en-
forcement agencies, nuclear facilities, or commercial air-
ports. It provides the strongest features, mechanisms, and 
assurances to counter these threats.

A product that is designed to meet the CS1 Protection Pro-
file and operate within its assumed environment will provide 
capabilities to counter threats. It should be noted, however, 
that although a product may faithfully implement all the fea-
tures and assurances specified in this Protection Profile, the 
complete elimination of any one threat should not be assumed.

The following threats have been assumed in specifying this 
CS1 Protection Profile:

1.	AN UNAUTHORIZED USER MAY ATTEMPT TO GAIN ACCESS TO THE 
SYSTEM

For CS1 compliant products, the threat of an unauthorized 
user gaining access to the system is primarily addressed by 
I&A. I&A features allow the TCB to verify the identity of in-
dividuals attempting to gain access to the system. This is 
accomplished through the use of passwords.

Although not a direct countermeasure, auditing requirements 
are specified at the CS1 level to provide the capability to 
perform an after-the-fact analysis of unauthorized system en-
try and login attempts. This provides an opportunity for the 
system administrators to take corrective actions, such as 
strengthening existing user authentication methods or requir-
ing users to change their passwords.

2.	AN AUTHORIZED USER MAY ATTEMPT TO GAIN ACCESS TO 
RESOURCES WHEN THE USER IS NOT ALLOWED ACCESS

An authorized user can try to gain access to unauthorized 
resources by assuming the user identifier of another user and 
thus gaining their associated access rights. This is addressed 
through the use of passwords.

Once an authorized user has gained access to the system, 
the threat still remains for a user to gain access to resourc-
es when the user is not authorized. At the resource level, CS1 
specifies access control features to mediate (i.e., distrib-
ute, review, and revoke) user access to a subset of resources. 

The object reuse feature has been specified to ensure that 
resource contents are cleared before they are reused. This re-
duces the vulnerability that the resource contents can be read 
before it is overwritten. 

3.	SECURITY RELEVANT ACTIONS MAY NOT BE TRACEABLE TO THE 
USER ASSOCIATED WITH THE EVENT

CS1 accountability and audit requirements are specified to 
provide the capability to track security relevant actions per-
formed by users and link such actions, if possible, to the 
responsible identifier. Audit mechanisms are responsible for 
the monitoring and detecting of real or potential security vi-
olations or events. These audit events can include successful 
or unsuccessful: I&A events, the introduction of objects into 
a user's address space, the deletion of objects, and actions 
taken by system administrators. Each audit record includes the 
date, time, location, type of event, identity of the user and 
object involved, and the success or failure of the event. 

4.	SECURITY BREACHES MAY OCCUR BECAUSE OF TCB PENETRATION

TCB protection is a fundamental capability of CS compliant 
products. The security components and mechanisms described in 
this Protection Profile depend upon the integrity of the TCB 
and on the TCB being isolated and non-circumventable. CS1 
specifies requirements for a common and basic set of security 
features to protect the TCB from outside penetration.

This threat is also countered through product assurance. 
TCB interface definition establishes the boundary between the 
TCB and its internal users. Security functional testing es-
tablishes that these TCB definitions and properties satisfy 
the requirements of this Protection Profile. 

5.	USERS MAY BE ABLE TO BYPASS THE SECURITY FEATURES OF 
THE SYSTEM

This threat is countered by authentication, access control, 
audit, TCB isolation, TCB non-circumventability, and refer-
ence mediation requirements. Authentication requirements pro-
tect authentication data from unauthorized users. Resource 
access control requirements protect access control data.

Audit requirements provide for the logging of successful 
and unsuccessful accesses to resources as well as for changes 
made to the system security configuration and system software 
in the event that the system security features have been by-
passed.

The CS1 specification for reference mediation protects the 
integrity of the access control mechanism and the TCB's func-
tionality. Starting at CS1, requirements exist for TCB medi-
ation of user references to objects and to security relevant 
services. 

CS1-compliant products maintain a domain for its own exe-
cution to protect it from external interference and tampering. 
Such requirements address TCB isolation and non-circumvent-
ability of TCB isolation functions. 

This threat is also countered through product assurance. 
The definition of TCB properties assures the consistency of 
the TCB's behavior. The identification of TCB elements pro-
vides the set of elements that determine the protection char-
acteristics of a product. The TCB interface definition 
establishes the boundary between the TCB and its internal us-
ers. Security functional testing establishes that these TCB 
definitions and properties satisfy the requirements of this 
Protection Profile, and provide evidence against users being 
able to bypass the security features of the system.
CS1 Functionality

3.	Introduction

This section provides detailed functionality requirements 
that must be satisfied by an Commercial Security 1 (CS1) 
compliant product. Note that all plain text are words taken 
directly from the Federal Criteria [11]. Any assignments or 
refinements made to the text in the Federal Criteria for this 
Protection Profile are indicated by the use of bold italics. 
A Protection Profile requirement is an assignment when it is 
directly taken as stated from the Federal Criteria component 
without change or when a binding is made to a Federal Criteria 
threshold definition. A Protection Profile requirement is a 
refinement when a Federal Criteria requirement is taken to a 
lower level of abstraction. The characterization of 
Protection Profile requirements as being either assignments 
or refinements can be found at each component level.

This Protection Profile for CS1 utilizes the following 
levels from the Federal Criteria. Note that not all the 
components from the Federal Criteria are reflected in this 
Protection Profile; there are no specific requirements for 
those components that are not listed.

           CS1 Functional Component Summary
.------------------------------------------------------.
|                                  | Component |       |
| Component Name                   |   Code    | Level |           
|======================================================|
| Security Policy Support:                             |
|----------------------------------+-----------+-------|
|  Identification & Authentication |    I&A    |   1   |
|----------------------------------+-----------+-------|
|  Audit                           |    AD     |   1   |
|----------------------------------+-----------+-------|
|  Access Control                  |    AC     |   1   |
|----------------------------------+-----------+-------|
|  Reference Mediation             |    RM     |   1   |
|----------------------------------+-----------+-------|
|  TCB Protection                  |    P      |   1   |
|----------------------------------+-----------+-------|
|  Self Checking                   |    SC     |   1   |
`------------------------------------------------------'

3.1	Identification & Authentication 

All users of the product must be identified and 
authenticated. A login process is established that the user 
interacts with in order to provide the information necessary 
for identification and authentication. The identification and 
authentication process begins the user's interaction with the 
target product. First, the user supplies a unique user 
identifier to the TCB. Then, the user is asked by the TCB to 
authenticate that claimed identity. The user identifier is 
used for both access control and also for accountability. 
Therefore, the proper maintenance and control of the 
identification mechanism and the identification databases are 
vital to product security. Once a user has supplied an 
identifier to the TCB, the TCB must verify that the user 
really corresponds to the claimed identifier. This is done by 
the authentication mechanism as described by the following 
requirements. 

For the CS1 level, I&A-1 was assigned from the Federal 
Criteria. This I&A component level has not been refined from 
the Federal Criteria.

I&A-1 Minimal Identification and Authentication

1.	The TCB shall require users to identify 
themselves to it before beginning to perform any 
other actions that the TCB is expected to mediate. 
The TCB shall be able to enforce individual 
accountability by providing the capability to 
uniquely identify each individual user. The TCB 
shall also provide the capability of associating 
this identity with all auditable actions taken by 
that individual.

2. 	The TCB shall use a protected mechanism (e.g., 
passwords) to authenticate the user's identity.

3. 	The TCB shall protect authentication data so 
that it cannot be used by any unauthorized user.

3.2	Audit 

Audit supports accountability by providing a trail of user 
actions. Actions are associated with individual users for 
security relevant events and are stored in an audit trail. 
This audit trail can be examined to determine what happened 
and what user was responsible for a security relevant event. 
The audit trail data must be protected from unauthorized 
access, modification, or destruction. In addition, the audit 
trail data must be available in a useful and timely manner for 
analysis. 

Audit data is recorded from several sources (such as from 
the TCB or a privileged application) to produce a complete 
picture of a user's security relevant actions. Therefore, 
audit data must be correlated across audit collection systems. 
The mechanisms providing audit data recording must be 
tailorable to each product's needs. Both the audit data itself 
and the mechanisms to determine what audit data is recorded 
are protected by privileges.

Once the audit data is recorded, it is analyzed and 
reported. At the CS1 level, reports are generated on request.

For the CS1 level, AD-1 was assigned from the Federal 
Criteria. No refinements were made from the Federal Criteria.

AD-1 - Minimal Audit

1.	The TCB shall be able to create, maintain, and 
protect from modification or unauthorized access 
or destruction an audit trail of accesses to the 
objects it protects. The audit data shall be 
protected by the TCB so that read access to it is 
limited to those who are authorized for audit 
data.

2.	The TCB shall be able to record the following 
types of events:

	- use of the identification and authentication 
mechanisms;

	- introduction of objects into a user's address 
space (e.g., file open, program initiation), and 
deletion of objects;

	- actions taken by computer operators and system 
administrators and/or system security officers.

3.	For each recorded event, the audit record shall 
identify: date and time of the event, user, type 
of event, and success or failure of the event. For 
identification/authentication events the origin of 
request (e.g., terminal ID) shall be included in 
the audit record. For events that introduce an 
object into a user's address space and for object 
deletion events the audit record shall include the 
name and policy attributes of the object (e.g., 
object security level).

4.	The system administrator shall be able to 
selectively audit the actions of one or more users 
based on individual identity and/or object policy 
attributes (e.g., object security level).

3.3	Access Control 

Once the user has been granted access, the question of which 
objects that authenticated user may access still remains. The 
requirements below describe these subject accesses to 
objects. 

For the CS1 level, AC-1 was assigned from the Federal 
Criteria. No refinements were made from the Federal Criteria.

AC-1 Minimal Access Control

1.	Definition of Access Control Attributes

The TCB shall define and protect access control 
attributes for subjects and objects. Subject 
attributes shall include named individuals or 
defined groups or both. Object attributes shall 
include defined access rights (e.g., read, write, 
execute) that can be assigned to subject 
attributes.

2.	Administration of Access Control Attributes.

The TCB shall define and enforce rules for 
assignment and modification of access control 
attributes for subjects and objects. The effect of 
these rules shall be that access permission to an 
object by users not already possessing access 
permission is assigned only by authorized users. 
These rules shall allow authorized users to 
specify and control sharing of objects by named 
individuals or defined groups of individuals, or 
by both, and shall provide controls to limit 
propagation of access rights. These controls shall 
be capable of including or excluding access to the 
granularity of a single user.

If different rules of assignment and modification 
of access control attributes apply to different 
subjects and/or objects, the totality of these 
rules shall be shown to support the defined 
policy.

3.	Authorization of Subject References to Objects

The TCB shall define and enforce authorization 
rules for the mediation of subject references to 
objects. These rules shall be based on the access 
control attributes of subjects and objects. These 
rules shall, either by explicit user action or by 
default, provide that objects are protected from 
unauthorized access.

The scope of the authorization rules shall include 
a defined subset of the product's subjects and 
objects and associated access control attributes. 
The coverage of authorization rules shall specify 
the types of objects and subjects to which these 
rules apply. If different rules apply to different 
subjects and objects, the totality of these rules 
shall be shown to support the defined policy.

4.	Subject and Object Creation and Destruction

The TCB shall control the creation and destruction 
of subjects and objects. These controls shall 
include object reuse. That is, all authorizations 
to the information contained within a storage 
object shall be revoked prior to initial 
assignment, allocation or reallocation to a 
subject from the TCB's pool of unused storage 
objects; information, including encrypted 
representations of information, produced by a 
prior subjects' actions shall be unavailable to 
any subject that obtains access to an object that 
has been released back to the system.

3.4	Reference Mediation 

Reference mediation, that is, the control by the TCB of 
subject accesses to objects, must be ensured so that the users 
can have faith in the TCB's access control decisions. Also, 
users must be ensured that all access to security services are 
mediated by the TCB.    

For the CS1 level, RM-1 was assigned from the Federal 
Criteria. No further refinements were made from the Federal 
Criteria.

RM-1 Mediation of References to a Defined Subject/Object 
Subset

1. 	The TCB shall mediate all references to 
subjects, objects, resources, and services (e.g., 
TCB functions) described in the TCB 
specifications. The mediation shall ensure that 
all references are directed to the appropriate 
security-policy functions.

2.	Reference mediation shall include references to 
the defined subset of subjects, objects, and 
resources protected under the TCB security policy, 
and to their policy attributes (e.g., access 
rights, security and/or integrity levels, role 
identifiers).

3. 	References issued by privileged subjects shall 
be mediated in accordance with the policy 
attributes defined for those subjects.

3.5	TCB Protection 

TCB protection is a fundamental requirement for a secure 
product. All of the security components and mechanisms that 
have been described depend upon the integrity of the TCB and 
on the TCB being isolated and non-circumventable. The TCB must 
be resistant to outside penetration. 

For the CS1 level, P-1 was assigned from the Federal 
Criteria. No refinements were made from the Federal Criteria.

P-1 Basic TCB Isolation

The TCB shall maintain a domain for its own 
execution that protects it from external 
interference and tampering (e.g., by reading or 
modification of its code and data structures). The 
protection of the TCB shall provide TCB isolation 
and noncircumventability of TCB isolation 
functions as follows:

	1. TCB Isolation requires that (1) the address 
spaces of the TCB and those of unprivileged 
subjects are separated such that users, or 
unprivileged subjects operating on their behalf, 
cannot read or modify TCB data structures or code, 
(2) the transfers between TCB and non-TCB domains 
are controlled such that arbitrary entry to or 
return from the TCB are not possible; and (3) the 
user or application parameters passed to the TCB 
by addresses are validated with respect to the TCB 
address space, and those passed by value are 
validated with respect to the values expected by 
the TCB.

	2. Noncircumventability of TCB isolation 
functions requires that the permission to objects 
(and/or to non-TCB data) passed as parameters to 
the TCB are validated with respect to the 
permissions required by the TCB, and references to 
TCB objects implementing TCB isolation functions 
are mediated by the TCB.

3.6	TCB Self-Checking 

Validating the correct operation of the TCB firmware and 
hardware is an important aspect of guaranteeing the integrity 
of the product. Hardware and software features that validate 
the correct operation of the product will be delivered with 
the product to ensure that the hardware and firmware are 
installed properly and are in working order.

For the CS1 level, SC-1 was assigned from the Federal 
Criteria. No refinements were made from the Federal Criteria.

SC-1 Minimal Self Checking

Hardware and/or software features shall be 
provided that can be used to periodically validate 
the correct operation of the on-site hardware and 
firmware elements of the TCB.

CS1 Assurance

4.	Introduction

This chapter provides the CS1 development and evaluation 
assurance requirements package using the development and 
evaluation assurance components defined in Volume I and the 
package contained in Volume I, Appendix G of the Federal 
Criteria. The structure of each assurance package follows that 
of the assurance components (i.e., each package consists of 
development process, operational support, development 
environment, development evidence, and evaluation process 
components). 

Assurance Package T1 

This minimal assurance level is intended to include most 
commercial computer products that incorporate protection 
components. Minimal assurance refers to the fact that this 
package includes the lowest levels of development and 
evaluation assurance components and only those components 
deemed important to provide the necessary minimal 
understanding of the product. 

The intent of product development assurance for this 
package is to establish that the external behavior of the 
product conforms to its user level and administrative 
documentation without any analysis of the internal structure 
of the product's TCB. For this reason, only the claimed TCB 
protection properties, TCB interface description, and TCB 
element list are required to enable functional testing. 

The intent of the operational support assurance for this 
package is to establish a minimal level of user and 
administrative guidance and product information that enables 
the correct product installation, use of product security 
features, and remediation of flaws. 

The development evidence required for this package is 
commensurate with the assurances required. The intent of this 
package is to require the type of assurance evidence that is 
generated during the normal commercial development process.

 The intent of evaluation support assurance is to establish 
that the product, and the context in which it is developed and 
supported, is commensurate with the development assurance 
requirements. At the T1 level, testing analysis and the 
requirement for independent testing determines whether the 
product minimally meets the functional protection 
requirements. Operational support evaluation assurance 
determines whether the product documentation correctly 
describes the security relevant operations.

The following table  summarizes the generic assurance 
components that comprise the minimal development assurance 
package (T1):

.

      CS1 Assurance Package Summary
.---------------------------------------.
| Assurance Components           |  T1  |
|================================|======|
| Development Assurance Components      |     
|=======================================|
| Development Process                   |
|--------------------------------+------|
| TCB Property Definition        | PD-1 |
|--------------------------------+------|
| TCB Design                            |
|--------------------------------+------|
|   TCB Element Identification   | ID-1 |
|--------------------------------+------|
|   TCB Interface Definition     | IF-1 |
|--------------------------------+------|
|   TCB Modular Decomposition    | ---- |
|--------------------------------+------|
|   TCB Structuring Support      | ---- |
|--------------------------------+------|
|   TCB Design Disciplines       | ---- |
|--------------------------------+------|
| TCB Implementation Support     | ---- |
|--------------------------------+------|
| TCB Testing and Analysis              |
|--------------------------------+------|
|   Functional Testing           | FT-1 |
|--------------------------------+------|
|   Penetration Analysis         | ---- |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Operational Support                   |
|--------------------------------+------|
| User Security Guidance         | UG-1 |
|--------------------------------+------|
| Administrative Guidance        | AG-1 |
|--------------------------------+------|
| Trusted Generation             | ---- |
|--------------------------------+------|
| Development Environment               |
|--------------------------------+------|
| Life Cycle Definition          | ---- |
|--------------------------------+------|
| Configuration Management       | ---- |
|--------------------------------+------|
| Trusted Distribution           | ---- |
|--------------------------------+------|
| Development Evidence                  |
|--------------------------------+------|
| TCB Protection Properties      | EPP1 |
|--------------------------------+------|
| Product Development            | EPD1 |
|--------------------------------+------|
| Product Testing & Analysis            |
|--------------------------------+------|
|   Functional Testing           | EFT1 |
|--------------------------------+------|
|   Penetration Analysis         | ---- |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Product Support                | ---- |
`---------------------------------------'
|=======================================|
| Evaluation Assurance Components       |
|=======================================|
| Testing                               |
|--------------------------------+------|
|   Test Analysis                | TA-1 |
|--------------------------------+------|
|   Independent Testing          | IT-1 |
|--------------------------------+------|
| Review                                |
|--------------------------------+------|
|   Development Environment      | ---- |
|--------------------------------+------|
|   Operational Support          | ---- |
|--------------------------------+------|
| Analysis                              |
|--------------------------------+------|
|   Protection Properties        | ---- |
|--------------------------------+------|
|   Design                       | ---- |
|--------------------------------+------|
|   Implementation               | ---- |
`---------------------------------------'

4.1	TCB Property Definition

The definition of TCB properties assures the consistency of 
the TCB's behavior. It determines a baseline set of properties 
that can be used by system developers and evaluators to assure 
that the TCB satisfies the defined functional requirements.

For CS1, PD-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

PD-1 Property Description

The developer shall interpret the functional 
requirements of the protection profile within the 
product TCB. For each functional requirement, the 
developer shall: (1) identify the TCB elements and 
their TCB interfaces (if any) that implement that 
requirement; (2) describe the operation of these 
TCB elements, and (3) explain why the operation of 
these elements is consistent with the functional 
requirement. 

4.2	TCB Element Identification

The identification of TCB elements (hardware, firmware, 
software, code, and data structures) provides the set of 
elements that determine the protection characteristics of a 
product. All assurance methods rely on the correct 
identification of TCB elements either directly or indirectly.

For CS1, ID-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 ID-1: TCB Element Identification

The developer shall identify the TCB elements 
(i.e., software, hardware/firmware code and data 
structures). Each element must be unambiguously 
identified by its name, type, release, and version 
number (if any).

4.3	TCB Interface Definition

The TCB interface establishes the boundary between the TCB 
and its external users and application programs. It consists 
of several components, such as command interfaces (i.e., user 
oriented devices such as the keyboard and mouse), application 
program interfaces (system calls), and machine/processor 
interfaces (processor instructions).

For CS1, IF-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 IF-1: Interface Description

The developer shall describe all external (e.g., 
command, software, and I/O) administrative (i.e., 
privileged) and non-administrative interfaces to 
the TCB. The description shall include those 
components of the TCB that are implemented as 
hardware and/or firmware if their properties are 
visible at the TCB interface.

The developer shall identify all call conventions 
(e.g., parameter order, call sequence 
requirements) and exceptions signaled at the TCB 
interface.

4.4	Developer Functional Testing

Functional testing establishes that the TCB interface 
exhibits the properties necessary to satisfy the requirements 
of the protection profile. It provides assurance that the TCB 
satisfies at least its functional protection requirements.

For CS1, FT-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 FT-1: Conformance Testing

The developer shall test the TCB interface to show 
that all claimed protection functions work as 
stated in the TCB interface description.

The developer shall correct all flaws discovered 
by testing and shall retest the TCB until the 
protection functions are shown to work as claimed.

4.5	User's Guidance

User's guidance is an operational support assurance 
component that ensures that usage constraints assumed by the 
protection profile are understood by the users of the product. 
It is the primary means available for providing product users 
with the necessary background and specific information on how 
to correctly use the product's protection functionality.

For CS1, UG-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 UG-1: Users' Guide

The developer shall provide a User Guide which 
describes all protection services provided and 
enforced by the TCB. The User Guide shall describe 
the interaction between these services and provide 
examples of their use. The User Guide may be in the 
form of a summary, chapter or manual. The User 
Guide shall specifically describe user 
responsibilities. These shall encompass any user 
responsibilities identified in the protection 
profile.

4.6	Administrative Guidance

Administrative guidance is an operation support assurance 
component that ensures that the environmental constraints 
assumed by the protection profile are understood by 
administrative users and operators of the IT product. It is 
the primary means available to the developer for providing to 
administrators and operators detailed, accurate information 
on how to configure and install the product, operate the IT 
product is a secure manner, make effective use of the 
product's privileges and protection mechanisms to control 
access to administrative functions and data bases, and to 
avoid pitfalls and improper use of the administrative 
functions that would compromise the TCB and user security.

For CS1, AG-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 AG-1: Basic Administrative Guidance

The developer shall provide a Trusted Facility 
Manual intended for the product administrators 
that describes how to use the TCB security 
services (e.g., Access Control, System Entry, or 
Audit) to enforce a system security policy. The 
Trusted Facility Manual shall include the 
procedures for securely configuring, starting, 
maintaining, and halting the TCB. The Trusted 
Facility Manual shall explain how to analyze audit 
data generated by the TCB to identify and document 
user and administrator violations of this policy. 
The Trusted Facility Manual shall explain the 
privileges and functions of administrators. The 
Trusted Facility Manual shall describe the 
administrative interaction between security 
services.

The Trusted Facility Manual shall be distinct from 
User Guidance, and encompass any administrative 
responsibilities identified in security 
management.

4.7	Evidence of TCB Protection Properties

The documentation of the TCB protection properties includes 
the definition of the functional component requirements, 
their modeling (if any), and their interpretation within a 
product's TCB. For each requirement of a protection profile, 
a description, definition (an informal, descriptive 
specification), or a formal specification of the TCB 
components and their operation corresponding to the 
requirement must be provided.

For CS1, EPP-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 EPP-1 Evidence of TCB Correspondence to the Functional 
Requirements

The developer shall provide documentation which 
describes the correspondence between the 
functional component requirements and the TCB 
elements and interfaces. The TCB properties, which 
are defined by this correspondence, shall be 
explained in this documentation.

4.8	Evidence of Product Development

Product development evidence consists of the TCB design 
evidence including the documentation of the TCB interface, TCB 
elements, TCB structure, TCB structuring support, and TCB 
design disciplines. The TCB implementation evidence includes 
TCB source code, and the processor hardware and firmware 
specifications.

For CS1, EPD-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 EPD-1: Description Of The TCB External Interface

The developer shall provide an accurate 
description of the functions, effects, exceptions 
and error messages visible at the TCB interface.

The developer shall provide a list of the TCB 
elements (hardware, software, and firmware).

4.9	Evidence of Functional Testing

Functional testing evidence includes the testing itself, 
the test plans, and test documentation results. Test plans 
consist of: the description definition or specification of the 
test conditions; the test data, which consists of the test 
environment set-up; the test parameters and expected 
outcomes; and a description of the test coverage. 

For CS1, EFT-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 EFT-1: Evidence of Conformance Testing

The developer shall provide evidence of the 
functional testing that includes the test plan, 
the test procedures, and the results of the 
functional testing.

4.10	Test Analysis

Test analysis determines whether the product meets the 
functional protection requirements defined in the protection 
profile. Functional testing is based on operational product, 
the TCB's functional properties, the product's operational 
support guidance, and other producer's documentation as 
defined by the development evidence requirements. Functional 
test analysis is based on the achieved test results as 
compared to the expected results derived from the development 
evidence.

For CS1, TA-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 TA-1: Elementary Test Analysis

The evaluator shall assess whether the producer 
has performed the activities defined in the 
development assurance requirements of the 
protection profile for functional testing and 
whether the producer has documented these 
activities as defined in the development evidence 
requirements of the protection profile. The 
evaluator shall analyze the results of the 
producer's testing activities for completeness of 
coverage and consistency of results. The evaluator 
shall determine whether the product's protection 
properties, as described in the product 
documentation have been tested. The evaluator 
shall assess testing results to determine whether 
the product's TCB works as claimed.

4.11	Independent Testing 

Independent testing determines whether the product's TCB 
meets the functional protection requirements as defined in the 
functionality chapter of this Protection Profile. Testing is 
based on the operational product, the TCB's functional 
properties, the product's operational support guidance, and 
other producer's documentation as defined by the Development 
Evidence requirements.

For CS1, IT-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 IT-1: Elementary Independent Testing

A tester, independent of the producer or 
evaluator, shall perform functional and elementary 
penetration testing. This testing shall be based 
on the product's user and administrative 
documentation, and on relevant known penetration 
flaws. Satisfactory completion consists of 
demonstrating that all user-visible security 
enforcing functions and security-relevant 
functions work as described in the product's user 
and administrative documentation and that no 
discrepancies exist between the documentation and 
the product. Test results of the producer shall be 
confirmed by the results of independent testing. 
The evaluator may selectively reconfirm any test 
result.

If the independent testing is performed at beta-
test sites, the producer shall supply the beta-
test plan and the test results. The evaluator 
shall review the scope and depth of beta testing 
with respect to the required protection 
functionality, and shall verify independence of 
both the test sites and the producer's and beta-
test user's test results. The evaluator shall 
confirm that the test environment of the beta-test 
site(s) adequately represents the environment 
specified in the protection profile.

COMMERCIAL SECURITY 2 (CS2)

CS2 compliant products provide protection beyond 
those of the CS1 Protection Profile by providing for 
the separation of administrative functions and 
access controls based on groups and access control 
lists (ACLs). Identification and authentication 
mechanisms include support for a rigorous password 
management program (if desired). System entry and 
availability and recovery requirements are also 
specified. Secure administrative tools are 
included, audit mechanisms are expanded, and data 
reduction tools are listed.

           CS2 Functional Component Summary
.------------------------------------------------------.
|                                  | Component |       |
| Component Name                   |   Code    | Level |
|======================================================|
| Security Policy Support:                             |
|----------------------------------+-----------+-------|
|  Identification & Authentication |    I&A    |   3   |
|----------------------------------+-----------+-------|
|  System Entry                    |    SE     |   2   |
|----------------------------------+-----------+-------|
|  Trusted Path                    |    TP     |   1   |
|----------------------------------+-----------+-------|
|  Audit                           |    AD     |   3   |
|----------------------------------+-----------+-------|
|  Access Control                  |    AC     |   2+  |
|----------------------------------+-----------+-------|
|  Security Management             |    SM     |   2   |
|----------------------------------+-----------+-------|
| Reference Mediation              |    RM     |   1   |
|----------------------------------+-----------+-------|
| TCB Protection                   |    P      |   1   |
|----------------------------------+-----------+-------|
| Self Checking                    |    SC     |   2   |
|----------------------------------+-----------+-------|
| TCB Initialization & Recovery    |    TR     |   2   |
|----------------------------------+-----------+-------|
| Privileged Operations            |    PO     |   1   |
|----------------------------------+-----------+-------|
| Ease-of-Use                      |    EU     |   2   |
`------------------------------------------------------'

      CS2 Assurance Package Summary
.---------------------------------------.
| Assurance Components           |  T2+ |
|================================|======|
| Development Assurance Components      |     
|=======================================|
| Development Process                   |
|--------------------------------+------|
| TCB Property Definition        | PD-2 |
|--------------------------------+------|
| TCB Design                            |
|--------------------------------+------|
|   TCB Element Identification   | ID-2 |
|--------------------------------+------|
|   TCB Interface Definition     | IF-1 |
|--------------------------------+------|
|   TCB Modular Decomposition    | ---- |
|--------------------------------+------|
|   TCB Structuring Support      | SP-1 |
|--------------------------------+------|
|   TCB Design Disciplines       | ---- |
|--------------------------------+------|
| TCB Implementation Support     | ---- |
|--------------------------------+------|
| TCB Testing and Analysis              |
|--------------------------------+------|
|   Functional Testing           | FT-1 |
|--------------------------------+------|
|   Penetration Analysis         | ---- |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Operational Support                   |
|--------------------------------+------|
| User Security Guidance         | UG-1 |
|--------------------------------+------|
| Administrative Guidance        | AG-1 |
|--------------------------------+------|
| Flaw Remediation               | FR-1 |
|--------------------------------+------|
| Trusted Generation             | TG-2 |
|--------------------------------+------|
| Development Environment               |
|--------------------------------+------|
| Life Cycle Definition          | ---- |
|--------------------------------+------|
| Configuration Management       | ---- |
|--------------------------------+------|
| Trusted Distribution           | ---- |
|--------------------------------+------|
| Development Evidence                  |
|--------------------------------+------|
| TCB Protection Properties      | EPP2 |
|--------------------------------+------|
| Product Development            | EPD1 |
|--------------------------------+------|
| Product Testing & Analysis            |
|--------------------------------+------|
|   Functional Testing           | EFT1 |
|--------------------------------+------|
|   Penetration Analysis         | ---- |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Product Support                | EPS1 |
`---------------------------------------'
|=======================================|
| Evaluation Assurance Components       |
|=======================================|
| Testing                               |
|--------------------------------+------|
|   Test Analysis                | TA-1 |
|--------------------------------+------|
|   Independent Testing          | IT-1 |
|--------------------------------+------|
| Review                                |
|--------------------------------+------|
|   Development Environment      | ---- |
|--------------------------------+------|
|   Operational Support          | OSR1 |
|--------------------------------+------|
| Analysis                              |
|--------------------------------+------|
|   Protection Properties        | ---- |
|--------------------------------+------|
|   Design                       | ---- |
|--------------------------------+------|
|   Implementation               | ---- |
`---------------------------------------'

CS2 Rationale

2.12	Introduction

As outlined in the Federal Criteria, this rationale 
describes the protection philosophy, how the security 
features are intended to be used, the assumptions about the 
environment in which a compliant product is intended to 
operate, the threats within that environment, and the security 
features and assurances that counter these threats. At the CS2 
level, the features used to counter threats and the strength 
of the assurance evidence is enhanced over CS1 and is 
indicated in the text through bold italics.

2.12.1	Protection Philosophy

Any discussion of protection necessarily starts from a 
protection philosophy, i.e., what it really means to call the 
product "secure." In general, products will control access to 
information and other resources through the use of specific 
security features so that only properly authorized 
individuals or processes acting on their behalf will be 
granted access. For CS1, three fundamental requirements are 
derived for this statement of protection:

o	Access authorization

o	Accountability

o	Assurance 

The totality of the functionality that enforces the access 
authorization and accountability protection philosophy is 
comprised of the hardware, software, and firmware of the 
Trusted Computing Base (TCB). CS2 requires the TCB to be self-
protecting and resistant to bypass so that it is effective at 
countering identified threats. CS2 also requires effective 
management of security attributes and configuration 
parameters. The assurance protection philosophy is comprised 
of the development process, operational support, development 
evidence, and evaluation process assurances. Each of these are 
explained below.

2.12.1.1	Access Authorization

The access authorization portion of the philosophy of 
protection for this profile addresses subject and object 
access mediation. For CS2 compliant products, access 
authorization has been further refined to include system 
entry, subject and object mediation based on Access Control 
Lists (ACLs), and privileged operations.

2.12.1.1.1	System Entry

CS2 provides the capability for a system administrator to 
establish, maintain, and protect information from 
unauthorized access, and defines the identities of and 
conditions under which users may gain entry into the system. 
These system entry controls are based on user identification, 
time, location, and method of entry. 

2.12.1.1.2	Subject and Object Access Mediation

CS2 provides protected access to resources and objects. As 
defined in the TCSEC and specified in this profile, access 
control permits system users and the processes that represent 
them to allow or disallow to other users access to objects 
under their control:

Access control is "a means of restricting access to 
objects based on the identity of subjects and/or 
groups to which they belong. The controls are 
discretionary in the sense that a subject with a 
certain access permission is capable of passing 
that permission (perhaps indirectly) on to any 
other subject." [1]

These controls permit the granting and revoking of access 
privileges to be left to the discretion of the individual 
users. The creator of the object becomes, by default, the 
owner of the object. The owner can grant access as well as 
specify the mode of access (read, write, execute) to the 
object.

ACLs are defined that can effectively specify, for each 
named object, a list of user identifiers with their respective 
modes of access (read, write, and execute) to that object. 
ACLs allow for control of:

o	objects 

o	access modes that protect these objects

o	specific access permissions to be passed onto 
identified authorized subjects.

CS2 also allows for the specification and maintenance of 
groups. Groups are a convenient means of logically associating 
user identifiers. Groups can be referenced when specifying 
ACLs.

2.12.1.1.3	Privileges

CS2 supports and promotes the separation and use of 
privileges. A privilege enables a subject to perform a 
security relevant operation that, by default, is denied. 
Privileges cover all security aspects of a product. CS2 
compliant products have tightly controlled privilege 
definitions as well as control over subjects that hold 
privileges. 

2.12.1.2	Accountability

The accountability portion of the philosophy of protection 
for this profile addresses user Identification and 
Authentication (I&A), requirements for security auditing, and 
a Trusted Path between a user and the operating system. Each 
of these are explained below.

2.12.1.2.1	Identification and Authentication

User identification is required to support access control 
and security auditing. This includes the capability to 
establish, maintain, and protect a unique identifier for each 
authorized user. User identification is functionally 
dependent on authentication. Authentication is a method of 
validating a person as a legitimate user.

User authentication in most computer systems has been 
provided primarily through the use of passwords. CS2 supports 
a variety of password features that give the product a great 
amount of flexibility in the generation of passwords, in 
password security, password features, and password 
administration. For most products, a great deal of confidence 
is placed on maintaining the privacy of passwords belonging 
to individuals. I&A prevents unauthorized individuals from 
logging into the product, therefore, password management is 
essential to secure product operations. The risk of losing a 
password is addressed within CS2 through promoting the use of 
stringent password management practices.

In addition, CS2 allows for stronger authentication 
approaches. CS2 specifies that a unique identifier be 
associated with each trusted subject such as print spoolers 
and database management system services. It also requires the 
TCB to maintain, protect, and display status information for 
all active users and all enabled or disabled user identities 
or accounts.

2.12.1.2.2	Audit

For most secure products, a capability must exist to audit 
the security relevant events. As each user performs security 
relevant tasks, the product must record the user identifier, 
the action performed, and the result in a security log. For 
CS2 compliant products, a capability is specified to allow an 
system administrator to access and evaluate audit 
information. This capability provides a method of protection 
in the sense that security relevant events that occur within 
a computer system can be logged and the responsible user held 
accountable for his/her actions. Audit trails are used to 
detect and deter penetration of a computer system and to 
reveal activity that identifies misuse.

CS2 provides for an effective audit mechanism by supporting 
the following basic security characteristics. It provides the 
ability to:

o	review the use of I&A mechanisms;

o	discover the introduction of objects into a user's 
address space;

o	discover the deletion of objects; 

o	discover actions taken by computer operators and 
system administrators;

o	audit attempts to violate resource allocation limits;

o	protect the audit data so that access to it is limited 
to system administrators that are authorized to 
examine audit information;

o	discover the use of privileges, such as changing the 
ownership of an object;

o	have the audit mechanism act as a deterrent against 
penetrators or hackers; and

o	use audit reduction tools for assessing the damage 
that may result in the event of a violation of the 
implemented security policy. These tools have the 
capability of selectively reviewing the actions of one 
or more users or groups, actions performed on a 
specific object or system resource, and actions 
associated with specific access control attributes.

2.12.1.3	Assurance

Assurance addresses all areas of product development 
assurance and evaluation assurance. Development assurance 
addresses the development process, operational support, the 
development environment, and the development evidence. 
Development process assurance defines the additional efforts 
that a developer must undertake to satisfy the assurance 
objectives while creating the product. It specifies how the 
TCB should be designed and supported by the implementation as 
well as how it should be tested. Operational support assurance 
defines the documentation of the security features for both 
administrative and non-administrative users as well as 
requirements for TCB flaw remediation and TCB generation. 
Development environment assurance includes requirements for 
defining the product's life cycle and specific features for 
configuration management. Development evidence assurance 
defines the TCB's protection properties, details the 
requirements for product testing and analysis, and defines the 
requirements for product support. Evaluation assurance 
establishes that the product, and the context in which it is 
developed and supported, is commensurate with the development 
assurance requirements.

The T2+ Assurance Package was chosen for CS2. This package 
is indicated as being TS2+ since an additional component was 
included for flaw remediation and for a higher level for 
trusted generation. This level is intended to include most 
commercial computer products that are designed to satisfy 
functional requirements. Although most development assurance 
components are required at their lowest levels, the 
requirements of several product development components are 
extended to capture (1) specific TCB properties, and (2) a 
rudimentary notion of support for product structure. The 
operational support component is also extended to enable 
systematic flaw discovery, tracking, and repair.

The intent of the product development assurance for this 
package is to establish that the external behavior of the 
product conforms to its user level and administrative 
documentation without analysis of the internal structure of 
the product TCB. For this reason, only the claimed TCB 
protection properties and their informal models, TCB 
interface description, and TCB element list are required to 
enable functional and penetration testing. Support for TCB 
structuring is limited to process isolation and separation of 
the protection critical TCB elements from the protection non-
critical ones.

The intent of the operational support assurance for this 
package is to establish a minimal level of user and 
administrative guidance and product information that enables 
the correct product installation, use of product security 
features, and remediation of flaws. Similarly, the 
development environment assurances are intended to provide a 
minimal level of control over the product configuration and 
production. This level of development environment assurance 
is similar to that already present in most established 
commercial development organizations.The development evidence 
required for this package is commensurate with the assurances 
required. The intent of this package is to require the type 
of assurance evidence that is generated during the normal 
commercial development process.

At the T2+ level, evaluation support assurance determines 
whether the product meets the functional protection 
requirements for testing analysis and independent testing. 
Operational support evaluation assurance determines whether 
the product documentation correctly describes the security 
relevant operations. 

Also for CS2, flaw remediation was included in this 
assurance package. Flaw remediation is important for 
commercial environments since it ensures that flaws (i.e, 
deficiencies in a product that enables a user external to the 
TCB to violate the functional requirements of a protection 
profile) that are discovered by the product consumers will be 
tracked, corrected, and disseminated to the affected 
customers.

2.12.1.4	Intended Method of Use

All individual users (both administrative and non-
administrative users) are assigned a unique user 
identifier.This user identifier supports individual 
accountability and access control. The operating system 
authenticates the claimed identity of the user before allowing 
the user to perform any further actions. 

Products that comply with the CS2 Protection Profile are 
provided with the capability of assigning privileges to secure 
functions. These privileges are used to control access to 
user, password files, and audit trails. This capability is 
particularly important to prevent a "privileged user" or 
"superuser" from having a wide set of privileges when only a 
subset is needed.

A CS1 compliant product imposes controls on authorized 
users and on processes acting on their behalf to prevent users 
from gaining access to information and other resources for 
which they are not authorized. The product provides the 
capability for users to allow or disallow to other users 
access to objects under their control. The objects are files 
that may be read or written to or programs which may be 
executed. The granularity of control is to the level of 
individual users (although groups made up of individual users 
may be specified) and individual objects. CS1 access controls 
permit the granting and revoking of access to be left to the 
discretion of the individual users.

Products that comply with CS2 specifications are intended 
to be used within the following operational constraints:

o	The information system is designed to be administered 
as a unique entity by a single organization.

o	The information system is designed to manage 
computing, storage, input/output, and to control the 
sharing of resources among multiple users and computer 
processes.

o	The administrative and non-administrative users are 
identified as distinct individuals.

o	The granting and revoking of access control 
permissions (read, write, execute, and deny) are left 
to the discretion of individual users.

o	The information system provides facilities for real-
time interaction with users that have access to input/
output devices.

2.12.2	Environmental Assumptions

A product designed to meet the CS2 Protection Profile is 
intended to be a general purpose, multi-user operating system 
that runs on either a workstation, minicomputer, or mainframe. 
CS2 compliant products are expected to be used in both 
commercial and government environments. The information being 
processed may be unclassified, sensitive-but-unclassified, or 
single-level classified, but not multi-level classified 
information.

The following specific environmental conditions have been 
assumed in specifying CS2:

o	The product hardware base (e.g., CPU, printers, 
terminals, etc.), firmware, and software will be 
protected from unauthorized physical access.

o	There will be one or more personnel assigned to manage 
the product including the security of the information 
it contains.

o	The operational environment will be managed according 
to the operational environment documentation that is 
required in the assurance chapter of the Protection 
Profile.

o	The IT product provides a cooperative environment for 
users to accomplish some task or group of tasks.

o	The processing resources of the IT product, including 
all terminals, are assumed to be located within user 
spaces that have physical access controls established.

o	The IT product provides facilities for some or all of 
the authorized users to create programs that use an 
Application Programming Interface (API) to enable them 
to protect themselves and their objects from 
unauthorized use.

o	Fail-safe defaults are included for the access control 
attributes for the defined subjects and objects for 
the product.

2.12.3	Expected Threats

In general, the choice of which Protection Profile to 
choose depends upon the level of security that is required for 
that particular organizational environment. The lowest level, 
the CS1 level, is intended for those commercial and government 
environments where all the system personnel are trusted and 
all the data on the system is at the same classification 
level. For example, a government agency where all personnel 
has a government clearance, all data is unclassified, and 
there is no outside network connections would be an ideal 
candidate for CS1, i.e., the threats to be countered are such 
that only a minimal level of trust is needed. However, most 
commercial and government environments are more complex and 
require a higher degree of trust. CS2 addresses the security 
needs for the main stream commercial and government 
environments. It provides a higher level of trust for those 
organizations that need to enforce a security policy where 
there is no need for different classifications of data. CS3 
is intended to provide the highest level of trust for 
commercial and government environments. It is intended to be 
used in those environments where a great deal of trust is 
required, such as in law enforcement agencies, nuclear 
facilities, or commercial airports. It provides the strongest 
features, mechanisms, and assurances to counter these 
threats.

A product that is designed to meet the CS2 Protection 
Profile and operate within its assumed environment will 
provide capabilities to counter these threats. It should be 
noted, however, that although a product may faithfully 
implement all the features and assurances specified in this 
Protection Profile, the complete elimination of any one threat 
should not be assumed. A product that is designed to meet the 
CS2 Protection Profile is generally known to be more effective 
at countering the threats than products that meet the CS1 
Protection Profile. CS2 products counter all the CS1 threats, 
and contain stronger features and more assurance evidence than 
CS1 products. In addition to countering CS1 threats, CS2 
compliant products provide protection capabilities to counter 
four additional threats:

1.	AN UNAUTHORIZED USER MAY ATTEMPT TO GAIN ACCESS TO THE 
SYSTEM

For CS1 compliant products, the threat of an unauthorized 
user gaining access to the system is primarily addressed by 
I&A. I&A features allow the TCB to verify the identity of 
individuals attempting to gain access to the system. This is 
accomplished through the use of passwords.

Although not a direct countermeasure, auditing requirements 
are specified at the CS1 level to provide the capability to 
perform an after-the-fact analysis of unauthorized system 
entry and login attempts. This provides an opportunity for the 
system administrators to take corrective actions, such as 
strengthening existing user authentication methods or 
requiring users to change their passwords.

For CS2 compliant systems, the threat of an unauthorized 
user gaining access to the system is primarily addressed by 
stronger I&A features and system entry requirements. 

CS2 specifies password requirements that promote a strong 
organizational password management program. These 
requirements specify that: null passwords cannot be used 
during normal operations; passwords be stored in a one-way 
encrypted form; the clear text representation of a password 
be automatically suppressed; passwords have a minimum-length; 
and that the system utilize a password complexity-checking 
algorithm. An advisory capability is also provided to exclude 
a list of customer-specified passwords. Such requirements 
support the use of passwords that are effective against 
password guessing. To further reduce the probability of a 
password being guessed, requirements limit the number of 
attempted login attempts that can be made by a user associated 
with a specific user identifier. The probability of a single 
password being guessed is further reduced by requirements for 
password aging, by having limitations on password reuse, and 
by allowing users to choose a password that is already 
associated with another user identifier. 

CS2 also allows for a password generating capability. 
Because random passwords can be difficult to remember and 
users are tempted to write them down, requirements are 
specified for the generation of passwords that are easy to 
remember (i.e., pronounceable). Additionally, an advisory 
requirement is specified to allow users to choose from a list 
of alternative passwords.

To minimize the threat that a password has been 
compromised, a requirement exists to allow a user to change 
the password. Because a password can be compromised by 
observing the characters on a terminal screen as it is being 
typed, there is a requirement to blot out the clear-text 
representation of the password on the display device.

In addition, requirements are specified to display an 
advisory warning message to all users prior to system logon 
to discourage a would-be system penetrator from attempting an 
unauthorized system entry. Such a message can also provide a 
basis for subsequent prosecution. System entry requirements 
also specify additional controls on identified and 
authenticated users entering the system. Once a user is 
authenticated, a check is made to determine if the user is 
allowed further entry. System entry is granted only in 
accordance with the authenticated user's access control 
attributes. These conditions are in terms of a user's identity 
and his/her membership in groups (if they exist). In addition, 
CS2 specifies system entry requirements to display to an 
authorized user, upon successful system entry, the date and 
time, method of access or port of entry, and the number of 
failed logon attempts since the last successful system entry 
by that user identifier. These requirements provide a user 
with the capability to detect attempted or successful system 
penetrations. In addition, requirements are specified to lock 
and terminate an interactive session after an administrator-
specified period of user inactivity, and also for the TCB to 
appear to perform the entire user authentication procedure 
even if the user identification entered is invalid. The TCB 
also provides a protected mechanism to allow or deny system 
entry based on specified ranges of time. Also, conditions for 
system entry via dial-up lines are required to be specified.

I&A requirements are also enhanced over those of CS1 by 
specifying requirements for the identification for each 
trusted user, and by specifying requirements for system 
administrators to disable a user's identity or account when 
the number of unsuccessful logon attempts exceeds an 
administrator specified threshold. This is intended to 
mitigate the effectiveness of successive attacks of system 
penetration. 

2.	AN AUTHORIZED USER MAY ATTEMPT TO GAIN ACCESS TO 
RESOURCES WHEN THE USER IS NOT ALLOWED ACCESS

An authorized user can try to gain access to unauthorized 
resources by assuming the user identifier of another user and 
thus gaining their associated access rights. This is addressed 
through the use of passwords.

Once an authorized user has gained access to the system, 
the threat still remains for a user to gain access to 
resources when the user is not authorized. At the resource 
level, CS2 specifies access control features to mediate (i.e., 
distribute, review, and revoke) user access to a subset of 
resources. 

The object reuse feature has been specified to ensure that 
resource contents are cleared before they are reused. This 
reduces the vulnerability that the resource contents can be 
read before it is overwritten. 

To address the vulnerability associated with passwords, CS2 
specifies password requirements that promote a strong 
organizational password management program. Besides those 
password requirements that address penetration threats from 
unauthorized users, other password requirements have been 
specified to counter the threat of an insider (authorized 
user) attack. There are password requirements that specify 
that passwords must always be stored in encrypted format and 
that passwords can never be included in audit trail data. 
Also, in the event that a user selects a password that is 
already in use by another user, requirements disallow the 
system from acknowledging the dual association.

In addition, CS2 specifies access control features to limit 
the user identifiers that may change to another user 
identifier that provides any additional privileges to that 
user. These controls are based on the user identifier and the 
mode of access (i.e., read, write, and execute). Also, 
administrators are provided with capabilities through the use 
of protected mechanisms to set and control security related 
parameters, defaults, thresholds, attributes, and other 
security related data. This provides the ability to 
effectively specify and control access to resources based on 
site specific protection policies. 

CS2 also specifies that privileges must be associated with 
TCB functions, TCB calls, and accesses to privileged TCB 
objects (e.g., user and group registration files. password 
files, audit log files). 

CS2 specifies requirements for a direct communication 
channel, i.e., a trusted path, between the user and the 
operating system to counter spoofing threats. This security 
feature provides confidence that a user at a terminal will 
communicate directly with the TCB rather than to malicious 
code. In particular, to counter the threat of an authorized 
user creating a spoof of legitimate user identifier 
authorization prompts, CS2 specifies requirements for a 
direct communication path between the user and the 
authentication system.

Requirements are also specified to display an advisory 
warning message to all users prior to system logon to 
discourage unauthorized system entry. Such a message can also 
provide a basis for subsequent prosecution.

Once an authorized user has been identified and 
authenticated, system entry control can help counter threats 
of inadvertent, deliberate, and coerced entry performed in an 
unauthorized manner by an authorized user. At the end of 
system entry control, the user bears the access-control 
attributes determined during the I&A process, provided that 
the system entry conditions are satisfied. These conditions 
can be specified in terms of a user's identity, group 
membership, or mode of access.

CS2 also provides other security features. Application 
programming interfaces are provided so that applications can 
protect themselves and their objects from unauthorized use. 
CS2 specifies lists of user identities authorized to enter the 
system via dial-up lines. CS2 also specifies general 
authentication facilities for use by application developers, 
system administrators, and users for the protection of 
resources.

3.	SECURITY RELEVANT ACTIONS MAY NOT BE TRACEABLE TO THE 
USER ASSOCIATED WITH THE EVENT

CS2 accountability and audit requirements are specified to 
provide the capability to track security relevant actions 
performed by users, and link such actions, if possible, to the 
responsible identifier. Audit mechanisms are responsible for 
the monitoring and detecting of real or potential security 
violations or events. These audit events can include 
successful or unsuccessful: I&A events, the introduction of 
objects into a user's address space, the deletion of objects, 
and actions taken by system administrators. Each audit record 
includes the date, time, location, type of event, identity of 
the user and object involved, and the success or failure of 
the event. 

Requirements are specified to protect audit trail data and 
the audit control mechanism from unauthorized access, 
modification, or destruction. Audit features are specified to 
provide post-collection audit analysis on specific data 
items, users, and privileged operations. Also, a capability 
is provided for trusted application programs to append data 
to the security audit trail. 

System entry control helps to enhance accountability by 
providing a time, space, and mode-of-entry context to each 
action for which the user is held accountable. These added 
constraints help to give additional assurance that the proper 
user is held responsible for a set of authorized actions.

At the CS2 level, tools are specified to enhance the 
effectiveness of user accountability. CS3 specifies 
requirements to provide tools to verify the consistency of the 
audit trial data and the selection of audit events. Tools are 
also specified for post-collection analysis to selectively 
review various actions. 

4.	THE PRODUCT MAY BE DELIVERED, INSTALLED, AND THEN USED 
IN AN UNSECURED MANNER

This threat is countered by explicitly requiring that the 
product be delivered with all security features turned on. 
This ensures that the product is secure by default rather than 
insecure by default. This is complemented by allowing many 
security features to be configurable so that, as a specific 
organization gains experience with the actual threats in its 
environment, the organization can adjust the degree of 
security in their system. There are several requirements that 
reinforce the "security by default" perspective during 
initial installation. Requirements for security 
administrative documentation are specified to increase the 
likelihood that the administrator will install and start the 
system in a secure manner.

5.	SECURITY BREACHES MAY OCCUR BECAUSE AVAILABLE SECURITY 
FEATURES ARE NOT USED OR ARE USED IMPROPERLY

Requirements for authentication, system and access control, 
security management, and product documentation provide a 
basis for countering this threat. Authentication requirements 
provide for password management procedures to reduce the 
possibility of easy to guess passwords and to initialize 
passwords for users. Password generation algorithms are 
provided that generate easy to remember passwords and that 
give the user a choice of passwords. In addition, CS2 provides 
for a capability to import and export objects and subjects 
with defined access control attributes. This ensures that 
access control attributes are maintained with the subject or 
object during import and export operations.

Security management requirements are specified for listing, 
setting, and updating all of the system security parameters 
and attributes. These parameters and attributes pertain to 
identification, authentication, system entry, access control, 
audit trail analysis and availability features for the system 
and for individual users. This allows a system administrator 
to confirm that the system is properly configured and, if 
necessary, to modify the existing configuration and 
attributes. In addition, security management requirements 
provide for routine control and maintenance of system 
resources.

Product documentation requirements for users and 
administrators describe how to perform security relevant 
functions in a secure manner.

6.	SECURITY BREACHES MAY OCCUR BECAUSE OF TCB PENETRATION

TCB protection is a fundamental capability of CS compliant 
products. The security components and mechanisms described in 
this Protection Profile depend upon the integrity of the TCB 
and on the TCB being isolated and non-circumventable. CS1 
specifies requirements for a common and basic set of security 
features to protect the TCB from outside penetration.

This threat is also countered through product assurance. 
The TCB interface definition establishes the boundary between 
the TCB and its internal users. Security functional testing 
establishes that these TCB definitions and properties satisfy 
the requirements of the Protection Profile. 

7.	USERS MAY BE ABLE TO BYPASS THE SECURITY FEATURES OF 
THE SYSTEM

This threat is countered by authentication, access control, 
audit, TCB isolation, TCB non-circumventability, and 
reference mediation requirements. Authentication requirements 
protect authentication data from unauthorized users. Resource 
access control requirements protect access control data.

Audit requirements provide for the logging of successful 
and unsuccessful accesses to resources as well as for changes 
made to the system security configuration and system software 
in the event that the system security features have been 
bypassed.

CS1 specifications for reference mediation protects the 
integrity of the access control mechanism and the TCB's 
functionality. Starting at CS1, requirements exist for TCB 
mediation of user references to objects and to security 
relevant services. 

CS1-compliant products maintain a domain for its own 
execution to protect it from external interference and 
tampering. Such requirements address TCB isolation and non-
circumventability of TCB isolation functions. 

This threat is also countered through product assurance. 
The definition of TCB properties assures the consistency of 
the TCB's behavior. The identification of TCB elements 
provides the set of elements that determine the protection 
characteristics of a product. The TCB interface definition 
establishes the boundary between the TCB and its internal 
users. Security functional testing establishes that these TCB 
definitions and properties satisfy the requirements of this 
Protection Profile, and provide evidence against users being 
able to bypass the security features of the system. At the CS2 
level, procedures also have to be established for developers 
to accept customer reports of protection problems and requests 
for corrections to those problems. Also, when the product is 
delivered, all security related parameters must be set to its 
fail-safe defaults.

8.	SUBJECTS MAY BE DENIED CONTINUED ACCESSIBILITY TO THE 
RESOURCES OF THE SYSTEM (I.E., DENIAL OF SERVICE)

Reliability of service requirements promote the continued 
accessibility of system resources by authorized subjects. 
These requirements principally counter threats related to 
intentional or unintentional denial of service attacks. The 
requirements include detecting and reporting facilities, 
controls to limit systematically the disabling of user 
identifiers, mechanisms for recovery in the event of a system 
crash, resource quotas, and data backup and restoration. In 
particular, mechanisms are specified for recovery and system 
start-up, and for a maintenance mode of operation.

CS2 compliant systems provide the capability to detect and 
recover from discontinuity of service using some combination 
of automatic and procedural techniques. This capability is 
intended to counter the threat that subjects may be denied 
continued accessibility to the resources of the system (i.e., 
denial of service). Also, users are notified in advance to 
change their password, so that access to the system is not 
denied without warning. An advisory capability exists to allow 
an system administrator to use null passwords during system 
start-up. This allows a system administrator to access the 
system even if the password mechanism has been compromised. 
In addition, audit trails are compressed to avoid excessive 
consumption of disk space. 

9.	THE INTEGRITY OF THE SYSTEM MAY BE COMPROMISED

At the CS2 level, requirements are specified for TCB 
recovery and start-up to promote the secure state of the 
system in the event of a system failure or discontinuity of 
service. These features are intended to minimize the 
likelihood of the loss of user objects during system recovery.

To protect audit trail data, a mechanism is specified to 
automatically copy the audit trail file to an alternative 
storage area. 

CS2 compliant products also provide the capability to 
validate the correct operation of the TCB software, firmware, 
and hardware. Such features are important to ensure that the 
software, hardware, and firmware are in working order. 

CS2 Functionality 

3.	Introduction

This section provides detailed functionality requirements 
that must be satisfied by a Commercial Security 2 (CS2) 
compliant product. Note that all plain text are words taken 
directly from the Federal Criteria. Any assignments or 
refinements made to the text in the Federal Criteria's are 
indicated by bold italics. A Protection Profile requirement 
is an assignment when it is directly taken as stated from the 
Federal Criteria component without change or when a binding 
is made to a Federal Criteria threshold definition. A 
Protection Profile requirement is a refinement when the 
Federal Criteria requirement is taken to a lower level of 
abstraction. The characterization of Protection Profile 
requirements as being either assignments or refinements can 
be found at each component level. Also, note that, unlike the 
Federal Criteria, there are some items that are considered to 
be "advisory," i.e., an item marked advisory is a desirable 
feature but is not required for that component. Each advisory 
item is marked with an "(A)".

This Protection Profile for CS2 utilizes the following 
levels from the Federal Criteria. Note that not all the 
components from the Federal Criteria are reflected in this 
Protection Profile; there are no specific requirements for 
those components that are not listed. Also note that a "+" 
after the component level number indicates that a requirement 
was included from a higher level of that component.    

           CS2 Functional Component Summary
.------------------------------------------------------.
|                                  | Component |       |
| Component Name                   |   Code    | Level |
|======================================================|
| Security Policy Support:                             |
|----------------------------------+-----------+-------|
|  Identification & Authentication |    I&A    |   3   |
|----------------------------------+-----------+-------|
|  System Entry                    |    SE     |   2   |
|----------------------------------+-----------+-------|
|  Trusted Path                    |    TP     |   1   |
|----------------------------------+-----------+-------|
|  Audit                           |    AD     |   3   |
|----------------------------------+-----------+-------|
|  Access Control                  |    AC     |   2+  |
|----------------------------------+-----------+-------|
|  Security Management             |    SM     |   2   |
|----------------------------------+-----------+-------|
| Reference Mediation              |    RM     |   1   |
|----------------------------------+-----------+-------|
| TCB Protection                   |    P      |   1   |
|----------------------------------+-----------+-------|
| Self Checking                    |    SC     |   2   |
|----------------------------------+-----------+-------|
| TCB Initialization & Recovery    |    TR     |   2   |
|----------------------------------+-----------+-------|
| Privileged Operations            |    PO     |   1   |
|----------------------------------+-----------+-------|
| Ease-of-Use                      |    EU     |   2   |
`------------------------------------------------------'

3.1	Identification & Authentication 

All users of the product must be identified and 
authenticated. A login process is established that interacts 
with the user in order to provide the information necessary 
for identification and authentication. The identification and 
authentication process begins the user's interaction with the 
target product. First, the user supplies a unique user 
identifier to the TCB. Then, the user is asked to authenticate 
that claimed identity by the TCB. The user identifier is used 
for both access control and also for accountability. 
Therefore, the proper maintenance and control of the 
identification mechanism and the identification databases are 
vital to TCB security. Once a user has supplied an identifier 
to the TCB, the TCB must verify that the user really 
corresponds to the claimed identifier. This is done by the 
authentication mechanism as described by the following 
requirements. 

 For the CS2 level, I&A-3 was assigned from the Federal 
Criteria. This I&A component level has been refined from the 
Federal Criteria by requiring that only system administrators 
perform certain actions. Password requirements have also been 
refined to reflect the importance of this protected mechanism 
to commercial products. An additional refinement was made 
regarding invalid user identification on error feedback. 
Assignments were made for default thresholds for the number 
of login attempts and login time intervals.

 I&A-3 Exception-Controlled Identification and 
Authentication

1.	 The TCB shall require users to identify 
themselves to it before beginning to perform any 
other actions that the TCB is expected to mediate. 
The TCB shall be able to enforce individual 
accountability by providing the capability to 
uniquely identify each individual user. The TCB 
shall also provide the capability of associating 
this identity with all auditable actions taken by 
that individual.

2. 	The TCB shall maintain authentication data that 
includes information for verifying the identity of 
individual users (e.g., passwords) as well as 
information for determining the product policy 
attributes of individual users, i.e. groups. These 
data shall be used by the TCB to authenticate the 
user's identity and to ensure that the attributes 
of subjects external to the TCB that may be 
created to act on behalf of the individual user 
satisfy the product policy. The control of user 
identification data shall be limited to system 
administrators, except that a user shall be 
allowed to modify his/her own authentication data 
within prescribed limits (e.g., changing his/her 
own password).

3. 	The TCB shall protect authentication data so 
that it cannot be used by any unauthorized user. 
The TCB shall appear to perform the entire user 
authentication procedure even if the user 
identification entered is invalid. Error feedback 
shall contain no information regarding which part 
of the authentication information is incorrect.

The TCB shall end the attempted login session if 
the user performs the authentication procedure 
incorrectly for a number of successive times 
(i.e., a threshold) specified by an authorized 
system administrator. The default threshold shall 
be three times. When the threshold is exceeded, 
the TCB shall send an alarm message to the system 
console and/or to the administrator's terminal, 
log this event in the audit trail, and delay the 
next login by an interval of time specified by the 
authorized system administrator. The default time 
interval shall be 60 seconds. The TCB shall 
provide a protected mechanism to disable the user 
identity or account when the threshold of 
successive, unsuccessful login attempts is 
violated more than a number of times specified by 
the administrator. By default, this mechanism 
shall be disabled (as it may cause unauthorized 
denial of service).

4. 	The TCB shall have the capability to maintain, 
protect, and display status information for all 
active users (e.g., users currently logged on, 
current policy attributes) and of all user 
accounts (i.e., enabled or disabled user identity 
or account). 

5. Whenever passwords are used as a protection 
mechanism, then, at a minimum:

a.	The TCB shall not indicate to the user if he/she 
has chosen a password already associated with 
another user. 

b.	The TCB shall store passwords in a one-way 
encrypted form. 

(1)	The TCB shall require privilege to access 
encrypted passwords. 

c.	The TCB shall automatically suppress or fully 
blot out the clear-text representation of the 
password on the data entry/display device. 

d.	The TCB shall, by default, prohibit the use of 
null passwords during normal operation. 

(1)	A capability, accessible only to an system 
administrator, to allow null passwords during 
non-normal operations, such as system start-
up, manual recovery, or maintenance mode, on a 
per-user identifier or per-port basis may be 
provided. (A)

e.	The TCB shall provide a protected mechanism to 
allow a user to change his or her password. This 
mechanism shall require re-authentication of the 
user identity. 

(1)	The TCB shall provide a protected mechanism to 
set or initialize passwords for users. The use 
of this mechanism shall be limited to system 
administrators.   

f.	The TCB shall enforce password aging on a per-
user identifier or per-group basis (i.e., a user 
shall be required to change his or her password 
after a system-specifiable minimum time). The 
default for all non-system administrators shall 
be sixty days. 

(1)	The default for system administrator 
identifiers shall be thirty days. 

(2)	After the password aging threshold has been 
reached, the password shall no longer be 
valid, except as provided in 5 g below. 

The control of password aging shall be limited to 
system administrators. 

g.	The TCB shall provide a protected mechanism to 
notify users in advance of requiring them to 
change their passwords. This can be done by 
either:

(1)	Notifying users a system-specifiable period 
of time prior to their password expiring. The 
default shall be seven days. 

- or -

(2)	Upon password expiration, notifying the user 
but allowing a system-specifiable subsequent 
number of additional logons prior to requiring 
a new password. The default shall be two 
additional logons. 

The control of user password expiration defaults 
shall be limited to system administrators. 

h.	Passwords shall not be reusable by the same user 
identifier for a system-specifiable period of 
time. The default shall be six months. The 
control of password re-use shall be limited to 
system administrators. 

i.	The TCB shall provide an algorithm for ensuring 
the complexity of user-entered passwords that 
meets the following requirements:

(1)	Passwords shall meet a system-specifiable 
minimum length requirement. The default 
minimum length shall be eight characters. 

(2)	The password complexity-checking algorithm 
shall be modifiable by the TCB. The default 
algorithm shall require passwords to include 
at least one alphabetic character, one numeric 
character, and one special character. 

(3)	The TCB should provide a protected mechanism 
that allows systems to specify a list of 
excluded passwords (e.g., company acronyms, 
common surnames). (A)

(a)	The TCB should prevent users from selecting 
a password that matches any of those on the 
list of excluded passwords. (A)

The control of password complexity shall be limited 
to system administrators. 

j.	If password generation algorithms are present, 
they shall meet the following requirements:

(1)	The password generation algorithm shall 
generate passwords that are easy to remember 
(i.e., pronounceable). 

(2)	The TCB should give the user a choice of 
alternative passwords from which to choose. 
(A)

(3)	Passwords shall be reasonably resistant to 
brute-force password guessing attacks. 

(4)			If the "alphabet" used by the password 
generation algorithm consists of syllables 
rather than characters, the security of the 
password shall not depend on the secrecy of the 
alphabet. 

(5)	The generated sequence of passwords shall have 
the property of randomness (i.e., consecutive 
instances shall be uncorrelated and the 
sequences shall not display periodicity). 

3.2	System Entry 

Once a user is authenticated, a check is made to see if the 
user is allowed to enter the product. The qualifying checks 
for system entry at the SE-2 level can include time-of-day, 
day-of-week, date, location of terminal, or means of access 
(e.g., dial-up port). 

For the CS2 level, SE-2 was assigned from the Federal 
Criteria. This component has been refined from the Federal 
Criteria by specifying a default advisory warning to be 
displayed before user logon, by limiting the control of system 
entry requirements to system administrators, and by further 
limiting the use of protected mechanisms to system 
administrators. Also, default values for terminal locking and 
session termination and for user policy attributes were 
assigned.

 SE-2 Time and Location Based Entry Control

1. 	Prior to initiating the system login procedure, 
the TCB shall display an advisory warning message 
to the user regarding unauthorized use of the 
system and the possible consequences of failure to 
heed this warning.

a.	The message shall be system-specifiable. 

b.	The TCB shall be able to display a message of up 
to twenty lines in length. 

c.	The following message shall be displayed by 
default: 

"NOTICE: This is a private computer system. 
All users of this system are subject to 
having their activities audited. Anyone 
using this system consents to such 
auditing. All unauthorized entries or 
activities revealed by this auditing can be 
used as evidence and may lead to criminal 
prosecution." 

The control of system entry messages shall be 
limited to system administrators. 

2. 	Before system entry is granted to a user, the 
identity of that user shall be authenticated by 
the TCB. If the TCB is designed to support 
multiple login sessions per user identity, the TCB 
shall provide a protected mechanism to enable 
limiting the number of login sessions per user 
identity or account with a default of a single 
login session. The control of this mechanism to 
limit the number of login sessions shall be 
limited to system administrators. 

3. 	The TCB shall grant system entry only in 
accordance with the authenticated user's policy 
attributes. The system entry conditions shall be 
expressed in terms of users' policy attributes, 
i.e., user identity and membership to groups. If 
no explicit system-entry conditions are defined, 
the system-entry default shall be used (e.g., the 
correct user authentication). The TCB shall 
provide a protected mechanism to allow or deny 
system entry based on specified ranges of time. 
Entry conditions using these ranges shall be 
specified using time-of-day, day-of-week, and 
calendar dates. The control of system entry 
conditions shall be limited to system 
administrators. 

The TCB shall provide a protected mechanism to 
allow or deny system entry based on location or 
port of entry. Conditions for system entry via 
dial-up lines (e.g., lists of user identities 
authorized to enter the system via dial-up lines), 
if any, shall be specified.The control of these 
mechanisms shall be limited to system 
administrators.

4. 	The TCB shall provide a protected mechanism that 
enables authorized administrators to display and 
modify the policy attributes used in system-entry 
control for each user. The conditions under which 
an unprivileged user may display these attributes 
shall be specified.

5.	Upon a user's successful entry to the system, 
the TCB shall display the following data to the 
user and shall not remove them without user 
intervention: (1) the date, time, means of access 
and port of entry of the last successful entry to 
the system; and (2) the number of successive, 
unsuccessful attempts to access the system since 
the last successful entry by the identified user.

6. 	The TCB shall either lock or terminate an 
interactive session after an administrator-
specified interval of user inactivity. The default 
value for the lock interval shall be five minutes. 
The default value for session termination shall be 
fifteen minutes.   

3.3	Trusted Path 

A Trusted Path ensures that users have direct, unencumbered 
communication with the TCB. A Trusted Path may be required at 
various times during a subject session and also may be 
initiated by a user during a TCB interaction.

For the CS2 level, TP-1 was assigned from the Federal 
Criteria. This level was refined by requiring that there be a 
direct Trusted Path connection to the authentication 
mechanism. 

 TP-1 Login Trusted Path

The TCB shall support a trusted communication path 
between itself and the user for initial 
identification and authentication. Communications 
via this path shall be initiated exclusively by a 
user.

a.	The TCB shall provide a protected mechanism by 
which a display device may force a direct 
connection between the port to which it is 
connected and the authentication mechanism. 

3.4	Audit 

Audit supports accountability by providing a trail of user 
actions. Actions are associated with individual users for 
security-relevant events and are stored in an audit trail. 
This audit trail can be examined to determine what happened 
and what user was responsible for a security relevant event. 
The audit trail data must be protected from unauthorized 
access, modification, or destruction. In addition, the audit 
trail data must be available in a useful and timely manner for 
analysis. 

Audit data is recorded from several sources (such as the 
TCB or privileged applications) to produce a complete picture 
of a user's security relevant actions. Therefore, audit data 
must be correlated across audit collection systems. The 
mechanisms providing audit data recording must be tailorable 
to each product's needs. Both the audit data itself and the 
mechanisms to determine what audit data is recorded are 
protected by privileges. Once the audit data is recorded, it 
is analyzed and reported. At the CS2 level, reporting can be 
generated on request. 

For the CS2 level, AD-4 was assigned from the Federal 
Criteria. This level was refined from the Federal Criteria by 
specifying that: password character strings not be recorded 
in the audit trail; privileged applications be allowed to 
append data to the audit trail; audit trail files be copied 
to an alternative storage area after a system-specifiable 
period of time; the TCB provide a protected mechanism for the 
automatic deletion of security audit trail files. Assignments 
were made to subject to object access control rules so that 
they include user access to disk files, tape volumes, and tape 
files.

AD-3 Audit Tools

1.	The TCB shall be able to create, maintain, and 
protect from modification or unauthorized access 
or destruction an audit trail of accesses to the 
objects it protects. The audit data shall be 
protected by the TCB so that read access to it is 
limited to those who are authorized for audit 
data.

The TCB shall support an application program 
interface that allows a privileged application to 
append data to the security audit trail or to an 
applications-specified alternative security audit 
trail. 

The TCB should support an option to maintain the 
security audit trail data in encrypted format. (A)

2.	The TCB shall be able to record the following 
types of events:

	- use of the identification and authentication 
mechanisms, and system entry events;

	- access control events selectable on a per 
user, per subject, per object, per group, and/or 
per policy attribute basis; i.e., introduction of 
objects into a user's address space (e.g., file 
open, program initiation), creation and deletion 
of subjects and objects; distribution and 
revocation of access rights; changes of subject 
and object policy attributes; acquisition and 
deletion of system privileges. 

	-actions taken by computer operators and system 
administrators and/or system security officers; 
i.e., privileged operations such as the 
modification of TCB elements; accesses to TCB 
objects (at a minimum, access to an object shall 
include disk file access, tape volume, or tape 
file access); changes of policy attributes of 
users, TCB configuration and security 
characteristics, and system privileges; selection 
and modification of audited events.

The events that are auditable by default, and 
those that are required for successful auditing of 
other events, which may not be disabled, shall be 
defined. The TCB shall provide a protected 
mechanism that displays the currently selected 
events and their defaults. The use of this 
mechanism shall be restricted to authorized system 
administrators.

3.	For each recorded event, the audit record shall 
identify: date and time of the event, user, type 
of event, and success or failure of the event. For 
identification/authentication events the origin of 
request (e.g., terminal ID) shall be included in 
the audit record. For events that introduce an 
object into a user's address space and for object 
deletion events the audit record shall include the 
name and policy attributes of the object.

The character strings input as a response to a 
password prompt shall not be recorded in the 
security audit trail. 

4.	The TCB shall provide a protected mechanism to 
turn auditing on and off, and to select and change 
the events to be audited and their defaults, 
during the system operation. The use of this 
mechanism shall be restricted to authorized system 
administrators. The system administrator shall be 
able to selectively audit the actions of one or 
more users based on individual identity and/or 
object policy attributes. Audit review tools shall 
be available to authorized system administrators 
to assist in the inspection and review of audit 
data, and shall be protected from unauthorized 
use, modification, or destruction.

The TCB shall provide tools for audit data 
processing. These shall include specifically 
designed tools: for verifying the consistency of 
the audit data; for verifying the selection of 
audit events; for audit trail management. The 
audit trail management tools shall enable:

	- creation, destruction, and emptying of audit 
trails; use of warning points regarding the size 
of the audit data, and modification of the audit 
trail size;

	-formatting and compressing of event records;

	-displaying of formatted audit trail data; and

	-maintaining the consistency of the audit trail 
data after system failures and discontinuity of 
operation.

The TCB shall provide a protected mechanism for 
the automatic copying of security audit trail 
files to an alternative storage area after a 
system-specifiable period of time. 

The TCB shall provide a protected mechanism for 
the automatic deletion of security audit trail 
files after a system-specifiable period of time. 
The default shall be thirty days. 

(a)	It shall not be possible to delete the 
security audit trail before it gets copied 
to an alternate storage area. 

(b)	It shall be possible to disable this mechanism. 

The use of audit trail management functions shall 
be limited to system administrators. 

5. 	Audit review tools shall be available to 
authorized users to assist in the inspection and 
review of audit data, and shall be protected from 
unauthorized modification or destruction. The TCB 
shall also provide tools for post-collection audit 
analysis (e.g., intrusion detection) that shall be 
able to selectively review (1) the actions of one 
or more users (e.g., identification, 
authentication, system-entry, and access control 
actions); (2) the actions performed on a specific 
object or system resource; and (3) all, or a 
specified set of, audited exceptions; and (4) 
actions associated with a specific policy 
attributes. The review tools shall be able to 
operate concurrently with the system operation.

3.5	Access Control 

Once the user has been granted access, the question of which 
objects that authenticated user may access still remains. An 
owner, or an authorized user, allows or denies to other users 
access to that object. The requirements below describe subject 
accesses to objects. 

For the CS2 level, AC-2+ was assigned from the Federal 
Criteria. This level is indicated as being AC-2+ because a 
requirement was included from level AC-4 (the distribution, 
revocation, and review of access control attributes rules). 
This is indicated in the text by an "[AC-4]" in front of the 
requirement.This component level was refined from the Federal 
Criteria by specifying: a protected mechanism for groups; a 
limitation on the changes an active subject can make to a 
privileged user identifier; a definition of an access control 
list; and minimum authorization rules.

 AC-2+ Basic Access Control

1.	Definition of Access Control Attributes

The TCB shall define and protect access control 
attributes for subjects and objects. Subject 
attributes shall include named individuals or 
defined groups or both. Object attributes shall 
include defined access rights (i.e., read, write, 
execute) that can be assigned to subject 
attributes.

The TCB shall be able to assign access rights to 
group identities.

If multiple access control policies are supported, 
the access control attributes corresponding to 
each individual policy shall be identified. 

The subject and/or object attributes shall 
accurately reflect the sensitivity and integrity 
of the subject or object.

2.	Administration of Access Control Attributes

The TCB shall define and enforce rules for 
assignment and modification of access control 
attributes for subjects and objects.

The TCB shall provide a protected mechanism for 
groups as follows: 

a.	A user identifier shall be able to be associated 
with one or more groups. 

b.	The TCB shall provide a protected mechanism to 
list the names of all groups. 

c.	The TCB shall provide a protected mechanism to 
list the membership of any group. 

Rules for maintaining group membership shall be 
provided. These rules shall include those for 
displaying and modifying the list of users 
belonging to a group and the group attributes of 
those users. 

The effect of these rules shall be that access 
permission to an object by users not already 
possessing access permission is assigned only by 
authorized users. 

Only the current owner or system administrators 
shall modify access control attributes on objects.

(a)	There should be a distinct access right to 
modify the contents of an object's access 
control list (e.g., an "ownership" or 
"control" access right). (A)

The TCB shall provide a protected mechanism to 
modify group membership. The use of this mechanism 
shall be under the control of system 
administrators. Authority to modify specific group 
membership may be delegated. 

The TCB shall provide a protected mechanism by which 
the user identifier associated with a subject 
attribute can be changed while the subject is 
active. It shall also provide a protected mechanism 
for limiting the user identifiers that may change 
to a user identifier that would provide any 
additional access rights. The control of these 
mechanisms shall be limited to system 
administrators. 

[AC-4]: These rules shall allow authorized users 
to specify and control sharing of objects by named 
individuals or defined groups of individuals, or 
by both, and shall provide controls to limit 
propagation of access rights, (i.e., these rules 
shall define the distribution, revocation, and 
review of access control attributes). The controls 
defined by these rules shall be capable of 
specifying for each named object, a list of 
individuals and a list of groups of named 
individuals, with their respective access rights 
to that object. Furthermore, for each named 
object, it shall be possible to specify a list of 
named individuals and a list of groups of named 
individuals for which no access to the object is 
given. These controls shall be capable of 
including or excluding access to the granularity 
of a single user.

The rules for assignment and modification of 
access control attributes shall include those for 
attribute assignment to objects during import and 
export operations. If different rules of 
assignment and modification of access control 
attributes apply to different subjects and/or 
objects, the totality of these rules shall be 
shown to support the defined policy.

3.	Authorization of Subject References to Objects

The TCB shall define and enforce authorization 
rules for the mediation of subject references to 
objects. These rules shall be based on the access 
control attributes of subjects and objects. These 
rules shall, either by explicit user action or by 
default, provide that objects are protected from 
unauthorized access. 

For each object, the authorization rules of the TCB 
shall be based on a protected mechanism to specify 
a list of user identifiers or groups with their 
specific access rights to that object (i.e., an 
access control list). 

At a minimum, the authorization rules shall be 
defined as follows:

a.	The access rights associated with a user 
identifier shall take precedence over the access 
rights associated with any groups of which that 
user identifier is a member. 

b.	When a user identifier can be an active member of 
multiple groups simultaneously, or if the access 
rights associated with the user identifier 
conflict with the access rights associated with 
any group in which the user is a member, it shall 
be possible for an system administrator to 
configure rules that combine the access rights to 
make a final access control decision. 

c.	The TCB shall provide a protected mechanism to 
specify default access rights for user 
identifiers not otherwise specified either 
explicitly by a user identifier or implicitly by 
group membership. 

The scope of the authorization rules shall include 
a defined subset of the product's subjects and 
objects and associated access control attributes. 
The coverage of authorization rules shall specify 
the types of objects and subjects to which these 
rules apply. If different rules apply to different 
subjects and objects, the totality of these rules 
shall be shown to support the defined policy. 

If multiple policies are supported, the 
authorization rules for each policy shall be 
defined separately. The TCB shall define and 
enforce the composition of policies, including the 
enforcement of the authorization rules (e.g., 
subject and object type coverage, enforcement 
precedence).

4.	Subject and Object Creation and Destruction

The TCB shall control the creation and destruction 
of subjects and objects. These controls shall 
include object reuse. That is, all authorizations 
to the information contained within a storage 
object shall be revoked prior to initial 
assignment, allocation or reallocation to a 
subject from the TCB's pool of unused storage 
objects; information, including encrypted 
representations of information, produced by a 
prior subjects' actions shall be unavailable to 
any subject that obtains access to an object that 
has been released back to the system.

3.6	Security Management 

The management of security attributes and configuration 
parameters is an important aspect of a secure product. 
Mechanisms have to be provided to easily maintain the product, 
and they must be protected so that only system administrators 
can manage the security aspects of the product. 

For the CS2 level, SM-2 was assigned from the Federal 
Criteria. This level was refined from the Federal Criteria by 
specifying that sessions be terminated rather than locked. An 
assignment was made for the definition and maintenance of 
groups as a security policy attribute. 

 SM-2 Basic Security Management

1. 	The TCB shall provide an installation mechanism 
for the setting and updating of its configuration 
parameters, and for the initialization of its 
protection-relevant data structures before any 
user or administrator policy attributes are 
defined. It shall allow the configuration of TCB 
internal databases and tables.

The TCB shall distinguish between normal mode of 
operation and maintenance mode, and shall provide 
a maintenance-mode mechanism for recovery and 
system start-up.

2. 	The TCB shall provide protected mechanisms for 
displaying and modifying the security policy 
parameters. These parameters shall include 
identification, authentication, system entry and 
access control parameters for the entire system 
and for individual users.

The TCB shall have a capability to define the 
identification and authentication policy on a 
system-wide basis (e.g., password minimum and 
maximum lifetime, password length and complexity 
parameters). The TCB mechanisms shall have the 
capability to limit: (1) maximum period of 
interactive session inactivity, (2) maximum login 
or session time, and (3) successive unsuccessful 
attempts to log in to the system. In particular, 
the TCB shall provide a protected mechanism to 
specify that sessions be terminated rather than 
locked after a period of inactivity. The control 
of these mechanisms shall be limited to system 
administrators. 

3.						 The TCB shall provide protected mechanisms for 
manually displaying, modifying, or deleting user 
registration and account parameters. These 
parameters shall include unique user identifiers, 
their account, and their associated user name and 
affiliation. The TCB shall allow the manual 
enabling and disabling of user identities and/or 
accounts. 

The TCB shall provide a means to uniquely identify 
security policy attributes. It shall also provide 
a means of listing all these attributes for a 
user, and all the users associated with an 
attribute. It shall be capable of defining and 
maintaining the security policy attributes for 
subjects including: defining and maintaining 
privileges for privileged subjects, discretionary 
(i.e., definition and maintenance of groups) and 
non-discretionary attributes and centralized 
distribution, review and revocation of policy 
attributes. 

4. 	The TCB shall provide protected mechanisms for 
routine control and maintenance of system 
resources. It shall allow the enabling and 
disabling of peripheral devices, mounting of 
removable storage media, backing-up and recovering 
user objects; maintaining the TCB hardware and 
software elements (e.g., on site testing); and 
starting and shutting down the system.

5. 	 The use of the protected mechanisms for system 
administration shall be limited to authorized 
administrative users. The control of access-
control attributes shall be limited to the object 
owner and to system administrators. 

3.7	Reference Mediation 

Reference mediation, that is, the control by the TCB of 
subject accesses to objects, must be ensured so that the users 
can have faith in the TCB's access control decisions. Also, 
users must be ensured that all access to security services are 
mediated by the TCB. 

For the CS2 level, RM-1 was assigned from the Federal 
Criteria. No refinements were made from the Federal Criteria.

 RM-1 Mediation of References to a Defined Subject/Object 
Subset

1. 	The TCB shall mediate all references to 
subjects, objects, resources, and services (e.g., 
TCB functions) described in the TCB 
specifications. The mediation shall ensure that 
all references are directed to the appropriate 
security-policy functions.

2.	Reference mediation shall include references to 
the defined subset of subjects, objects, and 
resources protected under the TCB security policy, 
and to their policy attributes (e.g., access 
rights, security and/or integrity levels, role 
identifiers).

3. 	References issued by privileged subjects shall 
be mediated in accordance with the policy 
attributes defined for those subjects.

3.8	Logical TCB Protection 

TCB protection is a fundamental requirement for a secure 
product. All of the security components and mechanisms that 
have been described depend upon the integrity of the TCB and 
on the TCB being isolated and non-circumventable. The TCB must 
be resistant to outside penetration. 

For the CS2 level, P-1 was assigned from the Federal 
Criteria. No refinements were made from the Federal Criteria.

 P-1 Basic TCB Isolation

The TCB shall maintain a domain for its own 
execution that protects it from external 
interference and tampering (e.g., by reading or 
modification of its code and data structures). The 
protection of the TCB shall provide TCB isolation 
and noncircumventability of TCB isolation 
functions as follows:

	1. TCB Isolation requires that (1) the address 
spaces of the TCB and those of unprivileged 
subjects are separated such that users, or 
unprivileged subjects operating on their behalf, 
cannot read or modify TCB data structures or code, 
(2) the transfers between TCB and non-TCB domains 
are controlled such that arbitrary entry to or 
return from the TCB are not possible; and (3) the 
user or application parameters passed to the TCB 
by addresses are validated with respect to the TCB 
address space, and those passed by value are 
validated with respect to the values expected by 
the TCB.

	2. Noncircumventability of TCB isolation 
functions requires that the permission to objects 
(and/or to non-TCB data) passed as parameters to 
the TCB are validated with respect to the 
permissions required by the TCB, and references to 
TCB objects implementing TCB isolation functions 
are mediated by the TCB.

3.9	TCB Self-Checking 

Validating the correct operation of the TCB firmware and 
hardware is an important aspect of guaranteeing the integrity 
of the product. Hardware and software features that validate 
the correct operation of the product will be delivered with 
the product to ensure that the hardware and firmware are 
installed properly and are in working order.

For the CS2 level, SC-2 was assigned from the Federal 
Criteria.The Federal Criteria was refined to limit the 
execution of operator-controlled tests to system 
administrators.

 SC-2 Basic Self Checking

Hardware and/or software features shall be 
provided that can be used to periodically validate 
the correct operation of the on-site hardware and 
firmware elements of the TCB. These features shall 
include: power-on tests, loadable tests, and 
operator-controlled tests. 

The power-on tests shall test all basic components 
of the TCB hardware and firmware elements 
including memory boards and memory 
interconnections; data paths; busses; control 
logic and processor registers; disk adapters; 
communication ports; system consoles, and the 
keyboard speaker. These tests shall cover all 
components that are necessary to run the loadable 
tests and the operator-controlled tests.

The loadable tests shall cover: processor 
components (e.g., arithmetic and logic unit, 
floating point unit, instruction decode buffers, 
interrupt controllers, register transfer bus, 
address translation buffer, cache, and processor-
to-memory bus controller); backplane busses; 
memory controllers; and writable control memory 
for operator-controlled and remote system-
integrity testing.

Operator-controlled tests shall be able to 
initiate a series of one-time or repeated tests, 
to log the results of these tests and, if any fault 
is detected, to direct the integrity-test programs 
to identify and isolate the failure. The execution 
of operator-controlled tests shall be limited to 
system administrators. 

3.10	TCB Initialization and Recovery 

The recovery and start-up of the TCB must be ensured so that 
the product always remains in a secure state, whether the 
recovery is performed manually or automatically. 

For the CS2 level, TR-1 was assigned from the Federal 
Criteria. No further refinements were made from the Federal 
Criteria.

 TR-2 Basic for Recovery or Start-up

1. Procedures and/or mechanisms shall be provided 
to assure that, after a TCB failure or other 
discontinuity, recovery without protection 
compromise is obtained.

2. If automated recovery and start-up is not 
possible, the TCB shall enter a state where the 
only system access method is via administrative 
interfaces, terminals, or procedures. 
Administrative procedures shall exist to restore 
the system to a secure state (i.e., a state in 
which all the security-policy properties hold).

3.11	Privileged Operation 

Privileges are associated with functional components so 
that at any given time only those operations that are 
associated with the privilege can be performed. The privileges 
that a product needs must be identified and must cover all the 
security aspects of the product, including the secure 
administration of the product, and should be defined so that 
there is not a single privileged mode for all of the TCB's 
operations. 

For the CS2 level, PO-1 was assigned from the Federal 
Criteria. No refinements were made from the Federal Criteria.

 PO-1 Privilege Association with TCB Modules

1. TCB privileges needed by individual functions, 
or groups of functions, shall be identified. 
Privileged TCB calls or access to privileged TCB 
objects, such as user and group registration 
files, password files, security and integrity-
level definition file, role definition file, or 
audit-log file shall also be identified.

2. The identified privileged functions of a TCB 
functional component shall be associated only with 
the privileges necessary to complete their task.

3.12	Ease-of-TCB-Use 

If security mechanisms are not easy to use and maintain, 
then administrative and non-system administrators may be 
tempted to disable the security mechanisms. Therefore, ease 
of use becomes an important element in the administration of 
a secure product and in the creation of privileged 
applications. It also minimizes errors on the part of both the 
administrative and non-system administrators, and can serve 
to minimize the consequences of these errors. 

For the CS2 level, EU-2 was assigned from the Federal 
Criteria. No refinements were made from the Federal Criteria.

EU-2 Ease of Application Programming

1. The TCB shall provide well-defined actions to 
undertake administrative functions. Default 
options shall be provided for security parameters 
of administrative functions.

The TCB shall include fail-safe defaults for the 
policy attributes of the defined subjects and 
objects, as well as user-setable defaults for the 
defined subjects and objects.

2. The TCB shall provide well-defined application 
programming interfaces and programming functions 
(e.g., libraries) for all its policies to support 
the development of applications that can define 
and enforce security policies on application-
controlled subjects and objects. The TCB shall 
enable user-controlled reduction of access rights 
available to applications.

CS2 Assurance 

4.	Introduction

This chapter provides the CS2 development and evaluation 
assurance requirements package using the development and 
evaluation assurance components defined in Volume I and the 
package contained in Volume I, Appendix G of the Federal 
Criteria. The structure of each assurance package follows that 
of the assurance components (i.e., each package consists of 
development process, operational support, development 
environment, development evidence, and evaluation process 
components).

Assurance Package T2+

Assurance package T2+ was chosen for CS2. This package is 
indicated as being TS2+ since an additional component was 
included for flaw remediation and a higher level was chosen 
for trusted generation. This basic assurance level is intended 
to include most commercial computer products that are designed 
to satisfy functional requirements. Although most development 
assurance components are required at their lowest levels, the 
requirements of several product-development components are 
extended to capture (1) specific TCB properties, and (2) a 
rudimentary notion of support for product structure. The 
operational support component is also extended to enable 
systematic flaw discovery, tracking, and repair.

The intent of the product development assurance for this 
package is to establish that the external behavior of the 
product conforms to its user level and administrative 
documentation without analysis of the internal structure of 
the product TCB. For this reason, only the claimed TCB 
protection properties and their informal models, TCB 
interface description, and TCB element list are required to 
enable functional testing. Support for TCB structuring is 
limited to process isolation and separation of the protection 
critical TCB elements from the protection non-critical ones. 

The intent of the operational support assurance for this 
package is to establish a minimal level of user and 
administrative guidance and product information that enables 
the correct product installation and use of product security 
features. Similarly, the development environment assurances 
are intended to provide the a minimal level of control over 
the product configuration and production. This level of 
development environment assurance is similar to that already 
present in most established commercial development 
organizations.

The development evidence required for this package is 
commensurate with the assurances required. The intent of this 
package is to require the type of assurance evidence that is 
generated during the normal commercial development process. 

The intent of evaluation support assurance is to establish 
that the product, and the context in which it is developed and 
supported, is commensurate with the development assurance 
requirements. At the T2+ level, testing analysis and the 
requirement for independent testing determines whether the 
product meets the functional protection requirements. 
Operational support evaluation assurance determines whether 
the product documentation correctly describes the security 
relevant operations. 

Also for CS2, flaw remediation was included in this 
package. Flaw remediation is important for commercial 
environments since it ensures that flaws (i.e, deficiencies 
in a product that enables a user external to the TCB to violate 
the functional requirements of a protection profile) that are 
discovered by the product consumers will be tracked, 
corrected, and disseminated to the affected customers.

The following table  summarizes the generic assurance 
components that comprise the Basic Development Assurance 
Package (T2+). 

      CS2 Assurance Package Summary
.---------------------------------------.
| Assurance Components           |  T2+ |
|================================|======|
| Development Assurance Components      |     
|=======================================|
| Development Process                   |
|--------------------------------+------|
| TCB Property Definition        | PD-2 |
|--------------------------------+------|
| TCB Design                            |
|--------------------------------+------|
|   TCB Element Identification   | ID-2 |
|--------------------------------+------|
|   TCB Interface Definition     | IF-1 |
|--------------------------------+------|
|   TCB Modular Decomposition    | ---- |
|--------------------------------+------|
|   TCB Structuring Support      | SP-1 |
|--------------------------------+------|
|   TCB Design Disciplines       | ---- |
|--------------------------------+------|
| TCB Implementation Support     | ---- |
|--------------------------------+------|
| TCB Testing and Analysis              |
|--------------------------------+------|
|   Functional Testing           | FT-1 |
|--------------------------------+------|
|   Penetration Analysis         | ---- |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Operational Support                   |
|--------------------------------+------|
| User Security Guidance         | UG-1 |
|--------------------------------+------|
| Administrative Guidance        | AG-1 |
|--------------------------------+------|
| Flaw Remediation               | FR-1 |
|--------------------------------+------|
| Trusted Generation             | TG-2 |
|--------------------------------+------|
| Development Environment               |
|--------------------------------+------|
| Life Cycle Definition          | ---- |
|--------------------------------+------|
| Configuration Management       | ---- |
|--------------------------------+------|
| Trusted Distribution           | ---- |
|--------------------------------+------|
| Development Evidence                  |
|--------------------------------+------|
| TCB Protection Properties      | EPP2 |
|--------------------------------+------|
| Product Development            | EPD1 |
|--------------------------------+------|
| Product Testing & Analysis            |
|--------------------------------+------|
|   Functional Testing           | EFT1 |
|--------------------------------+------|
|   Penetration Analysis         | ---- |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Product Support                | EPS1 |
`---------------------------------------'
|=======================================|
| Evaluation Assurance Components       |
|=======================================|
| Testing                               |
|--------------------------------+------|
|   Test Analysis                | TA-1 |
|--------------------------------+------|
|   Independent Testing          | IT-1 |
|--------------------------------+------|
| Review                                |
|--------------------------------+------|
|   Development Environment      | ---- |
|--------------------------------+------|
|   Operational Support          | OSR1 |
|--------------------------------+------|
| Analysis                              |
|--------------------------------+------|
|   Protection Properties        | ---- |
|--------------------------------+------|
|   Design                       | ---- |
|--------------------------------+------|
|   Implementation               | ---- |
`---------------------------------------'

4.1	TCB Property Definition

The definition of TCB properties assures the consistency of 
the TCB's behavior. It determines a baseline set of properties 
that can be used by system developers and evaluators to assure 
that the TCB satisfies the defined functional requirements.

For CS2, PD-2 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

PD-2 Informal Property Identification

The developer shall provide informal models for 
the functional components and sub-components of 
the profile. At a minimum, an informal model of 
the access control components shall be provided. 
Each informal model shall include (abstract) data 
structures and operations defining each functional 
component or sub-component, and a description of 
the model properties. The developer shall 
interpret (e.g., trace) the informal models within 
the product TCB. For each model entity, the 
developer shall: (1) identify the TCB elements and 
their TCB interfaces (if any) that implement that 
entity; (2) define the operation of these TCB 
elements, and (3) explain why the operation of 
these elements is consistent with the model 
properties. The developer's interpretation of each 
informal model, which defines the TCB properties, 
shall identify all TCB elements that do not 
correspond to any model entity and shall explain 
why these elements do not render the TCB 
properties invalid.

For the components that are not informally 
modeled, the developer shall interpret the 
functional requirements of the protection profile 
within the product TCB. For each functional 
requirement, the developer shall: (1) identify the 
TCB elements and their TCB interfaces (if any) 
that implement that requirement; (2) describe the 
operation of these TCB elements, and (3) explain 
why the operation of these elements is consistent 
with the functional requirement. The developer's 
interpretation of each functional requirement, 
which describes the TCB properties, shall identify 
all TCB elements that do not correspond to any 
functional requirement and shall explain why these 
elements do not render the TCB properties invalid.

4.2	TCB Element Identification

The identification of TCB elements (hardware, firmware, 
software, code, and data structures) provides the set of 
elements that determine the protection characteristics of a 
product. All assurance methods rely on the correct 
identification of TCB elements either directly or indirectly.

For CS2, ID-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 ID-1: TCB Element Identification

The developer shall identify the TCB elements 
(i.e., software, hardware/firmware code and data 
structures). Each element must be unambiguously 
identified by its name, type, release, and version 
number (if any).

4.3	TCB Interface Definition

The TCB interface establishes the boundary between the TCB 
and its external users and application programs. It consists 
of several components, such as command interfaces (i.e., user 
oriented devices such as the keyboard and mouse), application 
program interfaces (system calls), and machine/processor 
interfaces (processor instructions).

For CS2, IF-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

IF-1: Interface Description

The developer shall describe all external (e.g., 
command, software, and I/O) administrative (i.e., 
privileged) and non-administrative interfaces to 
the TCB. The description shall include those 
components of the TCB that are implemented as 
hardware and/or firmware if their properties are 
visible at the TCB interface.

The developer shall identify all call conventions 
(e.g., parameter order, call sequence 
requirements) and exceptions signaled at the TCB 
interface.

4.4	TCB Structuring Support

Structuring the TCB into modules is necessary. However, the 
modular decomposition does not necessarily reflect the run-
time enforcement of the TCB structuring since the separation 
of modules may not necessarily be supported by run-time 
mechanisms. The run-time enforcement of internal TCB 
structuring adds a measure of assurance that the TCB elements 
that are critical to the enforcement of the protection 
functions are separate from the non-critical elements. Also, 
the use of run-time enforcement of TCB structuring helps 
separate protection-critical TCB elements from each other, 
thereby helping to enforce the separation of protection 
concerns and minimizing the common mechanisms shared between 
protection critical elements.

For CS3, SP-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 SP-1: Process Isolation

The TCB shall maintain process isolation.

4.5	Developer Functional Testing

Functional testing establishes that the TCB interface 
exhibits the properties necessary to satisfy the requirements 
of the protection profile. It provides assurance that the TCB 
satisfies at least its functional protection requirements.

For CS2, FT-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 FT-1: Conformance Testing

The developer shall test the TCB interface to show 
that all claimed protection functions work as 
stated in the TCB interface description.

The developer shall correct all flaws discovered 
by testing and shall retest the TCB until the 
protection functions are shown to work as claimed.

4.6	User's Guidance

User's guidance is an operational support assurance 
component that ensures that usage constraints assumed by the 
protection profile are understood by the users of the product. 
It is the primary means available for providing product users 
with the necessary background and specific information on how 
to correctly use the product's protection functionality.

For CS2, UG-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 UG-1: Users' Guide

The developer shall provide a Users' Guide which 
describes all protection services provided and 
enforced by the TCB. The User's Guide shall 
describe the interaction between these services 
and provide examples of their use. The User's 
Guide may be in the form of a summary, chapter or 
manual. The User's Guide shall specifically 
describe user responsibilities. These shall 
encompass any user responsibilities identified in 
the protection profile.

4.7	Administrative Guidance

Administrative guidance is an operation support assurance 
component that ensures that the environmental constraints 
assumed by the protection profile are understood by 
administrative users and operators of the IT product. It is 
the primary means available to the developer for providing to 
administrators and operators detailed, accurate information 
on how to configure and install the product, operate the IT 
product is a secure manner, make effective use of the 
product's privileges and protection mechanisms to control 
access to administrative functions and data bases, and to 
avoid pitfalls and improper use of the administrative 
functions that would compromise the TCB and user security.

For CS2, AG-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 AG-1: Basic Administrative Guidance

The developer shall provide a Trusted Facility 
Manual intended for the product administrators 
that describes how to use the TCB security 
services (e.g., Access Control, System Entry, or 
Audit) to enforce a system security policy. The 
Trusted Facility Manual shall include the 
procedures for securely configuring, starting, 
maintaining, and halting the TCB. The Trusted 
Facility Manual shall explain how to analyze audit 
data generated by the TCB to identify and document 
user and administrator violations of this policy. 
The Trusted Facility Manual shall explain the 
privileges and functions of administrators. The 
Trusted Facility Manual shall describe the 
administrative interaction between security 
services.

The Trusted Facility Manual shall be distinct from 
User Guidance, and encompass any administrative 
responsibilities identified in security 
management.

4.8	Flaw Remediation Procedures

Flaw remediation is an operational support assurance 
component that ensures that flaws (i.e, deficiencies in a 
product that enables a user external to the TCB to violate the 
functional requirements of a protection profile) that are 
discovered by the product consumers will be tracked, 
corrected, and disseminated to the affected customers.

For CS2, FR-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 FR-1: Basic Flaw Remediation

Flaw Tracking Procedures: The developer shall 
establish a procedure to track all reported 
protection flaws in each release of the product. 
The tracking system shall include a description of 
the nature and effect of each flaw and the status 
of finding a correction to the flaw.

Flaw Repair Procedures: The developer shall 
establish a procedure to identify corrective 
actions for protection flaws.

Customer Interaction Procedures: The developer 
shall provide flaw information and corrections to 
registered customers.

4.9	Trusted Generation

Trusted generation is an operational support assurance 
component that ensures that the copy of the product's TCB that 
is configured and activated by the consumer exhibits the same 
protection properties as the master copy of the product's TCB 
that was evaluated for compliance with the protection profile. 
The trusted generation procedures must provide some 
confidence that the consumer will be aware of what product 
configuration parameters can affect the protection properties 
of the TCB. The procedures must encourage the consumer to 
choose parameter settings that are within the bounds assumed 
during the product evaluation.

For CS2, TG-2 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

TG-2: Trusted Generation With Fail-Safe Defaults

The developer shall establish and document the 
procedures that a customer must perform to 
generate an operational TCB from the delivered 
copy of the master TCB. The customer documentation 
shall identify any system parameters, which are 
initialized or set during system generation, that 
affect the TCB's conformance to the protection 
profile and state the acceptable ranges of values 
for those parameters. The product shall be 
delivered with each of these parameters set to its 
fail-safe defaults.

4.10	Evidence of TCB Protection Properties

The documentation of the TCB protection properties includes 
the definition of the functional component requirements, 
their modeling (if any), and their interpretation within a 
product's TCB. For each requirement of a protection profile, 
a description, definition (an informal, descriptive 
specification), or a formal specification of the TCB 
components and their operation corresponding to the 
requirement must be provided.

For CS2, EPP-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 EPP-1 Evidence of TCB Correspondence to the Functional 
Requirements

The developer shall provide documentation which 
describes the correspondence between the 
functional component requirements and the TCB 
elements and interfaces. The TCB properties, which 
are defined by this correspondence, shall be 
explained in this documentation.

4.11	Evidence of Product Development

Product development evidence consists of the TCB design 
evidence including the documentation of the TCB interface, TCB 
elements, TCB structure, TCB structuring support, and TCB 
design disciplines. The TCB implementation evidence includes 
TCB source code, and the processor hardware and firmware 
specifications.

For CS2, EPD-2 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 EPD-2: Description Of The TCB External Interface

The developer shall provide documentation which 
describes the correspondence between the 
functional component requirements and the TCB 
elements and interfaces. The developer shall also 
provide an informal access control model and its 
interpretation within the TCB. The TCB properties, 
which are defined by this correspondence, shall be 
explained in this documentation.

4.12	Evidence of Functional Testing

Functional testing evidence includes the testing itself, 
the test plans, and test documentation results. Test plans 
consist of: the description definition or specification of the 
test conditions; the test data, which consists of the test 
environment set-up; the test parameters and expected 
outcomes; and a description of the test coverage. 

For CS2, EFT-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 EFT-1: Evidence of Conformance Testing

The developer shall provide evidence of the 
functional testing that includes the test plan, 
the test procedures and the results of the 
functional testing.

4.13	Evidence of Product Support

Product support evidence consists of the development 
environment and operational support documentation and tools. 
The development environment evidence includes the 
documentation of the product life-cycle process, 
configuration management procedures enforced, and the trusted 
distribution mechanisms and procedures used. It also 
includes: the identification of the tools used in the product 
development, configuration management, and trusted 
distribution; and the characteristics that make those tools 
suitable for the development of product protection.

For CS2, EPS-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 EPS-1: Evidence of Basic Product Support

The developer shall provide evidence that 
describes the policies, procedures, and plans 
established by the developer to satisfy the 
Operational Support and Development Environment 
requirements of the protection profile. 

4.14	Test Analysis

Test analysis determines whether the product meets the 
functional protection requirements defined in the protection 
profile. Functional testing is based on operational product, 
the TCB's functional properties, the product's operational 
support guidance, and other producer's documentation as 
defined by the development evidence requirements. Functional 
test analysis is based on the achieved test results as 
compared to the expected results derived from the development 
evidence.

For CS2, TA-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 TA-1: Elementary Test Analysis

The evaluator shall assess whether the producer 
has performed the activities defined in the 
development assurance requirements of the 
protection profile for functional testing and 
whether the producer has documented these 
activities as defined in the development evidence 
requirements of the protection profile. The 
evaluator shall analyze the results of the 
producer's testing activities for completeness of 
coverage and consistency of results. The evaluator 
shall determine whether the product's protection 
properties, as described in the product 
documentation have been tested. The evaluator 
shall assess testing results to determine whether 
the product's TCB works as claimed.

4.15	Independent Testing 

Independent testing determines whether the product's TCB 
meets the functional protection requirements as defined in the 
functionality chapter of this Protection Profile. Testing is 
based on the operational product, the TCB's functional 
properties, the product's operational support guidance, and 
other producer's documentation as defined by the Development 
Evidence requirements.

For CS2, IT-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 IT-1: Elementary Independent Testing

A tester, independent of the producer or 
evaluator, shall perform functional and elementary 
penetration testing. This testing shall be based 
on the product's user and administrative 
documentation, and on relevant known penetration 
flaws. Satisfactory completion consists of 
demonstrating that all user-visible security 
enforcing functions and security-relevant 
functions work as described in the product's user 
and administrative documentation and that no 
discrepancies exist between the documentation and 
the product. Test results of the producer shall be 
confirmed by the results of independent testing. 
The evaluator may selectively reconfirm any test 
result.

If the independent testing is performed at beta-
test sites, the producer shall supply the beta-
test plan and the test results. The evaluator 
shall review the scope and depth of beta testing 
with respect to the required protection 
functionality, and shall verify independence of 
both the test sites and the producer's and beta-
test user's test results. The evaluator shall 
confirm that the test environment of the beta-test 
site(s) adequately represents the environment 
specified in the protection profile.

4.16	Operational Support Review

Operation support review establishes the level of review 
required to determine whether the product meets the 
requirements as defined in the protection profile's 
Development Assurance subsections for Operational Support 
including, at the CS2 level, the User and Administrative 
Guidance documents.

For CS2, OSR-1 was assigned from the Federal Criteria. No 
refinements were made from the Federal Criteria.

 OSR-1 Elementary Operational Support Review

The evaluator shall review all documentation 
focused on the activities of product use (e.g., 
Users Manuals) and product administration 
including installation, operation, maintenance, 
and trusted recovery (e.g., Trusted Facility 
Management Manuals). This review shall assess the 
clarity of presentation, difficulty in locating 
topics of interest, ease of understanding, and 
completeness of coverage. The need for separate 
manuals dedicated to protection-relevant aspects 
of the product should be assessed for 
effectiveness.

COMMERCIAL SECURITY 3 (CS3)

CS3 compliant products provide enhanced protection 
beyond those of the CS1 and CS2 Protection Profiles 
by providing administrative and access control 
features to centrally control access to information 
and other resources based on roles. Through the use 
of role based access controls, a variety of 
organization specific non-discretionary integrity 
and confidentiality policies can be specified and 
enforced. In addition, CS3 provides stronger 
authentication measures, more administrative tools, 
and requires a greater degree of assurance 
evidence.

          CS3 Functional Component Summary
.------------------------------------------------------.
|                                  | Component |       |
| Component Name                   |   Code    | Level |
|======================================================|
| Security Policy Support:                             |
|----------------------------------+-----------+-------|
|  Identification & Authentication |    I&A    |   4   |
|----------------------------------+-----------+-------|
|  System Entry                    |    SE     |   3   |
|----------------------------------+-----------+-------|
|  Trusted Path                    |    TP     |   1   |
|----------------------------------+-----------+-------|
|  Audit                           |    AD     |   3   |
|----------------------------------+-----------+-------|
|  Access Control                  |    AC     |   2+  |
|----------------------------------+-----------+-------|
|  Availability:                                       |
|----------------------------------+-----------+-------|
|    Resource Allocation           |    AR     |   1   |
|----------------------------------+-----------+-------|
|  Security Management             |    SM     |   3   |
|----------------------------------+-----------+-------|
| Reference Mediation              |    RM     |   1   |
|----------------------------------+-----------+-------|
| TCB Protection                   |    P      |   1   |
|----------------------------------+-----------+-------|
| Physical Protection              |    PP     |   1   |
|----------------------------------+-----------+-------|
| Self Checking                    |    SC     |   3   |
|----------------------------------+-----------+-------|
| TCB Initialization & Recovery    |    TR     |   3   |
|----------------------------------+-----------+-------|
| Privileged Operations            |    PO     |   2   |
|----------------------------------+-----------+-------|
| Ease-of-Use                      |    EU     |   3   |
`------------------------------------------------------'

    CS3 Assurance Package Summary
.---------------------------------------.
| Assurance Components           |  T3+ |
|================================|======|
| Development Assurance Components      |     
|=======================================|
| Development Process                   |
|--------------------------------+------|
| TCB Property Definition        | PD-2 |
|--------------------------------+------|
| TCB Design                            |
|--------------------------------+------|
|   TCB Element Identification   | ID-2 |
|--------------------------------+------|
|   TCB Interface Definition     | IF-1 |
|--------------------------------+------|
|   TCB Modular Decomposition    | ---- |
|--------------------------------+------|
|   TCB Structuring Support      | SP-1 |
|--------------------------------+------|
|   TCB Design Disciplines       | ---- |
|--------------------------------+------|
| TCB Implementation Support     | ---- |
|--------------------------------+------|
| TCB Testing and Analysis              |
|--------------------------------+------|
|   Functional Testing           | FT-1 |
|--------------------------------+------|
|   Penetration Analysis         | PA-1 |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Operational Support                   |
|--------------------------------+------|
| User Security Guidance         | UG-1 |
|--------------------------------+------|
| Administrative Guidance        | AG-2+|
|--------------------------------+------|
| Flaw Remediation               | FR-2 |
|--------------------------------+------|
| Trusted Generation             | TG-2 |
|--------------------------------+------|
| Development Environment               |
|--------------------------------+------|
| Life Cycle Definition          | LC-1 |
|--------------------------------+------|
| Configuration Management       | CM-1 |
|--------------------------------+------|
| Trusted Distribution           | ---- |
|--------------------------------+------|
| Development Evidence                  |
|--------------------------------+------|
| TCB Protection Properties      | EPP2 |
|--------------------------------+------|
| Product Development            | EPD1 |
|--------------------------------+------|
| Product Testing & Analysis            |
|--------------------------------+------|
|   Functional Testing           | EFT1 |
|--------------------------------+------|
|   Penetration Analysis         | EPA1 |
|--------------------------------+------|
|   Covert Channel Analysis      | ---- |
|--------------------------------+------|
| Product Support                | EPS1 |
`---------------------------------------'
|=======================================|
| Evaluation Assurance Components       |
|=======================================|
| Testing                               |
|--------------------------------+------|
|   Test Analysis                | TA-2 |
|--------------------------------+------|
|   Independent Testing          | IT-1 |
|--------------------------------+------|
| Review                                |
|--------------------------------+------|
|   Development Environment      | DER1 |
|--------------------------------+------|
|   Operational Support          | OSR1 |
|--------------------------------+------|
| Analysis                              |
|--------------------------------+------|
|   Protection Properties        | ---- |
|--------------------------------+------|
|   Design                       | DA-1 |
|--------------------------------+------|
|   Implementation               | ---- |
`---------------------------------------'

CS3 Rationale

2.17	Introduction

As outlined in the Federal Criteria, this rationale 
describes the protection philosophy, how the security 
features are intended to be used, the assumptions about the 
environment in which a compliant product is intended to 
operate, the threats within that environment, and the security 
features and assurances that counter these threats. At the CS3 
level, the features used to counter threats and the strength 
of the assurance evidence is enhanced over CS1 and CS2 and is 
indicated in the text through bold italics.

2.17.1	Protection Philosophy

Any discussion of protection necessarily starts from a 
protection philosophy, i.e., what it really means to call the 
product "secure." In general, products will control access to 
information and other resources through the use of specific 
security features so that only properly authorized 
individuals or processes acting on their behalf will be 
granted access. For CS1, four fundamental requirements are 
derived for this statement of protection:

o	Access authorization

o	Accountability

o	Assurance 

o	Availability of Service

The totality of the functionality that enforces the access 
authorization and accountability protection philosophy is 
comprised of the hardware, software, and firmware of the 
Trusted Computing Base (TCB). CS3 requires the TCB to be self-
protecting and resistant to bypass so that it is effective at 
countering identified threats. CS3 also requires effective 
management of security attributes and configuration 
parameters. The assurance protection philosophy is comprised 
of the development process, operational support, development 
environment, development evidence, and evaluation process 
assurances. Each of these are explained below.

2.17.1.1	Access Authorization

The access authorization portion of the philosophy of 
protection for this profile addresses subject and object 
access mediation. For CS3 compliant products, access 
authorization has been further refined to include system 
entry, subject and object mediation based on system entry, 
subject and object mediation based on role identifiers, and 
privileged operations.

2.17.1.1.1	System Entry

CS3 provides the capability for an system administrator to 
establish, maintain, and protect information from 
unauthorized access, and defines the identities of and 
conditions under which users may gain entry into the system. 
These system entry controls are based on user identification, 
role membership, time, location, and method of entry. CS3 
strengthens the requirement for locking interactive sessions 
by requiring the display device to be cleared or overwritten 
to make the current contents of the screen unreadable.

2.17.1.1.2	Subject and Object Access Mediation

CS1 and CS2 provide protected access to resources and 
objects. CS3 compliant products also provide the capability 
of specifying and enforcing access control decisions based on 
roles [12][13]. In many organizations, the end users do not 
"own" the information and the programs for which they are 
allowed access. For these organizations, the corporation or 
agency is the actual "owner" of the system objects as well as 
the programs that process them. Control is often based on 
employee functions rather than on ownership.

Access control decisions are often determined by the roles 
individual users take on as part of an organization. The 
definition of a role includes the specification of duties, 
responsibilities, and qualifications. For example, the roles 
an individual associated with a hospital can assume include 
doctor, nurse, clinician, and pharmacist. Roles in a bank 
include teller, loan officer, and accountant. Roles can also 
apply to military systems; for example, target analyst, 
situation analyst, and traffic analyst are common roles in 
tactical systems. A Role Based Access Control (RBAC) policy 
bases access control decisions on the functions a user is 
allowed to perform within an organization. Under this policy, 
the users cannot pass access permissions to other users at 
their discretion.

For each role, a set of transactions associated with the 
role is maintained. A transaction can be thought of as a 
transformation procedure [12] (a program or a portion of a 
program) plus a set of associated data items. In addition, 
each role has an associated set of individual members. 

The determination of membership and the allocation of 
transactions to a role is in compliance with organization 
specific non-discretionary policies. These policies can be 
derived from existing laws, ethics, regulations, or generally 
accepted practices. These policies are non-discretionary in 
the sense that they are unavoidably imposed on all users. 

For subject and object access mediation, CS3 also provides 
for additional time and location access control attributes. 
At a minimum, these attributes include the user's port of 
entry.

2.17.1.1.3	Privileges

CS3 supports and promotes the separation and use of 
privileges for TCB modules. A privilege enables a subject to 
perform a security relevant operation that, by default, is 
denied. Privileges cover all security aspects of a product, 
including TCB operations performed by system administrators. 
CS3 compliant products have tightly controlled privilege 
definitions as well as control over subjects that hold 
privileges. 

2.17.1.2	Accountability

The accountability portion of the philosophy of protection 
for this profile addresses user identification and 
authentication (I&A), requirements for security auditing, and 
a Trusted Path between a user and the operating system. Each 
of these are explained below.

2.17.1.2.1	Identification and Authentication

User identification is required to support access control 
and security auditing. This includes the capability to 
establish, maintain, and protect a unique identifier for each 
authorized user. User identification is functionally 
dependent on authentication. Authentication is a method of 
validating a person as a legitimate user.

User authentication in most computer systems has been 
provided primarily through the use of passwords. CS2 supports 
a variety of password features that give the product a great 
amount of flexibility in the generation of passwords, in 
password security, password features, and password 
administration. For most products, a great deal of confidence 
is placed on maintaining the privacy of passwords belonging 
to individuals. I&A prevents unauthorized individuals from 
logging into the product, therefore, password management is 
essential to secure product operations. The risk of losing a 
password is addressed within CS2 through promoting the use of 
stringent password management practices.

In addition, CS2 allows for stronger authentication 
approaches. CS2 specifies that a unique identifier be 
associated with each trusted subject such as print spoolers, 
database management system services, and transaction 
processing monitors. It also requires the TCB to maintain, 
protect, and display status information for all active users 
and all enabled or disabled user identities or accounts.

CS3 also provides for stronger authentication mechanisms 
for those commercial and government environments that need 
such assurance, such as law enforcement agencies, nuclear 
facilities, and commercial airports. These other approaches 
can be categorized as "something a user is," which can be 
indicated through the use of a unique characteristic that a 
person possesses, or "something a user has," such as a smart 
card. For example, biometrics is a "something you are" 
approach for identifying individuals through the use of a 
unique physical characteristic associated with a person such 
as a fingerprint or a retina pattern. In many respects, the 
biometrics approach to user identification is a cleaner and 
more secure approach than a password mechanism. This method 
eliminates the concern over the compromise of a password. 
However, while biometric devices are currently available, 
their expense makes them impractical for most applications. 
"Something a user has" requires a physical device that users 
must have in their possession at authentication time. Usually, 
these devices require the user to enter a Personal 
Identification Number (PIN) in case the device is lost or 
stolen.

2.17.1.2.2	Audit

For most secure products, a capability must exist to audit 
the security relevant events. As each user performs security 
relevant tasks, the product must record the user identifier, 
the action performed, and the result in a security log. For 
CS31compliant products, a capability is specified to allow a 
system administrator to access and evaluate audit 
information. This capability provides a method of protection 
in the sense that all security relevant events that occur 
within a computer system can be logged and the responsible 
user held accountable for his/her actions. Audit trails are 
used to detect and deter penetration of a computer system and 
to reveal activity that identifies misuse.

CS3 provides for an effective audit mechanism by supporting 
the following basic security characteristics. It provides the 
ability to:

o	review the use of I&A mechanisms;

o	discover the introduction of objects into a user's 
address space;

o	discover the deletion of objects; 

o	discover actions taken by computer operators and 
system administrators;

o	audit attempts to violate resource allocation limits;

o	protect the audit data so that access to it is limited 
to system administrators that are authorized to 
examine audit information;

o	discover the use of privileges, such as changing the 
ownership of an object;

o	have the audit mechanism act as a deterrent against 
penetrators or hackers; and

o	to use audit reduction tools for assessing the damage 
that may result in the event of a violation of the 
implemented security policy. These tools have the 
capability of selectively reviewing the actions of one 
or more users or roles, actions performed on a 
specific object or system resource, and actions 
associated with specific access control attributes.

2.17.1.3	Availability of Service

CS3 promotes the continuous accessibility and usability of 
resources. The TCB provides the capability to detect and 
recover from discontinuity of service using some combination 
of automatic and procedural techniques. Also, resource 
allocation requirements replace restrictions on the number of 
subjects and objects a user may have allocated at any given 
time. This prevents one individual user from denying access 
to another user's subject and object space.

2.17.1.4	Assurance

Assurance addresses all areas of product development 
assurance and evaluation assurance. The Development assurance 
addresses the development process, operational support, the 
development environment, and the development evidence. 
Development process assurance defines the additional efforts 
that a developer must undertake to satisfy the assurance 
objectives while creating the product. It specifies how the 
TCB should be designed and supported by the implementation as 
well as how it should be tested. Operational support assurance 
defines the documentation of the security features for both 
administrative and non-administrative users as well as 
requirements for TCB flaw remediation and TCB generation. 
Development environment assurance includes requirements for 
defining the product's life cycle and specific features for 
configuration management. Development evidence assurance 
defines the TCB's protection properties, details the 
requirements for product testing and analysis, and defines the 
requirements for product support. Evaluation assurance 
establishes that the product, and the context in which it is 
developed and supported, is commensurate with the development 
assurance requirements.

The T3+ Assurance Package was chosen for CS3. This package 
is indicated as being TS3+ since an additional component was 
included for flaw remediation. This enhanced assurance level 
is intended to include the best of the commercial computer 
products designed to satisfy functional requirements. As 
such, this package includes several extensions to the 
assurance components of the previous two packages. 

The intent of product development assurance for this 
package is both to establish that the external behavior of the 
product conforms to its user level and administrative 
documentation and to provide visibility into the internal 
structure of the product's TCB. For this reason, requirements 
for Descriptive Interface Specifications (DIS) and modular 
decomposition have been added. TCB element identification and 
security functional testing have also been extended and 
penetration testing requirements have been provided to 
support the added assurances of external behavior. 

The intent of the operational support assurance for this 
package is to establish a level of user and administrative 
guidance and product information that enables the correct 
product installation and the use of product security features. 
The developer is required to establish and document a policy 
for responding to customer inquiries and flaw remediation. 
Similarly, the development environment assurances are 
intended to provide a level of control over the product 
configuration and production, including well-defined coding 
standards and strict configuration management processes. This 
level of development environment assurance is similar to that 
used in the most advanced commercial development 
organizations.

The development evidence required for this package is 
commensurate with the assurances required. The intent of this 
package is to require the type of assurance evidence that is 
generated during commercial development oriented towards of 
high-quality products. 

At the T3+ level, evaluation support assurance determines 
whether the product meets the functional requirements for 
testing analysis and for independent testing. Operational 
support evaluation assurance determines whether the product 
documentation correctly describes the security relevant 
operations. Development environment assurance determines 
whether the product meets the requirements as defined in the 
Protection Profile's development assurance subsections. 
Design assurance determines whether the product meets the 
design requirements as defined in the Development Process 
Assurance section of this Protection Profile.

Also for CS3, flaw remediation was included in this 
package. Flaw remediation is important for commercial 
environments since it ensures that flaws (i.e, deficiencies 
in a product that enables a user external to the TCB to violate 
the functional requirements of a protection profile) that are 
discovered by the product consumers will be tracked, 
corrected, and disseminated to the affected customers. 
Vendors are required to separate protection-relevant fixes 
from those that are not protection-relevant and must document 
points of contact for customer error reports.

2.17.1.5	Intended Method of Use

All individual users (both administrative and non-
administrative users) are assigned a unique user identifier. 
This user identifier supports individual accountability. The 
operating system authenticates the claimed identity of the 
user before allowing the user to perform any further actions. 
Upon successful authentication, users are restricted to 
accessing programs, transactions, and information in a manner 
that is consistent with their assigned role(s). 

Products that comply with the CS3 Protection Profile are 
provided with the capability of assigning privileges to TCB 
modules. These privileges are used to control access to user 
and role registration files, password files, and audit trails. 
Privileges are associated with functional components so that 
only the privileges necessary to complete a security relevant 
task can be assigned at a given time. Also, privileges are 
associated with TCB operations performed by system 
administrators. This capability is particularly important to 
prevent a "privileged user" or "superuser" from having a wide 
set of privileges when only a subset is needed.

In addition, CS3 provides administrative and access control 
capabilities that allow for the central administration of a 
non-discretionary access control policy based on roles. A role 
specifies a user's set of transactions that allow the user to 
access resources through specific functions. Transactions can 
only be allocated to roles by system administrators. 
Membership to a role can only be granted and revoked by system 
administrators.

Products that comply with CS3 specifications are intended 
to be used within the following operational constraints:

o	The information system is designed to be administered 
as a unique entity by a single organization.

o	The information system is designed to manage 
computing, storage, input/output, and to control the 
sharing of resources among multiple users and computer 
processes.

o	The administrative and non-administrative users are 
identified as distinct individuals.

o	For role based access control, administrators are 
responsible for interpreting and enforcing 
organizational policies and protection guidelines 
that are derived from existing laws, ethics, 
regulations, or generally accepted practices.

o	The information system provides facilities for real-
time interaction with users that have access to input/
output devices.

o	System administrators are selectively assigned 
privileges that are minimally necessary to perform 
their security related task.

2.17.2	Environmental Assumptions

A product designed to meet the CS3 Protection Profile is 
intended to be a general purpose, multi-user operating system 
that runs on either a workstation, minicomputer, or mainframe. 
CS3 compliant products are expected to be used for both 
commercial and government environments. The information being 
processed for both commercial and government environments may 
be unclassified, sensitive-but-unclassified, or single-level 
classified, but not multi-level classified information.

The following specific environmental conditions have been 
assumed in specifying CS3:

o	The product hardware base (e.g., CPU, printers, 
terminals, etc.), firmware, and software will be 
protected from unauthorized physical access.

o	There will be one or more personnel assigned to manage 
the product including the security of the information 
it contains.

o	The operational environment will be managed according 
to the operational environment documentation that is 
required in the assurance chapter of the Protection 
Profile.

o	Access control to information and other resources is 
determined by the roles that individual users have.

o	The IT product provides a cooperative environment for 
users to accomplish some task or group of tasks.

o	The processing resources of the IT product, including 
all terminals, are assumed to be located within user 
spaces that have physical access controls established.

o	The IT product provides facilities for some or all of 
the authorized users to create programs that use an 
Application Programming Interface (API) to enable them 
to protect themselves and their objects from 
unauthorized use.

o	Fail-safe defaults are included for the access control 
attributes for the defined subjects and objects for 
the product.

2.17.3	Expected Threats

In general, the choice of which Protection Profile to 
choose depends upon the level of security that is required for 
that particular organizational environment. The lowest level, 
the CS1 level, is intended for those commercial and government 
environments where all the system personnel are trusted and 
all the data on the system is at the same classification 
level. For example, a government agency where all personnel 
has a government clearance, all data is unclassified, and 
there is no outside network connections would be an ideal 
candidate for CS1, i.e., the threats to be countered are such 
that only a minimal level of trust is needed. However, most 
commercial and government environments are more complex and 
require a higher degree of trust. CS2 addresses the security 
needs for the mainstream commercial and government 
environments. It provides a higher level of trust for those 
organizations that need to enforce a security policy where 
there is no need for different classifications of data. CS3 
is intended to provide the highest level of trust for 
commercial and government environments. It is intended to be 
used in those environments where a great deal of trust is 
required, such as in law enforcement agencies, nuclear 
facilities, or commercial airports. It provides the strongest 
features, mechanisms, and assurances to counter these 
threats.

A product that is designed to meet the CS3 Protection 
Profile and operate within its assumed environment will 
provide capabilities to counter these threats. It should be 
noted, however, that although a product may faithfully 
implement all the features and assurances specified in this 
Protection Profile, the complete elimination of any one threat 
should not be assumed. A product that is designed to meet the 
CS3 Protection Profile is generally known to be more effective 
at countering the threats than products that meet the CS1 and 
CS2 Protection Profiles. CS3 products counter all the CS1 and 
CS2 threats, and contain stronger features and more assurance 
evidence than CS1 and CS2 products. In addition to countering 
CS1 and CS2 threats, CS3 compliant products provide protection 
capabilities to counter one additional threat as follows:

1.	AN UNAUTHORIZED USER MAY ATTEMPT TO GAIN ACCESS TO THE 
SYSTEM

For CS1 compliant products, the threat of an unauthorized 
user gaining access to the system is primarily addressed by 
I&A features that allow the TCB to verify the identity of 
individuals attempting to gain access to the system. This is 
accomplished through the use of passwords.

Although not a direct countermeasure, auditing requirements 
are specified at the CS1 level to provide the capability to 
perform an after-the-fact analysis of unauthorized system