0.1.2 An important reason for wishing to produce these international harmonised criteria was that such harmonisation is one of the prerequisites of international mutual recognition of the certificates which summarise the results of Information Technology (IT) security evaluations and confirm that the evaluations have been properly carried out. It is also a prerequisite of mutual recognition that the methods used to apply these harmonised criteria should themselves be harmonised. On completion of the ITSEC therefore, the four countries continued to co-operate, with the aim of agreeing a common approach to the conduct of IT security evaluations, at least to the extent necessary to provide the required confidence to facilitate mutual recognition.
0.1.3 Much work had already been done and some of this published on the development of IT security evaluation methods. In the UK this included CESG Memorandum Number 2 [CESG2], developed for government use, and the "Green Books" series of the Department of Trade and Industry, including V23-Evaluation and Certification Manual [DTI23], for commercial IT security products. In Germany, the German Information Security Agency published their IT Evaluation Manual [GISA1].
0.1.4 The basic approach was to harmonise existing security evaluation methods in each of the four countries to the extent necessary to ensure that national evaluation methods conform to a single philosophy. It was initially felt that the work should be limited to harmonisation of existing methods. However, it has been necessary to extend the existing work and to develop some new ideas in order to achieve these objectives.
0.1.6 The ITSEM is a technical document, aimed predominantly at partners in evaluation (primarily evaluators but also sponsors and certifiers), but it is also of interest to vendors, developers, system accreditors and users. It contains sufficient detail of evaluation methods and procedures to enable technical equivalence of evaluations performed in different environments to be demonstrated. The document will be freely available. The ITSEM will apply to evaluations carried out both in commercial and government sectors.
0.1.7 For the purposes of mutual recognition it is necessary that some parts of the ITSEM be prescriptive on evaluators. However most of the ITSEM is descriptive or intended to provide guidance.
0.1.8 In order to put the evaluation methods prescribed and described into a context, it is necessary to include in the ITSEM some outline information on certification and how it may be organised.
0.1.9 This document stresses the importance of independence of evaluation from any commercial pressures from a sponsor or developer of a TOE. However first party evaluation, in the sense of evaluation performed by another part of the sponsoring or developing organisation, is not precluded provided that the requirements of the national scheme are fulfilled.
0.1.10 The ITSEM has been written from the perspective that certification follows the evaluation. The case that an evaluation is followed by a supplier's declaration is outside the scope of this document although, even in this case, use of the ITSEM may still be helpful.
0.1.12 Part 1 of the ITSEM describes an IT security framework providing background and rationale for IT security, evaluation, certification and system accreditation. This part is of a general nature. It is intended for a management audience.
0.1.13 Part 2 of the ITSEM gives basic information on the establishment and running of an evaluation and certification scheme, describing the general features of the certification process and the organisation of it. It is of interest to those wishing to understand the certification process.
0.1.14 Part 3 of the ITSEM explains the evaluation philosophy which underlies the ITSEC. It contains the principles which must be followed by the evaluators when evaluations are performed. It gives further explanation and clarification of the ITSEC concepts to provide a better basis for understanding the technical issues underlying evaluation.
0.1.15 Part 4 of the ITSEM is the key part for those closely involved in evaluation. All mandatory text is in this part. It gives an overview of how evaluation is performed and describes evaluation in terms of input, process, output. However it does not provide guidance for all details of evaluation.
0.1.16 Part 5 of the ITSEM provides examples of the application of ITSEC, demonstrating how the ITSEC can be applied to the evaluation of systems and products.
0.1.17 Part 6 of the ITSEM gives guidance on evaluation to sponsors, vendors, developers, system accreditors and users. It is particularly concerned with preparing the inputs, and using the outputs, from evaluation.
0.1.20 The current version of the ITSEM benefits from significant revisions arising from widespread international review. The review process has been assisted by the Commission of the European Communities who organised an international workshop in September 1992, at which version 0.2 was discussed. This event was supplemented by written comments and contributions from reviewers, which the authors have sought to take into account in preparing version 1.0. It is recognised by the authors of the ITSEM that in some areas of the ITSEM detailed guidance is still lacking, but that where appropriate, additional information in those areas will appear in later versions, as both this document and the ITSEC evolve in line with experience.
Commission of the European Communities
DIRECTORATE GENERAL XIII: Telecommunications, Information Market and Exploitation of Research
DIRECTORATE B: Advanced Communications Technologies and Services
Rue de la Loi 200
B-1049 BRUSSELS
Belgium
For France:
Service Central de la Sécurité des Systèmes d'Information
18 Rue du Docteur Zamenhof
F-92131 ISSY LES MOULINEAUX
For Germany:
Bundesamt für Sicherheit in der Informationstechnik
Postfach 20 03 63
D-53133 BONN
For the Netherlands:
Netherlands National Comsec Agency
Bezuidenhoutseweg 67
P.O. Box 20061
NL-2500 EB THE HAGUE
National Security Service
P.O. Box 20010
NL-2500 EA THE HAGUE
For the United Kingdom:
Head of the Certification Body
UK IT Security Evaluation and Certification Scheme
Certification Body
PO Box 152
CHELTENHAM
Glos GL52 5UF
0.2.4 CAD - Computer Aided Design
0.2.5 CASE - Computer Aided Software Engineering
0.2.6 CB - Certification Body
0.2.7 CRC - Cyclic Redundancy Check
0.2.8 DAC - Discretionary Access Control
0.2.9 ETR - Evaluation Technical Report
0.2.10 EWP - Evaluation Work Programme
0.2.11 FMEA - Failure Mode and Effects Analysis
0.2.12 FMSP - Formal Model of Security Policy
0.2.13 I&A - Identification and Authentication
0.2.14 IPSE - Integrated Project Support Environment
0.2.15 ISO - International Standards Organisation
0.2.16 ITSEC - Information Technology Security Evaluation Criteria
0.2.17 ITSEF - Information Technology Security Evaluation Facility
0.2.18 ITSEM - Information Technology Security Evaluation Manual
0.2.19 MAC - Mandatory Access Control
0.2.20 MARION - Méthode d'Analyse de Risques Informatiques et d'Optimisation par Niveau
0.2.21 MELISA - Méthode d'Evaluation de la Vulnérabilité Résiduelle des Systèmes
0.2.22 MMI - Man Machine Interface
0.2.23 PCB - Printed Circuit Board
0.2.24 PDL - Program Description Language
0.2.25 PID - Personal Identification Device
0.2.26 PIPSE - Populated Integrated Project Support Environment
0.2.27 SSADM - Structured Systems Analysis and Design Methodology
0.2.28 SEISP - System Electronic Information Security Policy
0.2.29 SEF - Security Enforcing Function
0.2.30 SoM - Strength of Mechanisms
0.2.31 SPM - Security Policy Model
0.2.32 SSP - System Security Policy
0.2.33 TCB - Trusted Computing Base
0.2.34 TOE - Target of Evaluation
0.2.35 T&T - Technique and Tool
0.2.37 Audit Trail: the set of records generated by a TOE in response to accountable operations, providing the basis for audit
0.2.38 Authentication: the verification of a claimed identity
0.2.39 Binding Analysis: the determination of whether the totality of security enforcing functions, together with the description of their inter-working as described in the architectural design, fulfils the totality of security objectives, i.e. covers all threats enumerated in the security target
0.2.40 Certificate/Certification Report: the public document issued by a CB as a formal statement confirming the results of the evaluation and that the evaluation criteria, methods and procedures were correctly applied; including appropriate details about the evaluation based on the ETR
0.2.41 Certification Body: a national organisation, often the National Security Authority, responsible for administering ITSEC evaluations within that country
0.2.42 Construction Vulnerability: vulnerabilities which take advantage of some property of the TOE which was introduced during its construction
0.2.43 Correct Refinement: the refinement of a function described at one abstraction level is said to be correct if the totality of effects described at the lower abstraction level at least exhibits all the effects described at the higher abstraction level
0.2.44 Countermeasure: a technical or non-technical security measure which contributes to meeting the security objective(s) of a TOE
0.2.45 Deliverable: an item or resource that is required to be made available to the evaluators for the purpose of evaluation
0.2.46 Error: a failure to meet the correctness criteria
0.2.47 Evaluation Technical Report: a report produced by an ITSEF and submitted to the CB detailing the findings of an evaluation and forming the basis of the certification of a TOE
0.2.48 Evaluation Work Programme: a description of how the work required for evaluations is organised; that is it is a description of the work packages involved in the evaluation and the relationships between them
0.2.49 Exploitable Vulnerability: a vulnerability which can be exploited in practice to defeat a security objective of a TOE
0.2.50 Impact Analysis: an activity performed by a sponsor to determine if a re-evaluation of a changed TOE is necessary
0.2.51 Impartiality: freedom from bias towards achieving any particular result
0.2.52 Information Technology Security Evaluation Facility: an organisation accredited in accordance with some agreed rules (e.g. [EN45]) and licensed by the CB to perform ITSEC security evaluations
0.2.53 Information Technology Security Evaluation Manual: a technical document containing sufficient detail of evaluation methods and procedures to enable mutual recognition
0.2.54 National Scheme: a set of national rules and regulations for evaluation and certification in accordance with the ITSEC and ITSEM
0.2.55 Object: a passive entity that contains or receives information
0.2.56 Objectivity: a property of a test whereby the result is obtained with the minimum of subjective judgement or opinion
0.2.57 Operational Vulnerability: vulnerabilities which take advantage of weaknesses in non-technical countermeasures to violate the security of the TOE
0.2.58 Potential Vulnerability: a suspected vulnerability which may be used to defeat a security objective of a TOE, but the exploitability or existence of which has not yet been demonstrated
0.2.59 Problem Report: a concise report, produced by the evaluators, sent to the CB outlining an error, a potential or actual vulnerability in the TOE
0.2.60 Re-evaluation: an evaluation of a previously evaluated TOE after changes have been made
0.2.61 Re-use: the use of previous evaluation results when one or more previously evaluated components are incorporated into a system or product
0.2.62 Repeatability: a repeated evaluation of the same TOE to the same security target by the same ITSEF yields the same overall verdict as the first evaluation (e.g. E0 or E5)
0.2.63 Representation: the specification of a TOE at a particular phase of the development process (one of requirements, architectural design, a level of detailed design, implementation)
0.2.64 Reproducibility: evaluation of the same TOE to the same security target by a different ITSEF yields the same overall verdict as the first ITSEF (e.g. E0 or E5)
0.2.65 Subject: an active entity, generally in the form of a person, process, or device [TCSEC]
0.2.66 Suitability Analysis: the determination that the security enforcing functions described in the security target are able to act as countermeasures to the threat(s) identified in the security target (suitability is only assessed at this level)
0.2.67 Vulnerability: a security weakness in a TOE (for example, due to failures in analysis, design, implementation or operation).
BDSS Risk Quantification Problems and Bayesian Decision Support System Solutions, Will Ozier, Information Age, Vol. 11, No. 4, October 1989.
BOE Characteristics of Software Quality - TRW North Holland, B.W. Boehm, Software Engineering Economics - Prentice Hall, 1975.
CESG2 Handbook of Security Evaluation, CESG Memorandum No. 2, Communications-Electronics Security Group, United Kingdom, November 1989.
CRAMM CCTA Risk Analysis and Management Methodology, Guidance on CRAMM for Management, Version 2.0, CCTA, February 1991.
DTI23 Evaluation and Certification Manual, V23 Department of Trade and Industry, United Kingdom, Version 3.0, February 1989.
ECMA A Reference Model for Frameworks of Computer-Assisted Software Engineering Environments, ECMA TR/55.
EN45 General Criteria for the Operating of Testing Laboratories, EN 45001.
GASSER Building a Secure Computer System, Morrie Gasser, Van Nostrand Reinhold.
GISA1 IT Evaluation Manual, GISA 1990.
GISA2 IT Sicherheitshandbuch, BSI 7105, Version 1.0, March 1992.
GUI25 General Requirements for the Technical Competence of Testing Laboratories, International Standards Organisation, ISO Guide 25, 1982.
ISO65A Software for Computers in the Application of Industrial Safety Related Systems, ISO/IEC JTC1/SC27 N381, November 1991.
ITSEC Information Technology Security Evaluation Criteria - Harmonised Criteria of France, Germany, the Netherlands, and the United Kingdom, Version 1.2, June 1991.
LINDE Operating System Penetration, R Linde, Proceedings of the AFIPS, NCC, pp 361- 368, 1975.
MCC Factors in Software Quality, J A McCall, General Electric n.77C1502, June 1977.
MS1629A Procedures for performing a failure mode, effects and criticality analysis, MIL- STD-1629A, US DoD, November 1980.
NIS35 Interpretation of Accreditation Requirements for IT Test Laboratories for Software and Communications Testing Services, NAMAS Information Sheet NIS35, NAMAS Executive, National Physics Laboratory, United Kingdom, November 1990.
OSI OSI Basic Reference Model, Part 2 - Security Architecture, ISO 7498 (1988(E)).
PCTE Portable Common Tool Environment Abstract Specification (December 1990; ECMA 149).
PCTE+ Portable Common Tool Environment (Extended) Definition Team Final Report (14 December 1992).
SRMM Shared Resource Matrix Methodology: An Approach to Identifying Storage and Timing Channels, R A Kemmerer, ACM Transactions on Computer Systems, Vol. 1, No. 3, August 1983.
TCSEC Trusted Computer Systems Evaluation Criteria, DoD 5200.28-STD, Department of Defense, United States of America, December 1985.
TNI Trusted Network Interpretation of the TCSEC, National Computer Security Center, United States of America, NCSC-TG-005, Version 1, 31 July 1987.
TDI Trusted Database Interpretation of the TCSEC, National Computer Security Center, United States of America, NCSC-TG-021, April 1991.
1.1.1 Information Technology (IT) has become essential to the effective conduct of business and the affairs of state, and is becoming increasingly important to the affairs of private individuals affected by the use of IT. Information is something to be gained and protected in order to advance one's business or private affairs, and should therefore be regarded as an asset. The importance of such assets is usually expressed in terms of the consequential damage resulting from the manife