Fixed time cybersecurity evaluation methodology for ICT products

This document describes the cybersecurity evaluation methodology for ICT products. It is intended for use for all three assurance levels as defined in the Cybersecurity Act (i.e. basic, substantial and high).
The methodology is comprised of different evaluation blocks including assessment activities that comply with the evaluation requirements of the CSA for the three levels.
Where appropriate, it can be applied both to 3rd party evaluation and self-assessment.
It is expected that this methodology may be used by different candidate schemes and verticals providing a common framework to evaluate ICT products.

Cybersicherheitsevaluationsmethodologie für IKT-Produkte

Dieses Dokument beschreibt eine Methodologie zur Evaluierung der Cybersicherheit von IKT-Produkten, die mit vorab festgelegten Ressourcen für Zeit und Arbeitsaufwand durchgeführt werden kann. Es soll für alle drei im CSA definierten Vertrauenswürdigkeitsstufen (d. h. niedrig, mittel und hoch) anwendbar sein.
Die Methodologie umfasst verschiedene Evaluierungsblöcke mit Bewertungsaufgaben, die den Evaluierungsanforderungen des CSA für die genannten drei Vertrauenswürdigkeitsstufen entsprechen.
Gegebenenfalls kann sie sowohl bei der Fremd- als auch bei der Selbstbeurteilung angewendet werden.

Méthode d'évaluation de la cybersécurité pour produits TIC

Le présent document décrit une méthodologie d'évaluation de la cybersécurité des produits TIC qui peut être implémentée à l'aide d'une durée et de ressources de charge de travail prédéfinies. Il est destiné à s'appliquer aux trois niveaux d'assurance définis dans le CSA (c'est-à-dire élémentaire, substantiel et élevé).
La méthodologie comprend différents blocs d'évaluation contenant des activités d'évaluation qui sont conformes aux exigences d'évaluation du CSA pour les trois niveaux d'assurance mentionnés.
Le cas échéant, elle peut être appliquée à la fois à une évaluation tierce et à une autoévaluation.

Metodologija ocenjevanja kibernetske varnosti za izdelke IKT za določeno obdobje

Ta dokument opisuje metodologijo ocenjevanja kibernetske varnosti za izdelke IKT. Namenjen je za vse tri ravni varnosti, ki so opredeljene v Zakonu o kibernetski varnosti (tj. osnovno, znatno in visoko).
Metodologijo sestavljajo različni ocenjevalni bloki, vključno z ocenjevalnimi dejavnostmi, ki so skladne z ocenjevalnimi zahtevami CSA za tri ravni.
Metodologijo je mogoče uporabiti za ocenjevanje tretjih oseb ali za samoocenjevanje.
Pričakuje se, da bodo to metodologijo lahko uporabljali v različnih shemah za kibernetsko varnost in specializiranih panogah za zagotovitev skupnega okvira za vrednotenje izdelkov IKT.

General Information

Status
Published
Public Enquiry End Date
16-Sep-2021
Publication Date
04-Dec-2022
Technical Committee
Current Stage
6060 - National Implementation/Publication (Adopted Project)
Start Date
18-Nov-2022
Due Date
23-Jan-2023
Completion Date
05-Dec-2022

Buy Standard

Standard
EN 17640:2023 - BARVE
English language
54 pages
sale 10% off
Preview
sale 10% off
Preview
e-Library read for
1 day
Draft
prEN 17640:2021 - BARVE
English language
56 pages
sale 10% off
Preview
sale 10% off
Preview
e-Library read for
1 day

Standards Content (Sample)

SLOVENSKI STANDARD
SIST EN 17640:2023
01-januar-2023
Metodologija ocenjevanja kibernetske varnosti za izdelke IKT za določeno obdobje
Fixed time cybersecurity evaluation methodology for ICT products
Cybersicherheitsevaluationsmethodologie für IKT-Produkte
Méthode d'évaluation de la cybersécurité pour produits TIC
Ta slovenski standard je istoveten z: EN 17640:2022
ICS:
35.030 Informacijska varnost IT Security
SIST EN 17640:2023 en,fr,de
2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.

---------------------- Page: 1 ----------------------
SIST EN 17640:2023

---------------------- Page: 2 ----------------------
SIST EN 17640:2023


EUROPEAN STANDARD EN 17640

NORME EUROPÉENNE

EUROPÄISCHE NORM
October 2022
ICS 35.030

English version

Fixed-time cybersecurity evaluation methodology for ICT
products
Méthode d'évaluation de la cybersécurité pour Zeitlich festgelegte
produits TIC Cybersicherheitsevaluationsmethodologie für IKT-
Produkte
This European Standard was approved by CEN on 15 August 2022.

CEN and CENELEC members are bound to comply with the CEN/CENELEC Internal Regulations which stipulate the conditions for
giving this European Standard the status of a national standard without any alteration. Up-to-date lists and bibliographical
references concerning such national standards may be obtained on application to the CEN-CENELEC Management Centre or to
any CEN and CENELEC member.

This European Standard exists in three official versions (English, French, German). A version in any other language made by
translation under the responsibility of a CEN and CENELEC member into its own language and notified to the CEN-CENELEC
Management Centre has the same status as the official versions.

CEN and CENELEC members are the national standards bodies and national electrotechnical committees of Austria, Belgium,
Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy,
Latvia, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of North Macedonia, Romania, Serbia,
Slovakia, Slovenia, Spain, Sweden, Switzerland, Türkiye and United Kingdom.






















CEN-CENELEC Management Centre:
Rue de la Science 23, B-1040 Brussels
© 2022 CEN/CENELEC All rights of exploitation in any form and by any means
Ref. No. EN 17640:2022 E
reserved worldwide for CEN national Members and for
CENELEC Members.

---------------------- Page: 3 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
Contents Page
European foreword . 4
Introduction . 5
1 Scope . 7
2 Normative references . 7
3 Terms and definitions . 7
4 Conformance . 9
5 General concepts . 11
5.1 Usage of this methodology . 11
5.2 Knowledge of the TOE . 12
5.3 Development process evaluation . 12
5.4 Attack Potential . 12
5.5 Knowledge building . 13
6 Evaluation tasks . 13
6.1 Completeness check . 13
6.1.1 Aim . 13
6.1.2 Evaluation method . 13
6.1.3 Evaluator competence . 13
6.1.4 Evaluator work units . 13
6.2 FIT Protection Profile Evaluation . 14
6.2.1 Aim . 14
6.2.2 Evaluation method . 14
6.2.3 Evaluator competence . 14
6.2.4 Evaluator work units . 14
6.3 Review of security functionalities . 15
6.3.1 Aim . 15
6.3.2 Evaluation method . 15
6.3.3 Evaluator competence . 15
6.3.4 Evaluator work units . 15
6.4 FIT Security Target Evaluation . 16
6.4.1 Aim . 16
6.4.2 Evaluation method . 16
6.4.3 Evaluator competence . 16
6.4.4 Evaluator work units . 16
6.5 Development documentation . 17
6.5.1 Aim . 17
6.5.2 Evaluation method . 17
6.5.3 Evaluator competence . 17
6.5.4 Work units . 17
6.6 Evaluation of TOE Installation . 17
6.6.1 Aim . 17
6.6.2 Evaluation method . 18
6.6.3 Evaluator competence . 18
6.6.4 Evaluator work units . 18
6.7 Conformance testing . 18
2

---------------------- Page: 4 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
6.7.1 Aim . 18
6.7.2 Evaluation method . 18
6.7.3 Evaluator competence . 19
6.7.4 Evaluator work units . 19
6.8 Vulnerability review . 20
6.8.1 Aim . 20
6.8.2 Evaluation method . 20
6.8.3 Evaluator competence . 21
6.8.4 Evaluator work units . 21
6.9 Vulnerability testing . 21
6.9.1 Aim . 21
6.9.2 Evaluation method . 22
6.9.3 Evaluator competence . 22
6.9.4 Evaluator work units . 22
6.10 Penetration testing . 24
6.10.1 Aim . 24
6.10.2 Evaluation method . 24
6.10.3 Evaluator competence . 25
6.10.4 Evaluator work units . 25
6.11 Basic crypto analysis . 26
6.11.1 Aim . 26
6.11.2 Evaluation method . 26
6.11.3 Evaluator competence . 26
6.11.4 Evaluator work units . 26
6.12 Extended crypto analysis . 27
6.12.1 Aim . 27
6.12.2 Evaluation method . 27
6.12.3 Evaluator competence . 28
6.12.4 Evaluator work units . 28
Annex A (informative) Example for a structure of a FIT Security Target (FIT ST) . 30
Annex B (normative) The concept of a FIT Protection Profile (FIT PP) . 32
Annex C (informative) Acceptance Criteria . 33
Annex D (informative) Guidance for integrating the methodology into a scheme . 40
Annex E (informative) Parameters of the methodology and the evaluation tasks . 45
Annex F (normative) Calculating the Attack Potential . 47
Annex G (normative) Reporting the results of an evaluation . 52
Bibliography . 54
3

---------------------- Page: 5 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
European foreword
This document (EN 17640:2022) has been prepared by Technical Committee CEN/CLC/JTC 13
“Cybersecurity and Data Protection”, the secretariat of which is held by DIN.
This European Standard shall be given the status of a national standard, either by publication of an
identical text or by endorsement, at the latest by April 2023, and conflicting national standards shall be
withdrawn at the latest by April 2023.
Attention is drawn to the possibility that some of the elements of this document may be the subject of
patent rights. CEN shall not be held responsible for identifying any or all such patent rights.
Any feedback and questions on this document should be directed to the users’ national standards body.
A complete listing of these bodies can be found on the CEN website.
According to the CEN-CENELEC Internal Regulations, the national standards organisations of the
following countries are bound to implement this European Standard: Austria, Belgium, Bulgaria, Croatia,
Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland,
Italy, Latvia, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of North
Macedonia, Romania, Serbia, Slovakia, Slovenia, Spain, Sweden, Switzerland, Türkiye and the United
Kingdom.
4

---------------------- Page: 6 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
Introduction
The foundation for a sound product certification is a reliable, transparent and repeatable evaluation
methodology. Several product or scheme dependent evaluation methodologies exist. The Cybersecurity
Act (CSA) [1] will cause new schemes to be created which in turn require (new) methodologies to
evaluate the cybersecurity functionalities of products. These new methodologies are required to describe
evaluation tasks defined in the CSA. This methodology also adds a concept, independent of the
requirements of the CSA, namely the evaluation in a fixed time. Existing cybersecurity evaluation
methodologies (e.g. EN ISO/IEC 15408 in combination with EN ISO/IEC 18045) are not explicitly
designed to be used in a fixed time.
Scheme developers are encouraged to implement the evaluation methodology in their schemes. This can
be done for general purpose schemes or in dedicated (vertical domain) schemes, by selecting aspects for
self-assessment at CSA assurance level “basic” or third-party assessments. The self-assessment may be
performed at CSA assurance level “basic”, the third-party evaluations at CSA assurance level “basic”,
“substantial” or “high”. And the evaluation criteria and methodology might be subject to extra tailoring,
depending on the requirements of the individual scheme. This cybersecurity evaluation methodology
caters for all of these needs. This methodology has been designed so that it can (and needs to be) adapted
to the requirements of each scheme.
Scheme developers are encouraged to implement the evaluation methodology for the intended use of
the scheme, applicable for general purpose or in dedicated (vertical) domains, by selecting those aspects
needed for self-assessment at CSA assurance level “basic” or third-party evaluation at any CSA assurance
level required by the scheme.
This document provides the minimal set of evaluation activities defined in the CSA to achieve the desired
CSA assurance level as well as optional tasks, which might be required by the scheme. Selection of the
various optional tasks is accompanied by guidelines so scheme developers can estimate the impact of
their choices. Further adaption to the risk situation in the scheme can be achieved by choosing the
different evaluation tasks defined in the methodology or using the parameters of the evaluation tasks, e.g.
the number of days for performing certain tasks.
If scheme developers choose tasks that are not defined in this evaluation methodology, it will be the
responsibility of the scheme developer to define a set of companion requirements or re-use another
applicable evaluation methodology.
Nonetheless, it is expected that individual schemes will instantiate the general requirements laid out in
this evaluation methodology and provide extensive guidance for manufacturers (and all other parties)
about the concrete requirements to be fulfilled within the scheme.
Evaluators, testers and certifiers can use this methodology to conduct the assessment, testing or
evaluation of the products and to perform the actual evaluation/certification according to the
requirements set up by a given scheme. It also contains requirements for the level of skills and knowledge
of the evaluators and thus will also be used by accreditation bodies or National Cybersecurity
Certification Authorities during accreditation or authorization, where appropriate, and monitoring of
conformity assessment bodies.
Manufacturers and developers will find the generic type of evidence required by each evaluation task
listed in the evaluation methodology to prepare for the assessment or evaluation. The evidence and
evaluation tasks are independent from the fact of whether the evaluation is done by the
manufacturer/developer (i.e. 1st party) or by someone else (2nd/3rd party).
Users of certified products (regulators, user associations, governments, companies, consumers,
etc.) may also use this document to inform themselves about the assurance drawn from certain
certificates using this evaluation methodology. Again, it is expected that scheme developers provide
additional information, tailored to the domain of the scheme, about the assurance obtained by
evaluations / assessments under this methodology.
5

---------------------- Page: 7 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
Furthermore, this methodology is intended to enable scheme developers to create schemes which
attempt to reduce the burden on the manufacturer as much as possible (implying additional burden on
the evaluation lab and the certification body).
NOTE In this document the term “Conformity Assessment body” (CAB) is used for CABs doing the evaluation.
Other possible roles for CABs are not considered in this document.
It should be noted that this document cannot be used “stand alone”. Each domain (scheme) needs to
provide domain specific cybersecurity requirements (“technical specifications”) for the objects to be
evaluated / certified. This methodology is intended to be used in conjunction with those technical
specifications containing such cybersecurity requirements. The relationship of the methodology
provided in this document to the activities in product conformity assessment is shown in Figure 1.

Figure 1 — Relationship of this document to the activities in product conformity assessment
6

---------------------- Page: 8 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
1 Scope
This document describes a cybersecurity evaluation methodology that can be implemented using pre-
defined time and workload resources, for ICT products. It is intended to be applicable for all three
assurance levels defined in the CSA (i.e. basic, substantial and high).
The methodology comprises different evaluation blocks including assessment activities that comply with
the evaluation requirements of the CSA for the mentioned three assurance levels. Where appropriate, it
can be applied both to third-party evaluation and self-assessment.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminological databases for use in standardization at the following addresses:
• IEC Electropedia: available at https://www.electropedia.org/
• ISO Online browsing platform: available at https://www.iso.org/obp
3.1
evaluator
individual that performs an evaluation
Note 1 to entry: Under accreditation the term “tester” is used for this individual.
3.2
auditor
individual that performs an audit
3.3
certifying function
people or group of people responsible for deciding upon certification
Note 1 to entry: Depending on the scheme the certifying function may use evidence beyond the ETR (3.13) as a basis
for the certification decision.
3.4
scheme developer
person or organization responsible for a conformity assessment scheme
Note 1 to entry: For schemes developed under the umbrella of the CSA the so-called “ad hoc group” helps the scheme
developer.
Note 2 to entry: This definition is based on and aligned with the definition of “scheme owner” in EN ISO/IEC 17000.
3.5
confirm
declare that something has been reviewed in detail with an independent
determination of sufficiency
[SOURCE: ISO/IEC 18045:2022, definition 3.2 with NOTE removed]
7

---------------------- Page: 9 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
3.6
verify
rigorously review in detail with an independent determination of sufficiency
Note 1 to entry: Also see “confirm”. This term has more rigorous connotations. The term “verify” is used in the
context of evaluator actions where an independent effort is required of the evaluator.
[SOURCE: ISO/IEC 18045:2022, definition 3.22]
3.7
determine
affirm a particular conclusion based on independent analysis with the objective of
reaching a particular conclusion
Note 1 to entry: The usage of this term implies a truly independent analysis, usually in the absence of any previous
analysis having been performed. Compare with the terms “confirm” or “verify” which imply that an analysis has
already been performed which needs to be reviewed.
[SOURCE: ISO/IEC 18045:2022, definition 3.5]
3.8
ICT product
product with information and/or communication technology
Note 1 to entry: ICT covers any product that will store, retrieve, handle, transmit, or receive digital information
electronically in a digital form (e.g., personal computers, smartphones, digital television, email systems, robots).
3.9
Target of Evaluation
TOE
product (or parts thereof, if product is not fully evaluated) with a clear boundary, which is subject to the
evaluation
3.10
FIT Security Target
FIT ST
documented information describing the security properties and the operational environment of the TOE
(3.9)
Note 1 to entry: The FIT ST may have different content, structure and size depending on the CSA assurance level.
3.11
FIT Protection Profile
FIT PP
implementation-independent statement of security needs for a TOE (3.9) type
[SOURCE: ISO/IEC 15408-1:2022, definition 3.68]
3.12
Secure User Guide
SUG
documented information describing the steps necessary to set up the TOE (3.9) into the intended secure
state (3.16)
8

---------------------- Page: 10 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
3.13
Evaluation Technical Report
ETR
documented information describing the results of the evaluation
3.14
scheme-specific checklist
list of items defining the required level of detail and granularity of the documentation, specified by the
scheme
3.15
knowledge
facts, information, truths, principles or understanding acquired through experience or education
Note 1 to entry: An example of knowledge is the ability to describe the various parts of an information assurance
standard.
Note 2 to entry: This concept is different from the concept “Knowledge of the TOE”.
[SOURCE: ISO/IEC TS 17027:2014, 2.56, modified — Note 1 to entry has been added from
ISO/IEC 19896-1:2018, Note 2 to entry is new]
3.16
secure state
state in which all data related to the TOE (3.9) security functionality are correct, and security functionality
remains in place
3.17
self-assessment
conformance assessment activity that is performed by the person or organization that provides the TOE
(3.9) or that is the object of conformity assessment
[SOURCE: EN ISO/IEC 17000:2020, definition 4.3 with Notes and Examples removed]
3.18
evaluation task parameter
parameter required to be set when using this document to define how the evaluation task shall be
executed by the evaluator (3.1)
4 Conformance
The following Table 1 provides a reference on how the evaluation tasks should be chosen for a certain
scheme for the different CSA assurance levels:
9

---------------------- Page: 11 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
Table 1 — Evaluation tasks vs. CSA assurance level conformance claim
  CSA assurance level conformance claim
Evaluation tasks Reference Basic Substantial High
Completeness
6.1 Required Required Required
check
Review of security
6.3 Required
functionalities
FIT Security
6.4 Required Required
Target Evaluation
Development
6.5 Required Required Required
1)
documentation
Evaluation of TOE
6.6 Recommended Required Required

Installation
Conformance
6.7 Recommended Required Required
Testing
Vulnerability Required (or done
6.8 Recommended
review with 6.9)
Vulnerability
6.9 Recommended
testing
Penetration
6.10  Required
testing
Basic crypto
2)
6.11 Recommended Recommended
analysis
Extended crypto
6.12  Required
analysis
1) The scheme specific checklist may be empty for a particular scheme and then this evaluation tasks
would not apply.
2) If crypto functionality is at the core of the product, then it is sensible for scheme developers to
include it, e.g. defining appropriate product classes.
NOTE 1 FIT Protection Profile Evaluation is a dedicated process and not part of the evaluation of a TOE. While
the FIT PP specifies for which CSA assurance level it is applicable, the evaluation of a FIT PP is agnostic to this.
To implement the methodology for a certain scheme, the following steps shall be performed:
1. The scheme developer needs to perform a (domain) risk assessment, reviewing the domain under
consideration.
2. The scheme developer shall assign the Attack Potential (cf. Clause 5.4 and Annex F) to each CSA
assurance level used in the scheme
3. For each CSA assurance level the scheme developer shall select those evaluation tasks required for
this level, these are marked grey in Table 1.
4. For each task chosen, the scheme developers shall review the parameters for this evaluation task and
set them suitably based on the risk assessment and the determined attack potential. For the
10

---------------------- Page: 12 ----------------------
SIST EN 17640:2023
EN 17640:2022 (E)
evaluation task “development documentation” this includes setting up a scheme specific checklist
(which maybe empty).
5. For each CSA assurance level the scheme developer shall review those evaluation tasks
recommended for this level if inclusion is sensible, these task contain t
...

SLOVENSKI STANDARD
oSIST prEN 17640:2021
01-september-2021
Metodologija ocenjevanja kibernetske varnosti za izdelke IKT za določen čas
Fixed time cybersecurity evaluation methodology for ICT products
Cybersicherheitsevaluationsmethodologie für IKT-Produkte
Méthode d'évaluation de la cybersécurité pour produits TIC
Ta slovenski standard je istoveten z: prEN 17640
ICS:
35.030 Informacijska varnost IT Security
oSIST prEN 17640:2021 en,fr,de
2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.

---------------------- Page: 1 ----------------------
oSIST prEN 17640:2021

---------------------- Page: 2 ----------------------
oSIST prEN 17640:2021


EUROPEAN STANDARD
DRAFT
prEN 17640
NORME EUROPÉENNE

EUROPÄISCHE NORM

July 2021
ICS 35.030

English version

Fixed time cybersecurity evaluation methodology for ICT
products
Méthode d'évaluation de la cybersécurité pour Cybersicherheitsevaluationsmethodologie für IKT-
produits TIC Produkte
This draft European Standard is submitted to CEN members for enquiry. It has been drawn up by the Technical Committee
CEN/CLC/JTC 13.

If this draft becomes a European Standard, CEN and CENELEC members are bound to comply with the CEN/CENELEC Internal
Regulations which stipulate the conditions for giving this European Standard the status of a national standard without any
alteration.

This draft European Standard was established by CEN and CENELEC in three official versions (English, French, German). A
version in any other language made by translation under the responsibility of a CEN and CENELEC member into its own
language and notified to the CEN-CENELEC Management Centre has the same status as the official versions.

CEN and CENELEC members are the national standards bodies and national electrotechnical committees of Austria, Belgium,
Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy,
Latvia, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of North Macedonia, Romania, Serbia,
Slovakia, Slovenia, Spain, Sweden, Switzerland, Turkey and United Kingdom.

Recipients of this draft are invited to submit, with their comments, notification of any relevant patent rights of which they are
aware and to provide supporting documentation.Recipients of this draft are invited to submit, with their comments, notification
of any relevant patent rights of which they are aware and to provide supporting documentation.

Warning : This document is not a European Standard. It is distributed for review and comments. It is subject to change without
notice and shall not be referred to as a European Standard.














CEN-CENELEC Management Centre:
Rue de la Science 23, B-1040 Brussels
© 2021 CEN/CENELEC All rights of exploitation in any form and by any means Ref. No. prEN 17640:2021 E
reserved worldwide for CEN national Members and for
CENELEC Members.

---------------------- Page: 3 ----------------------
oSIST prEN 17640:2021
prEN 17640:2021 (E)
Contents Page
European foreword . 5
Introduction . 6
1 Scope . 8
2 Normative references . 8
3 Terms and definitions . 8
4 Conformance . 10
5 General concepts . 12
5.1 Usage of this methodology . 12
5.2 Knowledge of the TOE . 12
5.3 Development process evaluation . 13
5.4 Attack Potential . 13
5.5 Knowledge building . 13
6 Evaluation tasks . 14
6.1 Completeness check . 14
6.1.1 Aim . 14
6.1.2 Evaluation method . 14
6.1.3 Evaluator qualification . 14
6.1.4 Evaluator work units . 14
6.2 Protection Profile Evaluation . 14
6.2.1 Aim . 14
6.2.2 Evaluation method . 14
6.2.3 Evaluator qualification . 15
6.2.4 Evaluator work units . 15
6.3 Security Target Evaluation . 16
6.3.1 Aim . 16
6.3.2 Evaluation method . 16
6.3.3 Evaluator qualification . 16
6.3.4 Evaluator work units . 16
6.4 Review of security functionalities . 17
6.4.1 Aim . 17
6.4.2 Evaluation method . 17
6.4.3 Evaluator qualification . 17
6.4.4 Evaluator work units - Work unit 1 . 17
6.5 Development documentation . 17
6.5.1 Aim . 17
6.5.2 Evaluation method . 18
6.5.3 Evaluator qualification . 18
6.5.4 Work units . 18
6.6 Evaluation of TOE Installation . 18
6.6.1 Aim . 18
6.6.2 Evaluation method . 18
6.6.3 Evaluator qualification . 18
6.6.4 Evaluator work units . 18
6.7 Conformance testing . 19
6.7.1 Aim . 19
6.7.2 Evaluation method . 19
2

---------------------- Page: 4 ----------------------
oSIST prEN 17640:2021
prEN 17640:2021 (E)
6.7.3 Evaluator qualification . 19
6.7.4 Evaluator work units . 20
6.8 Vulnerability review . 21
6.8.1 Aim . 21
6.8.2 Evaluation method . 21
6.8.3 Evaluator qualification . 21
6.8.4 Evaluator work units . 21
6.9 Vulnerability testing . 22
6.9.1 Aim . 22
6.9.2 Evaluation method . 22
6.9.3 Evaluator qualification . 22
6.9.4 Evaluator work units . 23
6.10 Penetration testing . 24
6.10.1 Aim . 24
6.10.2 Evaluation method . 24
6.10.3 Evaluator qualification . 25
6.10.4 Evaluator work units . 26
6.11 Basic crypto analysis . 26
6.11.1 Aim . 26
6.11.2 Evaluation method . 26
6.11.3 Evaluator qualification . 27
6.11.4 Evaluator work units . 27
6.12 Extended crypto analysis . 28
6.12.1 Aim . 28
6.12.2 Evaluation method . 28
6.12.3 Evaluator qualification . 28
6.12.4 Evaluator work units . 28
Annex A (informative) Example for a structure of a Security Target. 31
A.1 General . 31
A.2 Example structure . 31
A.3 Typical content of an ST . 32
Annex B (normative) The concept of a Protection Profile . 33
B.1 General . 33
B.2 Aim and basic principles of a Protection Profile (PP) . 33
B.3 Guidance for schemes to implement the PP concept . 33
Annex C (informative) Acceptance Criteria . 34
C.1 Introduction . 34
C.2 Identification, Authentication Control, and Access Control . 34
C.3 Secure Boot . 37
C.4 Cryptography . 38
C.5 Secure State After Failure . 39
C.6 Least Functionality . 40
C.7 Update Mechanism . 41
Annex D (informative) Guidance for integrating the methodology into a scheme . 42
D.1 General . 42
3

---------------------- Page: 5 ----------------------
oSIST prEN 17640:2021
prEN 17640:2021 (E)
D.1.1 Introduction . 42
D.1.2 Perform a risk assessment, reviewing the vertical domain under consideration . 42
D.1.3 Assign the attack potential to the CSA levels . 42
D.1.4 Select the evaluation tasks required for this level . 42
D.1.5 Review and set the parameters for the tasks . 42
D.1.6 Possible selection of additional or higher tasks . 43
D.1.7 Review and set the parameters for the additional tasks . 43
D.1.8 Set up and maintain further scheme requirements and guidelines . 43
D.2 Example . 44
Annex E (informative) Parameters of the methodology and the evaluation tasks . 47
E.1 General. 47
E.2 Parameters of the methodology . 47
E.3 Parameters of the evaluation tasks . 47
E.3.1 Parameters for 6.1 “Completeness check” . 47
E.3.2 Parameters for 6.2 “Protection Profile Evaluation” . 47
E.3.3 Parameters for 6.3 “Security Target Evaluation” . 47
E.3.4 Parameters for 6.4 “Review of security functionalities” . 47
E.3.5 Parameters for 6.5 “Development documentation” . 47
E.3.6 Parameters for 6.6 “Evaluation of TOE Installation” . 47
E.3.7 Parameters for 6.7 “Conformance testing” . 48
E.3.8 Parameters for 6.8 “Vulnerability review” . 48
E.3.9 Parameters for 6.9 “Vulnerability testing” . 48
E.3.10 Parameters for 6.10 “Penetration testing” . 48
E.3.11 Parameters for 6.11 “Basic crypto analysis” . 48
E.3.12 Parameters for 6.12 “Extended crypto analysis” . 48
Annex F (normative) Calculating the Attack Potential . 49
F.1 General. 49
F.2 Factors for Attack Potential . 49
F.3 Numerical factors for attack potential . 49
F.3.1 Default rating table . 50
F.3.2 Adaptation of the rating table . 51
Annex G (normative) Reporting the results of an evaluation . 54
G.1 General. 54
G.2 Written reporting . 54
G.3 Oral defence of the results obtained . 54
Bibliography . 56
4

---------------------- Page: 6 ----------------------
oSIST prEN 17640:2021
prEN 17640:2021 (E)
European foreword
This document (prEN 17640:2021) has been prepared by Technical Committee CEN/JTC 13
“Cybersecurity and Data Protection”, the secretariat of which is held by DIN.
This document is currently submitted to the CEN Enquiry.
5

---------------------- Page: 7 ----------------------
oSIST prEN 17640:2021
prEN 17640:2021 (E)
Introduction
The foundation for a sound product certification is a reliable, transparent and repeatable evaluation
methodology. Several product or scheme dependent evaluation methodologies exist, however, in the
advent of the CSA [1] new schemes need new methodologies to evaluate the cybersecurity functionalities
of products. These new methodologies are required to describe evaluation tasks defined in the CSA (e.g.
Technical Documentation Review, Check Against Known Vulnerabilities). In addition, existing
cybersecurity evaluation methodologies (e.g. EN ISO/IEC 15408 and EN ISO/IEC 18045) are not
designed to be used in a fixed time, i.e. the duration of the evaluation can be extended considerable during
execution.
The CSA enables scheme developers to consider self-assessment as well as third party evaluations. The
self-assessment may be performed at assurance level “basic”, the third-party evaluations at assurance
level “basic”, “substantial” or “high”. And, depending on the requirements of the individual scheme, the
evaluation criteria and methodology might be subject to extra tailoring. This cybersecurity evaluation
methodology caters for all of these needs. This methodology has been designed so that it can (and needs
to be) adapted to the requirements of each scheme.
Scheme developers are encouraged to implement the evaluation methodology for the intended use of
the scheme, applicable for general purpose or in dedicated (vertical) domains, by selecting those aspects
needed for self-assessment at level “basic” or third party evaluation at any level required by the scheme.
This document provides the minimal set of evaluation activities defined in the CSA to achieve the desired
assurance level as well as optional tasks, which might be required by the scheme. Selection of the various
optional tasks is accompanied by guidelines so scheme developers can estimate the impact of their
choices. Further adaption to the risk situation in the scheme can be achieved by choosing the different
evaluation tasks defined in the methodology or using the parameters of the evaluation tasks, e.g. the
number of days for performing certain tasks.
If scheme developers choose tasks that are not defined in this evaluation methodology, it will be
responsibility of the scheme developer to define a set of companion requirements or re-use an applicable
evaluation methodology.
Nonetheless, it is expected that individual schemes will instantiate the general requirements laid out in
this evaluation methodology and provide extensive guidance for manufacturers (and all other parties)
about the concrete requirements to be fulfilled within the scheme.
Evaluators, testers and certifiers can use this methodology to conduct the assessment, testing or
evaluation of the products and to perform the actual evaluation/certification according to the
requirements set up by a given scheme. It also contains requirements for the level of skills and knowledge
of the evaluators/testers and thus will also be used by accreditation bodies or National Cybersecurity
Certification Authorities during accreditation or authorization, where appropriate, and monitoring of
conformity assessment bodies.
Manufacturers and developers will find the generic type of evidence required by each evaluation task
listed in the evaluation methodology to prepare for the assessment or evaluation. The evidence and
evaluation tasks are independent from the fact of whether the evaluation is done by the
manufacturer/developer (i.e. first party) or by some else (2nd/3rd party).
Users of certified products (regulators, user associations, governments, companies, consumers,
etc.) may also use this document to inform themselves about the assurance drawn from certain
certificates using this evaluation methodology. Again, it is expected that scheme developers provide
additional information, tailored to the domain of the scheme, about the assurance obtained by
evaluations / assessments under this methodology.
6

---------------------- Page: 8 ----------------------
oSIST prEN 17640:2021
prEN 17640:2021 (E)
Furthermore, this methodology is intended to enable scheme developers to create schemes which
attempt to reduce the burden on the manufacturer as much as possible (implying additional burden on
the evaluation lab and the certification body).
NOTE In this document the term “Conformity Assessment body” (CAB) is used for CABs doing the evaluation.
Other possible roles for CABs are not considered in this document.
It should be noted that this document cannot be used “stand alone”. Each domain (scheme) needs to
provide domain specific cybersecurity requirements for the objects to be evaluated / certified. This
methodology is intended to be used in conjunction with specifications containing such cybersecurity
requirements.

Figure 1 — Relationship of this document to the activities in product conformity assessment
7

---------------------- Page: 9 ----------------------
oSIST prEN 17640:2021
prEN 17640:2021 (E)
1 Scope
This document describes a cybersecurity evaluation methodology that can be implemented using pre-
defined time and workload resources, for ICT products. It is intended to be applicable for all three
assurance levels defined in the CSA (i.e. basic, substantial and high).
The methodology comprises different evaluation blocks including assessment activities that comply with
the evaluation requirements of the CSA for the mentioned three assurance levels.
Where appropriate, it can be applied both to 3rd party evaluation and self-assessment.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminological databases for use in standardization at the following addresses:
• IEC Electropedia: available at https://www.electropedia.org/
• ISO Online browsing platform: available at https://www.iso.org/obp
3.1
confirm
declare that something has been reviewed in detail with an independent
determination of sufficiency
[SOURCE: EN ISO/IEC 15408-1:2009, definition 3.1.4 with NOTE removed]
3.2
certifying function
people or group of people responsible for deciding upon certification
Note 1 to entry: Depending on the scheme the certifying function may use evidence beyond the ETR as basis for the
certification decision.
3.3
determine
affirm a particular conclusion based on independent analysis with the objective of
reaching a particular conclusion
Note 1 to entry: The usage of this term implies a truly independent analysis, usually in the absence of any previous
analysis having been performed. Compare with the terms “confirm” or “verify” which imply that an analysis has
already been performed which needs to be reviewed.
[SOURCE: EN ISO/IEC 15408-1:2009, definition 3.1.22]
3.4
evaluation task parameter
parameter required to be set when using this document to define how the evaluation task shall be
executed by the evaluator
8

---------------------- Page: 10 ----------------------
oSIST prEN 17640:2021
prEN 17640:2021 (E)
3.5
Evaluation Technical Report (ETR)
documented information describing the results of the evaluation
3.6
evaluator
individual that performs an evaluation
3.7
ICT produ
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.