Foreword
This is a Supporting Document (SD), intended to complement the Common Criteria CC:2022 Release 1 and the associated Common Evaluation Methodology for Information Technology Security Evaluation.
SDs may be "Guidance Documents," that highlight specific approaches and application of the standard to areas where no mutual recognition of its application is required, and as such, are not of normative nature. SDs may also be "Mandatory Technical Documents," that have a mandatory application for evaluations where the scope is covered by that of the SD. The usage of SDs of the latter class is not only mandatory, but certificates issued as a result of their application are recognized under the CCRA.
This supporting document has been developed by the Dedicated Security Component iTC and is designed to be used to support the evaluations of products against the cPPs identified in section 1.1.
Technical Editor
Dedicated Security Component iTC
General Purpose
The purpose of a DSC is to provide seven security services to devices that incorporate them: Parse, Provision, Protect, Process, Prove, Purge, and Propagate. The DSC acts as a secure and separate processing area that stores and manipulates SDOs and SDEs that are isolated from the remainder of the device.
Field of special use
Vendors and integrators of mobile devices and other hardware devices that use DSCs.
Acknowledgements
This Supporting Document was developed by the DSC international Technical Community with representatives from industry, government agencies, Common Criteria Test Laboratories, and members of academia.
Revision History
Version | Date | Description |
---|---|---|
1.0 |
September 10, 2020 |
Initial publication |
2.0 |
October 28, 2024 |
Final publication for v2.0 (conversion to asciidoc, respond to CCDB comments) |
1. Introduction
1.1. Technology Area and Scope of Supporting Document
This Supporting Document (SD) is mandatory for evaluations of products that claim conformance to the collaborative Protection Profile for Dedicated Security Component.
A Dedicated Security Component (DSC), in the context of this cPP, is the combination of a hardware component and its controlling OS or firmware dedicated to the protection and safe use by the rich OS of Security Data Objects (SDOs) consisting of keys, identities, attributes, and Security Data Elements (SDEs).
Although Evaluation Activities (EAs) are defined mainly for the evaluators to follow, in general they will also help developers prepare for evaluation by identifying specific requirements for their Target of Evaluation (TOE). The specific requirements in EAs may in some cases clarify the meaning of Security Functional Requirements (SFRs), and may identify particular requirements for the content of Security Targets (especially the TOE Summary Specification), user guidance documentation, and possibly required supplementary information (e.g. for entropy analysis or cryptographic key management architecture).
1.2. Structure of the Document
Evaluation Activities can be defined for both SFRs and Security Assurance Requirements (SARs). These are defined in separate sections of this SD. The EAs associated with the SFRs are considered to be interpretations of applying the appropriate SAR activity. For instance, activities associated with testing are representative of what is required by ATE_IND.1.
If any Evaluation Activity cannot be successfully completed in an evaluation then the overall verdict for the evaluation is a 'fail'. In rare cases, there may be acceptable reasons why an Evaluation Activity may be modified or deemed not applicable for a particular TOE, but this must be agreed with the Certification Body for the evaluation.
In general, if all EAs (for both SFRs and SARs) are successfully completed in an evaluation then it would be expected that the overall verdict for the evaluation is a 'pass'.
In some cases, the Common Evaluation Methodology (CEM) work units have been interpreted to require the evaluator to perform specific EAs. In these instances, EAs will be specified in Section 2, Evaluation Activities for SFRs, Section 5, Evaluation Activities for SARs, and possibly Section 3, Evaluation Activities for Optional Requirements and Section 4, Evaluation Activities for Selection-Based Requirements. In cases where there are no CEM interpretations, the CEM activities are to be used to determine if SARs are satisfied and references to the CEM work units are identified as being the sole EAs to be performed.
Finally, there are cases where EAs have rephrased CEM work units to provide clarity on what is required. The EAs are reworded for clarity and interpret the CEM work units such that they will result in more objective and repeatable actions by the evaluator. In these cases, the EA supplements the CEM work unit. These EAs will be specified in Section 5, Evaluation Activities for SARs.
Note that certain parts of EAs may or may not be required depending on whether certain selections are made in the Security Target for the corresponding SFR. Underlined text is used to represent specific selection items.
1.3. Terminology
1.3.1. Glossary
For definitions of standard CC terminology, see [CC] part 1.
Term | Meaning |
---|---|
Access |
In the context of SDOs, access to an SDO represents the list of actions permissible with an SDO, including its generation, use, modification, propagation, and destruction. |
Administrator |
A type of user that has special privileges to manage the TSF. |
Assurance |
Grounds for confidence that a TOE meets the SFRs [CC1]. |
Attestation |
The process of presenting verifiable evidence describing those characteristics that affect integrity. Examples of these characteristics are boot firmware and boot critical data which, combined, describe the way the DSC booted. [SA] |
Attributes |
Indications of characteristics or properties of the SDEs bound in an SDO. |
Authorization Value |
Critical data bound to an action by itself or to action on a subject. Such data, when presented to the TOE, authorizes the action by itself or authorizes the action on or with the subject respectively. |
Authorization Data |
Collective term for authentication tokens and authorization values. |
Authentication Token |
Critical data bound to a user. Such data, when presented to the TOE and successfully verified by it, authenticates the user. The TOE may use the successful authentication of a user as an authorization to execute an action on its behalf. |
Authenticator |
A shortened name for Authentication Token. |
Boot Critical Data |
Critical data that persists across power cycles and determines characteristics of the DSC. Examples of boot critical data can be DSC configuration settings, certificates, and the results of measurements obtained by the root of trust for measurement. |
Boot Firmware |
The first firmware that executes during the boot process. |
Chain of Trust |
A Chain of Trust is anchored in a Root of Trust and extends a trust boundary by verifying the authenticity and integrity of successive components before passing control to those components. [SA] |
Client Application |
Entity who relies on the services provided by the platform or DSC. |
Data Encryption Key |
An encryption key, usually for a symmetric algorithm, that encrypts data that is not keying material. |
Integrity |
Assurance of trustworthiness and accuracy. |
Immutable |
Unchangeable. |
Key Encryption Key |
An encryption key that encrypts other keying material. This is sometimes called a key wrapping key. A KEK can be either symmetric or asymmetric. |
Known Answer Tests (KATs) |
Test vectors or data generated to determine the correctness of an implementation. |
Operator |
Human being who has physical possession of the platform on which the DSC is located. [GD] |
Owner |
Human being who controls or manages the platform on which the DSC is located. May be remote. [GD] |
Platform |
A platform consists of the hardware and firmware of a computing entity. |
Pre-installed SDO |
An SDO installed on the DSC by the manufacturer. The SDO consists of an SDE and attributes, which if not explicitly expressed in a data structure, are implicit based on the functions that have exclusive access to the SDE. |
Privileged Function |
Functions restricted to the role of administrator, which may include, but are not limited to, provisioning keys, provisioning user authorization values, de-provisioning user authorization values, provisioning administrator authorization values, changing authorization values, disabling key escrow, and configuring cryptography. |
Protected Data Blob |
Data in an encrypted structure that protects its confidentiality or integrity (as required by the context in which it is used). |
Protected Storage |
Refers to DSC hardware used to store SDEs or SDOs, and provide integrity protection for all items and confidentiality for those items that require it. Protected Storage may also refer to storage external to the DSC, which is usually encrypted by keys maintained by the DSC’s internal protected storage capabilities. |
Protections |
Mechanisms that ensure components of a DSC (executable firmware code and critical data) remain in a state of integrity and are protected from modification outside of authorized, authenticated processes and entities. [NIST-ROTM] |
Remote Secure Channel |
Logical channel to the DSC from a remote entity, which cryptographically protects the confidentiality and integrity of the channel content. |
Required Supplementary Information |
Information that is not necessarily included in the Security Target or operational guidance, and that may not necessarily be public. Examples of such information could be entropy analysis, or description of a cryptographic key management architecture used in (or in support of) the TOE. The requirement for any such supplementary information will be identified in the relevant cPP (see description in Section 6). |
Root Encryption Key |
An encryption key that serves as the anchor of a hierarchy of keys. |
Root of Trust |
A root of trust performs one or more security specific functions; establishing the foundation on which all trust in a system is placed. [NIST-ROTM] |
Root of Trust for Authorization |
(As defined by [GP_ROT]) The Root of Trust for Authorization provides reliable capabilities to assess authorization tokens and determine whether or not they satisfy policies for access control. |
Root of Trust for Confidentiality |
(As defined by [GP_ROT]) The Root of Trust for Confidentiality maintains shielded locations for the purpose of storing sensitive data, such as secret keys and passwords. |
Root of Trust for Integrity |
(As defined by [GP_ROT]) The Root of Trust for Integrity maintains shielded locations for the purpose of storing and protecting the integrity of non-secret critical security parameters and platform characteristics. Critical security parameters include, but are not limited to, authorization values, public keys, and public key certificates. |
Root of Trust for Measurement |
(As defined by [GP_ROT]) The Root of Trust for Measurement provides the ability to reliably create platform characteristics. |
Root of Trust for Reporting |
(As defined by [GP_ROT]) The Root of Trust for Reporting reliably reports platform characteristics. It provides an interface that limits its services to providing reports on its platform characteristics authenticated by a platform identity. |
Root of Trust for Storage |
A root of trust that acts as the Root of Trust for Confidentiality and the Root of Trust for Integrity. |
Root of Trust for Update |
A root of trust responsible for updating the firmware. |
Root of Trust for Verification |
A root of trust responsible for verifying digital signatures. |
Security Data Element |
A Critical Security Parameter, such as a cryptographic key or authorization token. |
Security Data Object |
A Security Data Object (SDO) may include one or more SDEs. SDOs bind SDEs with a set of attributes. |
Symmetric Encryption Key |
A value intend to input as a key to a symmetric encryption algorithm, such as AES. |
System |
A system consists of the platform hardware and firmware in addition to the higher-level software running on top of it (kernel, user-space processes, etc.). |
Target of Evaluation |
A set of software, firmware or hardware possibly accompanied by guidance. [CC1] |
Test Case |
In the context of cryptographic algorithm testing, a set of inputs to the algorithm and their corresponding expected outputs. Depending on the test, a test case may have one, many, or no expected outputs. |
Test Group |
In the context of cryptographic algorithm testing, a configuration of algorithm properties (key size, hash algorithm, etc.) and a set of test cases which correspond to these properties. |
TOE Security Functionality (TSF) |
A set consisting of all hardware, software, and firmware of the TOE that must be relied upon for the correct enforcement of the SFRs. [CC1] |
Trusted Local Channel |
Physical channel to the DSC within the platform of which the DSC is a part, which is protected by the operational environment to ensure confidentiality and integrity. |
TSF Data |
Data for the operation of the TSF upon which the enforcement of the requirements relies. |
User |
An administrator or client application. |
1.3.2. Acronyms
Acronym | Meaning |
---|---|
AES |
Advanced Encryption Standard |
CApp |
Client Application |
CBC |
Cipher Block Chaining |
CCM |
Counter with CBC-Message Authentication Code |
CCMP |
CCM Protocol |
CPU |
Central Processing Unit |
CSP |
Critical Security Parameter |
DAR |
Data At Rest |
DEK |
Data Encryption Key |
DH |
Diffie-Hellman |
DSA |
Digital Signature Algorithm |
ECDH |
Elliptic Curve Diffie Hellman |
ECDSA |
Elliptic Curve Digital Signature Algorithm |
EEPROM |
Electrically Erasable Programmable Read-Only Memory |
FIPS |
Federal Information Processing Standards |
FQDN |
Fully Qualified Domain Name |
GCM |
Galois Counter Mode |
HMAC |
Keyed-Hash Message Authentication Code |
HTTPS |
Hypertext Transfer Protocol Secure |
IEEE |
Institute of Electrical and Electronics Engineers |
IP |
Internet Protocol |
IPsec |
Internet Protocol Security |
KEK |
Key Encryption Key |
KMAC |
KECCACK Message Authentication Code |
KMD |
Key Management Document |
NIST |
National Institute of Standards and Technology |
OS |
Operating System |
PBKDF |
Password-Based Key Derivation Function |
PP |
Protection Profile |
RA |
Registration Authority |
RBG |
Random Bit Generator |
REK |
Root Encryption Key |
ROM |
Read-only memory |
RSA |
Rivest Shamir Adleman Algorithm |
SDE |
Security Data Element |
SDO |
Security Data Object |
SFP |
Security Function Policy |
SFR |
Security Functional Requirement |
SHA |
Secure Hash Algorithm |
SK |
Symmetric Key or Symmetric Encryption Key |
SPI |
Security Parameter Index |
SSH |
Secure Shell |
ST |
Security Target |
TLS |
Transport Layer Security |
TOE |
Target of Evaluation |
TSF |
TOE Security Functionality |
TSS |
TOE Summary Specification |
USB |
Universal Serial Bus |
1.4. Test Environment Note
Many of the test EAs in this Supporting Document require the evaluator to directly exercise low-level interfaces to the DSC to manipulate it in a manner that may not be feasible with the commercially-available (production) model of the DSC and associated tools. When this is the case, the TOE developer may provide models of the DSC and associated tools which allow for the required tests to be executed by the evaluator (or as necessary, executed by the TOE developer and observed by the evaluator).
For any tests that are executed using non-commercially-available versions of the TOE provided by the developer, the evaluator shall ensure the following:
-
The test report shall document the measures the evaluator took to gain assurance that if the TOE itself is modified to allow for certain tests to be performed, the security of the TOE is not reduced in the unmodified TOE (i.e. if the TOE is modified to use a special firmware build, this should not create a situation where the modified build enforces required security functionality that the unmodified build does not).
-
Any tools used to conduct the required testing shall produce sufficient evidence to demonstrate that the test was successful (e.g., if a tool is designed to erase a particular key, it should also attempt to perform some operation that requires the use of that key to provide evidence that the key destruction succeeded).
-
The evaluator shall ensure that the tool actually performs the intended action and does not create a contrived outcome that imitates the results of a passing test without performing the actual operation (e.g. if a tool is designed to erase a particular key and output its value as proof of this, the tool should be obtaining the actual key value and not simply returning a static result).
The test EAs for individual SFRs identify cases where developer tools may be needed to execute the test as written.
2. Evaluation Activities for SFRs
The EAs presented in this section capture the actions the evaluator performs to address technology specific aspects covering specific SARs (e.g., ASE_TSS.1, ADV_FSP.1, AGD_OPE.1, and ATE_IND.1) - this is in addition to the CEM work units that are performed in Section 5, Evaluation Activities for SARs.
Regarding design descriptions (designated by the subsections labelled TSS, as well as any required supplementary material that may be treated as proprietary designated by the subsections labelled KMD), the evaluator must ensure there is specific information that satisfies the EA. For findings regarding the TSS and KMD sections, the evaluator’s verdicts will be associated with the CEM work unit ASE_TSS.1-1. Evaluator verdicts associated with the supplementary evidence will also be associated with ASE_TSS.1-1, since the requirement to provide such evidence is specified in ASE in the cPP.
For ensuring the guidance documentation provides sufficient information for the administrators/users as it pertains to SFRs, the evaluator’s verdicts will be associated with CEM work units ADV_FSP.1-7, AGD_OPE.1-4, and AGD_OPE.1-5.
Finally, the subsection labelled Tests is where the iTC has determined that testing of the product in the context of the associated SFR is necessary. While the evaluator is expected to develop tests, there may be instances where it is more practical for the developer to construct tests, or where the developer may have existing tests, as mentioned in section 1.4 above. Therefore, it is acceptable for the evaluator to witness developer-generated tests in lieu of executing the tests. In this case, the evaluator must ensure the developer’s tests are executing both in the manner declared by the developer and as mandated by the EA. The CEM work units that are associated with the EAs specified in this section are: ATE_IND.1-3, ATE_IND.1-4, ATE_IND.1-5, ATE_IND.1-6, and ATE_IND.1-7.
2.1. Cryptographic Support (FCS)
2.1.1. Cryptographic Key Management (FCS_CKM)
2.1.1.1. FCS_CKM.1 Cryptographic Key Generation
2.1.1.1.1. TSS
The evaluator shall examine the TSS to determine whether it describes any supported key generation or derivation functionality consistent with the claims made in FCS_CKM.1.1.
[conditional] If the key is generated according to an asymmetric key scheme, the evaluator shall review the TSS to determine that it describes how the functionality described by FCS_CKM.1/AKG is invoked. The evaluator uses the description of the key generation functionality in FCS_CKM.1/AKG or documentation available for the operational environment to determine that the key strength being requested is greater than or equal to 112 bits.
[conditional] If the key is generated according to a symmetric key scheme, the evaluator shall review the TSS to determine that it describes how the functionality described by FCS_CKM.1/SKG is invoked. The evaluator uses the description of the RBG functionality in FCS_RBG.1, to determine that the key size being requested is greater than or equal to the key size and mode to be used for the encryption/decryption of the data.
[conditional] If the key is formed from derivation, the evaluator shall verify that the TSS describes the method of derivation and that this method is consistent with FCS_CKM.5 or FCS_CKM_EXT.8, depending on the key derivation method claimed.
2.1.1.1.2. AGD
There are no guidance evaluation activities for this component.
2.1.1.1.3. Test
The evaluator shall iterate through each of the methods selected by the ST and perform all applicable tests from the selected methods.
2.1.1.1.4. KMD
The evaluator shall iterate through each of the methods selected by the ST and confirm that the KMD describes the applicable selected methods.
2.1.1.2. FCS_CKM.2 Cryptographic Key Distribution
2.1.1.2.1. TSS
The evaluator shall examine the TSS to determine whether it describes the supported methods for key distribution functionality consistent with the selection(s) made in FCS_CKM.2.1. The evaluator shall confirm that the matching SFRs are included in the ST based on the selection(s) made in FCS_CKM.2.1.
2.1.1.2.2. AGD
The evaluator shall verify that the guidance instructs the administrator how to configure the TOE to use the selected key distribution method.
2.1.1.2.3. Test
Testing for this SFR is performed under the corresponding functions based on the selection(s) made in the ST.
2.1.1.2.4. KMD
There are no KMD evaluation activities for this component.
2.1.1.3. FCS_CKM.6 Cryptographic Key Destruction
2.1.1.3.1. TSS
The evaluator shall examine the TSS to ensure it lists all relevant keys and keying material (describing the source of the data, all memory types in which the data is stored (covering storage both during and outside of a session, and both plaintext and non-plaintext forms of the data)), all relevant destruction situations (including the point in time at which the destruction occurs; e.g. factory reset or device wipe function, change of authorization data, change of DEK, completion of use of an intermediate key) and the destruction method used in each case. The evaluator shall confirm that the description of the data and storage locations is consistent with the functions carried out by the TOE (e.g. that all keys in the key chain are accounted for[1]).
The evaluator shall check that the TSS identifies any configurations or circumstances that may not conform to the key destruction requirement (see further discussion in the AGD section below). Note that reference may be made to the AGD for description of the detail of such cases where destruction may be prevented or delayed.
Where the ST specifies the use of "some value that does not contain any CSP" to overwrite keys, the evaluator shall examine the TSS to ensure that it describes how that pattern is obtained and used, and that this justifies the claim that the pattern does not contain any sensitive data.
2.1.1.3.2. AGD
The evaluator shall check that the guidance documentation for the TOE requires users to ensure that the TOE remains under the user’s control while a session is active.
A TOE may be subject to situations that could prevent or delay data destruction in some cases. The evaluator shall check that the guidance documentation identifies configurations or circumstances that may not strictly conform to the key destruction requirement, and that this description is consistent with the relevant parts of the TSS (and KMD). The evaluator shall check that the guidance documentation provides guidance on situations where key destruction may be delayed at the physical layer, identifying any additional mitigation actions for the user (e.g. there might be some operation the user can invoke, or the user might be advised to retain control of the device for some particular time to maximise the probability that garbage collection will have occurred).
For example, when the TOE does not have full access to the physical memory, it is possible that the storage may implement wear-levelling and garbage collection. This may result in additional copies of the data that are logically inaccessible but persist physically. Where available, the TOE might then describe use of the TRIM command[2] and garbage collection to destroy these persistent copies upon their deletion (this would be explained in TSS and guidance documentation).
2.1.1.3.3. Test
The following tests require the developer to provide access to a test platform that provides the evaluator with tools that are typically not found on factory products.
The evaluator shall perform the following tests:
Test 1 [conditional]: If the TOE supports directly examining the SDO/SDE memory, this test is applied to each key or keying material held as plaintext in volatile memory and subject to destruction by overwrite by the TOE (whether or not the plaintext value is subsequently encrypted for storage in volatile or non-volatile memory).
-
Record the value of the key or keying material.
-
Cause the TOE to dump the SDO/SDE memory of the TOE into a binary file.
-
Search the content of the binary file created in Step #2. to locate all instances of the known key value from Step #1.
Note that the primary purpose of Step #3. is to demonstrate that appropriate search commands are being used for Steps #8. and #9.
-
Cause the TOE to perform normal cryptographic processing with the key from Step #1.
-
Cause the TOE to destroy the key.
-
Cause the TOE to stop execution but not exit.
-
Cause the TOE to dump the SDO/SDE memory of the TOE into a binary file.
-
Search the content of the binary file created in Step #7. for instances of the known key value from Step #1.
-
Break the key value from Step #1. into an evaluator-chosen set of fragments and perform a search using each fragment. (Note that the evaluator shall first confirm with the developer how the key is normally stored, in order to choose fragment sizes that are the same or smaller than any fragmentation of the data that may be implemented by the TOE. The endianness or byte-order should also be taken into account in the search.)
Steps #1.-8. ensure that the complete key does not exist anywhere in volatile memory. If a copy is found, then the test fails.
Step #9 ensures that partial key fragments do not remain in memory. If the evaluator finds a 32-or-greater-consecutive-bit fragment, then fail immediately. Otherwise, there is a chance that it is not within the context of a key (e.g., some random bits that happen to match). If this is the case the test should be repeated with a different key in Step #1. If a fragment is also found in this repeated run then the test fails unless the developer provides a reasonable explanation for the collision, then the evaluator may give a pass on this test.
Test 2 [conditional]: If the TOE supports directly examining the non-volatile memory, this test is applied to each key and keying material held in non-volatile memory and subject to destruction by overwrite by the TOE.
-
Record the value of the key or keying material.
-
Cause the TOE to perform normal cryptographic processing with the key from Step #1.
-
Search the non-volatile memory the key was stored in for instances of the known key value from Step #1.
Note that the primary purpose of Step #3. is to demonstrate that appropriate search commands are being used for Steps #5 and #6.
-
Cause the TOE to destroy the key.
-
Search the non-volatile memory in which the key was stored for instances of the known key value from Step #1. If a copy is found, then the test fails.
-
Break the key value from Step #1. into an evaluator-chosen set of fragments and perform a search using each fragment. (Note that the evaluator shall first confirm with the developer how the key is normally stored, in order to choose fragment sizes that are the same or smaller than any fragmentation of the data that may be implemented by the TOE. The endianness or byte-order should also be taken into account in the search).
Step #6 ensures that partial key fragments do not remain in non-volatile memory. If the evaluator finds a 32-or-greater-consecutive-bit fragment, then fail immediately. Otherwise, there is a chance that it is not within the context of a key (e.g., some random bits that happen to match). If this is the case the test should be repeated with a different key in Step #1. If a fragment is also found in this repeated run then the test fails unless the developer provides a reasonable explanation for the collision, then the evaluator may give a pass on this test.
Test 3 [conditional]: If the TOE the TOE supports directly examining the non-volatile memory, this test is applied to each key and keying material held in non-volatile memory and subject to destruction by overwrite by the TOE.
-
Record the memory location of the key or keying material.
-
Cause the TOE to perform normal cryptographic processing with the key from Step #1.
-
Cause the TOE to destroy the key. Record the value to be used for the overwrite of the key.
-
Examine the memory location from Step #1. to ensure the appropriate pattern (recorded in Step #3) is used.
The test succeeds if correct pattern is found at the memory location. If the pattern is not found, then the test fails.
Test 4 [conditional]: If the TOE does not support directly examining the SDO/SDE memory or non-volatile memory, this test is applied to each key and keying material held in volatile memory or non-volatile memory.
-
Record the corresponding checksum value of the key or keying material for a given key reference.
The checksum value must be deterministically computed by the TOE on the entire key or keying material. Possible methods include: error detection codes, cryptographic hashes, encryption using a fixed key.
-
Cause the TOE to perform normal cryptographic processing with the key from Step #1.
-
Cause the TOE to destroy the key.
-
Record the corresponding checksum value of the key or keying material for the key reference from Step #1.
-
Verify that the corresponding checksum values obtained from Step #1. and Step #4. are different.
The test succeeds if the corresponding checksum values are found to be different. If they are found to be identical, then the test fails.
Note that each key and keying material held in volatile or non-volatile memory must be tested using either Test 1, Test 2 and 3, or Test 4. Tests 1 through 3 are preferred and shall be performed where possible.
2.1.1.3.4. KMD
The evaluator shall examine the KMD to verify that it identifies and describes the interfaces that are used to service commands to read/write memory. The evaluator shall examine the interface description for each different media type to ensure that the interface supports the selections made by the ST author.
The evaluator shall examine the KMD to ensure that all keys and keying material identified in the TSS and KMD have been accounted for.
Note that where selections include 'destruction of reference to the key directly followed by a request for garbage collection' (for volatile memory) then the evaluator shall examine the KMD to ensure that it explains the nature of the destruction of the reference, the request for garbage collection, and of the garbage collection process itself.
2.1.1.4. FCS_CKM_EXT.7 Cryptographic Key Agreement
2.1.1.4.1. TSS
The evaluator shall ensure that the selected RSA, DH, and ECDH key agreement schemes correspond to the key generation schemes selected in FCS_CKM.1/AKG. If the ST selects DH, the TSS shall describe how the implementation meets the relevant sections of RFC 3526 (Section 3-7) or RFC 7919 (Appendices A.1-A.5). If the ST selects ECIES, the TSS shall describe the key sizes and algorithms (e.g. elliptic curve point multiplication, ECDH with either NIST or Brainpool curves, a symmetric cipher permitted by FCS_COP.1/SKC, a hash algorithm permitted by FCS_COP.1/Hash, and a MAC algorithm permitted by FCS_COP.1/KeyedHash or FCS_COP.1/CMAC) that are supported for the ECIES implementation.
The evaluator shall ensure that, for each key agreement scheme, the size of the derived keying material is at least the same as the intended strength of the key agreement scheme, and where feasible this should be twice the intended security strength of the key agreement scheme.
For each key agreement scheme, the TSS shall list the associated key derivation function if applicable.
2.1.1.4.2. AGD
The evaluator shall verify that the guidance instructs the administrator how to configure the TOE to use the selected key agreement scheme.
2.1.1.4.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Modulus size (KAS2)
-
Domain parameters (DH)
-
Curve (ECDH, ECIES)
-
Key agreement role (initiator or responder)
-
Hash algorithm (if applicable)
-
Associated data pattern (if applicable)
-
KDF configuration (if applicable)
Each test group shall consist of at least 10 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary "owner" private key, an arbitrary "peer" public key, an arbitrary associated data string (if applicable), and the corresponding shared secret or derived key.
-
A representative sample of the domain of supported associated data sizes shall be tested.
-
A representative sample of the domain of supported shared secret sizes shall be tested.
-
A representative sample of the domain of supported derived key sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates the correct shared secret or derived key.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
2.1.1.4.4. KMD
There are no KMD evaluation activities for this component.
2.1.2. Cryptographic Operation (FCS_COP)
2.1.2.1. FCS_COP.1/Hash Cryptographic Operation (Hashing)
2.1.2.1.1. TSS
The evaluator shall check that the association of the hash function with other TSF cryptographic functions (for example, the digital signature verification function) is documented in the TSS. The evaluator shall also check that the TSS identifies whether the implementation is bit-oriented or byte-oriented.
2.1.2.1.2. AGD
The evaluator checks the AGD documents to determine that any configuration that is required to configure the required hash sizes is present. The evaluator also checks the AGD documents to confirm that the instructions for establishing the evaluated configuration use only those hash algorithms selected in the ST.
2.1.2.1.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Hash algorithm
Each test group shall consist of at least 150 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary input message and its corresponding digest value.
-
A representative sample of the domain of supported input message sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates the correct digest value.
Additionally, each test group shall contain least one Monte Carlo Test (MCT) test case, consisting of an arbitrary seed and a list of 100 digest values. For hash algorithms, the MCT is defined as follows:
for i = 1 through 100 msg = seed for j = 1 through 1000 digest = hash(msg) msg = digest output digest seed = digest
For each test case in each test group, the evaluator shall verify that the TSF generates the correct 100 digest values.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
2.1.2.1.4. KMD
There are no KMD evaluation activities for this component.
2.1.2.2. FCS_COP.1/KeyedHash Cryptographic Operation (Keyed Hash)
2.1.2.2.1. TSS
The evaluator shall examine the TSS to ensure that it specifies the following values used by the HMAC and KMAC functions: output MAC length used.
2.1.2.2.2. AGD
There are no guidance evaluation activities for this component.
2.1.2.2.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Hash algorithm
Each test group shall consist of at least 75 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary key, an arbitrary input message, and its corresponding MAC tag.
-
A representative sample of the domain of supported key sizes shall be tested.
-
A representative sample of the domain of supported input message sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates the correct MAC tag.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
2.1.2.2.4. KMD
There are no KMD evaluation activities for this component.
2.1.2.3. FCS_COP.1/SigGen Cryptographic Operation (Signature Generation)
2.1.2.3.1. TSS
The evaluator shall examine the TSS to ensure that all signature generation functions use the approved algorithms and key sizes.
2.1.2.3.2. AGD
There are no AGD evaluation activities for this component.
2.1.2.3.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Modulus size (RSA)
-
Curve (ECDSA, EC-KCDSA, EdDSA)
-
Group size (KCDSA)
-
Private key size (LMS, HSS, XMSS, XMSSMT)
-
Padding scheme (RSA)
-
Hash or XOF algorithm
Each test group shall consist of at least 6 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary private key and an arbitrary input message to be signed.
-
A representative sample of the domain of supported message sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates a signature that is structurally well-formed according to the relevant standard. Additionally, the evaluator shall use a known-good implementation to verify that the generated signature is indeed a valid signature on the input message.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
2.1.2.3.4. KMD
There are no KMD evaluation activities for this component.
2.1.2.4. FCS_COP.1/SigVer Cryptographic Operation (Signature Verification)
2.1.2.4.1. TSS
The evaluator shall check the TSS to ensure that it describes the overall flow of the signature verification. This should at least include identification of the format and general location (e.g., "firmware on the hard drive device" rather than "memory location 0x00007A4B") of the data to be used in verifying the digital signature; how the data received from the operational environment are brought onto the device; and any processing that is performed that is not part of the digital signature algorithm (for instance, checking of certificate revocation lists).
2.1.2.4.2. AGD
There are no AGD evaluation activities for this component.
2.1.2.4.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Modulus size (RSA)
-
Curve (ECDSA, EC-KCDSA, EdDSA)
-
Group size (DSA, KCDSA)
-
Private key size (LMS, HSS, XMSS, XMSSMT)
-
Padding scheme (RSA)
-
Hash or XOF algorithm
Each test group shall consist of at least 6 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary public key, an arbitrary input message, and a signature. For 5 of the 6 test cases, the public key or input message shall be modified such that the signature is invalid.
-
A representative sample of the domain of supported message sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF correctly reports the validity of the signature for the input message.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
2.1.2.4.4. KMD
There are no KMD evaluation activities for this component.
2.1.2.5. FCS_COP.1/SKC Cryptographic Operation (Symmetric-Key Cryptography)
2.1.2.5.1. TSS
The evaluator shall check that the TSS includes a description of encryption functions used for symmetric key encryption. The evaluator should check that this description of the selected encryption function includes the key sizes and modes of operations as specified in the [DSC cPP] Table "Symmetric-Key Cryptography"
The evaluator shall check that the TSS describes the means by which the TOE satisfies constraints on algorithm parameters included in the selections made for 'cryptographic algorithm' and 'list of standards'.
2.1.2.5.2. AGD
If the product supports multiple modes, the evaluator shall examine the vendor’s documentation to determine that the method of choosing a specific mode/key size by the end user is described.
2.1.2.5.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Mode of operation
-
Algorithm direction (encrypt/decrypt)
-
Key size
For each test group, the evaluator shall generate at least 50 test cases meeting the following requirements:
-
Each test case shall consist of an arbitrary key, an arbitrary input plaintext/ciphertext, and its corresponding ciphertext/plaintext.
-
Depending on the mode of operation, an initialization vector, counter, or tweak value shall also be included in the test case.
-
A representative sample of the domain of supported input plaintext/ciphertext sizes shall be tested.
-
A representative sample of the domain of supported initialization vector sizes shall be tested (CBC, CFB, OFB mode).
-
A representative sample of the domain of supported counter sizes shall be tested (CTR mode).
-
A representative sample of the domain of supported tweak sizes shall be tested (XTS modes).
For each test case in each test group, the evaluator shall verify that the TSF generates the correct ciphertext/plaintext.
Additionally, each test group corresponding to a CBC, CFB, or OFB mode of operation shall contain least one Monte Carlo Test (MCT) test case, consisting of an arbitrary key, an arbitrary initialization vector, an arbitrary input plaintext/ciphertext, and a list of 100 corresponding ciphertexts/plaintexts.
For encryption with a 128-bit block size, the MCT is defined as follows:
for i = 1 through 100 output key output iv output pt encrypt_init(key, iv) ct[1] = encrypt(pt) pt = iv for j = 2 through 1000 ct[j] = encrypt(pt) pt = ct[j - 1] output ct[j] if key_len == block_size key = key xor ct[j] if 2 * key_len == 3 * block_size key = key xor (lsb(ct[j - 1], block_size / 2) || ct[j]) if key_len == 2 * block_size key = key xor (ct[j - 1] || ct[j]) iv = ct[j] pt = ct[j - 1]
For decryption with a 128-bit block size, the MCT is defined as follows:
for i = 1 through 100 output key output iv output ct decrypt_init(key, iv) pt[1] = encrypt(ct) ct = iv for j = 2 through 1000 pt[j] = encrypt(ct) ct = pt[j - 1] output pt[j] if key_len == block_size key = key xor pt[j] if 2 * key_len == 3 * block_size key = key xor (lsb(pt[j - 1], block_size / 2) || pt[j]) if key_len == 2 * block_size key = key xor (pt[j - 1] || pt[j]) iv = pt[j] ct = pt[j - 1]
For each test case in each test group, the evaluator shall verify that the TSF generates the correct 100 ciphertexts/plaintexts.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
2.1.2.5.4. KMD
The evaluator shall examine the KMD to ensure that the points at which symmetric key encryption and decryption occurs are described, and that the complete data path for symmetric key encryption is described. The evaluator checks that this description is consistent with the relevant parts of the TSS.
Assessment of the complete data path for symmetric key encryption includes confirming that the KMD describes the data flow from the device’s host interface to the device’s non-volatile memory storing the data, and gives information enabling the user data datapath to be distinguished from those situations in which data bypasses the data encryption engine (e.g. read-write operations to an unencrypted Master Boot Record area). The evaluator shall ensure that the documentation of the data path is detailed enough that it thoroughly describes the parts of the TOE that the data passes through (e.g. different memory types, processors and co-processors), its encryption state (i.e. encrypted or unencrypted) in each part, and any places where the data is stored. For example, any caching or buffering of the data should be identified and distinguished from the final destination in non-volatile memory (the latter represents the location from which the host will expect to retrieve the data in future).
If support for AES-CTR is claimed and the counter value source is internal to the TOE, the evaluator shall verify that the KMD describes the internal counter mechanism used to ensure that it provides unique counter block values.
2.1.3. Random Bit Generation (FCS_RBG)
2.1.3.1. FCS_RBG.1 Random Bit Generation (RBG)
In addition to the materials below, documentation shall be produced—and the evaluator shall perform the activities—in accordance with Appendix D of [DSC cPP].
2.1.3.1.1. TSS
The evaluator shall examine the TSS to determine that it specifies the DRBG type, identifies the entropy sources seeding the DRBG, and state the assumed or calculated min-entropy supplied either separately by each source or the min-entropy contained in the combined seed value.
2.1.3.1.2. AGD
There are no AGD evaluation activities for this component.
2.1.3.1.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Hash algorithm (Hash_DRBG or HMAC_DRBG)
-
Block cipher algorithm (CTR_DRBG)
-
Presence of a derivation function (CTR_DRBG)
-
Prediction resistance support
Each test group shall consist of at least 15 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary entropy input value, an arbitrary nonce (if supported), an arbitrary personalization string (if supported), two arbitrary additional input values (if supported), and the corresponding pseudorandom output.
-
If the DRBG implementation supports prediction resistance or reseeding, each test case shall also include one entropy input value and one additional input value for reseeding.
-
A representative sample of the domain of supported entropy input sizes shall be tested.
-
A representative sample of the domain of supported nonce sizes shall be tested.
-
A representative sample of the domain of supported personalization string sizes shall be tested.
-
A representative sample of the domain of supported additional input sizes shall be tested.
-
A representative sample of the domain of supported output sizes shall be tested.
For each test case in each test group, the evaluator shall perform the following steps:
-
Instantiate the DRBG implementation using the entropy input value, nonce (if present), and personalization string (if present).
-
If the DRBG implementation supports prediction resistance or reseeding, reseed the DRBG using the entropy input value and additional input value.
-
Generate bits from the DRBG matching the pseudorandom output length from the test case. Discard the generated bits.
-
Generate bits from the DRBG matching the pseudorandom output length from the test case. Verify the generated bits are equal to the pseudorandom output.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
2.1.3.1.4. KMD
There are no KMD evaluation activities for this component.
2.1.4. One-Time Value (Extended - FCS_OTV_EXT)
2.1.4.1. FCS_OTV_EXT.1 One-Time Value
2.1.4.1.1. TSS
The evaluator shall ensure the TSS describes how salts, nonces, and other one-time values are generated using the RBG.
2.1.4.1.2. AGD
There are no AGD evaluation activities for this component.
2.1.4.1.3. Test
The evaluator shall confirm by testing that the one-time values used by cryptographic operations are of the length specified in FCS_OTV_EXT.1, are obtained from the RBG, and are fresh on each invocation.
Note: in general these tests may be carried out as part of the tests of the relevant cryptographic operations.
2.1.4.1.4. KMD
There are no KMD evaluation activities for this component.
2.1.5. Cryptographic Key Storage (Extended - FCS_STG_EXT)
2.1.5.1. FCS_STG_EXT.1 Protected Storage
2.1.5.1.1. TSS
The evaluator shall review the TSS to determine that the TOE implements the required protected storage. The evaluator shall ensure that the TSS contains a description of the protected storage mechanism that justifies the selection of mutable hardware-based or software-based.
2.1.5.1.2. AGD
The evaluator shall examine the operational guidance to ensure that it describes the process for generating keys, importing keys, or both, based on what is claimed by the ST. The evaluator shall also examine the operational guidance to ensure that it describes the process for destroying keys that have been imported or generated.
2.1.5.1.3. Test
The evaluator shall test the functionality of each security function as described below. If the TOE supports both import and generation of keys, the evaluator shall repeat the testing as needed to demonstrate that the keys resulting from both operations are treated in the same manner. The devices used with the tooling may need to be non-production devices in order to enable the execution and gathering of evidence.
Test 1: The evaluator shall import or generate keys/secrets of each supported type according to the operational guidance. The evaluator shall write, or the developer shall provide access to, an application that generates a key/secret of each supported type and calls the import functions. The evaluator shall verify that no errors occur during import.
Test 2: The evaluator shall write, or the developer shall provide access to, an application that uses a generated or imported key/secret:
-
For RSA, the secret shall be used to sign data.
-
For ECDSA, the secret shall be used to sign data.
The evaluator shall repeat this test with the application-imported or application-generated keys/secrets and a different application’s imported keys/secrets or generated keys/secrets. The evaluator shall verify that the TOE requires approval before allowing the application to use the key/secret imported or generated by the user or by a different application:
-
The evaluator shall deny the approvals to verify that the application is not able to use the key/secret as described.
-
The evaluator shall repeat the test, allowing the approvals to verify that the application is able to use the key/secret as described.
2.1.5.1.4. KMD
There are no KMD evaluation activities for this component.
2.2. User Data Protection (FDP)
2.2.1. Access Control Policy (FDP_ACC)
2.2.1.1. FDP_ACC.1 Subset Access Control
2.2.1.1.1. TSS
The evaluator shall confirm that the TSS contains the access control policy implemented by the TOE where the ST author lists each object and identifies for each object, which operations the TSF permits for each subject (i.e. what can different roles do).
2.2.1.1.2. AGD
There are no guidance evaluation activities for this component.
2.2.1.1.3. Test
This component is tested as part of FDP_ACF.1.
2.2.1.1.4. KMD
There are no KMD evaluation activities for this component.
2.2.2. Access Control Functions (FDP_ACF)
2.2.2.1. FDP_ACF.1 Security Attribute Based Access Control
2.2.2.1.1. TSS
The evaluator shall examine the TSS to verify that it describes the policy rules for the Access Control SFP. Specifically, the evaluator should be able to identify, for any arbitrary subject-object-operation pairing, which of the following is true:
-
The subject can always perform the desired operation.
-
The subject can never perform the desired operation, either because they lack sufficient permission or because the TSF includes no interface to support the operation.
-
The subject can only perform the desired operation under certain conditions (which the evaluator shall verify are described in the TSS). For example, "the S.CApp subject may only perform the OP.Destroy operation on an OB.SDO object if it was the subject that originally created or imported the SDO."
-
The subject can only perform the desired operation on one or more attributes of the object as opposed to the entire object itself (which the evaluator shall verify are identified in the TSS).
-
Whether the subject can perform the desired operation depends on TSF configuration (which the evaluator shall verify is described in the TSS as part of the evaluation of FMT_SMF.1).
-
Some combination of c, d, and e.
Given that this SFR requires a large number of potential subject-object-operation pairings to be identified, it is not the expectation that the TSS contain an exhaustive list of these pairings. It is possible that large numbers of pairings are addressed by blanket statements of policy rules, such as "the subjects S.DSC and S.CApp are never able to perform any operation on the OB.AntiReplay object." For any rules that are not addressed in this manner, the evaluator shall verify the TSS includes sufficient data for the evaluator to determine how the TSF will evaluate the action. This can be presented in the form of a table, flowchart, list, or other manner that the ST author finds suitable.
Note that the TOE developer may not use the same terminology for its subjects, objects, and operations as the PP. If this is the case, the evaluator shall verify that the TSS includes a mapping that unambiguously shows how the vendor’s preferred terminology corresponds to what the PP defines.
2.2.2.1.2. AGD
For any access control policy enforcement behavior that is configurable, the evaluator shall ensure that the operational guidance describes how to perform the configuration, including any restrictions on permissible configurable settings.
2.2.2.1.3. Test
The following testing may require the TOE developer to make a test harness available to the evaluator that allows the evaluator to interface directly with the TOE. Due to the large volume of potential testing that this requires, this test may require the use of an automated script. If a test script is made available, the evaluator shall verify that it includes sufficient detail to validate the claims made in the TSS.
For each subject/object/operation/attribute combination, the evaluator shall attempt to perform the operation or determine that no interface is present to attempt the operation, consistent with the limitations described in the TSS.
For each case where an operation is always permitted or never permitted, both positive and negative testing will be conducted implicitly by attempting the operation with all possible subjects and determining that the intended results occur in each case.
For each case where the operation succeeds or fails based on the target object attribute, the evaluator shall ensure that both positive and negative testing is performed such that only the correct target attributes can be operated upon.
For each case where the operation succeeds or fails based on one or more specific conditions, the evaluator shall ensure that both positive and negative testing is performed such that the presence of the conditions causes the test to succeed while the absence of the conditions causes the test to fail.
For each case where the operation succeeds or fails based on an administratively configured setting, the evaluator shall ensure that both positive and negative testing is performed such that the configuration setting can be shown to affect whether or not the operation succeeds.
2.2.2.1.4. KMD
There are no KMD evaluation activities for this component.
2.2.3. Export from the TOE (Extended - FDP_ETC_EXT)
2.2.3.1. FDP_ETC_EXT.2 Propagation of SDOs
2.2.3.1.1. TSS
The evaluator shall examine the TSS to ensure that it describes how it protects the SDO references, authorization data, against access from unauthorized entities. In particular, it should describe how it provides confidentiality of the data while it resides outside the TOE after export.
2.2.3.1.2. AGD
There are no guidance evaluation activities for this component.
2.2.3.1.3. Test
There are no test evaluation activities for this component.
2.2.3.1.4. KMD
There are no KMD evaluation activities for this component.
2.2.4. Factory Reset (Extended - FDP_FRS_EXT)
2.2.4.1. FDP_FRS_EXT.1 Factory Reset
2.2.4.1.1. TSS
The evaluator shall examine the TSS to determine that it describes each of the conditions which may lead to a factory reset.
2.2.4.1.2. AGD
The evaluator shall examine the operational guidance to ensure that it describes the ways the administrator can set the conditions to initiate a factory reset if this is supported.
2.2.4.1.3. Test
Test 1 [conditional]: If activation by external interface is selected in FDP_FRS_EXT.1.1, the evaluator shall identify all external interfaces that reset the DSC to factory settings. For each external interface, the evaluator shall perform the following steps:
-
Create an SDE or SDO.
-
Verify the presence of the item created in the previous step.
-
Initiate the factory reset using the given external interface.
-
Verify the item created in Step #1. no longer exists.
Test 2 [conditional]: If presentation of … is selected in FDP_FRS_EXT.1.1, the evaluator shall identify all functions that reset the DSC to factory settings and their respective required authorization data. For each function and for each authorization method (i.e., type of requured authorization data), the evaluator shall perform the following steps:
-
Create an SDE or SDO.
-
Verify the presence of the item created in the previous step.
-
Initiate the factory reset using the given function and authorization data.
-
Verify the item created in Step #1. no longer exists.
2.2.4.1.4. KMD
There are no KMD evaluation activities for this component.
2.2.5. Import from Outside the TOE (Extended - FDP_ITC_EXT)
2.2.5.1. FDP_ITC_EXT.1 Parsing of SDEs
2.2.5.1.1. TSS
The evaluator shall confirm the TSS contains descriptions of the supported methods the TSF uses to import SDEs into the TOE. For each import method selected, the TSS shall describe integrity verification schemes employed. The TSS shall also list the ways the TSF generates and binds security attributes to the SDEs.
2.2.5.1.2. AGD
There are no AGD evaluation activities for this component.
2.2.5.1.3. Test
For each supported import method selected in FDP_ITC_EXT.1.1 and for each supported integrity verification method selected in FDP_ITC_EXT.1.2. used by the selected import method, the evaluator shall perform the following steps:
-
Import one SDE with valid integrity data (e.g. valid hash, MAC tag, or signature).
-
Verify that an SDO is created with valid security attributes in accordance with FDP_ITC_EXT.1.3 and FDP_ITC_EXT.1.4.
-
Import one SDE with invalid integrity data.
-
Verify that the operation results in an error and no SDO is created.
2.2.5.1.4. KMD
There are no KMD evaluation activities for this component.
2.2.5.2. FDP_ITC_EXT.2 Parsing of SDOs
2.2.5.2.1. TSS
The evaluator shall confirm the TSS contains descriptions of the supported methods the TSF uses to import SDOs into the TOE. For each import method selected, the TSS shall describe integrity verification schemes employed. The TSS shall also list the ways the TSF generates and binds security attributes to the SDOs.
2.2.5.2.2. AGD
There are no AGD evaluation activities for this component.
2.2.5.2.3. Test
For each supported import method selected in FDP_ITC_EXT.2.1 and for each supported integrity verification method selected in FDP_ITC_EXT.2.2. used by the selected import method, the evaluator shall perform the following steps:
-
Import one SDO with valid integrity data (e.g. valid hash, MAC tag, or signature).
-
Verify that an SDO is created with valid security attributes in accordance with FDP_ITC_EXT.2.3, FDP_ITC_EXT.2.4, and FDP_ITC_EXT.2.5.
-
Import one SDO with invalid integrity data.
-
Verify that the operation results in an error and no SDO is created.
2.2.5.2.4. KMD
There are no KMD evaluation activities for this component.
2.2.6. Residual Information Protection (FDP_RIP)
2.2.6.1. FDP_RIP.1 Subset Residual Information Protection
2.2.6.1.1. TSS
The evaluator shall check to ensure that the TSS describes resource deallocation to the extent that they can determine that no data will be reused when reallocating resources following the destruction of an SDE or SDO. The evaluator shall ensure that this description at a minimum describes how the previous data is destroyed. The evaluator shall also ensure that this destruction method is consistent with FCS_CKM.6.
2.2.6.1.2. AGD
There are no AGD evaluation activities for this component.
2.2.6.1.3. Test
Testing for FCS_CKM.6 is sufficient to address this component.
2.2.6.1.4. KMD
There are no KMD evaluation activities for this component.
2.2.7. Stored data confidentiality with dedicated method (FDP_SDC)
2.2.7.1. FDP_SDC.2 Stored data confidentiality with dedicated method
2.2.7.1.1. TSS
The evaluator shall examine the TSS to determine that it describes the protection for SDEs and authorization data and the methods of protection (e.g. protected storage, symmetric encryption, key wrapping, etc.).
The evaluator shall also examine the TSS to determine whether the TSF stores this data inside the TOE boundary or in its operational environment. The evaluator shall examine that the methods of protection used for the various SDEs and authorization data are specified based on where the data resides (inside or outside the TOE boundary), and that the protection method is appropariate to provide confidentiality.
2.2.7.1.2. AGD
There are no AGD evaluation activities for this component.
2.2.7.1.3. Test
The tests for FCS_COP.1 (as applicable), FPT_PHP.3 and FCS_STG_EXT.1 shall suffice for this component.
2.2.7.1.4. KMD
There are no KMD evaluation activities for this component.
2.2.8. Stored Data Integrity (FDP_SDI)
2.2.8.1. FDP_SDI.2 Stored Data Integrity Monitoring and Action
2.2.8.1.1. TSS
The evaluator shall confirm that the ST author describes the methods for protecting the integrity of SDOs stored with the TOE, and shall identify the iteration of FCS_COP.1/Hash, FCS_COP.1/KeyedHash, FCS_COP.1/SigVer or FCS_COP.1/KeyWrap that covers any cryptographic algorithm used. The evaluator shall also confirm that the TSS describes the response upon the detection of an integrity error.
The evaluator shall confirm that the TSS describes the actions the TSF takes when the integrity verification fails for an SDO, including the circumstances that cause a notification to be sent when this occurs.
The evaluator shall confirm that TSS describes how integrity of SDOs is protected in FMT_MSA.3 during initialization, and how the integrity of SDOs are verified during parsing (import) in FDP_ITC_EXT.1 and FDP_ITC_EXT.2.
2.2.8.1.2. AGD
The evaluator shall examine the operational guidance to verify that it describes the conditions that cause a notification to be sent when an integrity error is detected, and what the contents of the notification are.
2.2.8.1.3. Test
The tests for FDP_ITC_EXT.1, FDP_ITC_EXT.2 and FMT_MSA.3 shall suffice for this component.
2.2.8.1.4. KMD
There are no KMD evaluation activities for this component.
2.3. Identification and Authentication (FIA)
2.3.1. Authorization Failure Handling (Extended - FIA_AFL_EXT)
2.3.1.1. FIA_AFL_EXT.1 Authorization Failure Handling
2.3.1.1.1. TSS
The evaluator shall examine the TSS to determine that it contains a description for how successive unsuccessful authorization attempts are detected and tracked. The evaluator shall examine the TSS to determine that is contains a description of the actions in the event that the authorization attempt threshold is met or exceeded.
The evaluator shall also examine the TSS to determine that it describes how the failed authorization attempt counter is incremented before the authorization is verified.
The evaluator shall also examine the TSS to determine the behaviour that will occur if there are excessive failed authorization attempts, specifically whether future attempts are prevented for a static or configurable amount of time, future attempts are prevented indefinitely, or a factory reset is triggered.
2.3.1.1.2. AGD
If the administrator is able to configure any of the variables for authorization attempts, the evaluator shall examine the guidance documentation to ensure that instructions for configuring the number of successive unsuccessful authentication attempts and time period (if implemented) are provided, and that the process of unlocking the SDOs is described for each "action" specified (if that option is chosen).
The evaluator shall examine the guidance documentation to confirm that it describes, and identifies the importance of, any actions that are required in order to ensure that access to SDOs can be maintained, unless it is made permanently unavailable due to a factory reset.
2.3.1.1.3. Test
The evaluator shall perform the following tests for each method by which the TSF authorizes access the SDOs (e.g. any passwords entered as part of establishing authorization):
Test 1: If the administrator is able to configure any of the variables for authorization attempts, the evaluator shall use the operational guidance to configure the number of successive unsuccessful authorization attempts allowed by the TOE. The evaluator shall test that once the authorization attempts limit is reached (whether configured by the administrator or statically specified), authorization attempts with valid credentials are no longer successful.
Test 2: After reaching the limit for unsuccessful authorization attempts as in Test 1 above, the evaluator shall proceed as follows. If the action selected in FIA_AFL_EXT.1.3 is included in the ST then the evaluator shall confirm by testing that following the operational guidance and performing each action specified in the ST to re-enable access results in successful access. If the time period selection in FIA_AFL_EXT.1.3 is included in the ST, then the evaluator shall wait for just less than the time period configured in Test 1 and show that an authorization attempt using valid credentials does not result in successful access. The evaluator shall then wait until just after the time period configured in Test 1 and show that an authorization attempt using valid credentials results in successful access.
Test 3 [conditional]: If factory reset the TOE wiping out all non-permanent SDEs and SDOs, as described by FDP_FRS_EXT.2 is selected in FIA_AFL_EXT.1.3, the evaluator shall perform the test required by FDP_FRS_EXT.2 with step 5 replaced with "The evaluator shall initiate a factory reset by deliberately meeting or surpassing the threshold for unsuccessful authorization attempts, depending on whether meets or surpasses is selected in FIA_AFL_EXT.1.3."
2.3.1.1.4. KMD
There are no KMD evaluation activities for this component.
2.3.2. Specification of Secrets (FIA_SOS)
2.3.2.1. FIA_SOS.2 TSF Generation of Secrets
2.3.2.1.1. TSS
The evaluator shall ensure that the TSS describes for each of the TSF functions listed in FIA_SOS.2.2, if the available key space is configurable, and the size (or range) of the key space employed to generate authorization values.
The evaluator shall ensure that the TSS states that the quality metrics provided is based on the assumption of sufficient entropy being provided in accordance with the information given in [DSC cPP] Annex D.
The evaluator shall ensure that the TSS describes the mechanism used to generate authorization values and documents the quality metric that the mechanism provides. The information provided in the TSS shall demonstrate that the probability that a random single authentication attempt will be successful is less than one in 1,000,000.
2.3.2.1.2. AGD
The evaluator shall examine the guidance documentation to determine that it describes any configuration necessary to enforce the use of TSF generated authorization values listed in FIA_SOS.2.2.
The evaluator shall ensure that the guidance documentation provides any instructions needed to set parameters affecting the available key spaces.
2.3.2.1.3. Test
The evaluator shall perform the following tests.
Test 1: The evaluator shall compose a set of 50 authorization values that meet the requirements, and 50 authorization values that fail to meet the requirements.
-
For each authentication value that meets the requirements, the evaluator shall verify that the TOE supports the authentication value.
-
For each authentication value that does not meet the requirements, the evaluator shall verify that the TOE does not support the authentication value.
While the evaluator is not required (nor is it feasible) to test all possible compositions of authentication values, the evaluator shall ensure that the key space identified in the TSS is valid.
Test 2: For each TSF function listed in FIA_SOS.2.2 the TOE shall be configured to generate the authentication values; the evaluator shall check that the TOE produces the authentication values.
2.3.2.1.4. KMD
There are no KMD evaluation activities for this component.
2.3.3. User Authentication (FIA_UAU)
2.3.3.1. FIA_UAU.2 User Authentication before Any Action
2.3.3.1.1. TSS
The evaluator shall examine the TSS to determine that it describes the identification and authentication process for each supported method (PIN/try-PIN, salted hash, etc.), the circumstances in which each supported method is used, and what constitutes "successful authentication."
The evaluator shall examine the TSS to determine that it describes which actions are allowed before user identification and authentication. The evaluator shall also determine that the TSS describes, for each action that does require identification and authentication, the method and circumstances by which the authentication is performed (e.g., as per the application note, the TSF may authenticate a user once rather than each time access to an SDO is attempted; the TSS shall describe when authentication is or is not required in order to perform a TSF-mediated action).
2.3.3.1.2. AGD
The evaluator shall examine the guidance documentation to determine that any necessary preparatory steps (e.g., establishing valid credential material such as PIN) to logging in are described. For each supported the login method, the evaluator shall ensure the guidance documentation provides clear instructions for successfully logging on.
2.3.3.1.3. Test
The evaluator shall use the guidance documentation to configure the appropriate credentials supported for each authentication method. For that authentication method, the evaluator shall attempt to perform TSF-mediated actions that require successful use of that authentication method and subsequently show that providing correct identification and authentication information results in the ability to perform the requested action, while providing incorrect information results in denial of access.
2.3.3.1.4. KMD
There are no KMD evaluation activities for this component.
2.3.3.2. FIA_UAU.5 Multiple Authentication Mechanisms
2.3.3.2.1. TSS
The evaluator shall examine the TSS and ensure that it describes the authentication mechanisms used to support user authentication, including how each mechanism enforces the authentication.
2.3.3.2.2. AGD
If the supported authentication mechanisms are configurable, the evaluator shall examine the operational guidance to verify that it describes how to configure the authentication mechanisms used to provide authentication.
2.3.3.2.3. Test
For each supported authentication mechanism, the evaluator shall verify that valid credentials result in successful authentication and invalid credentials result in a rejected authentication attempt. If the supported authentication mechanisms are configurable, the evaluator shall follow the operational guidance to enable/disable the various mechanisms and ensure that valid credentials do not result in successful authentication if that mechanism is disabled, or that there is no interface to provide authentication credentials if that mechanism is disabled.
2.3.3.2.4. KMD
There are no KMD evaluation activities for this component.
2.3.3.3. FIA_UAU.6 Re-Authenticating
2.3.3.3.1. TSS
The evaluator shall examine the TSS to determine that it describes each of the options for reauthorization.
2.3.3.3.2. AGD
There are no AGD evaluation activities for this component.
2.3.3.3.3. Test
The evaluator shall use the configuration guidance to create an SDO with each of the options for reauthorization, then identify functions to exercise each of these options, then execute these options providing the correct authorization confirming that the operation succeeded with respect to the reauthorization option chosen. The evaluator shall then attempt to execute these functions while providing the incorrect authorization and confirming that the operation fails.
2.3.3.3.4. KMD
There are no KMD evaluation activities for this component.
2.4. Security Management (FMT)
2.4.1. Management of Functions in TSF (Extended - FMT_MOF_EXT)
2.4.1.1. FMT_MOF_EXT.1 Management of Security Functions Behavior
2.4.1.1.1. TSS
The evaluator shall verify that the TSS describes those management functions that may be performed by the ADM-R or MFGADM-R roles, to include how the CApp-R role is prevented from accessing, performing, or relaxing the function (if applicable), and how they are prevented from modifying the administrator configuration. The TSS also describes any functionality that is affected by administrator-configured policy and how. This activity will be performed in conjunction with FMT_SMF.1.
2.4.1.1.2. AGD
There are no AGD evaluation activities for this component.
2.4.1.1.3. Test
For each management function described in FMT_SMF.1.1, the evaluator shall perform the function with administrator authorization data and confirm it succeeds, and again with client application authorization data and confirm that it fails.
2.4.1.1.4. KMD
There are no KMD evaluation activities for this component.
2.4.2. Management of Security Attributes (FMT_MSA)
2.4.2.1. FMT_MSA.1 Management of Security Attributes
2.4.2.1.1. TSS
The evaluator shall confirm that the TSS describes the modification constraints for each SDO security attribute.
2.4.2.1.2. AGD
There are no AGD evaluation activities for this component.
2.4.2.1.3. Test
The evaluator shall confirm that the evaluation activities for FDP_ACF.1 contains tests for the OP.Modify operation on objects OB.P_SDO, OB.T_SDO.
2.4.2.1.4. KMD
There are no KMD evaluation activities for this component.
2.4.2.2. FMT_MSA.3 Static Attribute Initialization
2.4.2.2.1. TSS
The evaluator shall confirm that the TSS describes the initialization process for importing and generating SDOs. The TSS shall describe each type of SDO.Type and any additional attributes that are beyond the ones listed. In particular, it shall describe all attributes assigned to the SDO during parsing. Additionally, list any further restrictions of the allowed values for the minimum list of attributes.
The evaluator shall confirm that the TSS describes the allowed values for each of the attributes.
2.4.2.2.2. AGD
When applicable, the evaluator shall confirm that the user guidance describes all attributes assigned by the TOE to any parsed SDOs.
2.4.2.2.3. Test
The evaluator shall confirm that the evaluation activities for FDP_ACF.1 contains tests for the OP.Import and OP.Create operations on objects OB.P_SDO, OB.T_SDO.
2.4.2.2.4. KMD
There are no KMD evaluation activities for this component.
2.4.3. Specification of Management Functions (FMT_SMF)
2.4.3.1. FMT_SMF.1 Specification of Management Functions
2.4.3.1.1. TSS
The evaluator shall verify that the TSS describes all management functions.
2.4.3.1.2. AGD
The evaluator shall verify that the AGD describes how the ADM-R or MFGADM-R roles configure the management functions.
2.4.3.1.3. Test
Testing for this component is performed through evaluation of FMT_MOF_EXT.1.
2.4.3.1.4. KMD
There are no KMD evaluation activities for this component.
2.4.4. Security Management Roles (FMT_SMR)
2.4.4.1. FMT_SMR.1 Security Roles
2.4.4.1.1. TSS
The evaluator shall confirm that the TSS describes the mechanisms by which CApp-R roles can exclusively access their own encrypted data and administrators cannot access client application encrypted data. The evaluator shall also confirm the TSS describes the mechanisms that allow only ADM-R and MFGADM-R roles to perform privileged functions.
2.4.4.1.2. AGD
The evaluator shall verify that the AGD describes how the administrator roles configure the management functions.
2.4.4.1.3. Test
Testing for this component is performed through evaluation of FMT_MOF_EXT.1.
2.4.4.1.4. KMD
There are no KMD evaluation activities for this component.
2.5. Protection of the TSF (FPT)
2.5.1. Fail Secure (FPT_FLS)
2.5.1.1. FPT_FLS.1/FI Failure with Preservation of Secure State (Fault Injection)
2.5.1.1.1. TSS
The evaluator shall examine the TSS to verify that it describes the actions taken when the TOE experiences fault injection and how the TOE preserves a secure state.
The evaluator shall verify that the TSS describes the state of the TOE when the firmware validity checks fail, including the various failure modes assumed.
2.5.1.1.2. AGD
The evaluator shall examine the operational guidance to verify that it describes what actions should be taken to attempt to resolve the failed state.
2.5.1.1.3. Test
This component is tested as part of FPT_PHP.3.
2.5.1.1.4. KMD
There are no KMD evaluation activities for this component.
2.5.2. Mutable/Immutable Firmware (Extended - FPT_MFW_EXT)
2.5.2.1. FPT_MFW_EXT.1 Mutable/Immutable Firmware
2.5.2.1.1. TSS
The evaluator shall examine the TSS and ensure that details of which firmware components are considered mutable and which firmware components are considered immutable, as well as how these firmware components can/cannot be modified or altered, are described. For example, DSC firmware components that are stored in ROM would be considered immutable.
2.5.2.1.2. AGD
If the TOE has mutable firmware, the evaluator shall examine the operational guidance to ensure that it describes how to modify the firmware.
2.5.2.1.3. Test
If the TOE has mutable firmware, the evaluator shall perform the activities described in the operational guidance to modify the firmware.
2.5.2.1.4. KMD
There are no KMD evaluation activities for this component.
2.5.3. Debug Modes (Extended - FPT_MOD_EXT)
2.5.3.1. FPT_MOD_EXT.1 Debug Modes
2.5.3.1.1. TSS
The evaluator shall examine the TSS to ensure it describes the mechanisms the TSF employs to prevent access to debug modes with a brief description of each debug mode supported.
2.5.3.1.2. AGD
There are no AGD evaluation activities for this component.
2.5.3.1.3. Test
The evaluator shall attempt to exercise any single function from each supported debug mode. If the evaluator is able to exercise any function from any of the supported debug modes, the test is a 'Fail', otherwise, the test is a 'Pass'.
2.5.3.1.4. KMD
There are no KMD evaluation activities for this component.
2.5.4. TSF Physical Protection (FPT_PHP)
2.5.4.1. FPT_PHP.3 Resistance to Physical Attack
2.5.4.1.1. TSS
The evaluator shall examine the TSS to ensure it describes the methods used by the TOE to detect physical tampering and how the TOE will respond when physical tampering has been detected.
The evaluator shall also examine the TSS to ensure that it documents the temperature and voltage ranges in which the TSF is assured to operate properly.
2.5.4.1.2. AGD
There are no AGD evaluation activities for this component.
2.5.4.1.3. Test
All tests here are based on an attempted Fault Injection.
The evaluator shall perform fault injection on the TOE and attempt to extract a known SDO/SDE.
The evaluator shall cause the TOE to parse or generate an SDO/SDE with a known value. The evaluator will then cause the TOE to process the SDO/SDE, possibly multiple times, while injecting faults on the TOE.
This test is repeated for each type of Fault Injection.
If the evaluator is able to acquire the original SDO/SDE or a known result from the TOE processing the SDO/SDE, the test is a 'Fail', otherwise, the test is a 'Pass'.
Test 1: Temperature
The following testing is derived from [ISO 24759] test procedures TE07.77.01 through TE07.77.03:
The evaluator shall configure the temperature of the TOE close to the approximate extreme of the normal operating range specified in the TSS and verify that the TSF continues to function as expected. This may be done via ambient temperature or induced locally as is most appropriate for the TOE (and the accessibility to the physical component).
The evaluator shall determine 'expected functionality' based on how the TSS describes the TOE’s reaction to an environmental failure. For example, if the TSS states that the TOE’s response is to shut down, it can be assumed that the TOE functions as expected if it does not shut down. If the TSS states that the TOE’s response is to zeroize certain data, it can be assumed that the TOE functions as expected if the evaluator performs functions that rely on known data values and obtain results that indicate non-zero values.
The evaluator shall then extend the temperature outside of the specified normal range and verify that the TOE responds in the manner specified in the ST. If the TOE’s response is to zeroize known data, the evaluator shall return the ambient temperature to a normal range, perform functions that rely on known data values, and observe that the results of these functions are consistent with known values of zero.
Test 2: Power Analysis
The following testing is derived from [ISO 24759] test procedures TE07.77.01 through TE07.77.03:
The evaluator shall configure the voltage of the TOE close to the approximate extreme of the normal operating range specified in the TSS and verify that the TSF continues to function as expected.
The evaluator shall determine 'expected functionality' based on how the TSS describes the TOE’s reaction to an environmental failure. For example, if the TSS states that the TOE’s response is to shut down, it can be assumed that the TOE functions as expected if it does not shut down. If the TSS states that the TOE’s response is to zeroize certain data, it can be assumed that the TOE functions as expected if the evaluator performs functions that rely on known data values and obtain results that indicate non-zero values.
The evaluator shall then extend the voltage outside of the specified normal range and verify that the TOE responds in the manner specified in the ST. If the TOE’s response is to zeroize known data, the evaluator shall return the voltage to a normal range, perform functions that rely on known data values, and observe that the results of these functions are consistent with known values of zero.
The evaluator shall then induce a voltage glitch outside of the specified normal range and verify that the TOE responds in the manner specified in the ST. If the TOE’s response is to zeroize known data, the evaluator shall return the voltage to a normal range, perform functions that rely on known data values, and observe that the results of these functions are consistent with known values of zero.
2.5.4.1.4. KMD
There are no KMD evaluation activities for this component.
2.5.5. Root of Trust (Extended - FPT_PRO_EXT)
2.5.5.1. FPT_PRO_EXT.1 Root of Trust
2.5.5.1.1. TSS
The evaluator shall examine the TSS to ensure that it describes how Root of Trust data is immutable or otherwise mutable if and only if controlled by a unique identifiable owner, the roles this owner assumes in doing so (manufacturer administrator, owner administrator, etc.), as well as the circumstances in which Root of Trust data is mutable.
[conditional] For immutable Root of Trust data, the evaluator shall ensure there are no mechanisms to update the Root of Trust.
[conditional] For mutable Root of Trust data, the evaluator shall ensure the Root of Trust update mechanism uses an approved method for authenticating the source of the update.
2.5.5.1.2. AGD
For mutable Root of Trust data, the evaluator shall confirm the AGD contains an approved authenticated method for modifying the Root of Trust identity.
2.5.5.1.3. Test
Immutability
For immutable Root of Trust data, the evaluator shall confirm a successful evaluation of FPT_PHP.3 (Resistance to Physical Attack).
Mutability
For mutable Root of Trust data, the evaluator shall perform the following tests:
-
Create or use an authenticated Root of Trust identity, confirm the authenticated method for modifying the Root of Trust identity succeeds.
-
Create or use an unauthenticated Root of Trust identity, confirm the target fails to modify the Root of Trust identity.
2.5.5.1.4. KMD
The evaluator shall ensure that the KMD describes either a pre-installed identity (contained within an SDO), or a process on how the TOE creates an identity. IEEE 802.1ar is one example of a standard which a device can use to create such an identity.
2.5.6. Root of Trust Services (Extended - FPT_ROT_EXT)
2.5.6.1. FPT_ROT_EXT.1 Root of Trust Services
2.5.6.1.1. TSS
The evaluator shall ensure that the TSS identifies the Roots of Trust it provides (including but not limited to the Roots of Trust identified in the selections in this requirement) and describes their function in the context of the TOE. The TSS shall describe the cryptographic algorithms in use for the Roots of Trust.
2.5.6.1.2. AGD
There are no AGD evaluation activities for this component.
2.5.6.1.3. Test
Root of Trust for Storage
Testing for this component is performed through evaluation of FCS_CKM.1, FCS_STG_EXT.1 and FPT_PHP.3.
Root of Trust for Authorization
Testing for this component is performed through evaluation of FIA_AFL_EXT.1.
Root of Trust for Measurement
Testing for this component is performed through evaluation of FCS_COP.1/Hash.
Root of Trust for Reporting
Testing for this component is performed through evaluation of FCS_COP.1/SigGen.
2.5.6.1.4. KMD
There are no KMD evaluation activities for this component.
2.5.6.2. FPT_ROT_EXT.2 Root of Trust for Storage
2.5.6.2.1. TSS
The evaluator shall ensure that the TSS describes how the Root of Trust for Storage prevents unauthorized access (including unauthorized disclosure and unauthorized modification) to SDOs. The evaluator shall also examine the TSS to verify that it uses approved mechanisms to protect the confidentiality of secret SDOs and to protect the integrity of SDOs.
2.5.6.2.2. AGD
There are no AGD evaluation activities for this component.
2.5.6.2.3. Test
Testing for this component is performed through evaluation of FCS_CKM.1, FCS_STG_EXT.1, FDP_SDC.2, FDP_SDI.2 and FPT_PHP.3.
2.5.6.2.4. KMD
There are no KMD evaluation activities for this component.
2.5.7. Replay Prevention (FPT_RPL)
2.5.7.1. FPT_RPL.1/Authorization Replay Prevention
2.5.7.1.1. TSS
The evaluator shall examine the TSS to verify that it describes the mechanism employed for preventing replay of user authorization of operations on SDOs and that access is denied when replay is detected.
2.5.7.1.2. AGD
The evaluator shall examine the operational guidance to verify that it describes how to enforce Replay Prevention if configuration is necessary.
2.5.7.1.3. Test
The evaluator shall perform an authorization of an operation on an SDO and capture or retain that authorization for reuse. The evaluator shall then attempt to replay that same authorization and ensure that the DSC does not allow the authorization to take place. If the replay of the authorization is allowed to take place for an operation on SDOs, the test is a 'Fail', otherwise, the test is a 'Pass'.
2.5.7.1.4. KMD
There are no KMD evaluation activities for this component.
2.5.8. TSF Self Test (FPT_TST)
2.5.8.1. FPT_TST.1 TSF Testing
2.5.8.1.1. TSS
The evaluator shall examine the TSS and other vendor documentation and ensure they describe the methods used to verify integrity of the TSF and TSF data, both automatic checks and if supported, checks requested by the user. If more than the power-on check is selected, then each condition for the self test shall be examined.
2.5.8.1.2. AGD
The evaluator shall examine the operational guidance to ensure it provides authorized users with the capability to verify the integrity of the TSF and its data.
2.5.8.1.3. Test
Test 1: The evaluator shall verify that the DSC performs an integrity check of all TSF, including data, as well as performing KATs for those functions. The evaluator shall verify failures using malformed known answer test data (for example, unexpected input or output values).
Test 2: The evaluator shall ensure that when an integrity check failure occurs specific to failing KATs and failure to verify the integrity of the TSF, the TOE will prevent any further processing of the current TSF and user data.
Test 3 [conditional]: For each condition where a self test may be initiated, the evaluator shall verify that the self tests are run as specified.
2.5.8.1.4. KMD
There are no KMD evaluation activities for this component.
2.6. Resource Utilization (FRU)
2.6.1. Fault Tolerance (FRU_FLT)
2.6.1.1. FRU_FLT.1 Degraded Fault Tolerance
2.6.1.1.1. TSS
The evaluator shall examine the TSS and other vendor documentation and ensure they describe the response and state of TSF data to each type of fault injection into the TOE.
2.6.1.1.2. AGD
There are no AGD evaluation activities for this component.
2.6.1.1.3. Test
The evaluator shall process SDOs/SDEs while applying each type of identified Fault Injection into the TSF. The evaluator will note whether the TSF response is as noted in the TSS and whether the state can be confirmed. If the response and state are as documented, the test is a 'Pass', otherwise, the test is a 'Fail'.
2.6.1.1.4. KMD
There are no KMD evaluation activities for this component.
3. Evaluation Activities for Optional Requirements
3.1. Random Bit Generation (FCS_RBG)
3.1.1. FCS_RBG.2 Random Bit Generation (External Seeding)
3.1.1.1. TSS
The evaluator shall verify that this function is included as an interface to the RBG in the TSS or the proprietary Entropy Analysis Report and that the behavior of the RBG following a call to this interface is described. The evaluator shall also verify that the documentation of the DRBG describes the conditions of use and possible values for the Personalization String input to the SP 800-90A specified DRBG.
3.1.1.2. AGD
The evaluator shall verify that the operational guidance describes the process for supplying an external seed to the TOE’s DRBG.
3.1.1.3. Test
The evaluator shall develop and execute or verify and observe the developer tooling which adds data to the RBG via the Personalization String. The evaluator shall verify that the request succeeds.
3.1.1.4. KMD
There are no KMD evaluation activities for this component.
3.1.2. FCS_RBG.3 Random Bit Generation (Internal Seeding - Single Source)
3.1.2.1. TSS
The evaluator shall verify that this function is included as an interface to the RBG in the TSS or the proprietary Entropy Analysis Report and that the behavior of the RBG following a call to this interface is described. The evaluator shall also verify that the documentation of the DRBG describes how the internal seeding is initiated to the SP 800-90A specified DRBG.
3.1.2.2. AGD
The evaluator shall verify that the operational guidance describes the process for how the internally generated seed is supplied to the TOE’s DRBG.
3.1.2.3. Test
The evaluator shall develop and execute or verify and observe the developer tooling which generates the internal seed and provides it to the DRBG. The evaluator shall verify that the request succeeds.
3.1.2.4. KMD
There are no KMD evaluation activities for this component.
3.1.3. FCS_RBG.4 Random Bit Generation (Internal Seeding - Multiple Sources)
3.1.3.1. TSS
The evaluator shall verify that this function is included as an interface to the RBG in the TSS or the proprietary Entropy Analysis Report and that the behavior of the RBG following a call to this interface is described. The evaluator shall also verify that the documentation of the DRBG describes how to choose the seeding source to be used and that this source used to seed the SP 800-90A specified DRBG.
3.1.3.2. AGD
The evaluator shall verify that the operational guidance describes the process for how the seed source is selected and the output from the source supplied to the TOE’s DRBG.
3.1.3.3. Test
The evaluator shall develop and execute or verify and observe the developer tooling which selects the seed source and provides its output to the DRBG. The evaluator shall verify that the request succeeds.
3.1.3.4. KMD
There are no KMD evaluation activities for this component.
3.1.4. FCS_RBG.5 Random Bit Generation (Combining Noise Sources)
3.1.4.1. TSS
The evaluator shall verify that this function is included as an interface to the RBG in the TSS or the proprietary Entropy Analysis Report and that the behavior of the RBG following a call to this interface is described. The evaluator shall also verify that the documentation of the DRBG describes the seeding sources and how their outputs are combined according to the specified standard to be used and that this combined output is used to seed the SP 800-90A specified DRBG.
3.1.4.2. AGD
The evaluator shall verify that the operational guidance describes the process for how the generated seed is supplied to the TOE’s DRBG.
3.1.4.3. Test
The evaluator shall develop and execute or verify and observe the developer tooling which generates each separate seed for view and then combines them before providing the output to the DRBG. The evaluator shall verify that the request succeeds.
3.1.4.4. KMD
There are no KMD evaluation activities for this component.
3.1.5. FCS_RBG.6 Random Bit Generation Service
3.1.5.1. TSS
The evaluator shall verify the interface that is provided by the TOE for DRBG output.
3.1.5.2. AGD
The evaluator shall verify that the operational guidance describes how to access the RBG output interface including the method(s) for interacting with the service.
3.1.5.3. Test
The evaluator shall develop and execute or verify and observe the developer tooling which will access the output interface and verify the output from the interface matches the results that would be expected if used internally. The evaluator shall verify that the request succeeds.
3.1.5.4. KMD
There are no KMD evaluation activities for this component.
3.2. Protection of the TSF (FPT)
3.2.1. Internal TOE TSF Data Transfer (FPT_ITT)
3.2.1.1. FPT_ITT.1 Basic Internal TSF Data Transfer Protection
3.2.1.1.1. TSS
The evaluator shall examine the TSS to determine that it describes the TOE in terms of its distributed components and identifies the communications that take place between these components. The evaluator shall also examine the TSS to ensure that it describes the methods by which each of these communications are protected.
The evaluator shall examine the TSS to determine that is describes the type of separation between distributed components, such as physical connections (circuit, ports, etc) or protocols that may need configuration.
3.2.1.1.2. AGD
If applicable, the evaluator shall confirm that the guidance documentation contains instructions for establishing the relevant allowed communication channels between TOE components, and that it contains recovery instructions should a connection be unintentionally broken.
3.2.1.1.3. Test
The evaluator shall perform the following tests:
Test 1: The evaluator shall ensure that all communications between distributed TOE components are tested. If communications channels require configuration, the guidance documentation shall be followed and ensure that communication is successful.
Test 2: The evaluator shall ensure, for each communication channel, that the channel data is not sent in plaintext or that there is no interface by which it is possible to extract the channel data.
Test 3 [conditional]: If modification is selected in FPT_ITT.1.1, the evaluator shall ensure, for each communication channel, that the channel data is authenticated using an approved message authentication code, digital signature, or authenticated encryption mechanism, or that there is no interface by which it is possible to modify the channel data.
Test 4 [conditional]: The evaluator shall, where possible, physically interrupt communications between TOE components. The evaluator shall ensure that when physical connectivity is restored, communications are appropriately protected.
Note that test 4 shall only be performed for those interfaces where physical interruption of communications is typical operational behavior. If the TOE consists of multiple components embedded within the same physical chassis such that interrupting communications requires physical destruction of the device (as opposed to disconnecting a re-connectable wire or temporarily disabling a radio), this test is not applicable.
3.2.1.1.4. KMD
There are no KMD evaluation activities for this component.
3.2.2. Root of Trust (Extended - FPT_PRO_EXT)
3.2.2.1. FPT_PRO_EXT.2 Data Integrity Measurements
3.2.2.1.1. TSS
The evaluator shall examine the TSS and ensure that it describes the contents and structure of any measurements (including but not limited to cryptographic self-tests) and assertions that are used to quantify the integrity of the data protected by the DSC and establish trust. The evaluator shall also ensure that the TSS describes the order and importance of integrity measurements and records that comprise platform characteristics to prove the integrity of the SDOs in the DSC. In addition, the evaluator shall ensure the TSS lists the possible values for the platform characteristics and how these values relate to the integrity of the data protected by the DSC. Finally, the evaluator shall ensure that the TSS gives justification as to how the process in accumulating platform characteristics is consistent between restarts.
3.2.2.1.2. AGD
There are no AGD evaluation activities for this component.
3.2.2.1.3. Test
For each platform characteristic described in the TSS, the evaluator shall perform the following steps:
-
Identify the integrity measurements and assertions that comprise the platform characteristic.
-
Start the DSC and record the value of the platform characteristic.
-
Shut down the DSC.
-
Modify the data protected by the integrity measurements and assertions identified in Step 1.
-
Start the DSC and record the value of the platform characteristic. The evaluator shall verify the newly obtained value is different from the value obtained in Step 2.
-
Shut down the DSC.
-
Revert the modifications made in Step 4.
-
Start the DSC and record the value of the platform characteristic. The evaluator shall verify the newly obtained value is equal to the value obtained in Step 2.
-
Shut down the DSC.
Refer to [DSC cPP] for more information about what could constitute DSC data or platform characteristics.
3.2.2.1.4. KMD
There are no KMD evaluation activities for this component.
3.2.3. Root of Trust Services (Extended - FPT_ROT_EXT)
3.2.3.1. FPT_ROT_EXT.3 Root of Trust for Reporting Mechanisms
3.2.3.1.1. TSS
The evaluator shall examine the TSS and ensure that it describes the cryptographically verifiable identities for the Root of Trust for Reporting and how they are used, verifying that they are not visible outside the subset of DSC scope corresponding to the Roots of Trust.
[conditional] If these cryptographically verifiable identities come in the form of resident keys or aliases, the evaluator shall verify that the possession of such aliases or keys can only be proved indirectly by using them to decrypt a value that has been encrypted with a corresponding public key. The evaluator shall also verify that such keys or aliases are not used for producing digital signatures.
In addition, the evaluator shall examine the TSS and ensure that it describes how these cryptographically verifiable identities are unique between individual DSCs. The evaluator shall also ensure that the TSS describes how the Root of Trust for Reporting reports on the state of the platform and how the values reported satisfy the scope indicated by the report.
3.2.3.1.2. AGD
The evaluator shall examine the guidance documentation to confirm it describes how to enable the root of trust for reporting mechanism on the TOE, how to generate a report, and how reports are verified as genuine.
3.2.3.1.3. Test
Note that the following test may require the developer to provide access to a test platform that provides the evaluator with tools that are not typically available to end users.
The evaluator shall perform the following tests:
Test 1: The evaluator shall enable the root of trust for reporting mechanism on the TOE and generate a report.
Test 2: The evaluator shall confirm the generated report is valid.
Test 3: The evaluator shall modify the signature of a valid report and confirm the report is invalid.
Test 4: The evaluator shall modify the contents of a valid report and confirm the report is invalid.
3.2.3.1.4. KMD
There are no KMD evaluation activities for this component.
4. Evaluation Activities for Selection-Based Requirements
4.1. Cryptographic Support (FCS)
4.1.1. Cryptographic Key Generation (FCS_CKM)
4.1.1.1. FCS_CKM.1/AKG Cryptographic Key Generation (Asymmetric Keys)
4.1.1.1.1. TSS
The evaluator shall examine the TSS to verify that it describes how the TOE generates an asymmetric key based on the methods selected from the [DSC cPP] Table "Asymmetric Cryptographic Key Generation". The evaluator shall examine the TSS to verify that it describes how the TOE invokes the methods selected in the ST from the same table. The evaluator shall examine the TSS to verify that it identifies the usage for each row identifier (key type, key size, and list of standards) selected in the ST.
4.1.1.1.2. AGD
The evaluator shall verify that the AGD guidance instructs the administrator how to configure the TOE to use the selected key types for all uses identified in the ST.
4.1.1.1.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Modulus size (RSA)
-
Curve (ECC, EdDSA, EC-KCDSA)
-
Domain parameters (FFC)
-
Group size (KCDSA)
-
Private key size (LMS, HSS, XMSS, XMSSMT)
Each test group shall consist of at least 10 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
For each test case in each test group, the evaluator shall verify that the TSF generates a key pair that is structurally well-formed according to the relevant standard. Additionally, the evaluator shall use a known-good implementation to derive a public key from the private key provided by the TSF, and verify that the result matches the public key provided by the TSF.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
4.1.1.1.4. KMD
If the TOE uses the generated key in a key chain/hierarchy then the evaluator shall confirm that the KMD describes:
-
If AK1 is selected, then the KMD describes which methods for generating p and q are used
-
How the key is used as part of the key chain/hierarchy.
4.1.1.2. FCS_CKM.1/SKG Cryptographic key generation (Symmetric Key)
4.1.1.2.1. TSS
The evaluator shall examine the TSS to verify that it describes how the TOE obtains an SK through direct generation as specified in FCS_RBG.1, FCS_CKM.5, or FCS_CKM_EXT.8. The evaluator shall review the TSS to verify that it describes how the ST invokes the functionality described by FCS_RBG.1 and FCS_CKM_EXT.8 where applicable.
[conditional] If the symmetric key is generated by an RBG, the evaluator shall review the TSS to determine that it describes how the functionality described by FCS_RBG.1 is invoked. The evaluator uses the description of the RBG functionality in FCS_RBG.1 or documentation available for the operational environment to determine that the key size being requested is greater than or equal to the key size and mode to be used for the encryption/decryption of the data.
4.1.1.2.2. AGD
The evaluator shall verify that the AGD guidance instructs the administrator how to configure the TOE to use the selected key types for all uses identified in the ST.
4.1.1.2.3. Test
For each selected key generation method, the evaluator shall configure the selected generation capability. The evaluator shall use the description of the RBG interface to verify that the TOE requests and receives an amount of RBG output greater than or equal to the requested key size.
4.1.1.2.4. KMD
The evaluator shall confirm that the KMD describes the RBG interface and how the ST uses it in symmetric key generation.
If the TOE uses the generated key in a key chain/hierarchy then the KMD shall describe how the ST uses the key as part of the key chain/hierarchy.
4.1.1.3. FCS_CKM.5 Cryptographic Key Derivation
4.1.1.3.1. TSS
The evaluator shall check that the TSS includes a description of the key derivation functions and shall check that this uses a key derivation algorithm and key sizes according to the specification selected in the ST out of the [DSC cPP] Table "Cryptographic Key Derivation" per row. The evaluator shall confirm that the TSS supports the selected methods.
If Concatenated key5 is selected for any key derivation function, the evaluator shall verify that the TSS shows that the total length of the concatenated keys used as input to the KDF is greater than or equal to the length of the output from the KDF.
[conditional] If key combination is used to form a KEK, the evaluator shall verify that the TSS describes the method of combination and that this method is either an XOR, a KDF, or encryption.
[conditional] If a KDF is used to form a KEK, the evaluator shall ensure that the TSS includes a description of the key derivation function and shall verify the key derivation uses an approved derivation mode and key expansion algorithm according to SP 800-108 Rev. 1 or SP 800-56C Rev. 2.
[conditional] If KDF-MAC-2S is selected, the evaluator shall ensure the TSS includes a description of the randomness extraction step, including the following:
-
The description must include how an approved untruncated MAC function is being used for the randomness extraction step and the evaluator must verify the TSS describes that the output length (in bits) of the MAC function is at least as large as the targeted security strength (in bits) of the parameter set employed by the key establishment scheme (see Tables 1-3 of SP 800-56C Rev. 2).
-
The description must include how the MAC function being used for the randomness extraction step is related to the PRF used in the key expansion and verify the TSS description includes the correct MAC function:
-
If an HMAC-hash is used in the randomness extraction step, then the same HMAC-hash (with the same hash function hash) is used as the PRF in the key expansion step.
-
If an AES-CMAC (with key length 128, 192, or 256 bits) is used in the randomness extraction step, then AES-CMAC with a 128-bit key is used as the PRF in the key expansion step.
-
-
The description must include the lengths of the salt values being used in the randomness extraction step and the evaluator shall verify the TSS description includes correct salt lengths:
-
If an HMAC-hash is being used as the MAC, the salt length can be any value up to the maximum bit length permitted for input to the hash function hash.
-
If an AES-CMAC is being used as the MAC, the salt length shall be the same length as the AES key (i.e. 128, 192, or 256 bits).
-
4.1.1.3.2. AGD
The evaluator shall verify that the AGD guidance instructs the administrator how to configure the TOE to use the selected key types for all uses identified in the ST.
4.1.1.3.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
PRF (KDF-CTR, KDF-FB, KDF-DPI)
-
Encryption algorithm (KDF-ENC)
-
Hash algorithm (KDF-HASH)
-
MAC algorithm (KDF-MAC-1S, KDF-MAC-2S, KDF-KMAC)
-
Counter size (KDF-CTR, KDF-FB, KDF-DPI)
-
Counter location (KDF-CTR, KDF-FB, KDF-DPI)
-
Salt length (KDF-MAC-1S, KDF-MAC-2S)
-
Derived key size
If the algorithm implementation supports a range of derived key sizes, it is sufficient to define test groups for the start and end of the range.
Each test group shall consist of at least 5 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of at least one arbitrary key derivation key or shared secret, an arbitrary salt (KDF-MAC-1S, KDF-MAC-2S), an arbitrary fixed info (KDF-MAC-1S, KDF-MAC-2S), and the corresponding derived key.
-
A representative sample of the domain of supported key derivation key sizes shall be tested.
-
A representative sample of the domain of supported shared secret sizes shall be tested.
-
A representative sample of the domain of supported salt sizes shall be tested.
-
A representative sample of the domain of supported fixed info patterns / sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates the correct derived key.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
4.1.1.3.4. KMD
The evaluator shall examine the KMD to ensure that:
-
The KMD describes the complete key derivation chain and the description must be consistent with the description in the TSS. For all key derivations the TOE must use a method as described in the [DSC cPP] table. There should be no uncertainty about how a key is derived from another in the chain.
-
The length of the key derivation key is defined by the PRF. The evaluator should check whether the key derivation key length is consistent with the length provided by the selected PRF.
-
If a key is used as an input to several KDFs, each invocation must use a distinct context string. If the output of a KDF execution is used for multiple cryptographic keys, those keys must be disjoint segments of the output.
4.1.1.4. FCS_CKM_EXT.3 Cryptograhpic Key Access
4.1.1.4.1. TSS
The evaluator shall verify that for each key selected cryptograhpic method the ST contains the corresponding SFR. The evaluator shall ensure the key access operations which utilize the specified cryptographic methods. Where multiple methods may be selected, the evaluator shall verify that each method’s usage is described.
4.1.1.5. AGD
There are no guidance evaluation activities for this component.
4.1.1.6. Test
Testing for this SFR is performed under the corresponding functions based on the selection(s) made in the ST.
4.1.1.7. KMD
The developer is allowed to provide additional details of the key access operations and cryptographic methods in the KMD. The evaluator shall verify the accuracy of any additional descriptions (if present).
4.1.1.8. FCS_CKM_EXT.8 Password-Based Key Derivation
4.1.1.8.1. TSS
The evaluator shall review the TSS to verify that it contains a description of the PBKDF. The evaluator shall also confirm the TSF supports the selected hash function itself.
If the TSF performs additional conditioning, whitening, or manipulation of the password or passphrase before applying the PBKDF, or to the output of the PBKDF, the evaluator shall ensure that the TSS describes the actions and provides assurance that the TSF does not negatively impact the entropy of the PBKDF output.
If any manipulation of the key is performed in forming the submask that will be used to form the KEK, that process shall be described in the TSS.
4.1.1.8.2. AGD
There are no AGD evaluation activities for this component.
4.1.1.8.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Hash algorithm
Each test group shall consist of at least 10 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary password, an arbitrary salt, an arbitrary iteration count, and the corresponding derived key.
-
A representative sample of the domain of supported password lengths shall be tested.
-
A representative sample of the domain of supported salt sizes shall be tested.
-
A representative sample of the domain of supported iteration counts shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates the correct derived key.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
4.1.1.8.4. KMD
There are no KMD evaluation activities for this component.
4.1.2. Cryptographic Operation (FCS_COP)
4.1.2.1. FCS_COP.1/AEAD Cryptographic Operation (Authenticated Encryption with Associated Data)
4.1.2.1.1. TSS
The evaluator shall check that the TSS includes a description of encryption functions used for authenticated encryption with associated data. The evaluator should check that this description of the selected encryption function includes the key sizes and modes of operations as specified in the [DSC cPP] Table "AEAD Symmetric-Key Cryptography"
The evaluator shall check that the TSS describes the means by which the TOE satisfies constraints on algorithm parameters included in the selections made for 'cryptographic algorithm' and 'list of standards'.
4.1.2.1.2. AGD
If the product supports multiple modes, the evaluator shall examine the vendor’s documentation to determine that the method of choosing a specific mode/key size by the end user is described.
4.1.2.1.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Mode of operation
-
Algorithm direction (encrypt/decrypt)
-
Key size
-
MAC tag size
For each test group, the evaluator shall generate at least 10 test cases meeting the following requirements:
-
For encryption, each test case shall consist of an arbitrary key, an arbitrary initialization vector, an arbitrary input plaintext, an arbitrary associated data string (if supported), and the corresponding ciphertext and MAC tag.
-
For decryption, each test case shall consist of an arbitrary key, an arbitrary initialization vector, an arbitrary input ciphertext, an arbitrary associated data string (if supported), the corresponding MAC tag, and the corresponding plaintext. For 3 of the 10 test cases, the key, ciphertext, or associated data string shall be modified such that the MAC tag is invalid.
-
A representative sample of the domain of supported input plaintext/ciphertext sizes shall be tested.
-
A representative sample of the domain of supported initialization vector sizes shall be tested.
-
A representative sample of the domain of supported associated data sizes shall be tested (if supported).
For each test case in each test group, the evaluator shall verify that the TSF generates the correct ciphertext/plaintext, or outputs a failure if the MAC tag is invalid (decryption).
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
4.1.2.1.4. KMD
The evaluator shall examine the KMD to ensure that the points at which authenticated encryption and decryption occurs are described, and that the complete data path for authenticated encryption with associated data is described. The evaluator checks that this description is consistent with the relevant parts of the TSS.
Assessment of the complete data path for authenticated encryption with associated data includes confirming that the KMD describes the data flow from the device’s host interface to the device’s non-volatile memory storing the data, and gives information enabling the user data datapath to be distinguished from those situations in which data bypasses the data encryption engine (e.g. read-write operations to an unencrypted Master Boot Record area). The evaluator shall ensure that the documentation of the data path is detailed enough that it thoroughly describes the parts of the TOE that the data passes through (e.g. different memory types, processors and co-processors), its encryption state (i.e. encrypted or unencrypted) in each part, and any places where the data is stored. For example, any caching or buffering of the data should be identified and distinguished from the final destination in non-volatile memory (the latter represents the location from which the host will expect to retrieve the data in future).
The evaluator shall examine the KMD to ensure that it describes how the IV is generated and that the same initialization vector is never reused to encrypt different plaintext pairs under the same key. Moreover in the case of GCM, the evaluator must ensure that, at each invocation of GCM, the length of the plaintext is at most (2^32)-2 blocks.
4.1.2.2. FCS_COP.1/CMAC Cryptographic Operation (CMAC)
4.1.2.2.1. TSS
The evaluator shall examine the TSS to ensure that it specifies the following values used by the CMAC functions: output MAC length used.
4.1.2.2.2. AGD
There are no guidance evaluation activities for this component.
4.1.2.2.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least three test groups (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
AES key size
Each test group shall consist of at least 8 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary key, an arbitrary input message, and its corresponding MAC tag.
-
A representative sample of the domain of supported key sizes shall be tested.
-
A representative sample of the domain of supported input message sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates the correct MAC tag.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
4.1.2.2.4. KMD
There are no KMD evaluation activities for this component.
4.1.2.3. FCS_COP.1/KeyEncap Cryptographic Operation (Key Encapsulation)
4.1.2.3.1. TSS
The evaluator shall ensure that the selected RSA key encapsulation (transport) schemes correspond to the key generation schemes selected in FCS_CKM.1/AKG.
The evaluator shall ensure that, for each key encapsulation (transport) scheme, the security strength of the key encapsulation key is greater than or equal to the security strength of the encapsulated keying material.
For each key encapsulation (transport) scheme, the TSS shall list the associated key derivation function if applicable.
4.1.2.3.2. AGD
There are no guidance evaluation activities for this component.
4.1.2.3.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Modulus size
-
Key agreement role (initiator or responder)
-
Hash algorithm (if applicable)
-
Associated data pattern (if applicable)
-
KDF configuration (if applicable)
Each test group shall consist of at least 10 test cases meeting the following requirements:
-
Test cases shall be generated according to the parameter configuration of the test group.
-
Each test case shall consist of an arbitrary "owner" private key, an arbitrary "peer" public key, an arbitrary associated data string (if applicable), and the corresponding shared secret or derived key.
-
A representative sample of the domain of supported associated data sizes shall be tested.
-
A representative sample of the domain of supported shared secret sizes shall be tested.
-
A representative sample of the domain of supported derived key sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates the correct shared secret or derived key.
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
4.1.2.3.4. KMD
There are no KMD evaluation activities for this component.
4.1.2.4. FCS_COP.1/KeyWrap Cryptographic Operation (Key Wrap)
4.1.2.4.1. TSS
The evaluator shall check that the TSS includes a description of encryption functions used for key wrapping. The evaluator should check that this description of the selected encryption function includes the key sizes and modes of operations as specified in the [DSC cPP] Table "Key Wrap"
The evaluator shall ensure that, for each key wrapping function, the security strength of the key wrapping key is greater than or equal to the security strength of the wrapped keying material.
4.1.2.4.2. AGD
If the product supports multiple modes, the evaluator shall examine the vendor’s documentation to determine that the method of choosing a specific mode/key size by the end user is described.
4.1.2.4.3. Test
The algorithm tests might require access to a development version of the TOE or a test platform that provides tools not typically found on factory products.
The developer shall provide sufficient information to the evaluator to properly define the implementation of the algorithm. The evaluator shall define at least one test group (a configuration of algorithm properties and associated test cases) for each combination of the following parameters, according to the implementation of the algorithm:
-
Mode of operation
-
Algorithm direction (encrypt/decrypt)
-
Key size
For each test group, the evaluator shall generate at least 50 test cases meeting the following requirements:
-
For encryption, each test case shall consist of an arbitrary key, an arbitrary input plaintext, and its corresponding ciphertext.
-
For decryption, each test case shall consist of an arbitrary key, an arbitrary input ciphertext, and its corresponding plaintext. For 10 of the 50 test cases, the key or ciphertext shall be modified such that the ciphertext is invalid.
-
A representative sample of the domain of supported input plaintext/ciphertext sizes shall be tested.
For each test case in each test group, the evaluator shall verify that the TSF generates the correct ciphertext/plaintext, or outputs a failure if the ciphertext is invalid (decryption).
The evaluator shall be able to provide a rationale as to the sufficiency of the test groups and test cases in exercising the algorithm. Test cases shall be generated using a known-good implementation to ensure the correct result is produced by the TSF. The evaluator shall determine the proper known-good implementation and provide a rationale for its use for testing. Internal or external implementations are acceptable with a sufficient rationale.
4.1.2.4.4. KMD
The evaluator shall examine the KMD to ensure that it describes when the key wrapping occurs, that the KMD description is consistent with the description in the TSS, and that for all keys that are wrapped the TOE uses a method as described in the [DSC cPP] table. No uncertainty should be left over which is the wrapping key and the key to be wrapped and where the wrapping key potentially comes from i.e. is derived from.
4.2. User Data Protection (FDP)
4.2.1. Data Authentication (FDP_DAU)
4.2.1.1. FDP_DAU.1/Prove Basic Data Authentication (for Use with the Prove Service)
4.2.1.1.1. TSS
The evaluator shall examine the TSS and ensure it describes the data that is validity-stamped and where applicable, authenticity-stamped to the level of understanding the DSC has about the data or its origin (from the user providing it to the Prove service). The evaluator shall also ensure that the TSS describes how the evidence of validity or authenticity is generated, including the subjects who perform the verification, and the form the validity or authenticity stamp is represented (i.e. a cryptographic signature, MAC using a symmetric key shared with the receiver, etc.).
4.2.1.1.2. AGD
The evaluator shall ensure that the operational guidance describes how to configure validity-stamping on the TOE.
4.2.1.1.3. Test
The following testing may require the TOE developer to make a test harness available to the evaluator that allows the evaluator to interface directly with the DSC. Tests may also require the use of an automated script provided by either the vendor or the evaluator. If a test script is made available, the evaluator shall verify that it includes sufficient detail to validate the claims made in the TSS.
Test 1: Demonstrate the TOE can validity-stamp data.
For each configurable option to validity-stamp data, the evaluator shall configure the TOE to create a data object or import a data object which has not-yet been validity-stamped. The evaluator shall then instruct the TOE to validity-stamp this data object. The evaluator shall then export each data object and demonstrate it has been validity-stamped in accordance with the configured options.
Test 2: Demonstrate the TOE can disable validity-stamping of data objects.
The evaluator shall disable, or ensure validity-stamping has been disabled on the TOE.
The evaluator shall create a data object on the TOE or import an already-created data object which has not been validity-signed.
The evaluator shall export that data object and verify it has not been validity-stamped.
4.2.1.1.4. KMD
There are no KMD evaluation activities for this component.
4.2.2. Factory Reset (Extended - FDP_FRS_EXT)
4.2.2.1. FDP_FRS_EXT.2 Factory Reset Behavior
4.2.2.1.1. TSS
The evaluator shall examine the TSS to determine the pre-installed SDOs that are reverted to their factory settings when a factory reset occurs, what the factory settings are for those SDOs, and that the TSS states that all non-permanent SDEs and SDOs are destroyed.
4.2.2.1.2. AGD
The evaluator shall examine the operational guidance and verify that it identifies the pre-installed SDOs that are reverted to their initial values when a factory reset has been performed.
4.2.2.1.3. Test
The evaluator shall perform the following test:
-
The evaluator shall use each supported role to create or import an SDE or SDO that has known data.
-
The evaluator shall then verify that the created SDE/SDO resides either within the DSC, or under the control of the DSC.
-
The evaluator shall perform some operation for each created or imported SDE/SDO in step 1 that demonstrates that the SDE/SDO has been set to the indicated value.
-
For each pre-installed SDO that is identified in FDP_FRS_EXT.2.1, the evaluator shall perform some operation to verify what the current value of that SDO is.
-
The evaluator shall initiate a factory reset.
-
For each created or imported SDE/SDO that is identified in step 3, the evaluator shall re-attempt the operation and verify that it no longer completes successfully because the SDE/SDO data has been erased.
-
For each pre-installed SDO that is identified in step 4, the evaluator shall re-attempt the operation and verify that the SDOs have been set to their factory default values.
4.2.2.1.4. KMD
There are no KMD evaluation activities for this component.
4.3. Identification and Authentication (FIA)
4.3.1. Authorization Failure Handling (Extended - FIA_AFL_EXT)
4.3.1.1. FIA_AFL_EXT.2 Authorization Failure Response
4.3.1.1.1. TSS
The evaluator shall examine the TSS to determine that it describes the method by which access to an SDO is restored following a lockout that results from excessive authentication failures.
4.3.1.1.2. AGD
The evaluator shall examine the guidance to ensure that it describes the method by which an administrator unlocks access to an SDO following a lockout that results from excessive authentication failures.
4.3.1.1.3. Test
The evaluator shall intentionally fail authentication attempts to access an SDO until they are locked out from interacting with it. The evaluator shall then follow the operational guidance to unlock access to the SDO and verify that it was successful by subsequently using valid credentials to access the SDO.
4.3.1.1.4. KMD
There are no KMD evaluation activities for this component.
4.4. Protection of the TSF (FPT)
4.4.1. Fail Secure (FPT_FLS)
4.4.1.1. FPT_FLS.1/FW Failure with Preservation of Secure State (Firmware)
4.4.1.1.1. TSS
The evaluator shall examine the TSS to verify that it describes the actions taken when the TOE experiences each of the stated failures and how these actions ensure the DSC preserves a secure state.
The evaluator shall verify that the TSS describes the state of the DSC when the firmware validity checks fail, including the various failure modes assumed.
4.4.1.1.2. AGD
For each failure state, the evaluator shall examine the operational guidance to verify that it describes what actions should be taken to attempt to resolve the failure state.
4.4.1.1.3. Test
Note that this test requires firmware builds that are deliberately invalidated to cause authenticity, integrity, and rollback violation failures.
The evaluator shall examine the TOE’s behavior when it is loaded with a firmware build that causes a firmware failure. The evaluator shall ensure that when the failure occurs, the TOE prevents further processing of TSF and user data and performs any actions consistent with maintaining a secure state as described in the TSS.
The evaluator shall repeat this test as necessary to observe each of the specific firmware failures identified in the SFR.
4.4.1.1.4. KMD
There are no KMD evaluation activities for this component.
4.4.2. Mutable/Immutable Firmware (Extended - FPT_MFW_EXT)
4.4.2.1. FPT_MFW_EXT.2 Basic Firmware Integrity
4.4.2.1.1. TSS
The evaluator shall verify that the TSS describes which critical memory is measured for these integrity values and how the measurement is performed (including which TOE software measures the memory integrity values, how that software accesses the critical memory, and which algorithms are used).
4.4.2.1.2. AGD
If the integrity values are provided to the administrator, the evaluator shall verify that the AGD guidance contains instructions for retrieving these values and information for interpreting them. For example, if multiple measurements are taken, what those measurements are and how changes to those values relate to changes in the device state.
4.4.2.1.3. Test
Note that the following test may require the developer to provide access to a test platform that provides the evaluator with tools that are not typically available to end users.
The evaluator shall repeat the following test for each measurement:
The evaluator shall boot the TOE in an approved state and record the measurement taken. The evaluator shall modify the critical memory or value that is measured. The evaluator shall reboot the TOE and verify that the measurement changed.
4.4.2.1.4. KMD
There are no KMD evaluation activities for this component.
4.4.2.2. FPT_MFW_EXT.3 Firmware Authentication with Identity of Guarantor
4.4.2.2.1. TSS
The evaluator shall examine the TSS to ensure it describes the methods and identities used to verify integrity and authenticity of the firmware. The TSS shall identify the Guarantor and how to verify its identity.
4.4.2.2.2. AGD
There are no guidance activities for this component.
4.4.2.2.3. Test
The TOE guarantees the authenticity of the firmware using the identity of the Guarantor. This prevents impersonating a Guarantor when sending firmware to a device or modifying the firmware in transit.
Test 1: Verify Authentic Firmware
The evaluator shall trigger the TOE to load and evaluate the authenticity of authentic firmware according the methods described in the TSS. The evaluator shall ensure that the TOE provides a clear indication of the success of the evaluation to consider the test a 'Pass', otherwise, the test is a 'Fail'.
Test 2: Verify Unauthentic Firmware
The evaluator shall deliberately modify authentic firmware.
The evaluator shall trigger the TOE to load and evaluate the authenticity of the deliberately modified firmware according the methods described in the TSS. The evaluator shall ensure that the TOE provides a clear indication of the failure of the evaluation to consider the test a 'Pass', otherwise, the test is a 'Fail'.
4.4.2.2.4. KMD
There are no KMD evaluation activities for this component.
4.4.3. Replay Detection (FPT_RPL)
4.4.3.1. FPT_RPL.1/Rollback Replay Detection (Rollback)
4.4.3.1.1. TSS
The evaluator shall examine the TSS and other vendor documentation and ensure that they describe the methods used to guarantee the validity of firmware identifiers and prevents the TSF from executing older instances than that which is currently authorized.
4.4.3.1.2. AGD
There are no AGD evaluation activities for this component.
4.4.3.1.3. Test
The evaluator shall repeat the following tests to cover all allowed firmware verification mechanisms as described in the TSS. For example, if the firmware verification mechanism replaces an entire partition or subset of the DSC scope containing many separate code files, the evaluator does not need to repeat the test for each individual file.
Test 1: The evaluator shall attempt to execute an earlier instance or build of the software (as determined by the vendor). The evaluator shall verify that this attempt fails by checking the version identifiers or cryptographic hashes of the firmware against those previously recorded and checking that the values do not correspond to an unauthorized build.
Test 2: The evaluator shall attempt to execute a current or later instance and shall verify that the firmware execution succeeds.
4.4.3.1.4. KMD
There are no KMD evaluation activities for this component.
4.4.4. Time Counting (FPT_STM_EXT)
4.4.4.1. FPT_STM_EXT.1 Reliable Time Counting
4.4.4.1.1. TSS
The evaluator shall examine the TSS to ensure that it lists each security function that makes use of time. The TSS provides a description of how the time is maintained and considered reliable in the context of each of the time related functions.
The evaluator shall examine the TSS and other vendor documentation and ensure they describe the location of the time source for the TSF.
[conditional] For a time source that is outside the TOE but contained inside the same physical enclosure, the evaluator shall ensure there is a description of how the time is obtained from the source outside the TOE.
4.4.4.1.2. AGD
The evaluator shall examine the guidance documentation to ensure it instructs the administrator how to set the time or indicates any configuration steps required for the TSF to receive time data from an external source.
4.4.4.1.3. Test
The evaluator shall perform the following tests:
Test 1 [conditional]: If the TSF provides a mechanism to manually set the time, the evaluator shall use the guidance documentation to set the time. The evaluator shall then use an available interface to observe that the time was set correctly.
Test 2 [conditional]: If the TSF receives time data from some source outside the TOE, the evaluator shall use the guidance documentation to configure the external time source (if applicable). The evaluator shall observe that the time has been set to the expected value.
Test 3 [conditional]: If the TSF configures time data inside the TOE, the evaluator shall use the guidance documentation to configure the internal time source (if applicable). The evaluator shall observe that the time has been set to the expected value.
4.4.4.1.4. KMD
There are no KMD evaluation activities for this component.
4.5. Trusted Path/Channels (FTP)
4.5.1. CCM Protocol (Extended - FTP_CCMP_EXT)
4.5.1.1. FTP_CCMP_EXT.1 CCM Protocol
4.5.1.1.1. TSS
The evaluator shall verify that the TSS includes a description of the TOE’s expected responses to CCMP authentication failures and malformed or invalid CCMP data units.
4.5.1.1.2. AGD
There are no AGD evaluation activities for this component.
4.5.1.1.3. Test
The evaluator shall perform the following tests:
Test 1: The evaluator shall attempt to establish a CCMP connection to the TOE from an external entity, observe the return traffic with a network traffic analyzer, and verify that the connection succeeds and that the traffic is identified as properly constructed CCMP data units.
Test 2: The evaluator shall attempt to establish a CCMP connection to the TOE using five messages with incorrect or invalid authentication factors and verify that an authentication failure or error status is returned.
Test 3: The evaluator shall attempt to establish a CCMP connection to the TOE using five different messages that are malformed or invalid due to noncompliance with the CCMP standard and observe that all connection attempts are unsuccessful.
Test 4: The evaluator shall establish a valid CCMP connection to the TOE. Once this has been established, the evaluator shall send ten different messages that are malformed or invalid due to noncompliance with the CCMP standard and observe that each of these messages are rejected.
4.5.1.1.4. KMD
There are no KMD evaluation activities for this component.
4.5.2. GCM Protocol (Extended - FTP_GCMP_EXT)
4.5.2.1. FTP_GCMP_EXT.1 GCM Protocol
4.5.2.1.1. TSS
The evaluator shall verify that the TSS includes a description of the TOE’s expected responses to GCMP authentication failures and malformed or invalid GCMP data units.
4.5.2.1.2. AGD
There are no AGD evaluation activities for this component.
4.5.2.1.3. Test
The evaluator shall perform the following tests:
Test 1: The evaluator shall attempt to establish a GCMP connection to the TOE from an external entity, observe the return traffic with a network traffic analyzer, and verify that the connection succeeds and that the traffic is identified as properly constructed GCMP data units.
Test 2: The evaluator shall attempt to establish a GCMP connection to the TOE using five messages with incorrect or invalid authentication factors and verify that an authentication failure or error status is returned.
Test 3: The evaluator shall attempt to establish a GCMP connection to the TOE using five different messages that are malformed or invalid due to noncompliance with the GCMP standard and observe that all connection attempts are unsuccessful.
Test 4: The evaluator shall establish a valid GCMP connection to the TOE. Once this has been established, the evaluator shall send ten different messages that are malformed or invalid due to noncompliance with the GCMP standard and observe that each of these messages are rejected.
4.5.2.1.4. KMD
There are no KMD evaluation activities for this component.
4.5.3. Inter-TSF Trusted Channel (Extended - FTP_ITC_EXT)
4.5.3.1. FTP_ITC_EXT.1 Cryptographically Protected Communications Channels
4.5.3.1.1. TSS
The evaluator shall review the TSS to determine that it lists all trusted channels the TOE uses for remote communications, including both the external entities and remote users used for the channel as well as the protocol that is used for each.
4.5.3.1.2. AGD
There are no AGD evaluation activities for this component.
4.5.3.1.3. Test
The evaluator shall configure the TOE to communicate with each external IT entity or type of remote user identified in the TSS. The evaluator shall monitor network traffic while the TSF performs communication with each of these destinations. The evaluator shall ensure that for each session a trusted channel was established in conformance with the protocols identified in the selection.
4.5.3.1.4. KMD
There are no KMD evaluation activities for this component.
4.5.4. Encrypted Data Communications (Extended - FTP_ITE_EXT)
4.5.4.1. FTP_ITE_EXT.1 Encrypted Data Communications
4.5.4.1.1. TSS
The evaluator shall review the TSS to determine that it lists all encryption mechanisms the TOE uses for protected external communications, along with the types of communications protected using each mechanism.
4.5.4.1.2. AGD
There are no AGD evaluation activities for this component.
4.5.4.1.3. Test
The evaluator shall configure the TOE to communicate with each external entity identified in the TSS. The evaluator shall initiate a transaction that will result in data being transferred to the TOE through the mechanism and other data returned to the initiating entity through the mechanism. The evaluator must verify that the data returned to the entity was encrypted using the documented mechanism when received.
4.5.4.1.4. 4KMD
There are no KMD evaluation activities for this component.
4.5.5. Physically Protected Channel (Extended - FTP_ITP_EXT)
4.5.5.1. FTP_ITP_EXT.1 Physically Protected Channel
4.5.5.1.1. TSS
The evaluator shall review the TSS to determine that it lists all mechanisms the TOE uses for physically protected external communications, along with the types of communications protected using each mechanism.
4.5.5.1.2. AGD
There are no AGD evaluation activities for this component.
4.5.5.1.3. Test
There are no test activities for this component.
4.5.5.1.4. KMD
There are no KMD evaluation activities for this component.
5. Evaluation Activities for SARs
The sections below specify EAs for the Security Assurance Requirements (SARs) included in the related cPPs. The EAs in Section 2, Evaluation Activities for SFRs, Section 3, Evaluation Activities for Optional Requirements, and Section 4, Evaluation Activities for Selection-Based Requirements are an interpretation of the more general CEM assurance requirements as they apply to the specific technology area of the TOE.
In this section, each SAR that is contained in the [DSC cPP] is listed, and the EAs that are not associated with an SFR are captured here, or a reference is made to the CEM, and the evaluator is expected to perform the CEM work units.
5.1. ASE: Security Target Evaluation
When evaluating a Security Target, the evaluator performs the work units as presented in the CEM. In addition, the evaluator ensures the content of the TSS in the ST satisfies the EAs specified in Section 2 as well as the EAs for the optional and selection-based SFRs claimed by the ST and specified in Section 3 and Section 4.
5.2. ADV: Development
5.2.1. Basic Functional Specification (ADV_FSP.1)
The EAs for this assurance component focus on understanding the interfaces (e.g., application programing interfaces, command line interfaces, graphical user interfaces, network interfaces) described in the AGD documentation, and possibly identified in the TOE Summary Specification (TSS) and Key Management Documentation (KMD) in response to the SFRs. Specific evaluator actions to be performed against this documentation are identified (where relevant) for each SFR in Section 2, Evaluation Activities for SFRs, and in EAs for AGD, ATE and AVA SARs in other parts of Section 5.
The EAs presented in this section address the CEM work units ADV_FSP.1-1, ADV_FSP.1-2, ADV_FSP.1-3, and ADV_FSP.1-5.
The EAs are reworded for clarity and interpret the CEM work units such that they will result in more objective and repeatable actions by the evaluator. The EAs in this SD are intended to ensure the evaluators are consistently performing equivalent actions.
The documents to be examined for this assurance component in an evaluation are therefore the Security Target, AGD documentation, and any required supplementary information required by the [DSC cPP]: no additional "functional specification" documentation is necessary to satisfy the EAs. The interfaces that need to be evaluated are also identified by reference to the EAs listed for each SFR, and are expected to be identified in the context of the Security Target, AGD documentation, and any required supplementary information defined in the [DSC cPP] rather than as a separate list specifically for the purposes of CC evaluation. The direct identification of documentation requirements and their assessment as part of the EAs for each SFR also means that the tracing required in ADV_FSP.1.2D (work units ADV_FSP.1-4, ADV_FSP.1-6 and ADV_FSP.1-7) is treated as implicit and no separate mapping information is required for this element.
CEM ADV_FSP.1 Work Units | Evaluation Activities |
---|---|
ADV_FSP.1-1 The evaluator shall examine the functional specification to determine that it states the purpose of each SFR-supporting and SFR-enforcing TSFI. |
Section 5.2.1.1, “Evaluation Activity”: The evaluator shall examine the interface documentation to ensure it describes the purpose and method of use for each TSFI that is identified as being security relevant. |
ADV_FSP.1-2 The evaluator shall examine the functional specification to determine that the method of use for each SFR-supporting and SFR-enforcing TSFI is given. |
Section 5.2.1.2, “Evaluation Activity”: The evaluator shall check the interface documentation to ensure it identifies and describes the parameters for each TSFI that is identified as being security relevant. |
ADV_FSP.1-3 The evaluator shall examine the presentation of the TSFI to determine that it identifies all parameters associated with each SFR-enforcing and SFR supporting TSFI. |
Section 5.2.1.3, “Evaluation Activity”: The evaluator shall examine the interface documentation to develop a mapping of the interfaces to SFRs. |
ADV_FSP.1-4 The evaluator shall examine the rationale provided by the developer for the implicit categorisation of interfaces as SFR-non-interfering to determine that it is accurate. |
Paragraph 609 from the CEM: "In the case where the developer has provided adequate documentation to perform the analysis called for by the rest of the work units for this component without explicitly identifying SFR-enforcing and SFR-supporting interfaces, this work unit should be considered satisfied." Since the rest of the ADV_FSP.1 work units will have been satisfied upon completion of the EAs, it follows that this work unit is satisfied as well. |
ADV_FSP.1-5 The evaluator shall check that the tracing links the SFRs to the corresponding TSFIs. |
Section 5.2.1.4, “Evaluation Activity for Specifying DSC Interface for Use in Composite Evaluations”: The evaluator shall examine the interface documentation to develop a mapping of the interfaces to SFRs. |
ADV_FSP.1-6 The evaluator shall examine the functional specification to determine that it is a complete instantiation of the SFRs. |
EAs that are associated with the SFRs in Section 2, and, if applicable, Section 3 and Section 4, are performed to ensure that all the SFRs where the security functionality is externally visible (i.e., at the TSFI) are covered. Therefore, the intent of this work unit is covered. |
ADV_FSP.1-7 The evaluator shall examine the functional specification to determine that it is an accurate instantiation of the SFRs. |
EAs that are associated with the SFRs in Section 2, and, if applicable, Section 3 and Section 4, are performed to ensure that all the SFRs where the security functionality is externally visible (i.e., at the TSFI) are addressed, and that the description of the interfaces is accurate with respect to the specification captured in the SFRs. Therefore, the intent of this work unit is covered. |
5.2.1.1. Evaluation Activity
The evaluator shall examine the interface documentation to ensure it describes the purpose and method of use for each TSFI that is identified as being security relevant.
In this context, TSFI are deemed security relevant if they are used by the administrator to configure the TOE, or if they are used by the TSF to send or receive security-relevant data or to invoke an API or other service that supports the execution of a security function. The intent is that these interfaces will be adequately tested, and having an understanding of how these interfaces are used in the TOE is necessary to ensure proper test coverage is applied.
The set of TSFI that are provided as evaluation evidence are contained in the operational guidance.
5.2.1.2. Evaluation Activity
The evaluator shall check the interface documentation to ensure it identifies and describes the parameters for each TSFI that is identified as being security relevant.
5.2.1.3. Evaluation Activity
The evaluator shall examine the interface documentation to develop a mapping of the interfaces to SFRs.
The evaluator uses the provided documentation and first identifies, and then examines a representative set of interfaces to perform the EAs presented in Section 2, Evaluation Activities for SFRs, including the EAs associated with testing of the interfaces.
It should be noted that there may be some SFRs that do not have an interface that is explicitly "mapped" to invoke the desired functionality. For example, generating a random bit string, destroying a cryptographic key that is no longer needed, or the TSF failing to a secure state, are capabilities that may be specified in SFRs, but are not invoked by an interface and are performed entirely within the TOE boundary.
However, if the evaluator is unable to perform some other required EA because there is insufficient design and interface information, then the evaluator is entitled to conclude that an adequate functional specification has not been provided, and hence that the verdict for the ADV_FSP.1 assurance component is a 'fail'.
5.2.1.4. Evaluation Activity for Specifying DSC Interface for Use in Composite Evaluations
Table 4 below lists all potentially exportable DSC services supported by this [DSC cPP] along with the SFRs that implement the services. The ST author should include this table in the ST after removing the rows for services not supported by the DSC or whose interfaces are not available to dependent components. SFRs surrounded in brackets ("[SFR]") are optional or selection-based. All the SFRs listed in the resulting table must be selected in the ST.
Service | DSC SFR |
---|---|
Asymmetric Key Generation |
[FCS_CKM.1/AKG] |
Symmetric Key Generation |
[FCS_CKM.1/SKG] |
Cryptographic Key Distribution |
FCS_CKM.2 |
Cryptographic Key Derivation |
[FCS_CKM.5] |
Cryptographic Key Destruction |
FCS_CKM.6 |
Cryptographic Key Agreement |
FCS_CKM_EXT.7 |
Password-Based Key Derivation |
[FCS_CKM_EXT.8] |
CMAC |
[FCS_COP.1/CMAC] |
Cryptographic Hashing |
FCS_COP.1/Hash |
Cryptographic Keyed Hash |
FCS_COP.1/KeyedHash |
Cryptographic Key Encapsulation |
[FCS_COP.1/KeyEncap] |
Cryptographic Key Wrapping |
[FCS_COP.1/KeyWrap] |
Cryptographic Signature Generation |
FCS_COP.1/SigGen |
Cryptographic Signature Verification |
FCS_COP.1/SigVer |
Symmetric-Key Cryptography |
FCS_COP.1/SKC, [FTP_CCMP_EXT.1, FTP_GCMP_EXT.1] |
Random Bit Generation |
FCS_RBG.1, [FCS_RBG.6] |
RBG Seeding |
[FCS_RBG.2] |
Cryptographic One-Time Value Generation |
FCS_OTV_EXT.1 |
Protected Storage |
FCS_STG_EXT.1 |
Factory Reset |
FDP_FRS_EXT.1, [FDP_FRS_EXT.2] |
Generation of Data Integrity Measurements |
[FPT_PRO_EXT.2] |
Generation of Secrets |
FIA_SOS.2 |
Get Time Stamp |
[FPT_STM_EXT.1] |
Cryptographic Self-Tests |
FPT_TST.1 |
The ST must include a description of the interface for each exported service. Interface descriptions must include the purpose of the interface, method of use, parameters, and parameter descriptions. The descriptions must provide sufficient detail to allow the ST author of a composite evaluation to align the DSC interfaces with the dependent component SFRs that they satisfy. Multiple DSC operations may be combined to satisfy dependent component SFRs.
It is not required that a DSC expose interfaces to all the Table 4 services to a dependent component. Nor is a DSC limited to providing only the above services. But all such services that are provided must be documented and the interfaces described.
5.3. AGD: Guidance Documents
It is not necessary for a TOE to provide separate documentation to meet the individual requirements of AGD_OPE and AGD_PRE. Although the EAs in this section are described under the traditionally separate AGD families, the mapping between the documentation provided by the developer and the AGD_OPE and AGD_PRE requirements may be many-to-many, as long as all requirements are met in documentation that is delivered to administrators and users (as appropriate) as part of the TOE.
Note that a DSC is not a direct-to-consumer product and is more typically integrated with a larger system that is distributed to end customers. The guidance documentation may reflect this. For example, information on how to administer the TOE may not necessarily be user-facing instructions; rather, they may be presented as APIs or other information that could be used by a third-party integrator to develop user-facing functions that interface with the DSC to perform the required behavior.
Note that unique aspects of the DSC’s design or architecture may mean that some of these EAs below are not applicable to the TOE because they may be enforced by default. If a given EA is not applicable, the evaluator shall ensure that adequate rationale is provided to justify the lack of applicability.
5.3.1. Operational User Guidance (AGD_OPE.1)
The evaluator performs the CEM work units associated with the AGD_OPE.1 SAR. Specific requirements and EAs on the guidance documentation are identified (where relevant) in the individual EAs for each SFR.
In addition, the evaluator performs the EAs specified below.
5.3.1.1. Evaluation Activity
The evaluator shall ensure the Operational guidance documentation is distributed to administrators and users (as appropriate) as part of the TOE, so that there is a reasonable guarantee that administrators and users are aware of the existence and role of the documentation in establishing and maintaining the evaluated configuration.
5.3.1.2. Evaluation Activity
The evaluator shall ensure that the Operational guidance is provided for every Operational Environment that the product supports as claimed in the Security Target and shall adequately address all platforms claimed for the TOE in the Security Target.
5.3.1.3. Evaluation Activity
The evaluator shall ensure that the Operational guidance contains instructions for configuring any cryptographic engine associated with the evaluated configuration of the TOE. It shall provide a warning to the administrator that use of other cryptographic engines was not evaluated nor tested during the CC evaluation of the TOE.
5.3.1.4. Evaluation Activity
The evaluator shall ensure the Operational guidance makes it clear to an administrator which security functionality and interfaces have been assessed and tested by the EAs.
5.3.2. Preparative Procedures (AGD_PRE.1)
The evaluator performs the CEM work units associated with the AGD_PRE.1 SAR. Specific requirements and EAs on the preparative documentation are identified (and where relevant are captured in the Guidance Documentation portions of the EAs) in the individual EAs for each SFR.
Preparative procedures are distributed to administrators and users (as appropriate) as part of the TOE, so that there is a reasonable guarantee that administrators and users are aware of the existence and role of the documentation in establishing and maintaining the evaluated configuration.
In addition, the evaluator performs the EAs specified below. Note that unique aspects of the DSC’s design or architecture may mean that some of these EAs are not applicable to the TOE because they may be enforced by default. If a given EA is not applicable, the evaluator shall ensure that adequate rationale is provided to justify the lack of applicability.
5.3.2.1. Evaluation Activity
The evaluator shall examine the preparative procedures to ensure they include a description of how the administrator verifies that the operational environment can fulfil its role to support the security functionality (including the requirements of the Security Objectives for the Operational Environment specified in the Security Target).
The documentation should be in an informal style and should be written with sufficient detail and explanation that they can be understood and used by the target audience (which will typically include product developers/engineers that are responsible for integrating the DSC into a larger system).
5.3.2.2. Evaluation Activity
The evaluator shall examine the preparative procedures to ensure they are provided for every Operational Environment that the product supports as claimed in the Security Target and shall adequately address all platforms claimed for the TOE in the Security Target.
5.3.2.3. Evaluation Activity
The evaluator shall examine the preparative procedures to ensure they include instructions to successfully install the TSF in each Operational Environment.
5.3.2.4. Evaluation Activity
The evaluator shall examine the preparative procedures to ensure they include instructions to manage the security of the TSF as a product and as a component of the larger operational environment.
5.4. ALC: Life-cycle Support
5.4.1. Labelling of the TOE (ALC_CMC.1)
When evaluating that the TOE has been provided and is labelled with a unique reference, the evaluator performs the work units as presented in the CEM.
5.4.2. TOE CM coverage (ALC_CMS.1)
When evaluating the developer’s coverage of the TOE in their CM system, the evaluator performs the work units as presented in the CEM.
5.4.3. Basic Flaw Remediation (ALC_FLR.1) (optional)
When evaluating the developer’s procedures regarding basic flaw remediation, the evaluator performs the work units as presented in the CEM.
5.4.4. Flaw Reporting Procedures (ALC_FLR.2) (optional)
When evaluating the developer’s flaw reporting procedures, the evaluator performs the work units as presented in the CEM.
5.4.5. Systematic Flaw Remediation (ALC_FLR.3) (optional)
When evaluating the developer’s procedures regarding systematic flaw remediation, the evaluator performs the work units as presented in the CEM.
5.5. ATE: Tests
5.5.1. Independent Testing - Conformance (ATE_IND.1)
The focus of the testing is to confirm that the requirements specified in the SFRs are being met. Testing is performed to confirm the functionality, as described in the AGD (operational guidance), TSS and KMD (as specified for the functional specification), is accurate.
The evaluator performs the CEM work units associated with the ATE_IND.1 SAR. Specific testing requirements and EAs are captured for each SFR in Section 2, Evaluation Activities for SFRs, Section 3, Evaluation Activities for Optional Requirements and Section 4, Evaluation Activities for Selection-Based Requirements.
5.6. AVA: Vulnerability Assessment
5.6.1. Vulnerability Survey (AVA_VAN.1)
While vulnerability analysis is inherently a subjective activity, a minimum level of analysis can be defined and some measure of objectivity and repeatability (or at least comparability) can be imposed on the vulnerability analysis process. In order to achieve such objectivity and repeatability it is important that the evaluator follows a set of well-defined activities, and documents their findings so others can follow their arguments and come to the same conclusions as the evaluator. While this does not guarantee that different evaluation facilities will identify exactly the same type of vulnerabilities or come to exactly the same conclusions, the approach defines the minimum level of analysis and the scope of that analysis, and provides Certification Bodies a measure of assurance that the minimum level of analysis is being performed by the evaluation facilities.
In order to meet these goals some refinement of the AVA_VAN.1 CEM work units is needed. As Table 5, “Mapping of AVA_VAN.1 CEM Work Units to Evaluation Activitiess” indicates, for each work unit in AVA_VAN.1, whether the CEM work unit is to be performed as written, or if it has been clarified by an Evaluation Activity. If clarification has been provided, a reference to this clarification is provided in the table.
CEM AVA_VAN.1 Work Units | Evaluation Activities |
---|---|
AVA_VAN.1-1 The evaluator shall examine the TOE to determine that the test configuration is consistent with the configuration under evaluation as specified in the ST. |
The evaluator shall perform the CEM activity as specified. |
AVA_VAN.1-2 The evaluator shall examine the TOE to determine that it has been installed properly and is in a known state |
The evaluator shall perform the CEM activity as specified. |
AVA_VAN.1-3 The evaluator shall examine sources of information publicly available to identify potential vulnerabilities in the TOE. |
Replace CEM work unit with activities outlined in Section A.1, “Sources of vulnerability information” |
AVA_VAN.1-4 The evaluator shall record in the ETR the identified potential vulnerabilities that are candidates for testing and applicable to the TOE in its operational environment. |
Replace the CEM work unit with the analysis activities on the list of potential vulnerabilities in Section A.1, “Sources of vulnerability information”, and documentation as specified in Section A.3, “Reporting”. |
AVA_VAN.1-5 The evaluator shall devise penetration tests, based on the independent search for potential vulnerabilities. |
Replace the CEM work unit with the activities specified in Section A.2, “Process for Evaluator Vulnerability Analysis”. |
AVA_VAN.1-6 The evaluator shall produce penetration test documentation for the tests based on the list of potential vulnerabilities in sufficient detail to enable the tests to be repeatable. The test documentation shall include:
|
Replace the CEM work unit with the activities specified in Section A.3, “Reporting”. |
AVA_VAN.1-7 The evaluator shall conduct penetration testing. |
The evaluator shall perform the CEM activity as specified. See Section A.3, “Reporting”, for guidance related to attack potential for confirmed flaws. |
AVA_VAN.1-8 The evaluator shall record the actual results of the penetration tests. |
The evaluator shall perform the CEM activity as specified. |
AVA_VAN.1-9 The evaluator shall report in the ETR the evaluator penetration testing effort, outlining the testing approach, configuration, depth and results. |
Replace the CEM work unit with the reporting called for in Section A.3, “Reporting”. |
AVA_VAN.1-10 The evaluator shall examine the results of all penetration testing to determine that the TOE, in its operational environment, is resistant to an attacker possessing a Basic attack potential. |
This work unit is not applicable for Type 1 and Type 2 flaws (as defined in Section A.1, “Sources of vulnerability information”), as inclusion in this Supporting Document by the iTC makes any confirmed vulnerabilities stemming from these flaws subject to an attacker possessing a Basic attack potential. This work unit is replaced for Type 3 and Type 4 flaws by the activities defined in Section A.3, “Reporting”. |
AVA_VAN.1-11 The evaluator shall report in the ETR all exploitable vulnerabilities and residual vulnerabilities, detailing for each:
|
Replace the CEM work unit with the reporting called for in Section A.3, “Reporting”. |
Because of the level of detail required for the evaluation activities, the bulk of the instructions are contained in Appendix A, while an "outline" of the evaluation activity is provided below.
5.6.1.1. Evaluation Activity (Documentation):
The developer shall provide documentation identifying the list of software and hardware components that compose the TOE. Hardware components apply to all systems claimed in the ST, and should identify at a minimum the processors used by the TOE. Software components include any libraries used by the TOE, such as cryptographic libraries. This additional documentation is merely a list of the name and version number of the components, and will be used by the evaluators in formulating hypotheses during their analysis.
The evaluator shall examine the documentation outlined below provided by the vendor to confirm that it contains all required information. This documentation is in addition to the documentation already required to be supplied in response to the EAs listed previously.
In addition to the activities specified by the CEM in accordance with Table 5, “Mapping of AVA_VAN.1 CEM Work Units to Evaluation Activitiess” above, the evaluator shall perform the following activities.
5.6.1.2. Evaluation Activity
The evaluator formulates hypotheses in accordance with process defined in Section A.1. The evaluator documents the flaw hypotheses generated for the TOE in the report in accordance with the guidelines in Section A.3. The evaluator shall perform vulnerability analysis in accordance with Section A.2. The results of the analysis shall be documented in the report according to Section A.3.
6. Required Supplementary Information
This Supporting Document refers in various places to the possibility that 'required supplementary information' may need to be supplied as part of the deliverables for an evaluation. This term is intended to describe information that is not necessarily included in the Security Target or operational guidance, and that may not necessarily be public. Examples of such information could be entropy analysis, or description of a cryptographic key management architecture used in (or in support of) the TOE. The requirement for any such supplementary information will be identified in the relevant cPP.
The supplementary information required by this SD are:
-
Entropy Documentation, which is evaluated against the guidance specified in Appendix D of the [DSC cPP].
-
A Key Management Document (KMD), which is evaluated against all relevant KMD Evaluation Activities in this SD.
7. References
ID | Title | Source |
---|---|---|
[CC1] |
Common Criteria for Information Technology Security Evaluation, Part 1: Introduction and General Model CCMB-2022-11-001, CC:2022 Revision 1, November 2022 |
|
[CC2] |
Common Criteria for Information Technology Security Evaluation, Part 2: Security Functional Components, CCMB-2022-11-002, CC:2022 Revision 1, November 2022 |
|
[CC3] |
Common Criteria for Information Technology Security Evaluation, Part 3: Security Assurance Components, CCMB-2022-11-003, CC:2022 Revision 1, November 2022 |
|
[CC4] |
Common Criteria for Information Technology Security Evaluation, Part 4: Framework for the specification of evaluation methods and activities, CCMB-2022-11-004, CC:2022 Revision 1, November 2022 |
|
[CC5] |
Common Criteria for Information Technology Security Evaluation, Part 5: Pre-defined packages of security requirements, CCMB-2022-11-005, CC:2022 Revision 1, November 2022 |
|
[CEM] |
Common Methodology for Information Technology Security Evaluation, CCMB-2022-11-006, CC:2022 Revision 1, November 2022 |
|
[CC-E&I] |
Errata and Interpretation for CC:2022 (Release 1) and CEM:2022 (Release 1), 001, Version 1.1, February 1, 2024 |
|
[ISO-TR] |
ISO/IEC. ISO/IEC 24759-3:2017 Information Technology - Security Techniques - Test Requirements for Cryptographic Modules, ISO/IEC, 2017 |
|
[DSC cPP] |
collaborative Protection Profile for Dedicated Security Component, Version 2.0, October 28, 2024 |
Appendix A: Vulnerability Analysis
A.1. Sources of vulnerability information
CEM work unit AVA_VAN.1-3 has been supplemented in this Supporting Document to provide a better-defined set of flaws to investigate and procedures to follow based on this particular technology. Terminology used is based on the flaw hypothesis methodology, where the evaluation team hypothesizes flaws and then either proves or disproves those flaws (a flaw is equivalent to a "potential vulnerability" as used in the CEM). Flaws are categorized into four "types" depending on how they are formulated:
-
A list of flaw hypotheses applicable to the technology described by the [DSC cPP] derived from public sources as documented in Section A.1.1—this fixed set has been agreed to by the iTC. Additionally, this will be supplemented with entries for a set of public sources (as indicated below) that are directly applicable to the TOE or its identified components (as defined by the process in Section A.1.1 below); this is to ensure that the evaluators include in their assessment applicable entries that have been discovered since the [DSC cPP] was published;
-
A list of flaw hypotheses contained in this document that are derived from lessons learned specific to that technology and other iTC input (that might be derived from other open sources and vulnerability databases, for example) as documented in Section A.1.2;
-
A list of flaw hypotheses derived from information available to the evaluators; this includes the baseline evidence provided by the vendor described in this Supporting Document (documentation associated with EAs, documentation described in Section 5.6.1.1, documentation described in Section 6), as well as other information (public or based on evaluator experience) as documented in Section A.1.3; and
-
A list of flaw hypotheses that are generated through the use of iTC-defined tools (e.g. network mappping utilities, protocol testers) and their application is specified in Section A.1.4.
A.1.1. Type 1 Hypotheses—Public-Vulnerability-based
The following list of public sources of vulnerability information were selected by the iTC (in no particular order):
-
Common Vulnerabilities and Exposures: https://cve.mitre.org/cve/
-
The National Vulnerability Database: https://nvd.nist.gov/
-
CVE Details: https://www.cvedetails.com/vulnerability-search.php
-
Tipping Point Zero Day Initiative https://www.zerodayinitiative.com/advisories
-
Offensive Security Exploit Database: https://www.exploit-db.com/
The list of sources above should be searched with the following search terms:
-
Vendor name
-
Product name
-
Third-party or OEM components, if known
-
If the DSC is embedded in specific commercially-available products, any terms relevant to a sample of those products
In order to successfully complete this activity, the evaluator will use the developer provided list of all of 3rd party library information that is used as part of their product, along with the version and any other identifying information (this is required in the [DSC cPP] as part of the ASE_TSS.1.1C requirement). This applies to hardware (including chipsets, etc.) that a vendor uses as part of their TOE. This TOE-unique information will be used in the search terms the evaluator uses in addition to those listed above.
The evaluator will also consider the requirements that are chosen and the appropriate guidance that is tied to each requirement.
In order to supplement this list, the evaluators shall also perform a search on the sources listed above to determine a list of potential flaw hypotheses that are more recent that the publication date of the [DSC cPP], and those that are specific to the TOE and its components as specified by the additional documentation mentioned above. Any duplicates - either in a specific entry, or in the flaw hypothesis that is generated from an entry from the same or a different source - can be noted and removed from consideration by the evaluation team.
As part of type 1 flaw hypothesis generation for the specific components of the TOE, the evaluator shall also search the component manufacturer’s websites to determine if flaw hypotheses can be generated on this basis (for instance, if security patches have been released for the version of the component being evaluated, the subject of those patches may form the basis for a flaw hypothesis).
A.1.2. Type 2 Hypotheses—iTC-Sourced
The following list of flaw hypothesis generated by the iTC for this technology must be considered by the evaluation team as flaw hypotheses in performing the vulnerability assessment.
There are currently no iTC-sourced flaw hypotheses. A future revision of the [DSC cPP] may update this section for relevant findings made by evaluation laboratories.
If the evaluators discover a Type 3 or Type 4 flaw that they believe should be considered as a Type 2 flaw in future versions of this [DSC cPP], they should work with their Certification Body to determine the appropriate means of submitting the flaw for consideration by the iTC.
A.1.3. Type 3 Hypotheses—Evaluation-Team-Generated
The iTC has leveraged the expertise of the developers and the evaluation labs to diligently develop the appropriate search terms and vulnerability databases. They have also thoughtfully considered the iTC-sourced hypotheses the evaluators should use based upon the applicable use case and the threats to be mitigated by the SFRs. Therefore, it is the intent of the iTC, for the evaluation to focus all effort on the Type 1 and Type 2 Hypotheses and has decided that Type 3 Hypotheses are not necessary.
However, if the evaluators discover a Type 3 potential flaw that they believe should be considered, they should work with their Certification Body to determine the feasibility of pursuing the hypothesis. The Certification Body may determine whether the potential flaw hypotheses is worth submitting to the iTC for consideration as Type 2 hypotheses in future drafts of the [DSC cPP]/SD.
A.1.4. Type 4 Hypotheses—Tool-Generated
The iTC has called out several tools that should be used during the Type 2 hypotheses process. Therefore, the use of any tools is covered within the Type 2 construct and the iTC does not see any additional tools that are necessary. The use case for Version 2 of this [DSC cPP] is rather straightforward - the device is found in a powered down state and has not been subjected to revisit/evil maid attacks. Since the use case is so narrow, and is not a typical model for penetration or fuzzing testing, the normal types of testing do not apply. Therefore, the relevant types of tools are referenced in Type 2.
A.2. Process for Evaluator Vulnerability Analysis
As flaw hypotheses are generated from the activities described above, the evaluation team will disposition them; that is, attempt to prove, disprove, or determine the non-applicability of the hypotheses. This process is as follows.
The evaluator shall refine each flaw hypothesis for the TOE and attempt to disprove it using the information provided by the developer or through penetration testing. During this process, the evaluator is free to interact directly with the developer to determine if the flaw exists, including requests to the developer for additional evidence (e.g., detailed design information, consultation with engineering staff); however, the CB should be included in these discussions. Should the developer object to the information being requested as being not compatible with the overall level of the evaluation activity/[DSC cPP] and cannot provide evidence otherwise that the flaw is disproved, the evaluator prepares an appropriate set of materials as follows:
-
The source documents used in formulating the hypothesis, and why it represents a potential compromise against a specific TOE function;
-
An argument why the flaw hypothesis could not be proven or disproved by the evidence provided so far; and
-
The type of information required to investigate the flaw hypothesis further.
The Certification Body (CB) will then either approve or disapprove the request for additional information. If approved, the developer provides the requested evidence to disprove the flaw hypothesis (or, of course, acknowledge the flaw).
For each hypothesis, the evaluator shall note whether the flaw hypothesis has been successfully disproved, successfully proven to have identified a flaw, or requires further investigation. It is important to have the results documented as outlined in Section A.3 below.
If the evaluator finds a flaw, the evaluator must report these flaws to the developer. All reported flaws must be addressed as follows:
If the developer confirms that the flaw exists and that it is exploitable at Basic Attack Potential, then a change is made by the developer, and the resulting resolution is agreed by the evaluator and noted as part of the evaluation report.
If the developer, the evaluator, and the CB agree that the flaw is exploitable only above Basic Attack Potential and does not require resolution for any other reason, then no change is made and the flaw is noted as a residual vulnerability in the CB-internal report (ETR).
If the developer and evaluator agree that the flaw is exploitable only above Basic Attack Potential, but it is deemed critical to fix because of specific aspects such as typical use cases or operational environments, then a change is made by the developer, and the resulting resolution is agreed by the evaluator and noted as part of the evaluation report.
Disagreements between evaluator and vendor regarding questions of the existence of a flaw, its attack potential, or whether it should be deemed critical to fix are resolved by the CB.
Any testing performed by the evaluator shall be documented in the test report as outlined in Section A.3 below.
As indicated in Section A.3, the public statement with respect to vulnerability analysis that is performed on TOEs conformant to the [DSC cPP] is constrained to coverage of flaws associated with Types 1 and 2 (defined in Section A.1) flaw hypotheses only. The fact that the iTC generates these candidate hypotheses indicates these must be addressed.
A.3. Reporting
The evaluators shall produce two reports on the testing effort; one that is public-facing (that is, included in the non-proprietary evaluation report, which is a subset of the Evaluation Technical Report (ETR)), and the complete ETR that is delivered to the overseeing CB.
The public-facing report contains:
-
The flaw identifiers returned when the procedures for searching public sources were followed according to instructions in the Supporting Document per Section A.1.1;
-
A statement that the evaluators have examined the Type 1 flaw hypotheses specified in this Supporting Document in Section A.1.1 (i.e. the flaws listed in the previous bullet) and the Type 2 flaw hypotheses specified in this Supporting Document by the iTC in Section A.1.2.
No other information is provided in the public-facing report.
The internal CB report contains, in addition to the information in the public-facing report:
-
A list of all of the flaw hypotheses generated (cf. AVA_VAN.1-4);
-
The evaluator penetration testing effort, outlining the testing approach, configuration, depth and results (cf. AVA_VAN.1-9);
-
All documentation used to generate the flaw hypotheses:
-
The documentation shall be characterized so that a reader can determine whether it is strictly required by this Supporting Document, and the nature of the documentation (design information, developer engineering notebooks, etc.);
-
-
The evaluator shall report all exploitable vulnerabilities and residual vulnerabilities, detailing for each:
-
Its source (e.g. CEM activity being undertaken when it was conceived, known to the evaluator, read in a publication);
-
The SFRs not met;
-
-
A description;
-
Whether it is exploitable in its operational environment or not (i.e. exploitable or residual).
-
The amount of time, level of expertise, level of knowledge of the TOE, level of opportunity and the equipment required to perform the identified vulnerabilities (cf. AVA_VAN.1-11);
-
How each flaw hypothesis was resolved (this includes whether the original flaw hypothesis was confirmed or disproved, and any analysis relating to whether a residual vulnerability is exploitable by an attacker with Basic Attack Potential) (cf. AVA_VAN1-10); and
-
In the case that actual testing was performed in the investigation (either as part of flaw hypothesis generation using tools specified by the iTC in Section A.1.4, or in proving/disproving a particular flaw) the steps followed in setting up the TOE (and any required test equipment); executing the test; post-test procedures; and the actual results (to a level of detail that allow repetition of the test, including the following:
-
Identification of the potential vulnerability the TOE is being tested for;
-
Instructions to connect and setup all required test equipment as required to conduct the penetration test;
-
Instructions to establish all penetration test prerequisite initial conditions;
-
Instructions to stimulate the TSF;
-
Instructions for observing the behaviour of the TSF;
-
Descriptions of all expected results and the necessary analysis to be performed on the observed behaviour for comparison against expected results;
-
Instructions to conclude the test and establish the necessary post-test state for the TOE. (cf. AVA_VAN.1-6, AVA_VAN.1-8).
-
-
Appendix B: Equivalency Considerations
B.1. Introduction
This appendix provides a foundation for evaluators to determine whether a vendor’s request for equivalency of products is allowed.
For the purpose of this evaluation, equivalency can be broken into two categories:
-
Variations in models: Separate TOE models/variations may include differences that could necessitate separate testing across each model. If there are no variations in any of the categories listed below, the models may be considered equivalent.
-
Variations in TOE dependencies on the environment (e.g., OS/platform the product is tested on): The method a TOE provides functionality (or the functionality itself) may vary depending upon the environment on which it is installed. If there is no difference in the TOE-provided functionality or in the manner in which the TOE provides the functionality, the models may be considered equivalent.
Determination of equivalency for each of the above specified categories can result in several different testing outcomes.
If a set of TOE are determined to be equivalent, testing may be performed on a single variation of the TOE. However, if the TOE variations have security-relevant functional differences, each of the TOE models that exhibits either functional or structural differences must be separately tested. Generally speaking, only the difference between each variation of TOE must be separately tested. Other equivalent functionality may be tested on a representative model and not across multiple platforms.
If it is determined that a TOE operates the same regardless of the environment, testing may be performed on a single instance for all equivalent configurations. However, if the TOE is determined to provide environment-specific functionality, testing must take place in each environment for which a difference in functionality exists. Similar to the above scenario, only the functionality affected by environment differences must be retested.
If a vendor disagrees with the evaluator’s assessment of equivalency, the Scheme arbitrates between the two parties whether equivalency exists.
B.2. Evaluator guidance for determining equivalence
B.2.1. Strategy
When performing the equivalency analysis, the evaluator should consider each factor independently. A factor may be any number of things at various levels of abstraction, ranging from the processor a device uses, to the underlying operating system and hardware platform a software application relies upon. Examples may be the various chip sets employed by the product, the type of network interface (different device drivers), storage media (solid state drive, spinning disk, EEPROM). It is important to consider how the difference in these factors may influence the TOE’s ability to enforce the SFRs. Each analysis of an individual factor will result in one of two outcomes:
-
For the particular factor, all variations of the TOE on all supported platforms are equivalent. In this case, testing may be performed on a single model in a single test environment and cover all supported models and environments.
-
For the particular factor, a subset of the product has been identified to require separate testing to ensure that it operates identically to all other equivalent TOEs. The analysis would identify the specific combinations of models/testing environments that needed to be tested.
Complete CC testing of the product would encompass the totality of each individual analysis performed for each of the identified factors.
B.3. Test presentation/Truth in advertising
In addition to determining what to test, the evaluation results and resulting validation report must identify the actual module and testing environment combinations that have been tested. The analysis used to determine the testing subset may be considered proprietary and will only optionally be publicly included.