hash
stringlengths 32
32
| doc_id
stringlengths 7
13
| section
stringlengths 3
121
| content
stringlengths 0
3.82M
|
---|---|---|---|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.10 eCPRI
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.10.0 Overview
|
This clause emphasizes the importance of securing the eCPRI protocol to ensure secure, robust, and reliable communication within the O-RAN system. The tests target multiple dimensions of eCPRI's security framework, from session management to auditing capabilities.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.10.1 eCPRI Session Management
|
Requirement Name: eCPRI security Requirement Reference & Description: 'REQ-SEC-TRAN-1' clause 5.3.4.1 in O-RAN Security and Controls Requirements Specifications [5]. Threat References: 'T-FRHAUL-01, T-FRHAUL-02' clause 7.4.1.2 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU Test Name: TC_eCPRI_SESSION_MANAGEMENT Test description and applicability Purpose: The purpose of this test is to verify that the eCPRI protocol properly manages sessions and prevents session- related vulnerabilities. Test setup and configuration • eCPRI API is accessible. • Authentication credentials are available. Test procedure 1) Positive Case: a) Authenticate with the eCPRI API and establish a session. b) Perform valid API requests within the session. c) Verify that the session remains active and valid for a reasonable duration. d) Perform subsequent API requests using the same session. e) Verify that the API responds with the expected results without re-authentication. 2) Negative Case: a) Authenticate with the eCPRI API and establish a session. b) Wait for the session to expire or become inactive. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 41 c) Attempt to perform API requests using the expired or inactive session. d) Verify that the API responds with an appropriate error message. Expected Result: The eCPRI protocol manages sessions effectively, allowing authorized requests to be performed within valid sessions while preventing unauthorized access to expired or inactive sessions. Expected format of evidence • Test log: A log file documenting the steps performed during the test, including session establishment, API requests, and responses. • Screenshots: Screenshots of the API responses showing successful session establishment and subsequent API interactions.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.10.2 eCPRI Input Validation
|
Requirement Name: eCPRI security Requirement Reference & Description: 'REQ-SEC-TRAN-1' clause 5.3.4.1 in O-RAN Security Requirements and Controls Specifications [5]. Threat References: 'T-FRHAUL-01, T-FRHAUL-02' clause 7.4.1.2 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU Test Name: TC_eCPRI_INPUT_VALIDATION Test description and applicability Purpose: The purpose of this test is to ensure that the eCPRI protocol properly validates and sanitizes user input to prevent common security vulnerabilities such as injection attacks. Test setup and configuration • eCPRI API is accessible. • Input fields requiring validation are identified. Test procedure 1) Positive Case: a) Send API requests with valid and expected input values. b) Verify that the API processes the requests successfully and provides the expected responses. 2) Negative Case: a) Generate API requests by systematically applying fuzzing techniques to introduce deliberately malicious input values containing potential security threats. b) Verify that the eCPRI API detects and rejects the malicious input, responding with appropriate error messages or status codes. NOTE: Ensuring comprehensive coverage against malicious inputs is challenging due to the boundless variety of potential inputs. A more pragmatic approach is to adopt a risk-focused testing strategy. This method emphasizes inputs that pose significant threats to security. Such inputs commonly fall under categories like data breaches (inputs that might unveil confidential information, encryption keys, or credentials), unauthorized entry (inputs that could circumvent authentication or exploit privileges to gain unauthorized access), and system infiltration (inputs that might activate code execution vulnerabilities). It is worth noting that fuzzing tools play a pivotal role in generating malicious inputs for APIs to pinpoint potential vulnerabilities. The importance of the fuzzing methodology has been accentuated in this context, and it is relevant to all O-RAN APIs (e.g. SCTP, eCPRI and RESTful APIs). ETSI ETSI TS 104 105 V7.0.0 (2025-06) 42 Expected Result: The eCPRI protocol validates and sanitizes user input to prevent security vulnerabilities related to improper input handling. Expected format of evidence • Test log: A log file documenting the requests sent to the API, including valid and malicious inputs. • Screenshots: Screenshots of the API responses showing the handling of valid inputs and appropriate error messages for malicious inputs.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.10.3 eCPRI Error Handling
|
Requirement Name: eCPRI security Requirement Reference & Description: 'REQ-SEC-TRAN-1' clause 5.3.4.1 in O-RAN Security and Controls Requirements Specifications [5]. Threat References: 'T-FRHAUL-01, T-FRHAUL-02' clause 7.4.1.2 in O-RAN Security Threat Modeling and Risk Assessment [3]. DUT/s: O-RU, O-DU Test Name: TC_eCPRI_ERROR_HANDLING Test description and applicability Purpose: The purpose of this test is to ensure that the eCPRI protocol handles errors securely and does not disclose sensitive information. Test setup and configuration • eCPRI API is accessible. • Various error scenarios are identified. Test procedure 1) Attempt to force error conditions by sending unexpected or malicious requests, by simulating a high-latency or slow network connection between the client and the eCPRI API server. 2) Verify that the eCPRI API detects and handles the errors appropriately, responding with informative error messages without revealing sensitive information. 3) Validate that the error messages provide helpful and actionable information for troubleshooting. 4) Restore normal connectivity. 5) Resend a normal request to the eCPRI API. 6) Verify that the API processes the request successfully and provides the expected response. Expected Result: The eCPRI protocol handles errors securely, providing meaningful error messages without disclosing sensitive information and recovering seamlessly when the connection is restored. Expected format of evidence • Screenshots: Screenshots of the error messages or status codes received from the API in response to triggered errors. • Test log: A log file documenting the requests and responses during error scenarios.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.10.4 eCPRI Access Control
|
Requirement Name: eCPRI security ETSI ETSI TS 104 105 V7.0.0 (2025-06) 43 Requirement Reference & Description: 'REQ-SEC-TRAN-1' clause 5.3.4.1 in O-RAN Security and Controls Requirements Specifications [5]. Threat References: 'T-FRHAUL-01, T-FRHAUL-02' clause 7.4.1.2 in O-RAN Security Threat Modeling and Risk Assessment [3]. DUT/s: O-RU, O-DU Test Name: TC_eCPRI_ACCESS_CONTROL Test description and applicability Purpose: The purpose of this test is to verify that the eCPRI protocol enforces access controls consistently across all relevant resources and endpoints. Test setup and configuration • eCPRI API is accessible. • User roles and permissions are defined. Test procedure 1) Positive Case: a) Authenticate with different roles. b) Send requests to various API endpoints associated with different levels of access rights. c) Verify that the API allows access to authorized resources and returns the expected results. d) Repeat the test with different authenticated user roles and ensure consistent access control enforcement. 2) Negative Case: a) Attempt to access resources or perform actions that require higher access privileges than the authenticated user possesses. b) Verify that the eCPRI API responds with appropriate access control-related error messages or status codes. c) Repeat the test with different scenarios and confirm consistent behaviour. Expected Result: The eCPRI protocol enforces access controls consistently, granting access only to authorized users based on their assigned roles and permissions. Expected format of evidence • Test log: A log file documenting the user authentication process, access requests, and the responses received from the API. • Screenshots: Screenshots of successful access to authorized resources and error messages for unauthorized access attempts.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.10.5 eCPRI Logging and Auditing
|
Requirement Name: eCPRI security Requirement Reference & Description: 'REQ-SEC-TRAN-1' clause 5.3.4.1 in O-RAN Security and Controls Requirements Specifications [5]. Threat References: 'T-FRHAUL-01, T-FRHAUL-02' clause 7.4.1.2 in O-RAN Security Threat Modeling and Risk Assessment [3]. DUT/s: O-RU, O-DU ETSI ETSI TS 104 105 V7.0.0 (2025-06) 44 Test Name: TC_eCPRI_LOGGING_AUDITING Test description and applicability Purpose: The purpose of this test is to validate that the eCPRI protocol logs relevant security events and activities and supports auditing capabilities. Test setup and configuration • eCPRI API is accessible. • Logging and auditing mechanisms are enabled and configured. Test procedure 1) Perform various API actions (e.g. authentication, access control, data retrieval, configuration changes). 2) Verify that the eCPRI API generates appropriate log entries for each action, capturing relevant security-related information. 3) Access and review the generated logs to ensure they contain the necessary details for security auditing purposes. Expected Result: The eCPRI protocol generates accurate and tamper-resistant logs, recording security-related events and activities for auditing and forensic analysis. Expected format of evidence • Log files: The generated log files containing recorded security events and activities during the testing process. • Screenshots: Screenshots of log entries highlighting relevant security events and timestamps.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.10.6 eCPRI Timeout Error Handling
|
Requirement Name: eCPRI security Requirement Reference & Description: 'REQ-SEC-TRAN-1' clause 5.3.4.1 in O-RAN Security and Controls Requirements Specifications [5]. Threat References: 'T-FRHAUL-01, T-FRHAUL-02' clause 7.4.1.2 in O-RAN Security Threat Modeling and Risk Assessment [3]. DUT/s: O-RU, O-DU Test Name: TC_eCPRI_TIMEOUT_ERROR_HANDLING Test description and applicability Purpose: The purpose of this test is to verify that the eCPRI protocol handles timeout errors gracefully and provides appropriate error messages. Test setup and configuration • eCPRI API is running and accessible. • A request with a long processing time or a simulated delay is prepared. Test procedure 1) Positive Case: a) Send a request to the eCPRI API with a normal processing time. b) Verify that the API responds within a reasonable time frame and provides the expected response. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 45 2) Negative Case: a) Send a request to the eCPRI API that triggers a timeout condition (e.g. requesting a resource that requires a long processing time). b) Verify that the API responds with an appropriate error message or status code indicating the timeout condition. c) Adjust the timeout settings or optimize the processing time. d) Resend the request to the eCPRI API. e) Verify that the API processes the request successfully and provides the expected response within the adjusted timeout duration. Expected Result: The eCPRI protocol handles timeout errors gracefully, providing meaningful error messages or status codes when a request exceeds the configured or reasonable processing time. Once the timeout issue is addressed, the API processes requests within the specified time limits. Expected format of evidence • Test log: A log file documenting the requests sent to the eCPRI API and their corresponding responses, including timestamps. • Screenshots or videos: Screenshots or video recordings showing the requests being sent to the eCPRI API and the received error messages or status codes indicating the timeout error.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11 SCTP
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.0 Overview
|
The SCTP is pivotal in ensuring reliable, secure, and efficient communication within O-RAN networks, particularly between various endpoints such as O-CU, O-DU, and Near-RT RIC. To validate its operational and security robustness, the following test cases are designed, emphasizing diverse facets of SCTP, namely, association management, data transfer, authentication, authorization, and resilience against potential threats and attacks.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.1 Void
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.2 Void
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.3 Void
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.4 Void
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.5 SCTP DoS Prevention Rate Limiting
|
Requirement Name: SCTP security Requirement Reference & Description: 'REQ-SEC-TRAN-1' clause 5.3.4.1 in O-RAN Security and Controls Requirements Specifications [5]. Threat References: ' T-E2-01, T-E2-02, T-E2-03' clause 7.4.1.12 in O-RAN Security Threat Modeling and Risk Assessment [3]. DUT/s: O-CU, O-DU, Near-RT RIC Test Name: TC_SCTP_ DOS_PREVENTION_RATE_LIMITING Test description and applicability ETSI ETSI TS 104 105 V7.0.0 (2025-06) 46 Purpose: The purpose of this test is to verify that the SCTP protocol effectively handles DoS attacks and prevents resource exhaustion. Test setup and configuration • Enable DoS prevention mechanisms. • The rate limiting parameters, such as the maximum number of connections or allowed data transfer rate, are properly defined. • Use SCTP library. EXAMPLE 1: The sctplib library in the C programming language. Test procedure 1) Simulate a DoS attack by overwhelming the SCTP protocol with a large number of connection requests (send data at a rate that exceeds the defined rate limiting parameters). EXAMPLE 2: Sample SCTP commands: • for (int i = 0; i < num_connections; i++) { sctp_socket = sctp_socket(AF_INET, SOCK_STREAM, IPPROTO_SCTP); // Establish connections rapidly beyond system limits } 2) Monitor the SCTP protocol's response and behaviour during the excessive connection and data transfer attempts. Expected Results • The SCTP protocol detects the excessive usage and applies rate limiting measures to restrict or reject connections or data transfers that exceed the defined limits. • The system handles the rate limiting effectively, ensuring that resources are not exhausted or overwhelmed. Expected format of evidence • Test logs showing successful handling of the DoS attack, such as connection limits or rejection messages. • System performance metrics or logs indicating the proper handling of excessive connection requests.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.6 SCTP Input Validation
|
Requirement Name: SCTP security Requirement Reference & Description: 'REQ-SEC-TRAN-1', clause 5.3.4.1 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-E2-01, T-E2-02, T-E2-03' clause 7.4.1.12 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-CU, O-DU, Near-RT RIC Test Name: TC_SCTP_INPUT_VALIDATION Test description and applicability Purpose: To verify that the SCTP protocol performs proper input validation to prevent security vulnerabilities such as buffer overflows or injection attacks. Test setup and configuration • The SCTP protocol is configured with input validation enabled. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 47 • Use SCTP library. EXAMPLE 1: The sctplib library in the C programming language. Test procedure 1) Attempt to establish a connection using the SCTP protocol and provide invalid or malicious input. EXAMPLE 2: Sample SCTP command: sctp_socket = sctp_socket(AF_INET, SOCK_STREAM, IPPROTO_SCTP); 2) Send data containing invalid or malicious content over the connection. EXAMPLE 3: Sample SCTP command: sctp_sendmsg(sctp_socket, malicious_data_buffer, data_length, NULL, 0, 0, 0, stream_id, 0, 0); Expected Results • The SCTP protocol performs input validation and rejects or sanitizes the invalid or malicious input. • The connection is not established, or the malicious data is handled safely. Expected format of evidence • Test logs showing the rejection or sanitization of invalid or malicious input. • Output from the application indicating the successful validation and rejection of malicious data.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.7 Void
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.11.8 Void
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.12 RESTful
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.12.0 Overview
|
This clause emphasizes the necessity of robust security controls to safeguard these APIs within the O-RAN NFs against various threats and vulnerabilities. The outlined test cases aim to validate the security mechanisms deployed in RESTful API implementations, ensuring authentication, authorization, input validation, and secure logging and monitoring are upheld to the highest standards, thereby securing the NFs from malicious actors and potential breaches.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.12.1 REST API Authentication
|
Requirement Name: RESTful API protection Requirement Reference & Description: 'REQ-SEC-O-CLOUD-NotifAPI-1, REQ-SEC-O-CLOUD-NotifAPI-2' clause 5.1.8.9.1.3 [5], 'REQ-SEC-API-1, REQ-SEC-API-2, REQ-SEC-API-3, REQ-SEC-API-4, REQ-SEC-API-5, REQ-SEC-API-6, REQ-SEC-API-8, REQ-SEC-API-9, REQ-SEC-API-10, REQ-SEC-API-13, REQ-SEC-API-15' clause 5.3.10.2 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-01, T-O-RAN-02, T-O-RAN-03, T-O-RAN-05, T-O-RAN-06' clause 7.4.1.1 in O- RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_REST_API_AUTHENTICATION Test description and applicability Purpose: The purpose of this test is to verify the authentication mechanism of an O-RAN NF supporting RESTful API. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 48 Test setup and configuration • An O-RAN NF supporting the RESTful API is provisioned and running. • Access to the O-RAN NF management system or command-line interface. Test procedure 1) Positive Case: a) Authenticate using valid credentials or API tokens: EXAMPLE 1: curl -X POST -H "Content-Type: application/json" -d '{"username":"<username>", "password":"<password>"}' http://<ORAN_IP>/auth b) Capture the authentication token from the response. c) Execute an authenticated request against an O-RAN NF resource (e.g. get cell status). d) Verify that the request is successful and returns the expected response. 2) Negative Case: a) Attempt to access the O-RAN RESTful API without providing valid authentication credentials: EXAMPLE 2: curl http://<ORAN_IP>/cell-status b) Verify that the request fails and returns an unauthorized response. Expected Results 1) Positive Case: - Authentication using valid credentials or API tokens is successful. - Authorized requests to O-RAN NF resources return the expected responses. 2) Negative Case: - Requests without valid authentication credentials are rejected with an unauthorized response. Expected format of evidence • Screenshots or logs showing the successful authentication and authorized requests. • Screenshots or logs showing the failed authentication attempts.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.12.2 REST Authorization and Access Control
|
Requirement Name: RESTful API protection Requirement Reference & Description: 'REQ-SEC-O-CLOUD-NotifAPI-1, REQ-SEC-O-CLOUD-NotifAPI-2' clause 5.1.8.9.1.3 [5], 'REQ-SEC-API-1, REQ-SEC-API-2, REQ-SEC-API-3, REQ-SEC-API-4, REQ-SEC-API-5, REQ-SEC-API-6, REQ-SEC-API-8, REQ-SEC-API-9, REQ-SEC-API-10, REQ-SEC-API-13, REQ-SEC-API-15' clause 5.3.10.2 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-01, T-O-RAN-02, T-O-RAN-03, T-O-RAN-05, T-O-RAN-06' clause 7.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_REST_AUTHORIZATION_ACCESS_CONTROL Test description and applicability Purpose: The purpose of this test is to ensure that the RESTful API enforces proper authorization and access control mechanisms. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 49 Test setup and configuration • An O-RAN NF supporting the RESTful API is provisioned and running. • Access to the O-RAN NF management system or command-line interface. • User roles and permissions are defined and configured. Test procedure 1) Positive Case: a) Authenticate using credentials associated with a user assigned to a role with necessary permissions: EXAMPLE 1: curl -X POST -H "Content-Type: application/json" -d '{"username":"<username>", "password":"<password>"}' http://<ORAN_IP>/auth b) Capture the authentication token from the response. c) Execute a request that requires the permissions granted by the user's role (e.g. update configuration). d) Verify that the request is successful and returns the expected response. 2) Negative Case: a) Authenticate using credentials associated with a user not assigned to a role with necessary permissions: EXAMPLE 2: curl -X POST -H "Content-Type: application/json" -d '{"username":"<username>", "password":"<password>"}' http://<ORAN_IP>/auth b) Capture the authentication token from the response. c) Execute a request that requires the permissions beyond the user's role (e.g. perform a restricted operation). d) Verify that the request fails and returns a forbidden response. Expected Results 1) Positive Case: - Users with appropriate roles and permissions can perform authorized actions. - Requests requiring specific permissions return the expected responses. 2) Negative Case: - Users without necessary roles or permissions are restricted from performing unauthorized actions. - Requests requiring permissions beyond the user's role return a forbidden response. Expected format of evidence • Screenshots or logs showing the successful authorization and access control enforcement. • Screenshots or logs showing the failed authorization attempts.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.12.3 REST Input Validation and Sanitization
|
Requirement Name: RESTful API protection Requirement Reference & Description: 'REQ-SEC-O-CLOUD-NotifAPI-1, REQ-SEC-O-CLOUD-NotifAPI-2' clause 5.1.8.9.1.3 [5], 'REQ-SEC-API-1, REQ-SEC-API-2, REQ-SEC-API-3, REQ-SEC-API-4, REQ-SEC-API-5, REQ-SEC-API-6, REQ-SEC-API-8, REQ-SEC-API-9, REQ-SEC-API-10, REQ-SEC-API-13, REQ-SEC-API-15' clause 5.3.10.2 in O-RAN Security Requirements and Controls Specifications [5] ETSI ETSI TS 104 105 V7.0.0 (2025-06) 50 Threat References: 'T-O-RAN-01, T-O-RAN-02, T-O-RAN-03, T-O-RAN-05, T-O-RAN-06' clause 7.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_REST_INPUT_VALIDATION_SANITIZATION Test description and applicability Purpose: The purpose of this test is to validate that the RESTful API properly validates and sanitizes input data to prevent common security vulnerabilities. Test setup and configuration • An O-RAN NF supporting the RESTful API is provisioned and running. • Access to the O-RAN NF management system or command-line interface. Test procedure 1) Positive Case: a) Construct a valid request with appropriate input data: EXAMPLE 1: curl -X POST -H "Content-Type: application/json" -d '{"parameter1":"value1", "parameter2":"value2"}' http://<ORAN_IP>/api-endpoint b) Verify that the request is successful and returns the expected response. 2) Negative Case: a) Construct a request with invalid or malicious input data: EXAMPLE 2: curl -X POST -H "Content-Type: application/json" -d '{"parameter1":"<script>alert(1)</script>", "parameter2":"value2"}' http://<ORAN_IP>/api-endpoint b) Verify that the request fails and returns an error response or rejects the malicious input. Expected Results 1) Positive Case: - Requests with valid and appropriate input data are successfully processed. - Responses from the O-RAN NF RESTful API are as expected. 2) Negative Case: - Requests with invalid or malicious input data are rejected or handled properly to prevent security vulnerabilities. Expected format of evidence • Screenshots or logs showing the successful input validation and sanitization. • Screenshots or logs showing failed input validation or sanitization attempts.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
6.12.4 REST Security Logging and Monitoring
|
Requirement Name: RESTful API protection Requirement Reference & Description: 'REQ-SEC-O-CLOUD-NotifAPI-1, REQ-SEC-O-CLOUD-NotifAPI-2' clause 5.1.8.9.1.3 [5], 'REQ-SEC-API-1, REQ-SEC-API-2, REQ-SEC-API-3, REQ-SEC-API-4, REQ-SEC-API-5, REQ-SEC-API-6, REQ-SEC-API-8, REQ-SEC-API-9, REQ-SEC-API-10, REQ-SEC-API-13, REQ-SEC-API-15' clause 5.3.10.2 in O-RAN Security Requirements and Controls Specifications [5] ETSI ETSI TS 104 105 V7.0.0 (2025-06) 51 Threat References: 'T-O-RAN-01, T-O-RAN-02, T-O-RAN-03, T-O-RAN-05, T-O-RAN-06' clause 7.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_REST_SECURITY_LOGGING_MONITORING Test description and applicability Purpose: The purpose of this test is to verify that the O-RAN NF logs and monitors API activities for security and compliance purposes. Test setup and configuration • An O-RAN NF supporting the RESTful API is provisioned and running. • Access to the O-RAN NF management system or command-line interface. Test procedure 1) Positive Case: a) Enable API logging and monitoring for the O-RAN NF. b) Generate a series of API requests and actions. c) Review the logs or monitoring system for the recorded activities. 2) Negative Case: a) Attempt unauthorized API actions or exploit security vulnerabilities. b) Verify that the logs or monitoring system captures and raises alerts for these activities. Expected Results 1) Positive Case: - API activities are logged and monitored by the O-RAN NF. - Logs or monitoring system records the expected API requests and actions. 2) Negative Case: - Unauthorized or malicious API actions trigger alerts in the logs or monitoring system. - Logs or monitoring system captures and records failed security attempts. Expected format of evidence Screenshots or logs from the O-RAN NF management system showing the successful or failed API logging and monitoring settings. 7 Common Network Security Tests for O-RAN components
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.1 Overview
|
This clause contains a set of security evaluations that are performed from outside and inside of the network function in a network capacity. It is used to measure the external exposure and risk(s) of the function in place and leverages common techniques used in cyber security to evaluate the risk(s) device under test faces or has. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 52 The objects in scope of these network-based security tests are SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-eNB and O-Cloud.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.2 Network Protocol and Service Enumeration
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.2.1 Network Protocol and Service Enumeration
|
Requirement Name: Network protocol and service enumeration Requirement Reference: REQ-SEC-NET-1, clause 5.3.3.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: "A list of network protocols and services supported on the O-RAN component shall be clearly documented by its vendor. Unused protocols shall be disabled." Threat References: T-O-RAN-01, T-O-RAN-02 DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud Test name: TC_Network_Procotol_And_Enumeration Test description and applicability Purpose: To verify that the list of active network protocols and services on running O-RAN component is in line with vendor-provided list of network protocols and services supported by the O-RAN component. Probing of network protocols and services on running O-RAN component provides the information whether the service is active or not. NOTE 1: In practice, such probing is often referred to as network scanning or port scanning. This test case probes all possible TCP and SCTP ports in range 0-65535 using port scanner for presence of the active services. This test case probes all documented UDP ports from vendor-provided list using port scanner for presence of the active services. Optionally, additional UDP ports may be scanned as well. Result of probing the running O-RAN component is a list of active network protocols and services. Each item contains network protocol (TCP, UDP, SCTP), port number (from range 0-65535) and service name. If service type cannot be determined during probing, service name is "unknown". Service name is in line with Service Name and Transport Protocol Port Number Registry defined by IANA [i.1]. If service name is not defined in [i.1], vendor provided service name should be used. NOTE 2: In practice, services may also run on ports different from ports defined in [i.1]. O-RAN component configuration influences what network protocols and services are exposed as active. Service that is supported by O-RAN component may be disabled and therefore can be detected during probing as not active. Comparison between the vendor-provided list of all supported network protocols and services and the list or active network protocols and services found by port scanner are performed. Test setup and configuration This test is executed against running O-RAN component as the DUT. Test prerequisites: • Port scanner with capabilities as defined in clause 5.3 of present document. • Network access to DUT • Vendor-provided list of network protocols and services supported by DUT Test procedure 1) List of open ports are determined as follows: ETSI ETSI TS 104 105 V7.0.0 (2025-06) 53 - Port scanner scans all TCP ports in range 0-65535 on the IP interface of DUT. TCP SYN/ACK response by DUT are interpreted as open port. - Port scanner scans all SCTP ports in range 0-65535 on the IP interface of DUT. SCTP INIT-ACK response by DUT are interpreted as open port. - All UDP ports documented in vendor-provided list are interpreted as open ports. Other UDP ports may be considered as open for the purpose of service detection. NOTE 3: Due to the nature of UDP protocol, there is no simple method of open port detection similar to TCP/SCTP methods based on analysis of response message type (TCP: SYN/ACK, SCTP: INIT-ACK). In case of UDP, open port detection inevitably relies on service detection which is discussed in step 2 of this test procedure. In practice, port scans of entire UDP port range 0-65535 are impractical and time consuming. Typically, service detection is performed only for subset of UDP ports. UDP port subset selection is arbitrary and not standardized. Service detection in this test procedure is required for UDP ports from vendor-provided list and is optional for other UDP ports. 2) For each open port from previous step, port scanner performs service detection by sending service probe(s) as follows: - If open port is listed in vendor-provided list, port scanner uses service probe from its built-in database that exactly matches service documented in vendor-provided list. - If open port is not listed in vendor-provided list, port scanner should use service probe from its built-in database that exactly matches service defined in [i.1] for the that open port. If such service is not defined in [i.1], port scanner may report service as "unknown". Alternatively, port scanner may perform further service detection attempts based on other service probes from its built-in database. NOTE 4: Service detection for open ports that are also listed in vendor-provided list requires only one probe. Finding any open ports that are not listed in vendor-provided list means this test case fails. However, service information can be helpful in discussion with DUT vendor. This test procedure therefore accommodates optional service detection based on one probe or multiple probes. 3) Port scanner shall produce list of detected active network protocols, ports and services on DUT. Expected results All services found by port scanner are documented in vendor-provided list. This test case ends with success if: • both lists match exactly; • list of network protocols and services found by port scanner has fewer items than vendor-provided list; all items found by port scanner exactly match items from vendor-provided list. If any service is found by port scanner and it is not documented in vendor-provided list, this test case shall fail. It means that vendor-provided list is incorrect and undocumented attack surface exists. Expected format of evidence: Report file, log files and/or screenshots.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.3 Password-Based Authentication
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.3.1 Password guessing
|
Requirement Name: Password-Based Authentication Requirement Reference: REQ-SEC-PASS-1, clause 5.3.7.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Password guessing protection mechanism is present on the DUT Threat References: T-O-RAN-02, T-O-RAN-03, T-O-RAN-05, T-O-RAN-06 DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud ETSI ETSI TS 104 105 V7.0.0 (2025-06) 54 Test Name: TC_Password_Guessing Test description and applicability Purpose: To verify that running O-RAN component has protection mechanism(s) implemented to prevent password guessing attacks against services using password-based authentication. NOTE 1: In practice, brute-forcing and dictionary attacks are the most common classes of password guessing attacks. Traditional approach to brute-forcing and dictionary attacks uses fixed username with various candidate passwords. Password spraying is another approach that can be combined with brute-forcing and dictionary attacks; fixed password is tested with various candidate usernames. Example of protection mechanism is enforcing delay before next authentication attempt(s) by the same client. This test case cannot list all possible techniques that protection mechanisms can use. However, following list provides overview of the most common approaches: • Increase the delay after each unsuccessful authentication attempt. • Implement challenge-response authentication (example of such measure: CAPTCHA). • In order to prevent more attempts, impose temporary lock out on the client when threshold of consecutive failed authentication attempts is reached. During defined period of time all authentication attempts by locked- out client shall be rejected. Simulation of password guessing attacks against services on running O-RAN component provides the information whether any protection mechanism is present. This test case is run against all services on running O-RAN component that use password-based authentication. Vendor- provided list of all supported network protocols and services are used as a source. NOTE 2: Vendor-provided list of all supported network protocols and services may not include the specific information about presence of password-based authentication as it is including network protocol, port and service name. In practice, only subset of services from vendor-provided list will use password-based authentication. This test case does not mandate any specific list of passwords to be used for testing. Test setup and configuration This test is executed against running O-RAN component as the DUT. Test prerequisites: • Valid username for each tested service • Network access to DUT • Physical access to DUT (applicable if the DUT is in physical form) • Vendor-provided list of network protocols and services supported by DUT Test procedure 1) List of services using password-based authentication is determined by analysing the vendor-provided list as well as by analysing local services that are not remotely accessible. 2) For services identified in the previous step, presence of protection mechanism is tested as follows: - combination of valid username and invalid password (or various invalid passwords) are used for authentication repeatedly. - after certain number of authentication attempts, protection mechanism of DUT is detected. - minimum number of authentication attempts are 11. - protection mechanism(s) detects after 10 authentication attempts or fewer. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 55 EXAMPLE: If DUT uses protection mechanism based on delaying authentication attempts, such delay is observed at the latest when DUT receives 11th consecutive invalid authentication attempt. Expected results In context of each of the services using password-based authentication, protection mechanism(s) is present. Applicable to local services and to remotely accessible services. This test case fails if one or more services using password-based authentication have no protection mechanism present. Expected format of evidence: Report file, log files and/or screenshots.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.3.2 Unauthorized Password Reset
|
Requirement Name: Password-Based Authentication Requirement Reference: REQ-SEC-PASS-1, clause 5.3.7.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Out-of-band password recovery mechanism absent or deactivated on DUT Threat References: T-O-RAN-02, T-O-RAN-03, T-O-RAN-05, T-O-RAN-06 DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud Test Name: TC_Unauthorized_Password_Reset Test description and applicability Purpose: To verify that password reset mechanism of running O-RAN component cannot be circumvented, disabled, or misused to gain access to O-RAN component, its configuration, and data. Test covers services using password-based authentication and out-of-band mechanisms of password reset present in O- RAN components in physical form. If password reset is required, factory reset of O-RAN component is performed. Factory reset wipes O-RAN component, its configuration and data. Test setup and configuration This test is executed against running O-RAN component as the DUT. Test prerequisites: • Network access to DUT • Physical access to DUT (applicable if the DUT is in physical form) • Vendor-provided list of network protocols and services supported by DUT Test procedure List of services using password-based authentication are determined by analysing the vendor-provided list as well as by analysing local services that are not remotely accessible: 1) For services identified in the previous step, presence of password reset is tested. 2) For DUT that has physical form, it verifies that use of hardware factory reset switch or switches results in factory reset. Using any out-of-band mechanism, it is not possible to reset password only. Expected results In context of each of the services using password-based authentication, no password change mechanism is present. Applicable to local services and to remotely accessible services. This test case fails if one or more services using password-based authentication have password reset mechanism exposed. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 56 This test case fails if DUT in physical form has hardware switch or switches that can be used to reset password without triggering factory reset of DUT. Expected format of evidence: Report file, log files and/or screenshots.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.3.3 Password Policy Enforcement
|
Requirement Name: Password-Based Authentication Requirement Reference: REQ-SEC-PASS-1, clause 5.3.7.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Secure password policy is supported and enforced on the DUT Threat References: T-O-RAN-02, T-O-RAN-03, T-O-RAN-05, T-O-RAN-06 DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud Test Name: TC_Password_Policy_Enforcement Test description and applicability Purpose: To verify that password policy applied for services using password-based authentication is effectively enforced by running O-RAN component. Test setup and configuration This test is executed against running O-RAN component as the DUT. Test prerequisites: • Set of valid username and valid password for each tested service • Network access to DUT • Physical access to DUT (applicable if the DUT is in physical form) • Vendor-provided list of network protocols and services supported by DUT Test procedure 1) List of services using password-based authentication is determined by analysing the vendor-provided list as well as by analysing local services that are not remotely accessible. 2) For services identified in the previous step, effectiveness of password policy enforcement is verified as follows: - combination of valid username and valid password are used to authenticate - password change is performed using password that does not conform to applied password policy EXAMPLE: DUT uses password policy to set rules for password length, type of characters used (allowed and disallowed characters), complexity (character groups), and denied passwords (deny-list of passwords that cannot be set). Candidate password that does not conform to rules are chosen for this test. As password policy may be complex set of rules, multiple candidate password should be tested to fully cover possible password policy violations. Expected results In context of each of the services using password-based authentication, applied password policy are effectively enforced and non-compliant passwords are rejected by DUT during password change. Applicable to local services and to remotely accessible services. Expected format of evidence: Report file, log files and/or screenshots. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 57
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.4 Network Protocol Fuzzing
|
Fuzzing is an automated process of sending invalid or random inputs to a SUT to cause it to malfunction or crash. Fuzzing is effective for finding vulnerabilities because while most modern programs have extensive input fields, the test coverage of these areas is relatively small. Even though this process can be a powerful capability to ensure robustness, it needs to be sufficiently defined and implemented throughout the system development lifecycle to be helpful and achieve the required results in a multi-vendor environment. While traditional fuzzing techniques involve fuzzing piece(s) of software and generating inputs through command line or input files, fuzzing telecommunication network protocols tends to be different, requiring sending information via network ports. Furthermore, the complex nature of network protocols in the SUT resulting from how they are layered over each other adds to the challenges of fuzzing such SUTs. In the case of O-RAN SUT, fuzzing cover protocols rather than application-specific (web applications and services, etc.). The following are examples of the protocols that fuzzing will cover: General Transport Protocols SCTP IP TCP UDP SSH HTTP HTTP/2 and O-RAN Specific Protocols NETCONF E1AP E2AP A1 CTI eCPRI PTP It is anticipated that many O-RAN components utilize common software frameworks used for the lower-level general communication. In this case it should be evaluated if these General Transport Protocols are being tested in extensive Fuzzing tests in other activities and can therefore be considered to have lower risk profiles compared to the O-RAN Specific Protocols with less testing in the general industry. Many of the O-RAN specific protocols are state- machine based protocols that can have multiple end points served at the same time, e.g. the protocol needs to be tested in scale to understand if possible memory leaks or other similar aspects is available that could lead to buffer overflows (opening up for possible code execution) or software crashes of the O-RAN specific software. Fuzzing on the M-Plane protocol inside the Configuration of the O-RAN Fronthaul can be a possible significant area as this is combining multiple technologies from many domains into a single solution. In order for the Fuzzing to be time and resource efficient, it is important that this Fuzzing is protocol and state machine aware so that the Fuzzing can focus on the relevant aspects of the SUT representing the most significant risk exposure. Further effectiveness can be achieved if the Fuzzing capability is able to intelligently respond to the SUT behavior. The Fuzzing tool should be able to both perform test with and without access to relevant credentials. Many possible vulnerabilities would be present on the inside of the authenticated session of the management protocols and would lead to escalation of privileges. In order to identify the possible risk for memory leaks, Denial of Service (DoS) or other similar aspects a robust logging of the underlying platform (hardware and software), the virtualization or container platform and the O-RAN function, the logging needs to be detailed enough to evaluate the trends early but not intrusive to degrade the performance of the platform and lead to inaccurate results. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 58 As general guidance, vendors and operators running fuzzing tests aim to document the list of all of the protocols of the SUTs reachable externally on an IP-based interface, together with indications of whether adequate available robustness and fuzz testing tools have been used against them. The tool's name, their unambiguous version (also for plug-ins if applicable), user settings, and the relevant output evidenced and should be documented. Additionally, any input causing unspecified, undocumented, or unexpected behavior and a description of this behavior should be highlighted in the testing documentation. Since fuzzing test cases are not exhaustive and difficult to define and replicate, it is likely that test results even from testing the same set of protocols by different vendors may end up resulting in different outputs. So further effort and time needs to be invested in fuzzing activities until a satisfactory approach based on the vendor's or/and operators adopted risk-based model is satisfied.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.5 Denial of Service/Message Flooding
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.5.1 Protocol, Application and Volumetric Based DDoS Attacks
|
Requirement Name: Robustness against Volumetric DDoS Attack Requirement Reference: REQ-SEC-DOS-1, clause 5.3.5.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: "An O-RAN component with external network interface shall be able to withstand network transport protocol based volumetric DDoS attack without system crash and returning to normal service level after the attack" Threat References: T-O-RAN-04, T-O-RAN-09, T-SMO-03 DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud. Test Name: TC_Robustness_DDoS Test description and applicability Purpose: To verify the DUT is able to recover from a DDoS attack. Each component interface is tested to validate how handling of large amounts of requests is done, similar to what is seen from denial of service (DoS) or/and distributed denial of service (DDoS) attempts. DoS/DDoS scenario can be occurred as a result of malicious attack or because of network/operator error. DoS/DDoS attacks may come in these forms: Protocol layer attacks (e.g. SYN Floods, UDP Floods, TCP Floods), Volume based attacks (e.g. ICMP floods, Smurf DDoS) and Application layer attacks (e.g. GET/POST floods, low-and-slow attacks, attacks that target specific software – application with exposed network services or operating system network services). Test setup and configuration This test is executed against running O-RAN component or O-RAN system as the DUT. Test prerequisites: • Network access to DUT • Vendor-provided list of network protocols and services supported by DUT Test procedure 1) In case the call flow needs authentication: 1.1) Set up a call flow that will send repeated requests after the authentication at an increasing rate over time. Mark the failure point of receiving rejection or response messages. 1.2) Stop the attack. 1.3) Set up a call flow that will send repeated requests before the authentication at an increasing rate over time. Mark the failure point of receiving acceptance or response messages. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 59 1.4) Stop the attack. 2) In case the call flow does not need authentication: 2.1) Set up a call flow that will send repeated requests at an increasing rate over time. Mark the failure point of receiving response messages. 2.2) Stop the attack. Expected results It is expected the component fails to serve requests after steps 1.1, 1.3 and/or step 2.1. After the attack/test stops, the DUT returns to a functional state, being able to respond to service requests again. This test case fails if DUT does not return to a functional state after the test stops. Expected format of evidence: Report file, log files and/or screenshots.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.5.2 O-CU DoS protection and recovery
|
Requirement Name: O-CU DoS protection and recovery Requirement Reference & Description: 'REQ-SEC-OCU-1' clause 5.1.4, 'REQ-SEC-DOS-1' clause 5.3.5 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-04, T-O-RAN-09' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-CU Test Name: TC_DoS_RECOV_OCU Test description and applicability Purpose: The purpose of this test is to evaluate the resilience of the O-CU against Denial-of-Service attacks and the recovery process from those attacks. Test setup and configuration • The O-CU is powered on and operational. • DoS protection mechanisms are implemented on the O-CU. • The testing environment is isolated and does not impact production systems. Test procedure Refer to TC_Robustness_DoS for the detailed test procedure. Expected Results • O-CU detects and demonstrates robustness against the DoS attack, maintaining normal operations with acceptable performance and rejecting requests, regardless of whether they are malicious or not. • O-CU successfully recovers from the DoS attack and resumes normal operation within a reasonable recovery time. Expected format of evidence: Observation logs during the DoS attack, including any triggered countermeasures or rate limiting mechanisms, and validate that the O-CU effectively defends against the attack. Observation logs of the recovery process, including the time taken for the O-CU to regain stable operation, and validate that the recovery is timely and effective. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 60 NOTE: Recovery time specifies the maximum acceptable recovery time after the attack ceases (e.g. "O-CU recovers and returns to normal operation within 5 minutes after the attack stops").
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.5.3 O-DU DoS protection and recovery
|
Requirement Name: O-DU DoS protection and recovery Requirement Reference & Description: 'REQ-SEC-ODU-1' clause 5.1.5, 'REQ-SEC-DOS-1' clause 5.3.5 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-04, T-O-RAN-09' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-DU Test Name: TC_DoS_RECOV_ODU Test description and applicability Purpose: The purpose of this test is to verify the resilience of the user plane to bandwidth exhaustion and packet flooding DoS attacks. Test setup and configuration • A valid eCPRI connection between the O-RU and O-DU. • Test environment capable of generating high bandwidth traffic (e.g. high volume of packets). Test procedure Refer to TC_Robustness_DDoS for the detailed test procedure. Expected Results • The O-DU maintains acceptable performance levels despite increased traffic. • It handles the excess traffic without experiencing significant degradation or failure. • Once the load is reduced, the O-DU recovers and returns to normal operation. Expected Format of evidence: • Steps performed with detailed execution logs • Metrics and performance measurements (e.g. recovery time, packet loss, CPU utilization) during the DoS attack
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.5.4 O-RU DoS protection and recovery
|
Requirement Name: O-RU DoS protection and recovery Requirement Reference & Description: 'REQ-SEC-ORU-1, REQ-SEC-ORU-2' clause 5.1.6, 'REQ-SEC-DOS-1' clause 5.3.5 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-04, T-O-RAN-09' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU Test Name: TC_DoS_RECOV_ORU Test description and applicability Purpose: The purpose of this test is to evaluate the resilience of the O-RU against Denial-of-Service attacks and the recovery process from those attacks. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 61 Test setup and configuration • The O-RU is powered on and operational. • DoS protection mechanisms are implemented. • The testing environment is isolated and does not impact production systems. Test procedure Refer to TC_Robustness_DDoS for the detailed test procedure. Expected Results • O-RU detects and demonstrates robustness against the DoS attack, maintaining normal operations with acceptable performance and rejecting requests, regardless of whether they are malicious or not. • O-RU successfully recovers from the DoS attack and resumes normal operation within a reasonable recovery time. Expected format of evidence: Observation logs during the DoS attack, including any triggered countermeasures or rate limiting mechanisms, and validate that the O-RU effectively defends against the attack. Observation logs of the recovery process, including the time taken for the O-RU to regain stable operation, and validate that the recovery is timely and effective. NOTE: Recovery time specifies the maximum acceptable recovery time after the attack ceases (e.g. "O-RU recovers and returns to normal operation within 5 minutes after the attack stops").
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.5.5 Near-RT RIC DoS protection and recovery
|
Requirement Name: Near-RT RIC DoS protection and recovery Requirement Reference & Description: 'REQ-SEC-NEAR-RT-6, REQ-SEC-NEAR-RT-7' clause 5.1.3, 'REQ-SEC- DOS-1' clause 5.3.5 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-04, T-O-RAN-09' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: Near-RT RIC Test Name: TC_DoS_RECOV_NEAR_RT_RIC Test description and applicability Purpose: The purpose of this test is to evaluate the resilience of the Near-RT RIC against Denial-of-Service attacks and the recovery process from those attacks. Test setup and configuration • The Near-RT RIC is powered on and operational. • DoS protection mechanisms are implemented on the Near-RT RIC. • The testing environment is isolated and does not impact production systems. Test procedure Refer to TC_Robustness_DDoS for the detailed test procedure. Expected Results • Near-RT RIC detects and demonstrates robustness against the DoS attack, maintaining normal operations with acceptable performance and rejecting requests, regardless of whether they are malicious or not. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 62 • Near-RT RIC successfully recovers from the DoS attack and resumes normal operation within a reasonable recovery time. Expected format of evidence: • Observation logs during the DoS attack, including any triggered countermeasures or rate limiting mechanisms, and validate that the Near-RT RIC effectively defends against the attack. • Observation logs of the recovery process, including the time taken for the Near-RT RIC to regain stable operation, and validate that the recovery is timely and effective. NOTE: Recovery time specifies the maximum acceptable recovery time after the attack ceases (e.g. "Near-RT RIC recovers and returns to normal operation within 5 minutes after the attack stops").
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.6 Input validation and error handling
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.6.0 Overview
|
Input validation and error handling are pivotal security practices that guard against malformed or malicious data inputs, ensuring that systems behave predictably and securely. This clause elucidates a series of tests designed to validate the efficacy of the input validation and error handling mechanisms implemented in various O-RAN network functions (O- CU, O-DU, Near-RT RIC), safeguarding them from a myriad of potential vulnerabilities and ensuring robust, secure, and stable operations.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.6.1 O-CU input validation and error handling
|
Requirement Name: Input validation and error handling on data provided through O1 and E2 interfaces. Requirement Reference & Description: 'REQ-SEC-OCU-1' clause 5.1.4.1 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-05' clause 7.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-CU Test Name: TC_INPUT_VALIDATION_ERR_HANDL_OCU Test description and applicability Purpose: The purpose of this test is to verify that the O-CU performs proper input validation on provided data via E2/O1 interfaces and rejects invalid or malicious inputs. It verifies that the O-CU correctly handles errors and responds appropriately. Test setup and configuration • The O-CU is powered on and operational. • Test environment is set up with E2 and O1 interfaces configured. • Input validation mechanisms are implemented on O-CU. • Error handling mechanisms (e.g. error codes, error messages) are implemented by O-CU. Test procedure 1) Case of malformed input data: a) The tester provides invalid or malformed input data to the O-CU via E2/O1 interfaces, violating the specified format or containing unexpected values. b) The tester captures and analyses the response from the E2/O1 interfaces. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 63 c) The tester verifies that the O-CU detects the invalid input and rejects it appropriately, returning an error message or taking necessary actions to mitigate the impact. EXAMPLE: Actions could be rejecting the message, sending an error indication, etc. 2) Case of malicious input data: a) The tester provides malicious input data to the O-CU, aiming to exploit known vulnerabilities (e.g. CVE database, OWASP Top Ten, NIST National Vulnerability Database (NVD), vendor-specific vulnerability database) or perform unauthorized actions. b) The tester verifies that the O-CU identifies the malicious input and implements security measures to prevent exploitation, such as input sanitization, access controls, or anomaly detection. 3) Boundary case: a) Provide input data at the boundaries of the allowed range or limits defined for specific inputs. b) Verify that the O-CU handles the boundary cases correctly, without encountering any unexpected behaviour or errors due to boundary conditions. Expected Results 1) For case 'malformed input data', the O-CU properly validates incoming inputs form O1/E2 interfaces and rejects those with invalid or malformed data, returning an appropriate error response and preventing any potential security risks or system failures. 2) For case 'malicious input data', the O-CU detects and mitigates the malicious input, preventing any potential security breaches or unauthorized operations. 3) For case 'boundary', the O-CU properly handles the boundary cases, ensuring that inputs at the limits are processed accurately without causing any system instability or vulnerabilities. Expected format of evidence: 1) Logs detailing the invalid or malformed input data provided to the O-CU via O1/E2 interfaces, alongside system logs capturing the O-CU's error messages or indications in response to the invalid input. 2) Logs documenting the malicious input data sent to the O-CU and the targeted vulnerabilities, complemented by system logs highlighting the O-CU 's detection and mitigation actions upon receiving the malicious input. 3) Logs of the boundary input data values provided to the O-CU, paired with system logs capturing the O-CU 's messages or behaviors in response to the boundary inputs.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.6.2 O-DU input validation and error handling
|
Requirement Name: Input validation and error handling on data provided through O1/E2/FH interfaces. Requirement Reference & Description 'REQ-SEC-ODU-1' clause 5.1.5.1 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-05' clause 7.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-DU Test Name: TC_INPUT_VALIDATION_ERR_HANDL_ODU Test description and applicability Purpose: The purpose of this test is to verify that the O-DU performs proper input validation on provided data via E2/O1/FH interfaces and rejects invalid or malicious inputs. It verifies that the O-DU correctly handles errors and responds appropriately. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 64 Test setup and configuration • The O-DU is powered on and operational. • Test environment is set up with E2/O1/FH interfaces configured. • Input validation mechanisms are implemented on O-DU. • Error handling mechanisms (e.g. error codes, error messages) are implemented by O-DU. Test procedure 1) Case of malformed input data: a) The tester provides invalid or malformed input data to the O-DU via E2/O1/FH interfaces, violating the specified format or containing unexpected values. b) The tester captures and analyses the response from the E2/O1/FH interfaces. c) The tester verifies that the O-DU detects the invalid input and rejects it appropriately, returning an error message or taking necessary actions to mitigate the impact. EXAMPLE: Actions could be rejecting the message, sending an error indication, etc. 2) Case of malicious input data a) The tester provides malicious input data to the O-DU, aiming to exploit known vulnerabilities (e.g. CVE database, OWASP Top Ten, NIST National Vulnerability Database (NVD), vendor-specific vulnerability database) or perform unauthorized actions. b) The tester verifies that the O-DU identifies the malicious input and implements security measures to prevent exploitation, such as input sanitization, access controls, or anomaly detection. 3) Boundary case a) Provide input data at the boundaries of the allowed range or limits defined for specific inputs. b) Verify that the O-DU handles the boundary cases correctly, without encountering any unexpected behaviour or errors due to boundary conditions. Expected Results 1) For case 'malformed input data', the O-DU properly validates incoming inputs form O1/E2/FH interfaces and rejects those with invalid or malformed data, returning an appropriate error response and preventing any potential security risks or system failures. 2) For case 'malicious input data', the O-DU detects and mitigates the malicious input, preventing any potential security breaches or unauthorized operations. 3) For case 'boundary', the O-DU properly handles the boundary cases, ensuring that inputs at the limits are processed accurately without causing any system instability or vulnerabilities. Expected format of evidence: 1) Logs detailing the invalid or malformed input data provided to the O-DU via E2/O1/FH interfaces, alongside system logs capturing the O-DU's error messages or indications in response to the invalid input. 2) Logs documenting the malicious input data sent to the O-DU and the targeted vulnerabilities, complemented by system logs highlighting the O-DU's detection and mitigation actions upon receiving the malicious input. 3) Logs of the boundary input data values provided to the O-DU, paired with system logs capturing the O-DU's messages or behaviors in response to the boundary inputs. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 65
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.6.3 Near-RT RIC input validation and error handling
|
Requirement Name: Error handling by Near-RT RIC Requirement Reference & Description: 'REQ-SEC-NEAR-RT-6, REQ-SEC-NEAR-RT-7' clause 5.1.3.1 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-NEAR-RT-03, T-NEAR-RT-04' clause 7.4.1.4 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: NEAR-RT RIC Test Name: TC_INPUT_VALIDATION_ERR_HANDL_NEAR_RT_RIC Test description and applicability Purpose: The purpose of this test is to verify that the Near-RT RIC performs proper input validation on provided data via O1/E2/A1/Y1 interfaces and rejects invalid or malicious inputs. It verifies that the Near-RT RIC correctly handles errors and responds appropriately. Test setup and configuration 1) Near-RT RIC is powered and operational. 2) Test environment is set up with O1/E2/A1/Y1 interfaces configured. 3) Input validation mechanisms are implemented on Near-RT RIC. 4) Error handling mechanisms (e.g. error codes, error messages) are implemented by Near-RT RIC. Test procedure 1) Case of malformed input data a) The tester provides invalid or malformed input data to the Near-RT RIC via O1/E2/A1/Y1interfaces, violating the specified format or containing unexpected values. b) The tester captures and analyses the response from the O1/E2/A1/Y1interfaces. c) The tester verifies that the Near-RT RIC detects the invalid input and rejects it appropriately, returning an error message or taking necessary actions to mitigate the impact. EXAMPLE: Actions could be rejecting the message, sending an error indication, etc. 2) Case of malicious input data a) The tester provides malicious input data to the Near-RT RIC, aiming to exploit known vulnerabilities (e.g. CVE database, OWASP Top Ten, NIST National Vulnerability Database (NVD), vendor- specific vulnerability database) or perform unauthorized actions. b) The tester verifies that the Near-RT RIC identifies the malicious input and implements security measures to prevent exploitation, such as input sanitization, access controls, or anomaly detection. 3) Boundary case a) Provide input data at the boundaries of the allowed range or limits defined for specific inputs. b) Verify that the Near-RT RIC handles the boundary cases correctly, without encountering any unexpected behaviour or errors due to boundary conditions. Expected Results 1) For case 'malformed input data', the Near-RT RIC properly validates incoming inputs form O1/E2/A1/Y1interfaces and rejects those with invalid or malformed data, returning an appropriate error response and preventing any potential security risks or system failures. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 66 2) For case 'malicious input data', the Near-RT RIC detects and mitigates the malicious input, preventing any potential security breaches or unauthorized operations. 3) For case 'boundary', the Near-RT RIC properly handles the boundary cases, ensuring that inputs at the limits are processed accurately without causing any system instability or vulnerabilities. Expected format of evidence: 1) Logs detailing the invalid or malformed input data provided to the Near-RT RIC via O1/E2/A1/Y1 interfaces, alongside system logs capturing the Near-RT RIC's error messages or indications in response to the invalid input. 2) Logs documenting the malicious input data sent to the Near-RT RIC and the targeted vulnerabilities, complemented by system logs highlighting the Near-RT RIC 's detection and mitigation actions upon receiving the malicious input. 3) Logs of the boundary input data values provided to the Near-RT RIC, paired with system logs capturing the Near-RT RIC 's messages or behaviors in response to the boundary inputs.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.7 Secure configuration verification
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.7.0 Overview
|
The tests outlined in this clause aim to verify the resilience of the configuration of the O-RAN NFs against unauthorized access and modifications, emphasizing the importance of stringent security measures in the face of potential threats.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.7.1 O-CU secure configuration verification
|
Requirement Name: Secure configuration verification by O-CU Requirement Reference & Description: 'REQ-SEC-OCU-1' clause 5.1.4 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-02' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-CU Test Name: TC_CONF_VER_OCU Test description and applicability Purpose: The purpose of this test is to verify that the O-CU enforces secure configuration settings and protects against unauthorized configuration changes. Test setup and configuration 1) The O-CU is powered on and operational. 2) Secure configuration settings are defined and applied on the O-CU. Test procedure 1) Access the O-CU configuration settings a) Attempt to access the O-CU configuration settings without proper authorization or credentials. b) Verify that the O-CU denies access to the configuration settings and prompts for valid credentials. c) Ensure that only authorized users or devices with appropriate credentials can access and modify the configuration settings. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 67 2) Modification or tampering with the secure configuration settings on the O-CU a) Attempt unauthorized access: Try to access and modify the secure configuration settings on the O-CU without proper authorization. This includes sending unauthorized access messages or commands to the O-CU. b) Tamper with settings: If access is granted, attempt to modify, delete, or add new configuration settings that deviate from the secure baseline. c) Verify that the O-CU detects any unauthorized modification or tampering attempts and rejects the modified configuration. d) Ensure that the O-CU maintains the integrity and validity of the configuration settings, reverting any unauthorized changes. Expected Results 1) The O-CU denies unauthorized access to the configuration settings and requests valid credentials. 2) The O-CU detects any unauthorized modification or tampering attempts and rejects the modified configuration, maintaining its secure configuration. Expected format of evidence: 1) Document the access denial and verify the system logs or audit logs capturing the unauthorized access attempt. 2) Document the configuration rejection and verify the system logs or audit logs indicating the detection of unauthorized modification.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.7.2 O-DU secure configuration verification
|
Requirement Name: Secure configuration verification by O-DU Requirement Reference & Description: 'REQ-SEC-ODU-1' clause 5.1.5 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-02' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-DU Test Name: TC_CONF_VER_ODU Test description and applicability Purpose: The purpose of this test is to verify that the O-DU enforces secure configuration settings and protects against unauthorized configuration changes. Test setup and configuration 1) The O-DU is powered on and operational. 2) Secure configuration settings are defined and applied on the O-DU. Test procedure 1) Access the O-DU configuration settings a) Attempt to access the O-DU configuration settings without proper authorization or credentials. b) Verify that the O-DU denies access to the configuration settings and prompts for valid credentials. c) Ensure that only authorized users or devices with appropriate credentials can access and modify the configuration settings. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 68 2) Modification or tampering with the secure configuration settings on the O-DU a) Attempt unauthorized access: Try to access and modify the secure configuration settings on the O-DU without proper authorization. This includes sending unauthorized access messages or commands to the O-DU. b) Tamper with settings: If access is granted, attempt to modify, delete, or add new configuration settings that deviate from the secure baseline. c) Verify that the O-DU detects any unauthorized modification or tampering attempts and rejects the modified configuration. d) Ensure that the O-DU maintains the integrity and validity of the configuration settings, reverting any unauthorized changes. Expected Results 1) The O-DU denies unauthorized access to the configuration settings and requests valid credentials. 2) The O-DU detects any unauthorized modification or tampering attempts and rejects the modified configuration, maintaining its secure configuration. Expected format of evidence: 1) Document the access denial and verify the system logs or audit logs capturing the unauthorized access attempt. 2) Document the configuration rejection and verify the system logs or audit logs indicating the detection of unauthorized modification.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.7.3 O-RU secure configuration verification
|
Requirement Name: Secure configuration verification by O-RU Requirement Reference & Description: 'REQ-SEC-ORU-1, REQ-SEC-ORU-2' clause 5.1.6 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-02' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU Test Name: TC_CONF_VER_ORU Test description and applicability Purpose: The purpose of this test is to verify that the O-RU enforces secure configuration settings and protects against unauthorized configuration changes. Test setup and configuration 1) The O-RU is powered on and operational. 2) Secure configuration settings are defined and applied on the O-RU. Test procedure 1) Access the O-RU configuration settings a) Attempt to access the O-RU configuration settings without proper authorization or credentials. b) Verify that the O-RU denies access to the configuration settings and prompts for valid credentials. c) Ensure that only authorized users or devices with appropriate credentials can access and modify the configuration settings. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 69 2) Modification or tampering with the secure configuration settings on the O-RU a) Attempt unauthorized access: Try to access and modify the secure configuration settings on the O-RU without proper authorization. This includes sending unauthorized access messages or commands to the O-RU. b) Tamper with settings: If access is granted, attempt to modify, delete, or add new configuration settings that deviate from the secure baseline. c) Verify that the O-RU detects any unauthorized modification or tampering attempts and rejects the modified configuration. d) Ensure that the O-RU maintains the integrity and validity of the configuration settings, reverting any unauthorized changes. Expected Results 1) The O-RU denies unauthorized access to the configuration settings and requests valid credentials. 2) The O-RU detects any unauthorized modification or tampering attempts and rejects the modified configuration, maintaining its secure configuration. Expected format of evidence: 1) Document the access denial and verify the system logs or audit logs capturing the unauthorized access attempt. 2) Document the configuration rejection and verify the system logs or audit logs indicating the detection of unauthorized modification.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.7.4 Near-RT RIC secure configuration verification
|
Requirement Name: Secure configuration verification by Near-RT RIC Requirement Reference & Description: 'REQ-SEC-NEAR-RT-6, REQ-SEC-NEAR-RT-7' clause 5.1.3 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-02' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: NEAR-RT RIC Test Name: TC_CONF_VER_NEAR_RT_RIC Test description and applicability Purpose: The purpose of this test is to verify that the Near-RT RIC enforces secure configuration settings and protects against unauthorized configuration changes. Test setup and configuration 1) The Near-RT RIC is powered on and operational. 2) Secure configuration settings are defined and applied on the Near-RT RIC. Test procedure 1) Access the Near-RT RIC configuration settings a) Attempt to access the Near-RT RIC configuration settings without proper authorization or credentials. b) Verify that the Near-RT RIC denies access to the configuration settings and prompts for valid credentials. c) Ensure that only authorized users or devices with appropriate credentials can access and modify the configuration settings. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 70 2) Modification or tampering with the secure configuration settings on the Near-RT RIC a) Attempt unauthorized access: Try to access and modify the secure configuration settings on the Near- RT RIC without proper authorization. This includes sending unauthorized access messages or commands to the Near-RT RIC. b) Tamper with settings: If access is granted, attempt to modify, delete, or add new configuration settings that deviate from the secure baseline. c) Verify that the Near-RT RIC detects any unauthorized modification or tampering attempts and rejects the modified configuration. d) Ensure that the Near-RT RIC maintains the integrity and validity of the configuration settings, reverting any unauthorized changes. Expected Results 1) The Near-RT RIC denies unauthorized access to the configuration settings and requests valid credentials. 2) The Near-RT RIC detects any unauthorized modification or tampering attempts and rejects the modified configuration, maintaining its secure configuration. Expected format of evidence: 1) Document the access denial and verify the system logs or audit logs capturing the unauthorized access attempt. 2) Document the configuration rejection and verify the system logs or audit logs indicating the detection of unauthorized modification.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.8 Logging and monitoring
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.8.0 Overview
|
The tests outlined here aim to scrutinize the logging and monitoring capabilities of various O-RAN components, ensuring they are up to the mark and can effectively detect, log, and alert any anomalies.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.8.1 O-CU logging and monitoring
|
Requirement Name: O-CU logging and monitoring Requirement Reference & Description: 'REQ-SEC-OCU-1' clause 5.1.4 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-07' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-CU Test Name: TC_LOG_OCU Test description and applicability Purpose: The purpose of this test is to verify that the O-CU correctly logs and monitors security-related events effectively. Test setup and configuration • The O-CU is powered on and operational. • Logging and monitoring configurations are properly set up on the O-CU. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 71 Test procedure 1) Logging a) The tester triggers an error or failure condition in the O-CU, such as connection attempts with invalid credentials, unauthorized access and a dropped connection. b) The tester verifies that the O-CU logs the error by capturing the relevant log entry. 2) Monitoring a) The tester monitors the key performance indicators (KPIs) of the O-CU, such as throughput, latency, or signal quality. b) The tester verifies that the monitoring system accurately collects and displays the KPI values in real- time. c) The tester introduces a simulated degradation or overload scenario on the O-CU, such as increasing network traffic or reducing available resources. d) The tester monitors the O-CU performance under the simulated scenario. e) The tester verifies that the monitoring system detects and raises alerts for the degraded performance or overload condition. Expected Results 1) O-CU logs and generates alerts for security-related events, providing necessary information and timestamps for incident investigation and analysis. 2) The monitoring system provides accurate and real-time KPI values for the O-CU. The monitoring system detects and raises appropriate alerts for the degraded performance or overload condition. Expected format of evidence: 1) Capture and analyse the logged error in the O-CU logs or logging system and document the presence of the log entry. 2) Document the monitored KPI values and the raised alerts, validate them against the expected values, and ensure they are triggered accurately in the monitoring system.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.8.2 O-DU logging and monitoring
|
Requirement Name: O-DU logging and monitoring Requirement Reference & Description: 'REQ-SEC-ODU-1' clause 5.1.5 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-07' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-DU Test Name: TC_LOG_ODU Test description and applicability Purpose: The purpose of this test is to ensure that the O-DU correctly logs and monitors security-related events effectively. Test setup and configuration • The O-DU is powered on and operational. • Logging and monitoring configurations are properly set up on the O-DU. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 72 Test procedure 1) Logging a) The tester triggers an error or failure condition in the O-DU, such as connection attempts with invalid credentials, unauthorized access and a dropped connection. b) The tester verifies that the O-DU logs the error by capturing the relevant log entry. 2) Monitoring a) The tester monitors the Key Performance Indicators (KPIs) of the O-DU, such as throughput, latency, or signal quality. b) The tester verifies that the monitoring system accurately collects and displays the KPI values in real- time. c) The tester introduces a simulated degradation or overload scenario on the O-DU, such as increasing network traffic or reducing available resources. d) Th tester monitors the O-DU performance under the simulated scenario. e) The tester verifies that the monitoring system detects and raises alerts for the degraded performance or overload condition. Expected Results 1) O-DU logs and generates alerts for security-related events, providing necessary information and timestamps for incident investigation and analysis. 2) The monitoring system provides accurate and real-time KPI values for the O-DU. The monitoring system detects and raises appropriate alerts for the degraded performance or overload condition. Expected format of evidence: 1) Capture and analyse the logged error in the O-DU logs or logging system and document the presence of the log entry. 2) Document the monitored KPI values and the raised alerts, validate them against the expected values, and ensure they are triggered accurately in the monitoring system.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.8.3 O-RU logging and monitoring
|
Requirement Name: O-RU logging and monitoring Requirement Reference & Description: 'REQ-SEC-ORU-1, REQ-SEC-ORU-2' clause 5.1.6 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-07' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU Test Name: TC_LOG_ORU Test description and applicability Purpose: The purpose of this test is to ensure that the O-RU correctly logs and monitors security-related events effectively. Test setup and configuration • The O-RU is powered on and operational. • Logging and monitoring configurations are properly set up on the O-RU. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 73 Test procedure 1) Logging a) The tester triggers an error or failure condition in the O-RU, such as connection attempts with invalid credentials, unauthorized access and a dropped connection. b) The tester verifies that the O-RU logs the error by capturing the relevant log entry. 2) Monitoring a) The tester monitors the key performance indicators (KPIs) of the O-RU, such as throughput, latency, or signal quality. b) The tester verifies that the monitoring system accurately collects and displays the KPI values in real- time. c) The tester introduces a simulated degradation or overload scenario on the O-RU, such as increasing network traffic or reducing available resources. d) Th tester monitors the O-RU performance under the simulated scenario. e) The tester verifies that the monitoring system detects and raises alerts for the degraded performance or overload condition. Expected Results 1) O-RU logs and generates alerts for security-related events, providing necessary information and timestamps for incident investigation and analysis. 2) The monitoring system provides accurate and real-time KPI values for the O-RU. The monitoring system detects and raises appropriate alerts for the degraded performance or overload condition. Expected format of evidence: 1) Capture and analyse the logged error in the O-RU logs or logging system and document the presence of the log entry. 2) Document the monitored KPI values and the raised alerts, validate them against the expected values, and ensure they are triggered accurately in the monitoring system.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
7.8.4 Near-RT RIC logging and monitoring
|
Requirement Name: Near-RT RIC logging and monitoring Requirement Reference & Description: 'REQ-SEC-NEAR-RT-4' clause 5.1.3 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-04' clause 5.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: NEAR-RT RIC Test Name: TC_LOG_NEAR_RT_RIC Test description and applicability Purpose: The purpose of this test is to ensure that the Near-RT RIC correctly logs and monitors security-related events effectively. Test setup and configuration 1) The Near-RT RIC is powered on and operational. 2) Logging and monitoring configurations are properly set up on the Near-RT RIC. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 74 Test procedure 1) Logging a) The tester triggers an error or failure condition in the Near-RT RIC, such as connection attempts with invalid credentials, unauthorized access, or a dropped connection. b) The tester verifies that the Near-RT RIC logs the error by capturing the relevant log entry. 2) Monitoring a) The tester monitors the key performance indicators (KPIs) of the Near-RT RIC, such as throughput, latency, or signal quality. b) The tester verifies that the monitoring system accurately collects and displays the KPI values in real- time. c) The tester introduces a simulated degradation or overload scenario on the Near-RT RIC, such as increasing network traffic or reducing available resources. d) Th tester monitors the Near-RT RIC performance under the simulated scenario. e) The tester verifies that the monitoring system detects and raises alerts for the degraded performance or overload condition. Expected Results 1) Near-RT RIC logs and generates alerts for security-related events, providing necessary information and timestamps for incident investigation and analysis. 2) The monitoring system provides accurate and real-time KPI values for the Near-RT RIC. The monitoring system detects and raises appropriate alerts for degraded performance or overload conditions. Expected format of evidence: 1) Capture and analyse the logged error in the Near-RT RIC logs or logging system and document the presence of the log entry. 2) Document the monitored KPI values and the raised alerts, validate them against the expected values, and ensure they are triggered accurately in the monitoring system.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8 System security evaluation for O-RAN component
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.1 Overview
|
This clause contains security evaluations to be performed at the system level of an O-RAN component, covering vulnerability scanning, data and information protection and system logging. The objects in scope of these system security evaluation are SMO, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU and O- RU.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.2 System Vulnerability Scanning
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.2.1 System Vulnerability Scanning
|
Requirement Name: Robustness of OS and Applications Requirement Reference: REQ-SEC-SYS-1 from clause 5.3.6, REQ-SEC-ALM-PKG-1, O-RAN Security Requirements and Controls Specifications [5] ETSI ETSI TS 104 105 V7.0.0 (2025-06) 75 Requirement Description: Operating System (OS) and applications vulnerability scan of O-RAN component Threat References: T-O-RAN-01 DUT/s: SMO, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, xApp, rApp Test Name: TC_Vulnerability_Scanning Test description and applicability Purpose: To verify the O-RAN element under test does not contain known vulnerabilities in the OS and applications. Perform vulnerability scanning to ensure that there are no known vulnerabilities on the O-RAN component, both in the Operating System (OS) and the applications installed, that can be detected by means of automatic testing tools via the IP enabled network interfaces, or to identify the know vulnerabilities on the O-RAN component and have a clear mitigation plan for the ones of high severity. Known vulnerabilities are considered those which are publicly disclosed, found by users or reported by security researchers. Those vulnerabilities are widely detected by commercial, or open-source tools designed for this purpose. Test setup and configuration DUT is the O-RAN component with IP enabled network interfaces. Test procedure 1) Run the vulnerability scanning tool and check the potential known vulnerabilities existing on the O-RAN component OS and applications levels. 2) The severity level of the existing vulnerabilities is evaluated. Expected results The O-RAN component is free from known vulnerabilities or there are security controls in place to mitigate the exploits associated with the vulnerabilities of high severity. Expected format of evidence: Report files, log files and/or screenshots.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.3 Data and Information Protection
|
Void.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4 System logging
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4.1 Introduction
|
This clause contains test cases related to security log management. 8.4.2 Security log format and related log fields Requirement Name: Security logs check for date, time and location field IP address. Requirement Reference: SEC-CTL-SLM-FLD-1, SEC-CTL-SLM-FLD-2; [5], clause 5.3.8.8 Requirement Description: Support for security logs containing date, time and location field IP address. Threat References: T-O-RAN-07 DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud Test Name: TC_Logs_Datetime_Fields_Validation Test Description ETSI ETSI TS 104 105 V7.0.0 (2025-06) 76 Purpose: To verify the log fields of security log data from an O-RAN component as per clause 5.3.8.8 of Security requirement and protocol specifications [5]. The security log should have the recommended date and time in ISO 8601 [24] format and mandatorily log the location field IP address (IP address of the host from which security events are generated). Test setup and configuration DUT is any O-RAN component that creates/generates security event logs which acts as server. DUT also offers one or more services through which it can be accessed. Client is the test system equipped to communicate securely with O-RAN component and able to perform security related operations on DUT. Test procedure Table 8.4.2-1: Scenarios to be executed Scenario ID Configuration 1 Login to the DUT via test system with authorized credentials. 2 Execute valid operations on the DUT which triggers/generates the security logs. Expected results Table 8.4.2-2: Expected results Scenario ID Expected result Reason 1 Connection established. Authentication successful. 2 All the security logs generated by the DUT have: 1) Date and time format as per ISO 8601 [24] format as recommended by clause 5.3.8.8 of [5] 2) Location field IP address (IP address of the DUT) as mandated by clause 5.3.8.8 of [5] Security log generation is successful. Expected format of evidence: Log files
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4.3 Authenticated Time Stamping
|
Requirement Name: Authenticated Time-Stamping Requirement Reference: Clause 5.3.8.9.2.1, O-RAN Security Requirements and Controls Specification [5] Requirement Description: Optional support NTPv4 Threat References: T-O-RAN-07 DUT/s: SMO, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, rApp, xApp, O-Cloud, Non-RT RIC Test Name: TC_Logs_Authenticated_Time_Stamping Test description and applicability Purpose: To verify that the element fulfills the optional requirement of supporting Network Time Protocol (NTP) version 4 as specified by IETF RFC 5905 [15] for authenticated time stamping in the client role only. Test setup and configuration 1) The element is powered on and operational. 2) The NTP server specified for testing is reachable and properly configured to support authenticated time stamping. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 77 Test procedure Verify NTP Client Version: • Access the elements configuration settings related to NTP. • Confirm that the element specifies NTP version 4 as the selected protocol. Authentication Setup • Configure the element to use the necessary authentication methods and -credentials (AES-CMAC/IETF RFC 4493 [19], certificates for Autokey/IETF RFC 5906 [16]) required by IETF RFC 5905 [15] for authenticated time stamping. • Provide valid authentication credentials (certificates) for NTP communication. Time Synchronization • Initiate an NTP time synchronization process from the element to the specified NTP server. • Monitor the communication between the element and the NTP server to ensure that the NTP packets are properly constructed with the required authentication parameters. • Verify that the element successfully receives the authenticated time stamps from the NTP server. Time Accuracy Check • After synchronization, record the element's internal clock time. • Obtain the time from the NTP server's authenticated time stamp. • Calculate the time difference between the element's internal clock time and the received authenticated time stamp. • Ensure that the time difference is within an acceptable tolerance, considering network latency and authentication processing. Expected results The element fulfills the requirement of supporting Network Time Protocol (NTP) version 4 for authenticated time stamping, as specified by IETF RFC 5905 [15]. The NTP communication successfully employs the configured authentication methods, and the time synchronization process ensures accurate timekeeping within the specified tolerance. An accuracy below 1 second should be measured to pass. Expected format of evidence: Log files, traffic captures and/or screenshots.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4.4 Network Security and System Security Events
|
Requirement Name: Network Security Events to be Logged and System Security Events to be Logged. Requirement Reference: 'REQ-SEC-SLM-NET-EVT-1' clause 5.3.8.11.2, 'REQ-SEC-SLM-GEN-EVT-1', 'REQ-SEC- SLM-GEN-EVT-2', 'REQ-SEC-SLM-GEN-EVT-3' clause 5.3.8.11.3.1.1, 'REQ-SEC-SLM-HYP-EVT-1', 'REQ-SEC- SLM-HYP-EVT-2', 'REQ-SEC-SLM-HYP-EVT-3' clause 5.3.8.11.3.2, 'REQ-SEC-SLM-CON-EVT-1', 'REQ-SEC- SLM-CON-EVT-2', 'REQ-SEC-SLM-CON-EVT-3' clause 5.3.8.11.3.3 in O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Logging of network and system security events in O-Cloud Threat References: T-O-RAN-01, T-O-RAN-02, T-O-RAN-03, T-O-RAN-09, T-VM-C-01, T-VM-C-02, T-VM-C-03, T-VM-C-04, T-VM-C-05, T-VM-C-06, T-IMG-01, T-IMG-02, T-ADMIN-02. DUT/s: O-Cloud Test Name: TC_Logs_Network_System_Security_Events Test Description ETSI ETSI TS 104 105 V7.0.0 (2025-06) 78 The security log contains log messages pertaining to network and system events that have security utility. Purpose: The purpose of the test is to verify the logging of security events from O-Cloud as per the Security Requirements and Controls Specifications [5]. Test setup and configuration DUT is the O-Cloud. A tester will have access to testing equipment that can connect to the O-Cloud with administrative privileges to the operating system, hypervisor, and container engine. Test procedure 1. Login to the DUT via testing equipment with administrative credentials. 2. Execute the following operations on the DUT. 2.1. Create a new network configuration. 2.2. Modify an existing network configuration. 2.3. Disable a port. 2.4. Enable a port. 2.5. Generate packets that exceed configured firewall limits. 2.6. Generate at least one network connection. 2.7. Reboot a virtual machine and then reboot the host operating system. 2.8. Shutdown a virtual machine then shutdown the host operating system. 2.9. Create a scheduled job within the host operating systems, hypervisor, and container engine. 2.10. Make a configuration change to the host operating system and hypervisor. 2.11. Attach and detach a virtual disk to a virtual machine. 2.12. Create a virtual machine. 2.13. Start a virtual machine. 2.14. Stop a virtual machine. 2.15. Delete a virtual machine. 2.16. Add an image to the container repository. 2.17. Modify an image to the container repository. 2.18. Remove an image to the container repository. 2.19. Create a container. 2.20. Start a container. 2.21. Stop a container. 2.22. Restart a container. 2.23. Delete a container. 2.24. Create a container volume. 2.25. Mount a container volume. 2.26. Delete a container volume. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 79 Expected results All the security logs produced by O-Cloud contain log messages that describe the actions taken in the test procedure steps. • For test procedure step 2.1 the log message indicates the creation of a new network configuration. • For test procedure step 2.2 the log message indicates the modification of an existing network configuration. • For test procedure step 2.3 the log message indicates the disabling of a port. • For test procedure step 2.4 the log message indicates the enabling a port. • For test procedure step 2.5 the log message indicates that packets have exceeded configured firewall limits. • For test procedure step 2.6 the log message indicates a network connection has been attempted along with details about that network connection including source and destination IP addresses. • For test procedure step 2.7 the log message indicates that a virtual machine was rebooted, and a subsequent log message indicates that a host operating system has been rebooted. • For test procedure step 2.8 the log message indicates that a virtual machine has been shut down and a subsequent log message indicates that the host operating system has been shut down. • For test procedure step 2.9 the log message indicates that a scheduled job was created within the host operating system, a subsequent log message indicates that a scheduled job was created in the hypervisor, and a subsequent log message indicates that a scheduled job was created in the container engine. • For test procedure step 2.10 the log message indicates that a configuration change was made to the host operating system and a subsequent log message indicates that a configuration change was made to the hypervisor. • For test procedure step 2.11 the log message indicates that a virtual disk was attached to a virtual machine, and a subsequent log message indicates that a virtual disk was detached from a virtual machine. • For test procedure step 2.12 the log message indicates that a virtual machine was created. • For test procedure step 2.13 the log message indicates that a virtual machine was started. • For test procedure step 2.14 the log message indicates that a virtual machine was stopped. • For test procedure step 2.15 the log message indicates that a virtual machine was deleted. • For test procedure step 2.16 the log message indicates that an image was added to the container repository. • For test procedure step 2.17 the log message indicates that an image was modified in the container repository. • For test procedure steps 2.18 the log message indicates that an image was removed from the container repository. • For test procedure step 2.19 the log message indicates a container was created. • For test procedure step 2.20 the log message indicates that a container was started. • For test procedure step 2.21 the log message indicates that a container was stopped. • For test procedure step 2.22 the log message indicated that a container was restarted. • For test procedure step 2.23 the log message indicates that a container was deleted. • For test procedure step 2.24 the log message indicates that a container volume was created. • For test procedure step 2.25 the log message indicates that a container volume was mounted. • For test procedure step 2.26 the log message indicates that a container volume was deleted. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 80 Expected format of evidence: Generated Log Files from DUT/s.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4.5 Application Security Events
|
Requirement Name: Application Security Events to be Logged. Requirement Reference: 'REQ-SEC-SLM-APP-EVT-1', 'REQ-SEC-SLM-APP-EVT-2' clause 5.3.8.11.4 in O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Support for the logging of security events in network functions Threat References: T-OPENSRC-01, T-xAPP-01, T-xAPP-02, T-xAPP-03, T-xAPP-04, T-rAPP-01, T-rAPP-02, T- rAPP-03, T-rAPP-04, T-rAPP-05, T-rAPP-06, T-rAPP-07, T-PNF-01. DUT/s: Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU. Test Name: TC_Logs_Application_Security_Events Test Description The security log contains log messages pertaining to application events that have security utility. Purpose: The purpose of the test is to verify the logging of security event data from O-RAN Network Functions as per the Security Requirements and Controls Specifications [5]. Test setup and configuration DUT is any O-RAN network function, i.e. Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU. A tester will have access to testing equipment that can connect to any O-RAN network function. Test procedure NOTE: Test procedure steps not applicable to the DUT may be skipped. 1) Login to the DUT via test equipment with authorized credentials. 2) Conduct an operation on the DUT that is known to generate an error. 3) Conduct an operation on the DUT that is known to load a dynamic library. Expected results All the security logs produced by O-RAN Network Functions contain log messages that pertain to the actions taken in the test procedure steps. • For test procedure step 2 the log message contains an error message. • For test procedure step 3 the log message contains a message indicating that a dynamic library loaded and details about that library. Expected format of evidence: Generated Log Files from DUT/s.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4.6 Data Access Security Events
|
Requirement Name: Data Access Security Events to be Logged. Requirement Reference: 'REQ-SEC-SLM-DAT-EVT-1', 'REQ-SEC-SLM-DAT-EVT-2', 'REQ-SEC-SLM-DAT- EVT-3', 'REQ-SEC-SLM-DAT-EVT-4', 'REQ-SEC-SLM-DAT-EVT-5', 'REQ-SEC-SLM-DAT-EVT-6', 'REQ-SEC- SLM-DAT-EVT-7', 'REQ-SEC-SLM-DAT-EVT-8' clause 5.3.8.11.5 in O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Logging of data access security events in O-RAN elements. Threat References: T-VM-C-01, T-NEAR-RT-03, T-O-RAN-07, T-O-RAN-08, T-GEN-05 DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud ETSI ETSI TS 104 105 V7.0.0 (2025-06) 81 Test Name: TC_Logs_Data_Access_Security_Events Test Description The security log contains log messages pertaining to data access that have security utility. Purpose: The purpose of the test is to verify the logging of data access security events from O-RAN elements as per the Security Requirements and Controls Specifications [5]. Test setup and configuration A tester will have access to testing equipment that can communicate securely with the DUT and is able to perform security and administrative related operations. Test procedure NOTE: Test procedure steps not applicable to the DUT may be skipped. 1. Login to the DUT via testing equipment with authorized credentials. 2. Execute the following operations on the DUT. 2.1. Add a new file. 2.2. Delete an existing file. 2.3. Attempt to add a file in an unauthorized location. 2.4. Attempt to delete a file from an unauthorized location. 2.5. Read an existing file. 2.6. Write to an existing file. 2.7. Attempt to read to a file in an unauthorized location. 2.8. Attempt to write to a file in an authorized location. 2.9. Create a new directory. 2.10. Delete an existing directory. 2.11. Attempt to create a directory in an unauthorized location. 2.12. Attempt to delete a directory from an unauthorized location. 2.13. Add data to a datastore or database. 2.14. Delete data from a datastore or database. 2.15. Attempt to add data to a datastore or database in an unauthorized location. 2.16. Attempt to delete data from a datastore or database from an unauthorized location. 2.17. Read data from a datastore or database. 2.18. Write data from a datastore or database. 2.19. Attempt to read data from a datastore or database from an unauthorized location. 2.20. Attempt to write data to a datastore or database in an unauthorized location. 2.21. Make a permissions change to a file. 2.22. Make a permissions change to a directory. 2.23. Make a permissions change to a datastore or database. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 82 Expected results All the security logs produced by O-RAN elements contain log messages that document appropriately the actions taken in the test procedure steps. • For test procedure step 2.1 the log message indicates that a new file was added. • For test procedure step 2.2 the log message indicates an existing file was deleted. • For test procedure step 2.3 the log message indicates an unauthorized attempt to add a file. • For test procedure step 2.4 the log message indicates an unauthorized attempt to delete a file. • For test procedure step 2.5 the log message indicates an existing file was read. • For test procedure step 2.6 the log message indicates an existing file was written. • For test procedure step 2.7 the log message indicates an unauthorized attempt to read to a file. • For test procedure step 2.8 the log message indicates an unauthorized attempt to write to a file. • For test procedure step 2.9 the log message indicates a new directory was created. • For test procedure step 2.10 the log message indicates an existing directory was deleted. • For test procedure step 2.11 the log message indicates an unauthorized attempt to create a directory. • For test procedure step 2.12 the log message indicates an unauthorized attempt to delete a directory. • For test procedure step 2.13 the log message indicates data was added to a datastore or database. • For test procedure step 2.14 the log message indicates data was deleted from a datastore or database. • For test procedure step 2.15 the log message indicates an unauthorized attempt to add data to a datastore or database. • For test procedure step 2.16 the log message indicates an unauthorized attempt to delete data from a datastore or database. • For test procedure step 2.17 the log message indicates that data was read from a datastore or database. • For test procedure step 2.18 the log message indicates that data was written to a datastore or database. • For test procedure step 2.19 the log message indicates an unauthorized attempt to read data from a datastore or database. • For test procedure step 2.20 the log message indicates an unauthorized attempt to write data to a datastore or database. • For test procedure step 2.21 the log message indicates a permissions change to a file. • For test procedure step 2.22 the log message indicates a permissions change to a directory. • For test procedure step 2.23 the log message indicates a permissions change to a datastore or database. Expected format of evidence: Generated Log Files from DUT.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4.7 Account and Identity Security Events
|
Requirement Name: Account and Identity Security Events to be Logged. Requirement Reference: 'REQ-SEC-SLM-AAI-EVT-1', 'REQ-SEC-SLM-AAI-EVT-2', 'REQ-SEC-SLM-AAI-EVT- 3', 'REQ-SEC-SLM-AAI-EVT-4', 'REQ-SEC-SLM-AAI-EVT-5', 'REQ-SEC-SLM-AAI-EVT-6', 'REQ-SEC-SLM- AAI-EVT-7', 'REQ-SEC-SLM-AAI-EVT-8', 'REQ-SEC-SLM-AAI-EVT-9', 'REQ-SEC-SLM-AAI-EVT-10' clause 5.3.8.11.6 in O-RAN Security Requirements and Controls Specifications [5]. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 83 Requirement Description: Logging of account and identity security events in O-RAN elements. Threat References: T-GEN-02, T-O-RAN-02, T-O-RAN-06, T-O-RAN-07, T-ProtocolStack-02, T-SMO-02, T-SMO- 05, T-SMO-08, T-SMO-25, T-SMO-30, T-NEAR-RT-03. DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud. Test Name: TC_Logs_Account_and_Identity_Security_Events. Test Description The security log contains log messages pertaining to account and identity events that have security utility. Purpose: The purpose of the test is to verify the logging of account and identity access security events from O-RAN elements as per Security Requirements and Controls Specifications [5] Test setup and configuration A tester will have access to testing equipment that can communicate securely with the DUT and is able to perform security and administrative related operations. Test procedure NOTE: Test procedure steps not applicable to the DUT may be skipped. 1. Login to the DUT via testing equipment with authorized credentials. 2. Execute the following operations on the DUT. 2.1. Create an account. 2.2. Modify an existing account. 2.3. Delete an existing account. 2.4. Attempt to create an account in an unauthorized location. 2.5. Change the privilege level of an existing account from a lower privilege to a higher privilege. 2.6. Attempt to change the privilege level of an existing account in an unauthorized location. 2.7. Change the group membership of an existing account. 2.8. Attempt to change the group membership of an existing account in an unauthorized location. 2.9. Use a function in the DUT that requires a specific assigned authorization. 2.10. Attempt to use a function in the DUT that requires a specific unassigned authorization. 2.11. Authenticate an account to the DUT that has been configured to access that DUT. 2.12. Attempt to authenticate an account to the DUT that has not been configured to access that DUT. 2.13. Change the privilege level of an existing account from a higher privilege to a lower privilege. 2.14. Access the DUT with an account the does not require authentication. 2.15. End a session with the DUT. Expected results All the security logs produced by O-RAN elements contain log messages that document appropriately the actions taken in the test procedure steps. • For test procedure step 2.1 the log message indicates that an account was created. • For test procedure step 2.2 the log message indicates that an existing account was modified. • For test procedure step 2.3 the log message indicates that an existing account was deleted. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 84 • For test procedure step 2.4 the log message indicates an unauthorized attempt to create an account. • For test procedure step 2.5 the log message indicates a privilege level change of an existing account from a lower privilege to a higher privilege. • For test procedure step 2.6 the log message indicates an unauthorized attempt to change the privilege level of an existing account. • For test procedure step 2.7 the log message indicates that the group membership had changed for an existing account. • For test procedure step 2.8 the log message indicates an unauthorized attempt to change the group membership of an existing account. • For test procedure step 2.9 the log message indicates the use of a restricted function. • For test procedure step 2.10 the log message indicates an unauthorized attempt to use a restricted function. • For test procedure step 2.11 the log message indicates the successful authentication of an account. • For test procedure step 2.12 the log message indicates the unsuccessful attempt to authenticate an account. • For test procedure step 2.13 the log message indicates a privilege level change of an existing account from a higher privilege to a lower privilege. • For test procedure step 2.14 the log message indicates access with an account the does not require authentication. • For test procedure step 2.15 the log message indicates the end of a session. Expected format of evidence: Generated Log Files from DUT.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4.8 General Security Events
|
Requirement Name: General Security Events to be Logged. Requirement Reference: 'REQ-SEC-SLM-GSE-1', 'REQ-SEC-SLM-GSE-2', 'REQ-SEC-SLM-GSE-3', 'REQ-SEC- SLM-GSE-4', 'REQ-SEC-SLM-GSE-5', 'REQ-SEC-SLM-GSE-6' clause 5.3.8.11.7 in O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Logging of general security events in O-RAN elements. Threat References: T-ORAN-01, T-O-RAN-02, T-O-RAN-03, T-O-RAN-08, T-GEN-02, T-VM-C-01, T-VM-C-04, T-VM-C-06, T-IMG-01, T-IMG-04, T-VL-01, T-VL-02, T-xAPP-01, T-rAPP-03, T-HW-02. DUT/s: SMO, Non-RT RIC, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, O-Cloud Test Name: TC_General_Security_Events_Logged Test Description The security log contains log messages pertaining to general security events. Purpose: The purpose of the test is to verify the logging of general security events from O-RAN elements as per the Security Requirements and Controls Specifications [5] Test setup and configuration A tester will have access to testing equipment that can communicate securely with the DUT and is able to perform security and administrative related operations. Test procedure NOTE: Test procedure steps not applicable to the DUT may be skipped. 1. Login to the DUT via testing equipment with authorized credentials. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 85 2. Execute the following operations on the DUT. 2.1. Enable security software such as firewalls, malware protection, data loss prevention or intrusion detection systems. 2.2. Disable security software such as firewalls, malware protection, data loss prevention or intrusion detection systems. 2.3. Log into DUT using an account with administrative privileges and perform a function that requires those privileges. 2.4. Make a change to the security configuration of the DUT. 2.5. View a certificate or key on the DUT. 2.6. Export a certificate or key from the DUT. 2.7. Renew a certificate or key on the DUT. 2.8. Import a certificate or key from the DUT. 2.9. Modify a certificate or key on the DUT. 2.10. Delete a certificate or key from the DUT. 2.11. Perform a cryptographic operation on the DUT that involves signatures, encryption, hashing, key generation or key destruction. 2.12. Submit a security patch to the DUT but do not apply it. Expected results All the security logs produced by O-RAN elements contain log messages that document appropriately the actions taken in the test procedure steps. • For test procedure steps 2.1 and 2.2 the log message indicates that the security software has been enabled or disabled. • For test procedure step 2.3 the log message indicates the use of administrative privileges. • For test procedure step 2.4 the log message indicates that a change to the security configuration has occurred and the nature of the change. • For test procedures 2.5 through 2.11 the log message is absent of any sensitive information related to the certificate or key. • For test procedure 2.12 the log message indicates that a security patch was submitted but not applied. Expected format of evidence: Generated Log Files from DUT.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
8.4.9 Storage
|
Requirement Name: Secure storage of security log data Requirement Reference: SEC-CTL-SLM-SST-1, SEC-CTL-SLM-SST-2, O-RAN.WG11.Security Requirements and Controls Specification [5], clause 5.3.8.5 Requirement Description: Support for secure storage of security log data Threat References: T-O-RAN-07, T-O-RAN-08 DUT/s: Centralized log server Test Name: TC_Logs_Secure_Storage Test Description ETSI ETSI TS 104 105 V7.0.0 (2025-06) 86 Purpose: To verify whether the storage of security logs is tamper-proof in centralized log servers as per clause 5.3.8.5 of [5]. These storages can be centralized logging servers or cloud-based services. Test setup and configuration DUT is the centralized log server where security log data from the O-RAN components are stored. Client is the test system equipped to communicate securely with the DUT. Preconditions: The log storage system (Centralized log server) is implemented and operational. User accounts with appropriate access levels have been provisioned on the DUT. A list of authorized personnel who should have access to the log system have been identified and documented. Test procedure Table 8.4.9-1: Scenarios to be executed Scenario ID Configuration 1 Login to the DUT via test system with authorized credentials. 2 Create a test account with unauthorized/Invalid credentials and attempt login access to the DUT. 3 Create a test account with insufficient privileges and attempt login access to the DUT. 4 Create a test account with the revoked account (if any earlier account got revoked on DUT) and attempt login access to the DUT. 5 Attempt login access to the DUT with authorized credentials after attempting with revoked account. Expected results Table 8.4.9-2: Expected results Scenario ID Expected result Reason 1 Connection established. Success event is logged by the DUT, and the log fields are as per clause 5.3.8.8 of [5] Authentication successful. 2 Connection not established. Failure event is logged by the DUT, and the log fields are as per clause 5.3.8.8 of [5] Authentication failure due to invalid credentials. 3 Connection not established. Failure event is logged by the DUT, and the log fields are as per clause 5.3.8.8 of [5] Authentication failure due to insufficient privileges 4 Connection not established. Failure event is logged by the DUT, and the log fields are as per clause 5.3.8.8 of [5] Authentication failure due to invalid credentials 5 Connection established. Success event is logged by the DUT, and the log fields are as per clause 5.3.8.8 of [5] Authentication successful Expected format of evidence: Log files
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9 Software security evaluation for O-RAN components
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.1 Overview
|
This clause contains a set of software security evaluations of an O-RAN component, covering Software Lifecycle Management. The objects in scope of these software security evaluation are SMO, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU and O-RU. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 87
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.2 Open-Source Software Component Analysis
|
Void.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.3 Binary Static Analysis
|
Void.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4 Software Bill of Materials (SBOM)
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.1 SBOM Signature
|
Requirement Name: A digital signature is provided for the SBOM. Requirement Reference: REQ-SBOM-007, REQ-SBOM-011, clause 6.3.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: SBOM is authenticity, integrity protected and provided in a standard format. Threat References: T-O-RAN-09 DUT/s: SMO, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, rApp, xApp Test name: TC_SBOM_Signature Test description and applicability Open RAN software producers shall provide the SBOM for every O-RAN software delivery, including patches, to the network operator. SBOM shall be digitally signed. Purpose: To verify the SBOM is provided with a digital signature Test setup and configuration SBOM is provided. Tools to verify the SBOM are available. Test procedure Ensure the SBOM is provided with a digital signature in the format as described below. Verify SBOM digital signature is valid using the software provider's public key or certificate. Depending on the format of the SBOM, there are various ways how to include and verify the digital signature of the SBOM. Below, the digital signature methods are detailed. SPDX YAML, RDF and tag data: The signature is in a separate file from the SPDX file (Example: foo.spdx has foo.spdx.sig containing its signature). Digital signature format shall be CMS/PKCS#7/CAdES. XML: XML Signature 2.0 JSON: JSON Web Signature (JWS), and JSON Signature Format (JSF). CycloneDX XML: XML Signature 2.0 JSON: JSON Web Signature (JWS), and JSON Signature Format (JSF). SWID XML: XML Signature 2.0 Expected results Digital signature of the SBOM shall be valid. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 88 Expected format of evidence: Log file, screenshot, or report file.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.2 SBOM Data Fields
|
Requirement Name: Data fields are according to NTIA guidance [13] Requirement Reference: REQ-SBOM-002, REQ-SBOM-011, clause 6.3.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: A minimum set of data fields are included in the SBOM and it is in an standard format. Threat References: T-O-RAN-09 DUT/s: SMO, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, rApp, xApp Test Name: TC_SBOM_Data_Fields Test description and applicability Open RAN software producers shall provide the SBOM for every O-RAN software delivery to the network operator, including patches. Minimum set of the data fields shall be present. Purpose of the test is to verify that the minimum set of the data fields are present in SBOM. Purpose: To verify the minimum set of data fields are included in the SBOM Test setup and configuration SBOM file is provided. Tools to verify the data fields are available. Test procedure Run the SBOM check tool and verify that there is minimum set of data fields present in SBOM depending on the SBOM format used. Table 9.4.2-1: Minimum set of data fields for SPDX [12] NTIA field NTIA description SPDX 2.2.1 field Supplier Name The name of an entity that creates, defines, and identifies components PackageSupplier Component Name Designation assigned to a unit of software defined by the original supplier PackageName Version of the Component Identifier used by the supplier to specify a change in software from a previously identified version PackageVersion Other Unique Identifiers Other identifiers that are used to identify a component, or serve as a look-up key for relevant databases SPDXID (Package SPDX Identifier) Dependency Relationship Characterizing the relationship that an upstream component X is included in software Y Relationship: CONTAINS Author of SBOM Data The name of the entity that creates the SBOM data for this component Creator Timestamp Record of the date and time of the SBOM data assembly Created ETSI ETSI TS 104 105 V7.0.0 (2025-06) 89 Table 9.4.2-2: Minimum set of data fields for CycloneDX [13] NTIA field NTIA description CycloneDX field Supplier Name The name of an entity that creates, defines, and identifies components publisher Component Name Designation assigned to a unit of software defined by the original supplier name Version of the Component Identifier used by the supplier to specify a change in software from a previously identified version version Other Unique Identifiers Other identifiers that are used to identify a component, or serve as a look-up key for relevant databases bom/serialNumber and component/bom-ref Dependency Relationship Characterizing the relationship that an upstream component X is included in software Y (Nested assembly/subassembly and/or dependency graphs) Author of SBOM Data The name of the entity that creates the SBOM data for this component bom-descriptor:metadata/ manufacture/contact Timestamp Record of the date and time of the SBOM data assembly timestamp Table 9.4.2-3: Minimum set of data fields for SWID [13] NTIA field NTIA description SWID tag Supplier Name The name of an entity that creates, defines, and identifies components <Entity> @role (softwareCreator/publisher), @name Component Name Designation assigned to a unit of software defined by the original supplier <softwareIdentity> @name Version of the Component Identifier used by the supplier to specify a change in software from a previously identified version <softwareIdentity> @version Other Unique Identifiers Other identifiers that are used to identify a component, or serve as a look-up key for relevant databases <softwareIdentity> @tagID Dependency Relationship Characterizing the relationship that an upstream component X is included in software Y <Link> @rel, @href Author of SBOM Data The name of the entity that creates the SBOM data for this component <Entity> @role (tagCreator), @name Timestamp Record of the date and time of the SBOM data assembly - This test is part of the O-RAN software producer's Software Development Lifecycle (SDLC). Expected results Minimum set of data fields are present. Expected format of evidence: Log file, screenshot, or report file.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.3 SBOM Format
|
Requirement Name: SBOM is provided in one of the accepted formats: SPDX, CycloneDX, or SWID. Requirement Reference: REQ-SBOM-11, clause 6.3.1, O-RAN Security Requirements and Controls Specification [5] Requirement Description: SBOM is provided in a standard format. Threat References: T-O-RAN-09 DUT/s: SMO, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, rApp, xApp ETSI ETSI TS 104 105 V7.0.0 (2025-06) 90 Test Name: TC_SBOM_Format Test description and applicability Open RAN software producers shall provide the SBOM for every O-RAN software delivery in one of three accepted formats: Software Package Data eXchange (SPDX) [i.2], CycloneDX [i.3], or Software Identification (SWID) [i.4] formats. Purpose: To verify that the SBOM is provided in one of these formats. Test setup and configuration SBOM file provided for the O-RAN software delivery, and the SBOM check tool is available. Test procedure Run the SBOM check tool to verify the SBOM format. Expected results SBOM format is SDPX, CycloneDX, or SWID. Expected format of evidence: Screenshots and/or report file.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.4 SBOM Depth
|
Requirement Name: SBOM Depth is in the required level. Requirement Reference: REQ-SBOM-004, REQ-SBOM-005, REQ-SBOM-006, clause 6.3.1, O-RAN Security Requirements and Controls Specification [5] Requirement Description: The SBOM Depth is the required for the different types of software. Threat References: T-O-RAN-09 DUT/s: SMO, Near-RT RIC, O-CU-CP, O-CU-UP, O-DU, O-RU, rApp, xApp Test Name: TC_SBOM_Depth Test description and applicability Open RAN software producers provide the SBOM for every O-RAN software delivery, including patches, to the network operator. SBOM depth is provided at top-level for every O-RAN software delivery, and SBOM depth is provided to a second-level for any O-RAN Software Community or open source software. Purpose: To verify that the SBOM depth is provided to the level specified. Test setup and configuration SBOM file provided for the O-RAN software delivery, and the SBOM check tool is available. Test procedure Run the SBOM check tool to verify the SBOM depth provided. At a minimum, all top-level dependencies are listed. Expected results SBOM depth is as specified in the requirements: • top-level for every O-RAN software delivery • second level for any O-RAN Software Community or open-source software. Expected format of evidence: Log file, screenshot, or report file. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 91
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.5 SBOM completeness check
|
Requirement Name: The SBOM for each O-RAN NF shall comprehensively and accurately list all sub-components, libraries, and dependencies to ensure a complete representation of the software composition. Requirement Reference & Description: 'REQ-SBOM-002' clause 6.3 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-08, T-O-RAN-09' clause 7.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_SBOM_COMPLETENESS_CHECK Test Description and Applicability Purpose: The purpose of this test is to validate that the SBOM for each component comprehensively lists all sub- components, libraries, and dependencies. Test Setup and Configuration • Tools: SBOM validation tools, manual review tools. • Data: Access to each component's SBOM. Test Procedure 1) Open and review each SBOM. 2) Ensure that all sub-components, libraries, and dependencies are listed. 3) Cross-reference the SBOM with the actual component to verify no elements are omitted. 4) Document any discrepancies or missing elements. Expected Results: • The SBOM for each component is complete, with no omissions. Expected format of evidence: • A report detailing: - Each component and its SBOM. - Any discrepancies or missing elements. - Recommendations for SBOM completion.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.6 SBOM version verification
|
Requirement Name: The version in the SBOM shall accurately match the actual O-RAN NF version. Requirement Reference & Description: 'REQ-SBOM-002' clause 6.3 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-08, T-O-RAN-09' clause 7.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_SBOM_VERSION_VERIFICATION Test Description and Applicability ETSI ETSI TS 104 105 V7.0.0 (2025-06) 92 Purpose: The purpose of this test is to ensure the SBOM reflects the current version of the component. Test Setup and Configuration • Tools: Version control systems, manual review tools. • Data: Inventory of all O-RAN components, their versions, and their SBOMs. Test Procedure: 1) For each component, compare the version listed in the SBOM with the actual component version. 2) Ensure that the SBOM's version matches the component's version. 3) Document any discrepancies. Expected Results: • The version specified in the SBOM aligns with the actual version of the component. Expected format of evidence: • A report detailing: - Each component, its version, and its SBOM version. - Any discrepancies between the two versions. - Recommendations for version alignment.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.7 SBOM vulnerability cross check
|
Requirement Name: All components listed in the SBOM shall be checked against known vulnerability databases to identify and document any associated security risks. Requirement Reference & Description: 'REQ-SBOM-003' clause 6.3 in O-RAN Security Requirements and Controls Specifications [5] Threat References: 'T-O-RAN-08, T-O-RAN-09' clause 7.4.1.1 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_SBOM_VULN_CROSS_CHECK Test Description and Applicability Purpose: The purpose of this test is to cross-reference the components listed in the SBOM with known vulnerability databases. Test Setup and Configuration: • Tools: Vulnerability scanning tools like NVD, Snyk, or OWASP Dependency-Check. • Data: Access to each component's SBOM. Test Procedure: 1) Extract a list of components and their versions from the SBOM. 2) Use the vulnerability scanning tool to check for known vulnerabilities associated with each component/version. 3) Document any vulnerabilities found, noting their severity and potential impact. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 93 Expected Results: • A list of vulnerabilities, if any, associated with the components listed in the SBOM. Expected format of evidence: • A comprehensive report detailing: - Each component and its version from the SBOM. - Vulnerabilities found. - Severity and potential impact of each vulnerability. Recommendations for mitigation or patching.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.8 SBOM Delivery
|
Requirement Name: SBOM provided with all O-RAN Software. Requirement Reference: REQ-SBOM-001, clause 6.3, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: The O-RAN vendor shall provide the SBOM with every O-RAN software delivery package, including patches. Threat References: T-O-RAN-09 DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_SBOM_Delivery_with_O-RAN_Software Test description and applicability Purpose: The purpose of this test is to ensure that every O-RAN component is accompanied by an SBOM. This is applicable to all components within the O-RAN system. Test setup and configuration • Tools: File explorer, documentation access tools, or automated SBOM detection tools. • Environment: A repository or directory containing all O-RAN components and their associated documentation or metadata. • Data: Inventory of all O-RAN components. Test procedure 1) Navigate to the directory or repository of each O-RAN component. 2) Look for associated files or documentation indicating the presence of an SBOM. 3) Validate the SBOM's content to ensure it is not just a placeholder. 4) Document any components that lack a genuine SBOM. Expected Results • Every O-RAN component has a genuine SBOM associated with it. Expected Format of Evidence: • A spreadsheet or report detailing: - Each component. - Status of its SBOM (Present/Absent). ETSI ETSI TS 104 105 V7.0.0 (2025-06) 94 - Notes on any discrepancies or issues found.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.9 SBOM Vulnerabilities Field
|
Requirement Name: Vulnerabilities field omission in SBOMs. Requirement Reference: REQ-SBOM-003, clause 6.3, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Vulnerabilities shall not be included as an additional data field because it would represent a static view from a specific point in time, while vulnerabilities are constantly evolving. Threat References: T-O-RAN-09 DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_SBOM_Vulnerabilities_Fields Test description and applicability Vulnerabilities fields in an SBOM represent a snapshot of vulnerabilities present in the SBOM at a particular moment in time. Therefore, the vulnerabilities field in SBOM for O-RAN software should not be relied upon to determine the SBOM vulnerabilities by the operator. Operators should perform their own vulnerability assessment. Purpose: Verify that vulnerabilities fields is not included as an additional field to the SBOM. Test setup and configuration SBOM file shall be provided for the O-RAN software delivery, and the SBOM check tool shall be available. Test procedure 1) Access the SBOM file provided by the Solution Provider. 2) Verify that no vulnerabilities fields exist within the SBOM. Expected Results There are no vulnerabilities field(s) present in the SBOM. Expected Format of Evidence: screenshot(s)
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.4.10 SBOM OSC Components
|
Requirement Name: Verify OSC components included in SBOM for commercial software which uses O-RAN OSC components. Requirement Reference: REQ-SBOM-008, clause 6.3, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Commercial software vendors using software from the O-RAN Software Community (OSC) shall provide an SBOM that includes the components used from the OSC. Threat References: T-O-RAN-09 DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_SBOM_OSC_Components Test description and applicability Purpose: Verify that commercial software containing O-RAN OSC components is associated with an SBOM with O- RAN OSC components listed. Test setup and configuration Commercial Software with OSC component(s) SBOM file shall be provided for the O-RAN software delivery, and the SBOM check tool shall be available. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 95 Test procedure 1) Access the SBOM file provided by the Solution Provider. 2) Verify the SBOM is for the commercial software. 3) Verify O-RAN Software Community component(s) are listed in the SBOM. Expected Results OSC components present in the SBOM. Expected Format of Evidence: screenshot(s)
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.5 Software Image Signing and Verification
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.5.1 Software Image/Application Package Signing
|
Requirement Name: Any software image(s) of O-RAN components and/or apps shall be digitally signed by its provider for distribution and by the Service Provider for internal publishing. Requirement Reference: REQ-SEC-ALM-PKG-2, REQ-SEC-ALM-PKG-4, REQ-SEC-ALM-PKG-8, REQ-SEC- ALM-SU-1, clause 5.3.2, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Application package shall be signed and verified for integrity and authenticity protection. Threat References: T-IMG-01, T-VM-C-02, T-Near-RT-01, T-Near-RT-02, T-xAPP-02, T-rAPP-05, clause 7.4 in O- RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_SW_Img_Pkg_Signing Test description and applicability Open RAN software producer/provider provides the digitally signed image or Application package for its delivery, including new version and/or patches, to the Service Provider. Service Provider digitally signs the verified image or Application package delivered by the software producer for publishing into its catalog visible to SMO. Purpose: Ensure O-RAN software image or application package is digitally signed. Test setup and configuration Software image or Application package ready for signing. Test procedure Manually or using a software signing service, sign the image or Application package for distribution by software producer or internally published by the Service Provider. The following steps are to be followed: 1) Generate key-pair: ephemeral key pair (prime256v1) is preferred a. private key (and public key if certificate is used) deletion asap 2) Request for Signing Certificate: Optional, preferred short-lived certificate 3) Image or Application package hash and signing: SHA256 or stronger 4) Upload image or Application package and its digital signature(s) for distribution or publish. Expected results The provider's digital signature of the software image or Application package is present in the image repository for distribution from software producer; and Service Provider digital signature of the software image or Application package is present in the catalogue published by the O-RAN operator. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 96 Expected Format of Evidence: screenshot(s)
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
9.5.2 Software Signature Verification
|
Requirement Name: Any software image(s) of O-RAN component(s) and/or app(s) shall be verified for its signature(s) by the operator for onboarding and/or instantiation process. Requirement Reference: REQ-SEC-ALM-PKG-5, REQ-SEC-ALM-PKG-6, REQ-SEC-ALM-PKG-8, REQ-SEC- ALM-SU-1, clause 5.3.2, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Application package shall be signed and verified for integrity and authenticity protection. Threat References: T-IMG-01, T-VM-C-02, T-Near-RT-01, T-Near-RT-02, T-xAPP-02, clause 7.4 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU, O-DU, O-CU, Near-RT RIC, xApp, rApp, Non-RT RIC, SMO, O-Cloud Test Name: TC_SW_Img_Pkg_Verification Test description and applicability O-RAN software image(s) or Application package distributed by the software producer/provider is authenticated by the Service Provider during the onboarding process with its signature verified. Both provider and Service Provider signatures of the O-RAN software image(s) or Application package is verified during the instantiation process. Purpose: Ensure signatures on O-RAN software image or application package are verified. Test setup and configuration Digitally signed software image or Application package with shared necessary digital certificates or public key is validated. EXAMPLE: Root CA certificate, any intermediate or RA certificates. Test procedure The signature of the software image or Application package is verified manually or using a software signing service. The software used to verify the signature(s) could be provided by software producer or internally published by the Service Provider. For image or Application package instantiation, Service Provider signature of the software image or Application package verification is executed first, followed by provider signature verification. Expected results The provider signature verification for software image or Application package during onboarding is successful. The Service Provider and provider signatures verification for image or Application instantiation is successful. Expected Format of Evidence: screenshot(s)
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
10 ML security validation for O-RAN system
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
10.1 Overview
|
AI/ML technologies and models are adopted at the O-RAN system Non-RT RIC and Near-RT RIC to enable O-RAN use cases: traffic steering, massive MIMO optimization, radio resource allocation for UAV applications, position accuracy enhancement, beam management, and enhance CSI feedback. Other uses cases could be checked in document O-RAN Use Cases Detailed Specification [22]. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 97
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
10.2 ML Data Poisoning
|
Void.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11 Security tests of O-RAN interfaces
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1 FH
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1.1 Overview
|
This clause contains security tests to validate the security protection mechanism of the O-RAN open fronthaul interface.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1.2 Open Fronthaul Point-to-Point LAN Segment
| |
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1.2.0 Overview
|
IEEE 802.1X-2020 Port-based Network Access Control [11] provides the means to control network access in point-to- point LAN segments within the Open Fronthaul network. Port-based network access control in the O-RAN Alliance Open Fronthaul comprises supplicant, authenticator, and authentication of server entities described in IEEE 802.1X-2020 [11]. The security test cases in this clause cover the validation of the authenticator and supplicant functionalities of the 802.1X, affecting to all the elements acting as an O-RAN Open Fronthaul network elements, including but not limited to, O-DU, O-RU, switches, FHM, FHGW, TNE and PRTC-T/GM as defined in clause 5.2.5.5 of Security Requirements and Controls Specifications [5].
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1.2.1 Authenticator Validation
|
Requirement Name: Authenticator function of O-RAN component Requirement Reference: REQ-SEC-OFHPLS-1, REQ-SEC-OFHPLS-2 and REQ-SEC-OFHPLS-3 from clause 5.2.5.5.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Requirements of Authenticators in the open fronthaul network and its interface to an Authentication Server Threat References: T-FRHAUL-02 DUT/s: O-RU, O-DU Test Name: TC_Authenticator_Validation Test description and applicability Purpose: To verify and validates the authenticator requirements of the network component to serve the request from supplicant(s) using EAP TLS authentication per 802.1X-2020 [11] Open fronthaul network component could serve as the authenticator role of the 802.1X for port-based network access control. Test setup and configuration The DUT shall be the O-RAN component with IP enabled network interface reachable to the authentication server and 802.1X enabled for its open fronthaul interface. First, set up an authentication RADIUS server (e.g. free radius on Linux ®) with root, server and client certificates configured with .cnf files and eap configuration (eap.conf). Then start the authentication RADIUS server. NOTE 1: Linux® is the registered trademark of Linus Torvalds in the U.S. and other countries. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 98 NOTE 2: Radius support is required over interface between an authenticator and authentication server in the security requirement specification, only Radius authentication server is called for in this security test environment setup. Diameter based authentication server could be used as an alternative. Test procedure First set up the 802.1X test tool host/device with EAP authentication for 802.1X protocol. Run the 802.1X test tool emulating the request(s) from the supplicant(s) towards the DUT, which is the authenticator and ensure the 802.1X authentication process runs to completion. The following test scenarios are executed: Table 11.1.2.1-1: Scenarios to be executed Scenario ID Configuration 1 Test tool (as supplicant) setting for 802.1X with EAPoL, correct Identity (Certificate DN) and Client Certificate (provisioned on the Radius server) 2 Test tool (as supplicant) setting for 802.1X with EAPoL, correct Identity (Certificate DN) and incorrect Client Certificate (not provisioned on the Radius server) 3 Test tool (as supplicant) setting for 802.1X with EAPoL and incorrect Identity (Certificate DN) 4 Test tool (as supplicant) setting for 802.1X with EAP non-TLS (e.g. MD5) authentication Expected results The O-RAN component successfully complete the procedure for the emulated supplicant validation (being granted or denied), for each test scenario: Table 11.1.2.1-2: Expected results Scenario ID Expected result Reason 1 Connection established Authentication successfully 2 Connection not established Fail Authentication because the certificate is wrong 3 Connection not established Fail Authentication because the Identity is wrong 4 Connection not established Fail Authentication because the authentication type is wrong Expected format of evidence: log files and/or traffic captures.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1.2.2 Supplicant Validation
|
Requirement Name: Supplicant function of O-RAN component Requirement Reference: REQ-SEC-OFHPLS-1, REQ-SEC-OFHPLS-2 and REQ-SEC-OFHPLS-3 from clause 5.2.5.5.1, O-RAN Security Requirements and Controls Specifications [5] Requirement Description: Requirements of Supplicant in the open fronthaul network Threat References: T-FRHAUL-02 DUT/s: All the functions with Open Fronthaul functionalities, including but not limited to, O-RU, O-DU, switches, FHM, FHGW, TNE and PRTC/T-GM. Test Name: TC_Supplicant_Validation Test description and applicability Purpose: To verify the supplicant requirement of the network component for port connection request using EAP TLS authentication per 802.1X-2020 [11]. Open fronthaul network component shall support supplicant role of the 802.1X for port-based network access control. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 99 Test setup and configuration First set up an authentication RADIUS server (e.g. free radius on Linux®) with root, server and client certificates configured with .cnf files and eap configuration (eap.conf), then start the authentication RADIUS server. NOTE: Radius support is required over interface between an authenticator and authentication server in the security requirement specification, only Radius authentication server is called for in this security test environment setup. Diameter based authentication server could be used as an alternative. Test procedure First, set up the 802.1X test tool host/device as the authenticator with EAP TLS authentication for 802.1X protocol and configure the preset RADIUS server as its authentication server. Then start the test run as an emulated authenticator waiting for the supplicant request. Configure and enable the O-RAN component of the open fronthaul interface to start the port connection request as a supplicant towards the 802.1X test tool, which is the authenticator and verify the 802.1X authentication process runs to completion. The following test scenarios are executed: Table 11.1.2.2-1: Scenarios to be executed Scenario ID Configuration 1 O-RAN component (as supplicant) setting for 802.1X with EAPoL, correct Identity (Certificate DN) and Client Certificate (provisioned on the Radius server) 2 O-RAN component (as supplicant) setting for 802.1X with EAPoL, correct Identity (Certificate DN) and incorrect Client Certificate (un-provisioned on the Radius server) 3 O-RAN component (as supplicant) setting for 802.1X with EAPoL and incorrect Identity (Certificate DN) 4 O-RAN component (as supplicant) setting for 802.1X with EAP non-TLS (e.g. MD5) authentication (optional) Expected results The O-RAN component successfully complete the procedure for the supplicant validation (being granted or denied), for each test scenario: Table 11.1.2.2-2: Expected results Scenario ID Expected result Reason 1 Connection established Authentication successfully 2 Connection not established Fail Authentication because the certificate is wrong 3 Connection not established Fail Authentication because the Identity is wrong 4 Connection not established Fail Authentication because the authentication type is wrong Expected format of evidence: log files and/or traffic captures.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1.3 M-Plane
|
11.1.3.1 SSH-based M-Plane authentication, authorization and access control protection
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1.3.1.0 Overview
|
The test cases outlined in this clause verify M-Plane authenticity, authorization, and access control protection over the FH interface using SSH. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 100 11.1.3.1.1 Secure Password-Based Authentication and Authorization in FH_MPLANE Using SSH Requirement Name: M-Plane authenticity protection over FH interface using SSH Requirement Reference & Description: clause 5.4 in O-RAN Fronthaul Working Group Management Plane Specification [21] Threat References: 'T-O-RAN-05' clause 5.4.1, 'T-FRHAUL-01, T-FRHAUL-02, T-MPLANE-01' clause 5.4.1.2 in O-RAN Security Threat Modeling and Risk Assessment [3] DUT/s: O-RU Test Name: TC_FH_MPLANE_SSH-PASSWORD-BASED_AUTHENTICATION_AUTHORIZATION Test description and applicability Purpose: The purpose of this test is to verify the SSH password-based authentication and authorization mechanisms on the Front-Haul (FH) of the O-DU by the O-RU. Test setup and configuration 1) The O-RU is properly configured and operational. 2) Test equipment (potentially an O-DU or a dedicated SSH client simulator) is configured to establish SSH connections to the O-RU. 3) NACM with NETCONF is enabled and configured for authorization on the FH interface. 4) SSH is properly implemented and configured as defined in [2] clause 4.1. Test procedure 1) Execute the test on the SSH protocol as defined in clause 6.2. 2) Positive Case: Successful SSH password-based authentication and authorization. • Test the successful SSH password-based authentication and authorization of the test equipment by the O-RU. a) Establish an SSH connection from the test equipment (acting as a SSH client) to the O-RU (acting as SSH server) using the SSH password. EXAMPLE 1: "Command: ssh <username>@<O-RU_IP>" b) Verify that the O-RU successfully authenticates the test equipment using the SSH password. EXAMPLE 2: "Command: show ssh sessions " c) Validate that the test equipment is authorized to perform the requested operations on the FH interface after successful authentication. This operation should be within the scope of permitted actions for the authenticated entity. EXAMPLE of operations: "start up" installation, software management, configuration management, performance management, fault management and file management towards the O-RU • Monitor the responses from the O-RU to these operations. • Record whether each operation was successfully executed, partially executed, or rejected. • Verify the O-RU logs to confirm that the operations were authorized. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 101 3) Negative Case: Failed SSH password-based authentication. • Test the handling of failed SSH password-based authentication attempts of the test equipment by the O-RU in different scenarios. a) Attempt with incorrect password • Attempt to establish an SSH connection from the test equipment to the O-RU using an incorrect password. EXAMPLE 3: "Command: ssh <valid_username>@<O-RU_IP>", using an incorrect password • Verify that the O-RU rejects the SSH connection due to the authentication failure. b) Attempt with non-existent username • Attempt to establish an SSH connection using a username that does not exist in the O-RU's user database. EXAMPLE 4: Command: ssh <invalid_username>@<O-RU_IP> • Verify that the O-RU rejects the SSH connection, confirming that authentication does not proceed with non-existent usernames. Expected Results 1) For step 1): Expected results in clause 6.2.4 2) For step 2): - The SSH connection is successfully established using the SSH password. - The O-RU validates the test equipment's SSH password for authentication. - The O-RU grants the necessary authorization for the requested operations. 3) For step 3): - The SSH connection attempt fails due to the incorrect password. - The O-RU identifies the authentication failure and denies access. - The SSH connection attempt fails due to the invalid username. - The O-RU identifies the authentication failure and prevents access. Expected format of evidence 1) For step 1): Logs and screenshots showing adherence to SSH protocol specifications as defined in [2] clause 4.1. 2) For step 2): Logs showing successful SSH authentication and authorization events. 3) For step 3): Logs or error messages indicating failed SSH password-based authentication attempts for both incorrect password and invalid username scenarios.
|
0ce77be9f03a15fedfc038153382bd6a
|
104 105
|
11.1.3.1.2 FH M-Plane SSH-certificate-based authentication authorization
|
Requirement Name: M-Plane authenticity protection over FH interface using SSH Requirement Reference & Description: clause 5.4 in O-RAN Fronthaul Working Group Management Plane Specification [21] Threat References: 'T-O-RAN-05' clause 5.4.1, 'T-FRHAUL-01, T-FRHAUL-02, T-MPLANE-01' clause 5.4.1.2 in O-RAN Security Threat Modeling and Risk Assessment [3] ETSI ETSI TS 104 105 V7.0.0 (2025-06) 102 DUT/s: O-RU, O-DU Test Name: TC_FH_MPLANE_SSH-CERTIFICATE-BASED_AUTHENTICATION_AUTHORIZATION Test description and applicability Purpose: The purpose of this test is to verify the SSH-certificate-based authentication and authorization mechanisms on the front-haul (FH) interface between O-RU and O-DU, using test equipment as needed to simulate either party. NOTE: Test equipment may simulate the role of O-DU or O-RU for the purpose of this test. Test setup and configuration 1) The O-RU and O-DU devices are properly configured and operational. 2) Test equipment capable of simulating SSH client/server functionality is prepared to represent either the O-DU or O-RU as required. 3) SSH keys and certificates are generated and installed on both the O-RU and O-DU devices. 4) NACM with NETCONF is enabled and configured for authorization on the FH interface. 5) SSH is properly implemented and configured as defined in [2] clause 4.1. Test procedure • Execute the test on the SSH protocol as defined in clause 6.2. Part A: Authentication and authorization of O-DU by O-RU (or test equipment simulating O-DU) • Positive Case: Successful SSH-certificate-based authentication and authorization. a) Establish an SSH connection from the O-RU to the O-DU using the SSH key and certificate. b) Verify that the O-RU successfully authenticates the O-DU using the SSH certificate. EXAMPLE 1: "Command: show ssh sessions" c) Validate that the O-DU is authorized to perform the requested operations on the FH interface. • Perform an operation on the FH interface that requires authorization. This operation should be within the scope of permitted actions for the authenticated O-DU. EXAMPLE of operations: "start up" installation, software management, configuration management, performance management, fault management and file management towards the O-RU • Monitor the responses from the O-RU to these operations. • Record whether each operation was successfully executed, partially executed, or rejected. • Verify the O-RU logs to confirm that the operations were authorized. • Negative Case: Failed SSH-certificate-based authentication. • Test the handling of failed SSH-certificate-based authentication attempts by the O-RU in different scenarios. a) Attempt with invalid key or certificate • Attempt to establish an SSH connection using an incorrect or invalid SSH key or certificate. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 103 EXAMPLE 2: "Command: ssh -i <path_to_invalid_private_key> -o CertificateFile=<path_to_invalid_certificate> <valid_username>@<O-RU_IP>" • Verify that the O-RU rejects the SSH connection due to the authentication failure. b) Attempt with invalid username • Attempt to establish an SSH connection using a valid SSH key and certificate, but with a username that does not exist in the O-RU's system. EXAMPLE 3: "Command: ssh -i <path_to_valid_private_key> -o CertificateFile=<path_to_valid_certificate> <invalid_username>@<O-RU_IP>"Verify that the O-RU rejects the SSH connection, confirming that the system does not authenticate usernames that are not registered or recognized. Part B: Authentication of O-RU by O-DU (or test equipment simulating O-RU) • Positive Case: Successful SSH-certificate-based authentication: a) Establish an SSH connection using the SSH key and certificate. b) Verify that the O-DU successfully authenticates the O-RU using the SSH certificate. EXAMPLE 4: "Command: show ssh sessions" • Negative Case: Failed SSH-certificate-based authentication. • Test the handling of failed SSH-certificate-based authentication attempts by the O-DU in different scenarios. a) Attempt with invalid key or certificate • Attempt to establish an SSH connection using an incorrect or invalid SSH key or certificate. EXAMPLE 5: "Command: ssh -i <path_to_invalid_private_key> -o CertificateFile=<path_to_invalid_certificate> <valid_username>@<O-DU_IP>" • Verify that the O-RU rejects the SSH connection due to the authentication failure. b) Attempt with invalid username • Attempt to establish an SSH connection using a valid SSH key and certificate, but with a username that does not exist in the O-RU's system. EXAMPLE 6: "Command: ssh -i <path_to_valid_private_key> -o CertificateFile=<path_to_valid_certificate> <invalid_username>@<O-DU_IP>" • Verify that the O-DU rejects the SSH connection, confirming that the system does not authenticate usernames that are not registered or recognized. Expected Results 1) For step 1): Expected results in clause 6.2.4 2) For Parts A and B – Positive Case: - The SSH connection is successfully established using the correct SSH key and certificate. - The DUT (O-RU or O-DU) validates the test equipment's SSH certificate for authentication. - The O-RU grants the necessary authorization to the O-DU for the requested operations. - The SSH connection attempt fails due to the incorrect or invalid SSH key or certificate. - The DUT identifies the authentication failure and denies access accordingly. ETSI ETSI TS 104 105 V7.0.0 (2025-06) 104 Expected format of evidence 1) For step 1): Logs and screenshots showing adherence to SSH protocol specifications as defined in [2] clause 4.1. 2) For Parts A and B – Positive Case: Logs showing successful SSH authentication and authorization events. 3) For Parts A and B – Negative Case: Logs or error messages indicating failed SSH-certificate-based authentication attempts for both invalid key/certificate and non-existent username scenarios.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.