This is the multi-page printable view of this section. Click here to print.
§4.1 Program Foundation
1 - §4.1.1 Policy
1. Clause Overview
§4.1.1 corresponds to ISO/IEC 5230 §3.1.1 (License Compliance Policy), but the focus shifts to Security Assurance. A policy that systematically manages known and newly discovered vulnerabilities in open source components of the supplied software must be documented and communicated internally. The key difference from ISO/IEC 5230 §3.1.1 is the requirement for a periodic review process for the policy itself and its communication method. It is not enough to establish a policy; a review system must be in place to ensure the policy always remains valid and up to date.
2. What to Do
- Document and formalize a policy for managing security vulnerabilities in open source components included in the supplied software.
- Include vulnerability detection, assessment, response, and notification principles, as well as a Coordinated Vulnerability Disclosure (CVD) policy.
- Establish and document a procedure for communicating the policy to Program Participants (developers, security personnel, legal, IT, etc.).
- Specify in the policy a review process that periodically reviews the policy and its communication method to keep them current and valid.
- Record the review completion date, reviewer, and change history in the document.
3. Requirements and Verification Materials
| Clause | Requirement | Verification Material(s) |
|---|---|---|
| §4.1.1 | A written open source software security assurance policy shall exist that governs open source software security assurance of the supplied software. The policy shall be internally communicated. The policy and its method of communication shall have a review process to ensure they remain current and valid. | 4.1.1.1 A documented open source software security assurance policy 4.1.1.2 A documented procedure that makes Program Participants aware of the security assurance policy |
§4.1.1 Policy A written open source software security assurance policy shall exist that governs open source software security assurance of the supplied software. The policy shall be internally communicated. The policy and its method of communication shall have a review process to ensure they remain current and valid.
Verification Material(s): 4.1.1.1 A documented open source software security assurance policy. 4.1.1.2 A documented procedure that makes program participants aware of the security assurance policy.
4. How to Comply and Samples per Verification Material
4.1.1.1 Documented Security Assurance Policy
How to Comply
If an Open Source Policy for ISO/IEC 5230 is already in place, you can either add a security assurance section to that policy or create a separate security assurance policy document. Both approaches satisfy Verification Material 4.1.1.1.
The policy should include: ① principles for identifying, tracking, and responding to security vulnerabilities; ② risk assessment criteria based on CVSS and remediation timelines; ③ Coordinated Vulnerability Disclosure (CVD) policy; ④ post-deployment monitoring principles; and ⑤ periodic review cycle and reviewer. The key difference from ISO/IEC 5230 §3.1.1.1 is that the periodic review process must be explicitly stated within the policy document.
Considerations
- Integration with 5230 Policy: It can be managed in an integrated manner by expanding the §5 Security Vulnerability Response section of the existing ISO/IEC 5230 policy.
- Specify Review Cycle: Include a clause stating that a minimum of one annual periodic review is conducted, with an immediate review triggered by changes in the threat environment or legal requirements.
- Adopt CVSS Criteria: Use CVSS (Common Vulnerability Scoring System) for vulnerability severity assessment and specify remediation timelines in the policy (e.g., Critical: 7 days, High: 30 days).
- Include CVD Policy: Include a CVD procedure in the policy for collaborating confidentially to resolve externally reported vulnerabilities before public disclosure.
- Version Control: Record the policy version number, change history, and review completion date.
Sample
The following is a sample of the security assurance section of an open source policy.
## §5 Open Source Security Assurance
### 5.1 Purpose
The company systematically identifies and responds to security vulnerabilities
in open source components included in the supplied software to minimize security risks.
### 5.2 Vulnerability Response Principles
Remediation timeline criteria for known vulnerabilities (CVEs) are as follows:
- Critical (CVSS 9.0–10.0): Patch or mitigation within 7 days
- High (CVSS 7.0–8.9): Patch or mitigation within 30 days
- Medium (CVSS 4.0–6.9): Establish patch plan within 90 days
- Low (CVSS 0.1–3.9): Address in next scheduled update
### 5.3 Coordinated Vulnerability Disclosure (CVD) Policy
When a vulnerability is reported externally, it will be resolved in cooperation
with the reporter before public disclosure.
Vulnerability reporting channel: security@company.com
### 5.4 Policy Review
This policy and its communication method shall be reviewed at least annually
to remain current and valid.
The review completion date and reviewer shall be recorded in the document.
4.1.1.2 Documented Procedure for Security Assurance Policy Awareness
How to Comply
A procedure for communicating the security assurance policy to Program Participants must be documented, in the same manner as the policy communication procedure for ISO/IEC 5230 (§3.1.1.2). The additional requirement of §4.1.1 is that the communication procedure itself must also be periodically reviewed to maintain its validity. You can either add security assurance policy content to the existing 5230 policy communication procedure, or establish a separate security policy communication procedure.
Considerations
- Reuse 5230 Procedure: Respond efficiently by adding the security assurance policy to existing open source policy communication channels (onboarding, internal wiki, email).
- Review the Communication Procedure: Specify the review cycle (annually) and the reviewer in the communication procedure document to manage the validity of the procedure itself.
- Retain Evidence: Keep notification history and training completion records for a minimum of 3 years.
Sample
Subject: [Security] Open Source Security Assurance Policy Notice and Acknowledgment Request
To: All employees in development/deployment/security-related roles
From: Open Source Program Manager
Dear all,
The company's open source security assurance policy has been established (or revised).
Please review and familiarize yourself with the policy document at the link below.
- Policy document: [Internal portal link]
- Key content: Vulnerability response principles, CVSS-based remediation timelines, CVD policy
- Policy version: v1.0 (Effective date: YYYY-MM-DD) / Next review scheduled: YYYY-MM-DD
Inquiries: Open Source Program Manager (oss@company.com)
5. References
- Corresponding ISO/IEC 5230 clause: §3.1.1 Policy
- Related guide: Enterprise Open Source Management Guide — 2. Policy
- Related template: Open Source Policy Template
2 - §4.1.2 Competence
1. Clause Overview
§4.1.2 has the same basic structure as ISO/IEC 5230 §3.1.2 (Competence), but requires 3 additional Verification Material items. While 5230 requires three items — a list of roles, a definition of competencies per role, and evidence of competency assessment — 18974 additionally requires a list of participants and their role mappings (4.1.2.3), evidence of periodic review and process changes (4.1.2.5), and verification of alignment with internal best practices (4.1.2.6). These three additional items require demonstrating that the competence framework is not merely formal, but is actively kept up to date and aligned with industry standards.
2. What to Do
- Create a list of responsibilities per program role (same as 5230).
- Define and document the competencies required for each role (same as 5230).
- Create a separate list mapping participant names to their respective roles (added in 18974).
- Assess and record the competencies of each participant (same as 5230).
- Periodically review the competence framework and record process changes (added in 18974).
- Confirm that the competence framework aligns with the company’s internal best practices and assign a person responsible (added in 18974).
3. Requirements and Verification Materials
| Clause | Requirement | Verification Material(s) |
|---|---|---|
| §4.1.2 | The organization shall: identify the roles and responsibilities that impact the performance and effectiveness of the program; determine the necessary competence for each role; ensure that participants are competent; take actions where applicable to acquire the necessary competence; and retain documented evidence of competence. | 4.1.2.1 A documented list of roles with corresponding responsibilities for the different participants in the program 4.1.2.2 A document that identifies the competencies for each role 4.1.2.3 List of participants and their roles ★ 4.1.2.4 Documented evidence of assessed competence for each program participant 4.1.2.5 Documented evidence of periodic reviews and changes to the process ★ 4.1.2.6 Documented evidence that these processes align with and are up-to-date with company internal best practices, and that a person has been assigned to make sure they remain so ★ |
★ = Additional items compared to ISO/IEC 5230 §3.1.2
§4.1.2 Competence The organization shall:
- Identify the roles and responsibilities that impact the performance and effectiveness of the program;
- Determine the necessary competence of program participants fulfilling each role;
- Ensure that program participants are competent on the basis of appropriate education, training, and/or experience;
- Where applicable, take actions to acquire the necessary competence;
- Retain appropriate documented information as evidence of competence.
Verification Material(s): 4.1.2.1 A documented list of roles with corresponding responsibilities for the different participants in the program. 4.1.2.2 A document that identifies the competencies for each role. 4.1.2.3 List of participants and their roles. 4.1.2.4 Documented evidence of assessed competence for each program participant. 4.1.2.5 Documented evidence of periodic reviews and changes to the process. 4.1.2.6 Documented evidence that these processes align with and are up-to-date with company internal best practices, and that a person has been assigned to make sure they remain so.
4. How to Comply and Samples per Verification Material
4.1.2.1 Documented List of Roles and Responsibilities
This is the same as ISO/IEC 5230 §3.1.2.1. From a security assurance perspective, the roles of security personnel (DevSecOps, vulnerability analysis) must be clearly included. Refer to §3.1.2.1 Documented List of Roles and Responsibilities for how to prepare this.
4.1.2.2 Document Identifying Competencies per Role
This is the same as ISO/IEC 5230 §3.1.2.2. The security personnel role should include competencies in CVSS score interpretation, operation of vulnerability analysis tools (OSV-SCALIBR, Dependency-Track, etc.), and DevSecOps understanding. Refer to §3.1.2.2 Document Identifying Competencies per Role for how to prepare this.
4.1.2.3 List of Participants and Their Roles ★
How to Comply
Unlike the list of roles in 4.1.2.1, this item requires a list mapping actual individuals by name to the roles they are assigned. The purpose is to clearly identify who the actual personnel participating in the program are, not just the organizational roles. This list must be updated immediately when personnel changes occur.
Considerations
- Name or job title: Using a job title instead of a personal name is acceptable, but it must be specific enough to identify a particular individual.
- Multiple roles: If someone holds multiple roles, list all of them.
- Timely updates: Update the document and increment the version immediately when an assignee changes.
Sample
| Name | Role | Contact | Assigned Date |
|------|------|---------|---------------|
| Gil-dong Hong | Open Source Program Manager | oss@company.com | 2025-01-01 |
| Chul-su Kim | Security (DevSecOps) | security@company.com | 2025-01-01 |
| Young-hee Lee | Legal | legal@company.com | 2025-01-01 |
| Infra Park | IT | it@company.com | 2025-03-15 |
4.1.2.4 Evidence of Assessed Competence
This is the same as ISO/IEC 5230 §3.1.2.3. Refer to §3.1.2.3 Evidence of Assessed Competence for how to prepare this.
4.1.2.5 Evidence of Periodic Reviews and Process Changes ★
How to Comply
The competence framework (role definitions, competency criteria, assessment methods) must be periodically reviewed, and any changes resulting from the review process must be recorded. The key is to verify that new security tool adoption, organizational restructuring, and improvements to vulnerability response processes have been reflected in the competence framework. The review record itself serves as Verification Material 4.1.2.5.
Considerations
- Review cycle: Conduct a minimum annual periodic review and an immediate review when the organization or processes change.
- Record changes: Record the content of changes, reasons for changes, change dates, and who made the changes.
- Evidence format: Review meeting minutes, review completion confirmations, and change history logs can all serve as evidence.
Sample
[Competence Framework Periodic Review Record]
| Review Date | Review Content | Changes | Reviewer |
|-------------|---------------|---------|----------|
| 2025-01-10 | Full review of all roles and competencies | Added CVSS v4.0 interpretation item to security competency | Gil-dong Hong |
| 2026-01-08 | Full review of all roles and competencies | Added Dependency-Track operation competency to IT role | Gil-dong Hong |
4.1.2.6 Verification of Alignment with Internal Best Practices ★
How to Comply
It must be confirmed that the competency definitions and assessment processes are aligned with the company’s internal best practices (HR policy, technical training standards, etc.), and a person responsible for continuously managing this must be assigned. The purpose is to ensure that the competence framework does not deviate from industry standards or internal guidelines.
Considerations
- Assign a responsible person: Explicitly assign and record a person responsible for managing the currency of the competence framework and its alignment with internal best practices.
- Best practice criteria: Industry standards (NIST SSDF, ISO 27001, etc.), internal security policies, and DevSecOps guidelines can be used as best practice criteria.
Sample
[Internal Best Practice Alignment Confirmation]
Competence Framework Manager: Gil-dong Hong (Open Source Program Manager)
Last alignment review date: 2026-01-08
Reference best practice criteria: Internal Security Training Standard v3.0, NIST SSDF 1.1
Review results:
- Security competency criteria align with the internal security training curriculum ✓
- Vulnerability analysis tool competency reflects the latest tools (Dependency-Track v4.x) ✓
- Next alignment review scheduled: 2027-01-08
5. References
- Corresponding ISO/IEC 5230 clause: §3.1.2 Competence
- Related guide: Enterprise Open Source Management Guide — 1. Organization
- Related template: Open Source Policy Template — Appendix 1. Personnel Status
3 - §4.1.3 Awareness
1. Clause Overview
§4.1.3 has the same Verification Material structure as ISO/IEC 5230 §3.1.3 (Awareness). It requires that awareness of program objectives, how individuals contribute to the program, and the implications of non-compliance be assessed and recorded for Program Participants. The difference from 5230 is that the focus of the awareness assessment shifts to Security Assurance. Participants must be aware not only of license compliance but also of the vulnerability management process, CVD procedures, and CVSS-based response criteria.
2. What to Do
- Confirm that Program Participants understand the objectives of the open source security assurance program (vulnerability detection, assessment, response, and notification).
- Assess whether each participant is aware of how their role contributes to the security assurance framework.
- Assess awareness of the legal, business, and security implications of failing to comply with the vulnerability response process.
- Record and retain the results of awareness assessments as documentation.
- Provide additional training to participants with gaps and retain the results of re-assessments.
3. Requirements and Verification Materials
| Clause | Requirement | Verification Material(s) |
|---|---|---|
| §4.1.3 | The organization shall ensure that program participants are aware of: the existence and location of the security assurance policy / relevant security assurance objectives / their contribution to the effectiveness of the program / the implications of not following the program’s requirements | 4.1.3.1 Documented evidence of assessed awareness for the program participants — which shall include the program’s objectives, contributions within the program, and implications of failing to follow the program’s requirements |
§4.1.3 Awareness The organization shall ensure that the program participants are aware of:
- The open source software security assurance policy and where to find it;
- Relevant open source software security assurance objectives;
- Their contribution to the effectiveness of the program; and
- The implications of not following the program’s requirements.
Verification Material(s): 4.1.3.1 Documented evidence of assessed awareness for the program participants - which shall include the program’s objectives, contributions within the program, and implications of failing to follow the program’s requirements.
4. How to Comply and Samples per Verification Material
4.1.3.1 Evidence of Assessed Awareness for Program Participants
How to Comply
The basic approach is the same as ISO/IEC 5230 §3.1.3.1, but the awareness assessment questions must focus on Security Assurance. The three core awareness items are: ① the objectives of the security assurance program (vulnerability identification, assessment, response, and CVD); ② how one’s own role contributes to the security framework; and ③ the security, legal, and business risks of non-compliance with the process.
The method for recording and retaining assessment results as documentation is the same as for 5230. Conduct a minimum annual periodic assessment, and perform an initial assessment for new participants immediately upon joining.
Considerations
- Design security-specific questions: Include content specific to security assurance in the assessment questions, such as vulnerability response procedures, CVSS criteria, and CVD policy.
- Role-specific in-depth assessment: For security personnel, assess awareness up to technical vulnerability analysis; for developers, assess awareness of secure coding practices as well.
- Assessment cycle and evidence retention: Conduct assessments at a minimum annually and immediately upon a new participant joining, and retain results for a minimum of 3 years.
Sample
The following is a sample security assurance policy awareness acknowledgment form.
[Open Source Security Assurance Policy Awareness Acknowledgment]
I confirm that I have familiarized myself with the following:
1. The existence and location of the company's open source security assurance policy
2. The objectives of the open source component vulnerability detection, assessment,
response, and CVD program
3. How my role contributes to the operation of the security assurance program
4. The security breaches, legal liabilities, and business risks that may arise
from failing to comply with the vulnerability response process
Name: ________________ Role: ________________
Signature: ________________ Date: ________________
5. References
- Corresponding ISO/IEC 5230 clause: §3.1.3 Awareness
- Related guide: Enterprise Open Source Management Guide — 5. Training
- Related template: Open Source Policy Template — §6 Training and Awareness
4 - §4.1.4 Program Scope
1. Clause Overview
§4.1.4 is the clause from ISO/IEC 5230 §3.1.4 (Program Scope) with 2 additional Verification Materials. While 5230 only requires a written statement that clearly defines the program’s scope, 18974 additionally requires a set of performance metrics the program seeks to improve upon (4.1.4.2) and documented evidence from each review, update, or audit demonstrating continuous improvement (4.1.4.3). These two items are intended to demonstrate that the security assurance program is not static compliance, but a system with measurable goals that continuously improves.
2. What to Do
- Create a documented written statement that clearly defines the program scope (target software, organizational units, exclusions) (same as 5230).
- Define the performance metrics that the program seeks to improve upon (added in 18974).
- Maintain records demonstrating that continuous improvement is achieved through periodic reviews, updates, and audits (added in 18974).
- Periodically measure actual performance against metric targets and record the results.
- Identify areas for improvement and document the history of follow-up actions.
3. Requirements and Verification Materials
| Clause | Requirement | Verification Material(s) |
|---|---|---|
| §4.1.4 | The program scope must be clearly defined, and metrics for program improvement and evidence of continuous improvement must be maintained. | 4.1.4.1 A written statement that clearly defines the scope and limits of the program 4.1.4.2 A set of metrics the program seeks to improve upon ★ 4.1.4.3 Documented evidence from each review, update, or audit to demonstrate continuous improvement ★ |
★ = Additional items compared to ISO/IEC 5230 §3.1.4
§4.1.4 Program Scope Different programs may be designed to address different scopes depending on the supplier’s needs and business model. The scope needs to be clear.
Verification Material(s): 4.1.4.1 A written statement that clearly defines the scope and limits of the program. 4.1.4.2 A set of metrics the program seeks to improve upon. 4.1.4.3 Documented evidence from each review, update, or audit to demonstrate continuous improvement.
4. How to Comply and Samples per Verification Material
4.1.4.1 Written Statement of Program Scope
This is the same as ISO/IEC 5230 §3.1.4.1. Refer to §3.1.4.1 Written Statement of Program Scope for how to prepare this. From a security assurance perspective, explicitly state that “response to known and newly discovered vulnerabilities” is included within the scope.
4.1.4.2 Set of Performance Metrics ★
How to Comply
The performance metrics that the security assurance program seeks to improve must be defined and documented. Metrics must be measurable and realistic, and connected to the program’s key objectives (vulnerability detection rate, response time, SBOM completeness, etc.). The metrics set itself is Verification Material 4.1.4.2.
Considerations
- Measurability: Set quantitative indicators rather than qualitative descriptions.
- Realistic targets: Set targets at an achievable level initially and progressively raise them.
- Periodic measurement: Measure metrics at a minimum quarterly and record the results.
Sample
[Security Assurance Program Performance Metrics]
| Metric | Measurement Method | Target | Measurement Cycle |
|--------|--------------------|--------|--------------------|
| SBOM Completeness | Proportion of distributed software with an SBOM | 100% | Quarterly |
| Critical Vulnerability Average Response Time | Detection date to patch application date | 7 days or less | Quarterly |
| High Vulnerability Average Response Time | Detection date to patch application date | 30 days or less | Quarterly |
| Vulnerability Recurrence Rate | Rate of re-vulnerability in the same component | 10% or less | Semi-annually |
| New Participant Awareness Assessment Completion Rate | Rate of assessment completion within 30 days of joining | 100% | Quarterly |
| External Vulnerability Inquiry Response Compliance Rate | Rate of response completion within 14 days | 95% or more | Quarterly |
4.1.4.3 Evidence of Continuous Improvement ★
How to Comply
Records showing that the security assurance program is actually improving through periodic reviews, process updates, internal audits, etc. must be maintained. Document the issues found, improvement actions taken, and improvement results at each review or audit. These records themselves are Verification Material 4.1.4.3.
Considerations
- Regular audit schedule: Conduct a full program audit at a minimum annually and record the results.
- Track improvement history: Track and record whether issues raised in previous audits were resolved in subsequent audits.
- Link to metrics: Use the performance trend of the metrics defined in 4.1.4.2 as evidence of improvement.
Sample
[Security Assurance Program Periodic Review Record]
Review date: 2026-01-10
Reviewers: Gil-dong Hong (Open Source Program Manager), Chul-su Kim (Security)
Metrics performance:
- SBOM Completeness: 97% → 100% (target achieved)
- Critical vulnerability average response time: 9 days → 6 days (target achieved)
- High vulnerability average response time: 35 days → 28 days (target achieved)
Improvements identified:
1. External vulnerability inquiry response compliance rate 88% → target of 95% not met
Action: Additional inquiry monitoring assignee designated (completed 2026-02-01)
2. Delays in awareness assessment for new participants identified
Action: Awareness assessment added as mandatory item to onboarding checklist (completed 2026-01-20)
Next review scheduled: 2027-01-09
5. References
- Corresponding ISO/IEC 5230 clause: §3.1.4 Program Scope
- Related guide: Enterprise Open Source Management Guide — 2. Policy
- Related template: Open Source Policy Template — §1.4 Scope
5 - §4.1.5 Standard Practice Implementation
1. Clause Overview
§4.1.5 is a new clause exclusive to 18974 that does not exist in ISO/IEC 5230. It requires establishing documented procedures for each of the 8 standard handling methods for open source vulnerabilities. This clause goes beyond a simple declaration of “responding” to vulnerabilities, requiring the formalization of the entire vulnerability lifecycle into procedures — from threat identification, detection, follow-up, customer notification, post-deployment monitoring, continuous security testing, risk verification, to information export. This clause, together with §4.3.2 Security Assurance, forms the core of ISO/IEC 18974.
2. What to Do
Establish documented procedures for each of the 8 methods:
- Identification of structural and technical threats: Define a method to identify threats affecting the supplied software.
- Detection of known vulnerabilities: Define a method to detect the existence of known vulnerabilities (CVEs) in open source components.
- Follow-up on vulnerabilities: Define a method to take follow-up actions such as patching, mitigation, or acceptance for identified vulnerabilities.
- Customer notification: Define a method to communicate identified vulnerabilities to customers, where applicable.
- Post-deployment analysis for newly disclosed vulnerabilities: Define a method to analyze already-deployed software for newly published CVEs.
- Continuous security testing: Define a method to apply continuous and iterative security testing to all supplied software before release.
- Verification of risk resolution: Define a method to verify that identified risks have been addressed before release.
- Export of risk information: Define a method to export risk information to third parties, where appropriate.
3. Requirements and Verification Materials
| Clause | Requirement | Verification Material(s) |
|---|---|---|
| §4.1.5 | A program shall demonstrate defined and implemented processes for sound and robust handling of known vulnerabilities and security software development, specifically by defining and implementing the following 8 methods: threat identification / vulnerability detection / follow-up / customer notification / post-deployment newly disclosed vulnerability analysis / continuous security testing / risk resolution verification / risk information export | 4.1.5.1 A documented procedure exists for each of the methods identified above |
§4.1.5 Standard Practice Implementation A program shall demonstrate defined and implemented processes for sound and robust handling of known vulnerabilities and security software development, specifically by defining and implementing the following:
- A method to identify structural and technical threats to the supplied software;
- A method to detect the existence of known vulnerabilities in the supplied software;
- A method to follow up on identified known vulnerabilities;
- A method to communicate identified known vulnerabilities to customers, where applicable;
- A method to analyze the supplied software for newly disclosed known vulnerabilities when they are published;
- A method to apply continuous and iterative security testing to all supplied software before release;
- A method to verify that identified risks have been addressed before release; and
- A method to export information about identified risks to third parties, where appropriate.
Verification Material(s): 4.1.5.1 A documented procedure exists for each of the methods identified above.
4. How to Comply and Samples per Verification Material
4.1.5.1 Documented Procedures for the 8 Vulnerability Handling Methods
How to Comply
A procedure explaining “how” each of the 8 methods is performed must be documented. These procedures together constitute Verification Material 4.1.5.1. You can create 8 separate documents, or organize all 8 as sections within a single integrated vulnerability management procedure document. The latter approach is recommended as it reduces management burden and is easier to maintain consistency.
Method 1 — Identification of Structural and Technical Threats
Define a method to identify structural (architecture design, dependency structure) and technical (known vulnerable patterns, risky components) threats that may affect the supplied software. Using threat modeling (STRIDE, PASTA, etc.) or periodically analyzing the dependency tree to identify risky components is a common approach.
[Threat Identification Procedure Overview]
- Conduct threat modeling when designing a new software architecture.
- Analyze the dependency tree quarterly to identify EOL (End-of-Life) components,
abandoned projects, and components with high dependency concentration.
- Register identified threats in the Risk Registry and assign a responsible party.
Method 2 — Detection of Known Vulnerabilities
Define a method to detect the existence of CVEs (Common Vulnerabilities and Exposures) in open source components based on the SBOM. Integrating automated tools (OSV-SCALIBR, Dependency-Track, Grype, etc.) into the CI/CD pipeline to scan for vulnerabilities at every build is an effective approach.
[Vulnerability Detection Procedure Overview]
- Integrate SCA (Software Composition Analysis) tools into the CI/CD pipeline.
- Automatically run SBOM-based vulnerability scans on every build.
- Reference multiple vulnerability databases such as OSV (Open Source Vulnerabilities),
NVD (National Vulnerability Database), and GitHub Advisory Database.
- Automatically notify the security team of detected vulnerabilities along with their severity.
Method 3 — Follow-up on Vulnerabilities
Define a method to take follow-up actions on identified vulnerabilities, such as applying patches, implementing mitigations, replacing components, or accepting risk. Specify priority and remediation timelines based on CVSS scores in the procedure.
[Follow-up Procedure Overview]
- Determine remediation priority and timelines based on CVSS score:
Critical (9.0+): within 7 days / High (7.0-8.9): within 30 days
Medium (4.0-6.9): within 90 days / Low (0.1-3.9): at next release
- If no patch is available, implement mitigation measures (network isolation, WAF rule additions, etc.)
and risk acceptance decisions require joint approval from the security team and open source PM.
- Record the remediation outcome in the vulnerability tracking system and verify completion.
Method 4 — Customer Notification
Define a method to communicate vulnerabilities to customers when they are discovered in supplied software and may affect customers. Specify the notification criteria (severity, customer impact scope), notification channels, and notification timelines in the procedure.
[Customer Notification Procedure Overview]
- Notify customers for Critical/High vulnerabilities that affect distributed products.
- Notification channels: Product security notice (website), customer security contact email,
security advisory publication
- Notification timeline: Within 7 days of confirming patch or mitigation measures
- Notification content: Affected components, CVE ID, severity, recommended actions, patch delivery schedule
Method 5 — Post-deployment Analysis for Newly Disclosed Vulnerabilities
Define a method to analyze whether newly published CVEs affect already-deployed software. A monitoring system is needed that retains the SBOM for deployed software and automatically cross-references newly published CVEs against those SBOMs.
[Post-deployment Newly Disclosed Vulnerability Analysis Procedure Overview]
- Retain SBOMs for deployed software by version.
- Use tools such as Dependency-Track to automatically cross-reference newly published CVEs
against deployed SBOMs and generate a list of affected software versions.
- When affected versions are confirmed, process them according to Method 3 (follow-up)
and Method 4 (customer notification) procedures.
- Monitoring is performed automatically at all times, with weekly summary reports
sent to the security team.
Method 6 — Continuous Security Testing
Define a method to apply continuous and iterative security testing to all supplied software before release. Integrating SAST (Static Application Security Testing), DAST (Dynamic Application Security Testing), and SCA into the CI/CD pipeline is the common approach.
[Continuous Security Testing Procedure Overview]
- Security testing by CI/CD pipeline stage:
· On code commit: SAST (static analysis), SCA (open source vulnerability scan)
· On PR merge: Verify passage of security gate (blocking if Critical/High unresolved)
· On release candidate build: DAST (dynamic analysis), container image scan
- Automatically block release on security test failure and notify the security team.
- Continuously monitor test coverage and results on a dashboard.
Method 7 — Verification of Risk Resolution
Define a method to verify that identified risks have actually been resolved before release. After applying a patch, confirm via rescan that the vulnerability has been eliminated and record the result.
[Risk Resolution Verification Procedure Overview]
- Run a rescan using the same tool after patch or mitigation is complete.
- Confirm that the vulnerability has been eliminated in the rescan result and record
it in the vulnerability tracking system.
- For Critical/High vulnerabilities, the security team approves the verification result.
- If releasing with unresolved risks, document management approval and a mitigation plan.
Method 8 — Export of Risk Information
Define a method to export identified risk information to third parties (supply chain partners, customers, vulnerability databases, etc.) where appropriate. This includes using the VEX (Vulnerability Exploitability eXchange) format, or procedures for reporting vulnerability information to upstream projects through CVD channels.
[Risk Information Export Procedure Overview]
- When a new vulnerability is independently discovered, report it to the upstream project
or CERT following CVD procedures.
- Use VEX format when sharing vulnerability impact information with supply chain partners.
- Before sharing with third parties, conduct a legal review to ensure no trade secrets
or undisclosed information are included.
- Record and retain a log of information exports (recipient, date, content summary).
5. References
- No corresponding ISO/IEC 5230 clause (new clause exclusive to 18974)
- Related guide: Enterprise Open Source Management Guide — 3. Process
- Related tools: OSV-SCALIBR, Dependency-Track