< Previous | Contents | Next >
4.4.1. SWE.1 Software Requirements Analysis
Process ID | SWE.1 |
Process name | Software Requirements Analysis |
Process purpose | The purpose of the Software Requirements Analysis Process is to transform the software related parts of the system requirements into a set of software requirements. |
Process outcomes | As a result of successful implementation of this process: 1) the software requirements to be allocated to the software elements of the system and their interfaces are defined; 2) software requirements are categorized and analyzed for correctness and verifiability; 3) the impact of software requirements on the operating environment is analyzed; 4) prioritization for implementing the software requirements is defined; 5) the software requirements are updated as needed; 6) consistency and bidirectional traceability are established between system requirements and software requirements; and consistency and bidirectional traceability are established between system architectural design and software requirements; 7) the software requirements are evaluated for cost, schedule and technical impact; and 8) the software requirements are agreed and communicated to all affected parties. |
Base practices | SWE.1.BP1: Specify software requirements. Use the system requirements and the system architecture and changes to system requirements and architecture to identify the required functions and capabilities of the software. Specify functional and non-functional software requirements in a software requirements specification. [OUTCOME 1, 5, 7] NOTE 1: Application parameter influencing functions and capabilities are part of the system requirements. NOTE 2: In case of software development only, the system requirements and the system architecture refer to a given operating environment (see also note 5). In that case, stakeholder requirements should be used as the basis for identifying the required functions and capabilities of the software as well as for identifying application parameters influencing software functions and capabilities. SWE.1.BP2: Structure software requirements. Structure the software requirements in the software requirements specification by e.g. grouping to project relevant clusters, sorting in a logical order for the project, categorizing based on relevant criteria for the project, prioritizing according to stakeholder needs. [Outcome 2, 4] NOTE 3: Prioritizing typically includes the assignment of software content to planned releases. Refer to SPL.2.BP1. |
SWE.1.BP3: Analyze software requirements. Analyze the specified software requirements including their interdependencies to ensure correctness, technical feasibility and verifiability, and to support risk identification. Analyze the impact on cost, schedule and the technical impact. [OUTCOME 2, 7] NOTE 4: The analysis of impact on cost and schedule supports the adjustment of project estimates. Refer to MAN.3.BP5. SWE.1.BP4: Analyze the impact on the operating environment. Analyze the impact that the software requirements will have on interfaces of system elements and the operating environment. [OUTCOME 3, 7] NOTE 5: The operating environment is defined as the system in which the software executes (e.g. hardware, operating system, etc.). SWE.1.BP5: Develop verification criteria. Develop the verification criteria for each software requirement that define the qualitative and quantitative measures for the verification of a requirement. [OUTCOME 2, 7] NOTE 6: Verification criteria demonstrate that a requirement can be verified within agreed constraints and is typically used as the input for the development of the software test cases or other verification measures that should demonstrate compliance with the software requirements. NOTE 7: Verification which cannot be covered by testing is covered by SUP.2. SWE.1.BP6: Establish bidirectional traceability. Establish bidirectional traceability between system requirements and software requirements. Establish bidirectional traceability between the system architecture and software requirements. [OUTCOME 6] NOTE 8: Redundancy should be avoided by establishing a combination of these approaches that covers the project and the organizational needs. NOTE 9: Bidirectional traceability supports coverage, consistency and impact analysis. SWE.1.BP7: Ensure consistency. Ensure consistency between system requirements and software requirements. Ensure consistency between the system architecture and software requirements. [OUTCOME 6] NOTE 10: Consistency is supported by bidirectional traceability and can be demonstrated by review records. NOTE 11: In case of software development only, the system requirements and system architecture refer to a given operating environment (see also note 2). In that case, consistency and bidirectional traceability have to be ensured between stakeholder requirements and software requirements. SWE.1.BP8: Communicate agreed software requirements. Communicate the agreed software requirements and updates to software requirements to all relevant parties. [OUTCOME 8] | |
Output work products | 13-04 Communication record → [OUTCOME 8] 13-19 Review record → [OUTCOME 6] 13-21 Change control record → [OUTCOME 5, 7] 13-22 Traceability record → [OUTCOME 1, 6] 15-01 Analysis report → [OUTCOME 2, 3, 4, 7] |
17-08 Interface requirements specification → [OUTCOME 1, 3] 17-11 Software requirements specification → [OUTCOME 1] 17-50 Verification criteria → [OUTCOME 2] |
4.4.2. SWE.2 Software Architectural Design
Process ID | SWE.2 |
Process name | Software Architectural Design |
Process purpose | The purpose of the Software Architectural Design Process is to establish an architectural design and to identify which software requirements are to be allocated to which elements of the software, and to evaluate the software architectural design against defined criteria. |
Process outcomes | As a result of successful implementation of this process: 1) a software architectural design is defined that identifies the elements of the software; 2) the software requirements are allocated to the elements of the software; 3) the interfaces of each software element are defined; 4) the dynamic behavior and resource consumption objectives of the software elements are defined; 5) consistency and bidirectional traceability are established between software requirements and software architectural design; and 6) the software architectural design is agreed and communicated to all affected parties. |
Base practices | SWE.2.BP1: Develop software architectural design. Develop and document the software architectural design that specifies the elements of the software with respect to functional and non-functional software requirements. [OUTCOME 1] NOTE 1: The software is decomposed into elements across appropriate hierarchical levels down to the software components (the lowest level elements of the software architectural design) that are described in the detailed design. SWE.2.BP2: Allocate software requirements. Allocate the software requirements to the elements of the software architectural design. [OUTCOME 2] SWE.2.BP3: Define interfaces of software elements. Identify, develop and document the interfaces of each software element. [OUTCOME 3] SWE.2.BP4: Describe dynamic behavior. Evaluate and document the timing and dynamic interaction of software elements to meet the required dynamic behavior of the system. [OUTCOME 4] NOTE 2: Dynamic behavior is determined by operating modes (e.g. start-up, shutdown, normal mode, calibration, diagnosis, etc.), processes and process intercommunication, tasks, threads, time slices, interrupts, etc. NOTE 3: During evaluation of the dynamic behavior the target platform and potential loads on the target should be considered. |
SWE.2.BP5: Define resource consumption objectives. Determine and document the resource consumption objectives for all relevant elements of the software architectural design on the appropriate hierarchical level. [OUTCOME 4] NOTE 4: Resource consumption is typically determined for resources like Memory (ROM, RAM, external / internal EEPROM or Data Flash), CPU load, etc. SWE.2.BP6: Evaluate alternative software architectures. Define evaluation criteria for the architecture. Evaluate alternative software architectures according to the defined criteria. Record the rationale for the chosen software architecture. [OUTCOME 1, 2, 3, 4, 5] NOTE 5: Evaluation criteria may include quality characteristics (modularity, maintainability, expandability, scalability, reliability, security realization and usability) and results of make-buy-reuse analysis. SWE.2.BP7: Establish bidirectional traceability. Establish bidirectional traceability between software requirements and elements of the software architectural design. [OUTCOME 5] NOTE 6: Bidirectional traceability covers allocation of software requirements to the elements of the software architectural design. NOTE 7: Bidirectional traceability supports coverage, consistency and impact analysis. SWE.2.BP8: Ensure consistency. Ensure consistency between software requirements and the software architectural design. [OUTCOME 1, 2, 5, 6] NOTE 8: Consistency is supported by bidirectional traceability and can be demonstrated by review records. SWE.2.BP9: Communicate agreed software architectural design. Communicate the agreed software architectural design and updates to software architectural design to all relevant parties. [OUTCOME 6] | |
Output work products | 04-04 Software architectural design → [OUTCOME 1, 2, 3, 4, 5] 13-04 Communication record → [OUTCOME 6] 13-19 Review record → [OUTCOME 5] 13-22 Traceability record → [OUTCOME 5] 17-08 Interface requirement specification → [OUTCOME 3] |
4.4.3. SWE.3 Software Detailed Design and Unit Construction
Process ID | SWE.3 |
Process name | Software Detailed Design and Unit Construction |
Process purpose | The purpose of the Software Detailed Design and Unit Construction Process is to provide an evaluated detailed design for the software components and to specify and to produce the software units. |
Process outcomes | As a result of successful implementation of this process: 1) a detailed design is developed that describes software units; 2) interfaces of each software unit are defined; 3) the dynamic behavior of the software units is defined; |
4) consistency and bidirectional traceability are established between software requirements and software units; and consistency and bidirectional traceability are established between software architectural design and software detailed design; and consistency and bidirectional traceability are established between software detailed design and software units; 5) the software detailed design and the relationship to the software architectural design is agreed and communicated to all affected parties; and 6) software units defined by the software detailed design are produced. | |
Base practices | SWE.3.BP1: Develop software detailed design. Develop a detailed design for each software component defined in the software architectural design that specifies all software units with respect to functional and non- functional software requirements. [OUTCOME 1] SWE.3.BP2: Define interfaces of software units. Identify, specify and document the interfaces of each software unit. [OUTCOME 2] SWE.3.BP3: Describe dynamic behavior. Evaluate and document the dynamic behavior of and the interaction between relevant software units. [OUTCOME 3] NOTE 1: Not all software units have dynamic behavior to be described. SWE.3.BP4: Evaluate software detailed design. Evaluate the software detailed design in terms of interoperability, interaction, criticality, technical complexity, risks and testability. [OUTCOME 1,2,3,4] NOTE 2: The results of the evaluation can be used as input for software unit verification. SWE.3.BP5: Establish bidirectional traceability. Establish bidirectional traceability between software requirements and software units. Establish bidirectional traceability between the software architectural design and the software detailed design. Establish bidirectional traceability between the software detailed design and software units. [OUTCOME 4] NOTE 3: Redundancy should be avoided by establishing a combination of these approaches that covers the project and the organizational needs. NOTE 4: Bidirectional traceability supports coverage, consistency and impact analysis. SWE.3.BP6: Ensure consistency. Ensure consistency between software requirements and software units. Ensure consistency between the software architectural design, the software detailed design and software units. [OUTCOME 4] NOTE 5: Consistency is supported by bidirectional traceability and can be demonstrated by review records. SWE.3.BP7: Communicate agreed software detailed design. Communicate the agreed software detailed design and updates to the software detailed design to all relevant parties. [OUTCOME 5] SWE.3.BP8: Develop software units. Develop and document the executable representations of each software unit according to the software detailed design. [OUTCOME 6] |
Output work products | 04-05 Software detailed design → [OUTCOME 1, 2, 3] 11-05 Software unit → [OUTCOME 6] 13-04 Communication record → [OUTCOME 5] 13-19 Review record → [OUTCOME 4] 13-22 Traceability record → [OUTCOME 4] |
4.4.4. SWE.4 Software Unit Verification
Process ID | SWE.4 |
Process name | Software Unit Verification |
Process purpose | The purpose of the Software Unit Verification Process is to verify software units to provide evidence for compliance of the software units with the software detailed design and with the non-functional software requirements. |
Process outcomes | As a result of successful implementation of this process: 1) a software unit verification strategy including regression strategy is developed to verify the software units; 2) criteria for software unit verification are developed according to the software unit verification strategy that are suitable to provide evidence for compliance of the software units with the software detailed design and with the non-functional software requirements; 3) software units are verified according to the software unit verification strategy and the defined criteria for software unit verification and the results are recorded; 4) consistency and bidirectional traceability are established between software units, criteria for verification and verification results; and 5) results of the unit verification are summarized and communicated to all affected parties. |
Base practices | SWE.4.BP1: Develop software unit verification strategy including regression strategy. Develop a strategy for verification of the software units including regression strategy for re-verification if a software unit is changed. The verification strategy shall define how to provide evidence for compliance of the software units with the software detailed design and with the non-functional requirements. [OUTCOME 1] NOTE 1: Possible techniques for unit verification include static/dynamic analysis, code reviews, unit testing etc. SWE.4.BP2: Develop criteria for unit verification. Develop criteria for unit verification that are suitable to provide evidence for compliance of the software units, and their interactions within the component, with the software detailed design and with the non-functional requirements according to the verification strategy. For unit testing, criteria shall be defined in a unit test specification. [OUTCOME 2] NOTE 2: Possible criteria for unit verification include unit test cases, unit test data, static verification, coverage goals and coding standards such as the MISRA rules. NOTE 3: The unit test specification may be implemented e.g. as a script in an automated test bench. |
SWE.4.BP3: Perform static verification of software units. Verify software units for correctness using the defined criteria for verification. Record the results of the static verification. [OUTCOME 3] NOTE 4: Static verification may include static analysis, code reviews, checks against coding standards and guidelines, and other techniques. NOTE 5: See SUP.9 for handling of non-conformances. SWE.4.BP4: Test software units. Test software units using the unit test specification according to the software unit verification strategy. Record the test results and logs. [OUTCOME 3] NOTE 6: See SUP.9 for handling of non-conformances. SWE.4.BP5: Establish bidirectional traceability. Establish bidirectional traceability between software units and static verification results. Establish bidirectional traceability between the software detailed design and the unit test specification. Establish bidirectional traceability between the unit test specification and unit test results. [OUTCOME 4] NOTE 7: Bidirectional traceability supports coverage, consistency and impact analysis. SWE.4.BP6: Ensure consistency. Ensure consistency between the software detailed design and the unit test specification. [OUTCOME 4] NOTE 8: Consistency is supported by bidirectional traceability and can be demonstrated by review records. SWE.4.BP7: Summarize and communicate results. Summarize the unit test results and static verification results and communicate them to all affected parties. [OUTCOME 5] NOTE 9: Providing all necessary information from the test case execution in a summary enables other parties to judge the consequences. | |||
Output work products | 08-50 Test specification 08-52 Test plan | → → | [Outcome 2] [Outcome 1] |
13-04 Communication record | → | [Outcome 5] | |
13-19 Review record | → | [Outcome 3, 4] | |
13-22 Traceability record | → | [Outcome 4] | |
13-25 Verification results | → | [Outcome 3, 5] | |
13-50 Test result | → | [Outcome 3, 5] | |
15-01 Analysis report | → | [Outcome 3] | |
4.4.5. SWE.5 Software Integration and Integration Test
Process ID | SWE.5 |
Process name | Software Integration and Integration Test |
Process purpose | The purpose of the Software Integration and Integration Test Process is to integrate the software units into larger software items up to a complete integrated software consistent with the software architectural design and to ensure that the software items are tested to provide evidence for compliance of the integrated software items with the software architectural design, including the interfaces between the software units and between the software items. |
Process outcomes | As a result of successful implementation of this process: 1) a software integration strategy consistent with the project plan, release plan and the software architectural design is developed to integrate the software items; 2) a software integration test strategy including the regression test strategy is developed to test the software unit and software item interactions; 3) a specification for software integration test according to the software integration test strategy is developed that is suitable to provide evidence for compliance of the integrated software items with the software architectural design, including the interfaces between the software units and between the software items; 4) software units and software items are integrated up to a complete integrated software according to the integration strategy; 5) Test cases included in the software integration test specification are selected according to the software integration test strategy, and the release plan; 6) integrated software items are tested using the selected test cases and the results of software integration test are recorded; 7) consistency and bidirectional traceability are established between the elements of the software architectural design and the test cases included in the software integration test specification and between test cases and test results; and 8) results of the software integration test are summarized and communicated to all affected parties. |
Base practices | SWE.5.BP1: Develop software integration strategy. Develop a strategy for integrating software items consistent with the project plan and release plan. Identify software items based on the software architectural design and define a sequence for integrating them. [OUTCOME 1] SWE.5.BP2: Develop software integration test strategy including regression test strategy. Develop a strategy for testing the integrated software items following the integration strategy. This includes a regression test strategy for re-testing integrated software items if a software item is changed. [OUTCOME 2] SWE.5.BP3: Develop specification for software integration test. Develop the test specification for software integration test including the test cases according to the software integration test strategy for each integrated software item. The test specification shall be suitable to provide evidence |
for compliance of the integrated software items with the software architectural design. [OUTCOME 3] NOTE 1: Compliance to the architectural design means that the specified integration tests are suitable to prove that the interfaces between the software units and between the software items fulfill the specification given by the software architectural design. NOTE 2: The software integration test cases may focus on the correct dataflow between software items the timeliness and timing dependencies of dataflow between software items the correct interpretation of data by all software items using an interface the dynamic interaction between software items the compliance to resource consumption objectives of interfaces SWE.5.BP4: Integrate software units and software items. Integrate the software units to software items and software items to integrated software according to the software integration strategy. [OUTCOME 4] SWE.5.BP5: Select test cases. Select test cases from the software integration test specification. The selection of test cases shall have sufficient coverage according to the software integration test strategy and the release plan. [OUTCOME 5] SWE.5.BP6: Perform software integration test. Perform the software integration test using the selected test cases. Record the integration test results and logs. [OUTCOME 6] NOTE 4: See SUP.9 for handling of non-conformances. NOTE 5: The software integration test may be supported by using hardware debug interfaces or simulation environments (e.g. Software-in-the-Loop- Simulation). SWE.5.BP7: Establish bidirectional traceability. Establish bidirectional traceability between elements of the software architectural design and test cases included in the software integration test specification. Establish bidirectional traceability between test cases included in the software integration test specification and software integration test results. [OUTCOME 7] NOTE 6: Bidirectional traceability supports coverage, consistency and impact analysis. SWE.5.BP8: Ensure consistency. Ensure consistency between elements of the software architectural design and test cases included in the software integration test specification. [OUTCOME 7] NOTE 7: Consistency is supported by bidirectional traceability and can be demonstrated by review records. SWE.5.BP9: Summarize and communicate results. Summarize the software integration test results and communicate them to all affected parties. [OUTCOME 8] NOTE 8: Providing all necessary information from the test case execution in a summary enables other parties to judge the consequences. | |
Output work products | 01-03 Software item → [OUTCOME 4] |
01-50 Integrated software → [OUTCOME 4] 08-50 Test specification → [OUTCOME 3, 5] 08-52 Test plan → [OUTCOME 1, 2] 13-04 Communication record → [OUTCOME 8] 13-19 Review record → [OUTCOME 7] 13-22 Traceability record → [OUTCOME 7] 13-50 Test result → [OUTCOME 6, 8] 17-02 Build list → [OUTCOME 4, 7] |
4.4.6. SWE.6 Software Qualification Test
Process ID | SWE.6 |
Process name | Software Qualification Test |
Process purpose | The purpose of the Software Qualification Test Process is to ensure that the integrated software is tested to provide evidence for compliance with the software requirements. |
Process outcomes | As a result of successful implementation of this process: 1) a software qualification test strategy including regression test strategy consistent with the project plan and release plan is developed to test the integrated software; 2) a specification for software qualification test of the integrated software according to the software qualification test strategy is developed that is suitable to provide evidence for compliance with the software requirements; 3) test cases included in the software qualification test specification are selected according to the software qualification test strategy and the release plan; 4) the integrated software is tested using the selected test cases and the results of software qualification test are recorded; 5) consistency and bidirectional traceability are established between software requirements and software qualification test specification including test cases and between test cases and test results; and 6) results of the software qualification test are summarized and communicated to all affected parties. |
Base practices | SWE.6.BP1: Develop software qualification test strategy including regression test strategy. Develop a strategy for software qualification testing consistent with the project plan and the release plan. This includes a regression test strategy for re-testing the integrated software if a software item is changed. [OUTCOME 1] SWE.6.BP2: Develop specification for software qualification test. Develop the specification for software qualification test including test cases based on the verification criteria, according to the software test strategy. The test specification shall be suitable to provide evidence for compliance of the integrated software with the software requirements. [OUTCOME 2] |
SWE.6.BP3: Select test cases. Select test cases from the software test specification. The selection of test cases shall have sufficient coverage according to the software test strategy and the release plan. [OUTCOME 3] SWE.6.BP4: Test integrated software. Test the integrated software using the selected test cases. Record the software test results and logs. [OUTCOME 4] NOTE 1: See SUP.9 for handling of non-conformances. SWE.6.BP5: Establish bidirectional traceability. Establish bidirectional traceability between software requirements and test cases included in the software qualification test specification. Establish bidirectional traceability between test cases included in the software qualification test specification and software qualification test results. [OUTCOME 5] NOTE 2: Bidirectional traceability supports coverage, consistency and impact analysis. SWE.6.BP6: Ensure consistency. Ensure consistency between software requirements and test cases included in the software qualification test specification. [OUTCOME 5] NOTE 3: Consistency is supported by bidirectional traceability and can be demonstrated by review records. SWE.6.BP7: Summarize and communicate results. Summarize the software qualification test results and communicate them to all affected parties. [OUTCOME 6] NOTE 4: Providing all necessary information from the test case execution in a summary enables other parties to judge the consequences. | |||
Output work products | 08-50 Test specification 08-52 Test plan | → → | [Outcome 2, 3] [Outcome 1] |
13-04 Communication record | → | [Outcome 6] | |
13-19 Review record | → | [Outcome 5] | |
13-22 Traceability record | → | [Outcome 5] | |
13-50 Test result | → | [Outcome 4, 6] | |