1
0
mirror of https://github.com/SAP/jenkins-library.git synced 2025-03-03 15:02:35 +02:00

Move docs of Cloud SDK Pipeline (#1167)

This commit is contained in:
Florian Wilhelm 2020-02-18 17:51:44 +01:00 committed by GitHub
parent e81f40f645
commit 1b6781c1e9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 390 additions and 3 deletions

View File

@ -0,0 +1,245 @@
# Build Tools
The SAP Cloud SDK supports multiple programming languages (Java and JavaScript) and can be used in the SAP Cloud Application Programming Model.
For each of these variants project templates exists (as referenced in the project's main [Readme](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/README.md) file).
These templates introduce standard tooling, such as build tools, and a standard structure.
The SAP Cloud SDK Continuous Delivery Toolkit expects that the project follows this structure and depends on the build tools introduced by these templates.
The supported build tools are:
* [Maven](https://maven.apache.org/) for Java projects
* [npm](https://www.npmjs.com/) for JavaScript projects
* [MTA](https://sap.github.io/cloud-mta-build-tool) for Multi-Target Application Model projects
MTA itself makes use of other build tools, such as Maven and npm depending on what types of modules your application has.
*Note: The npm pipeline variant is in an early state. Some interfaces might change. We recommend consuming a fixed released version as described in the project [Readme](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/README.md#versioning).*
## Feature Matrix
Support for the different features of the pipeline may vary in each variant of the SDK pipeline build tool.
The following table gives an overview over the features available per build tool.
| Feature | Maven | npm | MTA Maven | MTA npm |
|----------------------------|-------|-----|-----------|---------|
| Automatic Versioning | x | | x | x |
| Build | x | x | x | x |
| Backend Integration Tests | x | x | x | x |
| Frontend Integration Tests | x | x | x | x |
| Backend Unit Tests | x | x | x | x |
| Frontend Unit Tests | x | x | x | x |
| NPM Dependency Audit | x | x | x | x |
| Linting | x | | x | x |
| Static Code Checks | x | | x | |
| End-To-End Tests | x | | x | x |
| Performance Tests | x | | x | |
| Resilience Checks | x | | x | |
| S4HANA Public APIs | x | | x | |
| Code Coverage Checks | x | x | x | x |
| Checkmarx Integration | x | | x | |
| Fortify Integration | x | | x | |
| SourceClear Integration | x | | | |
| Whitesource Integration | x | x | x | x |
| Deployment to Nexus | x | | x | x |
| Zero Downtime Deployment | x | x | x¹ | x¹ |
| Download Cache | x | x | x | x |
¹ MTA projects can only be deployed to the Cloud Foundry Environment
## Projects Requirements
Each variant of the pipeline has different requirements regarding the project structure, location of reports and tooling.
Stages not listed here do not have a special requirement.
In any case, please also consult the [documentation of the pipeline configuration](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md), as some stages have to be activated by providing configuration values.
### Build Tool Independent Requirements
In order to run in the pipeline your project has to include the following two files in the root folder: `Jenkinsfile` and `pipeline_config.yml`.
You can copy both files from this [github repository](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources).
There are two variants of the configuration file.
Please pick the corresponding version for your deployment target and rename it properly.
#### Frontend Unit Tests
The command `npm run ci-frontend-unit-test` will be executed in this stage.
Furthermore, the test results have to be stored in the folder `./s4hana_pipeline/reports/frontend-unit` in the root directory.
The required format of the test result report is the JUnit format as an `.xml` file.
The code coverage report can be published as html report and in the cobertura format.
The cobertura report as html report has to be stored in the directory `./s4hana_pipeline/reports/coverage-reports/frontend-unit/report-html/ut/` as an `index.html` file.
These coverage reports will then be published in Jenkins.
Furthermore, if configured in the `pipeline_config.yml`, the pipeline ensures the configured level of code coverage.
In MTA projects Frontend Unit Tests are executed for every module of type `html5`.
#### Frontend Integration Tests
The command `npm run ci-it-frontend` will be executed in this stage and has to be defined in the `package.json` in the root.
In this stage, the frontend should be tested end-to-end without the backend.
Therefore, even a browser is started to simulate user interactions.
Furthermore, the test results have to be stored in the folder `./s4hana_pipeline/reports/frontend-integration` in the root directory of the project.
The required format of the test result report is the JUnit format as an `.xml` file.
The user is responsible to use a proper reporter for generating the results.
It is recommended to use the same tools as in the `package.json` of this [example project](https://github.com/SAP/cloud-s4-sdk-examples/blob/scaffolding-js/package.json).
#### Backend Unit Tests
##### Maven
In the maven module called `unit-tests` we run the command `mvn test`.
##### Java MTA modules
We run the command `mvn test` in each Java MTA module.
##### Npm and Nodejs MTA modules
For each `package.json` where the script `ci-backend-unit-test` is defined the command `npm run ci-backend-unit-test` will be executed in this stage.
Furthermore, the test results have to be stored in the folder `./s4hana_pipeline/reports/backend-unit/` in the root directory of the project.
The required format of the test result report is the JUnit format as an `.xml` file.
For the code coverage the results have to be stored in the folder `./s4hana_pipeline/reports/coverage-reports/backend-unit/` in the cobertura format as an `xml` file.
The user is responsible to use a proper reporter for generating the results.
We recommend the tools used in the `package.json` of this [example project](https://github.com/SAP/cloud-s4-sdk-examples/blob/scaffolding-js/package.json).
If you have multiple npm packages with unit tests the names of the report files must have unique names.
#### Backend Integration Tests
##### Maven and Java MTA modules
If there is a maven module called `integration-tests` we run `maven test` in this module.
##### Npm and Nodejs MTA modules
For each `package.json` where the script `ci-it-backend` is defined the command `npm run ci-it-backend` will be executed in this stage.
Furthermore, the test results have to be stored in the folder `./s4hana_pipeline/reports/backend-integration` in the root directory of the project.
The required format of the test result report is the JUnit format as an `.xml` file.
For the code coverage the results have to be stored in the folder `./s4hana_pipeline/reports/coverage-reports/backend-integration/` in the cobertura format as an `xml` file.
The user is responsible to use a proper reporter for generating the results.
We recommend the tools used in the `package.json` of this [example project](https://github.com/SAP/cloud-s4-sdk-examples/blob/scaffolding-js/package.json).
If you have multiple npm packages with unit tests the names of the report files must have unique names.
#### End-to-End Tests
This stage is only executed if you configured it in the file `pipeline_config.yml`.
The command `npm run ci-e2e` will be executed in this stage.
The url which is defined as `appUrl` in the file `pipeline_config.yml` will be passed as argument named `launchUrl` to the tests.
This can be reproduced locally by executing:
```
npm run ci-e2e -- --launchUrl=https://path/to/your/running/application
```
The credentials also defined in the file `pipeline_config.yml` will be available during the test execution as environment variables named `e2e_username` and `e2e_password`.
The test results have to be stored in the folder `./s4hana_pipeline/reports/e2e` in the root directory.
The required format of the test result report is the Cucumber format as an `.json` file, or the JUnit format as an xml file.
Also, screenshots can be stored in this folder.
The screenshots and reports will then be published in Jenkins.
The user is responsible to use a proper reporter for generating the results.
#### Performance Tests
This stage is only executed if you configured it in the file `pipeline_config.yml`.
Performance tests can be executed using [JMeter](https://jmeter.apache.org/) or [Gatling](https://gatling.io/).
If only JMeter is used as a performance tests tool then test plans can be placed in a default location, which is the directory `{project_root}/performance-tests`. However, if JMeter is used along with Gatling, then JMeter test plans should be kept in a subdirectory under a directory `performance-tests` for example`./performance-tests/JMeter/`.
The gatling test project including the `pom.xml` should be placed in the directory `{project_root}/performance-tests`.
Afterwards, Gatling has to be enable in the configuration.
#### Deployments
For all deployments to Cloud Foundry (excluding MTA) there has to be a file called `manifest.yml`.
This file may only contain exactly one application.
*Note: For JavaScript projects the path of the application should point to the folder `deployment`.*
### Java / Maven
For Maven the pipeline expects the following structure.
The project should have three maven modules named:
- `application`
- `unit-tests`
- `integration-tests`
The module `application` should contain the application code.
The modules `unit-tests` and `integration-tests` should contain the corresponding tests.
Furthermore, the test modules have to include the following dependency:
```xml
<dependency>
<groupId>com.sap.cloud.s4hana.quality</groupId>
<artifactId>listeners-all</artifactId>
<scope>test</scope>
</dependency>
```
### JavaScript / npm
The project has to use npm and include a `package.json` in the root directory.
In the pipeline stages, specific scripts in the `package.json` are called to build the project or run tests.
Furthermore, the pipeline expects reports, such as test results, to be written into certain folders.
These stage specific requirements are documented below.
#### Build
By default `npm ci` will be executed.
After `npm ci` the command `npm run ci-build` will be executed.
This script can be used to, for example, compile Typescript resources or webpack the frontend.
In the build stage, also development dependencies are installed and tests should also be compiled.
Afterwards the command `npm run ci-package` will be executed.
This step should prepare the deployment by copying all deployment relevant files into the folder `deployment` located in the root of the project.
This folder should not contain any non-production-related resources, such as tests or development dependencies.
This directory has to be defined as path in the `manifest.yml`.
*Note: This steps runs isolated from the steps before. Thus, e.g. modifying node_modules with `npm prune --production` will not have an effect for later stages, such as the test execution.*
### SAP Cloud Application Programming Model / MTA
The project structure follows the standard structure for projects created via the _SAP Cloud Platform Business Application_ SAP Web IDE Template with some constraints.
Please leave the basic structure of the generated project intact.
Make sure to check the _Include support for continuous delivery pipeline of SAP Cloud SDK_ checkbox, which will automatically add the required files for continous delivery in your project.
If you already created your project without this option, you'll need to copy and paste two files into the root directory of your project, and commit them to your git repository:
* [`Jenkinsfile`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/Jenkinsfile)
* [`pipeline_config.yml`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/cf-pipeline_config.yml)
* Note: The file must be named `pipeline_config.yml`, despite the different name of the file template
Further constrains on the project structure (this is all correct in projects generated from the _SAP Cloud Platform Business Application_ SAP Web IDE Template):
On the project root level, a `pom.xml` file is required.
Java services are Maven projects which include the application- and the unit-test code.
A service is typically called `srv`, but the name can be chosen freely.
An `integration-test` module must exist on the root level.
This module is where integration between the services can be tested.
In summary, the project structure should look like this:
```
.
├── Jenkinsfile
├── app // web application, not required
├── db // only if database module exists
├── integration-tests
│ ├── pom.xml
│ └── src
│ └── test
├── mta.yaml
├── package.json
├── pipeline_config.yml
├── pom.xml
└── srv
├── pom.xml
└── src
├── main
└── test // Unit-Tests for this service
```

View File

@ -0,0 +1,110 @@
# Checked Qualities in the SAP Cloud SDK Pipeline
The goal of the SAP Cloud SDK Pipeline is to help you build high quality applications which run on SAP Cloud Platform.
To achieve this, the SAP Cloud SDK Pipeline checks qualities when building your application.
This document summarizes the qualities that are checked by the SAP Cloud SDK Pipeline.
## SAP Cloud SDK Specific Checks
### Required Dependencies
For the SAP Cloud SDK specific checks to work, a few dependencies are required in unit and integration tests.
The Cloud SDK pipeline will check if the odata-querylistener, rfc-querylistener, and the httpclient-listener dependencies are in the unit- and integration tests maven modules. If one of those dependencies is missing the pipeline will add the `listeners-all`dependency to the pom on the fly before executing the respective tests. That means for a user of the SDK it is not necessary to add those dependencies manually, but it can be beneficial to speed up the runtime of the pipeline since the pom.xml won't be changed if the dependencies are available.
### Only Depend on Official API
This quality checks for usage of unofficial RFC and OData services.
Only official API from the [SAP API Business Hub](https://api.sap.com/) should be used, since unofficial API don't provide any stable interfaces.
A list of official API can be found in [this blog post](https://blogs.sap.com/2017/09/22/quality-checks/).
### Resilient Network Calls
When building extension applications on SAP Cloud Platform, you always deal with a distributed system.
There is at least two applications in this scenario: Your extension application, and SAP S/4HANA.
In distributed systems, you may not assume that the network is reliable.
To mitigate unreliable networks, a pattern called _circuit breaker_ is commonly used.
The idea is that you define a fallback action in case the network fails too often in a short time span.
The fallback might use cached data, or default values, depending on what works best in your problem domain.
To implement this pattern, the SAP Cloud SDK integrates with the [Hystrix](https://github.com/Netflix/Hystrix) library.
The version 3 of the SAP Cloud SDK integrates with the [resilience4j](https://github.com/resilience4j/resilience4j) library.
This quality check tests, that your remote calls are wrapped in a Hystrix command (v2) or in a ResilienceDecorator (v3).
The build will fail with a error message like `Your project accesses downstream systems in a non-resilient manner` if this is not the case.
More information on building resilient applications is available in [this blog post](https://blogs.sap.com/2017/06/23/step-5-resilience-with-hystrix/).
## Functional Tests
Ensuring the functional correctness of an application requires automated tests, which are part of the application code.
Those qualities depend on the test code written by the application developer.
### Unit Tests
The purpose of unit tests is to verify the correctness of a single _unit_ in isolation.
Other components than the _unit under test_ may be mocked for testing purposes.
Place your unit tests in the appropriate Maven module (`unit-tests`) in order to make the pipeline run them automatically.
### Integration Tests
Integration tests work on a higher level compared to unit tests.
They should ensure that independently tested units work together as they need to.
In the context of extension applications on SAP Cloud Platform, this means to ensure _interoperability of your application with S/4HANA_ and _interoperability between your application's backend and frontend component_.
Place your integration tests in the appropriate Maven module (`integration-tests`) in order to make the pipeline run them automatically.
For more detailed description, refer to [this blog post](https://blogs.sap.com/2017/09/19/step-12-with-sap-s4hana-cloud-sdk-automated-testing/).
### End-to-End Tests
End-to-end tests use your application, like a human user would by clicking buttons, entering text into forms and waiting for the result.
Place your end-to-end tests in the `e2e-tests` directory and ensure the `ci-e2e` script in `package.json` runs the right command.
The output folder for the reports needs to be `s4hana_pipeline/reports/e2e`.
### Code Coverage
Code coverage refers to how much of your application code is tested.
The build fails, if the test coverage of your code drops below a certain threshold.
To fix such a build failure, check which parts of your code are not tested yet and write missing tests.
The code coverage is tested using [JaCoCo Java Code Coverage Library](https://www.eclemma.org/jacoco/).
## Non-Functional Tests
### Performance
Performance relates to how quickly your application reacts under heavy load.
For implementing performance tests, you can chose between to Open Source tools: [JMeter](https://jmeter.apache.org/) and [Gatling](https://gatling.io/).
If you're not familiar with both of them, we recommend using Gatling.
More information on testing the performance of your application is available in [this blog post](https://blogs.sap.com/2018/01/11/step-23-with-sap-s4hana-cloud-sdk-performance-tests/).
### Static Code Checks
Static code checks look for potential issues in code without running the program.
The SAP Cloud SDK Pipeline includes commonly used static checks using both [PMD](https://pmd.github.io/) and [SpotBugs](https://spotbugs.github.io/).
In addition to the default checks of those tools, it adds the following SAP Cloud SDK specific checks:
* To make post-mortem debugging possible
* Log the exception in the catch block or in a called handling method or reference it in a new thrown exception
* Reference the exception when logging inside a catch block
* In order to allow a smooth transition from Neo to Cloud Foundry, you should use the platform independent abstractions provided by the SAP S4HANA Cloud SDK
### Lint
The pipeline automatically checks JavaScript and XML files in SAPUI5 components for the SAPUI5 recommended best practices.
[Custom linters](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/pipeline/extensibility.md#custom-linters) can be implemented by development teams, if desired.
This allows to enforce a common coding style within a team of developers, thus making it easier to focus on the application code, rather then discussing minor style issues.
### Third-Party Tools
The SAP Cloud SDK Pipeline also integrates with commercial third party code analyzer services, if you wish to use them.
Currently, [Checkmarx](https://www.checkmarx.com/), [WhiteSource](https://www.whitesourcesoftware.com/), and [SourceClear](https://www.sourceclear.com/) are available.
For those scans to be enabled, they need to be configured in the [pipeline configuration file](../../configuration.md).

View File

@ -7,7 +7,7 @@ Thanks to highly streamlined components, setting up and delivering your first pr
## Qualities and Pipeline Features
The SAP Cloud SDK pipeline is based on project "piper" and offers unique features for assuring that your SAP Cloud SDK based application fulfills highest quality standards.
The SAP Cloud SDK pipeline is based on project "piper" and offers unique features for assuring that your SAP Cloud SDK based application fulfills the highest quality standards.
In conjunction with the SAP Cloud SDK libraries, the pipeline helps you to implement and automatically assure application qualities, for example:
* Functional correctness via:
@ -23,6 +23,8 @@ In conjunction with the SAP Cloud SDK libraries, the pipeline helps you to imple
* Zero-downtime deployment
* Proper logging of application errors
For more details, see [Cloud Qualities](cloud-qualities).
![Screenshot of SAP Cloud SDK Pipeline](../../images/cloud-sdk-pipeline.png)
## Supported Project Types
@ -34,7 +36,7 @@ The pipeline supports the following types of projects:
* TypeScript projects based on the [SAP Cloud SDK TypeScript Scaffolding](https://github.com/SAP/cloud-s4-sdk-examples/tree/scaffolding-ts).
* SAP Cloud Application Programming Model (CAP) projects based on the _SAP Cloud Platform Business Application_ WebIDE Template.
You can find more details about the supported project types and build tools in the [project documentation](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/pipeline/build-tools.md).
You can find more details about the supported project types and build tools in [Build Tools](build-tools).
## Legal Notes

View File

@ -0,0 +1,26 @@
# Share Configuration Between Projects
SAP Cloud SDK Pipeline does not require any programming on the application developer's end, as the pipeline is centrally developed and maintained.
The necessary configuration happens in the `pipeline_config.yml` file in the root directory of the application's repository.
For projects that are composed of multiple repositories (microservices), it might be desired to share the common configuration.
To do that, create a YAML file which is accessible from your CI/CD environment and configure it in your project.
For example, the common configuration can be stored in a GitHub repository an accessed via the "raw" URL:
```yaml
general:
sharedConfiguration: 'https://my.github.local/raw/someorg/shared-config/master/backend-service.yml'
```
It is important to ensure that the HTTP response body is proper YAML, as the pipeline will attempt to parse it.
Anonymous read access to the `shared-config` repository is required.
The shared config is merged with the project's `pipeline_config.yml`.
Note that the project's config takes precedence, so you can override the shared configuration in your project's local configuration.
This might be useful to provide a default value that needs to be changed only in some projects.
If you have different types of projects, they might require different shared configuration.
For example, you might not require all projects to have a certain code check (like Checkmarx, SourceClear, Whitesource) active.
This can be achieved by having multiple YAML files in the _shared-config_ repository.
Configure the URL to the respective configuration file in the projects as described above.

View File

@ -20,7 +20,11 @@ nav:
- 'Confirm Stage': stages/confirm.md
- 'Promote Stage': stages/promote.md
- 'Release Stage': stages/release.md
- 'SAP Cloud SDK pipeline': pipelines/cloud-sdk/introduction.md
- 'SAP Cloud SDK pipeline':
- 'Introduction': pipelines/cloud-sdk/introduction.md
- 'Build Tools': pipelines/cloud-sdk/build-tools.md
- 'Cloud Qualities': pipelines/cloud-sdk/cloud-qualities.md
- 'Shared Configuration': pipelines/cloud-sdk/shared-config-between-projects.md
- 'Scenarios':
- 'Build and Deploy Hybrid Applications with Jenkins and SAP Solution Manager': scenarios/changeManagement.md
- 'Build and Deploy SAP UI5 or SAP Fiori Applications on SAP Cloud Platform with Jenkins': scenarios/ui5-sap-cp/Readme.md