1
0
mirror of https://github.com/SAP/jenkins-library.git synced 2025-01-30 05:59:39 +02:00

Remove sdk docs (#2365)

* Remove sdk docs

* Apply suggestions from code review

Co-authored-by: Stephan Aßmus <stephan.assmus@sap.com>

* Add hint regarding Cloud SDK Pipeline

* Update documentation/docs/guidedtour.md

Co-authored-by: Stephan Aßmus <stephan.assmus@sap.com>

Co-authored-by: Stephan Aßmus <stephan.assmus@sap.com>
This commit is contained in:
Daniel Kurzynski 2020-11-17 17:35:01 +01:00 committed by GitHub
parent cbd932a5eb
commit 720ba0c875
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
9 changed files with 11 additions and 934 deletions

View File

@ -28,7 +28,6 @@ To do so, create a file called `<StageName>.groovy` (for example, `Acceptance.gr
For this, you need to know the technical identifiers for stage names.
* For the general purpose pipeline, you can find them in [the pipeline source file](https://github.com/SAP/jenkins-library/blob/master/vars/piperPipeline.groovy).
* For the SAP Cloud SDK pipeline, you can find them in [this GitHub search query](https://github.com/SAP/cloud-s4-sdk-pipeline-lib/search?q=%22def+stageName+%3D%22).
The centrally maintained pipeline checks if such a file exists and if it does, executes it.
A parameter of type `Map` that contains the following keys is passed to the extension:
@ -73,11 +72,11 @@ return this
### Practical example
For a more practical example, you can use extensions in the SAP Cloud SDK pipeline to add custom linters to your pipeline.
For a more practical example, you can use extensions in the general purpose pipeline to add custom linters to your pipeline.
A linter is a tool that can check the source code for certain stylistic criteria. Many teams choose to use a linter to ensure a common programming style.
For example, if you want to use [Checkstyle](https://checkstyle.sourceforge.io/) in your codebase, you might use an extension similar to this one in a file called `.pipeline/extensions/build.groovy` in your project:
For example, if you want to use [Checkstyle](https://checkstyle.sourceforge.io/) in your codebase, you might use an extension similar to this one in a file called `.pipeline/extensions/Build.groovy` in your project:
```groovy
def call(Map parameters) {
@ -104,7 +103,7 @@ return this
This code snippet has three components, let's see what is happening here:
Firstly, we run the original stage.
This builds the application and runs ESLint on JavaScript/TypeScript source files and static checks using PMD and SpotBugs tools as these are standard features of SAP Cloud SDK pipeline.
This builds the application and optionally runs ESLint on JavaScript/TypeScript source files and static checks using PMD and SpotBugs tools as these are standard features of General Purpose Pipeline.
Secondly, we run the checkstyle maven plugin using the `mavenExecute` Jenkins library step as provided by project "Piper".
This serves as an example for how flexible you can re-use what project "Piper" already provides in your extension.
@ -146,12 +145,10 @@ void call(parameters) {
The actual pipeline code (the `call` method in the listing above) can be found here:
* [General purpose pipeline](https://github.com/SAP/jenkins-library/blob/master/vars/piperPipeline.groovy)
* [SAP Cloud SDK pipeline](https://github.com/SAP/cloud-s4-sdk-pipeline-lib/blob/master/vars/cloudSdkPipeline.groovy)
!!! note "Use the correct shared library definition"
Which shared library you need depends on the pipeline you're using.<br />
For the [general purpose pipeline](https://github.com/SAP/jenkins-library/blob/master/vars/piperPipeline.groovy), you need `'piper-lib-os'`.<br />
For the [SAP Cloud SDK pipeline](https://github.com/SAP/cloud-s4-sdk-pipeline-lib/blob/master/vars/cloudSdkPipeline.groovy), you need `'s4sdk-pipeline-library'`.
For the [general purpose pipeline](https://github.com/SAP/jenkins-library/blob/master/vars/piperPipeline.groovy), you need `'piper-lib-os'`.
For the version identifier, please see the section _How to stay up-to-date_ in this document.
@ -171,7 +168,7 @@ A minimal example of such a library could have the following directory structure
./README.md
```
`myCustomPipeline.groovy` contains the modified pipeline code of the [general purpose pipeline](https://github.com/SAP/jenkins-library/blob/master/vars/piperPipeline.groovy) or [SAP Cloud SDK Pipeline](https://github.com/SAP/cloud-s4-sdk-pipeline-lib/blob/master/vars/cloudSdkPipeline.groovy).
`myCustomPipeline.groovy` contains the modified pipeline code of the [general purpose pipeline](https://github.com/SAP/jenkins-library/blob/master/vars/piperPipeline.groovy).
!!! note
The name of your custom pipeline _must_ differ from the other pipelines provided by project "Piper" because Jenkins requires names across multiple libraries to be unique.
@ -208,7 +205,7 @@ Please be aware that stages may have dependencies on each other.
The downside is that in rare cases, breaking changes might happen.
Another potential issue is that your builds are not _repeatable_, that means building the same version of your application twice _might_ have a different result.
For those reasons, you might want to consider to fix versions to a released version like in this example: `@Library('my-shared-library@v1.0') _`<br />
Find the most recent release for the [jenkins-library](https://github.com/SAP/jenkins-library/releases) and for the [SAP Cloud SDK Pipeline](https://github.com/SAP/cloud-s4-sdk-pipeline/releases) on GitHub.
Find the most recent release for the [jenkins-library](https://github.com/SAP/jenkins-library/releases) on GitHub.
To stay up to date with the latest releases, you can ["watch" releases for those repositories on GitHub](https://help.github.com/en/github/receiving-notifications-about-activity-on-github/watching-and-unwatching-releases-for-a-repository).
!!! note "When to go with a modified ready-made pipeline"

View File

@ -199,14 +199,11 @@ Open the application name to get into the `Application Overview`. Open the **App
## What's Next
You are now familiar with the basics of using project "Piper". Through the concept of pipeline as code, project "Piper" and Jenkins pipelines are extremely powerful. While Jenkins pipelines offer a full set of common programming features, project "Piper" adds SAP-specific flavors. Have a look at the different **Scenarios** to understand how to easily integrate SAP systems with defaults.
Dive into the ready-made continuous delivery pipelines: the **General Purpose Pipeline**
and **SAP Cloud SDK Pipeline** help you quickly build and deliver your apps.
Dive into the ready-made continuous delivery pipeline: the **General Purpose Pipeline** helps you to quickly build and deliver your apps.
Browse the steadily increasing list of features you can implement through the project "Piper" **Steps**.
The **Configuration** pattern supports simple pipelines that can be reused by multiple applications. To understand the principles of inheritance and customization, have a look at the the [configuration][resources-configuration] documentation.
Please also consult the blog post on setting up [Continuous Delivery for S/4HANA extensions][sap-blog-ci-cd] and get tons of informations around the application development with the [S/4HANA Cloud SDK][sap-blog-s4-sdk-first-steps].
[guidedtour-my-own-jenkins]: myownjenkins.md
[guidedtour-sample.config]: samples/cloud-cf-helloworld-nodejs/pipeline/config.yml
[guidedtour-sample.jenkins]: samples/cloud-cf-helloworld-nodejs/Jenkinsfile
@ -218,8 +215,6 @@ Please also consult the blog post on setting up [Continuous Delivery for S/4HANA
[sap]: https://www.sap.com
[sap-cp-trial]: https://account.hanatrial.ondemand.com
[sap-blog-s4-sdk-first-steps]: https://blogs.sap.com/2017/05/10/first-steps-with-sap-s4hana-cloud-sdk/
[sap-blog-ci-cd]: https://blogs.sap.com/2017/09/20/continuous-integration-and-delivery/
[devops-docker-images-cxs-guide]: https://github.com/SAP/devops-docker-cx-server/blob/master/docs/operations/cx-server-operations-guide.md

View File

@ -12,7 +12,6 @@ To get you started quickly, project "Piper" offers you the following artifacts:
* A set of ready-made Continuous Delivery pipelines for direct use in your project
* [ABAP Environment Pipeline](pipelines/abapEnvironment/introduction/)
* [General Purpose Pipeline](stages/introduction/)
* [SAP Cloud SDK Pipeline][cloud-sdk-pipeline]
* [A shared library][piper-library] that contains reusable step implementations, which enable you to customize our preconfigured pipelines, or to even build your own customized ones
* A standalone [command line utility](cli) for Linux and a [GitHub Action](https://github.com/SAP/project-piper-action)
* Note: This version is still in early development. Feel free to use it and [provide feedback](https://github.com/SAP/jenkins-library/issues), but don't expect all the features of the Jenkins library
@ -23,11 +22,12 @@ In many cases, they should satisfy your requirements, and if this is the case, y
### The best-practice way: Ready-made pipelines
**Are you building a standalone SAP Cloud Platform application?<br>**
**Are you building a standalone SAP Cloud Platform application, an application with the SAP Cloud SDK, or using the SAP Cloud Application Programming Model?<br>**
Then continue reading about our [general purpose pipeline](stages/introduction/), which supports various technologies and programming languages.
**Are you building an application with the SAP Cloud SDK and/or SAP Cloud Application Programming Model?<br>**
Then we can offer you a [pipeline specifically tailored to SAP Cloud SDK and SAP Cloud Application Programming Model applications][cloud-sdk-pipeline]
Previously, project "Piper" included also the SAP Cloud SDK Pipeline designed specifically for SAP Cloud SDK and SAP Cloud Application Model (CAP) projects.
SAP Cloud SDK pipeline and its features are merged into the General Purpose Pipeline as of November 2020.
The reasoning as well as further information how to adopt the General Purpose Pipeline are described in our [guide](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/gpp-guide.md).
### The do-it-yourself way: Build with Library

View File

@ -115,8 +115,6 @@ If you use your own Jenkins installation, you need to care for the configuration
This option should only be considered if you know why you need it, otherwise using the Cx Server life-cycle management makes your life much easier.
If you choose to go this path, follow the [Custom Jenkins Setup guide][resources-custom-jenkins].
**Note:** This option is not supported for SAP Cloud SDK projects.
[devops-docker-images-cxs-guide]: https://github.com/SAP/devops-docker-cx-server/blob/master/docs/operations/cx-server-operations-guide.md
[docker-images]: https://hub.docker.com/u/ppiper
[resources-custom-jenkins]: customjenkins.md

View File

@ -1,280 +0,0 @@
# Build Tools
The SAP Cloud SDK supports multiple programming languages (Java and JavaScript) and can be used in the SAP Cloud Application Programming Model.
For each of these variants project templates exists (as referenced in the project's main [Readme](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/README.md) file).
These templates introduce standard tooling, such as build tools, and a standard structure.
The SAP Cloud SDK Continuous Delivery Toolkit expects that the project follows this structure and depends on the build tools introduced by these templates.
The supported build tools are:
* [Maven](https://maven.apache.org/) for Java projects
* [npm](https://www.npmjs.com/) for JavaScript projects
* [MTA](https://sap.github.io/cloud-mta-build-tool) for Multi-Target Application Model projects
MTA itself makes use of other build tools, such as Maven and npm depending on what types of modules your application has.
*Note: The npm pipeline variant is in an early state. Some interfaces might change. We recommend consuming a fixed released version as described in the project [Readme](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/README.md#versioning).*
## Feature Matrix
Support for the different features of the pipeline may vary in each variant of the SDK pipeline build tool.
The following table gives an overview over the features available per build tool.
| Feature | Maven | npm | MTA Maven | MTA npm |
|----------------------------|-------|-----|-----------|---------|
| Automatic Versioning | x | x | x | x |
| Build | x | x | x | x |
| Backend Integration Tests | x | x | x | x |
| Frontend Integration Tests | x | x | x | x |
| Backend Unit Tests | x | x | x | x |
| Frontend Unit Tests | x | x | x | x |
| NPM Dependency Audit | x | x | x | x |
| Linting | x | | x | x |
| Static Code Checks | x | | x | |
| End-To-End Tests | x | | x | x |
| Performance Tests | x | | x | |
| Resilience Checks | x | | x | |
| S4HANA Public APIs | x | | x | |
| Code Coverage Checks | x | x | x | x |
| Checkmarx Integration | x | | x | |
| Fortify Integration | x | | x | |
| SourceClear Integration | x | | | |
| Whitesource Integration | x | x | x | x |
| Deployment to Nexus | x | | x | x |
| Zero Downtime Deployment | x | x | x¹ | x¹ |
| Download Cache | x | x | x | x |
¹ MTA projects can only be deployed to the Cloud Foundry Environment
## Java/Node.js runtime versions
Runtime versions used in builds are determined by Docker images.
For Java, the default is still (as of August 2020) version 8.
For more details, please check the [documentation of the SAP Cloud SDK for Java](https://sap.github.io/cloud-sdk/docs/java/getting-started/).
In case you need to use a specific Java version to build your application, you may do so by setting another Docker image in your `.pipeline/config.yml` file.
See [documentation of the pipeline configuration](../configuration/) and look for the `dockerImage` key on where this option applies.
In most cases, it should be suffcient to configure an image for the `mavenExecute` step like so:
```yaml
steps:
mavenExecute:
dockerImage: 'maven:3.6.3-jdk-11'
```
## Projects Requirements
Each variant of the pipeline has different requirements regarding the project structure, location of reports and tooling.
Stages not listed here do not have a special requirement.
In any case, please also consult the [documentation of the pipeline configuration](../configuration/), as some stages have to be activated by providing configuration values.
### Build Tool Independent Requirements
In order to run in the pipeline your project has to include the following two files in the root folder: `Jenkinsfile` and `.pipeline/config.yml`.
You can copy both files from this [github repository](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources).
There are two variants of the configuration file.
Please pick the corresponding version for your deployment target and rename it properly.
#### Frontend Unit Tests
For each `package.json` where the script `ci-frontend-unit-test` is defined the command `npm run ci-frontend-unit-test` will be executed in this stage.
Furthermore, the test results have to be stored in the folder `./s4hana_pipeline/reports/frontend-unit` in the root directory.
The required format of the test result report is the JUnit format as an `.xml` file.
The code coverage report can be published as html report and in the cobertura format.
The cobertura report as html report has to be stored in the directory `./s4hana_pipeline/reports/coverage-reports/frontend-unit/report-html/ut/` as an `index.html` file.
These coverage reports will then be published in Jenkins.
Furthermore, if configured in the `.pipeline/config.yml`, the pipeline ensures the configured level of code coverage.
#### Frontend Integration Tests
The command `npm run ci-it-frontend` will be executed in this stage and has to be defined in the `package.json` in the root.
In this stage, the frontend should be tested end-to-end without the backend.
Therefore, even a browser is started to simulate user interactions.
Furthermore, the test results have to be stored in the folder `./s4hana_pipeline/reports/frontend-integration` in the root directory of the project.
The required format of the test result report is the JUnit format as an `.xml` file.
The user is responsible to use a proper reporter for generating the results.
It is recommended to use the same tools as in the `package.json` of this [example project](https://github.com/SAP/cloud-s4-sdk-examples/blob/scaffolding-js/package.json).
#### Backend Unit Tests
##### Maven
Maven unit-tests are executed as part of the [`mavenBuild`](../../../steps/mavenBuild/) step.
They are supposed to be placed inside of `application/src/test`.
##### Java MTA modules
We run the command `mvn test` in each Java MTA module.
##### Npm and Nodejs MTA modules
For each `package.json` where the script `ci-backend-unit-test` is defined the command `npm run ci-backend-unit-test` will be executed in this stage.
Furthermore, the test results have to be stored in the folder `./s4hana_pipeline/reports/backend-unit/` in the root directory of the project.
The required format of the test result report is the JUnit format as an `.xml` file.
For the code coverage the results have to be stored in the folder `./s4hana_pipeline/reports/coverage-reports/backend-unit/` in the cobertura format as an `xml` file.
The user is responsible to use a proper reporter for generating the results.
We recommend the tools used in the `package.json` of this [example project](https://github.com/SAP/cloud-s4-sdk-examples/blob/scaffolding-js/package.json).
If you have multiple npm packages with unit tests the names of the report files must have unique names.
#### Backend Integration Tests
##### Maven and Java MTA modules
If there is a maven module called `integration-tests` we run `maven test` in this module.
##### Npm and Nodejs MTA modules
For each `package.json` where the script `ci-it-backend` is defined the command `npm run ci-it-backend` will be executed in this stage.
Furthermore, the test results have to be stored in the folder `./s4hana_pipeline/reports/backend-integration` in the root directory of the project.
The required format of the test result report is the JUnit format as an `.xml` file.
For the code coverage the results have to be stored in the folder `./s4hana_pipeline/reports/coverage-reports/backend-integration/` in the cobertura format as an `xml` file.
The user is responsible to use a proper reporter for generating the results.
We recommend the tools used in the `package.json` of this [example project](https://github.com/SAP/cloud-s4-sdk-examples/blob/scaffolding-js/package.json).
If you have multiple npm packages with unit tests the names of the report files must have unique names.
#### Lint
For each `package.json` where the script `ci-lint` is defined the command `npm run ci-lint` will be executed as part of the `build` stage.
The required format of the linting results is the checkstyle format as an `xml` file.
The linting results have to be stored in a file named `*cilint.xml`, which may reside in any directory of the project.
The linting results will then be published in Jenkins.
If no script `ci-lint` is defined and Javascript or Typescript files are present in the project, the pipeline will automatically execute ESLint.
If no ESLint configuration files are present in the project directory, a general purpose configuration is used to lint all Javascript and/or Typescript files of the project.
If, on the other hand, ESLint configuration files exist in the project, they will be used to lint Javascript files in the project.
The execution happens according to ESLint's default execution behavior, i.e., for each JS file the ESLint config in that directory or one of the parent directories will be used to lint the file.
Note, in this case only those files will be linted, for which an ESLint config exists.
More details on the execution behavior of ESLint and the usage of configuration files can be found in the [related documentation](https://eslint.org/docs/user-guide/configuring#configuration-cascading-and-hierarchy).
Note, if it is necessary to disable the default linting behavior, it is possible to, e.g., define a script `"ci-lint" : "exit 0"` in your `package.json`.
We recommend the use of a custom defined `ci-lint` script in your `package.json` to address project specific linting requirements.
#### End-to-End Tests
This stage is only executed if you configured it in the file `.pipeline/config.yml`.
The command `npm run ci-e2e` will be executed in this stage.
The url which is defined as `appUrl` in the file `.pipeline/config.yml` will be passed as argument named `launchUrl` to the tests.
This can be reproduced locally by executing:
```
npm run ci-e2e -- --launchUrl=https://path/to/your/running/application
```
The credentials also defined in the file `.pipeline/config.yml` will be available during the test execution as environment variables named `e2e_username` and `e2e_password`.
The test results have to be stored in the folder `./s4hana_pipeline/reports/e2e` in the root directory.
The required format of the test result report is the Cucumber format as an `.json` file, or the JUnit format as an xml file.
Also, screenshots can be stored in this folder.
The screenshots and reports will then be published in Jenkins.
The user is responsible to use a proper reporter for generating the results.
#### Performance Tests
This stage is only executed if you configured it in the file `.pipeline/config.yml`.
Performance tests can be executed using [JMeter](https://jmeter.apache.org/) or [Gatling](https://gatling.io/).
If only JMeter is used as a performance tests tool then test plans can be placed in a default location, which is the directory `{project_root}/performance-tests`. However, if JMeter is used along with Gatling, then JMeter test plans should be kept in a subdirectory under a directory `performance-tests` for example`./performance-tests/JMeter/`.
The gatling test project including the `pom.xml` should be placed in the directory `{project_root}/performance-tests`.
Afterwards, Gatling has to be enable in the configuration.
#### Deployments
For all deployments to Cloud Foundry (excluding MTA) there has to be a file called `manifest.yml`.
This file may only contain exactly one application.
*Note: For JavaScript projects the path of the application should point to the folder `deployment`.*
### Java / Maven
For Maven the pipeline expects the following structure.
The project should have three maven modules named:
- `application`
- `integration-tests`
The module `application` should contain the application code and unit tests.
The module `integration-tests` should contain integration tests.
Furthermore, the test modules have to include the following dependency:
```xml
<dependency>
<groupId>com.sap.cloud.s4hana.quality</groupId>
<artifactId>listeners-all</artifactId>
<scope>test</scope>
</dependency>
```
### JavaScript / npm
The project has to use npm and include a `package.json` in the root directory.
In the pipeline stages, specific scripts in the `package.json` are called to build the project or run tests.
Furthermore, the pipeline expects reports, such as test results, to be written into certain folders.
These stage specific requirements are documented below.
#### Build
By default `npm ci` will be executed.
After `npm ci` the command `npm run ci-build` will be executed.
This script can be used to, for example, compile Typescript resources or webpack the frontend.
In the build stage, also development dependencies are installed and tests should also be compiled.
Afterwards the command `npm run ci-package` will be executed.
This step should prepare the deployment by copying all deployment relevant files into the folder `deployment` located in the root of the project.
This folder should not contain any non-production-related resources, such as tests or development dependencies.
This directory has to be defined as path in the `manifest.yml`.
*Note: This steps runs isolated from the steps before. Thus, e.g. modifying node_modules with `npm prune --production` will not have an effect for later stages, such as the test execution.*
### SAP Cloud Application Programming Model / MTA
The project structure follows the standard structure for projects created via the _SAP Cloud Platform Business Application_ SAP Web IDE Template with some constraints.
Please leave the basic structure of the generated project intact.
Make sure to check the _Include support for continuous delivery pipeline of SAP Cloud SDK_ checkbox, which will automatically add the required files for continuous delivery in your project.
If you already created your project without this option, you'll need to copy and paste two files into the root directory of your project, and commit them to your git repository:
* [`Jenkinsfile`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/Jenkinsfile)
* [`.pipeline/config.yml`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/cf-pipeline_config.yml)
* Note: The file must be named `.pipeline/config.yml`, despite the different name of the file template
Further constrains on the project structure (this is all correct in projects generated from the _SAP Cloud Platform Business Application_ SAP Web IDE Template):
On the project root level, a `pom.xml` file is required.
Java services are Maven projects which include the application- and the unit-test code.
A service is typically called `srv`, but the name can be chosen freely.
An `integration-test` module must exist on the root level.
This module is where integration between the services can be tested.
In summary, the project structure should look like this:
```
.
├── Jenkinsfile
├── .pipeline
│ └── config.yml
├── app // web application, not required
├── db // only if database module exists
├── integration-tests
│ ├── pom.xml
│ └── src
│ └── test
├── mta.yaml
├── package.json
├── pom.xml
└── srv
├── pom.xml
└── src
├── main
└── test // Unit-Tests for this service
```

View File

@ -1,110 +0,0 @@
# Checked Qualities in the SAP Cloud SDK Pipeline
The goal of the SAP Cloud SDK Pipeline is to help you build high quality applications which run on SAP Cloud Platform.
To achieve this, the SAP Cloud SDK Pipeline checks qualities when building your application.
This document summarizes the qualities that are checked by the SAP Cloud SDK Pipeline.
## SAP Cloud SDK Specific Checks
### Required Dependencies
For the SAP Cloud SDK specific checks to work, a few dependencies are required in unit and integration tests.
The Cloud SDK pipeline will check if the odata-querylistener, rfc-querylistener, and the httpclient-listener dependencies are in the unit- and integration tests maven modules. If one of those dependencies is missing the pipeline will add the `listeners-all`dependency to the pom on the fly before executing the respective tests. That means for a user of the SDK it is not necessary to add those dependencies manually, but it can be beneficial to speed up the runtime of the pipeline since the pom.xml won't be changed if the dependencies are available.
### Only Depend on Official API
This quality checks for usage of unofficial RFC and OData services.
Only official API from the [SAP API Business Hub](https://api.sap.com/) should be used, since unofficial API don't provide any stable interfaces.
A list of official API can be found in [this blog post](https://blogs.sap.com/2017/09/22/quality-checks/).
### Resilient Network Calls
When building extension applications on SAP Cloud Platform, you always deal with a distributed system.
There is at least two applications in this scenario: Your extension application, and SAP S/4HANA.
In distributed systems, you may not assume that the network is reliable.
To mitigate unreliable networks, a pattern called _circuit breaker_ is commonly used.
The idea is that you define a fallback action in case the network fails too often in a short time span.
The fallback might use cached data, or default values, depending on what works best in your problem domain.
To implement this pattern, the SAP Cloud SDK integrates with the [Hystrix](https://github.com/Netflix/Hystrix) library.
The version 3 of the SAP Cloud SDK integrates with the [resilience4j](https://github.com/resilience4j/resilience4j) library.
This quality check tests, that your remote calls are wrapped in a Hystrix command (v2) or in a ResilienceDecorator (v3).
The build will fail with a error message like `Your project accesses downstream systems in a non-resilient manner` if this is not the case.
More information on building resilient applications is available in [this blog post](https://blogs.sap.com/2017/06/23/step-5-resilience-with-hystrix/).
## Functional Tests
Ensuring the functional correctness of an application requires automated tests, which are part of the application code.
Those qualities depend on the test code written by the application developer.
### Unit Tests
The purpose of unit tests is to verify the correctness of a single _unit_ in isolation.
Other components than the _unit under test_ may be mocked for testing purposes.
Place your unit tests in the appropriate Maven module (`unit-tests`) in order to make the pipeline run them automatically.
### Integration Tests
Integration tests work on a higher level compared to unit tests.
They should ensure that independently tested units work together as they need to.
In the context of extension applications on SAP Cloud Platform, this means to ensure _interoperability of your application with S/4HANA_ and _interoperability between your application's backend and frontend component_.
Place your integration tests in the appropriate Maven module (`integration-tests`) in order to make the pipeline run them automatically.
For more detailed description, refer to [this blog post](https://blogs.sap.com/2017/09/19/step-12-with-sap-s4hana-cloud-sdk-automated-testing/).
### End-to-End Tests
End-to-end tests use your application, like a human user would by clicking buttons, entering text into forms and waiting for the result.
Place your end-to-end tests in the `e2e-tests` directory and ensure the `ci-e2e` script in `package.json` runs the right command.
The output folder for the reports needs to be `s4hana_pipeline/reports/e2e`.
### Code Coverage
Code coverage refers to how much of your application code is tested.
The build fails, if the test coverage of your code drops below a certain threshold.
To fix such a build failure, check which parts of your code are not tested yet and write missing tests.
The code coverage is tested using [JaCoCo Java Code Coverage Library](https://www.eclemma.org/jacoco/).
## Non-Functional Tests
### Performance
Performance relates to how quickly your application reacts under heavy load.
For implementing performance tests, you can chose between to Open Source tools: [JMeter](https://jmeter.apache.org/) and [Gatling](https://gatling.io/).
If you're not familiar with both of them, we recommend using Gatling.
More information on testing the performance of your application is available in [this blog post](https://blogs.sap.com/2018/01/11/step-23-with-sap-s4hana-cloud-sdk-performance-tests/).
### Static Code Checks
Static code checks look for potential issues in code without running the program.
The SAP Cloud SDK Pipeline includes commonly used static checks using both [PMD](https://pmd.github.io/) and [SpotBugs](https://spotbugs.github.io/).
In addition to the default checks of those tools, it adds the following SAP Cloud SDK specific checks:
* To make post-mortem debugging possible
* Log the exception in the catch block or in a called handling method or reference it in a new thrown exception
* Reference the exception when logging inside a catch block
* In order to allow a smooth transition from Neo to Cloud Foundry, you should use the platform independent abstractions provided by the SAP S4HANA Cloud SDK
### Lint
The pipeline automatically executes linting of JavaScript and Typescript files, either by running a user defined script, or by executing ESLint with a general purpose configuration.
[Custom linters](../../../extensibility/#practical-example) can be implemented by development teams, if desired.
This allows to enforce a common coding style within a team of developers, thus making it easier to focus on the application code, rather then discussing minor style issues.
### Third-Party Tools
The SAP Cloud SDK Pipeline also integrates with commercial third party code analyzer services, if you wish to use them.
Currently, [Checkmarx](https://www.checkmarx.com/), [WhiteSource](https://www.whitesourcesoftware.com/), and [SourceClear](https://www.sourceclear.com/) are available.
For those scans to be enabled, they need to be configured in the [pipeline configuration file](../../configuration.md).

View File

@ -1,466 +0,0 @@
# SAP Cloud SDK Pipeline Configuration
## General configuration
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `productiveBranch` | | `master` | The name of your default branch. This branch will be used for deploying your application. Other branches will skip deployment. |
| `projectName` | | `artifactId` from pom | Name of the project |
| `collectTelemetryData` | | `true` | No personal data is collected. For details, consult the [analytics documentation](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/operations/analytics.md). |
| `unsafeMode` | | `false` | Enable unsafe mode to skip checking environment variables for insecure elements. Only use this for demo purposes, **never for productive usage**. |
| `customDefaultsCredentialsId` | | | Credentials (username / password) used to download [custom defaults](#customDefaults). |
### features
This section allows to enable or disable certain optional features.
This concept is known as *Feature Toggles*.
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `parallelTestExecution` | | `off` | Run E2E Tests in parallel. This feature is disabled by default because it is [not supported](https://issues.jenkins-ci.org/browse/JENKINS-38442) in Blue Ocean. If this feature is enabled, we suggest not using the Blue Ocean interface and rely on the classic UI instead. |
Example:
```yaml
general:
productiveBranch: 'master'
projectName: 'example_project'
features:
parallelTestExecution: on
```
### jenkinsKubernetes
If the Jenkins is running on a kubernetes cluster as a pod, we can use the dynamic scaling feature in the pipeline. In order to enable this, an environment variable `ON_K8S` has to be set to `true` on the jenkins.
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `jnlpAgent` | | `jenkins/jnlp-slave:latest` | Docker image for `jnlp` agent to be used |
In the Jenkins configuration section under `Manage Jenkins` menu, set the value for your environment variable under `Global properties` section.
![Environment variable configuration](../../images/k8s-environment-config.jpg)
The Jenkins spins up `jnlp` agent nodes on demand. By default, the `jenkins/jnlp-slave` docker image is used. We can also use the custom `jnlp` agent by configuring the same in the `.pipeline/config.yml` file as shown below.
```yaml
general:
jenkinsKubernetes:
jnlpAgent: jenkins/jnlp-slave:latest
```
## Stage configuration
### staticCodeChecks
The `staticCodeChecks` stage has been integrated into the `build` stage.
To configure static code checks, please configure the step `mavenExecuteStaticCodeChecks` as described [here](../../../steps/mavenExecuteStaticCodeChecks/).
### backendIntegrationTests
The `backendIntegrationTests` stage has been integrated into the project "Piper" stage `Integration`.
Thus, it is required to update the stage name in the stages section of your configuration to `integration`.
The configuration parameters available for the stage remain the same.
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `retry` | | `1` | The number of times that integration tests will retry before aborting the build. **Note:** This will consume more time for the jenkins build. |
| `forkCount` | | `1C` | The number of JVM processes that are spawned to run the tests in parallel in case of using a maven based project structure. For more details visit the [surefire documentation](https://maven.apache.org/surefire/maven-surefire-plugin/test-mojo.html#forkCount). |
| `credentials` | | | The list of system credentials to be injected during integration tests. The following example will provide the username and password for the systems with the aliases ERP and SFSF. For this, it will use the Jenkins credentials entries erp-credentials and successfactors-credentials. You have to ensure that corresponding credential entries exist in your Jenkins configuration |
Example:
```yaml
integration:
retry: 2
credentials:
- alias: 'ERP'
credentialId: 'erp-credentials'
- alias: 'SF'
credentialId: 'successfactors-credentials'
```
The integration tests stage also offers the option to run a sidecar container, e.g. for running a database or another downstream system.
To use this optional feature the following configuration values have to be provided:
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
|`sidecarImage` | | | Name of the Docker image that should be used. |
|`sidecarName` | | | On Docker: Name of the container in local network. On Kubernetes: Name of the container. |
|`sidecarReadyCommand` | | | Command executed inside the container which returns exit code 0 when the container is ready to be used. |
|`sidecarEnvVars` | | | Environment variables to set in the container. |
*Note: To access the container from your tests use the `sidecarName` as hostname on Docker or `localhost:portOfProcess` on Kubernetes.*
Example:
```yaml
integration:
retry: 2
credentials:
- alias: 'ERP'
credentialId: 'erp-credentials'
- alias: 'SF'
credentialId: 'successfactors-credentials'
sidecarName: 'postgres'
sidecarImage: 'postgres'
sidecarReadyCommand: 'pg_isready'
sidecarEnvVars:
PORT: 8234
```
### frontendIntegrationTests
The `frontendIntegrationTests` stage has been integrated into the project "Piper" stage `Integration`.
Thus, it is required to update the stage name in the stages section of your configuration to `integration`.
The configuration parameters available for the stage remain the same.
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `dockerImage` | | | The docker image to be used for running frontend integration tests. **Note:** This will only change the docker image used for unit testing in the frontend. For switching all npm based steps to a different npm or chromium version, you should configure the dockerImage via the executeNpm step. |
### frontendUnitTests
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `dockerImage` | | | The docker image to be used for running frontend unit tests. **Note:** This will only change the docker image used for unit testing in the frontend. For switching all npm based steps to a different npm or chromium version, you should configure the dockerImage via the executeNpm step. |
### endToEndTests
For the stage `endToEndTests` the same configuration options are available as for the stage `productionDeployment`.
In addition to these options also the following ones are available for end-to-end tests:
It is possible to activate zero downtime deployment in end-to-end tests with the option `enableZeroDowntimeDeployment`.
This will lead to a blue-green-deployment on SCP Cloud Foundry respectively to a rolling update on SCP Neo.
By default this feature is turned off.
Running end-to-end tests can be restricted to the `productiveBranch` with the option `onlyRunInProductiveBranch`.
This might be useful when the end-to-end tests slow down development, and build failure on the `productiveBranch` is acceptable.
By default this feature is turned off.
Additional parameters can be passed for each end-to-end test deployment by specifying _optional_ `parameters` for an application URL.
These parameters are appended to the npm command during execution.
This could be used for example to split the entire end-to-end test scenario into multiple sub-scenarios and running these sub-scenarios on different deployments.
For example, when using nightwatch-api, these scenarios can be defined via annotations in the test descriptions and can be called with the `--tag` parameter as shown in the example below. Another option is to execute the end to end tests with various web browsers, e.g. chrome or firefox.
Example:
```yaml
endToEndTests:
enableZeroDowntimeDeployment: true
onlyRunInProductiveBranch: true
appUrls:
- url: <application url>
credentialId: e2e-test-user-cf
parameters: '--tag scenario1 --NIGHTWATCH_ENV=chrome'
- url: <application url 2>
credentialId: e2e-test-user-cf
parameters: '--tag scenario2 --tag scenario3 --NIGHTWATCH_ENV=firefox'
```
### npmAudit
This stage has been removed in v43 of the Cloud SDK pipeline.
### performanceTests
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `cfTargets` | | | The list of Cloud Foundry deployment targets required for the performance test stage. |
| `neoTargets` | | | The list of Neo deployment targets required for the performance test stage. |
For details on the properties `cfTargets` and `neoTargets` see the stage `productionDeployment`.
### s4SdkQualityChecks
This stage has been removed in v43 of the Cloud SDK pipeline.
### checkmarxScan
The `checkmarxScan` stage has been merged into the project "Piper" stage `security`.
To configure Checkmarx please configure the step `checkmarxExecuteScan` as described [in the step documentation](../../../steps/checkmarxExecuteScan/).
### productionDeployment
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
|`cfCreateServices`| | | The list of services which should be created before deploying the application as defined below. |
| `cfTargets` | | | The list of productive Cloud Foundry deployment targets to be deployed when a build of your productive branch succeeds. |
| `neoTargets`| | | The list of productive Neo deployment targets to be deployed when a build of your productive branch succeeds. |
| `appUrls` | | | The URLs under which the app is available after deployment. Each appUrl can be a string with the URL or a map containing a property url and a property credentialId. An example is shown in the configuration for the stage endToEndTests. |
### cfCreateServices
The option `cfCreateServices` is especially useful if you don't use MTA and need a way to declaratively define which services should be created in Cloud Foundry.
The following properties can be defined for each element in the list.
For a detailed documentation of the indivitual properties please consult the [step documentation](../../../steps/cloudFoundryCreateService/).
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `org` | X** | | The organization where you want to deploy your app. |
| `space` | X** | | The space where you want to deploy your app. |
| `serviceManifest`| X** | | Manifest file that needs to be used defining the services. |
| `manifestVariablesFiles`| X** | | Variables that should be replaced in the service manifest file. |
| `credentialsId` | X**| | ID to the credentials that will be used to connect to the Cloud Foundry account. |
| `apiEndpoint` | X** | | URL to the Cloud Foundry endpoint. |
** The parameters can either be specified here or for the step `cloudFoundryDeploy` or globally in the general section under the key `cloudFoundry`.
### cfTargets and neoTargets
You can either specify the property `cfTargets` or `neoTargets`.
For `cfTargets` the following properties can be defined:
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `org` | X** | | The organization where you want to deploy your app. |
| `space` | X** | | The space where you want to deploy your app. |
| `appName` | X** (not for MTA) | | Name of the application. |
| `manifest` | X** (not for MTA) | | Manifest file that needs to be used. |
| `credentialsId` | X**| | ID to the credentials that will be used to connect to the Cloud Foundry account. |
| `apiEndpoint` | X** | | URL to the Cloud Foundry endpoint. |
| `mtaExtensionDescriptor` | | | (**Only for MTA-projects**) Path to the mta extension description file. For more information on how to use those extension files please visit the [SAP HANA Developer Guide](https://help.sap.com/viewer/4505d0bdaf4948449b7f7379d24d0f0d/2.0.02/en-US/51ac525c78244282919029d8f5e2e35d.html). |
| `mtaExtensionCredentials` | | | (**Only for MTA-projects**) Map of credentials that need to be replaced in the `mtaExtensionDescriptor`. This map needs to be created as `value-to-be-replaced`:`id-of-a-credential-in-jenkins` |
** The parameters can either be specified here or for the step `cloudFoundryDeploy` or globally in the general section under the key `cloudFoundry`.
### Examples
```yaml
general:
cloudFoundry:
org: 'myorg'
space: 'Prod'
apiEndpoint: 'https://api.cf.sap.hana.ondemand.com'
credentialsId: 'CF-DEPLOY-DEFAULT'
manifestVariablesFiles: ['manifest-variables.yml']
stages:
productionDeployment:
appUrls:
- url: <application url>
credentialId: e2e-test-user-cf
cfCreateServices:
- serviceManifest: 'services-manifest.yml'
- serviceManifest: 'services-manifest.yml'
space: 'Prod2'
org: 'myorg2'
cfTargets:
- appName: 'exampleapp'
manifest: 'manifest.yml'
- space: 'Prod2'
org: 'myorg2'
appName: 'exampleapp'
manifest: 'manifest.yml'
credentialsId: 'CF-DEPLOY-PROD1'
```
The MTA projects can make use of the extension files and one can use a Jenkins credential store to inject the credentials during runtime instead of storing them as a plain text in the extension file.
In order to use this feature, use a [JSP style or GString style](http://docs.groovy-lang.org/latest/html/api/groovy/text/GStringTemplateEngine.html) place holder in the extension file and provide the respective credential id in the `.pipeline/config.yml` as shown below.
Please note currently only the Jenkins [Sercret text](https://jenkins.io/doc/book/using/using-credentials/) is the supported format for runtime credential substitution.
```yaml
#.pipeline/config.yml
productionDeployment:
appUrls:
- url: <application url>
credentialId: e2e-test-user-cf
cfTargets:
- space: 'Prod'
org: 'myorg'
appName: 'exampleapp'
manifest: 'manifest.yml'
credentialsId: 'CF-DEPLOY'
apiEndpoint: '<Cloud Foundry API endpoint>'
mtaExtensionDescriptor: 'path to mta extension description file'
mtaExtensionCredentials:
brokerCredential: sercretText-id-in-jenkins
```
```yaml
#extension_file.mtaext
_schema-version: "3.1"
version: 0.0.1
extends: myApplication
ID: my-application
parameters:
broker-credentials: <%= brokerCredential %>
```
For `neoTargets` the following properties can be defined:
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `host` | X | | Host of the region you want to deploy to, see [Regions](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/350356d1dc314d3199dca15bd2ab9b0e.html#loio350356d1dc314d3199dca15bd2ab9b0e)|
| `account` | X | | Identifier of the subaccount|
| `application` | X | | Name of the application in your account |
| `credentialsId` | | `CI_CREDENTIALS_ID` | ID of the credentials stored in Jenkins and used to deploy to SAP Cloud Platform |
| `environment` | | | Map of environment variables in the form of KEY: VALUE|
| `vmArguments` | | | String of VM arguments passed to the JVM|
| `size`| | `lite` | Size of the JVM, e.g. `lite`, `pro'`, `prem`, `prem-plus` |
| `runtime` | X | | Name of the runtime: neo-java-web, neо-javaee6-wp, neо-javaee7-wp. See the [runtime](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/937db4fa204c456f9b7820f83bc87118.html) for more information.|
| `runtimeVersion` | X | | Version of the runtime. See [runtime-version](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/937db4fa204c456f9b7820f83bc87118.html) for more information.|
Example:
```yaml
productionDeployment:
neoTargets:
- host: 'eu1.hana.ondemand.com'
account: 'myAccount'
application: 'exampleapp'
credentialsId: 'NEO-DEPLOY-PROD'
environment:
STAGE: Production
vmArguments: '-Dargument1=value1 -Dargument2=value2'
runtime: 'neo-javaee6-wp'
runtimeVersion: '2'
```
### artifactDeployment
#### nexus
The deployment of artifacts to nexus can be configured with a map containing the following properties:
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `version` | | `nexus3` | Version of nexus. Can be `nexus2` or `nexus3`. |
| `url` | X | | URL of the nexus. The scheme part of the URL will not be considered, because only `http` is supported. |
| `mavenRepository` | | | Name of the nexus repository for Maven and MTA artifacts. Ignored if the project does not contain `pom.xml` or `mta.yml` in the project root. |
| `npmRepository` | | | Name of the nexus repository for NPM artifacts. Ignored if the project does not contain a `package.json` in the project root directory. |
| `groupId` | | | Common group ID for MTA build artifacts, ignored for Maven projects. |
| `credentialsId` | | | ID to the credentials which is used to connect to Nexus. Anonymous deployments do not require a `credentialsId`.|
##### Choosing what to deploy into the npm repository
The Pipeline performs an [npm publish](https://docs.npmjs.com/cli/publish) command to deploy npm modules.
This deployment might include files that you don't want to deploy.
See [here](https://docs.npmjs.com/misc/developers#keeping-files-out-of-your-package) for npm documentation.
**WARNING:** The `.gitignore` file is not available in the pipeline during the artifact deployment.
To exclude files from that, please create a `.npmignore` file, copy the contents of your `.gitignore` file and add specific ignores for example for `*.java` files.
Example:
```yaml
artifactDeployment:
nexus:
version: nexus2
url: nexus.mycorp:8080/nexus
mavenRepository: snapshots
npmRepository: npm-repo
credentialsId: 'NEXUS-DEPLOY'
```
### whitesourceScan
The `whitesourceScan` stage has been merged into the project "Piper" stage `security`.
To configure WhiteSource please configure the step `whitesourceExecuteScan` as described [in the step documentation](../../../steps/whitesourceExecuteScan/).
### fortifyScan
The `fortifyScan` stage has been merged into the project "Piper" stage `security`.
To configure Fortify please configure the step `fortifyExecuteScan` as described [in the step documentation](../../../steps/fortifyExecuteScan/).
### lint
The `lint` stage has been integrated into the `build` stage.
The options for the use of linting tools remain the same and are described in the [build tools section](../build-tools/#lint).
Note, the available configuration options can be found in the related [step documentation](../../../steps/npmExecuteLint/#parameters).
### compliance
The stage `compliance` executes [SonarQube](https://www.sonarqube.org/) scans, if the step [`sonarExecuteScan`](../../../steps/sonarExecuteScan/) is configured.
This is an optional feature for teams who prefer to use SonarQube.
Note that it does some scans that are already done by the pipeline by default.
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `runInAllBranches` | | false | Define whether the scan should also happen in non productive branches, i.e. if your SonarQube instance supports that. |
**Note:** The stage is skipped by default if you're not on a productive branch (`master` by default).
You can change this by setting `runInAllBranches` to `true`, which requires the commercial version of SonarQube.
Example:
```yaml
compliance:
runInAllBranches: true
```
### postPipelineHook
This stage does nothing.
Its purpose is to be overridden if required.
See the documentation for [pipeline extensibility](../../../extensibility/) for details on how to extend a stage.
The name of an extension file must be `postPipelineHook.groovy`.
Also, the stage (and thus an extension) is only executed if a stage configuration exists, like in this example:
```yaml
postPipelineHook:
enabled: true
```
## Step configuration
This section describes the steps that are available only in SAP Cloud SDK Pipeline.
For common project "Piper" steps, please see the _Library steps_ section of the documentation.
### checkGatling
[Gatling](https://gatling.io/) is used as one of the performance tests tool.
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `enabled` | | `false` | You can enable Gatling tests by turning the flag to `true`. |
Example:
```yaml
checkGatling:
enabled: true
```
### checkJMeter
[Apache JMeter](http://jmeter.apache.org/) is executed as part of performance tests of the application. The user is free to choose between JMeter and Gatling or both.
| Property | Mandatory | Default Value | Description |
| --- | --- | --- | --- |
| `options` | | | Options such as proxy. |
| `testPlan` | | `./performance-tests/*` | The directory where the test plans reside. Should reside in a subdirectory under `performance-tests` directory if both JMeter and Gatling are enabled.|
| `dockerImage` | | `famiko/jmeter-base` | JMeter docker image. |
| `failThreshold` | | `100` | Marks build as `FAILURE` if the value exceeds the threshold. |
| `unstableThreshold` | | `90` | Marks build as `UNSTABLE` if the value exceeds the threshold. |
Example:
```yaml
checkJMeter:
options: '-H my.proxy.server -P 8000'
testPlan: './performance-tests/JMeter/*' # mandatory parameter if both JMeter and gatling are enabled
dockerImage: 'famiko/jmeter-base'
failThreshold : 80
unstableThreshold: 70
```
### executeNpm
This step has been removed in v43 of the Cloud SDK pipeline.
Please use the step [npmExecuteScripts](../../../steps/npmExecuteScripts/) instead.
### debugReportArchive
The documentation for the `debugReportArchive` step has been moved [here](../../../steps/debugReportArchive/).
## Post action configuration
### sendNotification
The `sendNotification` post-build action has been removed in version v43 of the Cloud SDK pipeline.

View File

@ -1,43 +0,0 @@
# SAP Cloud SDK Pipeline
<img src="https://help.sap.com/doc/6c02295dfa8f47cf9c08a19f2e172901/1.0/en-US/logo-for-cd.svg" alt="SAP Cloud SDK for Continuous Delivery Logo" height="122.92" width="226.773" align="right"/></a>
If you are building an application with [SAP Cloud SDK](https://community.sap.com/topics/cloud-sdk), the [SAP Cloud SDK pipeline](https://github.com/SAP/cloud-s4-sdk-pipeline) helps you to quickly build and deliver your app in high quality.
Thanks to highly streamlined components, setting up and delivering your first project will just take minutes.
## Qualities and Pipeline Features
The SAP Cloud SDK pipeline is based on project "Piper" and offers unique features for assuring that your SAP Cloud SDK based application fulfills the highest quality standards.
In conjunction with the SAP Cloud SDK libraries, the pipeline helps you to implement and automatically assure application qualities, for example:
* Functional correctness via:
* Backend and frontend unit tests
* Backend and frontend integration tests
* User acceptance testing via headless browser end-to-end tests
* Non-functional qualities via:
* Dynamic resilience checks
* Performance tests based on *Gatling* or *JMeter*
* Code Security scans based on *Checkmarx* and *Fortify*
* Dependency vulnerability scans based on *Whitesource*
* IP compliance scan based on *Whitesource*
* Zero-downtime deployment
* Proper logging of application errors
For more details, see [Cloud Qualities](../cloud-qualities).
![Screenshot of SAP Cloud SDK Pipeline](../../images/cloud-sdk-pipeline.png)
## Supported Project Types
The pipeline supports the following types of projects:
* Java projects based on the [SAP Cloud SDK Archetypes](https://mvnrepository.com/artifact/com.sap.cloud.sdk.archetypes).
* JavaScript projects based on the [SAP Cloud SDK JavaScript Scaffolding](https://github.com/SAP/cloud-s4-sdk-examples/tree/scaffolding-js).
* TypeScript projects based on the [SAP Cloud SDK TypeScript Scaffolding](https://github.com/SAP/cloud-s4-sdk-examples/tree/scaffolding-ts).
* SAP Cloud Application Programming Model (CAP) projects based on the _SAP Cloud Platform Business Application_ WebIDE Template.
You can find more details about the supported project types and build tools in [Build Tools](../build-tools).
## Legal Notes
Note: This license of this repository does not apply to the SAP Cloud SDK for Continuous Delivery Logo referenced in this page

View File

@ -27,20 +27,6 @@ nav:
- 'Confirm Stage': stages/confirm.md
- 'Promote Stage': stages/promote.md
- 'Release Stage': stages/release.md
- 'SAP Cloud SDK pipeline':
- 'Introduction': pipelines/cloud-sdk/introduction.md
- 'Configuration': pipelines/cloud-sdk/configuration.md
- 'Stages':
- 'Build and Test': stages/build.md
- 'Additional Unit Tests': stages/additionalunittests.md
- 'Integration Tests': stages/integration.md
- 'End to End Tests': stages/acceptance.md
- 'Security': stages/security.md
- 'Compliance': stages/compliance.md
- 'Performance': stages/performance.md
- 'Production Deployment': stages/release.md
- 'Build Tools': pipelines/cloud-sdk/build-tools.md
- 'Cloud Qualities': pipelines/cloud-sdk/cloud-qualities.md
- 'Scenarios':
- 'Build and Deploy Hybrid Applications with SAP Solution Manager': scenarios/changeManagement.md
- 'Build and Deploy SAPUI5/SAP Fiori Applications on SAP Cloud Platform': scenarios/ui5-sap-cp/Readme.md