1
0
mirror of https://github.com/SAP/jenkins-library.git synced 2025-10-30 23:57:50 +02:00

chore: fix markdownlint findings (#2385)

* activate MD022

* fix MD022 findings

* activate MD030

* fix MD030 findings

* activate MD038

* fix MD038 findings

* activate MD031

* fix MD031 findings

* activate MD042

* fix MD042 findings

* activate MD032

* fix MD032 findings

* activate MD039

* activate MD007

* fix MD007 findings

* activate MD026

* fix MD026 findings

* activate MD001

* fix MD001 findings

* acknowledge disabled rules

* fix code climate MD032 finding
This commit is contained in:
Christopher Fenner
2020-11-17 09:20:47 +01:00
committed by GitHub
parent 052a65d495
commit 1514be9857
20 changed files with 84 additions and 101 deletions

View File

@@ -35,55 +35,25 @@ plugins:
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md013---line-length
MD013:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/main/doc/Rules.md#md033---inline-html
MD033:
enabled: false
# TODO: fix in separate PR
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md001
MD001:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md004
MD004:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md007
MD007:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md022
MD022:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md024
MD024:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md026
MD026:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md030
MD030:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md031
MD031:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md032
MD032:
# https://github.com/DavidAnson/markdownlint/blob/main/doc/Rules.md#md033---inline-html
MD033:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md036
MD036:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md038
MD038:
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md041
MD041:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md039
MD039:
# TODO: fix in separate PR
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md004
MD004:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md040
MD040:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md042
MD042:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md041
MD041:
enabled: false
# https://github.com/DavidAnson/markdownlint/blob/master/doc/Rules.md#md046
MD046:
enabled: false

View File

@@ -2,22 +2,12 @@ default: true
# ignore
MD013: false #line-length
MD024: false #no-duplicate-heading/no-duplicate-header
MD033: false #no-inline-html
MD036: false #no-emphasis-as-heading/no-emphasis-as-header
MD041: false #first-line-heading/first-line-h1
# TODO: fix in separate PR
MD001: false #heading-increment/header-increment
MD004: false #ul-style
MD007: false #ul-indent
MD022: false #blanks-around-headings/blanks-around-headers
MD024: false #no-duplicate-heading/no-duplicate-header
MD026: false #no-trailing-punctuation
MD030: false #list-marker-space
MD031: false #blanks-around-fences
MD032: false #blanks-around-lists
MD036: false #no-emphasis-as-heading/no-emphasis-as-header
MD038: false #no-space-in-code
MD039: false #no-space-in-links
MD040: false #fenced-code-language
MD042: false #no-empty-links
MD041: false #first-line-heading/first-line-h1
MD046: false #code-block-style

View File

@@ -222,6 +222,7 @@ With writing a fatal error
```golang
log.Entry().WithError(err).Fatal("the error message")
```
the category will be written into the file `errorDetails.json` and can be used from there in the further pipeline flow.
Writing the file is handled by [`pkg/log/FatalHook`](pkg/log/fatalHook.go).

View File

@@ -24,11 +24,11 @@ All this makes it important to make compatible deployments.
For this option, we only consider the goal `deploy:deploy-file`.
##### :+1:
:+1:
- Official maven plugin for deployment, which is perfect if you only care whether the artifacts are deployed correctly.
##### :-1:
:-1:
- Knowledge about which artifacts to deploy has to be obtained manually.
- A list of parameters has to be generated before using the plugin, including `artifactId` and `version`, which is the same case as the `Uploading artifacts manually`. For maven projects, the parameters can be obtained using the `evaluate` goal of the `maven-help-plugin`. There is however a performance impact, since a maven command line has to be executed for each parameter, multiplied by the number of modules. This is not a problem for `Maven lifecycle phase : deploy`.
@@ -38,17 +38,18 @@ For this option, we only consider the goal `deploy:deploy-file`.
By default, the maven lifecycle phase `deploy` binds to the goal `deploy:deploy` of the `Apache Maven Deploy Plugin`.
##### :+1:
:+1:
- Same as the `Apache Maven Deploy Plugin`
- You don't have to obtain and pass the parameters as for `Apache Maven Deploy Plugin`, because `package` phase is executed implicitly and makes the parameters ready before `deploy` phase.
- Supports multi-module Maven projects and any project structure.
##### :-1:
:-1:
- Same case as the `Apache Maven Deploy Plugin` for handling credentials.
- Cannot be used for non-Maven projects (i.e. MTA)
- As a maven phase, a list of phases is triggered implicitly before this phase, including `compile`, `test` and `package`.
To follow the build-once principle, all these phases have to be skipped.
However, it's not possible to skip some of the maven goals binding to certain phases.
For example, if the `<packaging>` tag of the `pom.xml` is set to `jar`, then the `jar:jar` goal of the [`Apache Maven JAR Plugin`](https://maven.apache.org/plugins/maven-jar-plugin/) is bound to the `package` phase.
@@ -59,12 +60,12 @@ Unfortunately, however, `Apache Maven JAR Plugin` does not provide an option to
Files can be uploaded to the Nexus by simple HTTP PUT requests, using basic authentication if necessary. Meta-data files have to be downloaded, updated and re-uploaded after successful upload of the artifacts.
##### :+1:
:+1:
- Without the pain of handling the credentials, which was mentioned above in `Apache Maven Deploy Plugin` section.
- Gives full control over the implementation.
##### :-1:
:-1:
- Same as the `Apache Maven Deploy Plugin`. Knowledge about which artifacts to deploy has to be obtained manually.
- Same as the `Apache Maven Deploy Plugin`. A list of parameters has to be prepared.

View File

@@ -57,6 +57,7 @@ Copy the sources of the application into your own Git repository. While we will
1. Get your application repository in place.
1. Create a new file with the name `Jenkinsfile` in the root level of your repository and enter the following code:
```
@Library('piper-lib-os') _
node() {
@@ -66,6 +67,7 @@ Copy the sources of the application into your own Git repository. While we will
}
}
```
The "prepare" step synchronizes the repository and initializes the project specific settings. For more information about Jenkinsfiles and pipelines, see [Using a Jenkinsfile][jenkins-io-jenkinsfile].
1. Save your changes to your remote repository.
@@ -106,14 +108,17 @@ For additional information about multibranch pipelines, please refer to the [Jen
## Add a Build Step
1. In your `Jenkinsfile`, add the following code snippet:
```
stage('build') {
mtaBuild script: this
}
```
The `mtaBuild` step calls a build tool to build a multi-target application (MTA). The tool consumes an MTA descriptor that contains the metadata of all entities which comprise an application or are used by one during deployment or runtime, and the dependencies between them. For more information about MTAs, see [sap.com][sap].
1. Create the MTA descriptor file with the name `mta.yaml` in the root level of the repository. Insert the following code:
```
_schema-version: 2.1.0
ID: com.sap.piper.node.hello.world
@@ -127,12 +132,14 @@ For additional information about multibranch pipelines, please refer to the [Jen
```
1. Configure the step to build an MTA for the Cloud Foundry environment. Create the configuration file `.pipeline/config.yml` relative to the root level of the repository and insert the following content:
```
general:
steps:
mtaBuild:
buildTarget: 'CF'
```
For additional information about the configuration, have a look at the [Common Configuration Guide][resources-configuration] and the [MTA build step documentation][resources-step-mtabuild].
1. Save your changes to your remote repository.
@@ -141,15 +148,18 @@ For additional information about multibranch pipelines, please refer to the [Jen
## Add a Deploy Step
1. In your `Jenkinsfile`, add the following code snippet:
1. In your `Jenkinsfile`, add the following code snippet:
```
stage('deploy') {
cloudFoundryDeploy script: this
}
```
The `cloudFoundryDeploy` step calls the Cloud Foundry command line client to deploy the built MTA into SAP Cloud Platform.
1. To configure the step to deploy into the Cloud Foundry environment, in your repository, open the `.pipeline/config.yml` and add the following content:
```
cloudFoundryDeploy:
deployTool: 'mtaDeployPlugin'
@@ -159,6 +169,7 @@ For additional information about multibranch pipelines, please refer to the [Jen
space: '<your-space>'
credentialsId: 'CF_CREDENTIALSID'
```
**Note:** look after the indentation of the step within the YAML. Specify the `organisation` and `space` properties. For more information about the configuration, see the [Common Configuration Guide][resources-configuration] and [cloudFoundryDeploy][resources-step-cloudFoundryDeploy].
1. The key `CF_CREDENTIALSID` refers to a user-password credential you must create in Jenkins: In Jenkins, choose **Credentials** from the main menu and add a **Username with Password** entry.
<p align="center">

View File

@@ -10,12 +10,12 @@ The goal of project "Piper" is to substantially ease setting up continuous deliv
To get you started quickly, project "Piper" offers you the following artifacts:
* A set of ready-made Continuous Delivery pipelines for direct use in your project
* [ABAP Environment Pipeline](pipelines/abapEnvironment/introduction/)
* [General Purpose Pipeline](stages/introduction/)
* [SAP Cloud SDK Pipeline][cloud-sdk-pipeline]
* [ABAP Environment Pipeline](pipelines/abapEnvironment/introduction/)
* [General Purpose Pipeline](stages/introduction/)
* [SAP Cloud SDK Pipeline][cloud-sdk-pipeline]
* [A shared library][piper-library] that contains reusable step implementations, which enable you to customize our preconfigured pipelines, or to even build your own customized ones
* A standalone [command line utility](cli) for Linux and a [GitHub Action](https://github.com/SAP/project-piper-action)
* Note: This version is still in early development. Feel free to use it and [provide feedback](https://github.com/SAP/jenkins-library/issues), but don't expect all the features of the Jenkins library
* Note: This version is still in early development. Feel free to use it and [provide feedback](https://github.com/SAP/jenkins-library/issues), but don't expect all the features of the Jenkins library
* A set of [Docker images][devops-docker-images] to setup a CI/CD environment in minutes using sophisticated life-cycle management
To find out which offering is right for you, we recommend to look at the ready-made pipelines first.

View File

@@ -244,7 +244,7 @@ If you already created your project without this option, you'll need to copy and
* [`Jenkinsfile`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/Jenkinsfile)
* [`.pipeline/config.yml`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/cf-pipeline_config.yml)
* Note: The file must be named `.pipeline/config.yml`, despite the different name of the file template
* Note: The file must be named `.pipeline/config.yml`, despite the different name of the file template
Further constrains on the project structure (this is all correct in projects generated from the _SAP Cloud Platform Business Application_ SAP Web IDE Template):

View File

@@ -91,8 +91,8 @@ The SAP Cloud SDK Pipeline includes commonly used static checks using both [PMD]
In addition to the default checks of those tools, it adds the following SAP Cloud SDK specific checks:
* To make post-mortem debugging possible
* Log the exception in the catch block or in a called handling method or reference it in a new thrown exception
* Reference the exception when logging inside a catch block
* Log the exception in the catch block or in a called handling method or reference it in a new thrown exception
* Reference the exception when logging inside a catch block
* In order to allow a smooth transition from Neo to Cloud Foundry, you should use the platform independent abstractions provided by the SAP S4HANA Cloud SDK
### Lint

View File

@@ -436,8 +436,8 @@ checkGatling:
| `options` | | | Options such as proxy. |
| `testPlan` | | `./performance-tests/*` | The directory where the test plans reside. Should reside in a subdirectory under `performance-tests` directory if both JMeter and Gatling are enabled.|
| `dockerImage` | | `famiko/jmeter-base` | JMeter docker image. |
| `failThreshold ` | | `100` | Marks build as `FAILURE` if the value exceeds the threshold. |
| `unstableThreshold ` | | `90` | Marks build as `UNSTABLE` if the value exceeds the threshold. |
| `failThreshold` | | `100` | Marks build as `FAILURE` if the value exceeds the threshold. |
| `unstableThreshold` | | `90` | Marks build as `UNSTABLE` if the value exceeds the threshold. |
Example:

View File

@@ -11,17 +11,17 @@ The SAP Cloud SDK pipeline is based on project "Piper" and offers unique feature
In conjunction with the SAP Cloud SDK libraries, the pipeline helps you to implement and automatically assure application qualities, for example:
* Functional correctness via:
* Backend and frontend unit tests
* Backend and frontend integration tests
* User acceptance testing via headless browser end-to-end tests
* Backend and frontend unit tests
* Backend and frontend integration tests
* User acceptance testing via headless browser end-to-end tests
* Non-functional qualities via:
* Dynamic resilience checks
* Performance tests based on *Gatling* or *JMeter*
* Code Security scans based on *Checkmarx* and *Fortify*
* Dependency vulnerability scans based on *Whitesource*
* IP compliance scan based on *Whitesource*
* Zero-downtime deployment
* Proper logging of application errors
* Dynamic resilience checks
* Performance tests based on *Gatling* or *JMeter*
* Code Security scans based on *Checkmarx* and *Fortify*
* Dependency vulnerability scans based on *Whitesource*
* IP compliance scan based on *Whitesource*
* Zero-downtime deployment
* Proper logging of application errors
For more details, see [Cloud Qualities](../cloud-qualities).

View File

@@ -30,7 +30,7 @@ In case you already created your project without this option, you'll need to cop
* [`Jenkinsfile`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/Jenkinsfile)
* [`.pipeline/config.yml`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/cf-pipeline_config.yml)
* Note: The file must be named `.pipeline/config.yml`, despite the different name of the file template
* Note: The file must be named `.pipeline/config.yml`, despite the different name of the file template
!!! note "Using the right project structure"
This only applies to projects created based on the _SAP Cloud Platform Business Application_ template after September 6th 2019. They must comply with the structure which is described [here](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/pipeline/build-tools.md#sap-cloud-application-programming-model--mta).
@@ -41,7 +41,7 @@ Now, you'll need to push the code to a git repository.
This is required because the pipeline gets your code via git.
This might be GitHub, or any other cloud or on-premise git solution you have in your company.
Be sure to configure the [`productionDeployment `](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md#productiondeployment) stage so your changes are deployed to SAP Cloud Platform automatically.
Be sure to configure the [`productionDeployment`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md#productiondeployment) stage so your changes are deployed to SAP Cloud Platform automatically.
## Legacy documentation

View File

@@ -50,9 +50,9 @@ As explained above, the shipment of a software takes place via software product
- Product release
- Product Support Package stack and Patch level
- A list of contained software component versions with
- Software component name
- Software component release
- Delivery Package, which delivers the versions
- Software component name
- Software component release
- Delivery Package, which delivers the versions
## Building the Add-on Product
@@ -65,19 +65,24 @@ The pipeline consists of different steps responsible for a single task. The step
Different services and systems are required for the add-on build process.
### Delivery Tools
With the following tools the add-on deliveries are created.
#### Assembly System
First the ABAP system responsible for the add-on assembly. It is created during the pipeline and deleted in the end. All actions related to the ABAP source code are executed on this system, e.g. running checks with the ABAP Test Cockpit (ATC) or the physical build of the software components. There are two communication scenarios containing the different APIs of the ABAP Environment System: [Test Integration](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/b04a9ae412894725a2fc539bfb1ca055.html) and [Software Assembly Integration](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/b04a9ae412894725a2fc539bfb1ca055.html).
The assembly system should be of [service type abap](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/f0163565eb554f009f990652ca41d1c6.html) and be provisioned with parameter `is_development_allowed = false` to prevent local changes.
#### Add-on Assembly Kit as a Service (=AAKaaS)
The Add-on Assembly Kit as a Service is responsible for registering and publishing the software product. It is accessible via APIs with an S-User.
### Deployment Tools
With these SAP tools the assembled add-on deliveries are deployed to ABAP systems, for example into the [installation test system](#installation-test-system).
#### Installation Test System
In order to verify that the delivery packages included in the add-on product version being built are installable, a target vector is published in "test" scope. In the *Integration Tests* stage an ABAP system of service type abap-oem is created. This ABAP OEM service makes it possible to install a specific add-on product version into an ABAP system that is provisioned. The installation test system should be be provisioned with parameter `is_development_allowed = false` to prevent local changes.
### Prerequisites

View File

@@ -39,7 +39,8 @@ The basic workflow is as follows:
1. As soon as the development process is completed, the change document in SAP Solution Manager can be set to status `to be tested` and all components can be transported to the test system.
![Hybrid Application Development Workflow](../images/Scenario_SolMan.png "Hybrid Application Development Workflow")
###### Hybrid Application Development Workflow
**Hybrid Application Development Workflow**
## Example

View File

@@ -24,7 +24,8 @@ This scenario combines various different steps to create a complete pipeline.
In this scenario, we want to show how to build an application based on SAPUI5 or SAP Fiori by using the multi-target application (MTA) concept, and how to deploy the build result into an SAP Cloud Platform account in the Cloud Foundry environment. This document comprises the [mtaBuild](../../../steps/mtaBuild/) and the [cloudFoundryDeploy](../../../steps/cloudFoundryDeploy/) steps.
![This pipeline in Jenkins Blue Ocean](images/pipeline.jpg)
###### Screenshot: Build and Deploy Process in Jenkins
**Screenshot: Build and Deploy Process in Jenkins**
## Example

View File

@@ -47,7 +47,8 @@ TransportRequest: <YOUR TRANSPORT REQUEST ID>
By default, the Git commits between the merge base with the base branch (default: `master`) and the current branch head are traversed.
![This pipeline in Jenkins Blue Ocean](images/pipeline.png)
###### Screenshot: Build and Deploy Process in Jenkins
**Screenshot: Build and Deploy Process in Jenkins**
## Examples

View File

@@ -21,7 +21,8 @@ This scenario combines various different steps to create a complete pipeline.
In this scenario, we want to show how to build a Multitarget Application (MTA) and deploy the build result into an on-prem SAP HANA XS advances system. This document comprises the [mtaBuild](https://sap.github.io/jenkins-library/steps/mtaBuild/) and the [xsDeploy](https://sap.github.io/jenkins-library/steps/xsDeploy/) steps.
![This pipeline in Jenkins Blue Ocean](images/pipeline.jpg)
###### Screenshot: Build and Deploy Process in Jenkins
**Screenshot: Build and Deploy Process in Jenkins**
## Example

View File

@@ -13,6 +13,7 @@ piperPipeline script: this
## Pure Pull-Request Voting
.pipeline/config.yml:
``` YAML
general:
buildTool: 'npm'
@@ -30,4 +31,4 @@ In order to use a custom defaults only a simple extension to the `Jenkinsfile` i
piperPipeline script: this, customDefaults: ['myCustomDefaults.yml']
```
## more examples to come ...
## more examples to come

View File

@@ -2,11 +2,11 @@
The pipeline consists of a sequence of stages where each contains a number of individual steps.
### First step: Pull Request Pipeline
## First step: Pull Request Pipeline
In order to validate pull-requests to your GitHub repository you need to perform two simple steps:
#### 1. Create Pipeline configuration
### 1. Create Pipeline configuration
Create a file `.pipeline/config.yml` in your repository (typically in `master` branch) with the following content:
@@ -27,7 +27,7 @@ general:
If your build tool is not in the list you can still use further options as described for [Pull-Request Voting Stage](prvoting.md)
#### 2. Create Jenkinsfile
### 2. Create Jenkinsfile
Create a file called `Jenkinsfile` in the root of your repository (typically in `master` branch) with the following content:
@@ -48,7 +48,7 @@ piperPipeline script: this
You find more details about the custom defaults in the [configuration section](../configuration.md)
### Second step: Prepare pipeline for your main branch.
## Second step: Prepare pipeline for your main branch
Extend your configuration to also contain git ssh credentials information.
@@ -64,7 +64,7 @@ general:
The pointer to the Jenkins credentials containing your ssh private key is an important part of the pipeline run.
The credentials are for example required to push automatic versioning information to your GitHub repository.
### Subsequent steps: Configure individual stages
## Subsequent steps: Configure individual stages
The stages of the pipeline can be configured individually.
As a general rule of thumb, only stages with an existing configuration are executed.
@@ -75,7 +75,7 @@ If no dedicated configuration is required for a step, the precence of relevant f
The pipeline comprises following stages:
#### Init
### Init
This stage takes care that the pipeline is initialized correctly.
It will for example:
@@ -87,7 +87,7 @@ It will for example:
You find details about this stage on [**Init Stage** Details](init.md)
#### Pull-Request Voting
### Pull-Request Voting
This stage is responsible for validating pull-requests, see also above.

View File

@@ -27,15 +27,15 @@ none
## Exceptions
* `Exception`:
* If `source` is not provided.
* If `propertiesFile` is not provided (when using `'WAR_PROPERTIESFILE'` deployment mode).
* If `application` is not provided (when using `'WAR_PARAMS'` deployment mode).
* If `runtime` is not provided (when using `'WAR_PARAMS'` deployment mode).
* If `runtimeVersion` is not provided (when using `'WAR_PARAMS'` deployment mode).
* If `source` is not provided.
* If `propertiesFile` is not provided (when using `'WAR_PROPERTIESFILE'` deployment mode).
* If `application` is not provided (when using `'WAR_PARAMS'` deployment mode).
* If `runtime` is not provided (when using `'WAR_PARAMS'` deployment mode).
* If `runtimeVersion` is not provided (when using `'WAR_PARAMS'` deployment mode).
* `AbortException`:
* If neo-java-web-sdk is not properly installed.
* If neo-java-web-sdk is not properly installed.
* `CredentialNotFoundException`:
* If the credentials cannot be resolved.
* If the credentials cannot be resolved.
## Example

View File

@@ -69,7 +69,7 @@ withCredentials([usernamePassword(
}
```
In a Pipeline Template, a [Stage Exit](#) can be used to fetch the credentials and store them in the environment. As the environment is passed down to uiVeri5ExecuteTests, the variables will be present there. This is an example for the stage exit `.pipeline/extensions/Acceptance.groovy` where the `credentialsId` is read from the `config.yml`:
In a Pipeline Template, a [Stage Exit](../extensibility/#1-extend-individual-stages) can be used to fetch the credentials and store them in the environment. As the environment is passed down to uiVeri5ExecuteTests, the variables will be present there. This is an example for the stage exit `.pipeline/extensions/Acceptance.groovy` where the `credentialsId` is read from the `config.yml`:
```groovy
void call(Map params) {