1
0
mirror of https://github.com/SAP/jenkins-library.git synced 2024-12-14 11:03:09 +02:00

Merge branch 'master' into whitesource-step

This commit is contained in:
Sven Merk 2019-04-03 12:19:34 +02:00 committed by GitHub
commit 1fe05b8a56
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
22 changed files with 217 additions and 383 deletions

View File

@ -1,37 +1,10 @@
# durationMeasure
# ${docGenStepName}
## Description
## ${docGenDescription}
This step is used to measure the duration of a set of steps, e.g. a certain stage.
The duration is stored in a Map. The measurement data can then be written to an Influx database using step [influxWriteData](influxWriteData.md).
## ${docGenParameters}
!!! tip
Measuring for example the duration of pipeline stages helps to identify potential bottlenecks within the deployment pipeline.
This then helps to counter identified issues with respective optimization measures, e.g parallelization of tests.
## Prerequisites
none
## Pipeline configuration
none
## Parameters
| parameter | mandatory | default | possible values |
| ----------|-----------|---------|-----------------|
| script | yes | | |
| measurementName | no | test_duration | |
Details:
* `script` defines the global script environment of the Jenkinsfile run. Typically `this` is passed to this parameter. This allows the function to access the [`commonPipelineEnvironment`](commonPipelineEnvironment.md) for storing the measured duration.
* `measurementName` defines the name of the measurement which is written to the Influx database.
## Step configuration
none
## ${docGenConfiguration}
## Example

View File

@ -1,55 +1,17 @@
# handlePipelineStepErrors
# ${docGenStepName}
## Description
Used by other steps to make error analysis easier. Lists parameters and other data available to the step in which the error occurs.
## ${docGenDescription}
## Prerequisites
none
## Parameters
| parameter | mandatory | default | possible values |
| -----------------|-----------|---------|-----------------|
| `stepParameters` | yes | | |
| `stepName` | yes | | |
| `echoDetails` | yes | true | true, false |
* `stepParameters` - The parameters from the step to be executed. The list of parameters is then shown in the console output.
* `stepName` - The name of the step executed to be shown in the console output.
* `echoDetails` - If set to true the following will be output to the console:
1. Step beginning: `--- Begin library step: ${stepName}.groovy ---`
2. Step end: `--- End library step: ${stepName}.groovy ---`
3. Step errors:
```log
----------------------------------------------------------
--- An error occurred in the library step: ${stepName}
----------------------------------------------------------
The following parameters were available to the step:
***
${stepParameters}
***
The error was:
***
${err}
***
Further information:
* Documentation of step ${stepName}: .../${stepName}/
* Pipeline documentation: https://...
* GitHub repository for pipeline steps: https://...
----------------------------------------------------------
```
## ${docGenParameters}
## Step configuration
none
## Side effects
none
## Exceptions
none

View File

@ -1,19 +1,6 @@
# healthExecuteCheck
# ${docGenStepName}
## Description
Calls the health endpoint url of the application.
The intention of the check is to verify that a suitable health endpoint is available. Such a health endpoint is required for operation purposes.
This check is used as a real-life test for your productive health endpoints.
!!! note "Check Depth"
Typically, tools performing simple health checks are not too smart. Therefore it is important to choose an endpoint for checking wisely.
This check therefore only checks if the application/service url returns `HTTP 200`.
This is in line with health check capabilities of platforms which are used for example in load balancing scenarios. Here you can find an [example for Amazon AWS](http://docs.aws.amazon.com/elasticloadbalancing/latest/classic/elb-healthchecks.html).
## ${docGenDescription}
## Prerequisites
@ -25,6 +12,10 @@ Endpoint for health check is configured.
!!! tip
If using Spring Boot framework, ideally the provided `/health` endpoint is used and extended by development. Further information can be found in the [Spring Boot documenation for Endpoints](http://docs.spring.io/spring-boot/docs/current/reference/html/production-ready-endpoints.html)
## ${docGenParameters}
## ${docGenConfiguration}
## Example
Pipeline step:
@ -32,29 +23,3 @@ Pipeline step:
```groovy
healthExecuteCheck testServerUrl: 'https://testserver.com'
```
## Parameters
| parameter | mandatory | default | possible values |
| ----------|-----------|---------|-----------------|
|script|yes|||
|healthEndpoint|no|``||
|testServerUrl|no|||
Details:
* `script` defines the global script environment of the Jenkinsfile run. Typically `this` is passed to this parameter. This allows the function to access the [`commonPipelineEnvironment`](commonPipelineEnvironment.md) for storing the measured duration.
* Health check function is called providing full qualified `testServerUrl` (and optionally with `healthEndpoint` if endpoint is not the standard url) to the health check.
* In case response of the call is different than `HTTP 200 OK` the **health check fails and the pipeline stops**.
## Step configuration
We recommend to define values of step parameters via [config.yml file](../configuration.md).
In following sections the configuration is possible:
| parameter | general | step | stage |
| ----------|-----------|---------|-----------------|
|script||||
|healthEndpoint|X|X|X|
|testServerUrl|X|X|X|

View File

@ -1,47 +1,10 @@
# mavenExecute
# ${docGenStepName}
## Description
## ${docGenDescription}
Executes a maven command inside a Docker container.
## ${docGenParameters}
## Parameters
| parameter | mandatory | default | example values |
| -------------------------------|-----------|-------------------|----------------------------|
| `script` | yes | | |
| `dockerImage` | no | 'maven:3.5-jdk-7' | |
| `globalSettingsFile` | no | | 'local_folder/settings.xml'|
| `projectSettingsFile` | no | | |
| `pomPath` | no | | 'local_folder/m2' |
| `flags` | no | | '-o' |
| `goals` | no | | 'clean install' |
| `m2Path` | no | | 'local_folder/m2' |
| `defines` | no | | '-Dmaven.tests.skip=true' |
| `logSuccessfulMavenTransfers` | no | `false` | 'true' |
* `script` defines the global script environment of the Jenkinsfile run.
Typically `this` is passed to this parameter. This allows the function
to access the commonPipelineEnvironment for retrieving, for example,
configuration parameters.
* `dockerImage` Name of the docker image that should be used.
* `globalSettingsFile` Path or url to the mvn settings file that should be used as global settings file.
* `projectSettingsFile` Path or url to the mvn settings file that should be used as project settings file.
* `pomPath` Path to the pom file that should be used.
* `flags` Flags to provide when running mvn.
* `goals` Maven goals that should be executed.
* `m2Path` Path to the location of the local repository that should be used.
* `defines` Additional properties.
* `logSuccessfulMavenTransfers` configures maven to log successful downloads. This is set to `false` by default to reduce the noise in build logs.
## Step configuration
The following parameters can also be specified as step parameters using the global configuration file:
* `dockerImage`
* `globalSettingsFile`
* `projectSettingsFile`
* `pomPath`
* `m2Path`
## ${docGenConfiguration}
## Exceptions

View File

@ -1,44 +1,18 @@
# mtaBuild
# ${docGenStepName}
## Description
## ${docGenDescription}
Executes the SAP Multitarget Application Archive Builder to create an mtar archive of the application.
## Prerequisites
Before doing this, validates that SAP Multitarget Application Archive Builder exists and the version is compatible.
While using a custom docker file, ensure that the following tools are installed:
Note that a version is formed by `major.minor.patch`, and a version is compatible to another version if the minor and patch versions are higher, but the major version is not, e.g. if 3.39.10 is the expected version, 3.39.11 and 3.40.1 would be compatible versions, but 4.0.1 would not be a compatible version.
* **SAP MTA Archive Builder 1.0.6 or compatible version** - can be downloaded from [SAP Development Tools](https://tools.hana.ondemand.com/#cloud).
* **Java 8 or compatible version** - necessary to run the *MTA Archive Builder* itself and to build Java modules.
* **NodeJS installed** - the MTA Builder uses `npm` to download node module dependencies such as `grunt`.
## Parameters
## ${docGenParameters}
| parameter | mandatory | default | possible values |
| -----------------|-----------|--------------------------------------------------------|--------------------|
| `script` | yes | | |
| `dockerImage` | no | `ppiper/mta-archive-builder` | |
| `dockerOptions` | no | '' | |
| `buildTarget` | yes | `'NEO'` | 'CF', 'NEO', 'XSA' |
| `extension` | no | | |
| `mtaJarLocation` | no | `'/opt/sap/mta/lib/mta.jar'` | |
| `applicationName`| no | | |
* `script` - The common script environment of the Jenkinsfile running. Typically the reference to the script calling the pipeline step is provided with the `this` parameter, as in `script: this`. This allows the function to access the [`commonPipelineEnvironment`](commonPipelineEnvironment.md) for retrieving, for example, configuration parameters.
* `dockerImage` - The Docker image to execute the MTA build.
Note that you can provide your own image if required, but for most cases, the default should be fine.
* `dockerOptions` Docker options to be set when starting the container. It can be a list or a string.
* `buildTarget` - The target platform to which the mtar can be deployed.
* `extension` - The path to the extension descriptor file.
* `mtaJarLocation` - The location of the SAP Multitarget Application Archive Builder jar file, including file name and extension. First, the location is retrieved from the environment variables using the environment variable `MTA_JAR_LOCATION`. If no environment variable is provided, the location is retrieved from the parameters, or the step configuration using the key `mtaJarLocation`. If SAP Multitarget Application Archive Builder is not found on one of the previous locations an AbortException is thrown.
Note that the environment variable `MTA_JAR_LOCATION` has priority. In case that the script runs on multiple nodes, SAP Multitarget Application Archive Builder must be located on all the nodes, therefore the environment variable must be also configured on all the nodes.
* `applicationName` - The name of the application which is being built. If the parameter has been provided and no `mta.yaml` exists, the `mta.yaml` will be automatically generated using this parameter and the information (`name` and `version`) from `package.json` before the actual build starts.
## Step configuration
The following parameters can also be specified as step parameters using the global configuration file:
* `dockerImage`
* `buildTarget`
* `extension`
* `mtaJarLocation`
* `applicationName`
## ${docGenConfiguration}
## Side effects
@ -47,7 +21,6 @@ The following parameters can also be specified as step parameters using the glob
## Exceptions
* `AbortException`:
* If SAP Multitarget Application Archive Builder is not found.
* If there is an invalid `buildTarget`.
* If there is no key `ID` inside the `mta.yaml` file.

View File

@ -1,34 +1,14 @@
# pipelineExecute
# ${docGenStepName}
## Description
Loads a pipeline from a git repository. The idea is to set up a pipeline job in Jenkins that loads a minimal pipeline, which in turn loads the shared library and then uses this step to load the actual pipeline.
A centrally maintained pipeline script (Jenkinsfile) can be re-used by
several projects using `pipelineExecute` as outlined in the example
below.
## ${docGenDescription}
## Prerequisites
none
## Parameters
## ${docGenParameters}
| parameter | mandatory | default | possible values |
| -------------------|-----------|-----------------|-----------------|
| `repoUrl` | yes | | |
| `branch` | no | 'master' | |
| `path` | no | 'Jenkinsfile' | |
| `credentialsId` | no | An empty String | |
* `repoUrl` The url to the git repository of the pipeline to be loaded.
* `branch` The branch of the git repository from which the pipeline should be checked out.
* `path` The path to the Jenkinsfile, inside the repository, to be loaded.
* `credentialsId` The Jenkins credentials containing user and password needed to access a private git repository.
## Step configuration
none
## ${docGenConfiguration}
## Side effects

View File

@ -349,7 +349,6 @@ steps:
sendMail: true
timeoutInSeconds: 900
pipelineStashFilesAfterBuild:
runOpaTests: false
stashIncludes:
checkmarx: '**/*.js, **/*.scala, **/*.py, **/*.go, **/*.xml, **/*.html'
classFiles: '**/target/classes/**/*.class, **/target/test-classes/**/*.class'
@ -358,12 +357,12 @@ steps:
checkmarx: '**/*.mockserver.js, node_modules/**/*.js'
classFiles: ''
sonar: ''
noDefaultExludes: []
pipelineStashFilesBeforeBuild:
runCheckmarx: false
stashIncludes:
buildDescriptor: '**/pom.xml, **/.mvn/**, **/assembly.xml, **/.swagger-codegen-ignore, **/package.json, **/requirements.txt, **/setup.py, **/mta*.y*ml, **/.npmrc, Dockerfile, **/VERSION, **/version.txt, **/Gopkg.*, **/build.sbt, **/sbtDescriptor.json, **/project/*'
deployDescriptor: '**/manifest*.y*ml, **/*.mtaext.y*ml, **/*.mtaext, **/xs-app.json, helm/**, *.y*ml'
git: '**/gitmetadata/**'
git: '.git/**'
opa5: '**/*.*'
opensourceConfiguration: '**/srcclr.yml, **/vulas-custom.properties, **/.nsprc, **/.retireignore, **/.retireignore.json, **/.snyk, **/wss-unified-agent.config, **/vendor/**/*'
pipelineConfigAndTests: '.pipeline/**'
@ -378,6 +377,8 @@ steps:
pipelineConfigAndTests: ''
securityDescriptor: ''
tests: ''
noDefaultExludes:
- 'git'
seleniumExecuteTests:
buildTool: 'npm'
containerPortMappings:

View File

@ -21,9 +21,19 @@ def getMandatoryParameter(Map map, paramName, defaultValue = null) {
}
def stash(name, include = '**/*.*', exclude = '') {
echo "Stash content: ${name} (include: ${include}, exclude: ${exclude})"
steps.stash name: name, includes: include, excludes: exclude
def stash(name, include = '**/*.*', exclude = '', useDefaultExcludes = true) {
echo "Stash content: ${name} (include: ${include}, exclude: ${exclude}, useDefaultExcludes: ${useDefaultExcludes})"
Map stashParams = [
name: name,
includes: include,
excludes: exclude
]
//only set the optional parameter if default excludes should not be applied
if (!useDefaultExcludes) {
stashParams.useDefaultExcludes = useDefaultExcludes
}
steps.stash stashParams
}
def stashList(script, List stashes) {
@ -46,9 +56,9 @@ def stashList(script, List stashes) {
}
}
def stashWithMessage(name, msg, include = '**/*.*', exclude = '') {
def stashWithMessage(name, msg, include = '**/*.*', exclude = '', useDefaultExcludes = true) {
try {
stash(name, include, exclude)
stash(name, include, exclude, useDefaultExcludes)
} catch (e) {
echo msg + name + " (${e.getMessage()})"
}

View File

@ -78,7 +78,7 @@ class ArtifactSetVersionTest extends BasePiperTest {
return closure()
})
shellRule.setReturnValue("date --universal +'%Y%m%d%H%M%S'", '20180101010203')
shellRule.setReturnValue("date --utc +'%Y%m%d%H%M%S'", '20180101010203')
shellRule.setReturnValue('git diff --quiet HEAD', 0)
helper.registerAllowedMethod('fileExists', [String.class], {true})

View File

@ -240,7 +240,7 @@ class DockerExecuteOnKubernetesTest extends BasePiperTest {
],
containerName: 'mavenexecute',
containerPortMappings: [
'selenium/standalone-chrome': [[containerPort: 4444, hostPort: 4444]]
'selenium/standalone-chrome': [[containerPort: 4444]]
],
containerWorkspaces: [
'selenium/standalone-chrome': ''
@ -263,8 +263,7 @@ class DockerExecuteOnKubernetesTest extends BasePiperTest {
hasItem('maven:3.5-jdk-8-alpine'),
hasItem('selenium/standalone-chrome'),
))
// assertThat(portList, is(null))
assertThat(portList, hasItem([[name: 'selenium0', containerPort: 4444, hostPort: 4444]]))
assertThat(portList, hasItem([[name: 'selenium0', containerPort: 4444]]))
assertThat(containerCommands.size(), is(1))
assertThat(envList, hasItem(hasItem(allOf(hasEntry('name', 'customEnvKey'), hasEntry ('value','customEnvValue')))))
}

View File

@ -22,22 +22,6 @@ class PipelineStashFilesAfterBuildTest extends BasePiperTest {
@Test
void testStashAfterBuild() {
helper.registerAllowedMethod("fileExists", [String.class], {
searchTerm ->
return false
})
stepRule.step.pipelineStashFilesAfterBuild(
script: nullScript,
juStabUtils: utils
)
// asserts
assertFalse(loggingRule.log.contains('Stash content: checkmarx'))
assertThat(loggingRule.log, containsString('Stash content: classFiles'))
assertThat(loggingRule.log, containsString('Stash content: sonar'))
}
@Test
void testStashAfterBuildWithCheckmarx() {
helper.registerAllowedMethod("fileExists", [String.class], {
searchTerm ->
return true
@ -52,21 +36,4 @@ class PipelineStashFilesAfterBuildTest extends BasePiperTest {
assertThat(loggingRule.log, containsString('Stash content: classFiles'))
assertThat(loggingRule.log, containsString('Stash content: sonar'))
}
@Test
void testStashAfterBuildWithCheckmarxConfig() {
helper.registerAllowedMethod("fileExists", [String.class], {
searchTerm ->
return true
})
stepRule.step.pipelineStashFilesAfterBuild(
script: [commonPipelineEnvironment: [configuration: [steps: [executeCheckmarxScan: [checkmarxProject: 'TestProject']]]]],
juStabUtils: utils,
)
// asserts
assertThat(loggingRule.log, containsString('Stash content: checkmarx'))
assertThat(loggingRule.log, containsString('Stash content: classFiles'))
assertThat(loggingRule.log, containsString('Stash content: sonar'))
}
}

View File

@ -22,27 +22,7 @@ class PipelineStashFilesBeforeBuildTest extends BasePiperTest {
.around(stepRule)
@Test
void testStashBeforeBuildNoOpa() {
stepRule.step.pipelineStashFilesBeforeBuild(script: nullScript, juStabUtils: utils)
// asserts
assertEquals('mkdir -p gitmetadata', shellRule.shell[0])
assertEquals('cp -rf .git/* gitmetadata', shellRule.shell[1])
assertEquals('chmod -R u+w gitmetadata', shellRule.shell[2])
assertThat(loggingRule.log, containsString('Stash content: buildDescriptor'))
assertThat(loggingRule.log, containsString('Stash content: deployDescriptor'))
assertThat(loggingRule.log, containsString('Stash content: git'))
assertFalse(loggingRule.log.contains('Stash content: opa5'))
assertThat(loggingRule.log, containsString('Stash content: opensourceConfiguration'))
assertThat(loggingRule.log, containsString('Stash content: pipelineConfigAndTests'))
assertThat(loggingRule.log, containsString('Stash content: securityDescriptor'))
assertThat(loggingRule.log, containsString('Stash content: tests'))
}
@Test
void testStashBeforeBuildOpa() {
void testStashBeforeBuild() {
stepRule.step.pipelineStashFilesBeforeBuild(script: nullScript, juStabUtils: utils, runOpaTests: true)
@ -56,4 +36,21 @@ class PipelineStashFilesBeforeBuildTest extends BasePiperTest {
assertThat(loggingRule.log, containsString('Stash content: securityDescriptor'))
assertThat(loggingRule.log, containsString('Stash content: tests'))
}
@Test
void testStashBeforeBuildCustomConfig() {
stepRule.step.pipelineStashFilesBeforeBuild(script: nullScript, juStabUtils: utils, runOpaTests: true, stashIncludes: ['myStash': '**.myTest'])
// asserts
assertThat(loggingRule.log, containsString('Stash content: buildDescriptor'))
assertThat(loggingRule.log, containsString('Stash content: deployDescriptor'))
assertThat(loggingRule.log, containsString('Stash content: git'))
assertThat(loggingRule.log, containsString('Stash content: opa5'))
assertThat(loggingRule.log, containsString('Stash content: opensourceConfiguration'))
assertThat(loggingRule.log, containsString('Stash content: pipelineConfigAndTests'))
assertThat(loggingRule.log, containsString('Stash content: securityDescriptor'))
assertThat(loggingRule.log, containsString('Stash content: tests'))
assertThat(loggingRule.log, containsString('Stash content: myStash'))
}
}

View File

@ -135,5 +135,5 @@ def isAppContainer(config){
}
def getTimestamp(pattern){
return sh(returnStdout: true, script: "date --universal +'${pattern}'").trim()
return sh(returnStdout: true, script: "date --utc +'${pattern}'").trim()
}

View File

@ -72,6 +72,12 @@ import hudson.AbortException
* Specifies a dedicated user home directory for the container which will be passed as value for environment variable `HOME`.
*/
'dockerWorkspace',
/**
* Kubernetes Security Context used for the pod.
* Can be used to specify uid and fsGroup.
* See: https://kubernetes.io/docs/tasks/configure-pod-container/security-context/
*/
'securityContext',
/**
* Specific stashes that should be considered for the step execution.
*/
@ -83,13 +89,7 @@ import hudson.AbortException
/**
*
*/
'stashIncludes',
/**
* Kubernetes Security Context used for the pod.
* Can be used to specify uid and fsGroup.
* See: https://kubernetes.io/docs/tasks/configure-pod-container/security-context/
*/
'securityContext'
'stashIncludes'
])
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS.minus([
'stashIncludes',
@ -277,19 +277,15 @@ private List getContainerList(config) {
}
if (config.containerPortMappings?.get(imageName)) {
def portMapping = { m ->
[
name: m.name,
containerPort: m.containerPort,
hostPort: m.hostPort
]
}
def ports = []
def portCounter = 0
config.containerPortMappings.get(imageName).each {mapping ->
mapping.name = "${containerName}${portCounter}".toString()
ports.add(portMapping(mapping))
def name = "${containerName}${portCounter}".toString()
if(mapping.containerPort != mapping.hostPort) {
echo ("[WARNING][${STEP_NAME}]: containerPort and hostPort are different for container '${containerName}'. "
+ "The hostPort will be ignored.")
}
ports.add([name: name, containerPort: mapping.containerPort])
portCounter ++
}
containerSpec.ports = ports

View File

@ -1,8 +1,23 @@
import com.sap.piper.GenerateDocumentation
import static com.sap.piper.Prerequisites.checkScript
import groovy.transform.Field
@Field STEP_NAME = getClass().getName()
@Field Set PARAMETER_KEYS = [
/** Defines the name of the measurement which is written to the Influx database.*/
'measurementName'
]
/**
* This step is used to measure the duration of a set of steps, e.g. a certain stage.
* The duration is stored in a Map. The measurement data can then be written to an Influx database using step [influxWriteData](influxWriteData.md).
*
* !!! tip
* Measuring for example the duration of pipeline stages helps to identify potential bottlenecks within the deployment pipeline.
* This then helps to counter identified issues with respective optimization measures, e.g parallelization of tests.
*/
@GenerateDocumentation
def call(Map parameters = [:], body) {
def script = checkScript(this, parameters)

View File

@ -1,5 +1,6 @@
import com.cloudbees.groovy.cps.NonCPS
import com.sap.piper.GenerateDocumentation
import com.sap.piper.ConfigurationHelper
import groovy.text.SimpleTemplateEngine
@ -10,14 +11,49 @@ import groovy.transform.Field
@Field Set GENERAL_CONFIG_KEYS = []
@Field Set STEP_CONFIG_KEYS = []
@Field Set PARAMETER_KEYS = [
/**
* If set to true the following will be output to the console:
* 1. Step beginning: `--- Begin library step: ${stepName}.groovy ---`
* 2. Step end: `--- End library step: ${stepName}.groovy ---`
* 3. Step errors:
*
* ```log
* ----------------------------------------------------------
* --- An error occurred in the library step: ${stepName}
* ----------------------------------------------------------
* The following parameters were available to the step:
* ***
* ${stepParameters}
* ***
* The error was:
* ***
* ${err}
* ***
* Further information:
* * Documentation of step ${stepName}: .../${stepName}/
* * Pipeline documentation: https://...
* * GitHub repository for pipeline steps: https://...
* ----------------------------------------------------------
* ```
* @possibleValues `true`, `false`
*/
'echoDetails',
/** Defines the url of the library's documentation that will be used to generate the corresponding links to the step documentation.*/
'libraryDocumentationUrl',
/** Defines the url of the library's repository that will be used to generate the corresponding links to the step implementation.*/
'libraryRepositoryUrl',
/** Defines the name of the step executed that will be shown in the console output.*/
'stepName',
/** */
'stepNameDoc',
/** Defines the parameters from the step to be executed. The list of parameters is then shown in the console output.*/
'stepParameters'
]
/**
* Used by other steps to make error analysis easier. Lists parameters and other data available to the step in which the error occurs.
*/
@GenerateDocumentation
void call(Map parameters = [:], body) {
// load default & individual configuration
Map config = ConfigurationHelper.newInstance(this)

View File

@ -1,5 +1,6 @@
import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.GenerateDocumentation
import com.sap.piper.ConfigurationHelper
import com.sap.piper.Utils
import groovy.transform.Field
@ -9,12 +10,32 @@ import groovy.transform.Field
@Field Set GENERAL_CONFIG_KEYS = STEP_CONFIG_KEYS
@Field Set STEP_CONFIG_KEYS = [
/** Optionally with `healthEndpoint` the health function is called if endpoint is not the standard url.*/
'healthEndpoint',
/**
* Health check function is called providing full qualified `testServerUrl` to the health check.
*
*/
'testServerUrl'
]
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS
/**
* Calls the health endpoint url of the application.
*
* The intention of the check is to verify that a suitable health endpoint is available. Such a health endpoint is required for operation purposes.
*
* This check is used as a real-life test for your productive health endpoints.
*
* !!! note "Check Depth"
* Typically, tools performing simple health checks are not too smart. Therefore it is important to choose an endpoint for checking wisely.
*
* This check therefore only checks if the application/service url returns `HTTP 200`.
*
* This is in line with health check capabilities of platforms which are used for example in load balancing scenarios. Here you can find an [example for Amazon AWS](http://docs.aws.amazon.com/elasticloadbalancing/latest/classic/elb-healthchecks.html).
*/
@GenerateDocumentation
void call(Map parameters = [:]) {
handlePipelineStepErrors (stepName: STEP_NAME, stepParameters: parameters) {
def script = checkScript(this, parameters) ?: this

View File

@ -1,5 +1,6 @@
import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.GenerateDocumentation
import com.sap.piper.ConfigurationHelper
import com.sap.piper.Utils
@ -9,20 +10,37 @@ import groovy.transform.Field
@Field Set GENERAL_CONFIG_KEYS = []
@Field Set STEP_CONFIG_KEYS = [
/** @see dockerExecute */
'dockerImage',
/** Path or url to the mvn settings file that should be used as global settings file.*/
'globalSettingsFile',
/** Path or url to the mvn settings file that should be used as project settings file.*/
'projectSettingsFile',
/** Path to the pom file that should be used.*/
'pomPath',
/** Path to the location of the local repository that should be used.*/
'm2Path'
]
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS.plus([
/** @see dockerExecute */
'dockerOptions',
/** Flags to provide when running mvn.*/
'flags',
/** Maven goals that should be executed.*/
'goals',
/** Additional properties.*/
'defines',
/**
* Configures maven to log successful downloads. This is set to `false` by default to reduce the noise in build logs.
* @possibleValues `true`, `false`
*/
'logSuccessfulMavenTransfers'
])
/**
* Executes a maven command inside a Docker container.
*/
@GenerateDocumentation
void call(Map parameters = [:]) {
handlePipelineStepErrors(stepName: STEP_NAME, stepParameters: parameters) {

View File

@ -1,5 +1,6 @@
import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.GenerateDocumentation
import com.sap.piper.ConfigurationHelper
import com.sap.piper.MtaUtils
import com.sap.piper.Utils
@ -10,16 +11,32 @@ import groovy.transform.Field
@Field Set GENERAL_CONFIG_KEYS = []
@Field Set STEP_CONFIG_KEYS = [
/** The name of the application which is being built. If the parameter has been provided and no `mta.yaml` exists, the `mta.yaml` will be automatically generated using this parameter and the information (`name` and `version`) from `package.json` before the actual build starts.*/
'applicationName',
/**
* The target platform to which the mtar can be deployed.
* @possibleValues 'CF', 'NEO', 'XSA'
*/
'buildTarget',
/** @see dockerExecute */
'dockerImage',
/** The path to the extension descriptor file.*/
'extension',
/**
* The location of the SAP Multitarget Application Archive Builder jar file, including file name and extension.
* If it is not provided, the SAP Multitarget Application Archive Builder is expected on PATH.
*/
'mtaJarLocation'
]
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS.plus([
/** @see dockerExecute */
'dockerOptions'
])
/**
* Executes the SAP Multitarget Application Archive Builder to create an mtar archive of the application.
*/
@GenerateDocumentation
void call(Map parameters = [:]) {
handlePipelineStepErrors(stepName: STEP_NAME, stepParameters: parameters) {

View File

@ -1,3 +1,4 @@
import com.sap.piper.GenerateDocumentation
import com.sap.piper.Utils
import groovy.transform.Field
@ -5,12 +6,26 @@ import groovy.transform.Field
@Field STEP_NAME = getClass().getName()
@Field Set PARAMETER_KEYS = [
/** The url to the git repository of the pipeline to be loaded.*/
'repoUrl',
/** The branch of the git repository from which the pipeline should be checked out.*/
'branch',
/** The path to the Jenkinsfile, inside the repository, to be loaded.*/
'path',
/** The Jenkins credentials containing user and password needed to access a private git repository.*/
'credentialsId'
]
/**
* pipelineExecute
* Load and executes a pipeline from another git repository.
* Loads and executes a pipeline from another git repository.
* The idea is to set up a pipeline job in Jenkins that loads a minimal pipeline, which
* in turn loads the shared library and then uses this step to load the actual pipeline.
*
* A centrally maintained pipeline script (Jenkinsfile) can be re-used by
* several projects using `pipelineExecute` as outlined in the example below.
*/
@GenerateDocumentation
void call(Map parameters = [:]) {
node() {

View File

@ -5,7 +5,7 @@ import com.sap.piper.ConfigurationHelper
import groovy.transform.Field
@Field String STEP_NAME = getClass().getName()
@Field Set STEP_CONFIG_KEYS = ['runCheckmarx', 'stashIncludes', 'stashExcludes']
@Field Set STEP_CONFIG_KEYS = ['noDefaultExludes', 'stashIncludes', 'stashExcludes']
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS
void call(Map parameters = [:]) {
@ -28,9 +28,6 @@ void call(Map parameters = [:]) {
.mixinGeneralConfig(script.commonPipelineEnvironment, STEP_CONFIG_KEYS)
.mixinStepConfig(script.commonPipelineEnvironment, STEP_CONFIG_KEYS)
.mixinStageConfig(script.commonPipelineEnvironment, parameters.stageName?:env.STAGE_NAME, STEP_CONFIG_KEYS)
.mixin([
runCheckmarx: (script.commonPipelineEnvironment.configuration?.steps?.executeCheckmarxScan?.checkmarxProject != null && script.commonPipelineEnvironment.configuration.steps.executeCheckmarxScan.checkmarxProject.length()>0)
])
.mixin(parameters, PARAMETER_KEYS)
.use()
@ -40,27 +37,9 @@ void call(Map parameters = [:]) {
stepParam1: parameters?.script == null
], config)
// store files to be checked with checkmarx
if (config.runCheckmarx) {
utils.stash(
'checkmarx',
config.stashIncludes.checkmarx,
config.stashExcludes.checkmarx
)
config.stashIncludes.each {stashKey, stashIncludes ->
def useDefaultExcludes = !config.noDefaultExludes.contains(stashKey)
utils.stashWithMessage(stashKey, "[${STEP_NAME}] no files detected for stash '${stashKey}': ", stashIncludes, config.stashExcludes[stashKey]?:'', useDefaultExcludes)
}
utils.stashWithMessage(
'classFiles',
"[${STEP_NAME}] Failed to stash class files.",
config.stashIncludes.classFiles,
config.stashExcludes.classFiles
)
utils.stashWithMessage(
'sonar',
"[${STEP_NAME}] Failed to stash sonar files.",
config.stashIncludes.sonar,
config.stashExcludes.sonar
)
}
}

View File

@ -5,14 +5,14 @@ import com.sap.piper.ConfigurationHelper
import groovy.transform.Field
@Field String STEP_NAME = getClass().getName()
@Field Set STEP_CONFIG_KEYS = ['runOpaTests', 'stashIncludes', 'stashExcludes']
@Field Set STEP_CONFIG_KEYS = ['noDefaultExludes', 'stashIncludes', 'stashExcludes']
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS
void call(Map parameters = [:]) {
handlePipelineStepErrors (stepName: STEP_NAME, stepParameters: parameters, stepNameDoc: 'stashFiles') {
def utils = parameters.juStabUtils
Utils utils = parameters.juStabUtils
if (utils == null) {
utils = new Utils()
}
@ -21,9 +21,6 @@ void call(Map parameters = [:]) {
if (script == null)
script = this
//additional includes via passing e.g. stashIncludes: [opa5: '**/*.include']
//additional excludes via passing e.g. stashExcludes: [opa5: '**/*.exclude']
Map config = ConfigurationHelper.newInstance(this)
.loadStepDefaults()
.mixinGeneralConfig(script.commonPipelineEnvironment, STEP_CONFIG_KEYS)
@ -38,59 +35,9 @@ void call(Map parameters = [:]) {
stepParam1: parameters?.script == null
], config)
if (config.runOpaTests){
utils.stash('opa5', config.stashIncludes?.get('opa5')?config.stashIncludes.opa5:'**/*.*', config.stashExcludes?.get('opa5')?config.stashExcludes.opa5:'')
config.stashIncludes.each {stashKey, stashIncludes ->
def useDefaultExcludes = !config.noDefaultExludes.contains(stashKey)
utils.stashWithMessage(stashKey, "[${STEP_NAME}] no files detected for stash '${stashKey}': ", stashIncludes, config.stashExcludes[stashKey]?:'', useDefaultExcludes)
}
//store build descriptor files depending on technology, e.g. pom.xml, package.json
utils.stash(
'buildDescriptor',
config.stashIncludes.buildDescriptor,
config.stashExcludes.buildDescriptor
)
//store deployment descriptor files depending on technology, e.g. *.mtaext.yml
utils.stashWithMessage(
'deployDescriptor',
"[${STEP_NAME}] no deployment descriptor files provided: ",
config.stashIncludes.deployDescriptor,
config.stashExcludes.deployDescriptor
)
//store git metadata for SourceClear agent
sh "mkdir -p gitmetadata"
sh "cp -rf .git/* gitmetadata"
sh "chmod -R u+w gitmetadata"
utils.stashWithMessage(
'git',
"[${STEP_NAME}] no git repo files detected: ",
config.stashIncludes.git,
config.stashExcludes.git
)
//store nsp & retire exclusion file for future use
utils.stashWithMessage(
'opensourceConfiguration',
"[${STEP_NAME}] no opensourceConfiguration files provided: ",
config.stashIncludes.get('opensourceConfiguration'),
config.stashExcludes.get('opensourceConfiguration')
)
//store pipeline configuration including additional groovy test scripts for future use
utils.stashWithMessage(
'pipelineConfigAndTests',
"[${STEP_NAME}] no pipeline configuration and test files found: ",
config.stashIncludes.pipelineConfigAndTests,
config.stashExcludes.pipelineConfigAndTests
)
utils.stashWithMessage(
'securityDescriptor',
"[${STEP_NAME}] no security descriptor found: ",
config.stashIncludes.securityDescriptor,
config.stashExcludes.securityDescriptor
)
//store files required for tests, e.g. Gauge, SUT, ...
utils.stashWithMessage(
'tests',
"[${STEP_NAME}] no files for tests provided: ",
config.stashIncludes.tests,
config.stashExcludes.tests
)
}
}