1
0
mirror of https://github.com/SAP/jenkins-library.git synced 2024-12-12 10:55:20 +02:00

Merge branch 'master' into publishCheckResults3

This commit is contained in:
Christopher Fenner 2018-01-30 11:15:26 +01:00
commit 2d2cc3a893
38 changed files with 1679 additions and 592 deletions

View File

@ -1,7 +1,7 @@
# ConfigurationMerger
## Description
A helper script that can merge the configurations from multiple sources.
A helper script that can merge the configurations from multiple sources.
## Static Method Details
@ -10,8 +10,8 @@ A helper script that can merge the configurations from multiple sources.
#### Description
A step is usually configured by default values, configuration values from the configuration file and the parameters.
The methods can merge these sources.
Default values are overwritten by configuration file values.
The method can merge these sources.
Default values are overwritten by configuration file values.
These are overwritten by parameters.
#### Parameters
@ -25,9 +25,9 @@ These are overwritten by parameters.
| `defaults` | yes | Map |
* `parameters` Parameters map given to the step
* `parameterKeys` List of parameter names (keys) that should be considered while merging.
* `parameterKeys` List of parameter names (keys) that should be considered while merging.
* `configurationMap` Configuration map loaded from the configuration file.
* `configurationKeys` List of configuration keys that should be considered while merging.
* `configurationKeys` List of configuration keys that should be considered while merging.
* `defaults` Map of default values, e.g. loaded from the default value configuration file.
#### Side effects
@ -62,3 +62,62 @@ List stepConfigurationKeys = [
Map configuration = ConfigurationMerger.merge(parameters, parameterKeys, stepConfiguration, stepConfigurationKeys, stepDefaults)
```
### mergeWithPipelineData
#### Description
A step is usually configured by default values, configuration values from the configuration file and the parameters.
In certain cases also information previously generated in the pipeline should be mixed in, like for example an artifactVersion created earlier.
The method can merge these sources.
Default values are overwritten by configuration file values.
Those are overwritten by information previously generated in the pipeline (e.g. stored in [commonPipelineEnvironment](../steps/commonPipelineEnvironment.md)).
These are overwritten by parameters passed directly to the step.
#### Parameters
| parameter | mandatory | Class |
| -------------------|-----------|-----------------------------------|
| `parameters` | yes | Map |
| `parameterKeys` | yes | List |
| `pipelineDataMap` | yes | Map |
| `configurationMap` | yes | Map |
| `configurationKeys`| yes | List |
| `defaults` | yes | Map |
* `parameters` Parameters map given to the step
* `parameterKeys` List of parameter names (keys) that should be considered while merging.
* `configurationMap` Configuration map loaded from the configuration file.
* `pipelineDataMap` Values available to the step during pipeline run.
* `configurationKeys` List of configuration keys that should be considered while merging.
* `defaults` Map of default values, e.g. loaded from the default value configuration file.
#### Side effects
none
#### Example
```groovy
def stepName = 'influxWriteData'
prepareDefaultValues script: script
final Map stepDefaults = ConfigurationLoader.defaultStepConfiguration(script, stepName)
final Map stepConfiguration = ConfigurationLoader.stepConfiguration(script, stepName)
final Map generalConfiguration = ConfigurationLoader.generalConfiguration(script)
List parameterKeys = [
'artifactVersion',
'influxServer',
'influxPrefix'
]
Map pipelineDataMap = [
artifactVersion: commonPipelineEnvironment.getArtifactVersion()
]
List stepConfigurationKeys = [
'influxServer',
'influxPrefix'
]
Map configuration = ConfigurationMerger.mergeWithPipelineData(parameters, parameterKeys, pipelineDataMap, stepConfiguration, stepConfigurationKeys, stepDefaults)
```

View File

@ -0,0 +1,30 @@
# JsonUtils
## Description
Provides json related utility functions.
## Constructors
### JsonUtils()
Default no-argument constructor. Instances of the Utils class does not hold any instance specific state.
#### Example
```groovy
new JsonUtils()
```
## Method Details
### getPrettyJsonString(object)
#### Description
Creates a pretty-printed json string.
#### Parameters
* `object` - A object (e.g. Map or List).
#### Return value
A pretty printed `String`.
#### Side effects
none

View File

@ -24,7 +24,7 @@ Retrieves the parameter value for parameter `paramName` from parameter map `map`
#### Parameters
* `map` - A map containing configuration parameters.
* `paramName` - The key of the parameter which should be looked up.
* `defaultValue` - The value which is returned in case there is no parameter with key `paramName` contained in `map`.
* optional: `defaultValue` - The value which is returned in case there is no parameter with key `paramName` contained in `map`. If it is not provided the default is `null`.
#### Return value
The value to the parameter to be retrieved, or the default value if the former is `null`, either since there is no such key or the key is associated with value `null`. In case the parameter is not defined or the value for that parameter is `null`and there is no default value an exception is thrown.

View File

@ -6,9 +6,52 @@ Provides project specific settings.
## Prerequisites
none
## Method details
### getArtifactVersion()
#### Description
Returns the version of the artifact which is build in the pipeline.
#### Parameters
none
#### Return value
A `String` containing the version.
#### Side effects
none
#### Exceptions
none
#### Example
```groovy
def myVersion = commonPipelineEnvironment.getArtifactVersion()
```
### setArtifactVersion(version)
#### Description
Sets the version of the artifact which is build in the pipeline.
#### Parameters
none
#### Return value
none
#### Side effects
none
#### Exceptions
none
#### Example
```groovy
commonPipelineEnvironment.setArtifactVersion('1.2.3')
```
### getConfigProperties()
#### Description
@ -102,6 +145,53 @@ none
commonPipelineEnvironment.setConfigProperty('DEPLOY_HOST', 'my-deploy-host.com')
```
### getInfluxCustomData()
#### Description
Returns the Influx custom data which can be collected during pipeline run.
#### Parameters
none
#### Return value
A `Map` containing the data collected.
#### Side effects
none
#### Exceptions
none
#### Example
```groovy
def myInfluxData = commonPipelineEnvironment.getInfluxCustomData()
```
### getInfluxCustomDataMap()
#### Description
Returns the Influx custom data map which can be collected during pipeline run.
It is used for example by step [`influxWriteData`](../steps/influxWriteData.md).
The data map is a map of maps, like `[pipeline_data: [:], my_measurement: [:]]`
Each map inside the map represents a dedicated measurement in the InfluxDB.
#### Parameters
none
#### Return value
A `Map` containing a `Map`s with data collected.
#### Side effects
none
#### Exceptions
none
#### Example
```groovy
def myInfluxDataMap = commonPipelineEnvironment.getInfluxCustomDataMap()
```
### getMtarFileName()
@ -143,3 +233,50 @@ none
```groovy
commonPipelineEnvironment.setMtarFileName('path/to/foo.mtar')
```
### getPipelineMeasurement(measurementName)
#### Description
Returns the value of a specific pipeline measurement.
The measurements are collected with step [`durationMeasure`](../steps/durationMeasure.md)
#### Parameters
Name of the measurement
#### Return value
Value of the measurement
#### Side effects
none
#### Exceptions
none
#### Example
```groovy
def myMeasurementValue = commonPipelineEnvironment.getPipelineMeasurement('build_stage_duration')
```
### setPipelineMeasurement(measurementName, value)
#### Description
**This is an internal function!**
Sets the value of a specific pipeline measurement.
Please use the step [`durationMeasure`](../steps/durationMeasure.md) in a pipeline, instead.
#### Parameters
Name of the measurement and its value.
#### Return value
none
#### Side effects
none
#### Exceptions
none
#### Example
```groovy
commonPipelineEnvironment.setPipelineMeasurement('build_stage_duration', 2345)
```

View File

@ -0,0 +1,37 @@
# durationMeasure
## Description
This step is used to measure the duration of a set of steps, e.g. a certain stage.
The duration is stored in a Map. The measurement data can then be written to an Influx database using step [influxWriteData](influxWriteData.md).
!!! tip
Measuring for example the duration of pipeline stages helps to identify potential bottlenecks within the deployment pipeline.
This then helps to counter identified issues with respective optimization measures, e.g parallelization of tests.
## Prerequisites
none
## Pipeline configuration
none
## Explanation of pipeline step
Usage of pipeline step:
```groovy
durationMeasure (script: this, measurementName: 'build_duration') {
//execute your build
}
```
Available parameters:
| parameter | mandatory | default | possible values |
| ----------|-----------|---------|-----------------|
| script | no | empty `globalPipelineEnvironment` | |
| measurementName | no | test_duration | |
Details:
* `script` defines the global script environment of the Jenkinsfile run. Typically `this` is passed to this parameter. This allows the function to access the [`commonPipelineEnvironment`](commonPipelineEnvironment.md) for storing the measured duration.
* `measurementName` defines the name of the measurement which is written to the Influx database.

View File

@ -0,0 +1,195 @@
# influxWriteData
## Description
Since your Continuous Delivery Pipeline in Jenkins provides your productive development and delivery infrastructure you should monitor the pipeline to ensure it runs as expected. How to setup this monitoring is described in the following.
You basically need three components:
- The [InfluxDB Jenkins plugin](https://wiki.jenkins-ci.org/display/JENKINS/InfluxDB+Plugin) which allows you to send build metrics to InfluxDB servers
- The [InfluxDB](https://www.influxdata.com/time-series-platform/influxdb/) to store this data (Docker available)
- A [Grafana](http://grafana.org/) dashboard to visualize the data stored in InfluxDB (Docker available)
!!! note "no InfluxDB available?"
If you don't have an InfluxDB available yet this step will still provide you some benefit.
It will create following files for you and archive them into your build:
* `jenkins_data.json`: This file gives you build-specific information, like e.g. build result, stage where the build failed
* `pipeline_data.json`: This file gives you detailed information about your pipeline, e.g. stage durations, steps executed, ...
## Prerequisites
### Setting up InfluxDB with Grafana
The easiest way to start with is using the available official docker images.
You can either run these docker containers on the same host on which you run your Jenkins or each docker on individual VMs (hosts).
Very basic setup can be done like that (with user "admin" and password "adminPwd" for both InfluxDB and Grafana):
docker run -d -p 8083:8083 -p 8086:8086 --restart=always --name influxdb -v /var/influx_data:/var/lib/influxdb influxdb
docker run -d -p 3000:3000 --name grafana --restart=always --link influxdb:influxdb -e "GF_SECURITY_ADMIN_PASSWORD=adminPwd" grafana/grafana
For more advanced setup please reach out to the respective documentation:
- https://hub.docker.com/_/influxdb/ (and https://github.com/docker-library/docs/tree/master/influxdb)
- https://hub.docker.com/r/grafana/grafana/ (and https://github.com/grafana/grafana-docker)
After you have started your InfluxDB docker you need to create a database:
- in a Webbrowser open the InfluxDB Web-UI using the following URL: <host of your docker>:8083 (port 8083 is used for access via Web-UI, for Jenkins you use port 8086 to access the DB)
- create new DB (the name of this DB you need to provide later to Jenkins)
- create Admin user (this user you need to provide later to Jenkins)
!!! hint "With InfluxDB version 1.1 the InfluxDB Web-UI is deprecated"
You can perform the above steps via commandline:
- The following command will create a database with name <databasename>
`curl -i -XPOST http://localhost:8086/query --data-urlencode "q=CREATE DATABASE \<databasename\>"`
- The admin user with the name &lt;adminusername&gt; and the password &lt;adminuserpwd&gt; can be created with
`curl -i -XPOST http://localhost:8086/query --data-urlencode "q=CREATE USER \<adminusername\> WITH PASSWORD '\<adminuserpwd\>' WITH ALL PRIVILEGES"`
Once you have started both docker containers and Influx and Grafana are running you need to configure the Jenkins Plugin according to your settings.
## Pipeline configuration
To setup your Jenkins you need to do two configuration steps:
1. Configure Jenkins (via Manage Jenkins)
2. Adapt pipeline configuration
### Configure Jenkins
Once the plugin is available in your Jenkins:
* go to "Manage Jenkins" > "Configure System" > scroll down to section "influxdb target"
* maintain Influx data
!!! note "Jenkins as a Service"
For Jenkins as a Service instances this is already preset to the local InfluxDB with the name `jenkins`. In this case there is not need to do any additional configuration.
### Adapt pipeline configuration
You need to define the influxDB server in your pipeline as it is defined in the InfluxDb plugin configuration (see above).
```properties
influxDBServer=jenkins
```
## Explanation of pipeline step
Example usage of pipeline step:
```groovy
influxWriteData script: this
```
Available parameters:
| parameter | mandatory | default | possible values |
| ----------|-----------|---------|-----------------|
| script | no | empty `commonPipelineEnvironment` | |
| artifactVersion | yes | commonPipelineEnvironment.getArtifactVersion() | |
| influxServer | no | `jenkins` | |
| influxPrefix | no | `null` | |
## Work with InfluxDB and Grafana
You can access your **Grafana** via Web-UI: &lt;host of your grafana(-docker)&gt;:&lt;port3000&gt;
(or another port in case you have defined another one when starting your docker)
As a first step you need to add your InfluxDB as Data source to your Grafana:
- Login as user admin (PW as defined when starting your docker)
- in the navigation go to data sources -> add data source:
- name
- type: InfluxDB
- Url: \http://&lt;host of your InfluxDB server&gt;:&lt;port&gt;
- Access: direct (not via proxy)
- database: &lt;name of the DB as specified above&gt;
- User: &lt;name of the admin user as specified in step above&gt;
- Password: &lt;password of the admin user as specified in step above&gt;
!!! note "Jenkins as a Service"
For Jenkins as a Service the data source configuration is already available.
Therefore no need to go through the data source configuration step unless you want to add addtional data sources.
## Data collected in InfluxDB
The Influx plugin collects following data in the Piper context:
* All data as per default [InfluxDB plugin capabilities](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin)
* Additional data collected via `commonPipelineEnvironment.setInfluxCustomDataProperty()` and via `commonPipelineEnvironment.setPipelineMeasurement()`
!!! note "Add custom information to your InfluxDB"
You can simply add custom data collected during your pipeline runs via available data objects.
Example:
```groovy
//add data to measurement jenkins_custom_data - value can be a String or a Number
commonPipelineEnvironment.setInfluxCustomDataProperty('myProperty', 2018)
```
### Collected InfluxDB measurements
Measurements are potentially pre-fixed - see parameter `influxPrefix` above.
| Measurement name | data column | description |
| ---------------- | ----------- | ----------- |
| **All measurements** |<ul><li>build_number</li><li>project_name</li></ul>| All below measurements will have these columns. <br />Details see [InfluxDB plugin documentation](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin)|
| jenkins_data | <ul><li>build_result</li><li>build_time</li><li>last_successful_build</li><li>tests_failed</li><li>tests_skipped</li><li>tests_total</li><li>...</li></ul> | Details see [InfluxDB plugin documentation](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin)|
| cobertura_data | <ul><li>cobertura_branch_coverage_rate</li><li>cobertura_class_coverage_rate</li><li>cobertura_line_coverage_rate</li><li>cobertura_package_coverage_rate</li><li>...</li></ul> | Details see [InfluxDB plugin documentation](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin) |
| jacoco_data | <ul><li>jacoco_branch_coverage_rate</li><li>jacoco_class_coverage_rate</li><li>jacoco_instruction_coverage_rate</li><li>jacoco_line_coverage_rate</li><li>jacoco_method_coverage_rate</li></ul> | Details see [InfluxDB plugin documentation](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin) |
| performance_data | <ul><li>90Percentile</li><li>average</li><li>max</li><li>median</li><li>min</li><li>error_count</li><li>error_percent</li><li>...</li></ul> | Details see [InfluxDB plugin documentation](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin) |
| sonarqube_data | <ul><li>blocker_issues</li><li>critical_issues</li><li>info_issues</li><li>major_issues</li><li>minor_issues</li><li>lines_of_code</li><li>...</li></ul> | Details see [InfluxDB plugin documentation](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin) |
| jenkins_custom_data | Piper fills following colums by default: <br /><ul><li>build_result</li><li>build_result_key</li><li>build_step (->step in case of error)</li><li>build_error (->error message in case of error)</li></ul> | filled by `commonPipelineEnvironment.setInfluxCustomDataProperty()` |
| pipeline_data | Examples from the Piper templates:<br /><ul><li>build_duration</li><li>opa_duration</li><li>deploy_test_duration</li><li>deploy_test_duration</li><li>fortify_duration</li><li>release_duration</li><li>...</li></ul>| filled by step [`measureDuration`](durationMeasure.md) using parameter `measurementName`|
| step_data | Considered, e.g.:<br /><ul><li>build_quality (Milestone/Release)</li><li>build_url</li><li>bats</li><li>checkmarx</li><li>fortify</li><li>gauge</li><li>nsp</li><li>opa</li><li>opensourcedependency</li><li>ppms</li><li>jmeter</li><li>supa</li><li>snyk</li><li>sonar</li><li>sourceclear</li><li>uiveri5</li><li>vulas</li><li>whitesource</li><li>traceability</li><li>...</li><li>xmakestage</li><li>xmakepromote</li></ul>| filled by `commonPipelineEnvironment.setInfluxStepData()` |
### Examples for InfluxDB queries which can be used in Grafana
!!! caution "Project Names containing dashes (-)"
The InfluxDB plugin replaces dashes (-) with underscores (\_).
Please keep this in mind when specifying your project_name for a InfluxDB query.
#### Example 1: Select last 10 successful builds
```
select top(build_number,10), build_result from jenkins_data WHERE build_result = 'SUCCESS'
```
#### Example 2: Select last 10 step names of failed builds
```
select top(build_number,10), build_result, build_step from jenkins_custom_data WHERE build_result = 'FAILURE'
```
#### Example 3: Select build duration of step for a specific project
```
select build_duration / 1000 from "pipeline_data" WHERE project_name='PiperTestOrg_piper_test_master'
```
#### Example 4: Get transparency about successful/failed steps for a specific project
```
select top(build_number,10) AS "Build", build_url, build_quality, fortify, gauge, vulas, opa from step_data WHERE project_name='PiperTestOrg_piper_test_master'
```
!!! note
With this query you can create transparency about which steps ran successfully / not successfully in your pipeline and which ones were not executed at all.
By specifying all the steps you consider relevant in your select statement it is very easy to create this transparency.

View File

@ -2,36 +2,71 @@
## Description
Deploys an Application to SAP Cloud Platform (SAP CP) using the SAP Cloud Platform Console Client (Neo Java Web SDK).
## Prerequisites
* **SAP CP account** - the account to where the application is deployed.
* **SAP CP user for deployment** - a user with deployment permissions in the given account.
* **Jenkins credentials for deployment** - must be configured in Jenkins credentials with a dedicated Id.
![Jenkins credentials configuration](../images/neo_credentials.png)
* **Neo Java Web SDK** - can be downloaded from [Maven Central](http://central.maven.org/maven2/com/sap/cloud/neo-java-web-sdk/). The Neo Java Web SDK
needs to be extracted into the folder provided by `neoHome`. In case this parameters is not provided and there is no NEO_HOME parameter in the environment
`<neoRoot>/tools` needs to be in the `PATH`.
`<neoRoot>/tools` needs to be in the `PATH`. This step is also capable of triggering the neo deploy tool provided inside a docker image.
* **Java 8 or higher** - needed by the *Neo-Java-Web-SDK*
## Parameters
| parameter | mandatory | default | possible values |
| -------------------|-----------|----------------------------------------------------------------------------------------------|-----------------|
| `script` | yes | | |
| `archivePath` | yes | | |
| `deployHost` | no | `'DEPLOY_HOST'` from `commonPipelineEnvironment` | |
| `deployAccount` | no | `'CI_DEPLOY_ACCOUNT'` from `commonPipelineEnvironment` | |
| `neoCredentialsId` | no | `'CI_CREDENTIALS_ID'` | |
| `neoHome` | no | Environment is checked for `NEO_HOME`, <br>otherwise the neo toolset is expected in the path | |
## Parameters when using MTA deployment method (default - MTA)
| parameter | mandatory | default | possible values |
| -------------------|-----------|----------------------------------------------------------------------------------------------------------------------------------|-----------------|
| `deployMode` | yes | `'MTA'` | `'MTA'`, `'WAR_PARAMS'`, `'WAR_PROPERTIESFILE'` |
| `script` | yes | | |
| `archivePath` | yes | | |
| `deployHost` | no | `'account'` from step configuration `'neoDeploy'`, or propertey `'DEPLOY_HOST'` from `commonPipelineEnvironment` (deprecated) | |
| `deployAccount` | no | `'host'` from step configuration `'neoDeploy'`, or property `'CI_DEPLOY_ACCOUNT'` from `commonPipelineEnvironment` (deprecated) | |
| `neoCredentialsId` | no | `'neoCredentialsId'` from step configuration `'neoDeploy'` or hard coded value `'CI_CREDENTIALS_ID'` | |
| `neoHome` | no | Environment is checked for `NEO_HOME`, <br>otherwise the neo toolset is expected in the path | |
## Parameters when using WAR file deployment method with .properties file (WAR_PROPERTIESFILE)
| parameter | mandatory | default | possible values |
| -------------------|-----------|----------------------------------------------------------------------------------------------|-------------------------------------------------|
| `deployMode` | yes | `'MTA'` | `'MTA'`, `'WAR_PARAMS'`, `'WAR_PROPERTIESFILE'` |
| `warAction` | yes | `'deploy'` | `'deploy'`, `'rolling-update'` |
| `script` | yes | | |
| `archivePath` | yes | | |
| `neoCredentialsId` | no | `'CI_CREDENTIALS_ID'` | |
| `neoHome` | no | Environment is checked for `NEO_HOME`, <br>otherwise the neo toolset is expected in the path | |
| `propertiesFile` | yes | | |
## Parameters when using WAR file deployment method witout .properties file - with parameters (WAR_PARAMS)
| parameter | mandatory | default | possible values |
| -------------------|-----------|----------------------------------------------------------------------------------------------|-------------------------------------------------|
| `deployMode` | yes | `'MTA'` | `'MTA'`, `'WAR_PARAMS'`, `'WAR_PROPERTIESFILE'` |
| `warAction` | yes | `'deploy'` | `'deploy'`, `'rolling-update'` |
| `script` | yes | | |
| `archivePath` | yes | | |
| `deployHost` | no | `'DEPLOY_HOST'` from `commonPipelineEnvironment` | |
| `deployAccount` | no | `'CI_DEPLOY_ACCOUNT'` from `commonPipelineEnvironment` | |
| `neoCredentialsId` | no | `'CI_CREDENTIALS_ID'` | |
| `neoHome` | no | Environment is checked for `NEO_HOME`, <br>otherwise the neo toolset is expected in the path | |
| `applicationName` | yes | | |
| `runtime` | yes | | |
| `runtime-version` | yes | | |
| `size` | no | `'lite'` | `'lite'`, `'pro'`, `'prem'`, `'prem-plus'` |
* `deployMode` - The deployment mode which should be used. Available options are `'MTA'` (default), `'WAR_PARAMS'` (deploying WAR file and passing all the deployment parameters via the function call) and `'WAR_PROPERTIESFILE'` (deploying WAR file and putting all the deployment parameters in a .properties file)
* `script` - The common script environment of the Jenkinsfile run. Typically `this` is passed to this parameter. This allows the function to access the [`commonPipelineEnvironment`](commonPipelineEnvironment.md) for retrieving e.g. configuration parameters.
* `archivePath`- The path to the archive for deployment to SAP CP.
* `deployHost` - The SAP Cloud Platform host to deploy to.
* `deployAccount` - The SAP Cloud Platform account to deploy to.
* `credentialsId` - The Jenkins credentials containing user and password used for SAP CP deployment.
* `neoHome` - The path to the `neo-java-web-sdk` tool used for SAP CP deployment. If no parameter is provided, the path is retrieved from the Jenkins environment variables using `env.NEO_HOME`. If this Jenkins environment variable is not set it is assumed that the tool is available in the `PATH`.
* `propertiesFile` - The path to the .properties file in which all necessary deployment properties for the application are defined.
* `warAction` - Action mode when using WAR file mode. Available options are `deploy` (default) and `rolling-update` which performs update of an application without downtime in one go.
* `applicationName` - Name of the application you want to manage, configure, or deploy
* `runtime` - Name of SAP Cloud Platform application runtime
* `runtime-version` - Version of SAP Cloud Platform application runtime
* `size` - Compute unit (VM) size. Acceptable values: lite, pro, prem, prem-plus.
## Return value
none
@ -42,6 +77,10 @@ none
## Exceptions
* `Exception`:
* If `archivePath` is not provided.
* If `propertiesFile` is not provided (when using `'WAR_PROPERTIESFILE'` deployment mode).
* If `applicationName` is not provided (when using `'WAR_PARAMS'` deployment mode).
* If `runtime` is not provided (when using `'WAR_PARAMS'` deployment mode).
* If `runtime-version` is not provided (when using `'WAR_PARAMS'` deployment mode).
* `AbortException`:
* If neo-java-web-sdk is not installed, or `neoHome`is wrong.
* If `deployHost` is wrong.
@ -53,3 +92,14 @@ none
```groovy
neoDeploy script: this, archivePath: 'path/to/archiveFile.mtar', credentialsId: 'my-credentials-id'
```
Example configuration:
```
steps:
<...>
neoDeploy:
account: <myDeployAccount>
host: hana.example.org
```

View File

@ -3,17 +3,20 @@ pages:
- Home: index.md
- 'Library steps':
- commonPipelineEnvironment: steps/commonPipelineEnvironment.md
- dockerExecute: steps/dockerExecute.md
- durationMeasure: steps/durationMeasure.md
- handlePipelineStepErrors: steps/handlePipelineStepErrors.md
- pipelineExecute: steps/pipelineExecute.md
- toolValidate: steps/toolValidate.md
- influxWriteData: steps/influxWriteData.md
- mavenExecute: steps/mavenExecute.md
- mtaBuild: steps/mtaBuild.md
- neoDeploy: steps/neoDeploy.md
- setupCommonPipelineEnvironment: steps/setupCommonPipelineEnvironment.md
- mavenExecute: steps/mavenExecute.md
- dockerExecute: steps/dockerExecute.md
- pipelineExecute: steps/pipelineExecute.md
- prepareDefaultValues: steps/prepareDefaultValues.md
- setupCommonPipelineEnvironment: steps/setupCommonPipelineEnvironment.md
- toolValidate: steps/toolValidate.md
- 'Library scripts':
- FileUtils: scripts/fileUtils.md
- JsonUtils: scripts/jsonUtils.md
- Utils: scripts/utils.md
- Version: scripts/version.md
- ConfigurationLoader: scripts/configurationLoader.md

View File

@ -10,7 +10,7 @@
<modelVersion>4.0.0</modelVersion>
<groupId>com.sap.cp.jenkins</groupId>
<artifactId>jenkins-library</artifactId>
<version>0.0.1</version>
<version>0.2</version>
<name>SAP CP Piper Library</name>
<description>Shared library containing steps and utilities to set up continuous deployment processes for SAP technologies.</description>

View File

@ -6,3 +6,10 @@ general:
steps:
mavenExecute:
dockerImage: 'maven:3.5-jdk-7'
influxWriteData:
influxServer: 'jenkins'
neoDeploy:
deployMode: 'mta'
warAction: 'deploy'
vmSize: 'lite'
neoCredentialsId: 'CI_CREDENTIALS_ID'

View File

@ -20,6 +20,21 @@ class ConfigurationMerger {
return merged
}
@NonCPS
def static mergeWithPipelineData(Map parameters, List parameterKeys,
Map pipelineDataMap,
Map configurationMap, List configurationKeys,
Map stepDefaults=[:]
){
Map merged = [:]
merged.putAll(stepDefaults)
merged.putAll(filterByKeyAndNull(configurationMap, configurationKeys))
merged.putAll(pipelineDataMap)
merged.putAll(filterByKeyAndNull(parameters, parameterKeys))
return merged
}
@NonCPS
private static filterByKeyAndNull(Map map, List keys) {
Map filteredMap = map.findAll {

View File

@ -0,0 +1,8 @@
package com.sap.piper
import com.cloudbees.groovy.cps.NonCPS
@NonCPS
def getPrettyJsonString(object) {
return groovy.json.JsonOutput.prettyPrint(groovy.json.JsonOutput.toJson(object))
}

View File

@ -3,7 +3,7 @@ package com.sap.piper
import com.cloudbees.groovy.cps.NonCPS
@NonCPS
def getMandatoryParameter(Map map, paramName, defaultValue) {
def getMandatoryParameter(Map map, paramName, defaultValue = null) {
def paramValue = map[paramName]

View File

@ -1,61 +1,97 @@
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.junit.rules.RuleChain
import com.lesfurets.jenkins.unit.BasePipelineTest
import util.JenkinsConfigRule
import util.JenkinsLoggingRule
import util.JenkinsSetupRule
import static org.junit.Assert.assertEquals
import static org.junit.Assert.assertTrue
import static org.junit.Assert.assertFalse
class DockerExecuteTest extends PiperTestBase {
class DockerExecuteTest extends BasePipelineTest {
private DockerMock docker
String echos
private JenkinsLoggingRule jlr = new JenkinsLoggingRule(this)
@Rule
public RuleChain ruleChain = RuleChain.outerRule(new JenkinsSetupRule(this))
.around(jlr)
.around(new JenkinsConfigRule(this))
int whichDockerReturnValue = 0
@Before
void setUp() {
super.setUp()
def bodyExecuted
def cpe
def dockerExecuteScript;
@Before
void init() {
bodyExecuted = false
docker = new DockerMock()
binding.setVariable('docker', docker)
binding.setVariable('Jenkins', [instance: [pluginManager: [plugins: [new PluginMock()]]]])
echos = ''
helper.registerAllowedMethod("echo", [String.class], { String s -> echos += " $s" })
helper.registerAllowedMethod('sh', [Map.class], {return whichDockerReturnValue})
cpe = loadScript('commonPipelineEnvironment.groovy').commonPipelineEnvironment
dockerExecuteScript = loadScript('dockerExecute.groovy').dockerExecute
}
@Test
void testExecuteInsideDocker() throws Exception {
def script = loadScript("test/resources/pipelines/dockerExecuteTest/executeInsideDocker.groovy")
script.execute()
dockerExecuteScript.call(script: [commonPipelineEnvironment: cpe],
dockerImage: 'maven:3.5-jdk-8-alpine') {
bodyExecuted = true
}
assertEquals('maven:3.5-jdk-8-alpine', docker.getImageName())
assertTrue(docker.isImagePulled())
assertEquals(' --env http_proxy --env https_proxy --env no_proxy --env HTTP_PROXY --env HTTPS_PROXY --env NO_PROXY', docker.getParameters())
assertTrue(echos.contains('Inside Docker'))
assertTrue(bodyExecuted)
}
@Test
void testExecuteInsideDockerWithParameters() throws Exception {
def script = loadScript("test/resources/pipelines/dockerExecuteTest/executeInsideDockerWithParameters.groovy")
script.execute()
dockerExecuteScript.call(script: [commonPipelineEnvironment: cpe],
dockerImage: 'maven:3.5-jdk-8-alpine',
dockerOptions: '-it',
dockerVolumeBind: ['my_vol': '/my_vol'],
dockerEnvVars: ['http_proxy': 'http://proxy:8000']) {
bodyExecuted = true
}
assertTrue(docker.getParameters().contains(' --env https_proxy '))
assertTrue(docker.getParameters().contains(' --env http_proxy=http://proxy:8000'))
assertTrue(docker.getParameters().contains(' -it'))
assertTrue(docker.getParameters().contains(' --volume my_vol:/my_vol'))
assertTrue(bodyExecuted)
}
@Test
void testDockerNotInstalledResultsInLocalExecution() throws Exception {
@Test
void testDockerNotInstalledResultsInLocalExecution() throws Exception {
whichDockerReturnValue = 1
def script = loadScript("test/resources/pipelines/dockerExecuteTest/executeInsideDockerWithParameters.groovy")
script.execute()
assertTrue(echos.contains('No docker environment found'))
assertTrue(echos.contains('Running on local environment'))
dockerExecuteScript.call(script: [commonPipelineEnvironment: cpe],
dockerImage: 'maven:3.5-jdk-8-alpine',
dockerOptions: '-it',
dockerVolumeBind: ['my_vol': '/my_vol'],
dockerEnvVars: ['http_proxy': 'http://proxy:8000']) {
bodyExecuted = true
}
assertTrue(jlr.log.contains('No docker environment found'))
assertTrue(jlr.log.contains('Running on local environment'))
assertTrue(bodyExecuted)
assertFalse(docker.isImagePulled())
}
@ -99,5 +135,4 @@ class DockerExecuteTest extends PiperTestBase {
return true
}
}
}

View File

@ -0,0 +1,25 @@
#!groovy
import com.lesfurets.jenkins.unit.BasePipelineTest
import org.junit.Rule
import org.junit.Test
import util.JenkinsSetupRule
import static org.junit.Assert.assertTrue
class DurationMeasureTest extends BasePipelineTest {
@Rule
public JenkinsSetupRule setupRule = new JenkinsSetupRule(this)
@Test
void testDurationMeasurement() throws Exception {
def cpe = loadScript("commonPipelineEnvironment.groovy").commonPipelineEnvironment
def script = loadScript("durationMeasure.groovy")
def bodyExecuted = false
script.call(script: [commonPipelineEnvironment: cpe], measurementName: 'test') {
bodyExecuted = true
}
assertTrue(cpe.getPipelineMeasurement('test') != null)
assertTrue(bodyExecuted)
assertJobStatusSuccess()
}
}

View File

@ -0,0 +1,107 @@
#!groovy
import com.lesfurets.jenkins.unit.BasePipelineTest
import com.sap.piper.DefaultValueCache
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.junit.rules.RuleChain
import util.JenkinsLoggingRule
import util.JenkinsSetupRule
import static org.junit.Assert.assertTrue
import static org.junit.Assert.assertEquals
class InfluxWriteDataTest extends BasePipelineTest {
Script influxWriteDataScript
Map fileMap = [:]
Map stepMap = [:]
String echoLog = ''
def cpe
public JenkinsSetupRule setupRule = new JenkinsSetupRule(this)
public JenkinsLoggingRule loggingRule = new JenkinsLoggingRule(this)
@Rule
public RuleChain ruleChain =
RuleChain.outerRule(setupRule)
.around(loggingRule)
@Before
void init() throws Exception {
//
// Currently we have dependencies between the tests since
// DefaultValueCache is a singleton which keeps its status
// for all the tests. Depending on the test order we fail.
// As long as this status remains we need:
DefaultValueCache.reset()
//reset stepMap
stepMap = [:]
//reset fileMap
fileMap = [:]
helper.registerAllowedMethod('readYaml', [Map.class], { map ->
return [
general: [productiveBranch: 'develop'],
steps : [influxWriteData: [influxServer: 'testInflux']]
]
})
helper.registerAllowedMethod('writeFile', [Map.class],{m -> fileMap[m.file] = m.text})
helper.registerAllowedMethod('step', [Map.class],{m -> stepMap = m})
cpe = loadScript('commonPipelineEnvironment.groovy').commonPipelineEnvironment
influxWriteDataScript = loadScript("influxWriteData.groovy")
}
@Test
void testInfluxWriteDataWithDefault() throws Exception {
cpe.setArtifactVersion('1.2.3')
influxWriteDataScript.call(script: [commonPipelineEnvironment: cpe])
assertTrue(loggingRule.log.contains('Artifact version: 1.2.3'))
assertEquals('testInflux', stepMap.selectedTarget)
assertEquals(null, stepMap.customPrefix)
assertEquals([:], stepMap.customData)
assertEquals([pipeline_data:[:]], stepMap.customDataMap)
assertTrue(fileMap.containsKey('jenkins_data.json'))
assertTrue(fileMap.containsKey('pipeline_data.json'))
assertJobStatusSuccess()
}
@Test
void testInfluxWriteDataNoInflux() throws Exception {
cpe.setArtifactVersion('1.2.3')
influxWriteDataScript.call(script: [commonPipelineEnvironment: cpe], influxServer: '')
assertEquals(0, stepMap.size())
assertTrue(fileMap.containsKey('jenkins_data.json'))
assertTrue(fileMap.containsKey('pipeline_data.json'))
assertJobStatusSuccess()
}
@Test
void testInfluxWriteDataNoArtifactVersion() throws Exception {
influxWriteDataScript.call(script: [commonPipelineEnvironment: cpe])
assertEquals(0, stepMap.size())
assertEquals(0, fileMap.size())
assertTrue(loggingRule.log.contains('no artifact version available -> exiting writeInflux without writing data'))
assertJobStatusSuccess()
}
}

View File

@ -4,27 +4,43 @@ import org.jenkinsci.plugins.pipeline.utility.steps.shaded.org.yaml.snakeyaml.pa
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.junit.rules.ExpectedException
import org.junit.rules.RuleChain
import org.junit.rules.TemporaryFolder
public class MTABuildTest extends PiperTestBase {
import com.lesfurets.jenkins.unit.BasePipelineTest
import util.JenkinsConfigRule
import util.JenkinsLoggingRule
import util.JenkinsSetupRule
import util.JenkinsShellCallRule
public class MTABuildTest extends BasePipelineTest {
private ExpectedException thrown = new ExpectedException()
private TemporaryFolder tmp = new TemporaryFolder()
private JenkinsLoggingRule jlr = new JenkinsLoggingRule(this)
private JenkinsShellCallRule jscr = new JenkinsShellCallRule(this)
@Rule
public ExpectedException thrown = new ExpectedException()
@Rule
public TemporaryFolder tmp = new TemporaryFolder()
public RuleChain ruleChain =
RuleChain.outerRule(thrown)
.around(tmp)
.around(new JenkinsSetupRule(this))
.around(jlr)
.around(jscr)
.around(new JenkinsConfigRule(this))
def currentDir
def otherDir
def mtaBuildShEnv
def mtaBuildScript
def cpe
@Before
void setUp() {
void init() {
super.setUp()
currentDir = tmp.newFolder().toURI().getPath()[0..-2] //omit final '/'
otherDir = tmp.newFolder().toURI().getPath()[0..-2] //omit final '/'
@ -49,6 +65,8 @@ public class MTABuildTest extends PiperTestBase {
binding.setVariable('JAVA_HOME', '/opt/java')
binding.setVariable('env', [:])
mtaBuildScript = loadScript("mtaBuild.groovy").mtaBuild
cpe = loadScript('commonPipelineEnvironment.groovy').commonPipelineEnvironment
}
@ -59,17 +77,18 @@ public class MTABuildTest extends PiperTestBase {
new File("${currentDir}/mta.yaml") << defaultMtaYaml()
def mtarFilePath = withPipeline(defaultPipeline()).execute()
def mtarFilePath = mtaBuildScript.call(script: [commonPipelineEnvironment: cpe],
buildTarget: 'NEO')
assert shellCalls[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/mta.yaml"$/
assert jscr.shell[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/mta.yaml"$/
assert shellCalls[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert jscr.shell[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert shellCalls[1].contains(' -jar /opt/mta/mta.jar --mtar ')
assert jscr.shell[1].contains(' -jar /opt/mta/mta.jar --mtar ')
assert mtarFilePath == "${currentDir}/com.mycompany.northwind.mtar"
assert messages[1] == "[mtaBuild] MTA JAR \"/opt/mta/mta.jar\" retrieved from environment."
assert jlr.log.contains( "[mtaBuild] MTA JAR \"/opt/mta/mta.jar\" retrieved from environment.")
}
@ -80,17 +99,20 @@ public class MTABuildTest extends PiperTestBase {
new File("${currentDir}/mta.yaml") << defaultMtaYaml()
def mtarFilePath = withPipeline(returnMtarFilePathFromCommonPipelineEnvironmentPipeline()).execute()
mtaBuildScript.call(script: [commonPipelineEnvironment: cpe],
buildTarget: 'NEO')
assert shellCalls[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/mta.yaml"$/
def mtarFilePath = cpe.getMtarFilePath()
assert shellCalls[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert jscr.shell[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/mta.yaml"$/
assert shellCalls[1].contains(' -jar /opt/mta/mta.jar --mtar ')
assert jscr.shell[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert jscr.shell[1].contains(' -jar /opt/mta/mta.jar --mtar ')
assert mtarFilePath == "${currentDir}/com.mycompany.northwind.mtar"
assert messages[1] == "[mtaBuild] MTA JAR \"/opt/mta/mta.jar\" retrieved from environment."
assert jlr.log.contains("[mtaBuild] MTA JAR \"/opt/mta/mta.jar\" retrieved from environment.")
}
@ -100,20 +122,24 @@ public class MTABuildTest extends PiperTestBase {
binding.getVariable('env')['MTA_JAR_LOCATION'] = '/opt/mta'
def newDirName = 'newDir'
new File("${currentDir}/${newDirName}").mkdirs()
new File("${currentDir}/${newDirName}/mta.yaml") << defaultMtaYaml()
def newDir = new File("${currentDir}/${newDirName}")
def mtarFilePath = withPipeline(withSurroundingDirPipeline()).execute(newDirName)
newDir.mkdirs()
new File(newDir, 'mta.yaml') << defaultMtaYaml()
assert shellCalls[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/newDir\/mta.yaml"$/
helper.registerAllowedMethod('pwd', [], { newDir } )
assert shellCalls[1].contains("PATH=./node_modules/.bin:/usr/bin")
def mtarFilePath = mtaBuildScript.call(script: [commonPipelineEnvironment: cpe], buildTarget: 'NEO')
assert shellCalls[1].contains(' -jar /opt/mta/mta.jar --mtar ')
assert jscr.shell[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/newDir\/mta.yaml"$/
assert mtarFilePath == "${currentDir}/com.mycompany.northwind.mtar"
assert jscr.shell[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert messages[1] == "[mtaBuild] MTA JAR \"/opt/mta/mta.jar\" retrieved from environment."
assert jscr.shell[1].contains(' -jar /opt/mta/mta.jar --mtar ')
assert mtarFilePath == "${currentDir}/${newDirName}/com.mycompany.northwind.mtar"
assert jlr.log.contains("[mtaBuild] MTA JAR \"/opt/mta/mta.jar\" retrieved from environment.")
}
@Test
@ -121,17 +147,17 @@ public class MTABuildTest extends PiperTestBase {
new File("${currentDir}/mta.yaml") << defaultMtaYaml()
def mtarFilePath = withPipeline(defaultPipeline()).execute()
def mtarFilePath = mtaBuildScript.call(script: [commonPipelineEnvironment: cpe], buildTarget: 'NEO')
assert shellCalls[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/mta.yaml"$/
assert jscr.shell[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/mta.yaml"$/
assert shellCalls[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert jscr.shell[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert shellCalls[1].contains(' -jar mta.jar --mtar ')
assert jscr.shell[1].contains(' -jar mta.jar --mtar ')
assert mtarFilePath == "${currentDir}/com.mycompany.northwind.mtar"
assert messages[1] == "[mtaBuild] Using MTA JAR from current working directory."
assert jlr.log.contains( "[mtaBuild] Using MTA JAR from current working directory." )
}
@ -140,17 +166,17 @@ public class MTABuildTest extends PiperTestBase {
new File("${currentDir}/mta.yaml") << defaultMtaYaml()
def mtarFilePath = withPipeline(mtaJarLocationAsParameterPipeline()).execute()
def mtarFilePath = mtaBuildScript.call(mtaJarLocation: '/mylocation/mta', buildTarget: 'NEO')
assert shellCalls[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/mta.yaml"$/
assert jscr.shell[0] =~ /sed -ie "s\/\\\$\{timestamp\}\/`date \+%Y%m%d%H%M%S`\/g" ".*\/mta.yaml"$/
assert shellCalls[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert jscr.shell[1].contains("PATH=./node_modules/.bin:/usr/bin")
assert shellCalls[1].contains(' -jar /etc/mta/mta.jar --mtar ')
assert jscr.shell[1].contains(' -jar /mylocation/mta/mta.jar --mtar ')
assert mtarFilePath == "${currentDir}/com.mycompany.northwind.mtar"
assert messages[1] == "[mtaBuild] MTA JAR \"/etc/mta/mta.jar\" retrieved from parameters."
assert jlr.log.contains("[mtaBuild] MTA JAR \"/mylocation/mta/mta.jar\" retrieved from parameters.".toString())
}
@ -158,7 +184,8 @@ public class MTABuildTest extends PiperTestBase {
public void noMtaPresentTest(){
thrown.expect(FileNotFoundException)
withPipeline(defaultPipeline()).execute()
mtaBuildScript.call(script: [commonPipelineEnvironment: cpe],
buildTarget: 'NEO')
}
@ -169,7 +196,8 @@ public class MTABuildTest extends PiperTestBase {
new File("${currentDir}/mta.yaml") << badMtaYaml()
withPipeline(defaultPipeline()).execute()
mtaBuildScript.call(script: [commonPipelineEnvironment: cpe],
buildTarget: 'NEO')
}
@ -180,7 +208,8 @@ public class MTABuildTest extends PiperTestBase {
new File("${currentDir}/mta.yaml") << noIdMtaYaml()
withPipeline(defaultPipeline()).execute()
mtaBuildScript.call(script: [commonPipelineEnvironment: cpe],
buildTarget: 'NEO')
}
@ -191,74 +220,9 @@ public class MTABuildTest extends PiperTestBase {
new File("${currentDir}/mta.yaml") << defaultMtaYaml()
withPipeline(noBuildTargetPipeline()).execute()
mtaBuildScript.call(script: [commonPipelineEnvironment: cpe])
}
private defaultPipeline(){
return '''
@Library('piper-library-os')
execute(){
mtaBuild buildTarget: 'NEO'
}
return this
'''
}
private returnMtarFilePathFromCommonPipelineEnvironmentPipeline(){
return '''
@Library('piper-library-os')
execute(){
mtaBuild buildTarget: 'NEO'
return commonPipelineEnvironment.getMtarFilePath()
}
return this
'''
}
private mtaJarLocationAsParameterPipeline(){
return '''
@Library('piper-library-os')
execute(){
mtaBuild mtaJarLocation: '/etc/mta', buildTarget: 'NEO'
}
return this
'''
}
private withSurroundingDirPipeline(){
return '''
@Library('piper-library-os')
execute(dirPath){
dir("${dirPath}"){
mtaBuild buildTarget: 'NEO'
}
}
return this
'''
}
private noBuildTargetPipeline(){
return '''
@Library('piper-library-os')
execute(){
mtaBuild()
}
return this
'''
}
private defaultMtaYaml(){
return '''
_schema-version: "2.0.0"

View File

@ -1,20 +1,35 @@
import junit.framework.TestCase
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.junit.rules.RuleChain
import com.lesfurets.jenkins.unit.BasePipelineTest
import static org.junit.Assert.assertEquals
import static org.junit.Assert.assertTrue
class MavenExecuteTest extends PiperTestBase {
import util.JenkinsConfigRule
import util.JenkinsSetupRule
import util.JenkinsShellCallRule
class MavenExecuteTest extends BasePipelineTest {
Map dockerParameters
List shellCalls
private JenkinsShellCallRule jscr = new JenkinsShellCallRule(this)
@Rule
public RuleChain ruleChain = RuleChain.outerRule(new JenkinsSetupRule(this))
.around(jscr)
.around(new JenkinsConfigRule(this))
def mavenExecuteScript
def cpe
@Before
void setUp() {
super.setUp()
void init() {
shellCalls = []
dockerParameters = [:]
helper.registerAllowedMethod("dockerExecute", [Map.class, Closure.class],
@ -22,23 +37,35 @@ class MavenExecuteTest extends PiperTestBase {
dockerParameters = parameters
closure()
})
helper.registerAllowedMethod('sh', [String], { s -> shellCalls.add(s) })
mavenExecuteScript = loadScript("mavenExecute.groovy").mavenExecute
cpe = loadScript('commonPipelineEnvironment.groovy').commonPipelineEnvironment
}
@Test
void testExecuteBasicMavenCommand() throws Exception {
def script = loadScript("test/resources/pipelines/mavenExecuteTest/executeBasicMavenCommand.groovy")
script.execute()
mavenExecuteScript.call(script: [commonPipelineEnvironment: cpe], goals: 'clean install')
assertEquals('maven:3.5-jdk-7', dockerParameters.dockerImage)
assertTrue(shellCalls.contains('mvn clean install'))
assert jscr.shell[0] == 'mvn clean install'
}
@Test
void testExecuteMavenCommandWithParameter() throws Exception {
def script = loadScript("test/resources/pipelines/mavenExecuteTest/executeMavenCommandWithParameters.groovy")
script.execute()
mavenExecuteScript.call(
script: [commonPipelineEnvironment: cpe],
dockerImage: 'maven:3.5-jdk-8-alpine',
goals: 'clean install',
globalSettingsFile: 'globalSettingsFile.xml',
projectSettingsFile: 'projectSettingsFile.xml',
pomPath: 'pom.xml',
flags: '-o',
m2Path: 'm2Path',
defines: '-Dmaven.tests.skip=true')
assertEquals('maven:3.5-jdk-8-alpine', dockerParameters.dockerImage)
String mvnCommand = "mvn --global-settings 'globalSettingsFile.xml' -Dmaven.repo.local='m2Path' --settings 'projectSettingsFile.xml' --file 'pom.xml' -o clean install -Dmaven.tests.skip=true"
assertTrue(shellCalls.contains(mvnCommand))
assertTrue(jscr.shell.contains(mvnCommand))
}
}

View File

@ -1,33 +1,55 @@
import hudson.AbortException
import org.junit.rules.TemporaryFolder
import static com.lesfurets.jenkins.unit.global.lib.LibraryConfiguration.library
import static ProjectSource.projectSource
import com.lesfurets.jenkins.unit.BasePipelineTest
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.junit.rules.ExpectedException
import org.junit.rules.RuleChain
class NeoDeploymentTest extends PiperTestBase {
import util.JenkinsConfigRule
import util.JenkinsLoggingRule
import util.JenkinsSetupRule
import util.JenkinsShellCallRule
class NeoDeploymentTest extends BasePipelineTest {
private ExpectedException thrown = new ExpectedException().none()
private TemporaryFolder tmp = new TemporaryFolder()
private JenkinsLoggingRule jlr = new JenkinsLoggingRule(this)
private JenkinsShellCallRule jscr = new JenkinsShellCallRule(this)
@Rule
public ExpectedException thrown = new ExpectedException().none()
public RuleChain ruleChain = RuleChain.outerRule(thrown)
.around(tmp)
.around(new JenkinsSetupRule(this))
.around(jlr)
.around(jscr)
.around(new JenkinsConfigRule(this))
@Rule
public TemporaryFolder tmp = new TemporaryFolder()
def workspacePath
def warArchiveName
def propertiesFileName
def archiveName
def archivePath
def neoDeployScript
def cpe
@Before
void setUp() {
void init() {
super.setUp()
archivePath = "${tmp.newFolder("workspace").toURI().getPath()}archiveName.mtar"
workspacePath = "${tmp.newFolder("workspace").toURI().getPath()}"
warArchiveName = 'warArchive.war'
propertiesFileName = 'config.properties'
archiveName = "archive.mtar"
helper.registerAllowedMethod('dockerExecute', [Map, Closure], null)
helper.registerAllowedMethod('error', [String], { s -> throw new AbortException(s) })
helper.registerAllowedMethod('fileExists', [String], { s -> return new File(workspacePath, s).exists() })
helper.registerAllowedMethod('usernamePassword', [Map], { m -> return m })
helper.registerAllowedMethod('withCredentials', [List, Closure], { l, c ->
if(l[0].credentialsId == 'myCredentialsId') {
@ -48,21 +70,78 @@ class NeoDeploymentTest extends PiperTestBase {
binding.setVariable('env', [:])
neoDeployScript = loadScript("neoDeploy.groovy").neoDeploy
cpe = loadScript('commonPipelineEnvironment.groovy').commonPipelineEnvironment
}
@Test
void straightForwardTest() {
void straightForwardTestConfigViaConfigProperties() {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(archivePath) << "dummy archive"
new File(workspacePath, archiveName) << "dummy archive"
withPipeline(defaultPipeline()).execute(archivePath, 'myCredentialsId')
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
assert shellCalls[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy-mta --user 'anonymous' --host 'test\.deploy\.host\.com' --source ".*" --account 'trialuser123' --password '\*\*\*\*\*\*\*\*' --synchronous/
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: archiveName,
neoCredentialsId: 'myCredentialsId'
)
assert messages[1] == "[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment."
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy-mta --host 'test\.deploy\.host\.com' --account 'trialuser123' --synchronous --user 'anonymous' --password '\*\*\*\*\*\*\*\*' --source ".*"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
@Test
void straightForwardTestConfigViaConfiguration() {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(workspacePath, archiveName) << "dummy archive"
cpe.configuration.put('steps', [neoDeploy: [host: 'test.deploy.host.com',
account: 'trialuser123']])
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: archiveName,
neoCredentialsId: 'myCredentialsId'
)
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy-mta --host 'test\.deploy\.host\.com' --account 'trialuser123' --synchronous --user 'anonymous' --password '\*\*\*\*\*\*\*\*' --source ".*"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
@Test
void straightForwardTestConfigViaConfigurationAndViaConfigProperties() {
//configuration via configurationFramekwork superseds.
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(workspacePath, archiveName) << "dummy archive"
cpe.setConfigProperty('DEPLOY_HOST', 'configProperties.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'configPropsUser123')
cpe.configuration.put('steps', [neoDeploy: [host: 'configuration-frwk.deploy.host.com',
account: 'configurationFrwkUser123']])
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: archiveName,
neoCredentialsId: 'myCredentialsId'
)
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy-mta --host 'configuration-frwk\.deploy\.host\.com' --account 'configurationFrwkUser123' --synchronous --user 'anonymous' --password '\*\*\*\*\*\*\*\*' --source ".*"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
@ -72,12 +151,18 @@ class NeoDeploymentTest extends PiperTestBase {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(archivePath) << "dummy archive"
new File(workspacePath, archiveName) << "dummy archive"
thrown.expect(MissingPropertyException)
thrown.expectMessage('No such property: username')
withPipeline(defaultPipeline()).execute(archivePath, 'badCredentialsId')
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: archiveName,
neoCredentialsId: 'badCredentialsId'
)
}
@ -86,40 +171,54 @@ class NeoDeploymentTest extends PiperTestBase {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(archivePath) << "dummy archive"
new File(workspacePath, archiveName) << "dummy archive"
withPipeline(noCredentialsIdPipeline()).execute(archivePath)
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
assert shellCalls[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy-mta --user 'defaultUser' --host 'test\.deploy\.host\.com' --source ".*" --account 'trialuser123' --password '\*\*\*\*\*\*\*\*' --synchronous/
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: archiveName
)
assert messages[1] == "[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment."
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy-mta --host 'test\.deploy\.host\.com' --account 'trialuser123' --synchronous --user 'defaultUser' --password '\*\*\*\*\*\*\*\*' --source ".*"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
@Test
void neoHomeNotSetTest() {
new File(archivePath) << "dummy archive"
new File(workspacePath, archiveName) << "dummy archive"
withPipeline(noCredentialsIdPipeline()).execute(archivePath)
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
assert shellCalls[0] =~ /#!\/bin\/bash "neo" deploy-mta --user 'defaultUser' --host 'test\.deploy\.host\.com' --source ".*" --account 'trialuser123' --password '\*\*\*\*\*\*\*\*' --synchronous/
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: archiveName
)
assert messages[1] == "Using Neo executable from PATH."
assert jscr.shell[0] =~ /#!\/bin\/bash "neo.sh" deploy-mta --host 'test\.deploy\.host\.com' --account 'trialuser123' --synchronous --user 'defaultUser' --password '\*\*\*\*\*\*\*\*' --source ".*"/
assert jlr.log.contains("Using Neo executable from PATH.")
}
@Test
void neoHomeAsParameterTest() {
new File(archivePath) << "dummy archive"
new File(workspacePath, archiveName) << "dummy archive"
withPipeline(neoHomeParameterPipeline()).execute(archivePath, 'myCredentialsId')
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
assert shellCalls[0] =~ /#!\/bin\/bash "\/etc\/neo\/tools\/neo\.sh" deploy-mta --user 'anonymous' --host 'test\.deploy\.host\.com' --source ".*" --account 'trialuser123' --password '\*\*\*\*\*\*\*\*' --synchronous.*/
assert messages[1] == "[neoDeploy] Neo executable \"/etc/neo/tools/neo.sh\" retrieved from parameters."
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: archiveName,
neoCredentialsId: 'myCredentialsId',
neoHome: '/etc/neo'
)
assert jscr.shell[0] =~ /#!\/bin\/bash "\/etc\/neo\/tools\/neo\.sh" deploy-mta --host 'test\.deploy\.host\.com' --account 'trialuser123' --synchronous --user 'anonymous' --password '\*\*\*\*\*\*\*\*' --source ".*"/
}
@ -127,10 +226,12 @@ class NeoDeploymentTest extends PiperTestBase {
void archiveNotProvidedTest() {
thrown.expect(Exception)
thrown.expectMessage('ERROR - NO VALUE AVAILABLE FOR archivePath')
thrown.expectMessage('Archive path not configured (parameter "archivePath").')
withPipeline(noArchivePathPipeline()).execute()
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe])
}
@ -138,116 +239,233 @@ class NeoDeploymentTest extends PiperTestBase {
void wrongArchivePathProvidedTest() {
thrown.expect(AbortException)
thrown.expectMessage("Archive cannot be found with parameter archivePath: '")
thrown.expectMessage("Archive cannot be found")
withPipeline(defaultPipeline()).execute(archivePath, 'myCredentialsId')
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: archiveName)
}
@Test
void scriptNotProvidedTest() {
new File(archivePath) << "dummy archive"
new File(workspacePath, archiveName) << "dummy archive"
thrown.expect(Exception)
thrown.expectMessage('ERROR - NO VALUE AVAILABLE FOR deployHost')
withPipeline(noScriptPipeline()).execute(archivePath)
thrown.expectMessage('ERROR - NO VALUE AVAILABLE FOR host')
neoDeployScript.call(archivePath: archiveName)
}
@Test
void mtaDeployModeTest() {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(workspacePath, archiveName) << "dummy archive"
private defaultPipeline(){
return """
@Library('piper-library-os')
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
execute(archivePath, neoCredentialsId) {
neoDeployScript.call(script: [commonPipelineEnvironment: cpe], archivePath: archiveName, deployMode: 'mta')
commonPipelineEnvironment.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
commonPipelineEnvironment.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
node() {
neoDeploy script: this, archivePath: archivePath, neoCredentialsId: neoCredentialsId
}
}
return this
"""
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy-mta --host 'test\.deploy\.host\.com' --account 'trialuser123' --synchronous --user 'defaultUser' --password '\*\*\*\*\*\*\*\*' --source ".*"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
private noCredentialsIdPipeline(){
return """
@Library('piper-library-os')
@Test
void warFileParamsDeployModeTest() {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(workspacePath, warArchiveName) << "dummy war archive"
execute(archivePath) {
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
commonPipelineEnvironment.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
commonPipelineEnvironment.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
applicationName: 'testApp',
runtime: 'neo-javaee6-wp',
runtimeVersion: '2.125',
deployMode: 'warParams',
vmSize: 'lite',
warAction: 'deploy',
archivePath: warArchiveName)
node() {
neoDeploy script: this, archivePath: archivePath
}
}
return this
"""
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy --host 'test\.deploy\.host\.com' --account 'trialuser123' --application 'testApp' --runtime 'neo-javaee6-wp' --runtime-version '2\.125' --size 'lite' --user 'defaultUser' --password '\*\*\*\*\*\*\*\*' --source ".*\.war"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
private neoHomeParameterPipeline(){
return """
@Library('piper-library-os')
@Test
void warFileParamsDeployModeRollingUpdateTest() {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(workspacePath, warArchiveName) << "dummy war archive"
execute(archivePath, neoCredentialsId) {
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
commonPipelineEnvironment.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
commonPipelineEnvironment.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
deployMode: 'warParams',
applicationName: 'testApp',
runtime: 'neo-javaee6-wp',
runtimeVersion: '2.125',
warAction: 'rolling-update',
vmSize: 'lite')
node() {
neoDeploy script: this, archivePath: archivePath, neoCredentialsId: neoCredentialsId, neoHome: '/etc/neo'
}
}
return this
"""
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" rolling-update --host 'test\.deploy\.host\.com' --account 'trialuser123' --application 'testApp' --runtime 'neo-javaee6-wp' --runtime-version '2\.125' --size 'lite' --user 'defaultUser' --password '\*\*\*\*\*\*\*\*' --source ".*\.war"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
private noArchivePathPipeline(){
return """
@Library('piper-library-os')
@Test
void warPropertiesFileDeployModeTest() {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(workspacePath, warArchiveName) << "dummy war archive"
new File(workspacePath, propertiesFileName) << "dummy properties file"
execute() {
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
deployMode: 'warPropertiesFile',
propertiesFile: propertiesFileName,
applicationName: 'testApp',
runtime: 'neo-javaee6-wp',
runtimeVersion: '2.125',
warAction: 'deploy',
vmSize: 'lite')
commonPipelineEnvironment.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
commonPipelineEnvironment.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
node() {
neoDeploy script: this
}
}
return this
"""
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" deploy .*\.properties --user 'defaultUser' --password '\*\*\*\*\*\*\*\*' --source ".*\.war"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
private noScriptPipeline(){
return """
@Library('piper-library-os')
@Test
void warPropertiesFileDeployModeRollingUpdateTest() {
binding.getVariable('env')['NEO_HOME'] = '/opt/neo'
new File(workspacePath, warArchiveName) << "dummy war archive"
new File(workspacePath, propertiesFileName) << "dummy properties file"
execute(archivePath) {
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
deployMode: 'warPropertiesFile',
propertiesFile: propertiesFileName,
applicationName: 'testApp',
runtime: 'neo-javaee6-wp',
runtimeVersion: '2.125',
warAction: 'rolling-update',
vmSize: 'lite')
node() {
neoDeploy archivePath: archivePath
}
}
return this
"""
assert jscr.shell[0] =~ /#!\/bin\/bash "\/opt\/neo\/tools\/neo\.sh" rolling-update .*\.properties --user 'defaultUser' --password '\*\*\*\*\*\*\*\*' --source ".*\.war"/
assert jlr.log.contains("[neoDeploy] Neo executable \"/opt/neo/tools/neo.sh\" retrieved from environment.")
}
@Test
void applicationNameNotProvidedTest() {
new File(workspacePath, warArchiveName) << "dummy war archive"
thrown.expect(Exception)
thrown.expectMessage('ERROR - NO VALUE AVAILABLE FOR applicationName')
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
deployMode: 'warParams',
runtime: 'neo-javaee6-wp',
runtimeVersion: '2.125'
)
}
@Test
void runtimeNotProvidedTest() {
new File(workspacePath, warArchiveName) << "dummy war archive"
thrown.expect(Exception)
thrown.expectMessage('ERROR - NO VALUE AVAILABLE FOR runtime')
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
applicationName: 'testApp',
deployMode: 'warParams',
runtimeVersion: '2.125')
}
@Test
void runtimeVersionNotProvidedTest() {
new File(workspacePath, warArchiveName) << "dummy war archive"
thrown.expect(Exception)
thrown.expectMessage('ERROR - NO VALUE AVAILABLE FOR runtimeVersion')
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
applicationName: 'testApp',
deployMode: 'warParams',
runtime: 'neo-javaee6-wp')
}
@Test
void illegalDeployModeTest() {
new File(workspacePath, warArchiveName) << "dummy war archive"
thrown.expect(Exception)
thrown.expectMessage("[neoDeploy] Invalid deployMode = 'illegalMode'. Valid 'deployMode' values are: 'mta', 'warParams' and 'warPropertiesFile'")
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
deployMode: 'illegalMode',
applicationName: 'testApp',
runtime: 'neo-javaee6-wp',
runtimeVersion: '2.125',
warAction: 'deploy',
vmSize: 'lite')
}
@Test
void illegalVMSizeTest() {
new File(workspacePath, warArchiveName) << "dummy war archive"
thrown.expect(Exception)
thrown.expectMessage("[neoDeploy] Invalid vmSize = 'illegalVM'. Valid 'vmSize' values are: 'lite', 'pro', 'prem' and 'prem-plus'.")
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
deployMode: 'warParams',
applicationName: 'testApp',
runtime: 'neo-javaee6-wp',
runtimeVersion: '2.125',
warAction: 'deploy',
vmSize: 'illegalVM')
}
@Test
void illegalWARActionTest() {
new File(workspacePath, warArchiveName) << "dummy war archive"
thrown.expect(Exception)
thrown.expectMessage("[neoDeploy] Invalid warAction = 'illegalWARAction'. Valid 'warAction' values are: 'deploy' and 'rolling-update'.")
cpe.setConfigProperty('DEPLOY_HOST', 'test.deploy.host.com')
cpe.setConfigProperty('CI_DEPLOY_ACCOUNT', 'trialuser123')
neoDeployScript.call(script: [commonPipelineEnvironment: cpe],
archivePath: warArchiveName,
deployMode: 'warParams',
applicationName: 'testApp',
runtime: 'neo-javaee6-wp',
runtimeVersion: '2.125',
warAction: 'illegalWARAction',
vmSize: 'lite')
}
}

View File

@ -1,23 +1,34 @@
import hudson.AbortException
import util.JenkinsConfigRule
import util.JenkinsSetupRule
import org.junit.rules.TemporaryFolder
import com.lesfurets.jenkins.unit.BasePipelineTest
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.junit.rules.ExpectedException
import org.junit.rules.RuleChain
class PipelineExecuteTest extends PiperTestBase {
class PipelineExecuteTest extends BasePipelineTest {
private ExpectedException thrown = new ExpectedException().none()
@Rule
public ExpectedException thrown = new ExpectedException().none()
public RuleChain ruleChain = RuleChain.outerRule(thrown)
.around(new JenkinsSetupRule(this))
.around(new JenkinsConfigRule(this))
def pipelinePath
def checkoutParameters = [:]
def load
@Before
void setUp() {
def pipelineExecuteScript
super.setUp()
@Before
void init() {
pipelinePath = null
checkoutParameters.clear()
@ -32,13 +43,14 @@ class PipelineExecuteTest extends PiperTestBase {
})
helper.registerAllowedMethod('load', [String], { s -> load = s })
pipelineExecuteScript = loadScript("pipelineExecute.groovy").pipelineExecute
}
@Test
void straightForwardTest() {
withPipeline(defaultPipeline()).execute()
pipelineExecuteScript.call(repoUrl: "https://test.com/myRepo.git")
assert load == "Jenkinsfile"
assert checkoutParameters.branch == 'master'
assert checkoutParameters.repoUrl == "https://test.com/myRepo.git"
@ -50,7 +62,11 @@ class PipelineExecuteTest extends PiperTestBase {
@Test
void parameterizeTest() {
withPipeline(parameterizePipeline()).execute()
pipelineExecuteScript.call(repoUrl: "https://test.com/anotherRepo.git",
branch: 'feature',
path: 'path/to/Jenkinsfile',
credentialsId: 'abcd1234')
assert load == "path/to/Jenkinsfile"
assert checkoutParameters.branch == 'feature'
assert checkoutParameters.repoUrl == "https://test.com/anotherRepo.git"
@ -65,50 +81,6 @@ class PipelineExecuteTest extends PiperTestBase {
thrown.expect(Exception)
thrown.expectMessage("ERROR - NO VALUE AVAILABLE FOR repoUrl")
withPipeline(noRepoUrlPipeline()).execute()
}
private defaultPipeline() {
return """
@Library('piper-library-os')
execute() {
pipelineExecute repoUrl: "https://test.com/myRepo.git"
}
return this
"""
}
private parameterizePipeline() {
return """
@Library('piper-library-os')
execute() {
pipelineExecute repoUrl: "https://test.com/anotherRepo.git", branch: 'feature', path: 'path/to/Jenkinsfile', credentialsId: 'abcd1234'
}
return this
"""
}
private noRepoUrlPipeline() {
return """
@Library('piper-library-os')
execute() {
pipelineExecute()
}
return this
"""
pipelineExecuteScript.call()
}
}

View File

@ -1,60 +0,0 @@
import com.lesfurets.jenkins.unit.BasePipelineTest
import com.sap.piper.DefaultValueCache
import org.yaml.snakeyaml.Yaml
import static ProjectSource.projectSource
import static com.lesfurets.jenkins.unit.global.lib.LibraryConfiguration.library
import org.junit.Rule
import org.junit.rules.TemporaryFolder
public class PiperTestBase extends BasePipelineTest {
@Rule
public TemporaryFolder pipelineFolder = new TemporaryFolder()
private File pipeline
protected messages = [], shellCalls = []
void setUp() {
super.setUp()
messages.clear()
shellCalls.clear()
preparePiperLib()
helper.registerAllowedMethod('echo', [String], {s -> messages.add(s)} )
helper.registerAllowedMethod('sh', [String], { s ->
shellCalls.add(s.replaceAll(/\s+/, " ").trim())
})
helper.registerAllowedMethod("readYaml", [Map], { Map parameters ->
Yaml yamlParser = new Yaml()
return yamlParser.load(parameters.text)
})
pipeline = pipelineFolder.newFile()
DefaultValueCache.reset()
}
protected withPipeline(p) {
pipeline << p
loadScript(pipeline.toURI().getPath())
}
private preparePiperLib() {
def piperLib = library()
.name('piper-library-os')
.retriever(projectSource())
.targetPath('clonePath/is/not/necessary')
.defaultVersion('<irrelevant>')
.allowOverride(true)
.implicit(false)
.build()
helper.registerSharedLibrary(piperLib)
}
}

View File

@ -1,17 +1,28 @@
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.yaml.snakeyaml.Yaml
import com.lesfurets.jenkins.unit.BasePipelineTest
import util.JenkinsSetupRule
import static org.junit.Assert.assertEquals
import static org.junit.Assert.assertNotNull
class SetupCommonPipelineEnvironmentTest extends PiperTestBase {
class SetupCommonPipelineEnvironmentTest extends BasePipelineTest {
def usedConfigFile
def setupCommonPipelineEnvironmentScript
def commonPipelineEnvironment
@Rule
public JenkinsSetupRule jsr = new JenkinsSetupRule(this)
@Before
void setUp() {
super.setUp()
void init() {
def examplePipelineConfig = new File('test/resources/test_pipeline_config.yml').text
@ -28,16 +39,18 @@ class SetupCommonPipelineEnvironmentTest extends PiperTestBase {
helper.registerAllowedMethod("fileExists", [String], { String path ->
return path.endsWith('.pipeline/config.yml')
})
setupCommonPipelineEnvironmentScript = loadScript("setupCommonPipelineEnvironment.groovy").setupCommonPipelineEnvironment
commonPipelineEnvironment = loadScript('commonPipelineEnvironment.groovy').commonPipelineEnvironment
}
@Test
void testIsConfigurationAvailable() throws Exception {
def script = loadScript("test/resources/pipelines/setupCommonPipelineEnvironmentTest/loadConfiguration.groovy")
script.execute()
setupCommonPipelineEnvironmentScript.call(script: [commonPipelineEnvironment: commonPipelineEnvironment])
assertEquals('.pipeline/config.yml', usedConfigFile)
assertNotNull(script.commonPipelineEnvironment.configuration)
assertEquals('develop', script.commonPipelineEnvironment.configuration.general.productiveBranch)
assertEquals('my-maven-docker', script.commonPipelineEnvironment.configuration.steps.mavenExecute.dockerImage)
assertNotNull(commonPipelineEnvironment.configuration)
assertEquals('develop', commonPipelineEnvironment.configuration.general.productiveBranch)
assertEquals('my-maven-docker', commonPipelineEnvironment.configuration.steps.mavenExecute.dockerImage)
}
}

View File

@ -4,27 +4,36 @@ import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.junit.rules.ExpectedException
import org.junit.rules.RuleChain
import org.junit.rules.TemporaryFolder
class ToolValidateTest extends PiperTestBase {
import com.lesfurets.jenkins.unit.BasePipelineTest
import util.JenkinsConfigRule
import util.JenkinsLoggingRule
import util.JenkinsSetupRule
class ToolValidateTest extends BasePipelineTest {
private ExpectedException thrown = new ExpectedException().none()
private TemporaryFolder tmp = new TemporaryFolder()
private JenkinsLoggingRule jlr = new JenkinsLoggingRule(this)
private JenkinsConfigRule jcr = new JenkinsConfigRule(this)
@Rule
public ExpectedException thrown = new ExpectedException().none()
@Rule
public TemporaryFolder tmp = new TemporaryFolder()
public RuleChain ruleChain =
RuleChain.outerRule(tmp)
.around(thrown)
.around(new JenkinsSetupRule(this))
.around(jlr)
.around(jcr)
private notEmptyDir
private script
def toolValidateScript
@Before
void setUp() {
super.setUp()
script = withPipeline(defaultPipeline())
void init() {
notEmptyDir = tmp.newFolder('notEmptyDir')
def path = "${notEmptyDir.getAbsolutePath()}${File.separator}test.txt"
@ -33,8 +42,7 @@ class ToolValidateTest extends PiperTestBase {
binding.setVariable('JAVA_HOME', notEmptyDir.getAbsolutePath())
binding.setVariable('home', notEmptyDir.getAbsolutePath())
toolValidateScript = loadScript("toolValidate.groovy").toolValidate
}
@ -44,10 +52,7 @@ class ToolValidateTest extends PiperTestBase {
thrown.expect(IllegalArgumentException)
thrown.expectMessage("The parameter 'home' can not be null or empty.")
binding.setVariable('tool', 'java')
binding.setVariable('home', null)
script.execute()
toolValidateScript.call(tool: 'java', home: null)
}
@Test
@ -56,10 +61,7 @@ class ToolValidateTest extends PiperTestBase {
thrown.expect(IllegalArgumentException)
thrown.expectMessage("The parameter 'home' can not be null or empty.")
binding.setVariable('tool', 'java')
binding.setVariable('home', '')
script.execute()
toolValidateScript.call(tool: 'java', home: '')
}
@Test
@ -68,9 +70,7 @@ class ToolValidateTest extends PiperTestBase {
thrown.expect(IllegalArgumentException)
thrown.expectMessage("The parameter 'tool' can not be null or empty.")
binding.setVariable('tool', null)
script.execute()
toolValidateScript.call(tool: null)
}
@Test
@ -79,9 +79,7 @@ class ToolValidateTest extends PiperTestBase {
thrown.expect(IllegalArgumentException)
thrown.expectMessage("The parameter 'tool' can not be null or empty.")
binding.setVariable('tool', '')
script.execute()
toolValidateScript.call(tool: '')
}
@Test
@ -90,9 +88,7 @@ class ToolValidateTest extends PiperTestBase {
thrown.expect(AbortException)
thrown.expectMessage("The tool 'test' is not supported.")
binding.setVariable('tool', 'test')
script.execute()
toolValidateScript.call(tool: 'test', home: notEmptyDir.getAbsolutePath())
}
@Test
@ -102,9 +98,8 @@ class ToolValidateTest extends PiperTestBase {
thrown.expectMessage('The validation of Java failed.')
helper.registerAllowedMethod('sh', [Map], { Map m -> getNoVersion(m) })
binding.setVariable('tool', 'java')
script.execute()
toolValidateScript.call(tool: 'java', home: notEmptyDir.getAbsolutePath())
}
@Test
@ -114,9 +109,8 @@ class ToolValidateTest extends PiperTestBase {
thrown.expectMessage('The validation of SAP Multitarget Application Archive Builder failed.')
helper.registerAllowedMethod('sh', [Map], { Map m -> getNoVersion(m) })
binding.setVariable('tool', 'mta')
script.execute()
toolValidateScript.call(tool: 'mta', home: notEmptyDir.getAbsolutePath())
}
@Test
@ -126,9 +120,8 @@ class ToolValidateTest extends PiperTestBase {
thrown.expectMessage('The validation of SAP Cloud Platform Console Client failed.')
helper.registerAllowedMethod('sh', [Map], { Map m -> getNoVersion(m) })
binding.setVariable('tool', 'neo')
script.execute()
toolValidateScript.call(tool: 'neo', home: notEmptyDir.getAbsolutePath())
}
@Test
@ -138,7 +131,8 @@ class ToolValidateTest extends PiperTestBase {
thrown.expectMessage('The validation of Change Management Command Line Interface failed.')
helper.registerAllowedMethod('sh', [Map], { Map m -> getNoVersion(m) })
binding.setVariable('tool', 'cm')
toolValidateScript.call(tool: 'cm', home: notEmptyDir.getAbsolutePath())
script.execute()
}
@ -150,9 +144,8 @@ class ToolValidateTest extends PiperTestBase {
thrown.expectMessage('The installed version of Java is 1.7.0.')
helper.registerAllowedMethod('sh', [Map], { Map m -> getIncompatibleVersion(m) })
binding.setVariable('tool', 'java')
script.execute()
toolValidateScript.call(tool: 'java', home: notEmptyDir.getAbsolutePath())
}
@Test
@ -162,9 +155,8 @@ class ToolValidateTest extends PiperTestBase {
thrown.expectMessage('The installed version of SAP Multitarget Application Archive Builder is 1.0.5.')
helper.registerAllowedMethod('sh', [Map], { Map m -> getIncompatibleVersion(m) })
binding.setVariable('tool', 'mta')
script.execute()
toolValidateScript.call(tool: 'mta', home: notEmptyDir.getAbsolutePath())
}
@Test
@ -174,9 +166,8 @@ class ToolValidateTest extends PiperTestBase {
thrown.expectMessage('The installed version of SAP Cloud Platform Console Client is 1.126.51.')
helper.registerAllowedMethod('sh', [Map], { Map m -> getIncompatibleVersion(m) })
binding.setVariable('tool', 'neo')
script.execute()
toolValidateScript.call(tool: 'neo', home: notEmptyDir.getAbsolutePath())
}
@Test
@ -188,80 +179,59 @@ class ToolValidateTest extends PiperTestBase {
helper.registerAllowedMethod('sh', [Map], { Map m -> getIncompatibleVersion(m) })
binding.setVariable('tool', 'cm')
script.execute()
toolValidateScript.call(tool: 'cm', home: notEmptyDir.getAbsolutePath())
}
@Test
void validateJavaTest() {
helper.registerAllowedMethod('sh', [Map], { Map m -> getVersion(m) })
binding.setVariable('tool', 'java')
script.execute()
toolValidateScript.call(tool: 'java', home: notEmptyDir.getAbsolutePath())
assert messages[0].contains('--- BEGIN LIBRARY STEP: toolValidate.groovy ---')
assert messages[1].contains('[INFO] Validating Java version 1.8.0 or compatible version.')
assert messages[2].contains('[INFO] Java version 1.8.0 is installed.')
assert messages[3].contains('--- END LIBRARY STEP: toolValidate.groovy ---')
assert jlr.log.contains('--- BEGIN LIBRARY STEP: toolValidate.groovy ---')
assert jlr.log.contains('[INFO] Validating Java version 1.8.0 or compatible version.')
assert jlr.log.contains('[INFO] Java version 1.8.0 is installed.')
assert jlr.log.contains('--- END LIBRARY STEP: toolValidate.groovy ---')
}
@Test
void validateMtaTest() {
helper.registerAllowedMethod('sh', [Map], { Map m -> getVersion(m) })
binding.setVariable('tool', 'mta')
script.execute()
toolValidateScript.call(tool: 'mta', home: notEmptyDir.getAbsolutePath())
assert messages[0].contains('--- BEGIN LIBRARY STEP: toolValidate.groovy ---')
assert messages[1].contains('[INFO] Validating SAP Multitarget Application Archive Builder version 1.0.6 or compatible version.')
assert messages[2].contains('[INFO] SAP Multitarget Application Archive Builder version 1.0.6 is installed.')
assert messages[3].contains('--- END LIBRARY STEP: toolValidate.groovy ---')
assert jlr.log.contains('--- BEGIN LIBRARY STEP: toolValidate.groovy ---')
assert jlr.log.contains('[INFO] Validating SAP Multitarget Application Archive Builder version 1.0.6 or compatible version.')
assert jlr.log.contains('[INFO] SAP Multitarget Application Archive Builder version 1.0.6 is installed.')
assert jlr.log.contains('--- END LIBRARY STEP: toolValidate.groovy ---')
}
@Test
void validateNeoTest() {
helper.registerAllowedMethod('sh', [Map], { Map m -> getVersion(m) })
binding.setVariable('tool', 'neo')
script.execute()
toolValidateScript.call(tool: 'neo', home: notEmptyDir.getAbsolutePath())
assert messages[0].contains('--- BEGIN LIBRARY STEP: toolValidate.groovy ---')
assert messages[1].contains('[INFO] Validating SAP Cloud Platform Console Client version 3.39.10 or compatible version.')
assert messages[2].contains('[INFO] SAP Cloud Platform Console Client version 3.39.10 is installed.')
assert messages[3].contains('--- END LIBRARY STEP: toolValidate.groovy ---')
assert jlr.log.contains('--- BEGIN LIBRARY STEP: toolValidate.groovy ---')
assert jlr.log.contains('[INFO] Validating SAP Cloud Platform Console Client version 3.39.10 or compatible version.')
assert jlr.log.contains('[INFO] SAP Cloud Platform Console Client version 3.39.10 is installed.')
assert jlr.log.contains('--- END LIBRARY STEP: toolValidate.groovy ---')
}
@Test
void validateCmTest() {
helper.registerAllowedMethod('sh', [Map], { Map m -> getVersion(m) })
binding.setVariable('tool', 'cm')
script.execute()
toolValidateScript.call(tool: 'cm', home: notEmptyDir.getAbsolutePath())
assert messages[0].contains('--- BEGIN LIBRARY STEP: toolValidate.groovy ---')
assert messages[1].contains('[INFO] Validating Change Management Command Line Interface version 0.0.1 or compatible version.')
assert messages[2].contains('[INFO] Change Management Command Line Interface version 0.0.1 is installed.')
assert messages[3].contains('--- END LIBRARY STEP: toolValidate.groovy ---')
}
private defaultPipeline(){
return """
@Library('piper-library-os')
execute() {
node() {
toolValidate tool: tool, home: home
}
}
return this
"""
assert jlr.log.contains('--- BEGIN LIBRARY STEP: toolValidate.groovy ---')
assert jlr.log.contains('[INFO] Validating Change Management Command Line Interface version 0.0.1 or compatible version.')
assert jlr.log.contains('[INFO] Change Management Command Line Interface version 0.0.1 is installed.')
assert jlr.log.contains('--- END LIBRARY STEP: toolValidate.groovy ---')
}
private getNoVersion(Map m) {

View File

@ -26,4 +26,17 @@ class ConfigurationMergerTest {
Map merged = ConfigurationMerger.merge(parameters, parameterKeys, defaults)
Assert.assertEquals([], merged.nonErpDestinations)
}
@Test
void testMergeCustomPipelineValues(){
Map defaults = [dockerImage: 'mvn']
Map parameters = [goals: 'install', flags: '']
List parameterKeys = ['flags']
Map configuration = [flags: '-B']
List configurationKeys = ['flags']
Map pipelineDataMap = [artifactVersion: '1.2.3', flags: 'test']
Map merged = ConfigurationMerger.mergeWithPipelineData(parameters, parameterKeys, pipelineDataMap, configuration, configurationKeys, defaults)
Assert.assertEquals('', merged.flags)
Assert.assertEquals('1.2.3', merged.artifactVersion)
}
}

View File

@ -0,0 +1,38 @@
package util
import com.lesfurets.jenkins.unit.BasePipelineTest
import com.sap.piper.DefaultValueCache
import org.junit.rules.TestRule
import org.junit.runner.Description
import org.junit.runners.model.Statement
import org.yaml.snakeyaml.Yaml
class JenkinsConfigRule implements TestRule {
final BasePipelineTest testInstance
JenkinsConfigRule(BasePipelineTest testInstance) {
this.testInstance = testInstance
}
@Override
Statement apply(Statement base, Description description) {
return statement(base)
}
private Statement statement(final Statement base) {
return new Statement() {
@Override
void evaluate() throws Throwable {
testInstance.helper.registerAllowedMethod("readYaml", [Map], { Map parameters ->
Yaml yamlParser = new Yaml()
return yamlParser.load(parameters.text)
})
DefaultValueCache.reset()
base.evaluate()
}
}
}
}

View File

@ -44,7 +44,6 @@ class JenkinsSetupRule implements TestRule {
base.evaluate()
testInstance.printCallStack()
}
}
}

View File

@ -0,0 +1,37 @@
package util
import com.lesfurets.jenkins.unit.BasePipelineTest
import org.junit.rules.TestRule
import org.junit.runner.Description
import org.junit.runners.model.Statement
class JenkinsShellCallRule implements TestRule {
final BasePipelineTest testInstance
List shell = []
JenkinsShellCallRule(BasePipelineTest testInstance) {
this.testInstance = testInstance
}
@Override
Statement apply(Statement base, Description description) {
return statement(base)
}
private Statement statement(final Statement base) {
return new Statement() {
@Override
void evaluate() throws Throwable {
testInstance.helper.registerAllowedMethod("sh", [String.class], {
command ->
shell.add(command.replaceAll(/\s+/," ").trim())
})
base.evaluate()
}
}
}
}

View File

@ -4,21 +4,18 @@ import static com.lesfurets.jenkins.unit.global.lib.LibraryConfiguration.library
class SharedLibraryCreator {
static def lazyLoadedLibrary = library()
.name('piper-library')
.retriever(new ProjectSource())
.targetPath('is/not/necessary')
.defaultVersion("master")
.allowOverride(true)
.implicit(false)
.build()
static def lazyLoadedLibrary = getLibraryConfiguration(false)
static def implicitLoadedLibrary = library()
.name('piper-library')
static def implicitLoadedLibrary = getLibraryConfiguration(true)
private static def getLibraryConfiguration(def implicit) {
library()
.name('piper-library-os')
.retriever(new ProjectSource())
.targetPath('is/not/necessary')
.defaultVersion("master")
.allowOverride(true)
.implicit(true)
.implicit(implicit)
.build()
}
}

View File

@ -1,11 +0,0 @@
@Library('piper-library-os')
execute() {
node() {
dockerExecute(script: this, dockerImage: 'maven:3.5-jdk-8-alpine') {
echo 'Inside Docker'
}
}
}
return this

View File

@ -1,11 +0,0 @@
@Library('piper-library-os')
execute() {
node() {
dockerExecute(script: this, dockerImage: 'maven:3.5-jdk-8-alpine', dockerOptions: '-it', dockerVolumeBind: ['my_vol': '/my_vol'], dockerEnvVars: ['http_proxy': 'http://proxy:8000']) {
echo 'Inside Docker'
}
}
}
return this

View File

@ -1,9 +0,0 @@
@Library('piper-library-os')
execute() {
node() {
mavenExecute script: this, goals: 'clean install'
}
}
return this

View File

@ -1,21 +0,0 @@
@Library('piper-library-os')
execute() {
node() {
mavenExecute(
script: this,
dockerImage: 'maven:3.5-jdk-8-alpine',
goals: 'clean install',
globalSettingsFile: 'globalSettingsFile.xml',
projectSettingsFile: 'projectSettingsFile.xml',
pomPath: 'pom.xml',
flags: '-o',
m2Path: 'm2Path',
defines: '-Dmaven.tests.skip=true'
)
}
}
return this

View File

@ -1,11 +0,0 @@
@Library('piper-library-os')
execute() {
node() {
setupCommonPipelineEnvironment script:this
}
}
return this

View File

@ -1,11 +1,27 @@
class commonPipelineEnvironment implements Serializable {
private Map configProperties = [:]
Map defaultConfiguration = [:]
//stores version of the artifact which is build during pipeline run
def artifactVersion
Map configuration = [:]
Map defaultConfiguration = [:]
//each Map in influxCustomDataMap represents a measurement in Influx. Additional measurements can be added as a new Map entry of influxCustomDataMap
private Map influxCustomDataMap = [pipeline_data: [:]]
//influxCustomData represents measurement jenkins_custom_data in Influx. Metrics can be written into this map
private Map influxCustomData = [:]
private String mtarFilePath
def setArtifactVersion(version) {
artifactVersion = version
}
def getArtifactVersion() {
return artifactVersion
}
def setConfigProperties(map) {
configProperties = map
}
@ -25,6 +41,14 @@ class commonPipelineEnvironment implements Serializable {
return configProperties[property]
}
def getInfluxCustomData() {
return influxCustomData
}
def getInfluxCustomDataMap() {
return influxCustomDataMap
}
def getMtarFilePath() {
return mtarFilePath
}
@ -32,4 +56,12 @@ class commonPipelineEnvironment implements Serializable {
void setMtarFilePath(mtarFilePath) {
this.mtarFilePath = mtarFilePath
}
def setPipelineMeasurement (measurementName, value) {
influxCustomDataMap.pipeline_data[measurementName] = value
}
def getPipelineMeasurement (measurementName) {
return influxCustomDataMap.pipeline_data[measurementName]
}
}

View File

@ -23,6 +23,12 @@ def call(Map parameters = [:], body) {
echo "[WARNING][${STEP_NAME}] No docker environment found (command 'which docker' did not return with '0'). Configured docker image '${dockerImage}' will not be used."
dockerImage = null
}
returnCode = sh script: 'docker ps -q > /dev/null', returnStatus: true
if(returnCode != 0) {
echo "[WARNING][$STEP_NAME] Cannot connect to docker daemon (command 'docker ps' did not return with '0'). Configured docker image '${dockerImage}' will not be used."
dockerImage = null
}
}
if(!dockerImage){

View File

@ -0,0 +1,19 @@
def call(Map parameters = [:], body) {
def script = parameters.script
def measurementName = parameters.get('measurementName', 'test_duration')
//start measurement
def start = System.currentTimeMillis()
body()
//record measurement
def duration = System.currentTimeMillis() - start
if (script != null)
script.commonPipelineEnvironment.setPipelineMeasurement(measurementName, duration)
return duration
}

View File

@ -0,0 +1,62 @@
import com.sap.piper.ConfigurationLoader
import com.sap.piper.ConfigurationMerger
import com.sap.piper.JsonUtils
def call(Map parameters = [:]) {
def stepName = 'influxWriteData'
handlePipelineStepErrors (stepName: stepName, stepParameters: parameters, allowBuildFailure: true) {
def script = parameters.script
if (script == null)
script = [commonPipelineEnvironment: commonPipelineEnvironment]
prepareDefaultValues script: script
final Map stepDefaults = ConfigurationLoader.defaultStepConfiguration(script, stepName)
final Map stepConfiguration = ConfigurationLoader.stepConfiguration(script, stepName)
List parameterKeys = [
'artifactVersion',
'influxServer',
'influxPrefix'
]
Map pipelineDataMap = [
artifactVersion: commonPipelineEnvironment.getArtifactVersion()
]
List stepConfigurationKeys = [
'influxServer',
'influxPrefix'
]
Map configuration = ConfigurationMerger.mergeWithPipelineData(parameters, parameterKeys, pipelineDataMap, stepConfiguration, stepConfigurationKeys, stepDefaults)
def artifactVersion = configuration.artifactVersion
if (!artifactVersion) {
//this takes care that terminated builds due to milestone-locking do not cause an error
echo "[${stepName}] no artifact version available -> exiting writeInflux without writing data"
return
}
def influxServer = configuration.influxServer
def influxPrefix = configuration.influxPrefix
echo """[${stepName}]----------------------------------------------------------
Artifact version: ${artifactVersion}
Influx server: ${influxServer}
Influx prefix: ${influxPrefix}
InfluxDB data: ${script.commonPipelineEnvironment.getInfluxCustomData()}
InfluxDB data map: ${script.commonPipelineEnvironment.getInfluxCustomDataMap()}
[${stepName}]----------------------------------------------------------"""
if (influxServer)
step([$class: 'InfluxDbPublisher', selectedTarget: influxServer, customPrefix: influxPrefix, customData: script.commonPipelineEnvironment.getInfluxCustomData(), customDataMap: script.commonPipelineEnvironment.getInfluxCustomDataMap()])
//write results into json file for archiving - also benefitial when no InfluxDB is available yet
def jsonUtils = new JsonUtils()
writeFile file: 'jenkins_data.json', text: jsonUtils.getPrettyJsonString(script.commonPipelineEnvironment.getInfluxCustomData())
writeFile file: 'pipeline_data.json', text: jsonUtils.getPrettyJsonString(script.commonPipelineEnvironment.getInfluxCustomDataMap())
archiveArtifacts artifacts: '*data.json', allowEmptyArchive: true
}
}

View File

@ -1,62 +1,197 @@
import com.sap.piper.Utils
import com.sap.piper.ConfigurationLoader
import com.sap.piper.ConfigurationMerger
import com.sap.piper.ConfigurationType
def call(parameters = [:]) {
handlePipelineStepErrors (stepName: 'neoDeploy', stepParameters: parameters) {
def stepName = 'neoDeploy'
List parameterKeys = [
'applicationName',
'archivePath',
'account',
'deployMode',
'dockerEnvVars',
'dockerImage',
'dockerOptions',
'host',
'neoCredentialsId',
'neoHome',
'propertiesFile',
'runtime',
'runtimeVersion',
'vmSize',
'warAction'
]
List stepConfigurationKeys = [
'account',
'dockerEnvVars',
'dockerImage',
'dockerOptions',
'host',
'neoCredentialsId',
'neoHome'
]
handlePipelineStepErrors (stepName: stepName, stepParameters: parameters) {
def script = parameters?.script ?: [commonPipelineEnvironment: commonPipelineEnvironment]
def utils = new Utils()
def script = parameters.script
if (script == null){
script = [commonPipelineEnvironment: commonPipelineEnvironment]
}
def archivePath = new File(utils.getMandatoryParameter(parameters, 'archivePath', null))
if (!archivePath.isAbsolute()) {
archivePath = new File(pwd(), archivePath.getPath())
}
if (!archivePath.exists()){
error "Archive cannot be found with parameter archivePath: '${archivePath}'."
}
prepareDefaultValues script: script
final Map stepConfiguration = [:]
// Backward compatibility: ensure old configuration is taken into account
// The old configuration in not stage / step specific
def defaultDeployHost = script.commonPipelineEnvironment.getConfigProperty('DEPLOY_HOST')
def defaultDeployAccount = script.commonPipelineEnvironment.getConfigProperty('CI_DEPLOY_ACCOUNT')
def defaultCredentialsId = script.commonPipelineEnvironment.getConfigProperty('neoCredentialsId')
if (defaultCredentialsId == null) {
defaultCredentialsId = 'CI_CREDENTIALS_ID'
if(defaultDeployHost) {
echo "[WARNING][${stepName}] A deprecated configuration framework is used for configuring parameter 'DEPLOY_HOST'. This configuration framework will be removed in future versions."
stepConfiguration.put('host', defaultDeployHost)
}
def deployHost = utils.getMandatoryParameter(parameters, 'deployHost', defaultDeployHost)
def deployAccount = utils.getMandatoryParameter(parameters, 'deployAccount', defaultDeployAccount)
def credentialsId = parameters.get('neoCredentialsId', defaultCredentialsId)
def defaultDeployAccount = script.commonPipelineEnvironment.getConfigProperty('CI_DEPLOY_ACCOUNT')
if(defaultDeployAccount) {
echo "[WARNING][${stepName}] A deprecated configuration framework is used for configuring parameter 'DEPLOY_ACCOUNT'. This configuration framekwork will be removed in future versions."
stepConfiguration.put('account', defaultDeployAccount)
}
def neoExecutable = getNeoExecutable(parameters)
if(parameters.DEPLOY_HOST && !parameters.host) {
echo "[WARNING][${stepName}] Deprecated parameter 'DEPLOY_HOST' is used. This will not work anymore in future versions. Use parameter 'host' instead."
parameters.put('host', parameters.DEPLOY_HOST)
}
if(parameters.CI_DEPLOY_ACCOUNT && !parameters.account) {
echo "[WARNING][${stepName}] Deprecated parameter 'CI_DEPLOY_ACCOUNT' is used. This will not work anymore in future versions. Use parameter 'account' instead."
parameters.put('account', parameters.CI_DEPLOY_ACCOUNT)
}
// Backward compatibility end
stepConfiguration.putAll(ConfigurationLoader.stepConfiguration(script, stepName))
Map configuration = ConfigurationMerger.merge(parameters, parameterKeys,
stepConfiguration, stepConfigurationKeys,
ConfigurationLoader.defaultStepConfiguration(script, stepName))
def archivePath = configuration.archivePath
if(archivePath?.trim()) {
if (!fileExists(archivePath)) {
error "Archive cannot be found with parameter archivePath: '${archivePath}'."
}
} else {
error "Archive path not configured (parameter \"archivePath\")."
}
def deployHost
def deployAccount
def credentialsId = configuration.get('neoCredentialsId', '')
def deployMode = configuration.deployMode
def warAction
def propertiesFile
def applicationName
def runtime
def runtimeVersion
def vmSize
if (deployMode != 'mta' && deployMode != 'warParams' && deployMode != 'warPropertiesFile') {
throw new Exception("[neoDeploy] Invalid deployMode = '${deployMode}'. Valid 'deployMode' values are: 'mta', 'warParams' and 'warPropertiesFile'")
}
if (deployMode == 'warPropertiesFile' || deployMode == 'warParams') {
warAction = utils.getMandatoryParameter(configuration, 'warAction')
if (warAction != 'deploy' && warAction != 'rolling-update') {
throw new Exception("[neoDeploy] Invalid warAction = '${warAction}'. Valid 'warAction' values are: 'deploy' and 'rolling-update'.")
}
}
if (deployMode == 'warPropertiesFile') {
propertiesFile = utils.getMandatoryParameter(configuration, 'propertiesFile')
if (!fileExists(propertiesFile)){
error "Properties file cannot be found with parameter propertiesFile: '${propertiesFile}'."
}
}
if (deployMode == 'warParams') {
applicationName = utils.getMandatoryParameter(configuration, 'applicationName')
runtime = utils.getMandatoryParameter(configuration, 'runtime')
runtimeVersion = utils.getMandatoryParameter(configuration, 'runtimeVersion')
vmSize = configuration.vmSize
if (vmSize != 'lite' && vmSize !='pro' && vmSize != 'prem' && vmSize != 'prem-plus') {
throw new Exception("[neoDeploy] Invalid vmSize = '${vmSize}'. Valid 'vmSize' values are: 'lite', 'pro', 'prem' and 'prem-plus'.")
}
}
if (deployMode.equals('mta') || deployMode.equals('warParams')) {
deployHost = utils.getMandatoryParameter(configuration, 'host')
deployAccount = utils.getMandatoryParameter(configuration, 'account')
}
def neoExecutable = getNeoExecutable(configuration)
def neoDeployScript
if (deployMode == 'mta') {
neoDeployScript =
"""#!/bin/bash
"${neoExecutable}" deploy-mta \
--host '${deployHost}' \
--account '${deployAccount}' \
--synchronous"""
}
if (deployMode == 'warParams') {
neoDeployScript =
"""#!/bin/bash
"${neoExecutable}" ${warAction} \
--host '${deployHost}' \
--account '${deployAccount}' \
--application '${applicationName}' \
--runtime '${runtime}' \
--runtime-version '${runtimeVersion}' \
--size '${vmSize}'"""
}
if (deployMode == 'warPropertiesFile') {
neoDeployScript =
"""#!/bin/bash
"${neoExecutable}" ${warAction} \
${propertiesFile}"""
}
withCredentials([usernamePassword(
credentialsId: credentialsId,
passwordVariable: 'password',
usernameVariable: 'username'
)]) {
sh """#!/bin/bash
"${neoExecutable}" deploy-mta \
--user '${username}' \
--host '${deployHost}' \
--source "${archivePath.getAbsolutePath()}" \
--account '${deployAccount}' \
--password '${password}' \
--synchronous
"""
credentialsId: credentialsId,
passwordVariable: 'password',
usernameVariable: 'username')]) {
def commonDeployParams =
"""--user '${username}' \
--password '${password}' \
--source "${archivePath}" \
"""
dockerExecute(dockerImage: configuration.get('dockerImage'),
dockerEnvVars: configuration.get('dockerEnvVars'),
dockerOptions: configuration.get('dockerOptions')) {
sh """${neoDeployScript} \
${commonDeployParams}
"""
}
}
}
}
private getNeoExecutable(parameters) {
private getNeoExecutable(configuration) {
def neoExecutable = 'neo' // default, if nothing below applies maybe it is the path.
def neoExecutable = 'neo.sh' // default, if nothing below applies maybe it is the path.
if (parameters?.neoHome) {
neoExecutable = "${parameters.neoHome}/tools/neo.sh"
echo "[neoDeploy] Neo executable \"${neoExecutable}\" retrieved from parameters."
if (configuration.neoHome) {
neoExecutable = "${configuration.neoHome}/tools/neo.sh"
echo "[neoDeploy] Neo executable \"${neoExecutable}\" retrieved from configuration."
return neoExecutable
}