1
0
mirror of https://github.com/SAP/jenkins-library.git synced 2024-12-12 10:55:20 +02:00

Merge branch 'master' into docGeneration

This commit is contained in:
Oliver Nocon 2019-04-12 13:58:40 +02:00 committed by GitHub
commit e296f5c5ed
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
22 changed files with 499 additions and 323 deletions

View File

@ -1,79 +1,14 @@
# artifactSetVersion
# ${docGenStepName}
## Description
The continuous delivery process requires that each build is done with a unique version number.
The version generated using this step will contain:
* Version (major.minor.patch) from descriptor file in master repository is preserved. Developers should be able to autonomously decide on increasing either part of this version number.
* Timestamp
* CommitId (by default the long version of the hash)
Optionally, but enabled by default, the new version is pushed as a new tag into the source code repository (e.g. GitHub).
If this option is chosen, git credentials and the repository URL needs to be provided.
Since you might not want to configure the git credentials in Jenkins, committing and pushing can be disabled using the `commitVersion` parameter as described below.
If you require strict reproducibility of your builds, this should be used.
## ${docGenDescription}
## Prerequsites
none
## Parameters
## ${docGenParameters}
| parameter | mandatory | default | possible values |
| ----------|-----------|---------|-----------------|
| script | yes | | |
| artifactType | no | | 'appContainer' |
| buildTool | no | maven | docker, dlang, golang, maven, mta, npm, pip, sbt |
| commitVersion | no | `true` | `true`, `false` |
| dockerVersionSource | no | `''` | FROM, (ENV name),appVersion |
| filePath | no | buildTool=`docker`: Dockerfile <br />buildTool=`dlang`: dub.json <br />buildTool=`golang`: VERSION <br />buildTool=`maven`: pom.xml <br />buildTool=`mta`: mta.yaml <br />buildTool=`npm`: package.json <br />buildTool=`pip`: version.txt <br />buildTool=`sbt`: sbtDescriptor.json| |
| gitCommitId | no | `GitUtils.getGitCommitId()` | |
| gitSshCredentialsId | If `commitVersion` is `true` | as defined in custom configuration | |
| gitUserEMail | no | | |
| gitUserName | no | | |
| gitSshUrl | If `commitVersion` is `true` | | |
| tagPrefix | no | 'build_' | |
| timestamp | no | current time in format according to `timestampTemplate` | |
| timestampTemplate | no | `%Y%m%d%H%M%S` | |
| versioningTemplate | no |buildTool=`docker`: `${version}-${timestamp}${commitId?"_"+commitId:""}`<br> />buildTool=`dlang`: `${version}-${timestamp}${commitId?"+"+commitId:""}`<br />buildTool=`golang`:`${version}-${timestamp}${commitId?"+"+commitId:""}`<br />buildTool=`maven`: `${version}-${timestamp}${commitId?"_"+commitId:""}`<br />buildTool=`mta`: `${version}-${timestamp}${commitId?"+"+commitId:""}`<br />buildTool=`npm`: `${version}-${timestamp}${commitId?"+"+commitId:""}`<br />buildTool=`pip`: `${version}.${timestamp}${commitId?"."+commitId:""}`<br />buildTool=`sbt`: `${version}-${timestamp}${commitId?"+"+commitId:""}`| |
* `script` defines the global script environment of the Jenkinsfile run. Typically `this` is passed to this parameter. This allows the function to access the [`commonPipelineEnvironment`](commonPipelineEnvironment.md) for retrieving e.g. configuration parameters.
* `artifactType` defines the type of the artifact.
* `buildTool` defines the tool which is used for building the artifact.
* `commitVersion` controls if the changed version is committed and pushed to the git repository. If this is enabled (which is the default), you need to provide `gitCredentialsId` and `gitSshUrl`.
* `dockerVersionSource` specifies the source to be used for the main version which is used for generating the automatic version.
* This can either be the version of the base image - as retrieved from the `FROM` statement within the Dockerfile, e.g. `FROM jenkins:2.46.2`
* Alternatively the name of an environment variable defined in the Docker image can be used which contains the version number, e.g. `ENV MY_VERSION 1.2.3`
* The third option `appVersion` applies only to the artifactType `appContainer`. Here the version of the app which is packaged into the container will be used as version for the container itself.
* Using `filePath` you could define a custom path to the descriptor file.
* `gitCommitId` defines the version prefix of the automatically generated version. By default it will take the long commitId hash. You could pass any other string (e.g. the short commitId hash) to be used. In case you don't want to have the gitCommitId added to the automatic versioning string you could set the value to an empty string: `''`.
* `gitSshCredentialsId`defines the ssh git credentials to be used for writing the tag.
* The parameters `gitUserName` and `gitUserEMail` allow to overwrite the global git settings available on your Jenkins server
* `gitSshUrl` defines the git ssh url to the source code repository.
* `tagPrefix` defines the prefix wich is used for the git tag which is written during the versioning run.
* `timestamp` defines the timestamp to be used in the automatic version string. You could overwrite the default behavior by explicitly setting this string.
## Step configuration
The following parameters can also be specified as step parameters using the global configuration file:
* `artifactType`
* `buildTool`
* `commitVersion`
* `dockerVersionSource`
* `filePath`
* `gitCredentialsId`
* `gitUserEMail`
* `gitUserName`
* `gitSshUrl`
* `tagPrefix`
* `timestamp`
* `timestampTemplate`
* `versioningTemplate`
## ${docGenConfiguration}
## Example

View File

@ -1,18 +1,6 @@
# cloudFoundryDeploy
# ${docGenStepName}
## Description
The application will be deployed to a test or production space within Cloud Foundry.
Deployment can be done
* in a standard way
* in a zero downtime manner (using a [blue-green deployment approach](https://martinfowler.com/bliki/BlueGreenDeployment.html))
!!! note "Deployment supports multiple deployment tools"
Currently the following are supported:
* Standard `cf push` and [Bluemix blue-green plugin](https://github.com/bluemixgaragelondon/cf-blue-green-deploy#how-to-use)
* [MTA CF CLI Plugin](https://github.com/cloudfoundry-incubator/multiapps-cli-plugin)
## ${docGenDescription}
## Prerequisites
@ -21,90 +9,9 @@ Deployment can be done
![Jenkins credentials configuration](../images/cf_credentials.png)
## Parameters
## ${docGenParameters}
| parameter | mandatory | default | possible values |
| ----------|-----------|---------|-----------------|
| script | yes | | |
| cloudFoundry | yes | | |
| deployTool | no | cf_native | cf_native, mtaDeployPlugin |
| deployType | no | standard | standard, blue-green |
| keepOldInstance | no | false | true, false |
| dockerImage | no | s4sdk/docker-cf-cli | |
| dockerWorkspace | no | /home/piper | |
| mtaDeployParameters | | for _deployType:standard_ `-f`<br />for _deployType:blue-green_ `-f --no-confirm` | |
| mtaExtensionDescriptor | no | '' | |
| mtaPath | no | '' | |
| smokeTestScript | no | blueGreenCheckScript.sh (provided by library). <br />Can be overwritten using config property 'smokeTestScript' | |
| smokeTestStatusCode | no | 200 | |
| stashContent | no | [] | |
* `script` defines the global script environment of the Jenkinsfile run. Typically `this` is passed to this parameter. This allows the function to access the [`commonPipelineEnvironment`](commonPipelineEnvironment.md) for retrieving e.g. configuration parameters.
* `cloudFoundry` defines a map containing following properties:
* `apiEndpoint`: Cloud Foundry API endpoint (default: `https://api.cf.eu10.hana.ondemand.com`)
* `appName`: App name of application to be deployed (optional)
* `credentialsId`: Credentials to be used for deployment (mandatory)
* `manifest`: Manifest to be used for deployment
* `org`: Cloud Foundry target organization (mandatory)
* `space`: Cloud Foundry target space (mandatory)
Example: `cloudFoundry: [apiEndpoint: 'https://test.server.com', appName:'cfAppName', credentialsId: 'cfCredentialsId', manifest: 'cfManifest', org: 'cfOrg', space: 'cfSpace']`
!!! note
It is also possible to use following configuration parameters instead of `cloudFoundry` map:
- cfApiEndpoint
- cfAppName
- cfCredentialsId
- cfManifest
- cfOrg
- cfSpace
!!! note
Due to [an incompatible change](https://github.com/cloudfoundry/cli/issues/1445) in the Cloud Foundry CLI, multiple buildpacks are not supported by this step.
If your `application` contains a list of `buildpacks` instead a single `buildpack`, this will be automatically re-written by the step when blue-green deployment is used.
* `deployTool` defines the tool which should be used for deployment.
* `deployType` defines the type of deployment, either `standard` deployment which results in a system downtime or a zero-downtime `blue-green` deployment.
* `keepOldInstance` in case of a `blue-green` deployment the old instance will be deleted by default. If this option is set to true the old instance will remain stopped in the Cloud Foundry space.
* `dockerImage` defines the Docker image containing the deployment tools (like cf cli, ...) and `dockerWorkspace` defines the home directory of the default user of the `dockerImage`
* `smokeTestScript` allows to specify a script which performs a check during blue-green deployment. The script gets the FQDN as parameter and returns `exit code 0` in case check returned `smokeTestStatusCode`. More details can be found [here](https://github.com/bluemixgaragelondon/cf-blue-green-deploy#how-to-use) <br /> Currently this option is only considered for deployTool `cf_native`.
* `stashContent` defines the stash names which should be unstashed at the beginning of the step. This makes the files available in case the step is started on an empty node.
### Deployment with cf_native
* `appName` in `cloudFoundry` map (or `cfAppName`) defines the name of the application which will be deployed to the Cloud Foundry space.
* `manifest` in `cloudFoundry` maps (or `cfManifest`) defines the manifest to be used for Cloud Foundry deployment.
!!! note
Cloud Foundry supports the deployment of multiple applications using a single manifest file.
This option is supported with Piper.
In this case define `appName: ''` since the app name for the individual applications have to be defined via the manifest.
You can find details in the [Cloud Foundry Documentation](https://docs.cloudfoundry.org/devguide/deploy-apps/manifest.html#multi-apps)
### Deployment with mtaDeployPlugin
* `mtaPath` define path to *.mtar for deployment.
* `mtaExtensionDescriptor` defines additional extension descriptor file for deployment.
* `mtaDeployParameters` defines additional parameters passed to mta deployment.
## Step configuration
The following parameters can also be specified as step/stage/general parameters using the [global configuration](../configuration.md):
* cloudFoundry
* deployUser
* deployTool
* deployType
* dockerImage
* dockerWorkspace
* mtaDeployParameters
* mtaExtensionDescriptor
* mtaPath
* smokeTestScript
* smokeTestStatusCode
* stashContent
## ${docGenConfiguration}
## Example

View File

@ -1,18 +1,6 @@
# githubPublishRelease
# ${docGenStepName}
## Description
This step creates a tag in your GitHub repository together with a release.
The release can be filled with text plus additional information like:
* Closed pull request since last release
* Closed issues since last release
* link to delta information showing all commits since last release
The result looks like
![Example release](../images/githubRelease.png)
## ${docGenDescription}
## Prerequisites
@ -20,6 +8,10 @@ You need to create a personal access token within GitHub and add this to the Jen
Please see [GitHub documentation for details about creating the personal access token](https://help.github.com/articles/creating-a-personal-access-token-for-the-command-line/).
## ${docGenParameters}
## ${docGenConfiguration}
## Example
Usage of pipeline step:
@ -27,52 +19,3 @@ Usage of pipeline step:
```groovy
githubPublishRelease script: this, releaseBodyHeader: "**This is the latest success!**<br />"
```
## Parameters
| parameter | mandatory | default | possible values |
| ----------|-----------|---------|-----------------|
|script|yes|||
|addClosedIssues|no|`false`||
|addDeltaToLastRelease|no|`false`||
|customFilterExtension|no|``||
|excludeLabels|no|<ul><li>`duplicate`</li><li>`invalid`</li><li>`question`</li><li>`wontfix`</li></ul>||
|githubApiUrl|no|`//https://api.github.com`||
|githubOrg|yes|`script.commonPipelineEnvironment.getGitFolder()`||
|githubRepo|yes|`script.commonPipelineEnvironment.getGitRepo()`||
|githubServerUrl|no|`https://github.com`||
|githubTokenCredentialsId|yes|||
|releaseBodyHeader|no|||
|version|yes|`script.commonPipelineEnvironment.getArtifactVersion()`||
### Details
* `script` defines the global script environment of the Jenkinsfile run. Typically `this` is passed to this parameter. This allows the function to access the [`commonPipelineEnvironment`](commonPipelineEnvironment.md) for storing the measured duration.
* All GitHub related properties allow you to overwrite the default behavior of identifying e.g. GitHub organization, GitHub repository.
* `version` defines the version number which will be written as tag as well as release name
* By defining the `releaseBodyHeader` you can specify the content which will appear for the release
* If you set `addClosedIssues` to `true`, a list of all closed issues and merged pull-requests since the last release will added below the `releaseBodyHeader`
* If you set `addDeltaToLastRelease` to `true`, a link will be added to the relese information that brings up all commits since the last release.
* By passing the parameter `customFilterExtension` it is possible to pass additional filter criteria for retrieving closed issues since the last release. Additional criteria could be for example specific `label`, or `filter` according to [GitHub API documentation](https://developer.github.com/v3/issues/).
* It is possible to exclude issues with dedicated labels using parameter `excludeLabels`. Usage is like `excludeLabels: ['label1', 'label2']`
## Step configuration
We recommend to define values of step parameters via [config.yml file](../configuration.md).
In following sections the configuration is possible:
| parameter | general | step | stage |
| ----------|-----------|---------|-----------------|
|script||||
|addClosedIssues||X|X|
|addDeltaToLastRelease||X|X|
|customFilterExtension||X|X|
|excludeLabels||X|X|
|githubApiUrl|X|X|X|
|githubOrg||X|X|
|githubRepo||X|X|
|githubServerUrl|X|X|X|
|githubTokenCredentialsId|X|X|X|
|releaseBodyHeader||X|X|
|version||X|X|

View File

@ -85,10 +85,10 @@ influxDBServer=jenkins
| ----------|-----------|---------|-----------------|
|script|yes|||
|artifactVersion|no|`commonPipelineEnvironment.getArtifactVersion()`||
|customData|no|`commonPipelineEnvironment.getInfluxCustomData()`||
|customDataMap|no|`commonPipelineEnvironment.getInfluxCustomDataMap()`||
|customDataMapTags|no|`commonPipelineEnvironment.getInfluxCustomDataTags()`||
|customDataTags|no|`commonPipelineEnvironment.getInfluxCustomDataTags()`||
|customData|no|`InfluxData.getInstance().getFields().jenkins_custom_data`||
|customDataMap|no|`InfluxData.getInstance().getFields()`||
|customDataMapTags|no|`InfluxData.getInstance().getTags()`||
|customDataTags|no|`InfluxData.getInstance().getTags().jenkins_custom_data`||
|influxPrefix|no|||
|influxServer|no|`''`||
|wrapInNode|no|`false`||
@ -144,7 +144,7 @@ As a first step you need to add your InfluxDB as Data source to your Grafana:
The Influx plugin collects following data in the Piper context:
- All data as per default [InfluxDB plugin capabilities](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin)
- Additional data collected via `commonPipelineEnvironment.setInfluxCustomDataProperty()` and via `commonPipelineEnvironment.setPipelineMeasurement()`
- Additional data collected via `InfluxData.addField(measurement, key, value)`
!!! note "Add custom information to your InfluxDB"
You can simply add custom data collected during your pipeline runs via available data objects.
@ -169,7 +169,7 @@ Measurements are potentially pre-fixed - see parameter `influxPrefix` above.
| sonarqube_data | <ul><li>blocker_issues</li><li>critical_issues</li><li>info_issues</li><li>major_issues</li><li>minor_issues</li><li>lines_of_code</li><li>...</li></ul> | Details see [InfluxDB plugin documentation](https://wiki.jenkins.io/display/JENKINS/InfluxDB+Plugin) |
| jenkins_custom_data | Piper fills following colums by default: <br /><ul><li>build_result</li><li>build_result_key</li><li>build_step (->step in case of error)</li><li>build_error (->error message in case of error)</li></ul> | filled by `commonPipelineEnvironment.setInfluxCustomDataProperty()` |
| pipeline_data | Examples from the Piper templates:<br /><ul><li>build_duration</li><li>opa_duration</li><li>deploy_test_duration</li><li>deploy_test_duration</li><li>fortify_duration</li><li>release_duration</li><li>...</li></ul>| filled by step [`measureDuration`](durationMeasure.md) using parameter `measurementName`|
| step_data | Considered, e.g.:<br /><ul><li>build_url</li><li>bats</li><li>checkmarx</li><li>fortify</li><li>gauge</li><li>nsp</li><li>snyk</li><li>sonar</li><li>...</li></ul>| filled by `commonPipelineEnvironment.setInfluxStepData()` |
| step_data | Considered, e.g.:<br /><ul><li>build_url</li><li>bats</li><li>checkmarx</li><li>fortify</li><li>gauge</li><li>nsp</li><li>snyk</li><li>sonar</li><li>...</li></ul>| filled by `InfluxData.addField('step_data', key, value)` |
### Examples for InfluxDB queries which can be used in Grafana

View File

@ -0,0 +1,41 @@
package com.sap.piper.analytics
import com.cloudbees.groovy.cps.NonCPS
class InfluxData implements Serializable{
// each Map in influxCustomDataMap represents a measurement in Influx.
// Additional measurements can be added as a new Map entry of influxCustomDataMap
protected Map fields = [jenkins_custom_data: [:], pipeline_data: [:], step_data: [:]]
// each Map in influxCustomDataMapTags represents tags for certain measurement in Influx.
// Tags are required in Influx for easier querying data
protected Map tags = [jenkins_custom_data: [:], pipeline_data: [:], step_data: [:]]
public Map getFields(){ return fields }
public Map getTags(){ return tags }
protected static InfluxData instance
@NonCPS
public static InfluxData getInstance(){
if(!instance) instance = new InfluxData()
return instance
}
public static void addField(String measurement, String key, value) {
add(getInstance().getFields(), measurement, key, value)
}
public static void addTag(String measurement, String key, value) {
add(getInstance().getTags(), measurement, key, value)
}
protected static void add(Map dataMap, String measurement, String field, value) {
if (!dataMap[measurement]) dataMap[measurement] = [:]
dataMap[measurement][field] = value
}
public static void reset(){
instance = null
}
}

View File

@ -96,7 +96,7 @@ class CloudFoundryDeployTest extends BasePiperTest {
stageName: 'acceptance',
])
// asserts
assertThat(loggingRule.log, containsString('[cloudFoundryDeploy] General parameters: deployTool=, deployType=standard, cfApiEndpoint=https://api.cf.eu10.hana.ondemand.com, cfOrg=testOrg, cfSpace=testSpace, cfCredentialsId=myCreds, deployUser=testUser'))
assertThat(loggingRule.log, containsString('[cloudFoundryDeploy] General parameters: deployTool=, deployType=standard, cfApiEndpoint=https://api.cf.eu10.hana.ondemand.com, cfOrg=testOrg, cfSpace=testSpace, cfCredentialsId=myCreds'))
}
@Test
@ -125,7 +125,7 @@ class CloudFoundryDeployTest extends BasePiperTest {
stageName: 'acceptance'
])
// asserts
assertThat(loggingRule.log, containsString('[cloudFoundryDeploy] General parameters: deployTool=notAvailable, deployType=standard, cfApiEndpoint=https://api.cf.eu10.hana.ondemand.com, cfOrg=testOrg, cfSpace=testSpace, cfCredentialsId=myCreds, deployUser=testUser'))
assertThat(loggingRule.log, containsString('[cloudFoundryDeploy] General parameters: deployTool=notAvailable, deployType=standard, cfApiEndpoint=https://api.cf.eu10.hana.ondemand.com, cfOrg=testOrg, cfSpace=testSpace, cfCredentialsId=myCreds'))
}
@Test

View File

@ -1,10 +1,16 @@
#!groovy
import com.sap.piper.analytics.InfluxData
import org.junit.Rule
import org.junit.Test
import util.BasePiperTest
import static org.junit.Assert.assertTrue
import static org.hamcrest.Matchers.hasKey
import static org.hamcrest.Matchers.is
import static org.hamcrest.Matchers.not
import static org.junit.Assert.assertThat
import org.junit.rules.RuleChain
import util.Rules
@ -27,8 +33,12 @@ class DurationMeasureTest extends BasePiperTest {
stepRule.step.durationMeasure(script: nullScript, measurementName: 'test') {
bodyExecuted = true
}
assertTrue(nullScript.commonPipelineEnvironment.getPipelineMeasurement('test') != null)
assertTrue(bodyExecuted)
// doesnt work
//assertThat(InfluxData.getInstance().getFields(), hasEntry('pipeline_data', hasEntry('test', is(anything()))))
assertThat(InfluxData.getInstance().getFields(), hasKey('pipeline_data'))
assertThat(InfluxData.getInstance().getFields().pipeline_data, hasKey('test'))
assertThat(InfluxData.getInstance().getFields().pipeline_data.test, is(not(null)))
assertThat(bodyExecuted, is(true))
assertJobStatusSuccess()
}
}

View File

@ -1,5 +1,7 @@
#!groovy
import com.sap.piper.DefaultValueCache
import com.sap.piper.analytics.InfluxData
import org.junit.Before
import org.junit.Rule
import org.junit.Test
@ -147,9 +149,9 @@ class InfluxWriteDataTest extends BasePiperTest {
void testInfluxCustomDataFromCPE() {
nullScript.commonPipelineEnvironment.reset()
nullScript.commonPipelineEnvironment.setArtifactVersion('1.2.3')
nullScript.commonPipelineEnvironment.setInfluxCustomDataTagsEntry('tag1', 'testTag1')
nullScript.commonPipelineEnvironment.setInfluxCustomDataMapEntry('test_data', 'key1', 'keyValue1')
nullScript.commonPipelineEnvironment.setInfluxCustomDataMapTagsEntry('test_data', 'tag1', 'tagValue1')
InfluxData.addTag('jenkins_custom_data', 'tag1', 'testTag1')
InfluxData.addField('test_data', 'key1', 'keyValue1')
InfluxData.addTag('test_data', 'tag1', 'tagValue1')
stepRule.step.influxWriteData(
//juStabUtils: utils,
script: nullScript,

View File

@ -0,0 +1,104 @@
package com.sap.piper.analytics
import org.junit.Rule
import org.junit.Before
import org.junit.Test
import static org.junit.Assert.assertThat
import static org.junit.Assume.assumeThat
import org.junit.rules.ExpectedException
import org.junit.rules.RuleChain
import static org.hamcrest.Matchers.containsString
import static org.hamcrest.Matchers.hasItem
import static org.hamcrest.Matchers.is
import static org.hamcrest.Matchers.not
import static org.hamcrest.Matchers.empty
import static org.hamcrest.Matchers.hasKey
import static org.hamcrest.Matchers.allOf
import static org.hamcrest.Matchers.hasEntry
import util.JenkinsLoggingRule
import util.JenkinsShellCallRule
import util.BasePiperTest
import util.Rules
class InfluxDataTest extends BasePiperTest {
private ExpectedException thrown = ExpectedException.none()
private JenkinsLoggingRule jlr = new JenkinsLoggingRule(this)
private JenkinsShellCallRule jscr = new JenkinsShellCallRule(this)
@Rule
public RuleChain rules = Rules
.getCommonRules(this)
.around(thrown)
.around(jscr)
.around(jlr)
@Before
void setup() {
InfluxData.instance = null
}
@Test
void testCreateInstance() {
InfluxData.getInstance()
// asserts
assertThat(InfluxData.instance.fields, allOf(
is(not(null)),
hasKey('jenkins_custom_data'),
hasKey('pipeline_data'),
hasKey('step_data')
))
assertThat(InfluxData.instance.fields.jenkins_custom_data, is([:]))
assertThat(InfluxData.instance.fields.pipeline_data, is([:]))
assertThat(InfluxData.instance.fields.step_data, is([:]))
assertThat(InfluxData.instance.tags, allOf(
is(not(null)),
hasKey('jenkins_custom_data'),
hasKey('pipeline_data'),
hasKey('step_data')
))
assertThat(InfluxData.instance.tags.jenkins_custom_data, is([:]))
assertThat(InfluxData.instance.tags.pipeline_data, is([:]))
assertThat(InfluxData.instance.tags.step_data, is([:]))
}
@Test
void testAddToDefaultMeasurement() {
InfluxData.addField('step_data', 'anyKey', 'anyValue')
InfluxData.addTag('step_data', 'anyKey', 'anyTag')
// asserts
assertThat(InfluxData.instance.fields.jenkins_custom_data, is([:]))
assertThat(InfluxData.instance.fields.pipeline_data, is([:]))
assertThat(InfluxData.instance.fields.step_data, is(['anyKey': 'anyValue']))
assertThat(InfluxData.instance.tags.jenkins_custom_data, is([:]))
assertThat(InfluxData.instance.tags.pipeline_data, is([:]))
assertThat(InfluxData.instance.tags.step_data, is(['anyKey': 'anyTag']))
}
@Test
void testAddToNewMeasurement() {
InfluxData.addField('new_measurement_data', 'anyKey', 'anyValue')
InfluxData.addTag('new_measurement_data', 'anyKey', 'anyTag')
// asserts
assertThat(InfluxData.instance.fields.new_measurement_data, is(['anyKey': 'anyValue']))
assertThat(InfluxData.instance.fields.jenkins_custom_data, is([:]))
assertThat(InfluxData.instance.fields.pipeline_data, is([:]))
assertThat(InfluxData.instance.fields.step_data, is([:]))
assertThat(InfluxData.instance.tags.new_measurement_data, is(['anyKey': 'anyTag']))
assertThat(InfluxData.instance.tags.jenkins_custom_data, is([:]))
assertThat(InfluxData.instance.tags.pipeline_data, is([:]))
assertThat(InfluxData.instance.tags.step_data, is([:]))
}
@Test
void testResetInstance() {
InfluxData.addField('step_data', 'anyKey', 'anyValue')
assumeThat(InfluxData.instance.fields.step_data, is(['anyKey': 'anyValue']))
InfluxData.reset()
// asserts
assertThat(InfluxData.instance.fields.jenkins_custom_data, is([:]))
assertThat(InfluxData.instance.fields.pipeline_data, is([:]))
assertThat(InfluxData.instance.fields.step_data, is([:]))
}
}

View File

@ -0,0 +1,28 @@
package util
import org.junit.rules.TestRule
import org.junit.runner.Description
import org.junit.runners.model.Statement
import com.lesfurets.jenkins.unit.BasePipelineTest
import com.sap.piper.analytics.InfluxData
class JenkinsInfluxDataRule implements TestRule {
JenkinsInfluxDataRule() { this(null) }
// Actually not needed. Only provided for the sake of consistency
// with our other rules which comes with an constructor having the
// test case contained in the signature.
JenkinsInfluxDataRule(BasePipelineTest testInstance) {}
@Override
Statement apply(Statement base, Description description) {
return new Statement() {
@Override
void evaluate() throws Throwable {
InfluxData.reset()
base.evaluate()
}
}
}
}

View File

@ -14,6 +14,7 @@ public class Rules {
public static RuleChain getCommonRules(BasePipelineTest testCase, LibraryConfiguration libConfig) {
return RuleChain.outerRule(new JenkinsSetupRule(testCase, libConfig))
.around(new JenkinsResetDefaultCacheRule())
.around(new JenkinsInfluxDataRule())
.around(new JenkinsErrorRule(testCase))
.around(new JenkinsEnvironmentRule(testCase))
}

View File

@ -1,5 +1,6 @@
import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.GenerateDocumentation
import com.sap.piper.ConfigurationHelper
import com.sap.piper.GitUtils
import com.sap.piper.Utils
@ -14,23 +15,87 @@ import groovy.text.SimpleTemplateEngine
@Field Set GENERAL_CONFIG_KEYS = STEP_CONFIG_KEYS
@Field Set STEP_CONFIG_KEYS = [
/**
* Defines the type of the artifact.
* @possibleValues `appContainer`
*/
'artifactType',
/**
* Defines the tool which is used for building the artifact.
* @possibleValues docker, dlang, golang, maven, mta, npm, pip, sbt
*/
'buildTool',
/**
* Controls if the changed version is committed and pushed to the git repository.
* If this is enabled (which is the default), you need to provide `gitCredentialsId` and `gitSshUrl`.
* @possibleValues `true`, `false`
*/
'commitVersion',
/**
* Specifies the source to be used for the main version which is used for generating the automatic version.
* * This can either be the version of the base image - as retrieved from the `FROM` statement within the Dockerfile, e.g. `FROM jenkins:2.46.2`
* * Alternatively the name of an environment variable defined in the Docker image can be used which contains the version number, e.g. `ENV MY_VERSION 1.2.3`
* * The third option `appVersion` applies only to the artifactType `appContainer`. Here the version of the app which is packaged into the container will be used as version for the container itself.
* @possibleValues FROM, (ENV name),appVersion
*/
'dockerVersionSource',
/**
* Defines a custom path to the descriptor file.
*/
'filePath',
/**
* Defines the ssh git credentials to be used for writing the tag.
*/
'gitSshKeyCredentialsId',
/**
* Allows to overwrite the global git setting 'user.email' available on your Jenkins server.
*/
'gitUserEMail',
/**
* Allows to overwrite the global git setting 'user.name' available on your Jenkins server.
*/
'gitUserName',
/**
* Defines the git ssh url to the source code repository.
*/
'gitSshUrl',
/**
* Defines the prefix which is used for the git tag which is written during the versioning run.
*/
'tagPrefix',
/**
* Defines the timestamp to be used in the automatic version string. You could overwrite the default behavior by explicitly setting this string.
*/
'timestamp',
/** Defines the template for the timestamp which will be part of the created version. */
'timestampTemplate',
/** Defines the template for the automatic version which will be created. */
'versioningTemplate'
]
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS.plus('gitCommitId')
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS.plus(
/**
* Defines the version prefix of the automatically generated version. By default it will take the long commitId hash.
* You could pass any other string (e.g. the short commitId hash) to be used. In case you don't want to have the gitCommitId added to the automatic versioning string you could set the value to an empty string: `''`.
*/
'gitCommitId'
)
/**
* The continuous delivery process requires that each build is done with a unique version number.
*
* The version generated using this step will contain:
*
* * Version (major.minor.patch) from descriptor file in master repository is preserved. Developers should be able to autonomously decide on increasing either part of this version number.
* * Timestamp
* * CommitId (by default the long version of the hash)
*
* Optionally, but enabled by default, the new version is pushed as a new tag into the source code repository (e.g. GitHub).
* If this option is chosen, git credentials and the repository URL needs to be provided.
* Since you might not want to configure the git credentials in Jenkins, committing and pushing can be disabled using the `commitVersion` parameter as described below.
* If you require strict reproducibility of your builds, this should be used.
*/
@GenerateDocumentation
void call(Map parameters = [:], Closure body = null) {
handlePipelineStepErrors (stepName: STEP_NAME, stepParameters: parameters) {

View File

@ -4,6 +4,7 @@ import com.sap.piper.GenerateDocumentation
import com.sap.piper.ConfigurationHelper
import com.sap.piper.GitUtils
import com.sap.piper.Utils
import com.sap.piper.analytics.InfluxData
import groovy.text.SimpleTemplateEngine
import groovy.transform.Field
@ -72,7 +73,7 @@ void call(Map parameters = [:]) {
stepParam1: parameters?.script == null
], config)
script.commonPipelineEnvironment.setInfluxStepData('bats', false)
InfluxData.addField('step_data', 'bats', false)
config.stashContent = config.testRepository
?[GitUtils.handleTestRepository(this, config)]
@ -89,7 +90,7 @@ void call(Map parameters = [:]) {
sh "git clone ${config.repository}"
try {
sh "bats-core/bin/bats --recursive --tap ${config.testPath} > 'TEST-${config.testPackage}.tap'"
script.commonPipelineEnvironment.setInfluxStepData('bats', true)
InfluxData.addField('step_data', 'bats', true)
} catch (err) {
echo "[${STEP_NAME}] One or more tests failed"
if (config.failOnError) throw err

View File

@ -2,6 +2,7 @@ import com.sap.piper.JenkinsUtils
import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.GenerateDocumentation
import com.sap.piper.Utils
import com.sap.piper.ConfigurationHelper
import com.sap.piper.CfManifestUtils
@ -14,22 +15,109 @@ import groovy.transform.Field
@Field Set STEP_CONFIG_KEYS = [
'cloudFoundry',
'deployUser',
/**
* Cloud Foundry API endpoint.
* @parentConfigKey cloudFoundry
*/
'apiEndpoint',
/**
* Defines the name of the application to be deployed to the Cloud Foundry space.
* @parentConfigKey cloudFoundry
*/
'appName',
/**
* Credentials to be used for deployment.
* @parentConfigKey cloudFoundry
*/
'credentialsId',
/**
* Defines the manifest to be used for deployment to Cloud Foundry.
* @parentConfigKey cloudFoundry
*/
'manifest',
/**
* Cloud Foundry target organization.
* @parentConfigKey cloudFoundry
*/
'org',
/**
* Cloud Foundry target space.
* @parentConfigKey cloudFoundry
*/
'space',
/**
* Defines the tool which should be used for deployment.
* @possibleValues 'cf_native', 'mtaDeployPlugin'
*/
'deployTool',
/**
* Defines the type of deployment, either `standard` deployment which results in a system downtime or a zero-downtime `blue-green` deployment.
* @possibleValues 'standard', 'blue-green'
*/
'deployType',
/**
* In case of a `blue-green` deployment the old instance will be deleted by default. If this option is set to true the old instance will remain stopped in the Cloud Foundry space.
* @possibleValues true, false
*/
'keepOldInstance',
/** @see dockerExecute */
'dockerImage',
/** @see dockerExecute */
'dockerWorkspace',
/** @see dockerExecute */
'stashContent',
/**
* Defines additional parameters passed to mta for deployment with the mtaDeployPlugin.
*/
'mtaDeployParameters',
/**
* Defines additional extension descriptor file for deployment with the mtaDeployPlugin.
*/
'mtaExtensionDescriptor',
/**
* Defines the path to *.mtar for deployment with the mtaDeployPlugin.
*/
'mtaPath',
/**
* Allows to specify a script which performs a check during blue-green deployment. The script gets the FQDN as parameter and returns `exit code 0` in case check returned `smokeTestStatusCode`.
* More details can be found [here](https://github.com/bluemixgaragelondon/cf-blue-green-deploy#how-to-use) <br /> Currently this option is only considered for deployTool `cf_native`.
*/
'smokeTestScript',
'smokeTestStatusCode',
'stashContent']
/**
* Expected status code returned by the check.
*/
'smokeTestStatusCode'
]
@Field Map CONFIG_KEY_COMPATIBILITY = [cloudFoundry: [apiEndpoint: 'cfApiEndpoint', appName:'cfAppName', credentialsId: 'cfCredentialsId', manifest: 'cfManifest', org: 'cfOrg', space: 'cfSpace']]
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS
/**
* Deploys an application to a test or production space within Cloud Foundry.
* Deployment can be done
*
* * in a standard way
* * in a zero downtime manner (using a [blue-green deployment approach](https://martinfowler.com/bliki/BlueGreenDeployment.html))
*
* !!! note "Deployment supports multiple deployment tools"
* Currently the following are supported:
*
* * Standard `cf push` and [Bluemix blue-green plugin](https://github.com/bluemixgaragelondon/cf-blue-green-deploy#how-to-use)
* * [MTA CF CLI Plugin](https://github.com/cloudfoundry-incubator/multiapps-cli-plugin)
*
* !!! note
* Due to [an incompatible change](https://github.com/cloudfoundry/cli/issues/1445) in the Cloud Foundry CLI, multiple buildpacks are not supported by this step.
* If your `application` contains a list of `buildpacks` instead a single `buildpack`, this will be automatically re-written by the step when blue-green deployment is used.
*
* !!! note
* Cloud Foundry supports the deployment of multiple applications using a single manifest file.
* This option is supported with Piper.
*
* In this case define `appName: ''` since the app name for the individual applications have to be defined via the manifest.
* You can find details in the [Cloud Foundry Documentation](https://docs.cloudfoundry.org/devguide/deploy-apps/manifest.html#multi-apps)
*/
@GenerateDocumentation
void call(Map parameters = [:]) {
handlePipelineStepErrors (stepName: STEP_NAME, stepParameters: parameters) {
@ -62,7 +150,7 @@ void call(Map parameters = [:]) {
stepParam3: parameters?.script == null
], config)
echo "[${STEP_NAME}] General parameters: deployTool=${config.deployTool}, deployType=${config.deployType}, cfApiEndpoint=${config.cloudFoundry.apiEndpoint}, cfOrg=${config.cloudFoundry.org}, cfSpace=${config.cloudFoundry.space}, cfCredentialsId=${config.cloudFoundry.credentialsId}, deployUser=${config.deployUser}"
echo "[${STEP_NAME}] General parameters: deployTool=${config.deployTool}, deployType=${config.deployType}, cfApiEndpoint=${config.cloudFoundry.apiEndpoint}, cfOrg=${config.cloudFoundry.org}, cfSpace=${config.cloudFoundry.space}, cfCredentialsId=${config.cloudFoundry.credentialsId}"
//make sure that all relevant descriptors, are available in workspace
utils.unstashAll(config.stashContent)
@ -163,7 +251,7 @@ def deployCfNative (config) {
}
}
sh """#!/bin/bash
def returnCode = sh returnStatus: true, script: """#!/bin/bash
set +x
set -e
export HOME=${config.dockerWorkspace}
@ -171,6 +259,9 @@ def deployCfNative (config) {
cf plugins
cf ${deployCommand} ${config.cloudFoundry.appName ?: ''} ${blueGreenDeployOptions} -f '${config.cloudFoundry.manifest}' ${config.smokeTest}
"""
if(returnCode != 0){
error "[ERROR][${STEP_NAME}] The execution of the deploy command failed, see the log for details."
}
stopOldAppIfRunning(config)
sh "cf logout"
}
@ -228,7 +319,7 @@ def deployMta (config) {
usernameVariable: 'username'
)]) {
echo "[${STEP_NAME}] Deploying MTA (${config.mtaPath}) with following parameters: ${config.mtaExtensionDescriptor} ${config.mtaDeployParameters}"
sh """#!/bin/bash
def returnCode = sh returnStatus: true, script: """#!/bin/bash
export HOME=${config.dockerWorkspace}
set +x
set -e
@ -236,6 +327,9 @@ def deployMta (config) {
cf login -u ${username} -p '${password}' -a ${config.cloudFoundry.apiEndpoint} -o \"${config.cloudFoundry.org}\" -s \"${config.cloudFoundry.space}\"
cf plugins
cf ${deployCommand} ${config.mtaPath} ${config.mtaDeployParameters} ${config.mtaExtensionDescriptor}"""
if(returnCode != 0){
error "[ERROR][${STEP_NAME}] The execution of the deploy command failed, see the log for details."
}
sh "cf logout"
}
}

View File

@ -1,5 +1,6 @@
import com.sap.piper.ConfigurationLoader
import com.sap.piper.ConfigurationMerger
import com.sap.piper.analytics.InfluxData
class commonPipelineEnvironment implements Serializable {
@ -25,15 +26,6 @@ class commonPipelineEnvironment implements Serializable {
Map configuration = [:]
Map defaultConfiguration = [:]
//each Map in influxCustomDataMap represents a measurement in Influx. Additional measurements can be added as a new Map entry of influxCustomDataMap
private Map influxCustomDataMap = [pipeline_data: [:], step_data: [:]]
//each Map in influxCustomDataMapTags represents tags for certain measurement in Influx. Tags are required in Influx for easier querying data
private Map influxCustomDataMapTags = [pipeline_data: [:]]
//influxCustomData represents measurement jenkins_custom_data in Influx. Metrics can be written into this map
private Map influxCustomData = [:]
//influxCustomDataTags represents tags in Influx. Tags are required in Influx for easier querying data
private Map influxCustomDataTags = [:]
String mtarFilePath
private Map valueMap = [:]
@ -61,15 +53,12 @@ class commonPipelineEnvironment implements Serializable {
githubOrg = null
githubRepo = null
influxCustomData = [:]
influxCustomDataTags = [:]
influxCustomDataMap = [pipeline_data: [:], step_data: [:]]
influxCustomDataMapTags = [pipeline_data: [:]]
mtarFilePath = null
valueMap = [:]
changeDocumentId = null
InfluxData.reset()
}
def setAppContainerProperty(property, value) {
@ -80,61 +69,62 @@ class commonPipelineEnvironment implements Serializable {
return appContainerProperties[property]
}
// goes into measurement jenkins_data
def setInfluxCustomDataEntry(field, value) {
influxCustomData[field] = value
// goes into measurement jenkins_custom_data
def setInfluxCustomDataEntry(key, value) {
InfluxData.addField('jenkins_custom_data', key, value)
}
// goes into measurement jenkins_data
// goes into measurement jenkins_custom_data
@Deprecated // not used in library
def getInfluxCustomData() {
return influxCustomData
return InfluxData.getInstance().getFields().jenkins_custom_data
}
// goes into measurement jenkins_data
def setInfluxCustomDataTagsEntry(tag, value) {
influxCustomDataTags[tag] = value
// goes into measurement jenkins_custom_data
def setInfluxCustomDataTagsEntry(key, value) {
InfluxData.addTag('jenkins_custom_data', key, value)
}
// goes into measurement jenkins_data
// goes into measurement jenkins_custom_data
@Deprecated // not used in library
def getInfluxCustomDataTags() {
return influxCustomDataTags
return InfluxData.getInstance().getTags().jenkins_custom_data
}
void setInfluxCustomDataMapEntry(measurement, field, value) {
if (!influxCustomDataMap[measurement]) {
influxCustomDataMap[measurement] = [:]
}
influxCustomDataMap[measurement][field] = value
InfluxData.addField(measurement, field, value)
}
@Deprecated // not used in library
def getInfluxCustomDataMap() {
return influxCustomDataMap
return InfluxData.getInstance().getFields()
}
def setInfluxCustomDataMapTagsEntry(measurement, tag, value) {
if (!influxCustomDataMapTags[measurement]) {
influxCustomDataMapTags[measurement] = [:]
}
influxCustomDataMapTags[measurement][tag] = value
InfluxData.addTag(measurement, tag, value)
}
@Deprecated // not used in library
def getInfluxCustomDataMapTags() {
return influxCustomDataMapTags
return InfluxData.getInstance().getTags()
}
@Deprecated // not used in library
def setInfluxStepData(key, value) {
setInfluxCustomDataMapEntry('step_data', key, value)
InfluxData.addField('step_data', key, value)
}
@Deprecated // not used in library
def getInfluxStepData(key) {
return influxCustomDataMap.step_data[key]
return InfluxData.getInstance().getFields()['step_data'][key]
}
@Deprecated // not used in library
def setInfluxPipelineData(key, value) {
setInfluxCustomDataMapEntry('pipeline_data', key, value)
InfluxData.addField('pipeline_data', key, value)
}
@Deprecated // not used in library
def setPipelineMeasurement(key, value){
setInfluxPipelineData(key, value)
}
@Deprecated // not used in library
def getPipelineMeasurement(key) {
return influxCustomDataMap.pipeline_data[key]
return InfluxData.getInstance().getFields()['pipeline_data'][key]
}
Map getStepConfiguration(stepName, stageName = env.STAGE_NAME, includeDefaults = true) {
@ -149,5 +139,4 @@ class commonPipelineEnvironment implements Serializable {
config = ConfigurationMerger.merge(configuration.get('stages')?.get(stageName) ?: [:], null, config)
return config
}
}

View File

@ -1,5 +1,7 @@
import com.sap.piper.GenerateDocumentation
import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.analytics.InfluxData
import groovy.transform.Field
@Field STEP_NAME = getClass().getName()
@ -36,8 +38,7 @@ def call(Map parameters = [:], body) {
//record measurement
def duration = System.currentTimeMillis() - start
if (script != null)
script.commonPipelineEnvironment.setPipelineMeasurement(measurementName, duration)
InfluxData.addField('pipeline_data', measurementName, duration)
return duration
}

View File

@ -4,6 +4,7 @@ import com.sap.piper.GenerateDocumentation
import com.sap.piper.Utils
import com.sap.piper.ConfigurationHelper
import com.sap.piper.GitUtils
import com.sap.piper.analytics.InfluxData
import groovy.text.SimpleTemplateEngine
import groovy.transform.Field
@ -86,7 +87,7 @@ void call(Map parameters = [:]) {
def script = checkScript(this, parameters) ?: this
def utils = parameters.juStabUtils ?: new Utils()
script.commonPipelineEnvironment.setInfluxStepData('gauge', false)
InfluxData.addField('step_data', 'gauge', false)
// load default & individual configuration
Map config = ConfigurationHelper.newInstance(this)
@ -146,7 +147,7 @@ void call(Map parameters = [:]) {
try {
sh "${gaugeScript} ${config.testOptions}"
script.commonPipelineEnvironment.setInfluxStepData('gauge', true)
InfluxData.addField('step_data', 'gauge', true)
} catch (err) {
echo "[${STEP_NAME}] One or more tests failed"
script.currentBuild.result = 'UNSTABLE'

View File

@ -1,27 +1,66 @@
import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.GenerateDocumentation
import com.sap.piper.Utils
import com.sap.piper.ConfigurationHelper
import groovy.transform.Field
@Field String STEP_NAME = getClass().getName()
@Field Set GENERAL_CONFIG_KEYS = ['githubApiUrl', 'githubTokenCredentialsId', 'githubServerUrl']
@Field Set STEP_CONFIG_KEYS = [
'addClosedIssues',
'addDeltaToLastRelease',
'customFilterExtension',
'excludeLabels',
@Field Set GENERAL_CONFIG_KEYS = [
/** Allows to overwrite the GitHub API url.*/
'githubApiUrl',
/**
* Allows to overwrite the GitHub token credentials id.
* @possibleValues Jenkins credential id
*/
'githubTokenCredentialsId',
'githubOrg',
'githubRepo',
'githubServerUrl',
'releaseBodyHeader',
'version'
/** Allows to overwrite the GitHub url.*/
'githubServerUrl'
]
@Field Set STEP_CONFIG_KEYS = GENERAL_CONFIG_KEYS.plus([
/**
* If it is set to `true`, a list of all closed issues and merged pull-requests since the last release will added below the `releaseBodyHeader`.
* @possibleValues `true`, `false`
*/
'addClosedIssues',
/**
* If you set `addDeltaToLastRelease` to `true`, a link will be added to the relese information that brings up all commits since the last release.
* @possibleValues `true`, `false`
*/
'addDeltaToLastRelease',
/** Allows to pass additional filter criteria for retrieving closed issues since the last release. Additional criteria could be for example specific `label`, or `filter` according to [GitHub API documentation](https://developer.github.com/v3/issues/).*/
'customFilterExtension',
/** Allows to exclude issues with dedicated labels. Usage is like `excludeLabels: ['label1', 'label2']`.*/
'excludeLabels',
/** Allows to overwrite the GitHub organitation.*/
'githubOrg',
/** Allows to overwrite the GitHub repository.*/
'githubRepo',
/** Allows to specify the content which will appear for the release.*/
'releaseBodyHeader',
/** Defines the version number which will be written as tag as well as release name.*/
'version'
])
@Field Set PARAMETER_KEYS = STEP_CONFIG_KEYS
/**
* This step creates a tag in your GitHub repository together with a release.
*
* The release can be filled with text plus additional information like:
*
* * Closed pull request since last release
* * Closed issues since last release
* * link to delta information showing all commits since last release
*
* The result looks like
*
* ![Example release](../images/githubRelease.png)
*/
@GenerateDocumentation
void call(Map parameters = [:]) {
handlePipelineStepErrors(stepName: STEP_NAME, stepParameters: parameters) {
def script = checkScript(this, parameters) ?: this

View File

@ -2,6 +2,7 @@ import com.cloudbees.groovy.cps.NonCPS
import com.sap.piper.GenerateDocumentation
import com.sap.piper.ConfigurationHelper
import com.sap.piper.analytics.InfluxData
import groovy.text.SimpleTemplateEngine
import groovy.transform.Field
@ -115,11 +116,9 @@ private String formatErrorMessage(Map config, error){
}
private void writeErrorToInfluxData(Map config, error){
def script = config?.stepParameters?.script
if(script && script.commonPipelineEnvironment?.getInfluxCustomDataMapTags().build_error_message == null){
script.commonPipelineEnvironment?.setInfluxCustomDataMapTagsEntry('pipeline_data', 'build_error_step', config.stepName)
script.commonPipelineEnvironment?.setInfluxCustomDataMapTagsEntry('pipeline_data', 'build_error_stage', script.env?.STAGE_NAME)
script.commonPipelineEnvironment?.setInfluxCustomDataMapEntry('pipeline_data', 'build_error_message', error.getMessage())
if(InfluxData.getInstance().getFields().pipeline_data?.build_error_message == null){
InfluxData.addTag('pipeline_data', 'build_error_step', config.stepName)
InfluxData.addTag('pipeline_data', 'build_error_stage', config.stepParameters.script?.env?.STAGE_NAME)
InfluxData.addField('pipeline_data', 'build_error_message', error.getMessage())
}
}

View File

@ -3,6 +3,7 @@ import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.ConfigurationHelper
import com.sap.piper.JsonUtils
import com.sap.piper.Utils
import com.sap.piper.analytics.InfluxData
import groovy.transform.Field
@ -41,10 +42,10 @@ void call(Map parameters = [:]) {
: null
])
.mixin(parameters, PARAMETER_KEYS)
.addIfNull('customData', script.commonPipelineEnvironment.getInfluxCustomData())
.addIfNull('customDataTags', script.commonPipelineEnvironment.getInfluxCustomDataTags())
.addIfNull('customDataMap', script.commonPipelineEnvironment.getInfluxCustomDataMap())
.addIfNull('customDataMapTags', script.commonPipelineEnvironment.getInfluxCustomDataMapTags())
.addIfNull('customData', InfluxData.getInstance().getFields().jenkins_custom_data)
.addIfNull('customDataTags', InfluxData.getInstance().getTags().jenkins_custom_data)
.addIfNull('customDataMap', InfluxData.getInstance().getFields().findAll({ it.key != 'jenkins_custom_data' }))
.addIfNull('customDataMapTags', InfluxData.getInstance().getTags().findAll({ it.key != 'jenkins_custom_data' }))
.use()
new Utils().pushToSWA([

View File

@ -1,5 +1,6 @@
import com.sap.piper.ConfigurationHelper
import com.sap.piper.Utils
import com.sap.piper.StepAssertions
import com.sap.piper.tools.neo.DeployMode
import com.sap.piper.tools.neo.NeoCommandHelper
import com.sap.piper.tools.neo.WarAction
@ -88,6 +89,9 @@ void call(parameters = [:]) {
dockerEnvVars: configuration.dockerEnvVars,
dockerOptions: configuration.dockerOptions
) {
StepAssertions.assertFileExists(this, configuration.source)
NeoCommandHelper neoCommandHelper = new NeoCommandHelper(
this,
deployMode,
@ -124,15 +128,24 @@ private deploy(script, utils, Map configuration, NeoCommandHelper neoCommandHelp
echo "Link to the application dashboard: ${neoCommandHelper.cloudCockpitLink()}"
if (warAction == WarAction.ROLLING_UPDATE) {
sh neoCommandHelper.rollingUpdateCommand()
def returnCodeRollingUpdate = sh returnStatus: true, script: neoCommandHelper.rollingUpdateCommand()
if(returnCodeRollingUpdate != 0){
error "[ERROR][${STEP_NAME}] The execution of the deploy command failed, see the log for details."
}
} else {
sh neoCommandHelper.deployCommand()
def returnCodeDeploy = sh returnStatus: true, script: neoCommandHelper.deployCommand()
if(returnCodeDeploy != 0){
error "[ERROR][${STEP_NAME}] The execution of the deploy command failed, see the log for details."
}
sh neoCommandHelper.restartCommand()
}
} else if (deployMode == DeployMode.MTA) {
sh neoCommandHelper.deployMta()
def returnCodeMTA = sh returnStatus: true, script: neoCommandHelper.deployMta()
if(returnCodeMTA != 0){
error "[ERROR][${STEP_NAME}] The execution of the deploy command failed, see the log for details."
}
}
}
}

View File

@ -3,6 +3,8 @@ import static com.sap.piper.Prerequisites.checkScript
import com.sap.piper.GenerateDocumentation
import com.sap.piper.ConfigurationHelper
import com.sap.piper.Utils
import com.sap.piper.analytics.InfluxData
import groovy.transform.Field
@Field String STEP_NAME = getClass().getName()
@ -50,8 +52,8 @@ void call(Map parameters = [:]) {
stepParam4: parameters.customDefaults?'true':'false'
], config)
script.commonPipelineEnvironment.setInfluxStepData('build_url', env.BUILD_URL)
script.commonPipelineEnvironment.setInfluxPipelineData('build_url', env.BUILD_URL)
InfluxData.addField('step_data', 'build_url', env.BUILD_URL)
InfluxData.addField('pipeline_data', 'build_url', env.BUILD_URL)
}
}