mirror of
https://github.com/SAP/jenkins-library.git
synced 2025-03-03 15:02:35 +02:00
Merge branch 'master' into pr/httpsPushArtifactSetVersion
This commit is contained in:
commit
2568316c6e
@ -1,3 +1,7 @@
|
||||
version: "2"
|
||||
checks:
|
||||
return-statements:
|
||||
enabled: false
|
||||
plugins:
|
||||
codenarc:
|
||||
enabled: true
|
||||
@ -18,6 +22,12 @@ plugins:
|
||||
strings:
|
||||
- TODO
|
||||
- FIXME
|
||||
gofmt:
|
||||
enabled: true
|
||||
golint:
|
||||
enabled: true
|
||||
govet:
|
||||
enabled: true
|
||||
markdownlint:
|
||||
enabled: true
|
||||
checks:
|
||||
|
@ -22,3 +22,6 @@ indent_size = none
|
||||
[cfg/id_rsa.enc]
|
||||
indent_style = none
|
||||
indent_size = none
|
||||
[{go.mod,go.sum,*.go,*.golden}]
|
||||
indent_style = tab
|
||||
indent_size = 8
|
||||
|
41
.github/CONTRIBUTING.md
vendored
41
.github/CONTRIBUTING.md
vendored
@ -1,5 +1,13 @@
|
||||
# Guidance on how to contribute
|
||||
|
||||
**Table of contents:**
|
||||
|
||||
1. [Using the issue tracker](#using-the-issue-tracker)
|
||||
1. [Changing the code-base](#changing-the-code-base)
|
||||
1. [Jenkins credential handling](#jenkins-credentials)
|
||||
1. [Code Style](#code-style)
|
||||
1. [References](#references)
|
||||
|
||||
There are two primary ways to help:
|
||||
|
||||
* Using the issue tracker, and
|
||||
@ -36,14 +44,6 @@ Implementation of a functionality and its documentation shall happen within the
|
||||
|
||||
Pipeline steps must not make use of return values. The pattern for sharing parameters between pipeline steps or between a pipeline step and a pipeline script is sharing values via the [`commonPipelineEnvironment`](../vars/commonPipelineEnvironment.groovy). Since there is no return value from a pipeline step the return value of a pipeline step is already `void` rather than `def`.
|
||||
|
||||
### Code Style
|
||||
|
||||
The code should follow any stylistic and architectural guidelines prescribed by the project. In the absence of guidelines, mimic the styles and patterns in the existing code-base.
|
||||
|
||||
Variables, methods, types and so on shall have meaningful self describing names. Doing so makes understanding code easier and requires less commenting. It helps people who did not write the code to understand it better.
|
||||
|
||||
Code shall contain comments to explain the intention of the code when it is unclear what the intention of the author was. In such cases, comments should describe the "why" and not the "what" (that is in the code already).
|
||||
|
||||
#### EditorConfig
|
||||
|
||||
To ensure a common file format, there is a `.editorConfig` file [in place](../.editorconfig). To respect this file, [check](http://editorconfig.org/#download) if your editor does support it natively or you need to download a plugin.
|
||||
@ -54,8 +54,25 @@ Write [meaningful commit messages](http://who-t.blogspot.de/2009/12/on-commit-me
|
||||
|
||||
Good commit messages speed up the review process and help to keep this project maintainable in the long term.
|
||||
|
||||
## Jenkins credential handling
|
||||
|
||||
References to Jenkins credentials should have meaningful names.
|
||||
|
||||
We are using the following approach for naming Jenkins credentials:
|
||||
|
||||
For username/password credentials:
|
||||
`<tool>CredentialsId` like e.g. `neoCredentialsId`
|
||||
|
||||
For other cases we add further information to the name like:
|
||||
|
||||
* `gitSshCredentialsId` for ssh credentials
|
||||
* `githubTokenCredentialsId`for token/string credentials
|
||||
* `gcpFileCredentialsId` for file credentials
|
||||
|
||||
## Code Style
|
||||
|
||||
Generally, the code should follow any stylistic and architectural guidelines prescribed by the project. In the absence of guidelines, mimic the styles and patterns in the existing code-base.
|
||||
|
||||
The intention of this section is to describe the code style for this project. As reference document, the [Groovy's style guide](http://groovy-lang.org/style-guide.html) was taken. For further reading about Groovy's syntax and examples, please refer to this guide.
|
||||
|
||||
This project is intended to run in Jenkins [[2]](https://jenkins.io/doc/book/getting-started/) as part of a Jenkins Pipeline [[3]](https://jenkins.io/doc/book/pipeline/). It is composed by Jenkins Pipeline's syntax, Groovy's syntax and Java's syntax.
|
||||
@ -64,6 +81,12 @@ Some Groovy's syntax is not yet supported by Jenkins. It is also the intention o
|
||||
|
||||
As Groovy supports 99% of Java’s syntax [[1]](http://groovy-lang.org/style-guide.html), many Java developers tend to write Groovy code using Java's syntax. Such a developer should also consider the following code style for this project.
|
||||
|
||||
### General remarks
|
||||
|
||||
Variables, methods, types and so on shall have meaningful self describing names. Doing so makes understanding code easier and requires less commenting. It helps people who did not write the code to understand it better.
|
||||
|
||||
Code shall contain comments to explain the intention of the code when it is unclear what the intention of the author was. In such cases, comments should describe the "why" and not the "what" (that is in the code already).
|
||||
|
||||
### Omit semicolons
|
||||
|
||||
### Use the return keyword
|
||||
@ -177,7 +200,7 @@ If the type of the exception thrown inside a try block is not important, catch a
|
||||
|
||||
To check parameters, return values, and more, use the assert statement.
|
||||
|
||||
## Reference
|
||||
## References
|
||||
|
||||
[1] Groovy's syntax: [http://groovy-lang.org/style-guide.html](http://groovy-lang.org/style-guide.html)
|
||||
|
||||
|
5
.gitignore
vendored
5
.gitignore
vendored
@ -8,6 +8,7 @@ reports
|
||||
.classpath
|
||||
.project
|
||||
*~
|
||||
.vscode
|
||||
|
||||
# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
|
||||
hs_err_pid*
|
||||
@ -17,3 +18,7 @@ targets/
|
||||
documentation/docs-gen
|
||||
|
||||
consumer-test/**/workspace
|
||||
|
||||
*.code-workspace
|
||||
/piper
|
||||
/piper.exe
|
||||
|
4
.pipeline/config.yml
Normal file
4
.pipeline/config.yml
Normal file
@ -0,0 +1,4 @@
|
||||
steps:
|
||||
githubPublishRelease:
|
||||
owner: SAP
|
||||
repository: jenkins-library
|
21
.travis.yml
21
.travis.yml
@ -3,6 +3,8 @@ branches:
|
||||
- master
|
||||
- /^it\/.*$/
|
||||
language: groovy
|
||||
jdk:
|
||||
- openjdk8
|
||||
sudo: required
|
||||
services:
|
||||
- docker
|
||||
@ -20,12 +22,27 @@ cache:
|
||||
jobs:
|
||||
include:
|
||||
- stage: Tests
|
||||
name: Unit Tests
|
||||
name: Golang Build
|
||||
if: type = pull_request
|
||||
script:
|
||||
- docker build -t piper:${TRAVIS_BRANCH} .
|
||||
- name: Golang Build & Publish
|
||||
if: type != pull_request && repo = "SAP/jenkins-library" && branch = "master"
|
||||
script:
|
||||
- docker build -t piper:${TRAVIS_BRANCH} .
|
||||
- docker create --name piper_${TRAVIS_BRANCH} piper:${TRAVIS_BRANCH}
|
||||
- docker cp piper_${TRAVIS_BRANCH}:/build/piper .
|
||||
- docker rm piper_${TRAVIS_BRANCH}
|
||||
- cp ./piper ./piper_master
|
||||
- chmod +x ./piper
|
||||
- ./piper githubPublishRelease --token ${GITHUB_TOKEN} --version latest --updateAsset --assetPath ./piper_master
|
||||
- name: Groovy Unit Tests
|
||||
before_script:
|
||||
- curl -L --output cc-test-reporter https://codeclimate.com/downloads/test-reporter/test-reporter-latest-linux-amd64
|
||||
- chmod +x ./cc-test-reporter
|
||||
- ./cc-test-reporter before-build
|
||||
script: mvn package --batch-mode
|
||||
script:
|
||||
- mvn package --batch-mode
|
||||
after_script:
|
||||
- JACOCO_SOURCE_PATH="src vars test" ./cc-test-reporter format-coverage target/site/jacoco/jacoco.xml --input-type jacoco
|
||||
- ./cc-test-reporter upload-coverage
|
||||
|
131
DEVELOPMENT.md
Normal file
131
DEVELOPMENT.md
Normal file
@ -0,0 +1,131 @@
|
||||
# Development
|
||||
|
||||
**Table of contents:**
|
||||
|
||||
1. [Getting started](#getting-started)
|
||||
1. [Build the project](#build-the-project_)
|
||||
1. [Logging](#logging)
|
||||
1. [Error handling](#error-handling)
|
||||
|
||||
## Getting started
|
||||
|
||||
1. [Ramp up your development environment](#ramp-up)
|
||||
1. [Get familiar with Go language](#go-basics)
|
||||
1. Create [a GitHub account](https://github.com/join)
|
||||
1. Setup [GitHub access via SSH](https://help.github.com/articles/connecting-to-github-with-ssh/)
|
||||
1. [Create and checkout a repo fork](#checkout-your-fork)
|
||||
1. Optional: [Get Jenkins related environment](#jenkins-environment)
|
||||
1. Optional: [Get familiar with Jenkins Pipelines as Code](#jenkins-pipelines)
|
||||
|
||||
### Ramp up
|
||||
|
||||
First you need to set up an appropriate development environment:
|
||||
|
||||
Install Go, see [GO Getting Started](https://golang.org/doc/install)
|
||||
|
||||
Install an IDE with Go plugins, see for example [Go in Visual Studio Code](https://code.visualstudio.com/docs/languages/go)
|
||||
|
||||
### Go basics
|
||||
|
||||
In order to get yourself started, there is a lot of useful information out there.
|
||||
|
||||
As a first step to take we highly recommend the [Golang documentation](https://golang.org/doc/) especially, [A Tour of Go](https://tour.golang.org/welcome/1)
|
||||
|
||||
We have a strong focus on high quality software and contributions without adequate tests will not be accepted.
|
||||
There is an excellent resource which teaches Go using a test-driven approach: [Learn Go with Tests](https://github.com/quii/learn-go-with-tests)
|
||||
|
||||
### Checkout your fork
|
||||
|
||||
The project uses [Go modules](https://blog.golang.org/using-go-modules). Thus please make sure to **NOT** checkout the project into your [`GOPATH`](https://github.com/golang/go/wiki/SettingGOPATH).
|
||||
|
||||
To check out this repository:
|
||||
|
||||
1. Create your own
|
||||
[fork of this repo](https://help.github.com/articles/fork-a-repo/)
|
||||
1. Clone it to your machine, for example like:
|
||||
|
||||
```shell
|
||||
mkdir -p ${HOME}/projects/jenkins-library
|
||||
cd ${HOME}/projects
|
||||
git clone git@github.com:${YOUR_GITHUB_USERNAME}/jenkins-library.git
|
||||
cd jenkins-library
|
||||
git remote add upstream git@github.com:sap/jenkins-library.git
|
||||
git remote set-url --push upstream no_push
|
||||
```
|
||||
|
||||
### Jenkins environment
|
||||
|
||||
If you want to contribute also to the Jenkins-specific parts like
|
||||
|
||||
* Jenkins library step
|
||||
* Jenkins pipeline integration
|
||||
|
||||
you need to do the following in addition:
|
||||
|
||||
* [Install Groovy](https://groovy-lang.org/install.html)
|
||||
* [Install Maven](https://maven.apache.org/install.html)
|
||||
* Get a local Jenkins installed: Use for example [cx-server](toDo: add link)
|
||||
|
||||
### Jenkins pipelines
|
||||
|
||||
The Jenkins related parts depend on
|
||||
|
||||
* [Jenkins Pipelines as Code](https://jenkins.io/doc/book/pipeline-as-code/)
|
||||
* [Jenkins Shared Libraries](https://jenkins.io/doc/book/pipeline/shared-libraries/)
|
||||
|
||||
You should get familiar with these concepts for contributing to the Jenkins-specific parts.
|
||||
|
||||
## Build the project
|
||||
|
||||
### Build the executable suitable for the CI/CD Linux target environments
|
||||
|
||||
Use Docker:
|
||||
|
||||
`docker build -t piper:latest .`
|
||||
|
||||
You can extract the binary using Docker means to your local filesystem:
|
||||
|
||||
```
|
||||
docker create --name piper piper:latest
|
||||
docker cp piper:/piper .
|
||||
docker rm piper
|
||||
```
|
||||
|
||||
## Generating step framework
|
||||
|
||||
The steps are generated based on the yaml files in `resources/metadata/` with the following command
|
||||
`go run pkg/generator/step-metadata.go`.
|
||||
|
||||
The yaml format is kept pretty close to Tekton's [task format](https://github.com/tektoncd/pipeline/blob/master/docs/tasks.md).
|
||||
Where the Tekton format was not sufficient some extenstions have been made.
|
||||
|
||||
Examples are:
|
||||
|
||||
* matadata - longDescription
|
||||
* spec - inputs - secrets
|
||||
* spec - containers
|
||||
* spec - sidecars
|
||||
|
||||
## Logging
|
||||
|
||||
to be added
|
||||
|
||||
## Error handling
|
||||
|
||||
In order to better understand the root cause of errors that occur we wrap errors like
|
||||
|
||||
```golang
|
||||
f, err := os.Open(path)
|
||||
if err != nil {
|
||||
return errors.Wrapf(err, "open failed for %v", path)
|
||||
}
|
||||
defer f.Close()
|
||||
```
|
||||
|
||||
We use [github.com/pkg/errors](https://github.com/pkg/errors) for that.
|
||||
|
||||
## Testing
|
||||
|
||||
Unit tests are done using basic `golang` means.
|
||||
|
||||
Additionally we encourage you to use [github.com/stretchr/testify/assert](https://github.com/stretchr/testify/assert) in order to have slimmer assertions if you like.
|
20
Dockerfile
Normal file
20
Dockerfile
Normal file
@ -0,0 +1,20 @@
|
||||
FROM golang:1.13 AS build-env
|
||||
COPY . /build
|
||||
WORKDIR /build
|
||||
|
||||
# execute tests
|
||||
RUN go test ./... -cover
|
||||
|
||||
## ONLY tests so far, building to be added later
|
||||
# execute build
|
||||
RUN export GIT_COMMIT=$(git rev-parse HEAD) && \
|
||||
export GIT_REPOSITORY=$(git config --get remote.origin.url) && \
|
||||
go build \
|
||||
-ldflags \
|
||||
"-X github.com/SAP/jenkins-library/cmd.GitCommit=${GIT_COMMIT} \
|
||||
-X github.com/SAP/jenkins-library/pkg/log.LibraryRepository=${GIT_REPOSITORY}" \
|
||||
-o piper
|
||||
|
||||
# FROM gcr.io/distroless/base:latest
|
||||
# COPY --from=build-env /build/piper /piper
|
||||
# ENTRYPOINT ["/piper"]
|
122
cmd/getConfig.go
Normal file
122
cmd/getConfig.go
Normal file
@ -0,0 +1,122 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/SAP/jenkins-library/pkg/piperutils"
|
||||
"github.com/pkg/errors"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
type configCommandOptions struct {
|
||||
output string //output format, so far only JSON
|
||||
parametersJSON string //parameters to be considered in JSON format
|
||||
stepMetadata string //metadata to be considered, can be filePath or ENV containing JSON in format 'ENV:MY_ENV_VAR'
|
||||
stepName string
|
||||
contextConfig bool
|
||||
openFile func(s string) (io.ReadCloser, error)
|
||||
}
|
||||
|
||||
var configOptions configCommandOptions
|
||||
|
||||
// ConfigCommand is the entry command for loading the configuration of a pipeline step
|
||||
func ConfigCommand() *cobra.Command {
|
||||
|
||||
configOptions.openFile = OpenPiperFile
|
||||
var createConfigCmd = &cobra.Command{
|
||||
Use: "getConfig",
|
||||
Short: "Loads the project 'Piper' configuration respecting defaults and parameters.",
|
||||
RunE: func(cmd *cobra.Command, _ []string) error {
|
||||
return generateConfig()
|
||||
},
|
||||
}
|
||||
|
||||
addConfigFlags(createConfigCmd)
|
||||
return createConfigCmd
|
||||
}
|
||||
|
||||
func generateConfig() error {
|
||||
|
||||
var myConfig config.Config
|
||||
var stepConfig config.StepConfig
|
||||
|
||||
var metadata config.StepData
|
||||
metadataFile, err := configOptions.openFile(configOptions.stepMetadata)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "metadata: open failed")
|
||||
}
|
||||
|
||||
err = metadata.ReadPipelineStepData(metadataFile)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "metadata: read failed")
|
||||
}
|
||||
|
||||
var customConfig io.ReadCloser
|
||||
if piperutils.FileExists(GeneralConfig.CustomConfig) {
|
||||
customConfig, err = configOptions.openFile(GeneralConfig.CustomConfig)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "config: open failed")
|
||||
}
|
||||
}
|
||||
|
||||
defaultConfig, paramFilter, err := defaultsAndFilters(&metadata)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "defaults: retrieving step defaults failed")
|
||||
}
|
||||
|
||||
for _, f := range GeneralConfig.DefaultConfig {
|
||||
fc, err := configOptions.openFile(f)
|
||||
if err != nil {
|
||||
return errors.Wrapf(err, "config: getting defaults failed: '%v'", f)
|
||||
}
|
||||
defaultConfig = append(defaultConfig, fc)
|
||||
}
|
||||
|
||||
var flags map[string]interface{}
|
||||
|
||||
params := []config.StepParameters{}
|
||||
if !configOptions.contextConfig {
|
||||
params = metadata.Spec.Inputs.Parameters
|
||||
}
|
||||
|
||||
stepConfig, err = myConfig.GetStepConfig(flags, GeneralConfig.ParametersJSON, customConfig, defaultConfig, paramFilter, params, GeneralConfig.StageName, configOptions.stepName)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "getting step config failed")
|
||||
}
|
||||
|
||||
myConfigJSON, _ := config.GetJSON(stepConfig.Config)
|
||||
|
||||
fmt.Println(myConfigJSON)
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func addConfigFlags(cmd *cobra.Command) {
|
||||
|
||||
//ToDo: support more output options, like https://kubernetes.io/docs/reference/kubectl/overview/#formatting-output
|
||||
cmd.Flags().StringVar(&configOptions.output, "output", "json", "Defines the output format")
|
||||
|
||||
cmd.Flags().StringVar(&configOptions.parametersJSON, "parametersJSON", os.Getenv("PIPER_parametersJSON"), "Parameters to be considered in JSON format")
|
||||
cmd.Flags().StringVar(&configOptions.stepMetadata, "stepMetadata", "", "Step metadata, passed as path to yaml")
|
||||
cmd.Flags().StringVar(&configOptions.stepName, "stepName", "", "Name of the step for which configuration should be included")
|
||||
cmd.Flags().BoolVar(&configOptions.contextConfig, "contextConfig", false, "Defines if step context configuration should be loaded instead of step config")
|
||||
|
||||
cmd.MarkFlagRequired("stepMetadata")
|
||||
cmd.MarkFlagRequired("stepName")
|
||||
|
||||
}
|
||||
|
||||
func defaultsAndFilters(metadata *config.StepData) ([]io.ReadCloser, config.StepFilters, error) {
|
||||
if configOptions.contextConfig {
|
||||
defaults, err := metadata.GetContextDefaults(configOptions.stepName)
|
||||
if err != nil {
|
||||
return nil, config.StepFilters{}, errors.Wrap(err, "metadata: getting context defaults failed")
|
||||
}
|
||||
return []io.ReadCloser{defaults}, metadata.GetContextParameterFilters(), nil
|
||||
}
|
||||
//ToDo: retrieve default values from metadata
|
||||
return nil, metadata.GetParameterFilters(), nil
|
||||
}
|
90
cmd/getConfig_test.go
Normal file
90
cmd/getConfig_test.go
Normal file
@ -0,0 +1,90 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/spf13/cobra"
|
||||
flag "github.com/spf13/pflag"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func configOpenFileMock(name string) (io.ReadCloser, error) {
|
||||
var r string
|
||||
switch name {
|
||||
case "TestAddCustomDefaults_default1":
|
||||
r = "default1"
|
||||
case "TestAddCustomDefaults_default2":
|
||||
r = "default3"
|
||||
default:
|
||||
r = ""
|
||||
}
|
||||
return ioutil.NopCloser(strings.NewReader(r)), nil
|
||||
}
|
||||
|
||||
func TestConfigCommand(t *testing.T) {
|
||||
cmd := ConfigCommand()
|
||||
|
||||
gotReq := []string{}
|
||||
gotOpt := []string{}
|
||||
|
||||
cmd.Flags().VisitAll(func(pflag *flag.Flag) {
|
||||
annotations, found := pflag.Annotations[cobra.BashCompOneRequiredFlag]
|
||||
if found && annotations[0] == "true" {
|
||||
gotReq = append(gotReq, pflag.Name)
|
||||
} else {
|
||||
gotOpt = append(gotOpt, pflag.Name)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Required flags", func(t *testing.T) {
|
||||
exp := []string{"stepMetadata", "stepName"}
|
||||
assert.Equal(t, exp, gotReq, "required flags incorrect")
|
||||
})
|
||||
|
||||
t.Run("Optional flags", func(t *testing.T) {
|
||||
exp := []string{"contextConfig", "output", "parametersJSON"}
|
||||
assert.Equal(t, exp, gotOpt, "optional flags incorrect")
|
||||
})
|
||||
|
||||
t.Run("Run", func(t *testing.T) {
|
||||
t.Run("Success case", func(t *testing.T) {
|
||||
configOptions.openFile = configOpenFileMock
|
||||
err := cmd.RunE(cmd, []string{})
|
||||
assert.NoError(t, err, "error occured but none expected")
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
func TestDefaultsAndFilters(t *testing.T) {
|
||||
metadata := config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{Name: "paramOne", Scope: []string{"GENERAL", "STEPS", "STAGES", "PARAMETERS", "ENV"}},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
t.Run("Context config", func(t *testing.T) {
|
||||
configOptions.contextConfig = true
|
||||
defer func() { configOptions.contextConfig = false }()
|
||||
defaults, filters, err := defaultsAndFilters(&metadata)
|
||||
|
||||
assert.Equal(t, 1, len(defaults), "getting defaults failed")
|
||||
assert.Equal(t, 0, len(filters.All), "wrong number of filter values")
|
||||
assert.NoError(t, err, "error occured but none expected")
|
||||
})
|
||||
|
||||
t.Run("Step config", func(t *testing.T) {
|
||||
defaults, filters, err := defaultsAndFilters(&metadata)
|
||||
assert.Equal(t, 0, len(defaults), "getting defaults failed")
|
||||
assert.Equal(t, 1, len(filters.All), "wrong number of filter values")
|
||||
assert.NoError(t, err, "error occured but none expected")
|
||||
})
|
||||
|
||||
}
|
217
cmd/githubPublishRelease.go
Normal file
217
cmd/githubPublishRelease.go
Normal file
@ -0,0 +1,217 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"mime"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
"github.com/google/go-github/v28/github"
|
||||
"github.com/pkg/errors"
|
||||
|
||||
piperGithub "github.com/SAP/jenkins-library/pkg/github"
|
||||
)
|
||||
|
||||
type githubRepoClient interface {
|
||||
CreateRelease(ctx context.Context, owner string, repo string, release *github.RepositoryRelease) (*github.RepositoryRelease, *github.Response, error)
|
||||
DeleteReleaseAsset(ctx context.Context, owner string, repo string, id int64) (*github.Response, error)
|
||||
GetLatestRelease(ctx context.Context, owner string, repo string) (*github.RepositoryRelease, *github.Response, error)
|
||||
ListReleaseAssets(ctx context.Context, owner string, repo string, id int64, opt *github.ListOptions) ([]*github.ReleaseAsset, *github.Response, error)
|
||||
UploadReleaseAsset(ctx context.Context, owner string, repo string, id int64, opt *github.UploadOptions, file *os.File) (*github.ReleaseAsset, *github.Response, error)
|
||||
}
|
||||
|
||||
type githubIssueClient interface {
|
||||
ListByRepo(ctx context.Context, owner string, repo string, opt *github.IssueListByRepoOptions) ([]*github.Issue, *github.Response, error)
|
||||
}
|
||||
|
||||
func githubPublishRelease(myGithubPublishReleaseOptions githubPublishReleaseOptions) error {
|
||||
ctx, client, err := piperGithub.NewClient(myGithubPublishReleaseOptions.Token, myGithubPublishReleaseOptions.APIURL, myGithubPublishReleaseOptions.UploadURL)
|
||||
if err != nil {
|
||||
log.Entry().WithError(err).Fatal("Failed to get GitHub client.")
|
||||
}
|
||||
|
||||
err = runGithubPublishRelease(ctx, &myGithubPublishReleaseOptions, client.Repositories, client.Issues)
|
||||
if err != nil {
|
||||
log.Entry().WithError(err).Fatal("Failed to publish GitHub release.")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func runGithubPublishRelease(ctx context.Context, myGithubPublishReleaseOptions *githubPublishReleaseOptions, ghRepoClient githubRepoClient, ghIssueClient githubIssueClient) error {
|
||||
|
||||
var publishedAt github.Timestamp
|
||||
|
||||
lastRelease, resp, err := ghRepoClient.GetLatestRelease(ctx, myGithubPublishReleaseOptions.Owner, myGithubPublishReleaseOptions.Repository)
|
||||
if err != nil {
|
||||
if resp.StatusCode == 404 {
|
||||
//no previous release found -> first release
|
||||
myGithubPublishReleaseOptions.AddDeltaToLastRelease = false
|
||||
log.Entry().Debug("This is the first release.")
|
||||
} else {
|
||||
return errors.Wrap(err, "Error occured when retrieving latest GitHub release.")
|
||||
}
|
||||
}
|
||||
publishedAt = lastRelease.GetPublishedAt()
|
||||
log.Entry().Debugf("Previous GitHub release published: '%v'", publishedAt)
|
||||
|
||||
//updating assets only supported on latest release
|
||||
if myGithubPublishReleaseOptions.UpdateAsset && myGithubPublishReleaseOptions.Version == "latest" {
|
||||
return uploadReleaseAsset(ctx, lastRelease.GetID(), myGithubPublishReleaseOptions, ghRepoClient)
|
||||
}
|
||||
|
||||
releaseBody := ""
|
||||
|
||||
if len(myGithubPublishReleaseOptions.ReleaseBodyHeader) > 0 {
|
||||
releaseBody += myGithubPublishReleaseOptions.ReleaseBodyHeader + "\n"
|
||||
}
|
||||
|
||||
if myGithubPublishReleaseOptions.AddClosedIssues {
|
||||
releaseBody += getClosedIssuesText(ctx, publishedAt, myGithubPublishReleaseOptions, ghIssueClient)
|
||||
}
|
||||
|
||||
if myGithubPublishReleaseOptions.AddDeltaToLastRelease {
|
||||
releaseBody += getReleaseDeltaText(myGithubPublishReleaseOptions, lastRelease)
|
||||
}
|
||||
|
||||
release := github.RepositoryRelease{
|
||||
TagName: &myGithubPublishReleaseOptions.Version,
|
||||
TargetCommitish: &myGithubPublishReleaseOptions.Commitish,
|
||||
Name: &myGithubPublishReleaseOptions.Version,
|
||||
Body: &releaseBody,
|
||||
}
|
||||
|
||||
createdRelease, _, err := ghRepoClient.CreateRelease(ctx, myGithubPublishReleaseOptions.Owner, myGithubPublishReleaseOptions.Repository, &release)
|
||||
if err != nil {
|
||||
return errors.Wrapf(err, "Creation of release '%v' failed", *release.TagName)
|
||||
}
|
||||
log.Entry().Infof("Release %v created on %v/%v", *createdRelease.TagName, myGithubPublishReleaseOptions.Owner, myGithubPublishReleaseOptions.Repository)
|
||||
|
||||
if len(myGithubPublishReleaseOptions.AssetPath) > 0 {
|
||||
return uploadReleaseAsset(ctx, createdRelease.GetID(), myGithubPublishReleaseOptions, ghRepoClient)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func getClosedIssuesText(ctx context.Context, publishedAt github.Timestamp, myGithubPublishReleaseOptions *githubPublishReleaseOptions, ghIssueClient githubIssueClient) string {
|
||||
closedIssuesText := ""
|
||||
|
||||
options := github.IssueListByRepoOptions{
|
||||
State: "closed",
|
||||
Direction: "asc",
|
||||
Since: publishedAt.Time,
|
||||
}
|
||||
if len(myGithubPublishReleaseOptions.Labels) > 0 {
|
||||
options.Labels = myGithubPublishReleaseOptions.Labels
|
||||
}
|
||||
ghIssues, _, err := ghIssueClient.ListByRepo(ctx, myGithubPublishReleaseOptions.Owner, myGithubPublishReleaseOptions.Repository, &options)
|
||||
if err != nil {
|
||||
log.Entry().WithError(err).Error("Failed to get GitHub issues.")
|
||||
}
|
||||
|
||||
prTexts := []string{"**List of closed pull-requests since last release**"}
|
||||
issueTexts := []string{"**List of closed issues since last release**"}
|
||||
|
||||
for _, issue := range ghIssues {
|
||||
if issue.IsPullRequest() && !isExcluded(issue, myGithubPublishReleaseOptions.ExcludeLabels) {
|
||||
prTexts = append(prTexts, fmt.Sprintf("[#%v](%v): %v", issue.GetNumber(), issue.GetHTMLURL(), issue.GetTitle()))
|
||||
log.Entry().Debugf("Added PR #%v to release", issue.GetNumber())
|
||||
} else if !issue.IsPullRequest() && !isExcluded(issue, myGithubPublishReleaseOptions.ExcludeLabels) {
|
||||
issueTexts = append(issueTexts, fmt.Sprintf("[#%v](%v): %v", issue.GetNumber(), issue.GetHTMLURL(), issue.GetTitle()))
|
||||
log.Entry().Debugf("Added Issue #%v to release", issue.GetNumber())
|
||||
}
|
||||
}
|
||||
|
||||
if len(prTexts) > 1 {
|
||||
closedIssuesText += "\n" + strings.Join(prTexts, "\n") + "\n"
|
||||
}
|
||||
|
||||
if len(issueTexts) > 1 {
|
||||
closedIssuesText += "\n" + strings.Join(issueTexts, "\n") + "\n"
|
||||
}
|
||||
return closedIssuesText
|
||||
}
|
||||
|
||||
func getReleaseDeltaText(myGithubPublishReleaseOptions *githubPublishReleaseOptions, lastRelease *github.RepositoryRelease) string {
|
||||
releaseDeltaText := ""
|
||||
|
||||
//add delta link to previous release
|
||||
releaseDeltaText += "\n**Changes**\n"
|
||||
releaseDeltaText += fmt.Sprintf(
|
||||
"[%v...%v](%v/%v/%v/compare/%v...%v)\n",
|
||||
lastRelease.GetTagName(),
|
||||
myGithubPublishReleaseOptions.Version,
|
||||
myGithubPublishReleaseOptions.ServerURL,
|
||||
myGithubPublishReleaseOptions.Owner,
|
||||
myGithubPublishReleaseOptions.Repository,
|
||||
lastRelease.GetTagName(), myGithubPublishReleaseOptions.Version,
|
||||
)
|
||||
|
||||
return releaseDeltaText
|
||||
}
|
||||
|
||||
func uploadReleaseAsset(ctx context.Context, releaseID int64, myGithubPublishReleaseOptions *githubPublishReleaseOptions, ghRepoClient githubRepoClient) error {
|
||||
|
||||
assets, _, err := ghRepoClient.ListReleaseAssets(ctx, myGithubPublishReleaseOptions.Owner, myGithubPublishReleaseOptions.Repository, releaseID, &github.ListOptions{})
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "Failed to get list of release assets.")
|
||||
}
|
||||
var assetID int64
|
||||
for _, a := range assets {
|
||||
if a.GetName() == filepath.Base(myGithubPublishReleaseOptions.AssetPath) {
|
||||
assetID = a.GetID()
|
||||
break
|
||||
}
|
||||
}
|
||||
if assetID != 0 {
|
||||
//asset needs to be deleted first since API does not allow for replacement
|
||||
_, err := ghRepoClient.DeleteReleaseAsset(ctx, myGithubPublishReleaseOptions.Owner, myGithubPublishReleaseOptions.Repository, assetID)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "Failed to delete release asset.")
|
||||
}
|
||||
}
|
||||
|
||||
mediaType := mime.TypeByExtension(filepath.Ext(myGithubPublishReleaseOptions.AssetPath))
|
||||
if mediaType == "" {
|
||||
mediaType = "application/octet-stream"
|
||||
}
|
||||
log.Entry().Debugf("Using mediaType '%v'", mediaType)
|
||||
|
||||
name := filepath.Base(myGithubPublishReleaseOptions.AssetPath)
|
||||
log.Entry().Debugf("Using file name '%v'", name)
|
||||
|
||||
opts := github.UploadOptions{
|
||||
Name: name,
|
||||
MediaType: mediaType,
|
||||
}
|
||||
file, err := os.Open(myGithubPublishReleaseOptions.AssetPath)
|
||||
defer file.Close()
|
||||
if err != nil {
|
||||
return errors.Wrapf(err, "Failed to load release asset '%v'", myGithubPublishReleaseOptions.AssetPath)
|
||||
}
|
||||
|
||||
log.Entry().Info("Starting to upload release asset.")
|
||||
asset, _, err := ghRepoClient.UploadReleaseAsset(ctx, myGithubPublishReleaseOptions.Owner, myGithubPublishReleaseOptions.Repository, releaseID, &opts, file)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "Failed to upload release asset.")
|
||||
}
|
||||
log.Entry().Infof("Done uploading asset '%v'.", asset.GetURL())
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func isExcluded(issue *github.Issue, excludeLabels []string) bool {
|
||||
//issue.Labels[0].GetName()
|
||||
for _, ex := range excludeLabels {
|
||||
for _, l := range issue.Labels {
|
||||
if ex == l.GetName() {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
189
cmd/githubPublishRelease_generated.go
Normal file
189
cmd/githubPublishRelease_generated.go
Normal file
@ -0,0 +1,189 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"os"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
type githubPublishReleaseOptions struct {
|
||||
AddClosedIssues bool `json:"addClosedIssues,omitempty"`
|
||||
AddDeltaToLastRelease bool `json:"addDeltaToLastRelease,omitempty"`
|
||||
AssetPath string `json:"assetPath,omitempty"`
|
||||
Commitish string `json:"commitish,omitempty"`
|
||||
ExcludeLabels []string `json:"excludeLabels,omitempty"`
|
||||
APIURL string `json:"apiUrl,omitempty"`
|
||||
Owner string `json:"owner,omitempty"`
|
||||
Repository string `json:"repository,omitempty"`
|
||||
ServerURL string `json:"serverUrl,omitempty"`
|
||||
Token string `json:"token,omitempty"`
|
||||
UploadURL string `json:"uploadUrl,omitempty"`
|
||||
Labels []string `json:"labels,omitempty"`
|
||||
ReleaseBodyHeader string `json:"releaseBodyHeader,omitempty"`
|
||||
UpdateAsset bool `json:"updateAsset,omitempty"`
|
||||
Version string `json:"version,omitempty"`
|
||||
}
|
||||
|
||||
var myGithubPublishReleaseOptions githubPublishReleaseOptions
|
||||
var githubPublishReleaseStepConfigJSON string
|
||||
|
||||
// GithubPublishReleaseCommand Publish a release in GitHub
|
||||
func GithubPublishReleaseCommand() *cobra.Command {
|
||||
metadata := githubPublishReleaseMetadata()
|
||||
var createGithubPublishReleaseCmd = &cobra.Command{
|
||||
Use: "githubPublishRelease",
|
||||
Short: "Publish a release in GitHub",
|
||||
Long: `This step creates a tag in your GitHub repository together with a release.
|
||||
The release can be filled with text plus additional information like:
|
||||
|
||||
* Closed pull request since last release
|
||||
* Closed issues since last release
|
||||
* Link to delta information showing all commits since last release
|
||||
|
||||
The result looks like
|
||||
|
||||
`,
|
||||
PreRunE: func(cmd *cobra.Command, args []string) error {
|
||||
log.SetStepName("githubPublishRelease")
|
||||
log.SetVerbose(GeneralConfig.Verbose)
|
||||
return PrepareConfig(cmd, &metadata, "githubPublishRelease", &myGithubPublishReleaseOptions, OpenPiperFile)
|
||||
},
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
return githubPublishRelease(myGithubPublishReleaseOptions)
|
||||
},
|
||||
}
|
||||
|
||||
addGithubPublishReleaseFlags(createGithubPublishReleaseCmd)
|
||||
return createGithubPublishReleaseCmd
|
||||
}
|
||||
|
||||
func addGithubPublishReleaseFlags(cmd *cobra.Command) {
|
||||
cmd.Flags().BoolVar(&myGithubPublishReleaseOptions.AddClosedIssues, "addClosedIssues", false, "If set to `true`, closed issues and merged pull-requests since the last release will added below the `releaseBodyHeader`")
|
||||
cmd.Flags().BoolVar(&myGithubPublishReleaseOptions.AddDeltaToLastRelease, "addDeltaToLastRelease", false, "If set to `true`, a link will be added to the relese information that brings up all commits since the last release.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.AssetPath, "assetPath", os.Getenv("PIPER_assetPath"), "Path to a release asset which should be uploaded to the list of release assets.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.Commitish, "commitish", "master", "Target git commitish for the release")
|
||||
cmd.Flags().StringSliceVar(&myGithubPublishReleaseOptions.ExcludeLabels, "excludeLabels", []string{}, "Allows to exclude issues with dedicated list of labels.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.APIURL, "apiUrl", "https://api.github.com", "Set the GitHub API url.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.Owner, "owner", os.Getenv("PIPER_owner"), "Set the GitHub organization.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.Repository, "repository", os.Getenv("PIPER_repository"), "Set the GitHub repository.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.ServerURL, "serverUrl", "https://github.com", "GitHub server url for end-user access.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.Token, "token", os.Getenv("PIPER_token"), "GitHub personal access token as per https://help.github.com/en/github/authenticating-to-github/creating-a-personal-access-token-for-the-command-line")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.UploadURL, "uploadUrl", "https://uploads.github.com", "Set the GitHub API url.")
|
||||
cmd.Flags().StringSliceVar(&myGithubPublishReleaseOptions.Labels, "labels", []string{}, "Labels to include in issue search.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.ReleaseBodyHeader, "releaseBodyHeader", os.Getenv("PIPER_releaseBodyHeader"), "Content which will appear for the release.")
|
||||
cmd.Flags().BoolVar(&myGithubPublishReleaseOptions.UpdateAsset, "updateAsset", false, "Specify if a release asset should be updated only.")
|
||||
cmd.Flags().StringVar(&myGithubPublishReleaseOptions.Version, "version", os.Getenv("PIPER_version"), "Define the version number which will be written as tag as well as release name.")
|
||||
|
||||
cmd.MarkFlagRequired("apiUrl")
|
||||
cmd.MarkFlagRequired("owner")
|
||||
cmd.MarkFlagRequired("repository")
|
||||
cmd.MarkFlagRequired("serverUrl")
|
||||
cmd.MarkFlagRequired("token")
|
||||
cmd.MarkFlagRequired("uploadUrl")
|
||||
cmd.MarkFlagRequired("version")
|
||||
}
|
||||
|
||||
// retrieve step metadata
|
||||
func githubPublishReleaseMetadata() config.StepData {
|
||||
var theMetaData = config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{
|
||||
Name: "addClosedIssues",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "bool",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "addDeltaToLastRelease",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "bool",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "assetPath",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "commitish",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "excludeLabels",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "[]string",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "apiUrl",
|
||||
Scope: []string{"GENERAL", "PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "owner",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "repository",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "serverUrl",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "token",
|
||||
Scope: []string{"GENERAL", "PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "uploadUrl",
|
||||
Scope: []string{"GENERAL", "PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "labels",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "[]string",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "releaseBodyHeader",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "updateAsset",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "bool",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "version",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
return theMetaData
|
||||
}
|
16
cmd/githubPublishRelease_generated_test.go
Normal file
16
cmd/githubPublishRelease_generated_test.go
Normal file
@ -0,0 +1,16 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestGithubPublishReleaseCommand(t *testing.T) {
|
||||
|
||||
testCmd := GithubPublishReleaseCommand()
|
||||
|
||||
// only high level testing performed - details are tested in step generation procudure
|
||||
assert.Equal(t, "githubPublishRelease", testCmd.Use, "command name incorrect")
|
||||
|
||||
}
|
383
cmd/githubPublishRelease_test.go
Normal file
383
cmd/githubPublishRelease_test.go
Normal file
@ -0,0 +1,383 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/google/go-github/v28/github"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
type ghRCMock struct {
|
||||
createErr error
|
||||
latestRelease *github.RepositoryRelease
|
||||
release *github.RepositoryRelease
|
||||
delErr error
|
||||
delID int64
|
||||
delOwner string
|
||||
delRepo string
|
||||
listErr error
|
||||
listID int64
|
||||
listOwner string
|
||||
listReleaseAssets []*github.ReleaseAsset
|
||||
listRepo string
|
||||
listOpts *github.ListOptions
|
||||
latestStatusCode int
|
||||
latestErr error
|
||||
uploadID int64
|
||||
uploadOpts *github.UploadOptions
|
||||
uploadOwner string
|
||||
uploadRepo string
|
||||
}
|
||||
|
||||
func (g *ghRCMock) CreateRelease(ctx context.Context, owner string, repo string, release *github.RepositoryRelease) (*github.RepositoryRelease, *github.Response, error) {
|
||||
g.release = release
|
||||
return release, nil, g.createErr
|
||||
}
|
||||
|
||||
func (g *ghRCMock) DeleteReleaseAsset(ctx context.Context, owner string, repo string, id int64) (*github.Response, error) {
|
||||
g.delOwner = owner
|
||||
g.delRepo = repo
|
||||
g.delID = id
|
||||
return nil, g.delErr
|
||||
}
|
||||
|
||||
func (g *ghRCMock) GetLatestRelease(ctx context.Context, owner string, repo string) (*github.RepositoryRelease, *github.Response, error) {
|
||||
hc := http.Response{StatusCode: 200}
|
||||
if g.latestStatusCode != 0 {
|
||||
hc.StatusCode = g.latestStatusCode
|
||||
}
|
||||
ghResp := github.Response{Response: &hc}
|
||||
return g.latestRelease, &ghResp, g.latestErr
|
||||
}
|
||||
|
||||
func (g *ghRCMock) ListReleaseAssets(ctx context.Context, owner string, repo string, id int64, opt *github.ListOptions) ([]*github.ReleaseAsset, *github.Response, error) {
|
||||
g.listID = id
|
||||
g.listOwner = owner
|
||||
g.listRepo = repo
|
||||
g.listOpts = opt
|
||||
return g.listReleaseAssets, nil, g.listErr
|
||||
}
|
||||
|
||||
func (g *ghRCMock) UploadReleaseAsset(ctx context.Context, owner string, repo string, id int64, opt *github.UploadOptions, file *os.File) (*github.ReleaseAsset, *github.Response, error) {
|
||||
g.uploadID = id
|
||||
g.uploadOwner = owner
|
||||
g.uploadRepo = repo
|
||||
g.uploadOpts = opt
|
||||
return nil, nil, nil
|
||||
}
|
||||
|
||||
type ghICMock struct {
|
||||
issues []*github.Issue
|
||||
lastPublished time.Time
|
||||
owner string
|
||||
repo string
|
||||
options *github.IssueListByRepoOptions
|
||||
}
|
||||
|
||||
func (g *ghICMock) ListByRepo(ctx context.Context, owner string, repo string, opt *github.IssueListByRepoOptions) ([]*github.Issue, *github.Response, error) {
|
||||
g.owner = owner
|
||||
g.repo = repo
|
||||
g.options = opt
|
||||
g.lastPublished = opt.Since
|
||||
return g.issues, nil, nil
|
||||
}
|
||||
|
||||
func TestRunGithubPublishRelease(t *testing.T) {
|
||||
ctx := context.Background()
|
||||
|
||||
t.Run("Success - first release & no body", func(t *testing.T) {
|
||||
ghIssueClient := ghICMock{}
|
||||
ghRepoClient := ghRCMock{
|
||||
latestStatusCode: 404,
|
||||
latestErr: fmt.Errorf("not found"),
|
||||
}
|
||||
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
AddDeltaToLastRelease: true,
|
||||
Commitish: "master",
|
||||
Owner: "TEST",
|
||||
Repository: "test",
|
||||
ServerURL: "https://github.com",
|
||||
ReleaseBodyHeader: "Header",
|
||||
Version: "1.0",
|
||||
}
|
||||
err := runGithubPublishRelease(ctx, &myGithubPublishReleaseOptions, &ghRepoClient, &ghIssueClient)
|
||||
assert.NoError(t, err, "Error occured but none expected.")
|
||||
|
||||
assert.Equal(t, "Header\n", ghRepoClient.release.GetBody())
|
||||
})
|
||||
|
||||
t.Run("Success - subsequent releases & with body", func(t *testing.T) {
|
||||
lastTag := "1.0"
|
||||
lastPublishedAt := github.Timestamp{Time: time.Date(2019, 01, 01, 0, 0, 0, 0, time.UTC)}
|
||||
ghRepoClient := ghRCMock{
|
||||
createErr: nil,
|
||||
latestRelease: &github.RepositoryRelease{
|
||||
TagName: &lastTag,
|
||||
PublishedAt: &lastPublishedAt,
|
||||
},
|
||||
}
|
||||
prHTMLURL := "https://github.com/TEST/test/pull/1"
|
||||
prTitle := "Pull"
|
||||
prNo := 1
|
||||
|
||||
issHTMLURL := "https://github.com/TEST/test/issues/2"
|
||||
issTitle := "Issue"
|
||||
issNo := 2
|
||||
|
||||
ghIssueClient := ghICMock{
|
||||
issues: []*github.Issue{
|
||||
{Number: &prNo, Title: &prTitle, HTMLURL: &prHTMLURL, PullRequestLinks: &github.PullRequestLinks{URL: &prHTMLURL}},
|
||||
{Number: &issNo, Title: &issTitle, HTMLURL: &issHTMLURL},
|
||||
},
|
||||
}
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
AddClosedIssues: true,
|
||||
AddDeltaToLastRelease: true,
|
||||
Commitish: "master",
|
||||
Owner: "TEST",
|
||||
Repository: "test",
|
||||
ServerURL: "https://github.com",
|
||||
ReleaseBodyHeader: "Header",
|
||||
Version: "1.1",
|
||||
}
|
||||
err := runGithubPublishRelease(ctx, &myGithubPublishReleaseOptions, &ghRepoClient, &ghIssueClient)
|
||||
|
||||
assert.NoError(t, err, "Error occured but none expected.")
|
||||
|
||||
assert.Equal(t, "Header\n\n**List of closed pull-requests since last release**\n[#1](https://github.com/TEST/test/pull/1): Pull\n\n**List of closed issues since last release**\n[#2](https://github.com/TEST/test/issues/2): Issue\n\n**Changes**\n[1.0...1.1](https://github.com/TEST/test/compare/1.0...1.1)\n", ghRepoClient.release.GetBody())
|
||||
assert.Equal(t, "1.1", ghRepoClient.release.GetName())
|
||||
assert.Equal(t, "1.1", ghRepoClient.release.GetTagName())
|
||||
assert.Equal(t, "master", ghRepoClient.release.GetTargetCommitish())
|
||||
|
||||
assert.Equal(t, lastPublishedAt.Time, ghIssueClient.lastPublished)
|
||||
})
|
||||
|
||||
t.Run("Success - update asset", func(t *testing.T) {
|
||||
var releaseID int64 = 1
|
||||
ghIssueClient := ghICMock{}
|
||||
ghRepoClient := ghRCMock{
|
||||
latestRelease: &github.RepositoryRelease{
|
||||
ID: &releaseID,
|
||||
},
|
||||
}
|
||||
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
UpdateAsset: true,
|
||||
AssetPath: filepath.Join("testdata", t.Name()+"_test.txt"),
|
||||
Version: "latest",
|
||||
}
|
||||
|
||||
err := runGithubPublishRelease(ctx, &myGithubPublishReleaseOptions, &ghRepoClient, &ghIssueClient)
|
||||
|
||||
assert.NoError(t, err, "Error occured but none expected.")
|
||||
|
||||
assert.Nil(t, ghRepoClient.release)
|
||||
|
||||
assert.Equal(t, releaseID, ghRepoClient.listID)
|
||||
assert.Equal(t, releaseID, ghRepoClient.uploadID)
|
||||
})
|
||||
|
||||
t.Run("Error - get release", func(t *testing.T) {
|
||||
ghIssueClient := ghICMock{}
|
||||
ghRepoClient := ghRCMock{
|
||||
latestErr: fmt.Errorf("Latest release error"),
|
||||
}
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{}
|
||||
err := runGithubPublishRelease(ctx, &myGithubPublishReleaseOptions, &ghRepoClient, &ghIssueClient)
|
||||
|
||||
assert.Equal(t, "Error occured when retrieving latest GitHub release.: Latest release error", fmt.Sprint(err))
|
||||
})
|
||||
|
||||
t.Run("Error - create release", func(t *testing.T) {
|
||||
ghIssueClient := ghICMock{}
|
||||
ghRepoClient := ghRCMock{
|
||||
createErr: fmt.Errorf("Create release error"),
|
||||
}
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
Version: "1.0",
|
||||
}
|
||||
err := runGithubPublishRelease(ctx, &myGithubPublishReleaseOptions, &ghRepoClient, &ghIssueClient)
|
||||
|
||||
assert.Equal(t, "Creation of release '1.0' failed: Create release error", fmt.Sprint(err))
|
||||
})
|
||||
}
|
||||
|
||||
func TestGetClosedIssuesText(t *testing.T) {
|
||||
ctx := context.Background()
|
||||
publishedAt := github.Timestamp{Time: time.Date(2019, 01, 01, 0, 0, 0, 0, time.UTC)}
|
||||
|
||||
t.Run("No issues", func(t *testing.T) {
|
||||
ghIssueClient := ghICMock{}
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
Version: "1.0",
|
||||
}
|
||||
|
||||
res := getClosedIssuesText(ctx, publishedAt, &myGithubPublishReleaseOptions, &ghIssueClient)
|
||||
|
||||
assert.Equal(t, "", res)
|
||||
})
|
||||
|
||||
t.Run("All issues", func(t *testing.T) {
|
||||
ctx := context.Background()
|
||||
publishedAt := github.Timestamp{Time: time.Date(2019, 01, 01, 0, 0, 0, 0, time.UTC)}
|
||||
|
||||
prHTMLURL := []string{"https://github.com/TEST/test/pull/1", "https://github.com/TEST/test/pull/2"}
|
||||
prTitle := []string{"Pull1", "Pull2"}
|
||||
prNo := []int{1, 2}
|
||||
|
||||
issHTMLURL := []string{"https://github.com/TEST/test/issues/3", "https://github.com/TEST/test/issues/4"}
|
||||
issTitle := []string{"Issue3", "Issue4"}
|
||||
issNo := []int{3, 4}
|
||||
|
||||
ghIssueClient := ghICMock{
|
||||
issues: []*github.Issue{
|
||||
{Number: &prNo[0], Title: &prTitle[0], HTMLURL: &prHTMLURL[0], PullRequestLinks: &github.PullRequestLinks{URL: &prHTMLURL[0]}},
|
||||
{Number: &prNo[1], Title: &prTitle[1], HTMLURL: &prHTMLURL[1], PullRequestLinks: &github.PullRequestLinks{URL: &prHTMLURL[1]}},
|
||||
{Number: &issNo[0], Title: &issTitle[0], HTMLURL: &issHTMLURL[0]},
|
||||
{Number: &issNo[1], Title: &issTitle[1], HTMLURL: &issHTMLURL[1]},
|
||||
},
|
||||
}
|
||||
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
Owner: "TEST",
|
||||
Repository: "test",
|
||||
}
|
||||
|
||||
res := getClosedIssuesText(ctx, publishedAt, &myGithubPublishReleaseOptions, &ghIssueClient)
|
||||
|
||||
assert.Equal(t, "\n**List of closed pull-requests since last release**\n[#1](https://github.com/TEST/test/pull/1): Pull1\n[#2](https://github.com/TEST/test/pull/2): Pull2\n\n**List of closed issues since last release**\n[#3](https://github.com/TEST/test/issues/3): Issue3\n[#4](https://github.com/TEST/test/issues/4): Issue4\n", res)
|
||||
assert.Equal(t, "TEST", ghIssueClient.owner, "Owner not properly passed")
|
||||
assert.Equal(t, "test", ghIssueClient.repo, "Repo not properly passed")
|
||||
assert.Equal(t, "closed", ghIssueClient.options.State, "Issue state not properly passed")
|
||||
assert.Equal(t, "asc", ghIssueClient.options.Direction, "Sort direction not properly passed")
|
||||
assert.Equal(t, publishedAt.Time, ghIssueClient.options.Since, "PublishedAt not properly passed")
|
||||
})
|
||||
|
||||
}
|
||||
|
||||
func TestGetReleaseDeltaText(t *testing.T) {
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
Owner: "TEST",
|
||||
Repository: "test",
|
||||
ServerURL: "https://github.com",
|
||||
Version: "1.1",
|
||||
}
|
||||
lastTag := "1.0"
|
||||
lastRelease := github.RepositoryRelease{
|
||||
TagName: &lastTag,
|
||||
}
|
||||
|
||||
res := getReleaseDeltaText(&myGithubPublishReleaseOptions, &lastRelease)
|
||||
|
||||
assert.Equal(t, "\n**Changes**\n[1.0...1.1](https://github.com/TEST/test/compare/1.0...1.1)\n", res)
|
||||
}
|
||||
|
||||
func TestUploadReleaseAsset(t *testing.T) {
|
||||
ctx := context.Background()
|
||||
|
||||
t.Run("Success - existing asset", func(t *testing.T) {
|
||||
var releaseID int64 = 1
|
||||
assetName := "Success_-_existing_asset_test.txt"
|
||||
var assetID int64 = 11
|
||||
ghRepoClient := ghRCMock{
|
||||
latestRelease: &github.RepositoryRelease{
|
||||
ID: &releaseID,
|
||||
},
|
||||
listReleaseAssets: []*github.ReleaseAsset{
|
||||
{Name: &assetName, ID: &assetID},
|
||||
},
|
||||
}
|
||||
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
Owner: "TEST",
|
||||
Repository: "test",
|
||||
AssetPath: filepath.Join("testdata", t.Name()+"_test.txt"),
|
||||
}
|
||||
|
||||
err := uploadReleaseAsset(ctx, releaseID, &myGithubPublishReleaseOptions, &ghRepoClient)
|
||||
|
||||
assert.NoError(t, err, "Error occured but none expected.")
|
||||
|
||||
assert.Equal(t, "TEST", ghRepoClient.listOwner, "Owner not properly passed - list")
|
||||
assert.Equal(t, "test", ghRepoClient.listRepo, "Repo not properly passed - list")
|
||||
assert.Equal(t, releaseID, ghRepoClient.listID, "Relase ID not properly passed - list")
|
||||
|
||||
assert.Equal(t, "TEST", ghRepoClient.delOwner, "Owner not properly passed - del")
|
||||
assert.Equal(t, "test", ghRepoClient.delRepo, "Repo not properly passed - del")
|
||||
assert.Equal(t, assetID, ghRepoClient.delID, "Relase ID not properly passed - del")
|
||||
|
||||
assert.Equal(t, "TEST", ghRepoClient.uploadOwner, "Owner not properly passed - upload")
|
||||
assert.Equal(t, "test", ghRepoClient.uploadRepo, "Repo not properly passed - upload")
|
||||
assert.Equal(t, releaseID, ghRepoClient.uploadID, "Relase ID not properly passed - upload")
|
||||
assert.Equal(t, "text/plain; charset=utf-8", ghRepoClient.uploadOpts.MediaType, "Wrong MediaType passed - upload")
|
||||
})
|
||||
|
||||
t.Run("Success - no asset", func(t *testing.T) {
|
||||
var releaseID int64 = 1
|
||||
assetName := "notFound"
|
||||
var assetID int64 = 11
|
||||
ghRepoClient := ghRCMock{
|
||||
latestRelease: &github.RepositoryRelease{
|
||||
ID: &releaseID,
|
||||
},
|
||||
listReleaseAssets: []*github.ReleaseAsset{
|
||||
{Name: &assetName, ID: &assetID},
|
||||
},
|
||||
}
|
||||
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{
|
||||
Owner: "TEST",
|
||||
Repository: "test",
|
||||
AssetPath: filepath.Join("testdata", t.Name()+"_test.txt"),
|
||||
}
|
||||
|
||||
err := uploadReleaseAsset(ctx, releaseID, &myGithubPublishReleaseOptions, &ghRepoClient)
|
||||
|
||||
assert.NoError(t, err, "Error occured but none expected.")
|
||||
|
||||
assert.Equal(t, int64(0), ghRepoClient.delID, "Relase ID should not be populated")
|
||||
})
|
||||
|
||||
t.Run("Error - List Assets", func(t *testing.T) {
|
||||
var releaseID int64 = 1
|
||||
ghRepoClient := ghRCMock{
|
||||
listErr: fmt.Errorf("List Asset Error"),
|
||||
}
|
||||
myGithubPublishReleaseOptions := githubPublishReleaseOptions{}
|
||||
|
||||
err := uploadReleaseAsset(ctx, releaseID, &myGithubPublishReleaseOptions, &ghRepoClient)
|
||||
assert.Equal(t, "Failed to get list of release assets.: List Asset Error", fmt.Sprint(err), "Wrong error received")
|
||||
})
|
||||
}
|
||||
|
||||
func TestIsExcluded(t *testing.T) {
|
||||
|
||||
l1 := "label1"
|
||||
l2 := "label2"
|
||||
|
||||
tt := []struct {
|
||||
issue *github.Issue
|
||||
excludeLabels []string
|
||||
expected bool
|
||||
}{
|
||||
{issue: nil, excludeLabels: nil, expected: false},
|
||||
{issue: &github.Issue{}, excludeLabels: nil, expected: false},
|
||||
{issue: &github.Issue{Labels: []github.Label{{Name: &l1}}}, excludeLabels: nil, expected: false},
|
||||
{issue: &github.Issue{Labels: []github.Label{{Name: &l1}}}, excludeLabels: []string{"label0"}, expected: false},
|
||||
{issue: &github.Issue{Labels: []github.Label{{Name: &l1}}}, excludeLabels: []string{"label1"}, expected: true},
|
||||
{issue: &github.Issue{Labels: []github.Label{{Name: &l1}, {Name: &l2}}}, excludeLabels: []string{}, expected: false},
|
||||
{issue: &github.Issue{Labels: []github.Label{{Name: &l1}, {Name: &l2}}}, excludeLabels: []string{"label1"}, expected: true},
|
||||
}
|
||||
|
||||
for k, v := range tt {
|
||||
assert.Equal(t, v.expected, isExcluded(v.issue, v.excludeLabels), fmt.Sprintf("Run %v failed", k))
|
||||
}
|
||||
|
||||
}
|
11
cmd/interfaces.go
Normal file
11
cmd/interfaces.go
Normal file
@ -0,0 +1,11 @@
|
||||
package cmd
|
||||
|
||||
type execRunner interface {
|
||||
RunExecutable(e string, p ...string) error
|
||||
Dir(d string)
|
||||
}
|
||||
|
||||
type shellRunner interface {
|
||||
RunShell(s string, c string) error
|
||||
Dir(d string)
|
||||
}
|
44
cmd/karmaExecuteTests.go
Normal file
44
cmd/karmaExecuteTests.go
Normal file
@ -0,0 +1,44 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"strings"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/command"
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
)
|
||||
|
||||
func karmaExecuteTests(myKarmaExecuteTestsOptions karmaExecuteTestsOptions) error {
|
||||
c := command.Command{}
|
||||
// reroute command output to loging framework
|
||||
// also log stdout as Karma reports into it
|
||||
c.Stdout = log.Entry().Writer()
|
||||
c.Stderr = log.Entry().Writer()
|
||||
runKarma(myKarmaExecuteTestsOptions, &c)
|
||||
return nil
|
||||
}
|
||||
|
||||
func runKarma(myKarmaExecuteTestsOptions karmaExecuteTestsOptions, command execRunner) {
|
||||
installCommandTokens := tokenize(myKarmaExecuteTestsOptions.InstallCommand)
|
||||
command.Dir(myKarmaExecuteTestsOptions.ModulePath)
|
||||
err := command.RunExecutable(installCommandTokens[0], installCommandTokens[1:]...)
|
||||
if err != nil {
|
||||
log.Entry().
|
||||
WithError(err).
|
||||
WithField("command", myKarmaExecuteTestsOptions.InstallCommand).
|
||||
Fatal("failed to execute install command")
|
||||
}
|
||||
|
||||
runCommandTokens := tokenize(myKarmaExecuteTestsOptions.RunCommand)
|
||||
command.Dir(myKarmaExecuteTestsOptions.ModulePath)
|
||||
err = command.RunExecutable(runCommandTokens[0], runCommandTokens[1:]...)
|
||||
if err != nil {
|
||||
log.Entry().
|
||||
WithError(err).
|
||||
WithField("command", myKarmaExecuteTestsOptions.RunCommand).
|
||||
Fatal("failed to execute run command")
|
||||
}
|
||||
}
|
||||
|
||||
func tokenize(command string) []string {
|
||||
return strings.Split(command, " ")
|
||||
}
|
88
cmd/karmaExecuteTests_generated.go
Normal file
88
cmd/karmaExecuteTests_generated.go
Normal file
@ -0,0 +1,88 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
type karmaExecuteTestsOptions struct {
|
||||
InstallCommand string `json:"installCommand,omitempty"`
|
||||
ModulePath string `json:"modulePath,omitempty"`
|
||||
RunCommand string `json:"runCommand,omitempty"`
|
||||
}
|
||||
|
||||
var myKarmaExecuteTestsOptions karmaExecuteTestsOptions
|
||||
var karmaExecuteTestsStepConfigJSON string
|
||||
|
||||
// KarmaExecuteTestsCommand Executes the Karma test runner
|
||||
func KarmaExecuteTestsCommand() *cobra.Command {
|
||||
metadata := karmaExecuteTestsMetadata()
|
||||
var createKarmaExecuteTestsCmd = &cobra.Command{
|
||||
Use: "karmaExecuteTests",
|
||||
Short: "Executes the Karma test runner",
|
||||
Long: `In this step the ([Karma test runner](http://karma-runner.github.io)) is executed.
|
||||
|
||||
The step is using the ` + "`" + `seleniumExecuteTest` + "`" + ` step to spin up two containers in a Docker network:
|
||||
|
||||
* a Selenium/Chrome container (` + "`" + `selenium/standalone-chrome` + "`" + `)
|
||||
* a NodeJS container (` + "`" + `node:8-stretch` + "`" + `)
|
||||
|
||||
In the Docker network, the containers can be referenced by the values provided in ` + "`" + `dockerName` + "`" + ` and ` + "`" + `sidecarName` + "`" + `, the default values are ` + "`" + `karma` + "`" + ` and ` + "`" + `selenium` + "`" + `. These values must be used in the ` + "`" + `hostname` + "`" + ` properties of the test configuration ([Karma](https://karma-runner.github.io/1.0/config/configuration-file.html) and [WebDriver](https://github.com/karma-runner/karma-webdriver-launcher#usage)).
|
||||
|
||||
!!! note
|
||||
In a Kubernetes environment, the containers both need to be referenced with ` + "`" + `localhost` + "`" + `.`,
|
||||
PreRunE: func(cmd *cobra.Command, args []string) error {
|
||||
log.SetStepName("karmaExecuteTests")
|
||||
log.SetVerbose(GeneralConfig.Verbose)
|
||||
return PrepareConfig(cmd, &metadata, "karmaExecuteTests", &myKarmaExecuteTestsOptions, OpenPiperFile)
|
||||
},
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
return karmaExecuteTests(myKarmaExecuteTestsOptions)
|
||||
},
|
||||
}
|
||||
|
||||
addKarmaExecuteTestsFlags(createKarmaExecuteTestsCmd)
|
||||
return createKarmaExecuteTestsCmd
|
||||
}
|
||||
|
||||
func addKarmaExecuteTestsFlags(cmd *cobra.Command) {
|
||||
cmd.Flags().StringVar(&myKarmaExecuteTestsOptions.InstallCommand, "installCommand", "npm install --quiet", "The command that is executed to install the test tool.")
|
||||
cmd.Flags().StringVar(&myKarmaExecuteTestsOptions.ModulePath, "modulePath", ".", "Define the path of the module to execute tests on.")
|
||||
cmd.Flags().StringVar(&myKarmaExecuteTestsOptions.RunCommand, "runCommand", "npm run karma", "The command that is executed to start the tests.")
|
||||
|
||||
cmd.MarkFlagRequired("installCommand")
|
||||
cmd.MarkFlagRequired("modulePath")
|
||||
cmd.MarkFlagRequired("runCommand")
|
||||
}
|
||||
|
||||
// retrieve step metadata
|
||||
func karmaExecuteTestsMetadata() config.StepData {
|
||||
var theMetaData = config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{
|
||||
Name: "installCommand",
|
||||
Scope: []string{"GENERAL", "PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "modulePath",
|
||||
Scope: []string{"PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "runCommand",
|
||||
Scope: []string{"GENERAL", "PARAMETERS", "STAGES", "STEPS"},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
return theMetaData
|
||||
}
|
16
cmd/karmaExecuteTests_generated_test.go
Normal file
16
cmd/karmaExecuteTests_generated_test.go
Normal file
@ -0,0 +1,16 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestKarmaExecuteTestsCommand(t *testing.T) {
|
||||
|
||||
testCmd := KarmaExecuteTestsCommand()
|
||||
|
||||
// only high level testing performed - details are tested in step generation procudure
|
||||
assert.Equal(t, "karmaExecuteTests", testCmd.Use, "command name incorrect")
|
||||
|
||||
}
|
47
cmd/karmaExecuteTests_test.go
Normal file
47
cmd/karmaExecuteTests_test.go
Normal file
@ -0,0 +1,47 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"testing"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestRunKarma(t *testing.T) {
|
||||
t.Run("success case", func(t *testing.T) {
|
||||
opts := karmaExecuteTestsOptions{ModulePath: "./test", InstallCommand: "npm install test", RunCommand: "npm run test"}
|
||||
|
||||
e := execMockRunner{}
|
||||
runKarma(opts, &e)
|
||||
|
||||
assert.Equal(t, e.dir[0], "./test", "install command dir incorrect")
|
||||
assert.Equal(t, e.calls[0], execCall{exec: "npm", params: []string{"install", "test"}}, "install command/params incorrect")
|
||||
|
||||
assert.Equal(t, e.dir[1], "./test", "run command dir incorrect")
|
||||
assert.Equal(t, e.calls[1], execCall{exec: "npm", params: []string{"run", "test"}}, "run command/params incorrect")
|
||||
|
||||
})
|
||||
|
||||
t.Run("error case install command", func(t *testing.T) {
|
||||
var hasFailed bool
|
||||
log.Entry().Logger.ExitFunc = func(int) { hasFailed = true }
|
||||
|
||||
opts := karmaExecuteTestsOptions{ModulePath: "./test", InstallCommand: "fail install test", RunCommand: "npm run test"}
|
||||
|
||||
e := execMockRunner{shouldFailWith: errors.New("error case")}
|
||||
runKarma(opts, &e)
|
||||
assert.True(t, hasFailed, "expected command to exit with fatal")
|
||||
})
|
||||
|
||||
t.Run("error case run command", func(t *testing.T) {
|
||||
var hasFailed bool
|
||||
log.Entry().Logger.ExitFunc = func(int) { hasFailed = true }
|
||||
|
||||
opts := karmaExecuteTestsOptions{ModulePath: "./test", InstallCommand: "npm install test", RunCommand: "npm run test"}
|
||||
|
||||
e := execMockRunner{shouldFailWith: errors.New("error case")}
|
||||
runKarma(opts, &e)
|
||||
assert.True(t, hasFailed, "expected command to exit with fatal")
|
||||
})
|
||||
}
|
123
cmd/piper.go
Normal file
123
cmd/piper.go
Normal file
@ -0,0 +1,123 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
"github.com/SAP/jenkins-library/pkg/piperutils"
|
||||
"github.com/pkg/errors"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
// GeneralConfigOptions contains all global configuration options for piper binary
|
||||
type GeneralConfigOptions struct {
|
||||
CustomConfig string
|
||||
DefaultConfig []string //ordered list of Piper default configurations. Can be filePath or ENV containing JSON in format 'ENV:MY_ENV_VAR'
|
||||
ParametersJSON string
|
||||
StageName string
|
||||
StepConfigJSON string
|
||||
StepMetadata string //metadata to be considered, can be filePath or ENV containing JSON in format 'ENV:MY_ENV_VAR'
|
||||
StepName string
|
||||
Verbose bool
|
||||
}
|
||||
|
||||
var rootCmd = &cobra.Command{
|
||||
Use: "piper",
|
||||
Short: "Executes CI/CD steps from project 'Piper' ",
|
||||
Long: `
|
||||
This project 'Piper' binary provides a CI/CD step libary.
|
||||
It contains many steps which can be used within CI/CD systems as well as directly on e.g. a developer's machine.
|
||||
`,
|
||||
//ToDo: respect stageName to also come from parametersJSON -> first env.STAGE_NAME, second: parametersJSON, third: flag
|
||||
}
|
||||
|
||||
// GeneralConfig contains global configuration flags for piper binary
|
||||
var GeneralConfig GeneralConfigOptions
|
||||
|
||||
// Execute is the starting point of the piper command line tool
|
||||
func Execute() {
|
||||
|
||||
rootCmd.AddCommand(ConfigCommand())
|
||||
rootCmd.AddCommand(VersionCommand())
|
||||
rootCmd.AddCommand(KarmaExecuteTestsCommand())
|
||||
rootCmd.AddCommand(GithubPublishReleaseCommand())
|
||||
|
||||
addRootFlags(rootCmd)
|
||||
if err := rootCmd.Execute(); err != nil {
|
||||
fmt.Println(err)
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
||||
|
||||
func addRootFlags(rootCmd *cobra.Command) {
|
||||
|
||||
rootCmd.PersistentFlags().StringVar(&GeneralConfig.CustomConfig, "customConfig", ".pipeline/config.yml", "Path to the pipeline configuration file")
|
||||
rootCmd.PersistentFlags().StringSliceVar(&GeneralConfig.DefaultConfig, "defaultConfig", nil, "Default configurations, passed as path to yaml file")
|
||||
rootCmd.PersistentFlags().StringVar(&GeneralConfig.ParametersJSON, "parametersJSON", os.Getenv("PIPER_parametersJSON"), "Parameters to be considered in JSON format")
|
||||
rootCmd.PersistentFlags().StringVar(&GeneralConfig.StageName, "stageName", os.Getenv("STAGE_NAME"), "Name of the stage for which configuration should be included")
|
||||
rootCmd.PersistentFlags().StringVar(&GeneralConfig.StepConfigJSON, "stepConfigJSON", os.Getenv("PIPER_stepConfigJSON"), "Step configuration in JSON format")
|
||||
rootCmd.PersistentFlags().BoolVarP(&GeneralConfig.Verbose, "verbose", "v", false, "verbose output")
|
||||
|
||||
}
|
||||
|
||||
// PrepareConfig reads step configuration from various sources and merges it (defaults, config file, flags, ...)
|
||||
func PrepareConfig(cmd *cobra.Command, metadata *config.StepData, stepName string, options interface{}, openFile func(s string) (io.ReadCloser, error)) error {
|
||||
|
||||
filters := metadata.GetParameterFilters()
|
||||
|
||||
flagValues := config.AvailableFlagValues(cmd, &filters)
|
||||
|
||||
var myConfig config.Config
|
||||
var stepConfig config.StepConfig
|
||||
|
||||
if len(GeneralConfig.StepConfigJSON) != 0 {
|
||||
// ignore config & defaults in favor of passed stepConfigJSON
|
||||
stepConfig = config.GetStepConfigWithJSON(flagValues, GeneralConfig.StepConfigJSON, filters)
|
||||
} else {
|
||||
// use config & defaults
|
||||
var customConfig io.ReadCloser
|
||||
var err error
|
||||
//accept that config file and defaults cannot be loaded since both are not mandatory here
|
||||
if piperutils.FileExists(GeneralConfig.CustomConfig) {
|
||||
if customConfig, err = openFile(GeneralConfig.CustomConfig); err != nil {
|
||||
errors.Wrapf(err, "Cannot read '%s'", GeneralConfig.CustomConfig)
|
||||
}
|
||||
} else {
|
||||
log.Entry().Infof("Project config file '%s' does not exist. No project configuration available.", GeneralConfig.CustomConfig)
|
||||
customConfig = nil
|
||||
}
|
||||
|
||||
var defaultConfig []io.ReadCloser
|
||||
for _, f := range GeneralConfig.DefaultConfig {
|
||||
//ToDo: support also https as source
|
||||
fc, _ := openFile(f)
|
||||
defaultConfig = append(defaultConfig, fc)
|
||||
}
|
||||
|
||||
stepConfig, err = myConfig.GetStepConfig(flagValues, GeneralConfig.ParametersJSON, customConfig, defaultConfig, filters, metadata.Spec.Inputs.Parameters, GeneralConfig.StageName, stepName)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "retrieving step configuration failed")
|
||||
}
|
||||
}
|
||||
|
||||
confJSON, _ := json.Marshal(stepConfig.Config)
|
||||
json.Unmarshal(confJSON, &options)
|
||||
|
||||
config.MarkFlagsWithValue(cmd, stepConfig)
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// OpenPiperFile provides functionality to retrieve configuration via file or http
|
||||
func OpenPiperFile(name string) (io.ReadCloser, error) {
|
||||
//ToDo: support also https as source
|
||||
if !strings.HasPrefix(name, "http") {
|
||||
return os.Open(name)
|
||||
}
|
||||
return nil, fmt.Errorf("file location not yet supported for '%v'", name)
|
||||
}
|
154
cmd/piper_test.go
Normal file
154
cmd/piper_test.go
Normal file
@ -0,0 +1,154 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/spf13/cobra"
|
||||
flag "github.com/spf13/pflag"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
type execMockRunner struct {
|
||||
dir []string
|
||||
calls []execCall
|
||||
shouldFailWith error
|
||||
}
|
||||
|
||||
type execCall struct {
|
||||
exec string
|
||||
params []string
|
||||
}
|
||||
|
||||
type shellMockRunner struct {
|
||||
dir string
|
||||
calls []string
|
||||
shouldFailWith error
|
||||
}
|
||||
|
||||
func (m *execMockRunner) Dir(d string) {
|
||||
m.dir = append(m.dir, d)
|
||||
}
|
||||
|
||||
func (m *execMockRunner) RunExecutable(e string, p ...string) error {
|
||||
if m.shouldFailWith != nil {
|
||||
return m.shouldFailWith
|
||||
}
|
||||
exec := execCall{exec: e, params: p}
|
||||
m.calls = append(m.calls, exec)
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *shellMockRunner) Dir(d string) {
|
||||
m.dir = d
|
||||
}
|
||||
|
||||
func (m *shellMockRunner) RunShell(s string, c string) error {
|
||||
|
||||
if m.shouldFailWith != nil {
|
||||
return m.shouldFailWith
|
||||
}
|
||||
|
||||
m.calls = append(m.calls, c)
|
||||
return nil
|
||||
}
|
||||
|
||||
type stepOptions struct {
|
||||
TestParam string `json:"testParam,omitempty"`
|
||||
}
|
||||
|
||||
func openFileMock(name string) (io.ReadCloser, error) {
|
||||
var r string
|
||||
switch name {
|
||||
case "testDefaults.yml":
|
||||
r = "general:\n testParam: testValue"
|
||||
case "testDefaultsInvalid.yml":
|
||||
r = "invalid yaml"
|
||||
default:
|
||||
r = ""
|
||||
}
|
||||
return ioutil.NopCloser(strings.NewReader(r)), nil
|
||||
}
|
||||
|
||||
func TestAddRootFlags(t *testing.T) {
|
||||
var testRootCmd = &cobra.Command{Use: "test", Short: "This is just a test"}
|
||||
addRootFlags(testRootCmd)
|
||||
|
||||
assert.NotNil(t, testRootCmd.Flag("customConfig"), "expected flag not available")
|
||||
assert.NotNil(t, testRootCmd.Flag("defaultConfig"), "expected flag not available")
|
||||
assert.NotNil(t, testRootCmd.Flag("parametersJSON"), "expected flag not available")
|
||||
assert.NotNil(t, testRootCmd.Flag("stageName"), "expected flag not available")
|
||||
assert.NotNil(t, testRootCmd.Flag("stepConfigJSON"), "expected flag not available")
|
||||
assert.NotNil(t, testRootCmd.Flag("verbose"), "expected flag not available")
|
||||
|
||||
}
|
||||
|
||||
func TestPrepareConfig(t *testing.T) {
|
||||
defaultsBak := GeneralConfig.DefaultConfig
|
||||
GeneralConfig.DefaultConfig = []string{"testDefaults.yml"}
|
||||
defer func() { GeneralConfig.DefaultConfig = defaultsBak }()
|
||||
|
||||
t.Run("using stepConfigJSON", func(t *testing.T) {
|
||||
stepConfigJSONBak := GeneralConfig.StepConfigJSON
|
||||
GeneralConfig.StepConfigJSON = `{"testParam": "testValueJSON"}`
|
||||
defer func() { GeneralConfig.StepConfigJSON = stepConfigJSONBak }()
|
||||
testOptions := stepOptions{}
|
||||
var testCmd = &cobra.Command{Use: "test", Short: "This is just a test"}
|
||||
testCmd.Flags().StringVar(&testOptions.TestParam, "testParam", "", "test usage")
|
||||
metadata := config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{Name: "testParam", Scope: []string{"GENERAL"}},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
PrepareConfig(testCmd, &metadata, "testStep", &testOptions, openFileMock)
|
||||
assert.Equal(t, "testValueJSON", testOptions.TestParam, "wrong value retrieved from config")
|
||||
})
|
||||
|
||||
t.Run("using config files", func(t *testing.T) {
|
||||
t.Run("success case", func(t *testing.T) {
|
||||
testOptions := stepOptions{}
|
||||
var testCmd = &cobra.Command{Use: "test", Short: "This is just a test"}
|
||||
testCmd.Flags().StringVar(&testOptions.TestParam, "testParam", "", "test usage")
|
||||
metadata := config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{Name: "testParam", Scope: []string{"GENERAL"}},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
err := PrepareConfig(testCmd, &metadata, "testStep", &testOptions, openFileMock)
|
||||
assert.NoError(t, err, "no error expected but error occured")
|
||||
|
||||
//assert config
|
||||
assert.Equal(t, "testValue", testOptions.TestParam, "wrong value retrieved from config")
|
||||
|
||||
//assert that flag has been marked as changed
|
||||
testCmd.Flags().VisitAll(func(pflag *flag.Flag) {
|
||||
if pflag.Name == "testParam" {
|
||||
assert.True(t, pflag.Changed, "flag should be marked as changed")
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
t.Run("error case", func(t *testing.T) {
|
||||
GeneralConfig.DefaultConfig = []string{"testDefaultsInvalid.yml"}
|
||||
testOptions := stepOptions{}
|
||||
var testCmd = &cobra.Command{Use: "test", Short: "This is just a test"}
|
||||
metadata := config.StepData{}
|
||||
|
||||
err := PrepareConfig(testCmd, &metadata, "testStep", &testOptions, openFileMock)
|
||||
assert.Error(t, err, "error expected but none occured")
|
||||
})
|
||||
})
|
||||
}
|
1
cmd/testdata/TestRunGithubPublishRelease/Success_-_update_asset_test.txt
vendored
Normal file
1
cmd/testdata/TestRunGithubPublishRelease/Success_-_update_asset_test.txt
vendored
Normal file
@ -0,0 +1 @@
|
||||
TEST
|
1
cmd/testdata/TestUploadReleaseAsset/Success_-_existing_asset_test.txt
vendored
Normal file
1
cmd/testdata/TestUploadReleaseAsset/Success_-_existing_asset_test.txt
vendored
Normal file
@ -0,0 +1 @@
|
||||
TEST
|
1
cmd/testdata/TestUploadReleaseAsset/Success_-_no_asset_test.txt
vendored
Normal file
1
cmd/testdata/TestUploadReleaseAsset/Success_-_no_asset_test.txt
vendored
Normal file
@ -0,0 +1 @@
|
||||
TEST
|
28
cmd/version.go
Normal file
28
cmd/version.go
Normal file
@ -0,0 +1,28 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
// GitCommit ...
|
||||
var GitCommit string
|
||||
|
||||
// GitTag ...
|
||||
var GitTag string
|
||||
|
||||
func version(myVersionOptions versionOptions) error {
|
||||
|
||||
gitCommit, gitTag := "<n/a>", "<n/a>"
|
||||
|
||||
if len(GitCommit) > 0 {
|
||||
gitCommit = GitCommit
|
||||
}
|
||||
|
||||
if len(GitTag) > 0 {
|
||||
gitTag = GitTag
|
||||
}
|
||||
|
||||
_, err := fmt.Printf("piper-version:\n commit: \"%s\"\n tag: \"%s\"\n", gitCommit, gitTag)
|
||||
|
||||
return err
|
||||
}
|
50
cmd/version_generated.go
Normal file
50
cmd/version_generated.go
Normal file
@ -0,0 +1,50 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
type versionOptions struct {
|
||||
}
|
||||
|
||||
var myVersionOptions versionOptions
|
||||
var versionStepConfigJSON string
|
||||
|
||||
// VersionCommand Returns the version of the piper binary
|
||||
func VersionCommand() *cobra.Command {
|
||||
metadata := versionMetadata()
|
||||
var createVersionCmd = &cobra.Command{
|
||||
Use: "version",
|
||||
Short: "Returns the version of the piper binary",
|
||||
Long: `Writes the commit hash and the tag (if any) to stdout and exits with 0.`,
|
||||
PreRunE: func(cmd *cobra.Command, args []string) error {
|
||||
log.SetStepName("version")
|
||||
log.SetVerbose(GeneralConfig.Verbose)
|
||||
return PrepareConfig(cmd, &metadata, "version", &myVersionOptions, OpenPiperFile)
|
||||
},
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
return version(myVersionOptions)
|
||||
},
|
||||
}
|
||||
|
||||
addVersionFlags(createVersionCmd)
|
||||
return createVersionCmd
|
||||
}
|
||||
|
||||
func addVersionFlags(cmd *cobra.Command) {
|
||||
|
||||
}
|
||||
|
||||
// retrieve step metadata
|
||||
func versionMetadata() config.StepData {
|
||||
var theMetaData = config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{},
|
||||
},
|
||||
},
|
||||
}
|
||||
return theMetaData
|
||||
}
|
16
cmd/version_generated_test.go
Normal file
16
cmd/version_generated_test.go
Normal file
@ -0,0 +1,16 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestVersionCommand(t *testing.T) {
|
||||
|
||||
testCmd := VersionCommand()
|
||||
|
||||
// only high level testing performed - details are tested in step generation procudure
|
||||
assert.Equal(t, "version", testCmd.Use, "command name incorrect")
|
||||
|
||||
}
|
64
cmd/version_test.go
Normal file
64
cmd/version_test.go
Normal file
@ -0,0 +1,64 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"io"
|
||||
"os"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestVersion(t *testing.T) {
|
||||
|
||||
t.Run("versionAndTagInitialValues", func(t *testing.T) {
|
||||
|
||||
result := runVersionCommand(t, "", "")
|
||||
assert.Contains(t, result, "commit: \"<n/a>\"")
|
||||
assert.Contains(t, result, "tag: \"<n/a>\"")
|
||||
})
|
||||
|
||||
t.Run("versionAndTagSet", func(t *testing.T) {
|
||||
|
||||
result := runVersionCommand(t, "16bafe", "v1.2.3")
|
||||
assert.Contains(t, result, "commit: \"16bafe\"")
|
||||
assert.Contains(t, result, "tag: \"v1.2.3\"")
|
||||
})
|
||||
}
|
||||
|
||||
func runVersionCommand(t *testing.T, commitID, tag string) string {
|
||||
|
||||
orig := os.Stdout
|
||||
defer func() { os.Stdout = orig }()
|
||||
|
||||
r, w, e := os.Pipe()
|
||||
if e != nil {
|
||||
t.Error("Cannot setup pipes.")
|
||||
}
|
||||
|
||||
os.Stdout = w
|
||||
|
||||
//
|
||||
// needs to be set in the free wild by the build process:
|
||||
// go build -ldflags "-X github.com/SAP/jenkins-library/cmd.GitCommit=${GIT_COMMIT} -X github.com/SAP/jenkins-library/cmd.GitTag=${GIT_TAG}"
|
||||
if len(commitID) > 0 {
|
||||
GitCommit = commitID
|
||||
}
|
||||
if len(tag) > 0 {
|
||||
GitTag = tag
|
||||
}
|
||||
defer func() { GitCommit = ""; GitTag = "" }()
|
||||
//
|
||||
//
|
||||
|
||||
var myVersionOptions versionOptions
|
||||
e = version(myVersionOptions)
|
||||
if e != nil {
|
||||
t.Error("Version command failed.")
|
||||
}
|
||||
|
||||
w.Close()
|
||||
|
||||
var buf bytes.Buffer
|
||||
io.Copy(&buf, r)
|
||||
return buf.String()
|
||||
}
|
@ -22,6 +22,11 @@ class TestRunnerThread extends Thread {
|
||||
def testCaseConfig
|
||||
|
||||
TestRunnerThread(File testCaseFile) {
|
||||
this.testCaseConfig = new Yaml().load(testCaseFile.text)
|
||||
if (!System.getenv(testCaseConfig.deployCredentialEnv.username) ||
|
||||
!System.getenv(testCaseConfig.deployCredentialEnv.password)) {
|
||||
throw new RuntimeException("Environment variables '${testCaseConfig.deployCredentialEnv.username}' and '${testCaseConfig.deployCredentialEnv.password}' need to be set.")
|
||||
}
|
||||
// Regex pattern expects a folder structure such as '/rootDir/areaDir/testCase.extension'
|
||||
def testCaseMatches = (testCaseFile.toString() =~
|
||||
/^[\w\-]+\\/([\w\-]+)\\/([\w\-]+)\..*\u0024/)
|
||||
@ -34,7 +39,6 @@ class TestRunnerThread extends Thread {
|
||||
this.uniqueName = "${area}|${testCase}"
|
||||
this.testCaseRootDir = new File("${workspacesRootDir}/${area}/${testCase}")
|
||||
this.testCaseWorkspace = "${testCaseRootDir}/workspace"
|
||||
this.testCaseConfig = new Yaml().load(testCaseFile.text)
|
||||
}
|
||||
|
||||
void run() {
|
||||
@ -43,8 +47,10 @@ class TestRunnerThread extends Thread {
|
||||
if (testCaseRootDir.exists() || !testCaseRootDir.mkdirs()) {
|
||||
throw new RuntimeException("Creation of dir '${testCaseRootDir}' failed.")
|
||||
}
|
||||
executeShell("git clone -b ${testCase} ${testCaseConfig.referenceAppRepoUrl} " +
|
||||
"${testCaseWorkspace}")
|
||||
|
||||
executeShell("git clone -b ${testCaseConfig.referenceAppRepo.branch} " +
|
||||
"${testCaseConfig.referenceAppRepo.url} ${testCaseWorkspace}")
|
||||
|
||||
addJenkinsYmlToWorkspace()
|
||||
setLibraryVersionInJenkinsfile()
|
||||
|
||||
@ -53,10 +59,12 @@ class TestRunnerThread extends Thread {
|
||||
'--author="piper-testing-bot <piper-testing-bot@example.com>"',
|
||||
'--message="Set piper lib version for test"'])
|
||||
|
||||
executeShell("docker run -v /var/run/docker.sock:/var/run/docker.sock " +
|
||||
executeShell("docker run --rm -v /var/run/docker.sock:/var/run/docker.sock " +
|
||||
"-v ${System.getenv('PWD')}/${testCaseWorkspace}:/workspace -v /tmp " +
|
||||
"-e CASC_JENKINS_CONFIG=/workspace/jenkins.yml -e CX_INFRA_IT_CF_USERNAME " +
|
||||
"-e CX_INFRA_IT_CF_PASSWORD -e BRANCH_NAME=${testCase} ppiper/jenkinsfile-runner")
|
||||
"-e CASC_JENKINS_CONFIG=/workspace/jenkins.yml " +
|
||||
"-e ${testCaseConfig.deployCredentialEnv.username} " +
|
||||
"-e ${testCaseConfig.deployCredentialEnv.password} " +
|
||||
"-e BRANCH_NAME=${testCaseConfig.referenceAppRepo.branch} ppiper/jenkinsfile-runner")
|
||||
|
||||
println "*****[INFO] Test case '${uniqueName}' finished successfully.*****"
|
||||
printOutput()
|
||||
|
@ -68,10 +68,6 @@ if (!RUNNING_LOCALLY) {
|
||||
}
|
||||
}
|
||||
|
||||
if (!System.getenv('CX_INFRA_IT_CF_USERNAME') || !System.getenv('CX_INFRA_IT_CF_PASSWORD')) {
|
||||
exitPrematurely('Environment variables CX_INFRA_IT_CF_USERNAME and CX_INFRA_IT_CF_PASSWORD need to be set.')
|
||||
}
|
||||
|
||||
if (options.s) {
|
||||
def file = new File(options.s)
|
||||
if (!file.exists()) {
|
||||
|
@ -27,3 +27,15 @@ credentials:
|
||||
username: ${CX_INFRA_IT_CF_USERNAME}
|
||||
password: ${CX_INFRA_IT_CF_PASSWORD}
|
||||
description: "SAP CP Trail account for test deployment"
|
||||
- usernamePassword:
|
||||
scope: GLOBAL
|
||||
id: "neo_deploy"
|
||||
username: ${NEO_DEPLOY_USERNAME}
|
||||
password: ${NEO_DEPLOY_PASSWORD}
|
||||
description: "SAP CP NEO Trail account for test deployment"
|
||||
- usernamePassword:
|
||||
scope: GLOBAL
|
||||
id: "cf_deploy"
|
||||
username: ${CX_INFRA_IT_CF_USERNAME}
|
||||
password: ${CX_INFRA_IT_CF_PASSWORD}
|
||||
description: "SAP CP CF Trial account for test deployment"
|
||||
|
@ -1,2 +1,7 @@
|
||||
# Test case configuration
|
||||
referenceAppRepoUrl: "https://github.com/sap/cloud-s4-sdk-book"
|
||||
referenceAppRepo:
|
||||
url: "https://github.com/piper-validation/cloud-s4-sdk-book"
|
||||
branch: "consumer-test-neo"
|
||||
deployCredentialEnv:
|
||||
username: "CX_INFRA_IT_CF_USERNAME"
|
||||
password: "CX_INFRA_IT_CF_PASSWORD"
|
||||
|
@ -1,2 +1,7 @@
|
||||
# Test case configuration
|
||||
referenceAppRepoUrl: "https://github.com/sap/cloud-s4-sdk-book"
|
||||
referenceAppRepo:
|
||||
url: "https://github.com/piper-validation/cloud-s4-sdk-book"
|
||||
branch: "consumer-test"
|
||||
deployCredentialEnv:
|
||||
username: "CX_INFRA_IT_CF_USERNAME"
|
||||
password: "CX_INFRA_IT_CF_PASSWORD"
|
||||
|
7
consumer-test/testCases/scs/cap.yml
Normal file
7
consumer-test/testCases/scs/cap.yml
Normal file
@ -0,0 +1,7 @@
|
||||
# Test case configuration
|
||||
referenceAppRepo:
|
||||
url: "https://github.com/piper-validation/mta-sample-app.git"
|
||||
branch: "piper-test-cap"
|
||||
deployCredentialEnv:
|
||||
username: "CX_INFRA_IT_CF_USERNAME"
|
||||
password: "CX_INFRA_IT_CF_PASSWORD"
|
7
consumer-test/testCases/scs/ui5-neo.yml
Normal file
7
consumer-test/testCases/scs/ui5-neo.yml
Normal file
@ -0,0 +1,7 @@
|
||||
# Test case configuration
|
||||
referenceAppRepo:
|
||||
url: "https://github.com/piper-validation/openui5-sample-app.git"
|
||||
branch: "piper-test-ui5-neo"
|
||||
deployCredentialEnv:
|
||||
username: "NEO_DEPLOY_USERNAME"
|
||||
password: "NEO_DEPLOY_PASSWORD"
|
@ -364,7 +364,7 @@ class Helper {
|
||||
def param = retrieveParameterName(line)
|
||||
|
||||
if(!param) {
|
||||
throw new RuntimeException('Cannot retrieve parameter for a comment')
|
||||
throw new RuntimeException("Cannot retrieve parameter for a comment. Affected line was: '${line}'")
|
||||
}
|
||||
|
||||
def _docu = [], _value = [], _mandatory = [], _parentObject = []
|
||||
@ -489,7 +489,7 @@ class Helper {
|
||||
def params = [] as Set
|
||||
f.eachLine {
|
||||
line ->
|
||||
if (line ==~ /.*withMandatoryProperty.*/) {
|
||||
if (line ==~ /.*withMandatoryProperty\(.*/) {
|
||||
def param = (line =~ /.*withMandatoryProperty\('(.*)'/)[0][1]
|
||||
params << param
|
||||
}
|
||||
@ -666,13 +666,15 @@ Map stages = Helper.resolveDocuRelevantStages(gse, stepsDir)
|
||||
boolean exceptionCaught = false
|
||||
|
||||
def stepDescriptors = [:]
|
||||
DefaultValueCache.prepare(Helper.getDummyScript('noop'), customDefaults)
|
||||
DefaultValueCache.prepare(Helper.getDummyScript('noop'), [customDefaults: customDefaults])
|
||||
for (step in steps) {
|
||||
try {
|
||||
stepDescriptors."${step}" = handleStep(step, gse)
|
||||
} catch(Exception e) {
|
||||
exceptionCaught = true
|
||||
System.err << "${e.getClass().getName()} caught while handling step '${step}': ${e.getMessage()}.\n"
|
||||
def writer = new StringWriter()
|
||||
e.printStackTrace(new PrintWriter(writer))
|
||||
System.err << "${e.getClass().getName()} caught while handling step '${step}': ${e.getMessage()}.\n${writer.toString()}\n"
|
||||
}
|
||||
}
|
||||
|
||||
@ -837,7 +839,8 @@ def handleStep(stepName, gse) {
|
||||
File theStepDocu = new File(stepsDocuDir, "${stepName}.md")
|
||||
File theStepDeps = new File('documentation/jenkins_workspace/plugin_mapping.json')
|
||||
|
||||
if (!theStepDocu.exists() && stepName.indexOf('Stage') != -1) {
|
||||
def stageNameFields = stepName.split('Stage')
|
||||
if (!theStepDocu.exists() && stepName.indexOf('Stage') != -1 && stageNameFields.size() > 1) {
|
||||
//try to get a corresponding stage documentation
|
||||
def stageName = stepName.split('Stage')[1].toLowerCase()
|
||||
theStepDocu = new File(stagesDocuDir,"${stageName}.md" )
|
||||
|
@ -1,6 +1,9 @@
|
||||
# Configuration
|
||||
|
||||
Configuration is done via a yml-file, located at `.pipeline/config.yml` in the **master branch** of your source code repository.
|
||||
Configure your project through a yml-file, which is located at `.pipeline/config.yml` in the **master branch** of your source code repository.
|
||||
|
||||
!!! note "Cloud SDK Pipeline"
|
||||
Cloud SDK Pipelines are configured in a file called `pipeline_config.yml`. See [SAP Cloud SDK Pipeline Configuration Docs](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md).
|
||||
|
||||
Your configuration inherits from the default configuration located at [https://github.com/SAP/jenkins-library/blob/master/resources/default_pipeline_environment.yml](https://github.com/SAP/jenkins-library/blob/master/resources/default_pipeline_environment.yml).
|
||||
|
||||
|
@ -4,10 +4,13 @@ There are several possibilities for extensibility besides the **[very powerful c
|
||||
|
||||
## 1. Stage Exits
|
||||
|
||||
You have to create a file like `<StageName>.groovy` for example `Acceptance.groovy` and store it in folder `.pipeline/extensions/` in your source code repository.
|
||||
You have to create a file like `<StageName>.groovy` (for example, `Acceptance.groovy`) and store it in folder `.pipeline/extensions/` in your source code repository.
|
||||
|
||||
The pipeline template will check if such a file exists and executes it if present.
|
||||
A parameter is passed to the extension containing following keys:
|
||||
!!! note "Cloud SDK Pipeline"
|
||||
If you use the Cloud SDK Pipeline, the folder is named `pipeline/extensions/` (without the dot). For more information, please refer to [the Cloud SDK Pipeline documentation](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/pipeline/extensibility.md).
|
||||
|
||||
The pipeline template checks if such a file exists and executes it, if present.
|
||||
A parameter that contains the following keys is passed to the extension:
|
||||
|
||||
* `script`: defines the global script environment of the Jenkinsfile run. This makes sure that the correct configuration environment can be passed to project "Piper" steps and also allows access to for example the `commonPipelineEnvironment`.
|
||||
* `originalStage`: this will allow you to execute the "original" stage at any place in your script. If omitting a call to `originalStage()` only your code will be executed instead.
|
||||
|
@ -12,29 +12,46 @@ The stated instructions assume the use of this application.
|
||||
|
||||
* You have installed a Linux system with at least 4 GB memory. **Note:** We have tested our samples on Ubuntu 16.04. On Microsoft Windows, you might face some issues.
|
||||
* You have installed the newest version of Docker. See [Docker Community Edition](https://docs.docker.com/install/). **Note:** we have tested on Docker 18.09.6.
|
||||
* You have installed Jenkins 2.60.3 or higher. **Recommendation:** We recommend to use the `cx-server` toolkit. See **(Optional) Install the `cx-server` Toolkit for Jenkins**. **Note:** If you use your **own Jenkins installation** you need to care for "Piper" specific configuration. Follow [my own Jenkins installation][guidedtour-my-own-jenkins].
|
||||
* Your system has access to [GitHub.com][github].
|
||||
|
||||
## (Optional) Install the `cx-server` Toolkit for Jenkins
|
||||
## **Recommended:** Install the Cx Server Life-cycle Management for Jenkins
|
||||
|
||||
`cx-server`is a lifecycle management toolkit that provides Docker images with a preconfigured Jenkins and a Nexus-based cache to facilitate the configuration and usage of Jenkins.
|
||||
Cx Server is a life-cycle management tool to bootstrap a pre-configured Jenkins instance within minutes.
|
||||
All required plugins and shared libraries are included automatically.
|
||||
It is based on Docker images provided by project "Piper".
|
||||
|
||||
To use the toolkit, get the `cx-server` script and its configuration file `server.cfg` by using the following command:
|
||||
To get started, initialize Cx Server by using this `docker run` command:
|
||||
|
||||
```sh
|
||||
docker run -it --rm -u $(id -u):$(id -g) -v "${PWD}":/cx-server/mount/ ppiper/cx-server-companion:latest init-cx-server
|
||||
```
|
||||
|
||||
When the files are downloaded into the current directory, launch the Jenkins server by using the following command:
|
||||
This creates a few files in your current working directory.
|
||||
The shell script `cx-server` and the configuration file `server.cfg` are of special interest.
|
||||
|
||||
Now, you can start the Jenkins server by using the following command:
|
||||
|
||||
```sh
|
||||
chmod +x ./cx-server
|
||||
./cx-server start
|
||||
```
|
||||
|
||||
For more information on the Jenkins lifecycle management and how to customize your Jenkins, have a look at the [Operations Guide for Cx Server][devops-docker-images-cxs-guide].
|
||||
For more information on the Cx Server and how to customize your Jenkins, have a look at the [Operations Guide for Cx Server][devops-docker-images-cxs-guide].
|
||||
|
||||
### On your own: Custom Jenkins Setup
|
||||
|
||||
If you use your own Jenkins installation, you need to care for the configuration that is specific to project "Piper".
|
||||
This option should only be considered if you know why you need it, otherwise using the Cx Server life-cycle management makes your life much easier.
|
||||
If you choose to go this path, follow [my own Jenkins installation][guidedtour-my-own-jenkins] for some hints.
|
||||
|
||||
**Note:** This option is not supported for SAP Cloud SDK projects.
|
||||
|
||||
## (Optional) Sample Application
|
||||
|
||||
!!! info "Choosing the best sample application"
|
||||
Depending on the type of project you're interested in, different sample applications might be interesting.
|
||||
For SAP Cloud SDK, please have a look at the [Address Manager](https://github.com/sap/cloud-s4-sdk-book) example application.
|
||||
|
||||
Copy the sources of the application into your own Git repository. While we will ask you to fork the application's repository into a **GitHub** space, you can use any version control system based on Git like **GitLab** or **plain git**. **Note:** A `public` GitHub repository is visible to the public. The configuration files may contain data you don't want to expose, so use a `private` repository.
|
||||
|
||||
1. Create an organization on GitHub, if you haven't any yet. See [Creating a new organization][github-create-org].
|
||||
@ -189,7 +206,7 @@ Please also consult the blog post on setting up [Continuous Delivery for S/4HANA
|
||||
[sap-blog-s4-sdk-first-steps]: https://blogs.sap.com/2017/05/10/first-steps-with-sap-s4hana-cloud-sdk/
|
||||
[sap-blog-ci-cd]: https://blogs.sap.com/2017/09/20/continuous-integration-and-delivery/
|
||||
|
||||
[devops-docker-images-cxs-guide]: https://github.com/SAP/devops-docker-images/blob/master/docs/operations/cx-server-operations-guide.md
|
||||
[devops-docker-images-cxs-guide]: https://github.com/SAP/devops-docker-cx-server/blob/master/docs/operations/cx-server-operations-guide.md
|
||||
|
||||
[cloud-cf-helloworld-nodejs]: https://github.com/SAP/cloud-cf-helloworld-nodejs
|
||||
[github]: https://github.com
|
||||
|
BIN
documentation/docs/images/Detailed_Process_TMS.png
Normal file
BIN
documentation/docs/images/Detailed_Process_TMS.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 36 KiB |
BIN
documentation/docs/images/Interplay_TMS.png
Normal file
BIN
documentation/docs/images/Interplay_TMS.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 62 KiB |
BIN
documentation/docs/images/cloud-sdk-pipeline.png
Normal file
BIN
documentation/docs/images/cloud-sdk-pipeline.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 79 KiB |
BIN
documentation/docs/images/webide-pipeline-template.png
Normal file
BIN
documentation/docs/images/webide-pipeline-template.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 127 KiB |
@ -1,32 +1,45 @@
|
||||
# Project "Piper" User Documentation
|
||||
|
||||
An efficient software development process is vital for success in building
|
||||
business applications on SAP Cloud Platform or SAP on-premise platforms. SAP
|
||||
addresses this need for efficiency with project "Piper". The goal of project
|
||||
"Piper" is to substantially ease setting up continuous delivery processes for
|
||||
the most important SAP technologies by means of Jenkins pipelines.
|
||||
Continuous delivery is a method to develop software with short feedback cycles.
|
||||
It is applicable to projects both for SAP Cloud Platform and SAP on-premise platforms.
|
||||
SAP implements tooling for continuous delivery in project "Piper".
|
||||
The goal of project "Piper" is to substantially ease setting up continuous delivery in your project using SAP technologies.
|
||||
|
||||
## What you get
|
||||
|
||||
Project "Piper" consists of two parts:
|
||||
To get you started quickly, project "Piper" offers you the following artifacts:
|
||||
|
||||
* [A shared library][piper-library] containing steps and utilities that are
|
||||
required by Jenkins pipelines.
|
||||
* A set of [Docker images][devops-docker-images] used in the piper library to implement best practices.
|
||||
* A set of ready-made Continuous Delivery pipelines for direct use in your project
|
||||
* [General Purpose Pipeline](stages/introduction/)
|
||||
* [SAP Cloud SDK Pipeline][cloud-sdk-pipeline]
|
||||
* [A shared library][piper-library] that contains reusable step implementations, which enable you to customize our preconfigured pipelines, or to even build your own customized ones
|
||||
* A set of [Docker images][devops-docker-images] to setup a CI/CD environment in minutes using sophisticated life-cycle management
|
||||
|
||||
The shared library contains all the necessary steps to run our best practice
|
||||
[Jenkins pipelines][piper-library-pages] described in the Scenarios section or
|
||||
to run a [pipeline as step][piper-library-scenario].
|
||||
To find out which offering is right for you, we recommend to look at the ready-made pipelines first.
|
||||
In many cases, they should satisfy your requirements, and if this is the case, you don't need to build your own pipeline.
|
||||
|
||||
The best practice pipelines are based on the general concepts of [Jenkins 2.0
|
||||
Pipelines as Code][jenkins-doc-pipelines]. With that you have the power of the
|
||||
Jenkins community at hand to optimize your pipelines.
|
||||
### The best-practice way: Ready-made pipelines
|
||||
|
||||
**Are you building a standalone SAP Cloud Platform application?<br>**
|
||||
Then continue reading about our [general purpose pipeline](stages/introduction/), which supports various technologies and programming languages.
|
||||
|
||||
**Are you building an application with the SAP Cloud SDK and/or SAP Cloud Application Programming Model?<br>**
|
||||
Then we can offer you a [pipeline specifically tailored to SAP Cloud SDK and SAP Cloud Application Programming Model applications][cloud-sdk-pipeline]
|
||||
|
||||
### The do-it-yourself way: Build with Library
|
||||
|
||||
The shared library contains building blocks for your own pipeline, following our best practice Jenkins pipelines described in the Scenarios section.
|
||||
|
||||
The best practice pipelines are based on the general concepts of [Pipelines as Code, as introduced in Jenkins 2][jenkins-doc-pipelines].
|
||||
With that you have the power of the Jenkins community at hand to optimize your pipelines.
|
||||
|
||||
You can run the best practice Jenkins pipelines out of the box, take them as a
|
||||
starting point for project-specific adaptations or implement your own pipelines
|
||||
from scratch using the shared library.
|
||||
|
||||
## Extensibility
|
||||
For an example, you might want to check out our ["Build and Deploy SAPUI5 or SAP Fiori Applications on SAP Cloud Platform with Jenkins" scenario][piper-library-scenario].
|
||||
|
||||
#### Extensibility
|
||||
|
||||
If you consider adding additional capabilities to your `Jenkinsfile`, consult
|
||||
the [Jenkins Pipeline Steps Reference][jenkins-doc-steps]. There, you get an
|
||||
@ -41,7 +54,7 @@ Custom library steps can be added using a custom library according to the
|
||||
groovy coding to the `Jenkinsfile`. Your custom library can coexist next to the
|
||||
provided pipeline library.
|
||||
|
||||
## API
|
||||
#### API
|
||||
|
||||
All steps (`vars` and `resources` directory) are intended to be used by Pipelines and are considered API.
|
||||
All the classes / groovy-scripts contained in the `src` folder are by default not part of
|
||||
@ -49,14 +62,15 @@ the API and are subjected to change without prior notice. Types and methods anno
|
||||
`@API` are considered to be API, used e.g. from other shared libraries. Changes to those
|
||||
methods/types needs to be announced, discussed and agreed.
|
||||
|
||||
|
||||
[github]: https://github.com
|
||||
[piper-library]: https://github.com/SAP/jenkins-library
|
||||
[cloud-sdk-pipeline]: pipelines/cloud-sdk/introduction/
|
||||
[devops-docker-images]: https://github.com/SAP/devops-docker-images
|
||||
[devops-docker-images-issues]: https://github.com/SAP/devops-docker-images/issues
|
||||
[devops-docker-images-cxs-guide]: https://github.com/SAP/devops-docker-images/blob/master/docs/operations/cx-server-operations-guide.md
|
||||
[piper-library-scenario]: https://sap.github.io/jenkins-library/scenarios/ui5-sap-cp/Readme/
|
||||
[piper-library-pages]: https://sap.github.io/jenkins-library
|
||||
[piper-library-pages-plugins]: https://sap.github.io/jenkins-library/jenkins/requiredPlugins
|
||||
[piper-library-scenario]: scenarios/ui5-sap-cp/Readme/
|
||||
[piper-library-pages-plugins]: requiredPlugins
|
||||
[piper-library-issues]: https://github.com/SAP/jenkins-library/issues
|
||||
[piper-library-license]: ./LICENSE
|
||||
[piper-library-contribution]: .github/CONTRIBUTING.md
|
||||
|
41
documentation/docs/pipelines/cloud-sdk/introduction.md
Normal file
41
documentation/docs/pipelines/cloud-sdk/introduction.md
Normal file
@ -0,0 +1,41 @@
|
||||
# SAP Cloud SDK Pipeline
|
||||
|
||||
<img src="https://help.sap.com/doc/6c02295dfa8f47cf9c08a19f2e172901/1.0/en-US/logo-for-cd.svg" alt="SAP Cloud SDK for Continuous Delivery Logo" height="122.92" width="226.773" align="right"/></a>
|
||||
|
||||
If you are building an application with [SAP Cloud SDK](https://community.sap.com/topics/cloud-sdk), the [SAP Cloud SDK pipeline](https://github.com/SAP/cloud-s4-sdk-pipeline) helps you to quickly build and deliver your app in high quality.
|
||||
Thanks to highly streamlined components, setting up and delivering your first project will just take minutes.
|
||||
|
||||
## Qualities and Pipeline Features
|
||||
|
||||
The SAP Cloud SDK pipeline is based on project "piper" and offers unique features for assuring that your SAP Cloud SDK based application fulfills highest quality standards.
|
||||
In conjunction with the SAP Cloud SDK libraries, the pipeline helps you to implement and automatically assure application qualities, for example:
|
||||
|
||||
* Functional correctness via:
|
||||
* Backend and frontend unit tests
|
||||
* Backend and frontend integration tests
|
||||
* User acceptance testing via headless browser end-to-end tests
|
||||
* Non-functional qualities via:
|
||||
* Dynamic resilience checks
|
||||
* Performance tests based on *Gatling* or *JMeter*
|
||||
* Code Security scans based on *Checkmarx* and *Fortify*
|
||||
* Dependency vulnerability scans based on *Whitesource*
|
||||
* IP compliance scan based on *Whitesource*
|
||||
* Zero-downtime deployment
|
||||
* Proper logging of application errors
|
||||
|
||||

|
||||
|
||||
## Supported Project Types
|
||||
|
||||
The pipeline supports the following types of projects:
|
||||
|
||||
* Java projects based on the [SAP Cloud SDK Archetypes](https://mvnrepository.com/artifact/com.sap.cloud.sdk.archetypes).
|
||||
* JavaScript projects based on the [SAP Cloud SDK JavaScript Scaffolding](https://github.com/SAP/cloud-s4-sdk-examples/tree/scaffolding-js).
|
||||
* TypeScript projects based on the [SAP Cloud SDK TypeScript Scaffolding](https://github.com/SAP/cloud-s4-sdk-examples/tree/scaffolding-ts).
|
||||
* SAP Cloud Application Programming Model (CAP) projects based on the _SAP Cloud Platform Business Application_ WebIDE Template.
|
||||
|
||||
You can find more details about the supported project types and build tools in the [project documentation](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/pipeline/build-tools.md).
|
||||
|
||||
## Legal Notes
|
||||
|
||||
Note: This license of this repository does not apply to the SAP Cloud SDK for Continuous Delivery Logo referenced in this page
|
@ -1,9 +1,49 @@
|
||||
# Build and Deploy Applications with Jenkins and the SAP Cloud Application Programming Model
|
||||
# Build and Deploy SAP Cloud Application Programming Model Applications
|
||||
|
||||
Set up a basic continuous delivery process for developing applications according to the SAP Cloud Application Programming Model.
|
||||
In this scenario, we will setup a CI/CD Pipeline for a SAP Cloud Application Programming Model (CAP) project, which is based on the _SAP Cloud Platform Business Application_ WebIDE Template.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* You have an account on SAP Cloud Platform in the Cloud Foundry environment. See [Accounts](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/8ed4a705efa0431b910056c0acdbf377.html).
|
||||
* You have setup a suitable Jenkins instance as described in [Guided Tour](../guidedtour.md)
|
||||
|
||||
## Context
|
||||
|
||||
The Application Programming Model for SAP Cloud Platform is an end-to-end best practice guide for developing applications on SAP Cloud Platform and provides a supportive set of APIs, languages, and libraries.
|
||||
For more information about the SAP Cloud Application Programming Model, see [Working with the SAP Cloud Application Programming Model](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/00823f91779d4d42aa29a498e0535cdf.html).
|
||||
|
||||
## Getting started
|
||||
|
||||
To get started, generate a project in SAP Web IDE based on the _SAP Cloud Platform Business Application_ template.
|
||||
Make sure to check the Include support for continuous delivery pipeline of SAP Cloud SDK checkbox, as in this screenshot:
|
||||
|
||||

|
||||
|
||||
This will generate a project which already includes a `Jenkinsfile`, and a `pipeline_config.yml` file.
|
||||
|
||||
In case you already created your project without this option, you'll need to copy and paste two files into the root directory of your project, and commit them to your git repository:
|
||||
|
||||
* [`Jenkinsfile`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/Jenkinsfile)
|
||||
* [`pipeline_config.yml`](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/cf-pipeline_config.yml)
|
||||
* Note: The file must be named `pipeline_config.yml`, despite the different name of the file template
|
||||
|
||||
!!! note "Using the right project structure"
|
||||
This only applies to projects created based on the _SAP Cloud Platform Business Application_ template after September 6th 2019. They must comply with the structure which is described [here](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/pipeline/build-tools.md#sap-cloud-application-programming-model--mta).
|
||||
|
||||
If your project uses SAP HANA containers (HDI), you'll need to configure `createHdiContainer` and `cloudFoundry` in the `backendIntegrationTests` stage in your `pipeline_config.yml` file as documented [here](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md#backendintegrationtests)
|
||||
|
||||
Now, you'll need to push the code to a git repository.
|
||||
This is required because the pipeline gets your code via git.
|
||||
This might be GitHub, or any other cloud or on-premise git solution you have in your company.
|
||||
|
||||
Be sure to configure the [`productionDeployment `](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md#productiondeployment) stage so your changes are deployed to SAP Cloud Platform automatically.
|
||||
|
||||
## Legacy documentation
|
||||
|
||||
If your project is not based on the _SAP Cloud Platform Business Application_ WebIDE template, you could either migrate your code to comply with the structure which is described [here](https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/pipeline/build-tools.md#sap-cloud-application-programming-model--mta), or you can use a self built pipeline, as described in this section.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
* You have an account on SAP Cloud Platform in the Cloud Foundry environment. See [Accounts](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/8ed4a705efa0431b910056c0acdbf377.html).
|
||||
* You have downloaded and installed the Cloud Foundry command line interface (CLI). See [Download and Install the Cloud Foundry Command Line Interface](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/afc3f643ec6942a283daad6cdf1b4936.html).
|
||||
* You have installed the multi-target application plug-in for the Cloud Foundry command line interface. See [Install the Multi-Target Application Plug-in in the Cloud Foundry Environment](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/27f3af39c2584d4ea8c15ba8c282fd75.html).
|
||||
@ -13,15 +53,15 @@ Set up a basic continuous delivery process for developing applications according
|
||||
* You have installed the Multi-Target Application (MTA) Archive Builder 1.0.6 or newer. See [SAP Development Tools](https://tools.hana.ondemand.com/#cloud).
|
||||
* You have installed Node.js including node and npm. See [Node.js](https://nodejs.org/en/download/).
|
||||
|
||||
## Context
|
||||
### Context
|
||||
|
||||
The Application Programming Model for SAP Cloud Platform is an end-to-end best practice guide for developing applications on SAP Cloud Platform and provides a supportive set of APIs, languages, and libraries. For more information about the SAP Cloud Application Programming Model, see [Working with the SAP Cloud Application Programming Model](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/00823f91779d4d42aa29a498e0535cdf.html).
|
||||
|
||||
In this scenario, we want to show how to implement a basic continuous delivery process for developing applications according to this programming model with the help of project "Piper" on Jenkins. This basic scenario can be adapted and enriched according to your specific needs.
|
||||
|
||||
## Example
|
||||
### Example
|
||||
|
||||
### Jenkinsfile
|
||||
#### Jenkinsfile
|
||||
|
||||
```groovy
|
||||
@Library('piper-library-os') _
|
||||
@ -43,7 +83,7 @@ node(){
|
||||
}
|
||||
```
|
||||
|
||||
### Configuration (`.pipeline/config.yml`)
|
||||
#### Configuration (`.pipeline/config.yml`)
|
||||
|
||||
```yaml
|
||||
steps:
|
||||
@ -57,9 +97,9 @@ steps:
|
||||
space: '<CF Space>'
|
||||
```
|
||||
|
||||
### Parameters
|
||||
#### Parameters
|
||||
|
||||
For the detailed description of the relevant parameters, see:
|
||||
|
||||
* [mtaBuild](https://sap.github.io/jenkins-library/steps/mtaBuild/)
|
||||
* [cloudFoundryDeploy](https://sap.github.io/jenkins-library/steps/cloudFoundryDeploy/)
|
||||
* [mtaBuild](../../../steps/mtaBuild/)
|
||||
* [cloudFoundryDeploy](../../../steps/cloudFoundryDeploy/)
|
||||
|
77
documentation/docs/scenarios/TMS_Extension.md
Normal file
77
documentation/docs/scenarios/TMS_Extension.md
Normal file
@ -0,0 +1,77 @@
|
||||
# Integrate SAP Cloud Platform Transport Management Into Your CI/CD Pipeline
|
||||
|
||||
Extend your CI/CD pipeline with SAP Cloud Platform Transport Management to add an enterprise-ready change and release management process and enable the transport of cloud-based applications on SAP Cloud Platform between several stages.
|
||||
|
||||
## Context
|
||||
|
||||
This procedure explains how to upload a [multitartget application](https://www.sap.com/documents/2016/06/e2f618e4-757c-0010-82c7-eda71af511fa.html) from a CI/CD pipeline to SAP Cloud Platform Transport Management and then import it into its target environment.
|
||||
|
||||
SAP Cloud Platform Transport Management allows you to manage the transport of development artifacts and application-specific content between different SAP Cloud Platform accounts. It adds transparency to the audit trail of changes so that you get information about who performed which changes in your production accounts and when they did it. At the same time, the Transport Management service enables a separation of concerns: For example, a developer of an application or SAP Cloud Platform content artifacts can trigger the propagation of changes, while the resulting transport is handled by a central operations team. For more information, see [SAP Cloud Platform Transport Management](https://help.sap.com/viewer/product/TRANSPORT_MANAGEMENT_SERVICE/Cloud/en-US).
|
||||
|
||||
The following graphic provides an overview about the interplay between continuous integration and Transport Management:
|
||||
|
||||

|
||||
|
||||
## Prerequisites
|
||||
|
||||
* You have an existing CI pipeline, which you want to extend with SAP Cloud Platform Transport Management.
|
||||
* You have an MTA project and the folder structure of its sources corresponds to the standard MTA structure. For more information, see [The Multitarget Application Model](https://www.sap.com/documents/2016/06/e2f618e4-757c-0010-82c7-eda71af511fa.html).
|
||||
* You have access to SAP Cloud Platform Transport Management. See [Provide Access to SAP Cloud Platform Transport Management](https://help.sap.com/viewer/7f7160ec0d8546c6b3eab72fb5ad6fd8/Cloud/en-US/13894bed9e2d4b25aa34d03d002707f9.html).
|
||||
* You have set up SAP Cloud Platform Transport Management and created a service key. See [Set Up the Environment to Transport Content Archives directly in an Application](https://help.sap.com/viewer/7f7160ec0d8546c6b3eab72fb5ad6fd8/Cloud/en-US/8d9490792ed14f1bbf8a6ac08a6bca64.html).
|
||||
* You have configured your Transport Management landscape. See [Configuring the Landscape](https://help.sap.com/viewer/7f7160ec0d8546c6b3eab72fb5ad6fd8/Cloud/en-US/3e7b04236d804a4eb80e42c6360209f1.html).
|
||||
|
||||
## Procedure
|
||||
|
||||
You can use this scenario to extend any CI process that meets the prerequisites, for example, the one described in [Build and Deploy SAPUI5 or SAP Fiori Applications on SAP Cloud Platform with Jenkins](https://sap.github.io/jenkins-library/scenarios/ui5-sap-cp/Readme/).
|
||||
|
||||
The following graphic shows an example of the detailed procedure when combining continuous integration and SAP Cloud Platform Transport Management:
|
||||
|
||||

|
||||
|
||||
The process flow contains the following steps:
|
||||
|
||||
1. The CI server builds a multitarget application (MTA) archive.
|
||||
1. The MTA is uploaded into the import queue of the target node, which is specified in the CI pipeline (in this example, PRE-PROD).
|
||||
1. The release manager manually triggers or schedules the import, which results in the physical deployment of the MTA archive into the corresponding subaccount (in this example, PRE-PROD).
|
||||
1. As soon as the import is executed, a transport is triggered along the defined transport route so that the MTA archive reaches the import queue of the next node (in this example, PROD).
|
||||
1. There, the physical import into the corresponding subaccount can be either triggered manually by the release manager or automatically by using the scheduling mechanisms of SAP Cloud Platform Transport Management.
|
||||
|
||||
## Example
|
||||
|
||||
### Jenkinsfile
|
||||
|
||||
If you use the pipeline of the following code snippet, you only have to configure it in the .pipeline/config.yml.
|
||||
|
||||
Following the convention for pipeline definitions, use a Jenkinsfile, which resides in the root directory of your development sources.
|
||||
|
||||
```groovy
|
||||
@Library('piper-lib-os') _
|
||||
|
||||
piperPipeline script:this
|
||||
```
|
||||
|
||||
### Configuration (`.pipeline/config.yml`)
|
||||
|
||||
This is a basic configuration example, which is also located in the sources of the project.
|
||||
|
||||
```yaml
|
||||
steps:
|
||||
tmsUpload:
|
||||
credentialsId: tms-secret-key
|
||||
nodeName: tms_target_node
|
||||
mtaPath: com.piper.example.tms.mtar
|
||||
customDescription: Custom-Transport-Description
|
||||
```
|
||||
|
||||
#### Configration for the Upload to Transport Management
|
||||
|
||||
| Parameter | Description |
|
||||
| -------------------|-------------|
|
||||
| `credentialsId` |Credentials that are used for the file and node uploads to the Transport Management Service.|
|
||||
| `nodeName`|Defines the name of the node to which the *.mtar file is uploaded.|
|
||||
| `mtaPath`|Defines the path to *.mtar for the upload to the Transport Management Service.|
|
||||
| `customDescription`| Can be used as description of a transport request. Overwrites the default (Default: Corresponding Git Commit-ID).|
|
||||
|
||||
### Parameters
|
||||
|
||||
For a detailed description of the relevant parameters, see [tmsUpload](../../../steps/tmsUpload/).
|
@ -8,8 +8,8 @@ Set up an agile development process with Jenkins CI, which automatically feeds c
|
||||
* You have installed Jenkins 2.60.3 or higher.
|
||||
* You have set up Project “Piper”. See [README](https://github.com/SAP/jenkins-library/blob/master/README.md).
|
||||
* You have installed SAP Solution Manager 7.2 SP6. See [README](https://github.com/SAP/devops-cm-client/blob/master/README.md).
|
||||
* You have installed the Multi-Target Application (MTA) Archive Builder 1.0.6 or newer. See [SAP Development Tools](https://tools.hana.ondemand.com/#cloud).
|
||||
* You have installed Node.js including node and npm. See [Node.js](https://nodejs.org/en/download/).
|
||||
* You have installed the Multi-Target Application (MTA) Archive Builder 1.0.6 or newer. See [SAP Development Tools](https://tools.hana.ondemand.com/#cloud). **Note:** This is only required if you don't use a Docker-based environment.
|
||||
* You have installed Node.js including node and npm. See [Node.js](https://nodejs.org/en/download/). **Note:** This is only required if you don't use a Docker-based environment.
|
||||
|
||||
## Context
|
||||
|
||||
@ -22,7 +22,7 @@ In this scenario, we want to show how an agile development process with Jenkins
|
||||
|
||||
The basic workflow is as follows:
|
||||
|
||||
1. The pipeline scans the Git commit messages for a line like `ChangeDocument : <changeDocumentId>`, and validates that the change is in the correct status `in development`. For more information, see [checkChangeInDevelopment](https://sap.github.io/jenkins-library/steps/checkChangeInDevelopment/). An example for the commit message looks as follows:
|
||||
1. The pipeline scans the Git commit messages for a line like `ChangeDocument : <changeDocumentId>`, and validates that the change is in the correct status `in development`. For more information, see [checkChangeInDevelopment](../../steps/checkChangeInDevelopment/). An example for the commit message looks as follows:
|
||||
|
||||
```
|
||||
Fix terminology in documentation
|
||||
@ -33,7 +33,7 @@ The basic workflow is as follows:
|
||||
|
||||
**Note:** The blank line between message header and message description is mandatory.
|
||||
|
||||
1. To communicate with SAP Solution Manager, the pipeline uses credentials that must be stored on Jenkins using the credential ID `CM`. For more information, see [checkChangeInDevelopment](https://sap.github.io/jenkins-library/steps/checkChangeInDevelopment/).
|
||||
1. To communicate with SAP Solution Manager, the pipeline uses credentials that must be stored on Jenkins using the credential ID `CM`. For more information, see [checkChangeInDevelopment](../../steps/checkChangeInDevelopment/).
|
||||
1. The required transport request is created on the fly. **Note:** The change document can contain various components (for example, UI and backend components).
|
||||
1. The changes of your development team trigger the Jenkins pipeline. It builds and validates the changes and attaches them to the respective transport request.
|
||||
1. As soon as the development process is completed, the change document in SAP Solution Manager can be set to status `to be tested` and all components can be transported to the test system.
|
||||
@ -91,8 +91,8 @@ steps:
|
||||
|
||||
For the detailed description of the relevant parameters, see:
|
||||
|
||||
* [checkChangeInDevelopment](https://sap.github.io/jenkins-library/steps/checkChangeInDevelopment/)
|
||||
* [mtaBuild](https://sap.github.io/jenkins-library/steps/mtaBuild/)
|
||||
* [transportRequestCreate](https://sap.github.io/jenkins-library/steps/transportRequestCreate/)
|
||||
* [transportRequestUploadFile](https://sap.github.io/jenkins-library/steps/transportRequestUploadFile/)
|
||||
* [transportRequestRelease](https://sap.github.io/jenkins-library/steps/transportRequestRelease/)
|
||||
* [checkChangeInDevelopment](../../steps/checkChangeInDevelopment/)
|
||||
* [mtaBuild](../../steps/mtaBuild/)
|
||||
* [transportRequestCreate](../../steps/transportRequestCreate/)
|
||||
* [transportRequestUploadFile](../../steps/transportRequestUploadFile/)
|
||||
* [transportRequestRelease](../../steps/transportRequestRelease/)
|
||||
|
@ -28,7 +28,7 @@ On the project level, provide and adjust the following template:
|
||||
|
||||
This scenario combines various different steps to create a complete pipeline.
|
||||
|
||||
In this scenario, we want to show how to build an application based on SAPUI5 or SAP Fiori by using the multi-target application (MTA) concept and how to deploy the build result into an SAP Cloud Platform account in the Neo environment. This document comprises the [mtaBuild](https://sap.github.io/jenkins-library/steps/mtaBuild/) and the [neoDeploy](https://sap.github.io/jenkins-library/steps/neoDeploy/) steps.
|
||||
In this scenario, we want to show how to build an application based on SAPUI5 or SAP Fiori by using the multi-target application (MTA) concept and how to deploy the build result into an SAP Cloud Platform account in the Neo environment. This document comprises the [mtaBuild](../../../steps/mtaBuild/) and the [neoDeploy](../../../steps/neoDeploy/) steps.
|
||||
|
||||

|
||||
###### Screenshot: Build and Deploy Process in Jenkins
|
||||
@ -82,5 +82,5 @@ steps:
|
||||
|
||||
For the detailed description of the relevant parameters, see:
|
||||
|
||||
* [mtaBuild](https://sap.github.io/jenkins-library/steps/mtaBuild/)
|
||||
* [neoDeploy](https://sap.github.io/jenkins-library/steps/neoDeploy/)
|
||||
* [mtaBuild](../../../steps/mtaBuild/)
|
||||
* [neoDeploy](../../../steps/neoDeploy/)
|
||||
|
101
documentation/docs/scenarios/xsa-deploy/Readme.md
Normal file
101
documentation/docs/scenarios/xsa-deploy/Readme.md
Normal file
@ -0,0 +1,101 @@
|
||||
# Build and Deploy SAP Fiori Applications on SAP HANA XS Advanced
|
||||
|
||||
Build an application based on SAPUI5 or SAP Fiori with Jenkins and deploy the build result into an SAP Cloud Platform account in the Neo environment.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* [Docker environment](https://docs.docker.com/get-started/)
|
||||
* All artifacts refereneced during the build are available either on Service Market Place or via public repositories
|
||||
* You have set up Project “Piper”. See [guided tour](https://sap.github.io/jenkins-library/guidedtour/).
|
||||
* Docker image for xs deployment available. Due to legal reasons there is no pre-build docker image. How to create the docker image is explained [here](https://github.com/SAP/devops-docker-images/tree/master/xs-cli).
|
||||
|
||||
### Project Prerequisites
|
||||
|
||||
This scenario requires additional files in your project and in the execution environment on your Jenkins instance.
|
||||
For details see: [XSA developer quick start guide](https://help.sap.com/viewer/400066065a1b46cf91df0ab436404ddc/2.0.04/en-US/7f681c32c2a34735ad85e4ab403f8c26.html).
|
||||
|
||||
## Context
|
||||
|
||||
This scenario combines various different steps to create a complete pipeline.
|
||||
|
||||
In this scenario, we want to show how to build a Multitarget Application (MTA) and deploy the build result into an on-prem SAP HANA XS advances system. This document comprises the [mtaBuild](https://sap.github.io/jenkins-library/steps/mtaBuild/) and the [xsDeploy](https://sap.github.io/jenkins-library/steps/xsDeploy/) steps.
|
||||
|
||||

|
||||
###### Screenshot: Build and Deploy Process in Jenkins
|
||||
|
||||
## Example
|
||||
|
||||
### Jenkinsfile
|
||||
|
||||
Following the convention for pipeline definitions, use a `Jenkinsfile`, which resides in the root directory of your development sources.
|
||||
|
||||
```groovy
|
||||
@Library('piper-lib-os') _
|
||||
|
||||
pipeline {
|
||||
|
||||
agent any
|
||||
|
||||
stages {
|
||||
stage("prepare") {
|
||||
steps {
|
||||
deleteDir()
|
||||
checkout scm
|
||||
setupCommonPipelineEnvironment script: this
|
||||
}
|
||||
}
|
||||
stage('build') {
|
||||
steps {
|
||||
mtaBuild script: this
|
||||
}
|
||||
}
|
||||
stage('deploy') {
|
||||
steps {
|
||||
xsDeploy script: this
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Configuration (`.pipeline/config.yml`)
|
||||
|
||||
This is a basic configuration example, which is also located in the sources of the project.
|
||||
|
||||
```yaml
|
||||
steps:
|
||||
mtaBuild:
|
||||
buildTarget: 'XSA'
|
||||
xsDeploy:
|
||||
apiUrl: '<API_URL>' # e.g. 'https://example.org:30030'
|
||||
# credentialsId: 'XS' omitted, 'XS' is the default
|
||||
docker:
|
||||
dockerImage: '<ID_OF_THE_DOCKER_IMAGE' # for legal reasons no docker image is provided.
|
||||
# dockerPullImage: true # default: 'false'. Needs to be set to 'true' in case the image is served from a docker registry
|
||||
loginOpts: '' # during setup for non-productive builds we might set here. '--skip-ssl-validation'
|
||||
org: '<ORG_NAME>'
|
||||
space: '<SPACE>'
|
||||
|
||||
```
|
||||
|
||||
#### Configuration for the MTA Build
|
||||
|
||||
| Parameter | Description |
|
||||
| -----------------|----------------|
|
||||
| `buildTarget` | The target platform to which the mtar can be deployed. In this case, the target platform is `XSA`. |
|
||||
|
||||
#### Configuration for the Deployment to XSA
|
||||
|
||||
| Parameter | Description |
|
||||
| -------------------|-------------|
|
||||
| `credentialsId` | The Jenkins credentials that contain user and password required for the deployment on SAP Cloud Platform.|
|
||||
| `mode` | DeployMode. See [stepDocu](../../../steps/xsDeploy) for more details. |
|
||||
| `org` | The org. See [stepDocu](../../../steps/xsDeploy) for more details. |
|
||||
| `space` | The space. See [stepDocu](../../../steps/xsDeploy) for more details. |
|
||||
|
||||
### Parameters
|
||||
|
||||
For the detailed description of the relevant parameters, see:
|
||||
|
||||
* [mtaBuild](https://sap.github.io/jenkins-library/steps/mtaBuild/)
|
||||
* [xsDeploy](https://sap.github.io/jenkins-library/steps/xsDeploy/)
|
BIN
documentation/docs/scenarios/xsa-deploy/images/pipeline.jpg
Normal file
BIN
documentation/docs/scenarios/xsa-deploy/images/pipeline.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 18 KiB |
25
documentation/docs/steps/abapEnvironmentPullGitRepo.md
Normal file
25
documentation/docs/steps/abapEnvironmentPullGitRepo.md
Normal file
@ -0,0 +1,25 @@
|
||||
# ${docGenStepName}
|
||||
|
||||
## ${docGenDescription}
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* A SAP Cloud Platform ABAP Environment system is available.
|
||||
* On this system, a [Communication User](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/0377adea0401467f939827242c1f4014.html), a [Communication System](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/1bfe32ae08074b7186e375ab425fb114.html) and a [Communication Arrangement](https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/a0771f6765f54e1c8193ad8582a32edb.html) is setup for the Communication Scenario "SAP Cloud Platform ABAP Environment - Software Component Test Integration (SAP_COM_0510)".
|
||||
|
||||
## ${docGenParameters}
|
||||
|
||||
## ${docGenConfiguration}
|
||||
|
||||
## ${docJenkinsPluginDependencies}
|
||||
|
||||
## Example
|
||||
|
||||
```groovy
|
||||
abapEnvironmentPullGitRepo (
|
||||
host : '1234-abcd-5678-efgh-ijk.abap.eu10.hana.ondemand.com',
|
||||
repositoryName : '/DMO/GIT_REPOSITORY',
|
||||
credentialsId : "myCredentialsId",
|
||||
script : this
|
||||
)
|
||||
```
|
73
documentation/docs/steps/cfManifestSubstituteVariables.md
Normal file
73
documentation/docs/steps/cfManifestSubstituteVariables.md
Normal file
@ -0,0 +1,73 @@
|
||||
# ${docGenStepName}
|
||||
|
||||
## ${docGenDescription}
|
||||
|
||||
## ${docGenParameters}
|
||||
|
||||
## ${docGenConfiguration}
|
||||
|
||||
## ${docJenkinsPluginDependencies}
|
||||
|
||||
## Side effects
|
||||
|
||||
Unless configured otherwise, this step will *replace* the input `manifest.yml` with a version that has all variable references replaced. This alters the source tree in your Jenkins workspace.
|
||||
If you prefer to generate a separate output file, use the step's `outputManifestFile` parameter. Keep in mind, however, that your Cloud Foundry deployment step should then also reference this output file - otherwise CF deployment will fail with unresolved variable reference errors.
|
||||
|
||||
## Exceptions
|
||||
|
||||
* `org.yaml.snakeyaml.scanner.ScannerException` - in case any of the loaded input files contains malformed Yaml and cannot be parsed.
|
||||
|
||||
* `hudson.AbortException` - in case of internal errors and when not all variables could be replaced due to missing replacement values.
|
||||
|
||||
## Example
|
||||
|
||||
Usage of pipeline step:
|
||||
|
||||
```groovy
|
||||
cfManifestSubstituteVariables (
|
||||
script: this,
|
||||
manifestFile: "path/to/manifest.yml", //optional, default: manifest.yml
|
||||
manifestVariablesFiles: ["path/to/manifest-variables.yml"] //optional, default: ['manifest-variables.yml']
|
||||
manifestVariables: [[key : value], [key : value]] //optional, default: []
|
||||
)
|
||||
```
|
||||
|
||||
For example, you can refer to the parameters using relative paths (similar to `cf push --vars-file`):
|
||||
|
||||
```groovy
|
||||
cfManifestSubstituteVariables (
|
||||
script: this,
|
||||
manifestFile: "manifest.yml",
|
||||
manifestVariablesFiles: ["manifest-variables.yml"]
|
||||
)
|
||||
```
|
||||
|
||||
Furthermore, you can also specify variables and their values directly (similar to `cf push --var`):
|
||||
|
||||
```groovy
|
||||
cfManifestSubstituteVariables (
|
||||
script: this,
|
||||
manifestFile: "manifest.yml",
|
||||
manifestVariablesFiles: ["manifest-variables.yml"],
|
||||
manifestVariables: [[key1 : value1], [key2 : value2]]
|
||||
)
|
||||
```
|
||||
|
||||
If you are using the Cloud Foundry [Create-Service-Push](https://github.com/dawu415/CF-CLI-Create-Service-Push-Plugin) CLI plugin you will most likely also have a `services-manifest.yml` file.
|
||||
Also in this file you can specify variable references, that can be resolved from the same variables file, e.g. like this:
|
||||
|
||||
```groovy
|
||||
// resolve variables in manifest.yml
|
||||
cfManifestSubstituteVariables (
|
||||
script: this,
|
||||
manifestFile: "manifest.yml",
|
||||
manifestVariablesFiles: ["manifest-variables.yml"]
|
||||
)
|
||||
|
||||
// resolve variables in services-manifest.yml from same file.
|
||||
cfManifestSubstituteVariables (
|
||||
script: this,
|
||||
manifestFile: "services-manifest.yml",
|
||||
manifestVariablesFiles: ["manifest-variables.yml"]
|
||||
)
|
||||
```
|
@ -4,7 +4,7 @@
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* **[Change Management Client 2.0.0 or compatible version](http://central.maven.org/maven2/com/sap/devops/cmclient/dist.cli/)** - available for download on Maven Central.
|
||||
* **[Change Management Client 2.0.0 or compatible version](http://central.maven.org/maven2/com/sap/devops/cmclient/dist.cli/)** - available for download on Maven Central. **Note:** This is only required if you don't use a Docker-based environment.
|
||||
|
||||
## ${docGenParameters}
|
||||
|
||||
|
37
documentation/docs/steps/cloudFoundryCreateService.md
Normal file
37
documentation/docs/steps/cloudFoundryCreateService.md
Normal file
@ -0,0 +1,37 @@
|
||||
# ${docGenStepName}
|
||||
|
||||
## ${docGenDescription}
|
||||
|
||||
## ${docGenParameters}
|
||||
|
||||
## ${docGenConfiguration}
|
||||
|
||||
## ${docJenkinsPluginDependencies}
|
||||
|
||||
## Example
|
||||
|
||||
The following Example will create the services specified in a file `manifest-create-service.yml` in cloud foundry org `cfOrg` of Cloud Foundry installation accessed via `https://test.server.com` in space `cfSpace` by using the username & password stored in `cfCredentialsId`.
|
||||
|
||||
```groovy
|
||||
cloudFoundryCreateService(
|
||||
script: this,
|
||||
cloudFoundry: [apiEndpoint: 'https://test.server.com',
|
||||
credentialsId: 'cfCredentialsId',
|
||||
serviceManifest: 'manifest-create-service.yml',
|
||||
org: 'cfOrg',
|
||||
space: 'cfSpace'])
|
||||
```
|
||||
|
||||
The following example additionally to above also makes use of a variable substitution file `mainfest-variable-substitution.yml`.
|
||||
|
||||
```groovy
|
||||
cloudFoundryCreateService(
|
||||
script: this,
|
||||
cloudFoundry: [apiEndpoint: 'https://test.server.com',
|
||||
credentialsId: 'cfCredentialsId',
|
||||
serviceManifest: 'manifest-create-service.yml',
|
||||
manifestVariablesFiles: ['mainfest-variable-substitution.yml'],
|
||||
org: 'cfOrg',
|
||||
space: 'cfSpace'])
|
||||
|
||||
```
|
@ -55,7 +55,7 @@ dockerExecute(
|
||||
sidecarImage: 'selenium/standalone-chrome',
|
||||
sidecarName: 'selenium',
|
||||
) {
|
||||
git url: 'https://github.wdf.sap.corp/XXXXX/WebDriverIOTest.git'
|
||||
git url: 'https://github.com/XXXXX/WebDriverIOTest.git'
|
||||
sh '''npm install
|
||||
node index.js
|
||||
'''
|
||||
|
@ -46,7 +46,7 @@ export ON_K8S=true"
|
||||
```
|
||||
|
||||
```groovy
|
||||
dockerExecuteOnKubernetes(script: script, containerMap: ['maven:3.5-jdk-8-alpine': 'maven', 's4sdk/docker-cf-cli': 'cfcli']){
|
||||
dockerExecuteOnKubernetes(script: script, containerMap: ['maven:3.5-jdk-8-alpine': 'maven', 'ppiper/cf-cli': 'cfcli']){
|
||||
container('maven'){
|
||||
sh "mvn clean install"
|
||||
}
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* **SAP CP account** - the account to where the application is deployed.
|
||||
* **SAP CP account** - the account to where the application is deployed. To deploy MTA (`deployMode: mta`) an over existing _Java_ application, free _Java Quota_ of at least 1 is required, which means that this will not work on trial accounts.
|
||||
* **SAP CP user for deployment** - a user with deployment permissions in the given account.
|
||||
* **Jenkins credentials for deployment** - must be configured in Jenkins credentials with a dedicated Id.
|
||||
|
||||
|
@ -19,7 +19,6 @@ The step is stashing files before and after the build. This is due to the fact,
|
||||
|classFiles|no| |includes: `**/target/classes/**/*.class, **/target/test-classes/**/*.class` <br />excludes: `''`|
|
||||
|deployDescriptor|no| |includes: `**/manifest*.y*ml, **/*.mtaext.y*ml, **/*.mtaext, **/xs-app.json, helm/**, *.y*ml`<br />exclude: `''`|
|
||||
|git|no| |includes: `**/gitmetadata/**`<br />exludes: `''`|
|
||||
|opa5|no|OPA5 is enabled|includes: `**/*.*`<br />excludes: `''`|
|
||||
|opensourceConfiguration|no| |includes: `**/srcclr.yml, **/vulas-custom.properties, **/.nsprc, **/.retireignore, **/.retireignore.json, **/.snyk`<br />excludes: `''`|
|
||||
|pipelineConfigAndTests|no| |includes: `.pipeline/*.*`<br />excludes: `''`|
|
||||
|securityDescriptor|no| |includes: `**/xs-security.json`<br />exludes: `''`|
|
||||
|
@ -10,7 +10,7 @@ none
|
||||
|
||||
```groovy
|
||||
seleniumExecuteTests (script: this) {
|
||||
git url: 'https://github.wdf.sap.corp/xxxxx/WebDriverIOTest.git'
|
||||
git url: 'https://github.com/xxxxx/WebDriverIOTest.git'
|
||||
sh '''npm install
|
||||
node index.js'''
|
||||
}
|
||||
|
9
documentation/docs/steps/tmsUpload.md
Normal file
9
documentation/docs/steps/tmsUpload.md
Normal file
@ -0,0 +1,9 @@
|
||||
# ${docGenStepName}
|
||||
|
||||
## ${docGenDescription}
|
||||
|
||||
## ${docGenParameters}
|
||||
|
||||
## ${docGenConfiguration}
|
||||
|
||||
## ${docJenkinsPluginDependencies}
|
@ -4,7 +4,7 @@
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* **[Change Management Client 2.0.0 or compatible version](http://central.maven.org/maven2/com/sap/devops/cmclient/dist.cli/)** - available for download on Maven Central.
|
||||
* **[Change Management Client 2.0.0 or compatible version](http://central.maven.org/maven2/com/sap/devops/cmclient/dist.cli/)** - available for download on Maven Central. **Note:** This is only required if you don't use a Docker-based environment.
|
||||
* Solution Manager version `ST720 SP08` or newer.
|
||||
|
||||
## ${docGenParameters}
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* **[Change Management Client 2.0.0 or compatible version](http://central.maven.org/maven2/com/sap/devops/cmclient/dist.cli/)** - available for download on Maven Central.
|
||||
* **[Change Management Client 2.0.0 or compatible version](http://central.maven.org/maven2/com/sap/devops/cmclient/dist.cli/)** - available for download on Maven Central. **Note:** This is only required if you don't use a Docker-based environment.
|
||||
|
||||
## ${docGenParameters}
|
||||
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
## Prerequisites
|
||||
|
||||
* **[Change Management Client 2.0.0 or compatible version](http://central.maven.org/maven2/com/sap/devops/cmclient/dist.cli/)** - available for download on Maven Central.
|
||||
* **[Change Management Client 2.0.0 or compatible version](http://central.maven.org/maven2/com/sap/devops/cmclient/dist.cli/)** - available for download on Maven Central. **Note:** This is only required if you don't use a Docker-based environment.
|
||||
|
||||
## ${docGenParameters}
|
||||
|
||||
|
40
documentation/docs/steps/xsDeploy.md
Normal file
40
documentation/docs/steps/xsDeploy.md
Normal file
@ -0,0 +1,40 @@
|
||||
# ${docGenStepName}
|
||||
|
||||
## ${docGenDescription}
|
||||
|
||||
## ${docGenParameters}
|
||||
|
||||
## ${docGenConfiguration}
|
||||
|
||||
## ${docJenkinsPluginDependencies}
|
||||
|
||||
## Side effects
|
||||
|
||||
none
|
||||
|
||||
## Example
|
||||
|
||||
```groovy
|
||||
xsDeploy
|
||||
script: this,
|
||||
mtaPath: 'path/to/archiveFile.mtar',
|
||||
credentialsId: 'my-credentials-id',
|
||||
apiUrl: 'https://example.org/xs',
|
||||
space: 'mySpace',
|
||||
org: 'myOrg'
|
||||
```
|
||||
|
||||
Example configuration:
|
||||
|
||||
```yaml
|
||||
steps:
|
||||
<...>
|
||||
xsDeploy:
|
||||
mtaPath: path/to/archiveFile.mtar
|
||||
credentialsId: my-credentials-id
|
||||
apiUrl: https://example.org/xs
|
||||
space: mySpace
|
||||
org: myOrg
|
||||
```
|
||||
|
||||
[dockerExecute]: ../dockerExecute
|
@ -1,17 +1,44 @@
|
||||
site_name: Jenkins 2.0 Pipelines
|
||||
site_name: 'Project "Piper": Continuous Delivery for the SAP Ecosystem'
|
||||
nav:
|
||||
- Home: index.md
|
||||
- 'Guided Tour' : guidedtour.md
|
||||
- Configuration: configuration.md
|
||||
- 'Pipelines':
|
||||
- 'General purpose pipeline':
|
||||
- 'Introduction': stages/introduction.md
|
||||
- 'Examples': stages/examples.md
|
||||
- 'Stages':
|
||||
- 'Init Stage': stages/init.md
|
||||
- 'Pull-Request Voting Stage': stages/prvoting.md
|
||||
- 'Build Stage': stages/build.md
|
||||
- 'Additional Unit Test Stage': stages/additionalunittests.md
|
||||
- 'Integration Stage': stages/integration.md
|
||||
- 'Acceptance Stage': stages/acceptance.md
|
||||
- 'Security Stage': stages/security.md
|
||||
- 'Performance Stage': stages/performance.md
|
||||
- 'Compliance': stages/compliance.md
|
||||
- 'Confirm Stage': stages/confirm.md
|
||||
- 'Promote Stage': stages/promote.md
|
||||
- 'Release Stage': stages/release.md
|
||||
- 'SAP Cloud SDK pipeline': pipelines/cloud-sdk/introduction.md
|
||||
- 'Scenarios':
|
||||
- 'Build and Deploy Hybrid Applications with Jenkins and SAP Solution Manager': scenarios/changeManagement.md
|
||||
- 'Build and Deploy SAP UI5 or SAP Fiori Applications on SAP Cloud Platform with Jenkins': scenarios/ui5-sap-cp/Readme.md
|
||||
- 'Build and Deploy Applications with Jenkins and the SAP Cloud Application Programming Model': scenarios/CAP_Scenario.md
|
||||
- 'Integrate SAP Cloud Platform Transport Management Into Your CI/CD Pipeline': scenarios/TMS_Extension.md
|
||||
- Extensibility: extensibility.md
|
||||
- 'Library steps':
|
||||
- abapEnvironmentPullGitRepo: steps/abapEnvironmentPullGitRepo.md
|
||||
- artifactSetVersion: steps/artifactSetVersion.md
|
||||
- batsExecuteTests: steps/batsExecuteTests.md
|
||||
- buildExecute: steps/buildExecute.md
|
||||
- checkChangeInDevelopment: steps/checkChangeInDevelopment.md
|
||||
- checksPublishResults: steps/checksPublishResults.md
|
||||
- cfManifestSubstituteVariables: steps/cfManifestSubstituteVariables.md
|
||||
- cloudFoundryDeploy: steps/cloudFoundryDeploy.md
|
||||
- commonPipelineEnvironment: steps/commonPipelineEnvironment.md
|
||||
- containerExecuteStructureTests: steps/containerExecuteStructureTests.md
|
||||
- containerPushToRegistry: steps/containerPushToRegistry.md
|
||||
- detectExecuteScan: steps/detectExecuteScan.md
|
||||
- dockerExecute: steps/dockerExecute.md
|
||||
- dockerExecuteOnKubernetes: steps/dockerExecuteOnKubernetes.md
|
||||
@ -37,6 +64,7 @@ nav:
|
||||
- pipelineStashFiles: steps/pipelineStashFiles.md
|
||||
- pipelineStashFilesAfterBuild: steps/pipelineStashFilesAfterBuild.md
|
||||
- pipelineStashFilesBeforeBuild: steps/pipelineStashFilesBeforeBuild.md
|
||||
- piperPublishWarnings: steps/piperPublishWarnings.md
|
||||
- prepareDefaultValues: steps/prepareDefaultValues.md
|
||||
- seleniumExecuteTests: steps/seleniumExecuteTests.md
|
||||
- setupCommonPipelineEnvironment: steps/setupCommonPipelineEnvironment.md
|
||||
@ -44,32 +72,13 @@ nav:
|
||||
- snykExecute: steps/snykExecute.md
|
||||
- sonarExecuteScan: steps/sonarExecuteScan.md
|
||||
- testsPublishResults: steps/testsPublishResults.md
|
||||
- tmsUpload: steps/tmsUpload.md
|
||||
- transportRequestCreate: steps/transportRequestCreate.md
|
||||
- transportRequestRelease: steps/transportRequestRelease.md
|
||||
- transportRequestUploadFile: steps/transportRequestUploadFile.md
|
||||
- uiVeri5ExecuteTests: steps/uiVeri5ExecuteTests.md
|
||||
- whitesourceExecuteScan: steps/whitesourceExecuteScan.md
|
||||
- 'Pipelines':
|
||||
- 'General purpose pipeline':
|
||||
- 'Introduction': stages/introduction.md
|
||||
- 'Examples': stages/examples.md
|
||||
- 'Stages':
|
||||
- 'Init Stage': stages/init.md
|
||||
- 'Pull-Request Voting Stage': stages/prvoting.md
|
||||
- 'Build Stage': stages/build.md
|
||||
- 'Additional Unit Test Stage': stages/additionalunittests.md
|
||||
- 'Integration Stage': stages/integration.md
|
||||
- 'Acceptance Stage': stages/acceptance.md
|
||||
- 'Security Stage': stages/security.md
|
||||
- 'Performance Stage': stages/performance.md
|
||||
- 'Compliance': stages/compliance.md
|
||||
- 'Confirm Stage': stages/confirm.md
|
||||
- 'Promote Stage': stages/promote.md
|
||||
- 'Release Stage': stages/release.md
|
||||
- 'Scenarios':
|
||||
- 'Build and Deploy Hybrid Applications with Jenkins and SAP Solution Manager': scenarios/changeManagement.md
|
||||
- 'Build and Deploy SAP UI5 or SAP Fiori Applications on SAP Cloud Platform with Jenkins': scenarios/ui5-sap-cp/Readme.md
|
||||
- 'Build and Deploy Applications with Jenkins and the SAP Cloud Application Programming Model': scenarios/CAP_Scenario.md
|
||||
- xsDeploy: steps/xsDeploy.md
|
||||
- Resources:
|
||||
- 'Required Plugins': jenkins/requiredPlugins.md
|
||||
|
||||
|
15
go.mod
Normal file
15
go.mod
Normal file
@ -0,0 +1,15 @@
|
||||
module github.com/SAP/jenkins-library
|
||||
|
||||
go 1.13
|
||||
|
||||
require (
|
||||
github.com/ghodss/yaml v1.0.0
|
||||
github.com/google/go-cmp v0.3.1
|
||||
github.com/google/go-github/v28 v28.1.1
|
||||
github.com/pkg/errors v0.8.1
|
||||
github.com/sirupsen/logrus v1.4.2
|
||||
github.com/spf13/cobra v0.0.5
|
||||
github.com/spf13/pflag v1.0.5
|
||||
github.com/stretchr/testify v1.2.2
|
||||
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45
|
||||
)
|
72
go.sum
Normal file
72
go.sum
Normal file
@ -0,0 +1,72 @@
|
||||
cloud.google.com/go v0.34.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
|
||||
github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=
|
||||
github.com/armon/consul-api v0.0.0-20180202201655-eb2c6b5be1b6/go.mod h1:grANhF5doyWs3UAsr3K4I6qtAmlQcZDesFNEHPZAzj8=
|
||||
github.com/coreos/etcd v3.3.10+incompatible/go.mod h1:uF7uidLiAD3TWHmW31ZFd/JWoc32PjwdhPthX9715RE=
|
||||
github.com/coreos/go-etcd v2.0.0+incompatible/go.mod h1:Jez6KQU2B/sWsbdaef3ED8NzMklzPG4d5KIOhIy30Tk=
|
||||
github.com/coreos/go-semver v0.2.0/go.mod h1:nnelYz7RCh+5ahJtPPxZlU+153eP4D4r3EedlOD2RNk=
|
||||
github.com/cpuguy83/go-md2man v1.0.10/go.mod h1:SmD6nW6nTyfqj6ABTjUi3V3JVMnlJmwcJI5acqYI6dE=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/fsnotify/fsnotify v1.4.7/go.mod h1:jwhsz4b93w/PPRr/qN1Yymfu8t87LnFCMoQvtojpjFo=
|
||||
github.com/ghodss/yaml v1.0.0 h1:wQHKEahhL6wmXdzwWG11gIVCkOv05bNOh+Rxn0yngAk=
|
||||
github.com/ghodss/yaml v1.0.0/go.mod h1:4dBDuWmgqj2HViK6kFavaiC9ZROes6MMH2rRYeMEF04=
|
||||
github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
|
||||
github.com/google/go-github v17.0.0+incompatible h1:N0LgJ1j65A7kfXrZnUDaYCs/Sf4rEjNlfyDHW9dolSY=
|
||||
github.com/google/go-github/v28 v28.1.1 h1:kORf5ekX5qwXO2mGzXXOjMe/g6ap8ahVe0sBEulhSxo=
|
||||
github.com/google/go-github/v28 v28.1.1/go.mod h1:bsqJWQX05omyWVmc00nEUql9mhQyv38lDZ8kPZcQVoM=
|
||||
github.com/google/go-querystring v1.0.0 h1:Xkwi/a1rcvNg1PPYe5vI8GbeBY/jrVuDX5ASuANWTrk=
|
||||
github.com/google/go-querystring v1.0.0/go.mod h1:odCYkC5MyYFN7vkCjXpyrEuKhc/BUO6wN/zVPAxq5ck=
|
||||
github.com/google/go-cmp v0.3.1 h1:Xye71clBPdm5HgqGwUkwhbynsUJZhDbS20FvLhQ2izg=
|
||||
github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
|
||||
github.com/hashicorp/hcl v1.0.0/go.mod h1:E5yfLk+7swimpb2L/Alb/PJmXilQ/rhwaUYs4T20WEQ=
|
||||
github.com/inconshreveable/mousetrap v1.0.0 h1:Z8tu5sraLXCXIcARxBp/8cbvlwVa7Z1NHg9XEKhtSvM=
|
||||
github.com/inconshreveable/mousetrap v1.0.0/go.mod h1:PxqpIevigyE2G7u3NXJIT2ANytuPF1OarO4DADm73n8=
|
||||
github.com/konsorten/go-windows-terminal-sequences v1.0.1 h1:mweAR1A6xJ3oS2pRaGiHgQ4OO8tzTaLawm8vnODuwDk=
|
||||
github.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
|
||||
github.com/magiconair/properties v1.8.0/go.mod h1:PppfXfuXeibc/6YijjN8zIbojt8czPbwD3XqdrwzmxQ=
|
||||
github.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0=
|
||||
github.com/mitchellh/mapstructure v1.1.2/go.mod h1:FVVH3fgwuzCH5S8UJGiWEs2h04kUh9fWfEaFds41c1Y=
|
||||
github.com/pelletier/go-toml v1.2.0/go.mod h1:5z9KED0ma1S8pY6P1sdut58dfprrGBbd/94hg7ilaic=
|
||||
github.com/pkg/errors v0.8.1 h1:iURUrRGxPUNPdy5/HRSm+Yj6okJ6UtLINN0Q9M4+h3I=
|
||||
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/russross/blackfriday v1.5.2/go.mod h1:JO/DiYxRf+HjHt06OyowR9PTA263kcR/rfWxYHBV53g=
|
||||
github.com/sirupsen/logrus v1.4.2 h1:SPIRibHv4MatM3XXNO2BJeFLZwZ2LvZgfQ5+UNI2im4=
|
||||
github.com/sirupsen/logrus v1.4.2/go.mod h1:tLMulIdttU9McNUspp0xgXVQah82FyeX6MwdIuYE2rE=
|
||||
github.com/spf13/afero v1.1.2/go.mod h1:j4pytiNVoe2o6bmDsKpLACNPDBIoEAkihy7loJ1B0CQ=
|
||||
github.com/spf13/cast v1.3.0/go.mod h1:Qx5cxh0v+4UWYiBimWS+eyWzqEqokIECu5etghLkUJE=
|
||||
github.com/spf13/cobra v0.0.5 h1:f0B+LkLX6DtmRH1isoNA9VTtNUK9K8xYd28JNNfOv/s=
|
||||
github.com/spf13/cobra v0.0.5/go.mod h1:3K3wKZymM7VvHMDS9+Akkh4K60UwM26emMESw8tLCHU=
|
||||
github.com/spf13/jwalterweatherman v1.0.0/go.mod h1:cQK4TGJAtQXfYWX+Ddv3mKDzgVb68N+wFjFa4jdeBTo=
|
||||
github.com/spf13/pflag v1.0.3/go.mod h1:DYY7MBk1bdzusC3SYhjObp+wFpr4gzcvqqNjLnInEg4=
|
||||
github.com/spf13/pflag v1.0.5 h1:iy+VFUOCP1a+8yFto/drg2CJ5u0yRoB7fZw3DKv/JXA=
|
||||
github.com/spf13/pflag v1.0.5/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
|
||||
github.com/spf13/viper v1.3.2/go.mod h1:ZiWeW+zYFKm7srdB9IoDzzZXaJaI5eL9QjNiN/DMA2s=
|
||||
github.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||
github.com/stretchr/testify v1.2.2 h1:bSDNvY7ZPG5RlJ8otE/7V6gMiyenm9RtJ7IUVIAoJ1w=
|
||||
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||
github.com/ugorji/go/codec v0.0.0-20181204163529-d75b2dcb6bc8/go.mod h1:VFNgLljTbGfSG7qAOspJ7OScBnGdDN/yBr0sguwnwf0=
|
||||
github.com/xordataexchange/crypt v0.0.3-0.20170626215501-b2862e3d0a77/go.mod h1:aYKd//L2LvnjZzWKhF00oedf4jCCReLcmhLdhm1A27Q=
|
||||
golang.org/x/crypto v0.0.0-20181203042331-505ab145d0a9/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=
|
||||
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2 h1:VklqNMn3ovrHsnt90PveolxSbWFaJdECFbxSq0Mqo2M=
|
||||
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
|
||||
golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
|
||||
golang.org/x/net v0.0.0-20190108225652-1e06a53dbb7e/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
|
||||
golang.org/x/net v0.0.0-20190311183353-d8887717615a h1:oWX7TPOiFAMXLq8o0ikBYfCJVlRHBcsciT5bXOrH628=
|
||||
golang.org/x/net v0.0.0-20190311183353-d8887717615a/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
|
||||
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
||||
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45 h1:SVwTIAaPC2U/AvvLNZ2a7OVsmBpC8L5BlwK1whH3hm0=
|
||||
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
||||
golang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
golang.org/x/sync v0.0.0-20190227155943-e225da77a7e6/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
golang.org/x/sys v0.0.0-20181205085412-a5c9d58dba9a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
||||
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
||||
golang.org/x/sys v0.0.0-20190422165155-953cdadca894 h1:Cz4ceDQGXuKRnVBDTS23GTn/pU5OE2C0WrNTOYK1Uuc=
|
||||
golang.org/x/sys v0.0.0-20190422165155-953cdadca894/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
||||
google.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=
|
||||
google.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/yaml.v2 v2.2.2 h1:ZCJp+EgiOT7lHqUV2J862kp8Qj64Jo6az82+3Td9dZw=
|
||||
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
|
9
main.go
Normal file
9
main.go
Normal file
@ -0,0 +1,9 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/SAP/jenkins-library/cmd"
|
||||
)
|
||||
|
||||
func main() {
|
||||
cmd.Execute()
|
||||
}
|
134
pkg/command/command.go
Normal file
134
pkg/command/command.go
Normal file
@ -0,0 +1,134 @@
|
||||
package command
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"os/exec"
|
||||
"sync"
|
||||
|
||||
"github.com/pkg/errors"
|
||||
)
|
||||
|
||||
// Command defines the information required for executing a call to any executable
|
||||
type Command struct {
|
||||
dir string
|
||||
Stdout io.Writer
|
||||
Stderr io.Writer
|
||||
}
|
||||
|
||||
// Dir sets the working directory for the execution
|
||||
func (c *Command) Dir(d string) {
|
||||
c.dir = d
|
||||
}
|
||||
|
||||
// ExecCommand defines how to execute os commands
|
||||
var ExecCommand = exec.Command
|
||||
|
||||
// RunShell runs the specified command on the shell
|
||||
func (c *Command) RunShell(shell, script string) error {
|
||||
|
||||
_out, _err := prepareOut(c.Stdout, c.Stderr)
|
||||
|
||||
cmd := ExecCommand(shell)
|
||||
|
||||
cmd.Dir = c.dir
|
||||
in := bytes.Buffer{}
|
||||
in.Write([]byte(script))
|
||||
cmd.Stdin = &in
|
||||
|
||||
if err := runCmd(cmd, _out, _err); err != nil {
|
||||
return errors.Wrapf(err, "running shell script failed with %v", shell)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// RunExecutable runs the specified executable with parameters
|
||||
func (c *Command) RunExecutable(executable string, params ...string) error {
|
||||
|
||||
_out, _err := prepareOut(c.Stdout, c.Stderr)
|
||||
|
||||
cmd := ExecCommand(executable, params...)
|
||||
|
||||
if len(c.dir) > 0 {
|
||||
cmd.Dir = c.dir
|
||||
}
|
||||
|
||||
if err := runCmd(cmd, _out, _err); err != nil {
|
||||
return errors.Wrapf(err, "running command '%v' failed", executable)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func runCmd(cmd *exec.Cmd, _out, _err io.Writer) error {
|
||||
|
||||
stdout, stderr, err := cmdPipes(cmd)
|
||||
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "getting commmand pipes failed")
|
||||
}
|
||||
|
||||
err = cmd.Start()
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "starting command failed")
|
||||
}
|
||||
|
||||
var wg sync.WaitGroup
|
||||
wg.Add(2)
|
||||
|
||||
var errStdout, errStderr error
|
||||
|
||||
go func() {
|
||||
_, errStdout = io.Copy(_out, stdout)
|
||||
wg.Done()
|
||||
}()
|
||||
|
||||
go func() {
|
||||
_, errStderr = io.Copy(_err, stderr)
|
||||
wg.Done()
|
||||
}()
|
||||
|
||||
wg.Wait()
|
||||
|
||||
err = cmd.Wait()
|
||||
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "cmd.Run() failed")
|
||||
}
|
||||
|
||||
if errStdout != nil || errStderr != nil {
|
||||
return fmt.Errorf("failed to capture stdout/stderr: '%v'/'%v'", errStdout, errStderr)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func prepareOut(stdout, stderr io.Writer) (io.Writer, io.Writer) {
|
||||
|
||||
//ToDo: check use of multiwriter instead to always write into os.Stdout and os.Stdin?
|
||||
//stdout := io.MultiWriter(os.Stdout, &stdoutBuf)
|
||||
//stderr := io.MultiWriter(os.Stderr, &stderrBuf)
|
||||
|
||||
if stdout == nil {
|
||||
stdout = os.Stdout
|
||||
}
|
||||
if stderr == nil {
|
||||
stderr = os.Stderr
|
||||
}
|
||||
|
||||
return stdout, stderr
|
||||
}
|
||||
|
||||
func cmdPipes(cmd *exec.Cmd) (io.ReadCloser, io.ReadCloser, error) {
|
||||
stdout, err := cmd.StdoutPipe()
|
||||
if err != nil {
|
||||
return nil, nil, errors.Wrap(err, "getting Stdout pipe failed")
|
||||
}
|
||||
|
||||
stderr, err := cmd.StderrPipe()
|
||||
if err != nil {
|
||||
return nil, nil, errors.Wrap(err, "getting Stderr pipe failed")
|
||||
}
|
||||
return stdout, stderr, nil
|
||||
}
|
181
pkg/command/command_test.go
Normal file
181
pkg/command/command_test.go
Normal file
@ -0,0 +1,181 @@
|
||||
package command
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"os/exec"
|
||||
"testing"
|
||||
)
|
||||
|
||||
//based on https://golang.org/src/os/exec/exec_test.go
|
||||
func helperCommand(command string, s ...string) (cmd *exec.Cmd) {
|
||||
cs := []string{"-test.run=TestHelperProcess", "--", command}
|
||||
cs = append(cs, s...)
|
||||
cmd = exec.Command(os.Args[0], cs...)
|
||||
cmd.Env = []string{"GO_WANT_HELPER_PROCESS=1"}
|
||||
return cmd
|
||||
}
|
||||
|
||||
func TestShellRun(t *testing.T) {
|
||||
|
||||
t.Run("test shell", func(t *testing.T) {
|
||||
ExecCommand = helperCommand
|
||||
defer func() { ExecCommand = exec.Command }()
|
||||
o := new(bytes.Buffer)
|
||||
e := new(bytes.Buffer)
|
||||
|
||||
s := Command{Stdout: o, Stderr: e}
|
||||
s.RunShell("/bin/bash", "myScript")
|
||||
|
||||
t.Run("success case", func(t *testing.T) {
|
||||
t.Run("stdin-stdout", func(t *testing.T) {
|
||||
expectedOut := "Stdout: command /bin/bash - Stdin: myScript\n"
|
||||
if oStr := o.String(); oStr != expectedOut {
|
||||
t.Errorf("expected: %v got: %v", expectedOut, oStr)
|
||||
}
|
||||
})
|
||||
t.Run("stderr", func(t *testing.T) {
|
||||
expectedErr := "Stderr: command /bin/bash\n"
|
||||
if eStr := e.String(); eStr != expectedErr {
|
||||
t.Errorf("expected: %v got: %v", expectedErr, eStr)
|
||||
}
|
||||
})
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
func TestExecutableRun(t *testing.T) {
|
||||
|
||||
t.Run("test shell", func(t *testing.T) {
|
||||
ExecCommand = helperCommand
|
||||
defer func() { ExecCommand = exec.Command }()
|
||||
o := new(bytes.Buffer)
|
||||
e := new(bytes.Buffer)
|
||||
|
||||
ex := Command{Stdout: o, Stderr: e}
|
||||
ex.RunExecutable("echo", []string{"foo bar", "baz"}...)
|
||||
|
||||
t.Run("success case", func(t *testing.T) {
|
||||
t.Run("stdin", func(t *testing.T) {
|
||||
expectedOut := "foo bar baz\n"
|
||||
if oStr := o.String(); oStr != expectedOut {
|
||||
t.Errorf("expected: %v got: %v", expectedOut, oStr)
|
||||
}
|
||||
})
|
||||
t.Run("stderr", func(t *testing.T) {
|
||||
expectedErr := "Stderr: command echo\n"
|
||||
if eStr := e.String(); eStr != expectedErr {
|
||||
t.Errorf("expected: %v got: %v", expectedErr, eStr)
|
||||
}
|
||||
})
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
func TestPrepareOut(t *testing.T) {
|
||||
|
||||
t.Run("os", func(t *testing.T) {
|
||||
s := Command{}
|
||||
_out, _err := prepareOut(s.Stdout, s.Stderr)
|
||||
|
||||
if _out != os.Stdout {
|
||||
t.Errorf("expected out to be os.Stdout")
|
||||
}
|
||||
|
||||
if _err != os.Stderr {
|
||||
t.Errorf("expected err to be os.Stderr")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("custom", func(t *testing.T) {
|
||||
o := bytes.NewBufferString("")
|
||||
e := bytes.NewBufferString("")
|
||||
s := Command{Stdout: o, Stderr: e}
|
||||
_out, _err := prepareOut(s.Stdout, s.Stderr)
|
||||
|
||||
expectOut := "Test out"
|
||||
expectErr := "Test err"
|
||||
_out.Write([]byte(expectOut))
|
||||
_err.Write([]byte(expectErr))
|
||||
|
||||
t.Run("out", func(t *testing.T) {
|
||||
if o.String() != expectOut {
|
||||
t.Errorf("expected: %v got: %v", expectOut, o.String())
|
||||
}
|
||||
})
|
||||
t.Run("err", func(t *testing.T) {
|
||||
if e.String() != expectErr {
|
||||
t.Errorf("expected: %v got: %v", expectErr, e.String())
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
func TestCmdPipes(t *testing.T) {
|
||||
cmd := helperCommand("echo", "foo bar", "baz")
|
||||
defer func() { ExecCommand = exec.Command }()
|
||||
|
||||
t.Run("success case", func(t *testing.T) {
|
||||
o, e, err := cmdPipes(cmd)
|
||||
t.Run("no error", func(t *testing.T) {
|
||||
if err != nil {
|
||||
t.Errorf("error occured but no error expected")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("out pipe", func(t *testing.T) {
|
||||
if o == nil {
|
||||
t.Errorf("no pipe received")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("err pipe", func(t *testing.T) {
|
||||
if e == nil {
|
||||
t.Errorf("no pipe received")
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
//based on https://golang.org/src/os/exec/exec_test.go
|
||||
//this is not directly executed
|
||||
func TestHelperProcess(*testing.T) {
|
||||
if os.Getenv("GO_WANT_HELPER_PROCESS") != "1" {
|
||||
return
|
||||
}
|
||||
defer os.Exit(0)
|
||||
|
||||
args := os.Args
|
||||
for len(args) > 0 {
|
||||
if args[0] == "--" {
|
||||
args = args[1:]
|
||||
break
|
||||
}
|
||||
args = args[1:]
|
||||
}
|
||||
if len(args) == 0 {
|
||||
fmt.Fprintf(os.Stderr, "No command\n")
|
||||
os.Exit(2)
|
||||
}
|
||||
|
||||
cmd, args := args[0], args[1:]
|
||||
switch cmd {
|
||||
case "/bin/bash":
|
||||
o, _ := ioutil.ReadAll(os.Stdin)
|
||||
fmt.Fprintf(os.Stdout, "Stdout: command %v - Stdin: %v\n", cmd, string(o))
|
||||
fmt.Fprintf(os.Stderr, "Stderr: command %v\n", cmd)
|
||||
case "echo":
|
||||
iargs := []interface{}{}
|
||||
for _, s := range args {
|
||||
iargs = append(iargs, s)
|
||||
}
|
||||
fmt.Println(iargs...)
|
||||
fmt.Fprintf(os.Stderr, "Stderr: command %v\n", cmd)
|
||||
default:
|
||||
fmt.Fprintf(os.Stderr, "Unknown command %q\n", cmd)
|
||||
os.Exit(2)
|
||||
|
||||
}
|
||||
}
|
248
pkg/config/config.go
Normal file
248
pkg/config/config.go
Normal file
@ -0,0 +1,248 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/ghodss/yaml"
|
||||
"github.com/google/go-cmp/cmp"
|
||||
"github.com/pkg/errors"
|
||||
)
|
||||
|
||||
// Config defines the structure of the config files
|
||||
type Config struct {
|
||||
General map[string]interface{} `json:"general"`
|
||||
Stages map[string]map[string]interface{} `json:"stages"`
|
||||
Steps map[string]map[string]interface{} `json:"steps"`
|
||||
}
|
||||
|
||||
// StepConfig defines the structure for merged step configuration
|
||||
type StepConfig struct {
|
||||
Config map[string]interface{}
|
||||
}
|
||||
|
||||
// ReadConfig loads config and returns its content
|
||||
func (c *Config) ReadConfig(configuration io.ReadCloser) error {
|
||||
defer configuration.Close()
|
||||
|
||||
content, err := ioutil.ReadAll(configuration)
|
||||
if err != nil {
|
||||
return errors.Wrapf(err, "error reading %v", configuration)
|
||||
}
|
||||
|
||||
err = yaml.Unmarshal(content, &c)
|
||||
if err != nil {
|
||||
return NewParseError(fmt.Sprintf("error unmarshalling %q: %v", content, err))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// ApplyAliasConfig adds configuration values available on aliases to primary configuration parameters
|
||||
func (c *Config) ApplyAliasConfig(parameters []StepParameters, filters StepFilters, stageName, stepName string) {
|
||||
for _, p := range parameters {
|
||||
c.General = setParamValueFromAlias(c.General, filters.General, p)
|
||||
if c.Stages[stageName] != nil {
|
||||
c.Stages[stageName] = setParamValueFromAlias(c.Stages[stageName], filters.Stages, p)
|
||||
}
|
||||
if c.Steps[stepName] != nil {
|
||||
c.Steps[stepName] = setParamValueFromAlias(c.Steps[stepName], filters.Steps, p)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func setParamValueFromAlias(configMap map[string]interface{}, filter []string, p StepParameters) map[string]interface{} {
|
||||
if configMap[p.Name] == nil && sliceContains(filter, p.Name) {
|
||||
for _, a := range p.Aliases {
|
||||
configMap[p.Name] = getDeepAliasValue(configMap, a.Name)
|
||||
if configMap[p.Name] != nil {
|
||||
return configMap
|
||||
}
|
||||
}
|
||||
}
|
||||
return configMap
|
||||
}
|
||||
|
||||
func getDeepAliasValue(configMap map[string]interface{}, key string) interface{} {
|
||||
parts := strings.Split(key, "/")
|
||||
if len(parts) > 1 {
|
||||
if configMap[parts[0]] == nil {
|
||||
return nil
|
||||
}
|
||||
return getDeepAliasValue(configMap[parts[0]].(map[string]interface{}), strings.Join(parts[1:], "/"))
|
||||
}
|
||||
return configMap[key]
|
||||
}
|
||||
|
||||
// GetStepConfig provides merged step configuration using defaults, config, if available
|
||||
func (c *Config) GetStepConfig(flagValues map[string]interface{}, paramJSON string, configuration io.ReadCloser, defaults []io.ReadCloser, filters StepFilters, parameters []StepParameters, stageName, stepName string) (StepConfig, error) {
|
||||
var stepConfig StepConfig
|
||||
var d PipelineDefaults
|
||||
|
||||
if configuration != nil {
|
||||
if err := c.ReadConfig(configuration); err != nil {
|
||||
return StepConfig{}, errors.Wrap(err, "failed to parse custom pipeline configuration")
|
||||
}
|
||||
}
|
||||
c.ApplyAliasConfig(parameters, filters, stageName, stepName)
|
||||
|
||||
if err := d.ReadPipelineDefaults(defaults); err != nil {
|
||||
switch err.(type) {
|
||||
case *ParseError:
|
||||
return StepConfig{}, errors.Wrap(err, "failed to parse pipeline default configuration")
|
||||
default:
|
||||
//ignoring unavailability of defaults since considered optional
|
||||
}
|
||||
}
|
||||
|
||||
// first: read defaults & merge general -> steps (-> general -> steps ...)
|
||||
for _, def := range d.Defaults {
|
||||
def.ApplyAliasConfig(parameters, filters, stageName, stepName)
|
||||
stepConfig.mixIn(def.General, filters.General)
|
||||
stepConfig.mixIn(def.Steps[stepName], filters.Steps)
|
||||
}
|
||||
|
||||
// second: read config & merge - general -> steps -> stages
|
||||
stepConfig.mixIn(c.General, filters.General)
|
||||
stepConfig.mixIn(c.Steps[stepName], filters.Steps)
|
||||
stepConfig.mixIn(c.Stages[stageName], filters.Stages)
|
||||
|
||||
// third: merge parameters provided via env vars
|
||||
stepConfig.mixIn(envValues(filters.All), filters.All)
|
||||
|
||||
// fourth: if parameters are provided in JSON format merge them
|
||||
if len(paramJSON) != 0 {
|
||||
var params map[string]interface{}
|
||||
json.Unmarshal([]byte(paramJSON), ¶ms)
|
||||
|
||||
//apply aliases
|
||||
for _, p := range parameters {
|
||||
params = setParamValueFromAlias(params, filters.Parameters, p)
|
||||
}
|
||||
|
||||
stepConfig.mixIn(params, filters.Parameters)
|
||||
}
|
||||
|
||||
// fifth: merge command line flags
|
||||
if flagValues != nil {
|
||||
stepConfig.mixIn(flagValues, filters.Parameters)
|
||||
}
|
||||
|
||||
// finally do the condition evaluation post processing
|
||||
for _, p := range parameters {
|
||||
if len(p.Conditions) > 0 {
|
||||
cp := p.Conditions[0].Params[0]
|
||||
dependentValue := stepConfig.Config[cp.Name]
|
||||
if cmp.Equal(dependentValue, cp.Value) && stepConfig.Config[p.Name] == nil {
|
||||
subMapValue := stepConfig.Config[dependentValue.(string)].(map[string]interface{})[p.Name]
|
||||
if subMapValue != nil {
|
||||
stepConfig.Config[p.Name] = subMapValue
|
||||
} else {
|
||||
stepConfig.Config[p.Name] = p.Default
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return stepConfig, nil
|
||||
}
|
||||
|
||||
// GetStepConfigWithJSON provides merged step configuration using a provided stepConfigJSON with additional flags provided
|
||||
func GetStepConfigWithJSON(flagValues map[string]interface{}, stepConfigJSON string, filters StepFilters) StepConfig {
|
||||
var stepConfig StepConfig
|
||||
|
||||
stepConfigMap := map[string]interface{}{}
|
||||
|
||||
json.Unmarshal([]byte(stepConfigJSON), &stepConfigMap)
|
||||
|
||||
stepConfig.mixIn(stepConfigMap, filters.All)
|
||||
|
||||
// ToDo: mix in parametersJSON
|
||||
|
||||
if flagValues != nil {
|
||||
stepConfig.mixIn(flagValues, filters.Parameters)
|
||||
}
|
||||
return stepConfig
|
||||
}
|
||||
|
||||
// GetJSON returns JSON representation of an object
|
||||
func GetJSON(data interface{}) (string, error) {
|
||||
|
||||
result, err := json.Marshal(data)
|
||||
if err != nil {
|
||||
return "", errors.Wrapf(err, "error marshalling json: %v", err)
|
||||
}
|
||||
return string(result), nil
|
||||
}
|
||||
|
||||
func envValues(filter []string) map[string]interface{} {
|
||||
vals := map[string]interface{}{}
|
||||
for _, param := range filter {
|
||||
if envVal := os.Getenv("PIPER_" + param); len(envVal) != 0 {
|
||||
vals[param] = os.Getenv("PIPER_" + param)
|
||||
}
|
||||
}
|
||||
return vals
|
||||
}
|
||||
|
||||
func (s *StepConfig) mixIn(mergeData map[string]interface{}, filter []string) {
|
||||
|
||||
if s.Config == nil {
|
||||
s.Config = map[string]interface{}{}
|
||||
}
|
||||
|
||||
s.Config = merge(s.Config, filterMap(mergeData, filter))
|
||||
}
|
||||
|
||||
func filterMap(data map[string]interface{}, filter []string) map[string]interface{} {
|
||||
result := map[string]interface{}{}
|
||||
|
||||
if data == nil {
|
||||
data = map[string]interface{}{}
|
||||
}
|
||||
|
||||
for key, value := range data {
|
||||
if len(filter) == 0 || sliceContains(filter, key) {
|
||||
result[key] = value
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func merge(base, overlay map[string]interface{}) map[string]interface{} {
|
||||
|
||||
result := map[string]interface{}{}
|
||||
|
||||
if base == nil {
|
||||
base = map[string]interface{}{}
|
||||
}
|
||||
|
||||
for key, value := range base {
|
||||
result[key] = value
|
||||
}
|
||||
|
||||
for key, value := range overlay {
|
||||
if val, ok := value.(map[string]interface{}); ok {
|
||||
if valBaseKey, ok := base[key].(map[string]interface{}); !ok {
|
||||
result[key] = merge(map[string]interface{}{}, val)
|
||||
} else {
|
||||
result[key] = merge(valBaseKey, val)
|
||||
}
|
||||
} else {
|
||||
result[key] = value
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func sliceContains(slice []string, find string) bool {
|
||||
for _, elem := range slice {
|
||||
if elem == find {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
409
pkg/config/config_test.go
Normal file
409
pkg/config/config_test.go
Normal file
@ -0,0 +1,409 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
type errReadCloser int
|
||||
|
||||
func (errReadCloser) Read(p []byte) (n int, err error) {
|
||||
return 0, errors.New("read error")
|
||||
}
|
||||
|
||||
func (errReadCloser) Close() error {
|
||||
return nil
|
||||
}
|
||||
|
||||
func TestReadConfig(t *testing.T) {
|
||||
|
||||
var c Config
|
||||
|
||||
t.Run("Success case", func(t *testing.T) {
|
||||
|
||||
myConfig := strings.NewReader("general:\n generalTestKey: generalTestValue\nsteps:\n testStep:\n testStepKey: testStepValue")
|
||||
|
||||
err := c.ReadConfig(ioutil.NopCloser(myConfig)) // NopCloser "no-ops" the closing interface since strings do not need to be closed
|
||||
if err != nil {
|
||||
t.Errorf("Got error although no error expected: %v", err)
|
||||
}
|
||||
|
||||
if c.General["generalTestKey"] != "generalTestValue" {
|
||||
t.Errorf("General config- got: %v, expected: %v", c.General["generalTestKey"], "generalTestValue")
|
||||
}
|
||||
|
||||
if c.Steps["testStep"]["testStepKey"] != "testStepValue" {
|
||||
t.Errorf("Step config - got: %v, expected: %v", c.Steps["testStep"]["testStepKey"], "testStepValue")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Read failure", func(t *testing.T) {
|
||||
var rc errReadCloser
|
||||
err := c.ReadConfig(rc)
|
||||
if err == nil {
|
||||
t.Errorf("Got no error although error expected.")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Unmarshalling failure", func(t *testing.T) {
|
||||
myConfig := strings.NewReader("general:\n generalTestKey: generalTestValue\nsteps:\n testStep:\n\ttestStepKey: testStepValue")
|
||||
err := c.ReadConfig(ioutil.NopCloser(myConfig))
|
||||
if err == nil {
|
||||
t.Errorf("Got no error although error expected.")
|
||||
}
|
||||
})
|
||||
|
||||
}
|
||||
|
||||
func TestGetStepConfig(t *testing.T) {
|
||||
|
||||
t.Run("Success case", func(t *testing.T) {
|
||||
|
||||
testConfig := `general:
|
||||
p3: p3_general
|
||||
px3: px3_general
|
||||
p4: p4_general
|
||||
steps:
|
||||
step1:
|
||||
p4: p4_step
|
||||
px4: px4_step
|
||||
p5: p5_step
|
||||
dependentParameter: dependentValue
|
||||
stages:
|
||||
stage1:
|
||||
p5: p5_stage
|
||||
px5: px5_stage
|
||||
p6: p6_stage
|
||||
`
|
||||
filters := StepFilters{
|
||||
General: []string{"p0", "p1", "p2", "p3", "p4"},
|
||||
Steps: []string{"p0", "p1", "p2", "p3", "p4", "p5", "dependentParameter", "pd1", "dependentValue", "pd2"},
|
||||
Stages: []string{"p0", "p1", "p2", "p3", "p4", "p5", "p6"},
|
||||
Parameters: []string{"p0", "p1", "p2", "p3", "p4", "p5", "p6", "p7"},
|
||||
Env: []string{"p0", "p1", "p2", "p3", "p4", "p5"},
|
||||
}
|
||||
|
||||
defaults1 := `general:
|
||||
p0: p0_general_default
|
||||
px0: px0_general_default
|
||||
p1: p1_general_default
|
||||
steps:
|
||||
step1:
|
||||
p1: p1_step_default
|
||||
px1: px1_step_default
|
||||
p2: p2_step_default
|
||||
dependentValue:
|
||||
pd1: pd1_dependent_default
|
||||
`
|
||||
|
||||
defaults2 := `general:
|
||||
p2: p2_general_default
|
||||
px2: px2_general_default
|
||||
p3: p3_general_default
|
||||
`
|
||||
paramJSON := `{"p6":"p6_param","p7":"p7_param"}`
|
||||
|
||||
flags := map[string]interface{}{"p7": "p7_flag"}
|
||||
|
||||
var c Config
|
||||
defaults := []io.ReadCloser{ioutil.NopCloser(strings.NewReader(defaults1)), ioutil.NopCloser(strings.NewReader(defaults2))}
|
||||
|
||||
myConfig := ioutil.NopCloser(strings.NewReader(testConfig))
|
||||
|
||||
parameterMetadata := []StepParameters{
|
||||
{
|
||||
Name: "pd1",
|
||||
Scope: []string{"STEPS"},
|
||||
Conditions: []Condition{
|
||||
{
|
||||
Params: []Param{
|
||||
{Name: "dependentParameter", Value: "dependentValue"},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "pd2",
|
||||
Default: "pd2_metadata_default",
|
||||
Scope: []string{"STEPS"},
|
||||
Conditions: []Condition{
|
||||
{
|
||||
Params: []Param{
|
||||
{Name: "dependentParameter", Value: "dependentValue"},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
stepConfig, err := c.GetStepConfig(flags, paramJSON, myConfig, defaults, filters, parameterMetadata, "stage1", "step1")
|
||||
|
||||
assert.Equal(t, nil, err, "error occured but none expected")
|
||||
|
||||
t.Run("Config", func(t *testing.T) {
|
||||
expected := map[string]string{
|
||||
"p0": "p0_general_default",
|
||||
"p1": "p1_step_default",
|
||||
"p2": "p2_general_default",
|
||||
"p3": "p3_general",
|
||||
"p4": "p4_step",
|
||||
"p5": "p5_stage",
|
||||
"p6": "p6_param",
|
||||
"p7": "p7_flag",
|
||||
"pd1": "pd1_dependent_default",
|
||||
"pd2": "pd2_metadata_default",
|
||||
}
|
||||
for k, v := range expected {
|
||||
t.Run(k, func(t *testing.T) {
|
||||
if stepConfig.Config[k] != v {
|
||||
t.Errorf("got: %v, expected: %v", stepConfig.Config[k], v)
|
||||
}
|
||||
})
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Config not expected", func(t *testing.T) {
|
||||
notExpectedKeys := []string{"px0", "px1", "px2", "px3", "px4", "px5"}
|
||||
for _, p := range notExpectedKeys {
|
||||
t.Run(p, func(t *testing.T) {
|
||||
if stepConfig.Config[p] != nil {
|
||||
t.Errorf("unexpected: %v", p)
|
||||
}
|
||||
})
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
t.Run("Failure case config", func(t *testing.T) {
|
||||
var c Config
|
||||
myConfig := ioutil.NopCloser(strings.NewReader("invalid config"))
|
||||
_, err := c.GetStepConfig(nil, "", myConfig, nil, StepFilters{}, []StepParameters{}, "stage1", "step1")
|
||||
assert.EqualError(t, err, "failed to parse custom pipeline configuration: error unmarshalling \"invalid config\": error unmarshaling JSON: json: cannot unmarshal string into Go value of type config.Config", "default error expected")
|
||||
})
|
||||
|
||||
t.Run("Failure case defaults", func(t *testing.T) {
|
||||
var c Config
|
||||
myConfig := ioutil.NopCloser(strings.NewReader(""))
|
||||
myDefaults := []io.ReadCloser{ioutil.NopCloser(strings.NewReader("invalid defaults"))}
|
||||
_, err := c.GetStepConfig(nil, "", myConfig, myDefaults, StepFilters{}, []StepParameters{}, "stage1", "step1")
|
||||
assert.EqualError(t, err, "failed to parse pipeline default configuration: error unmarshalling \"invalid defaults\": error unmarshaling JSON: json: cannot unmarshal string into Go value of type config.Config", "default error expected")
|
||||
})
|
||||
|
||||
//ToDo: test merging of env and parameters/flags
|
||||
}
|
||||
|
||||
func TestGetStepConfigWithJSON(t *testing.T) {
|
||||
|
||||
filters := StepFilters{All: []string{"key1"}}
|
||||
|
||||
t.Run("Without flags", func(t *testing.T) {
|
||||
sc := GetStepConfigWithJSON(nil, `"key1":"value1","key2":"value2"`, filters)
|
||||
|
||||
if sc.Config["key1"] != "value1" && sc.Config["key2"] == "value2" {
|
||||
t.Errorf("got: %v, expected: %v", sc.Config, StepConfig{Config: map[string]interface{}{"key1": "value1"}})
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("With flags", func(t *testing.T) {
|
||||
flags := map[string]interface{}{"key1": "flagVal1"}
|
||||
sc := GetStepConfigWithJSON(flags, `"key1":"value1","key2":"value2"`, filters)
|
||||
if sc.Config["key1"] != "flagVal1" {
|
||||
t.Errorf("got: %v, expected: %v", sc.Config["key1"], "flagVal1")
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
func TestApplyAliasConfig(t *testing.T) {
|
||||
p := []StepParameters{
|
||||
{
|
||||
Name: "p0",
|
||||
Aliases: []Alias{
|
||||
{Name: "p0_notused"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "p1",
|
||||
Aliases: []Alias{
|
||||
{Name: "p1_alias"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "p2",
|
||||
Aliases: []Alias{
|
||||
{Name: "p2_alias/deep/test"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "p3",
|
||||
Aliases: []Alias{
|
||||
{Name: "p3_notused"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "p4",
|
||||
Aliases: []Alias{
|
||||
{Name: "p4_alias"},
|
||||
{Name: "p4_2nd_alias"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "p5",
|
||||
Aliases: []Alias{
|
||||
{Name: "p5_notused"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "p6",
|
||||
Aliases: []Alias{
|
||||
{Name: "p6_1st_alias"},
|
||||
{Name: "p6_alias"},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
filters := StepFilters{
|
||||
General: []string{"p1", "p2"},
|
||||
Stages: []string{"p4"},
|
||||
Steps: []string{"p6"},
|
||||
}
|
||||
|
||||
c := Config{
|
||||
General: map[string]interface{}{
|
||||
"p0_notused": "p0_general",
|
||||
"p1_alias": "p1_general",
|
||||
"p2_alias": map[string]interface{}{
|
||||
"deep": map[string]interface{}{
|
||||
"test": "p2_general",
|
||||
},
|
||||
},
|
||||
},
|
||||
Stages: map[string]map[string]interface{}{
|
||||
"stage1": map[string]interface{}{
|
||||
"p3_notused": "p3_stage",
|
||||
"p4_alias": "p4_stage",
|
||||
},
|
||||
},
|
||||
Steps: map[string]map[string]interface{}{
|
||||
"step1": map[string]interface{}{
|
||||
"p5_notused": "p5_step",
|
||||
"p6_alias": "p6_step",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
c.ApplyAliasConfig(p, filters, "stage1", "step1")
|
||||
|
||||
t.Run("Global", func(t *testing.T) {
|
||||
assert.Nil(t, c.General["p0"])
|
||||
assert.Equal(t, "p1_general", c.General["p1"])
|
||||
assert.Equal(t, "p2_general", c.General["p2"])
|
||||
})
|
||||
|
||||
t.Run("Stage", func(t *testing.T) {
|
||||
assert.Nil(t, c.General["p3"])
|
||||
assert.Equal(t, "p4_stage", c.Stages["stage1"]["p4"])
|
||||
})
|
||||
|
||||
t.Run("Stage", func(t *testing.T) {
|
||||
assert.Nil(t, c.General["p5"])
|
||||
assert.Equal(t, "p6_step", c.Steps["step1"]["p6"])
|
||||
})
|
||||
|
||||
}
|
||||
|
||||
func TestGetDeepAliasValue(t *testing.T) {
|
||||
c := map[string]interface{}{
|
||||
"p0": "p0_val",
|
||||
"p1": 11,
|
||||
"p2": map[string]interface{}{
|
||||
"p2_0": "p2_0_val",
|
||||
"p2_1": map[string]interface{}{
|
||||
"p2_1_0": "p2_1_0_val",
|
||||
},
|
||||
},
|
||||
}
|
||||
tt := []struct {
|
||||
key string
|
||||
expected interface{}
|
||||
}{
|
||||
{key: "p0", expected: "p0_val"},
|
||||
{key: "p1", expected: 11},
|
||||
{key: "p2/p2_0", expected: "p2_0_val"},
|
||||
{key: "p2/p2_1/p2_1_0", expected: "p2_1_0_val"},
|
||||
}
|
||||
|
||||
for k, v := range tt {
|
||||
assert.Equal(t, v.expected, getDeepAliasValue(c, v.key), fmt.Sprintf("wrong return value for run %v", k+1))
|
||||
}
|
||||
}
|
||||
|
||||
func TestGetJSON(t *testing.T) {
|
||||
|
||||
t.Run("Success case", func(t *testing.T) {
|
||||
custom := map[string]interface{}{"key1": "value1"}
|
||||
json, err := GetJSON(custom)
|
||||
if err != nil {
|
||||
t.Errorf("Got error although no error expected: %v", err)
|
||||
}
|
||||
|
||||
if json != `{"key1":"value1"}` {
|
||||
t.Errorf("got: %v, expected: %v", json, `{"key1":"value1"}`)
|
||||
}
|
||||
|
||||
})
|
||||
t.Run("Marshalling failure", func(t *testing.T) {
|
||||
_, err := GetJSON(make(chan int))
|
||||
if err == nil {
|
||||
t.Errorf("Got no error although error expected")
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
func TestMerge(t *testing.T) {
|
||||
|
||||
testTable := []struct {
|
||||
Source map[string]interface{}
|
||||
Filter []string
|
||||
MergeData map[string]interface{}
|
||||
ExpectedOutput map[string]interface{}
|
||||
}{
|
||||
{
|
||||
Source: map[string]interface{}{"key1": "baseValue"},
|
||||
Filter: []string{},
|
||||
MergeData: map[string]interface{}{"key1": "overwrittenValue"},
|
||||
ExpectedOutput: map[string]interface{}{"key1": "overwrittenValue"},
|
||||
},
|
||||
{
|
||||
Source: map[string]interface{}{"key1": "value1"},
|
||||
Filter: []string{},
|
||||
MergeData: map[string]interface{}{"key2": "value2"},
|
||||
ExpectedOutput: map[string]interface{}{"key1": "value1", "key2": "value2"},
|
||||
},
|
||||
{
|
||||
Source: map[string]interface{}{"key1": "value1"},
|
||||
Filter: []string{"key1"},
|
||||
MergeData: map[string]interface{}{"key2": "value2"},
|
||||
ExpectedOutput: map[string]interface{}{"key1": "value1"},
|
||||
},
|
||||
{
|
||||
Source: map[string]interface{}{"key1": map[string]interface{}{"key1_1": "value1"}},
|
||||
Filter: []string{},
|
||||
MergeData: map[string]interface{}{"key1": map[string]interface{}{"key1_2": "value2"}},
|
||||
ExpectedOutput: map[string]interface{}{"key1": map[string]interface{}{"key1_1": "value1", "key1_2": "value2"}},
|
||||
},
|
||||
}
|
||||
|
||||
for _, row := range testTable {
|
||||
t.Run(fmt.Sprintf("Merging %v into %v", row.MergeData, row.Source), func(t *testing.T) {
|
||||
stepConfig := StepConfig{Config: row.Source}
|
||||
stepConfig.mixIn(row.MergeData, row.Filter)
|
||||
assert.Equal(t, row.ExpectedOutput, stepConfig.Config, "Mixin was incorrect")
|
||||
})
|
||||
}
|
||||
}
|
40
pkg/config/defaults.go
Normal file
40
pkg/config/defaults.go
Normal file
@ -0,0 +1,40 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
|
||||
"github.com/ghodss/yaml"
|
||||
"github.com/pkg/errors"
|
||||
)
|
||||
|
||||
// PipelineDefaults defines the structure of the pipeline defaults
|
||||
type PipelineDefaults struct {
|
||||
Defaults []Config `json:"defaults"`
|
||||
}
|
||||
|
||||
// ReadPipelineDefaults loads defaults and returns its content
|
||||
func (d *PipelineDefaults) ReadPipelineDefaults(defaultSources []io.ReadCloser) error {
|
||||
|
||||
for _, def := range defaultSources {
|
||||
|
||||
defer def.Close()
|
||||
|
||||
var c Config
|
||||
var err error
|
||||
|
||||
content, err := ioutil.ReadAll(def)
|
||||
if err != nil {
|
||||
return errors.Wrapf(err, "error reading %v", def)
|
||||
}
|
||||
|
||||
err = yaml.Unmarshal(content, &c)
|
||||
if err != nil {
|
||||
return NewParseError(fmt.Sprintf("error unmarshalling %q: %v", content, err))
|
||||
}
|
||||
|
||||
d.Defaults = append(d.Defaults, c)
|
||||
}
|
||||
return nil
|
||||
}
|
53
pkg/config/defaults_test.go
Normal file
53
pkg/config/defaults_test.go
Normal file
@ -0,0 +1,53 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestReadPipelineDefaults(t *testing.T) {
|
||||
|
||||
var d PipelineDefaults
|
||||
|
||||
t.Run("Success case", func(t *testing.T) {
|
||||
d0 := strings.NewReader("general:\n testStepKey1: testStepValue1")
|
||||
d1 := strings.NewReader("general:\n testStepKey2: testStepValue2")
|
||||
err := d.ReadPipelineDefaults([]io.ReadCloser{ioutil.NopCloser(d0), ioutil.NopCloser(d1)})
|
||||
|
||||
if err != nil {
|
||||
t.Errorf("Got error although no error expected: %v", err)
|
||||
}
|
||||
|
||||
t.Run("Defaults 0", func(t *testing.T) {
|
||||
expected := "testStepValue1"
|
||||
if d.Defaults[0].General["testStepKey1"] != expected {
|
||||
t.Errorf("got: %v, expected: %v", d.Defaults[0].General["testStepKey1"], expected)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Defaults 1", func(t *testing.T) {
|
||||
expected := "testStepValue2"
|
||||
if d.Defaults[1].General["testStepKey2"] != expected {
|
||||
t.Errorf("got: %v, expected: %v", d.Defaults[1].General["testStepKey2"], expected)
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
t.Run("Read failure", func(t *testing.T) {
|
||||
var rc errReadCloser
|
||||
err := d.ReadPipelineDefaults([]io.ReadCloser{rc})
|
||||
if err == nil {
|
||||
t.Errorf("Got no error although error expected.")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Unmarshalling failure", func(t *testing.T) {
|
||||
myConfig := strings.NewReader("general:\n\ttestStepKey: testStepValue")
|
||||
err := d.ReadPipelineDefaults([]io.ReadCloser{ioutil.NopCloser(myConfig)})
|
||||
if err == nil {
|
||||
t.Errorf("Got no error although error expected.")
|
||||
}
|
||||
})
|
||||
}
|
18
pkg/config/errors.go
Normal file
18
pkg/config/errors.go
Normal file
@ -0,0 +1,18 @@
|
||||
package config
|
||||
|
||||
// ParseError defines an error type for configuration parsing errors
|
||||
type ParseError struct {
|
||||
message string
|
||||
}
|
||||
|
||||
// NewParseError creates a new ParseError
|
||||
func NewParseError(message string) *ParseError {
|
||||
return &ParseError{
|
||||
message: message,
|
||||
}
|
||||
}
|
||||
|
||||
// Error returns the message of the ParseError
|
||||
func (e *ParseError) Error() string {
|
||||
return e.message
|
||||
}
|
13
pkg/config/errors_test.go
Normal file
13
pkg/config/errors_test.go
Normal file
@ -0,0 +1,13 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestParseError(t *testing.T) {
|
||||
err := NewParseError("Parsing failed")
|
||||
|
||||
assert.Equal(t, "Parsing failed", err.Error())
|
||||
}
|
43
pkg/config/flags.go
Normal file
43
pkg/config/flags.go
Normal file
@ -0,0 +1,43 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
flag "github.com/spf13/pflag"
|
||||
)
|
||||
|
||||
// AvailableFlagValues returns all flags incl. values which are available to the command.
|
||||
func AvailableFlagValues(cmd *cobra.Command, filters *StepFilters) map[string]interface{} {
|
||||
flagValues := map[string]interface{}{}
|
||||
flags := cmd.Flags()
|
||||
//only check flags where value has been set
|
||||
flags.Visit(func(pflag *flag.Flag) {
|
||||
|
||||
switch pflag.Value.Type() {
|
||||
case "string":
|
||||
flagValues[pflag.Name] = pflag.Value.String()
|
||||
case "stringSlice":
|
||||
flagValues[pflag.Name], _ = flags.GetStringSlice(pflag.Name)
|
||||
case "bool":
|
||||
flagValues[pflag.Name], _ = flags.GetBool(pflag.Name)
|
||||
default:
|
||||
fmt.Printf("Meta data type not set or not known: '%v'\n", pflag.Value.Type())
|
||||
os.Exit(1)
|
||||
}
|
||||
filters.Parameters = append(filters.Parameters, pflag.Name)
|
||||
})
|
||||
return flagValues
|
||||
}
|
||||
|
||||
// MarkFlagsWithValue marks a flag as changed if value is available for the flag through the step configuration.
|
||||
func MarkFlagsWithValue(cmd *cobra.Command, stepConfig StepConfig) {
|
||||
flags := cmd.Flags()
|
||||
flags.VisitAll(func(pflag *flag.Flag) {
|
||||
//mark as available in case default is available or config is available
|
||||
if len(pflag.Value.String()) > 0 || stepConfig.Config[pflag.Name] != nil {
|
||||
pflag.Changed = true
|
||||
}
|
||||
})
|
||||
}
|
67
pkg/config/flags_test.go
Normal file
67
pkg/config/flags_test.go
Normal file
@ -0,0 +1,67 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestAvailableFlagValues(t *testing.T) {
|
||||
var f StepFilters
|
||||
|
||||
var test0 string
|
||||
var test1 string
|
||||
var test2 []string
|
||||
var test3 bool
|
||||
|
||||
var c = &cobra.Command{
|
||||
Use: "test",
|
||||
Short: "..",
|
||||
}
|
||||
|
||||
c.Flags().StringVar(&test0, "test0", "val0", "Test 0")
|
||||
c.Flags().StringVar(&test1, "test1", "", "Test 1")
|
||||
c.Flags().StringSliceVar(&test2, "test2", []string{}, "Test 2")
|
||||
c.Flags().BoolVar(&test3, "test3", false, "Test 3")
|
||||
|
||||
c.Flags().Set("test1", "val1")
|
||||
c.Flags().Set("test2", "val3_1")
|
||||
c.Flags().Set("test3", "true")
|
||||
|
||||
v := AvailableFlagValues(c, &f)
|
||||
|
||||
if v["test0"] != nil {
|
||||
t.Errorf("expected: 'test0' to be empty but was %v", v["test0"])
|
||||
}
|
||||
|
||||
assert.Equal(t, "val1", v["test1"])
|
||||
assert.Equal(t, []string{"val3_1"}, v["test2"])
|
||||
assert.Equal(t, true, v["test3"])
|
||||
|
||||
}
|
||||
|
||||
func TestMarkFlagsWithValue(t *testing.T) {
|
||||
var test0 string
|
||||
var test1 string
|
||||
var test2 string
|
||||
var c = &cobra.Command{
|
||||
Use: "test",
|
||||
Short: "..",
|
||||
}
|
||||
c.Flags().StringVar(&test0, "test0", "val0", "Test 0")
|
||||
c.Flags().StringVar(&test1, "test1", "", "Test 1")
|
||||
c.Flags().StringVar(&test2, "test2", "", "Test 2")
|
||||
|
||||
s := StepConfig{
|
||||
Config: map[string]interface{}{
|
||||
"test2": "val2",
|
||||
},
|
||||
}
|
||||
|
||||
MarkFlagsWithValue(c, s)
|
||||
|
||||
assert.Equal(t, true, c.Flags().Changed("test0"), "default not considered")
|
||||
assert.Equal(t, false, c.Flags().Changed("test1"), "no value: considered as set")
|
||||
assert.Equal(t, true, c.Flags().Changed("test2"), "config not considered")
|
||||
}
|
312
pkg/config/stepmeta.go
Normal file
312
pkg/config/stepmeta.go
Normal file
@ -0,0 +1,312 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
|
||||
"github.com/ghodss/yaml"
|
||||
"github.com/pkg/errors"
|
||||
)
|
||||
|
||||
// StepData defines the metadata for a step, like step descriptions, parameters, ...
|
||||
type StepData struct {
|
||||
Metadata StepMetadata `json:"metadata"`
|
||||
Spec StepSpec `json:"spec"`
|
||||
}
|
||||
|
||||
// StepMetadata defines the metadata for a step, like step descriptions, parameters, ...
|
||||
type StepMetadata struct {
|
||||
Name string `json:"name"`
|
||||
Description string `json:"description"`
|
||||
LongDescription string `json:"longDescription,omitempty"`
|
||||
}
|
||||
|
||||
// StepSpec defines the spec details for a step, like step inputs, containers, sidecars, ...
|
||||
type StepSpec struct {
|
||||
Inputs StepInputs `json:"inputs"`
|
||||
// Outputs string `json:"description,omitempty"`
|
||||
Containers []Container `json:"containers,omitempty"`
|
||||
Sidecars []Container `json:"sidecars,omitempty"`
|
||||
}
|
||||
|
||||
// StepInputs defines the spec details for a step, like step inputs, containers, sidecars, ...
|
||||
type StepInputs struct {
|
||||
Parameters []StepParameters `json:"params"`
|
||||
Resources []StepResources `json:"resources,omitempty"`
|
||||
Secrets []StepSecrets `json:"secrets,omitempty"`
|
||||
}
|
||||
|
||||
// StepParameters defines the parameters for a step
|
||||
type StepParameters struct {
|
||||
Name string `json:"name"`
|
||||
Description string `json:"description"`
|
||||
LongDescription string `json:"longDescription,omitempty"`
|
||||
Scope []string `json:"scope"`
|
||||
Type string `json:"type"`
|
||||
Mandatory bool `json:"mandatory,omitempty"`
|
||||
Default interface{} `json:"default,omitempty"`
|
||||
Aliases []Alias `json:"aliases,omitempty"`
|
||||
Conditions []Condition `json:"conditions,omitempty"`
|
||||
}
|
||||
|
||||
// Alias defines a step input parameter alias
|
||||
type Alias struct {
|
||||
Name string `json:"name,omitempty"`
|
||||
Deprecated bool `json:"deprecated,omitempty"`
|
||||
}
|
||||
|
||||
// StepResources defines the resources to be provided by the step context, e.g. Jenkins pipeline
|
||||
type StepResources struct {
|
||||
Name string `json:"name"`
|
||||
Description string `json:"description,omitempty"`
|
||||
Type string `json:"type,omitempty"`
|
||||
Conditions []Condition `json:"conditions,omitempty"`
|
||||
}
|
||||
|
||||
// StepSecrets defines the secrets to be provided by the step context, e.g. Jenkins pipeline
|
||||
type StepSecrets struct {
|
||||
Name string `json:"name"`
|
||||
Description string `json:"description,omitempty"`
|
||||
Type string `json:"type,omitempty"`
|
||||
}
|
||||
|
||||
// StepOutputs defines the outputs of a step
|
||||
//type StepOutputs struct {
|
||||
// Name string `json:"name"`
|
||||
//}
|
||||
|
||||
// Container defines an execution container
|
||||
type Container struct {
|
||||
//ToDo: check dockerOptions, dockerVolumeBind, containerPortMappings, sidecarOptions, sidecarVolumeBind
|
||||
Command []string `json:"command"`
|
||||
EnvVars []EnvVar `json:"env"`
|
||||
Image string `json:"image"`
|
||||
ImagePullPolicy string `json:"imagePullPolicy"`
|
||||
Name string `json:"name"`
|
||||
ReadyCommand string `json:"readyCommand"`
|
||||
Shell string `json:"shell"`
|
||||
WorkingDir string `json:"workingDir"`
|
||||
Conditions []Condition `json:"conditions,omitempty"`
|
||||
}
|
||||
|
||||
// EnvVar defines an environment variable
|
||||
type EnvVar struct {
|
||||
Name string `json:"name"`
|
||||
Value string `json:"value"`
|
||||
}
|
||||
|
||||
// Condition defines an condition which decides when the parameter, resource or container is valid
|
||||
type Condition struct {
|
||||
ConditionRef string `json:"conditionRef"`
|
||||
Params []Param `json:"params"`
|
||||
}
|
||||
|
||||
// Param defines the parameters serving as inputs to the condition
|
||||
type Param struct {
|
||||
Name string `json:"name"`
|
||||
Value string `json:"value"`
|
||||
}
|
||||
|
||||
// StepFilters defines the filter parameters for the different sections
|
||||
type StepFilters struct {
|
||||
All []string
|
||||
General []string
|
||||
Stages []string
|
||||
Steps []string
|
||||
Parameters []string
|
||||
Env []string
|
||||
}
|
||||
|
||||
// ReadPipelineStepData loads step definition in yaml format
|
||||
func (m *StepData) ReadPipelineStepData(metadata io.ReadCloser) error {
|
||||
defer metadata.Close()
|
||||
content, err := ioutil.ReadAll(metadata)
|
||||
if err != nil {
|
||||
return errors.Wrapf(err, "error reading %v", metadata)
|
||||
}
|
||||
|
||||
err = yaml.Unmarshal(content, &m)
|
||||
if err != nil {
|
||||
return errors.Wrapf(err, "error unmarshalling: %v", err)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// GetParameterFilters retrieves all scope dependent parameter filters
|
||||
func (m *StepData) GetParameterFilters() StepFilters {
|
||||
var filters StepFilters
|
||||
for _, param := range m.Spec.Inputs.Parameters {
|
||||
parameterKeys := []string{param.Name}
|
||||
for _, condition := range param.Conditions {
|
||||
for _, dependentParam := range condition.Params {
|
||||
parameterKeys = append(parameterKeys, dependentParam.Value)
|
||||
}
|
||||
}
|
||||
filters.All = append(filters.All, parameterKeys...)
|
||||
for _, scope := range param.Scope {
|
||||
switch scope {
|
||||
case "GENERAL":
|
||||
filters.General = append(filters.General, parameterKeys...)
|
||||
case "STEPS":
|
||||
filters.Steps = append(filters.Steps, parameterKeys...)
|
||||
case "STAGES":
|
||||
filters.Stages = append(filters.Stages, parameterKeys...)
|
||||
case "PARAMETERS":
|
||||
filters.Parameters = append(filters.Parameters, parameterKeys...)
|
||||
case "ENV":
|
||||
filters.Env = append(filters.Env, parameterKeys...)
|
||||
}
|
||||
}
|
||||
}
|
||||
return filters
|
||||
}
|
||||
|
||||
// GetContextParameterFilters retrieves all scope dependent parameter filters
|
||||
func (m *StepData) GetContextParameterFilters() StepFilters {
|
||||
var filters StepFilters
|
||||
for _, secret := range m.Spec.Inputs.Secrets {
|
||||
filters.All = append(filters.All, secret.Name)
|
||||
filters.General = append(filters.General, secret.Name)
|
||||
filters.Steps = append(filters.Steps, secret.Name)
|
||||
filters.Stages = append(filters.Stages, secret.Name)
|
||||
filters.Parameters = append(filters.Parameters, secret.Name)
|
||||
filters.Env = append(filters.Env, secret.Name)
|
||||
}
|
||||
|
||||
containerFilters := []string{}
|
||||
if len(m.Spec.Containers) > 0 {
|
||||
parameterKeys := []string{"containerCommand", "containerShell", "dockerEnvVars", "dockerImage", "dockerOptions", "dockerPullImage", "dockerVolumeBind", "dockerWorkspace"}
|
||||
for _, container := range m.Spec.Containers {
|
||||
for _, condition := range container.Conditions {
|
||||
for _, dependentParam := range condition.Params {
|
||||
parameterKeys = append(parameterKeys, dependentParam.Value)
|
||||
}
|
||||
}
|
||||
}
|
||||
containerFilters = append(containerFilters, parameterKeys...)
|
||||
}
|
||||
if len(m.Spec.Sidecars) > 0 {
|
||||
//ToDo: support fallback for "dockerName" configuration property -> via aliasing?
|
||||
containerFilters = append(containerFilters, []string{"containerName", "containerPortMappings", "dockerName", "sidecarEnvVars", "sidecarImage", "sidecarName", "sidecarOptions", "sidecarPullImage", "sidecarReadyCommand", "sidecarVolumeBind", "sidecarWorkspace"}...)
|
||||
}
|
||||
if len(containerFilters) > 0 {
|
||||
filters.All = append(filters.All, containerFilters...)
|
||||
filters.Steps = append(filters.Steps, containerFilters...)
|
||||
filters.Stages = append(filters.Stages, containerFilters...)
|
||||
filters.Parameters = append(filters.Parameters, containerFilters...)
|
||||
}
|
||||
return filters
|
||||
}
|
||||
|
||||
// GetContextDefaults retrieves context defaults like container image, name, env vars, resources, ...
|
||||
// It only supports scenarios with one container and optionally one sidecar
|
||||
func (m *StepData) GetContextDefaults(stepName string) (io.ReadCloser, error) {
|
||||
|
||||
//ToDo error handling empty Containers/Sidecars
|
||||
//ToDo handle empty Command
|
||||
root := map[string]interface{}{}
|
||||
if len(m.Spec.Containers) > 0 {
|
||||
for _, container := range m.Spec.Containers {
|
||||
key := ""
|
||||
if len(container.Conditions) > 0 {
|
||||
key = container.Conditions[0].Params[0].Value
|
||||
}
|
||||
p := map[string]interface{}{}
|
||||
if key != "" {
|
||||
root[key] = p
|
||||
} else {
|
||||
p = root
|
||||
}
|
||||
if len(container.Command) > 0 {
|
||||
p["containerCommand"] = container.Command[0]
|
||||
}
|
||||
p["containerName"] = container.Name
|
||||
p["containerShell"] = container.Shell
|
||||
p["dockerEnvVars"] = envVarsAsStringSlice(container.EnvVars)
|
||||
p["dockerImage"] = container.Image
|
||||
p["dockerName"] = container.Name
|
||||
p["dockerPullImage"] = container.ImagePullPolicy != "Never"
|
||||
p["dockerWorkspace"] = container.WorkingDir
|
||||
|
||||
// Ready command not relevant for main runtime container so far
|
||||
//p[] = container.ReadyCommand
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
if len(m.Spec.Sidecars) > 0 {
|
||||
if len(m.Spec.Sidecars[0].Command) > 0 {
|
||||
root["sidecarCommand"] = m.Spec.Sidecars[0].Command[0]
|
||||
}
|
||||
root["sidecarEnvVars"] = envVarsAsStringSlice(m.Spec.Sidecars[0].EnvVars)
|
||||
root["sidecarImage"] = m.Spec.Sidecars[0].Image
|
||||
root["sidecarName"] = m.Spec.Sidecars[0].Name
|
||||
root["sidecarPullImage"] = m.Spec.Sidecars[0].ImagePullPolicy != "Never"
|
||||
root["sidecarReadyCommand"] = m.Spec.Sidecars[0].ReadyCommand
|
||||
root["sidecarWorkspace"] = m.Spec.Sidecars[0].WorkingDir
|
||||
}
|
||||
|
||||
// not filled for now since this is not relevant in Kubernetes case
|
||||
//p["dockerOptions"] = container.
|
||||
//p["dockerVolumeBind"] = container.
|
||||
//root["containerPortMappings"] = m.Spec.Sidecars[0].
|
||||
//root["sidecarOptions"] = m.Spec.Sidecars[0].
|
||||
//root["sidecarVolumeBind"] = m.Spec.Sidecars[0].
|
||||
|
||||
if len(m.Spec.Inputs.Resources) > 0 {
|
||||
keys := []string{}
|
||||
resources := map[string][]string{}
|
||||
for _, resource := range m.Spec.Inputs.Resources {
|
||||
if resource.Type == "stash" {
|
||||
key := ""
|
||||
if len(resource.Conditions) > 0 {
|
||||
key = resource.Conditions[0].Params[0].Value
|
||||
}
|
||||
if resources[key] == nil {
|
||||
keys = append(keys, key)
|
||||
resources[key] = []string{}
|
||||
}
|
||||
resources[key] = append(resources[key], resource.Name)
|
||||
}
|
||||
}
|
||||
|
||||
for _, key := range keys {
|
||||
if key == "" {
|
||||
root["stashContent"] = resources[""]
|
||||
} else {
|
||||
if root[key] == nil {
|
||||
root[key] = map[string]interface{}{
|
||||
"stashContent": resources[key],
|
||||
}
|
||||
} else {
|
||||
p := root[key].(map[string]interface{})
|
||||
p["stashContent"] = resources[key]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
c := Config{
|
||||
Steps: map[string]map[string]interface{}{
|
||||
stepName: root,
|
||||
},
|
||||
}
|
||||
|
||||
JSON, err := yaml.Marshal(c)
|
||||
if err != nil {
|
||||
return nil, errors.Wrap(err, "failed to create context defaults")
|
||||
}
|
||||
|
||||
r := ioutil.NopCloser(bytes.NewReader(JSON))
|
||||
return r, nil
|
||||
}
|
||||
|
||||
func envVarsAsStringSlice(envVars []EnvVar) []string {
|
||||
e := []string{}
|
||||
for _, v := range envVars {
|
||||
e = append(e, fmt.Sprintf("%v=%v", v.Name, v.Value))
|
||||
}
|
||||
return e
|
||||
}
|
424
pkg/config/stepmeta_test.go
Normal file
424
pkg/config/stepmeta_test.go
Normal file
@ -0,0 +1,424 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestReadPipelineStepData(t *testing.T) {
|
||||
var s StepData
|
||||
|
||||
t.Run("Success case", func(t *testing.T) {
|
||||
myMeta := strings.NewReader("metadata:\n name: testIt\nspec:\n inputs:\n params:\n - name: testParamName\n secrets:\n - name: testSecret")
|
||||
err := s.ReadPipelineStepData(ioutil.NopCloser(myMeta)) // NopCloser "no-ops" the closing interface since strings do not need to be closed
|
||||
|
||||
if err != nil {
|
||||
t.Errorf("Got error although no error expected: %v", err)
|
||||
}
|
||||
|
||||
t.Run("step name", func(t *testing.T) {
|
||||
if s.Metadata.Name != "testIt" {
|
||||
t.Errorf("Meta name - got: %v, expected: %v", s.Metadata.Name, "testIt")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("param name", func(t *testing.T) {
|
||||
if s.Spec.Inputs.Parameters[0].Name != "testParamName" {
|
||||
t.Errorf("Step name - got: %v, expected: %v", s.Spec.Inputs.Parameters[0].Name, "testParamName")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("secret name", func(t *testing.T) {
|
||||
if s.Spec.Inputs.Secrets[0].Name != "testSecret" {
|
||||
t.Errorf("Step name - got: %v, expected: %v", s.Spec.Inputs.Secrets[0].Name, "testSecret")
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
t.Run("Read failure", func(t *testing.T) {
|
||||
var rc errReadCloser
|
||||
err := s.ReadPipelineStepData(rc)
|
||||
if err == nil {
|
||||
t.Errorf("Got no error although error expected.")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Unmarshalling failure", func(t *testing.T) {
|
||||
myMeta := strings.NewReader("metadata:\n\tname: testIt")
|
||||
err := s.ReadPipelineStepData(ioutil.NopCloser(myMeta))
|
||||
if err == nil {
|
||||
t.Errorf("Got no error although error expected.")
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
func TestGetParameterFilters(t *testing.T) {
|
||||
metadata1 := StepData{
|
||||
Spec: StepSpec{
|
||||
Inputs: StepInputs{
|
||||
Parameters: []StepParameters{
|
||||
{Name: "paramOne", Scope: []string{"GENERAL", "STEPS", "STAGES", "PARAMETERS", "ENV"}},
|
||||
{Name: "paramTwo", Scope: []string{"STEPS", "STAGES", "PARAMETERS", "ENV"}},
|
||||
{Name: "paramThree", Scope: []string{"STAGES", "PARAMETERS", "ENV"}},
|
||||
{Name: "paramFour", Scope: []string{"PARAMETERS", "ENV"}},
|
||||
{Name: "paramFive", Scope: []string{"ENV"}},
|
||||
{Name: "paramSix"},
|
||||
{Name: "paramSeven", Scope: []string{"GENERAL", "STEPS", "STAGES", "PARAMETERS"}, Conditions: []Condition{{Params: []Param{{Name: "buildTool", Value: "mta"}}}}},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
metadata2 := StepData{
|
||||
Spec: StepSpec{
|
||||
Inputs: StepInputs{
|
||||
Parameters: []StepParameters{
|
||||
{Name: "paramOne", Scope: []string{"GENERAL"}},
|
||||
{Name: "paramTwo", Scope: []string{"STEPS"}},
|
||||
{Name: "paramThree", Scope: []string{"STAGES"}},
|
||||
{Name: "paramFour", Scope: []string{"PARAMETERS"}},
|
||||
{Name: "paramFive", Scope: []string{"ENV"}},
|
||||
{Name: "paramSix"},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
metadata3 := StepData{
|
||||
Spec: StepSpec{
|
||||
Inputs: StepInputs{
|
||||
Parameters: []StepParameters{},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
testTable := []struct {
|
||||
Metadata StepData
|
||||
ExpectedAll []string
|
||||
ExpectedGeneral []string
|
||||
ExpectedStages []string
|
||||
ExpectedSteps []string
|
||||
ExpectedParameters []string
|
||||
ExpectedEnv []string
|
||||
NotExpectedAll []string
|
||||
NotExpectedGeneral []string
|
||||
NotExpectedStages []string
|
||||
NotExpectedSteps []string
|
||||
NotExpectedParameters []string
|
||||
NotExpectedEnv []string
|
||||
}{
|
||||
{
|
||||
Metadata: metadata1,
|
||||
ExpectedGeneral: []string{"paramOne", "paramSeven", "mta"},
|
||||
ExpectedSteps: []string{"paramOne", "paramTwo", "paramSeven", "mta"},
|
||||
ExpectedStages: []string{"paramOne", "paramTwo", "paramThree", "paramSeven", "mta"},
|
||||
ExpectedParameters: []string{"paramOne", "paramTwo", "paramThree", "paramFour", "paramSeven", "mta"},
|
||||
ExpectedEnv: []string{"paramOne", "paramTwo", "paramThree", "paramFour", "paramFive", "paramSeven", "mta"},
|
||||
ExpectedAll: []string{"paramOne", "paramTwo", "paramThree", "paramFour", "paramFive", "paramSix", "paramSeven", "mta"},
|
||||
NotExpectedGeneral: []string{"paramTwo", "paramThree", "paramFour", "paramFive", "paramSix"},
|
||||
NotExpectedSteps: []string{"paramThree", "paramFour", "paramFive", "paramSix"},
|
||||
NotExpectedStages: []string{"paramFour", "paramFive", "paramSix"},
|
||||
NotExpectedParameters: []string{"paramFive", "paramSix"},
|
||||
NotExpectedEnv: []string{"paramSix", "mta"},
|
||||
NotExpectedAll: []string{},
|
||||
},
|
||||
{
|
||||
Metadata: metadata2,
|
||||
ExpectedGeneral: []string{"paramOne"},
|
||||
ExpectedSteps: []string{"paramTwo"},
|
||||
ExpectedStages: []string{"paramThree"},
|
||||
ExpectedParameters: []string{"paramFour"},
|
||||
ExpectedEnv: []string{"paramFive"},
|
||||
ExpectedAll: []string{"paramOne", "paramTwo", "paramThree", "paramFour", "paramFive", "paramSix"},
|
||||
NotExpectedGeneral: []string{"paramTwo", "paramThree", "paramFour", "paramFive", "paramSix"},
|
||||
NotExpectedSteps: []string{"paramOne", "paramThree", "paramFour", "paramFive", "paramSix"},
|
||||
NotExpectedStages: []string{"paramOne", "paramTwo", "paramFour", "paramFive", "paramSix"},
|
||||
NotExpectedParameters: []string{"paramOne", "paramTwo", "paramThree", "paramFive", "paramSix"},
|
||||
NotExpectedEnv: []string{"paramOne", "paramTwo", "paramThree", "paramFour", "paramSix"},
|
||||
NotExpectedAll: []string{},
|
||||
},
|
||||
{
|
||||
Metadata: metadata3,
|
||||
ExpectedGeneral: []string{},
|
||||
ExpectedStages: []string{},
|
||||
ExpectedSteps: []string{},
|
||||
ExpectedParameters: []string{},
|
||||
ExpectedEnv: []string{},
|
||||
},
|
||||
}
|
||||
|
||||
for key, row := range testTable {
|
||||
t.Run(fmt.Sprintf("Metadata%v", key), func(t *testing.T) {
|
||||
filters := row.Metadata.GetParameterFilters()
|
||||
t.Run("General", func(t *testing.T) {
|
||||
for _, val := range filters.General {
|
||||
if !sliceContains(row.ExpectedGeneral, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v to be contained in %v", val, filters.General)
|
||||
}
|
||||
if sliceContains(row.NotExpectedGeneral, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v NOT to be contained in %v", val, filters.General)
|
||||
}
|
||||
}
|
||||
})
|
||||
t.Run("Steps", func(t *testing.T) {
|
||||
for _, val := range filters.Steps {
|
||||
if !sliceContains(row.ExpectedSteps, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v to be contained in %v", val, filters.Steps)
|
||||
}
|
||||
if sliceContains(row.NotExpectedSteps, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v NOT to be contained in %v", val, filters.Steps)
|
||||
}
|
||||
}
|
||||
})
|
||||
t.Run("Stages", func(t *testing.T) {
|
||||
for _, val := range filters.Stages {
|
||||
if !sliceContains(row.ExpectedStages, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v to be contained in %v", val, filters.Stages)
|
||||
}
|
||||
if sliceContains(row.NotExpectedStages, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v NOT to be contained in %v", val, filters.Stages)
|
||||
}
|
||||
}
|
||||
})
|
||||
t.Run("Parameters", func(t *testing.T) {
|
||||
for _, val := range filters.Parameters {
|
||||
if !sliceContains(row.ExpectedParameters, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v to be contained in %v", val, filters.Parameters)
|
||||
}
|
||||
if sliceContains(row.NotExpectedParameters, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v NOT to be contained in %v", val, filters.Parameters)
|
||||
}
|
||||
}
|
||||
})
|
||||
t.Run("Env", func(t *testing.T) {
|
||||
for _, val := range filters.Env {
|
||||
if !sliceContains(row.ExpectedEnv, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v to be contained in %v", val, filters.Env)
|
||||
}
|
||||
if sliceContains(row.NotExpectedEnv, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v NOT to be contained in %v", val, filters.Env)
|
||||
}
|
||||
}
|
||||
})
|
||||
t.Run("All", func(t *testing.T) {
|
||||
for _, val := range filters.All {
|
||||
if !sliceContains(row.ExpectedAll, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v to be contained in %v", val, filters.All)
|
||||
}
|
||||
if sliceContains(row.NotExpectedAll, val) {
|
||||
t.Errorf("Creation of parameter filter failed, expected: %v NOT to be contained in %v", val, filters.All)
|
||||
}
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestGetContextParameterFilters(t *testing.T) {
|
||||
metadata1 := StepData{
|
||||
Spec: StepSpec{
|
||||
Inputs: StepInputs{
|
||||
Secrets: []StepSecrets{
|
||||
{Name: "testSecret1", Type: "jenkins"},
|
||||
{Name: "testSecret2", Type: "jenkins"},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
metadata2 := StepData{
|
||||
Spec: StepSpec{
|
||||
Containers: []Container{
|
||||
{Name: "testcontainer"},
|
||||
{Conditions: []Condition{
|
||||
{Params: []Param{
|
||||
{Name: "scanType", Value: "pip"},
|
||||
}},
|
||||
}},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
metadata3 := StepData{
|
||||
Spec: StepSpec{
|
||||
Sidecars: []Container{
|
||||
{Name: "testsidecar"},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
t.Run("Secrets", func(t *testing.T) {
|
||||
filters := metadata1.GetContextParameterFilters()
|
||||
assert.Equal(t, []string{"testSecret1", "testSecret2"}, filters.All, "incorrect filter All")
|
||||
assert.Equal(t, []string{"testSecret1", "testSecret2"}, filters.General, "incorrect filter General")
|
||||
assert.Equal(t, []string{"testSecret1", "testSecret2"}, filters.Steps, "incorrect filter Steps")
|
||||
assert.Equal(t, []string{"testSecret1", "testSecret2"}, filters.Stages, "incorrect filter Stages")
|
||||
assert.Equal(t, []string{"testSecret1", "testSecret2"}, filters.Parameters, "incorrect filter Parameters")
|
||||
assert.Equal(t, []string{"testSecret1", "testSecret2"}, filters.Env, "incorrect filter Env")
|
||||
})
|
||||
|
||||
t.Run("Containers", func(t *testing.T) {
|
||||
filters := metadata2.GetContextParameterFilters()
|
||||
assert.Equal(t, []string{"containerCommand", "containerShell", "dockerEnvVars", "dockerImage", "dockerOptions", "dockerPullImage", "dockerVolumeBind", "dockerWorkspace", "pip"}, filters.All, "incorrect filter All")
|
||||
assert.NotEqual(t, []string{"containerCommand", "containerShell", "dockerEnvVars", "dockerImage", "dockerOptions", "dockerPullImage", "dockerVolumeBind", "dockerWorkspace", "pip"}, filters.General, "incorrect filter General")
|
||||
assert.Equal(t, []string{"containerCommand", "containerShell", "dockerEnvVars", "dockerImage", "dockerOptions", "dockerPullImage", "dockerVolumeBind", "dockerWorkspace", "pip"}, filters.Steps, "incorrect filter Steps")
|
||||
assert.Equal(t, []string{"containerCommand", "containerShell", "dockerEnvVars", "dockerImage", "dockerOptions", "dockerPullImage", "dockerVolumeBind", "dockerWorkspace", "pip"}, filters.Stages, "incorrect filter Stages")
|
||||
assert.Equal(t, []string{"containerCommand", "containerShell", "dockerEnvVars", "dockerImage", "dockerOptions", "dockerPullImage", "dockerVolumeBind", "dockerWorkspace", "pip"}, filters.Parameters, "incorrect filter Parameters")
|
||||
assert.NotEqual(t, []string{"containerCommand", "containerShell", "dockerEnvVars", "dockerImage", "dockerOptions", "dockerPullImage", "dockerVolumeBind", "dockerWorkspace", "pip"}, filters.Env, "incorrect filter Env")
|
||||
})
|
||||
|
||||
t.Run("Sidecars", func(t *testing.T) {
|
||||
filters := metadata3.GetContextParameterFilters()
|
||||
assert.Equal(t, []string{"containerName", "containerPortMappings", "dockerName", "sidecarEnvVars", "sidecarImage", "sidecarName", "sidecarOptions", "sidecarPullImage", "sidecarReadyCommand", "sidecarVolumeBind", "sidecarWorkspace"}, filters.All, "incorrect filter All")
|
||||
assert.NotEqual(t, []string{"containerName", "containerPortMappings", "dockerName", "sidecarEnvVars", "sidecarImage", "sidecarName", "sidecarOptions", "sidecarPullImage", "sidecarReadyCommand", "sidecarVolumeBind", "sidecarWorkspace"}, filters.General, "incorrect filter General")
|
||||
assert.Equal(t, []string{"containerName", "containerPortMappings", "dockerName", "sidecarEnvVars", "sidecarImage", "sidecarName", "sidecarOptions", "sidecarPullImage", "sidecarReadyCommand", "sidecarVolumeBind", "sidecarWorkspace"}, filters.Steps, "incorrect filter Steps")
|
||||
assert.Equal(t, []string{"containerName", "containerPortMappings", "dockerName", "sidecarEnvVars", "sidecarImage", "sidecarName", "sidecarOptions", "sidecarPullImage", "sidecarReadyCommand", "sidecarVolumeBind", "sidecarWorkspace"}, filters.Stages, "incorrect filter Stages")
|
||||
assert.Equal(t, []string{"containerName", "containerPortMappings", "dockerName", "sidecarEnvVars", "sidecarImage", "sidecarName", "sidecarOptions", "sidecarPullImage", "sidecarReadyCommand", "sidecarVolumeBind", "sidecarWorkspace"}, filters.Parameters, "incorrect filter Parameters")
|
||||
assert.NotEqual(t, []string{"containerName", "containerPortMappings", "dockerName", "sidecarEnvVars", "sidecarImage", "sidecarName", "sidecarOptions", "sidecarPullImage", "sidecarReadyCommand", "sidecarVolumeBind", "sidecarWorkspace"}, filters.Env, "incorrect filter Env")
|
||||
})
|
||||
}
|
||||
|
||||
func TestGetContextDefaults(t *testing.T) {
|
||||
|
||||
t.Run("Positive case", func(t *testing.T) {
|
||||
metadata := StepData{
|
||||
Spec: StepSpec{
|
||||
Inputs: StepInputs{
|
||||
Resources: []StepResources{
|
||||
{
|
||||
Name: "buildDescriptor",
|
||||
Type: "stash",
|
||||
Conditions: []Condition{
|
||||
{Params: []Param{
|
||||
{Name: "scanType", Value: "abc"},
|
||||
}},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "source",
|
||||
Type: "stash",
|
||||
Conditions: []Condition{
|
||||
{Params: []Param{
|
||||
{Name: "scanType", Value: "abc"},
|
||||
}},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "test",
|
||||
Type: "nonce",
|
||||
},
|
||||
{
|
||||
Name: "test2",
|
||||
Type: "stash",
|
||||
Conditions: []Condition{
|
||||
{Params: []Param{
|
||||
{Name: "scanType", Value: "def"},
|
||||
}},
|
||||
},
|
||||
},
|
||||
{
|
||||
Name: "test3",
|
||||
Type: "stash",
|
||||
},
|
||||
},
|
||||
},
|
||||
Containers: []Container{
|
||||
{
|
||||
Command: []string{"test/command"},
|
||||
EnvVars: []EnvVar{
|
||||
{Name: "env1", Value: "val1"},
|
||||
{Name: "env2", Value: "val2"},
|
||||
},
|
||||
Name: "testcontainer",
|
||||
Image: "testImage:tag",
|
||||
Shell: "/bin/bash",
|
||||
WorkingDir: "/test/dir",
|
||||
},
|
||||
},
|
||||
Sidecars: []Container{
|
||||
{
|
||||
Command: []string{"/sidecar/command"},
|
||||
EnvVars: []EnvVar{
|
||||
{Name: "env3", Value: "val3"},
|
||||
{Name: "env4", Value: "val4"},
|
||||
},
|
||||
Name: "testsidecar",
|
||||
Image: "testSidecarImage:tag",
|
||||
ImagePullPolicy: "Never",
|
||||
ReadyCommand: "/sidecar/command",
|
||||
WorkingDir: "/sidecar/dir",
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
cd, err := metadata.GetContextDefaults("testStep")
|
||||
|
||||
t.Run("No error", func(t *testing.T) {
|
||||
if err != nil {
|
||||
t.Errorf("No error expected but got error '%v'", err)
|
||||
}
|
||||
})
|
||||
|
||||
var d PipelineDefaults
|
||||
d.ReadPipelineDefaults([]io.ReadCloser{cd})
|
||||
|
||||
assert.Equal(t, []interface{}{"buildDescriptor", "source"}, d.Defaults[0].Steps["testStep"]["abc"].(map[string]interface{})["stashContent"], "stashContent default not available")
|
||||
assert.Equal(t, []interface{}{"test2"}, d.Defaults[0].Steps["testStep"]["def"].(map[string]interface{})["stashContent"], "stashContent default not available")
|
||||
assert.Equal(t, []interface{}{"test3"}, d.Defaults[0].Steps["testStep"]["stashContent"], "stashContent default not available")
|
||||
assert.Equal(t, "test/command", d.Defaults[0].Steps["testStep"]["containerCommand"], "containerCommand default not available")
|
||||
assert.Equal(t, "testcontainer", d.Defaults[0].Steps["testStep"]["containerName"], "containerName default not available")
|
||||
assert.Equal(t, "/bin/bash", d.Defaults[0].Steps["testStep"]["containerShell"], "containerShell default not available")
|
||||
assert.Equal(t, []interface{}{"env1=val1", "env2=val2"}, d.Defaults[0].Steps["testStep"]["dockerEnvVars"], "dockerEnvVars default not available")
|
||||
assert.Equal(t, "testImage:tag", d.Defaults[0].Steps["testStep"]["dockerImage"], "dockerImage default not available")
|
||||
assert.Equal(t, "testcontainer", d.Defaults[0].Steps["testStep"]["dockerName"], "dockerName default not available")
|
||||
assert.Equal(t, true, d.Defaults[0].Steps["testStep"]["dockerPullImage"], "dockerPullImage default not available")
|
||||
assert.Equal(t, "/test/dir", d.Defaults[0].Steps["testStep"]["dockerWorkspace"], "dockerWorkspace default not available")
|
||||
|
||||
assert.Equal(t, "/sidecar/command", d.Defaults[0].Steps["testStep"]["sidecarCommand"], "sidecarCommand default not available")
|
||||
assert.Equal(t, []interface{}{"env3=val3", "env4=val4"}, d.Defaults[0].Steps["testStep"]["sidecarEnvVars"], "sidecarEnvVars default not available")
|
||||
assert.Equal(t, "testSidecarImage:tag", d.Defaults[0].Steps["testStep"]["sidecarImage"], "sidecarImage default not available")
|
||||
assert.Equal(t, "testsidecar", d.Defaults[0].Steps["testStep"]["sidecarName"], "sidecarName default not available")
|
||||
assert.Equal(t, false, d.Defaults[0].Steps["testStep"]["sidecarPullImage"], "sidecarPullImage default not available")
|
||||
assert.Equal(t, "/sidecar/command", d.Defaults[0].Steps["testStep"]["sidecarReadyCommand"], "sidecarReadyCommand default not available")
|
||||
assert.Equal(t, "/sidecar/dir", d.Defaults[0].Steps["testStep"]["sidecarWorkspace"], "sidecarWorkspace default not available")
|
||||
})
|
||||
|
||||
t.Run("Negative case", func(t *testing.T) {
|
||||
metadataErr := []StepData{
|
||||
StepData{},
|
||||
StepData{
|
||||
Spec: StepSpec{},
|
||||
},
|
||||
StepData{
|
||||
Spec: StepSpec{
|
||||
Containers: []Container{},
|
||||
Sidecars: []Container{},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
t.Run("No containers/sidecars", func(t *testing.T) {
|
||||
cd, _ := metadataErr[0].GetContextDefaults("testStep")
|
||||
|
||||
var d PipelineDefaults
|
||||
d.ReadPipelineDefaults([]io.ReadCloser{cd})
|
||||
|
||||
//no assert since we just want to make sure that no panic occurs
|
||||
})
|
||||
|
||||
t.Run("No command", func(t *testing.T) {
|
||||
cd, _ := metadataErr[1].GetContextDefaults("testStep")
|
||||
|
||||
var d PipelineDefaults
|
||||
d.ReadPipelineDefaults([]io.ReadCloser{cd})
|
||||
|
||||
//no assert since we just want to make sure that no panic occurs
|
||||
})
|
||||
})
|
||||
}
|
308
pkg/generator/helper/helper.go
Normal file
308
pkg/generator/helper/helper.go
Normal file
@ -0,0 +1,308 @@
|
||||
package helper
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"text/template"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
)
|
||||
|
||||
type stepInfo struct {
|
||||
CobraCmdFuncName string
|
||||
CreateCmdVar string
|
||||
ExportPrefix string
|
||||
FlagsFunc string
|
||||
Long string
|
||||
Metadata []config.StepParameters
|
||||
OSImport bool
|
||||
Short string
|
||||
StepFunc string
|
||||
StepName string
|
||||
}
|
||||
|
||||
//StepGoTemplate ...
|
||||
const stepGoTemplate = `package cmd
|
||||
|
||||
import (
|
||||
{{if .OSImport}}"os"{{end}}
|
||||
|
||||
{{if .ExportPrefix}}{{ .ExportPrefix }} "github.com/SAP/jenkins-library/cmd"{{end}}
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
type {{ .StepName }}Options struct {
|
||||
{{- range $key, $value := .Metadata }}
|
||||
{{ $value.Name | golangName }} {{ $value.Type }} ` + "`json:\"{{$value.Name}},omitempty\"`" + `{{end}}
|
||||
}
|
||||
|
||||
var my{{ .StepName | title}}Options {{.StepName}}Options
|
||||
var {{ .StepName }}StepConfigJSON string
|
||||
|
||||
// {{.CobraCmdFuncName}} {{.Short}}
|
||||
func {{.CobraCmdFuncName}}() *cobra.Command {
|
||||
metadata := {{ .StepName }}Metadata()
|
||||
var {{.CreateCmdVar}} = &cobra.Command{
|
||||
Use: "{{.StepName}}",
|
||||
Short: "{{.Short}}",
|
||||
Long: {{ $tick := "` + "`" + `" }}{{ $tick }}{{.Long | longName }}{{ $tick }},
|
||||
PreRunE: func(cmd *cobra.Command, args []string) error {
|
||||
log.SetStepName("{{ .StepName }}")
|
||||
log.SetVerbose({{if .ExportPrefix}}{{ .ExportPrefix }}.{{end}}GeneralConfig.Verbose)
|
||||
return {{if .ExportPrefix}}{{ .ExportPrefix }}.{{end}}PrepareConfig(cmd, &metadata, "{{ .StepName }}", &my{{ .StepName | title}}Options, {{if .ExportPrefix}}{{ .ExportPrefix }}.{{end}}OpenPiperFile)
|
||||
},
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
return {{.StepName}}(my{{ .StepName | title }}Options)
|
||||
},
|
||||
}
|
||||
|
||||
{{.FlagsFunc}}({{.CreateCmdVar}})
|
||||
return {{.CreateCmdVar}}
|
||||
}
|
||||
|
||||
func {{.FlagsFunc}}(cmd *cobra.Command) {
|
||||
{{- range $key, $value := .Metadata }}
|
||||
cmd.Flags().{{ $value.Type | flagType }}(&my{{ $.StepName | title }}Options.{{ $value.Name | golangName }}, "{{ $value.Name }}", {{ $value.Default }}, "{{ $value.Description }}"){{ end }}
|
||||
{{- printf "\n" }}
|
||||
{{- range $key, $value := .Metadata }}{{ if $value.Mandatory }}
|
||||
cmd.MarkFlagRequired("{{ $value.Name }}"){{ end }}{{ end }}
|
||||
}
|
||||
|
||||
// retrieve step metadata
|
||||
func {{ .StepName }}Metadata() config.StepData {
|
||||
var theMetaData = config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{{- range $key, $value := .Metadata }}
|
||||
{
|
||||
Name: "{{ $value.Name }}",
|
||||
Scope: []string{{ "{" }}{{ range $notused, $scope := $value.Scope }}"{{ $scope }}",{{ end }}{{ "}" }},
|
||||
Type: "{{ $value.Type }}",
|
||||
Mandatory: {{ $value.Mandatory }},
|
||||
},{{ end }}
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
return theMetaData
|
||||
}
|
||||
`
|
||||
|
||||
//StepTestGoTemplate ...
|
||||
const stepTestGoTemplate = `package cmd
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func Test{{.CobraCmdFuncName}}(t *testing.T) {
|
||||
|
||||
testCmd := {{.CobraCmdFuncName}}()
|
||||
|
||||
// only high level testing performed - details are tested in step generation procudure
|
||||
assert.Equal(t, "{{.StepName}}", testCmd.Use, "command name incorrect")
|
||||
|
||||
}
|
||||
`
|
||||
|
||||
// ProcessMetaFiles generates step coding based on step configuration provided in yaml files
|
||||
func ProcessMetaFiles(metadataFiles []string, openFile func(s string) (io.ReadCloser, error), writeFile func(filename string, data []byte, perm os.FileMode) error, exportPrefix string) error {
|
||||
for key := range metadataFiles {
|
||||
|
||||
var stepData config.StepData
|
||||
|
||||
configFilePath := metadataFiles[key]
|
||||
|
||||
metadataFile, err := openFile(configFilePath)
|
||||
checkError(err)
|
||||
defer metadataFile.Close()
|
||||
|
||||
fmt.Printf("Reading file %v\n", configFilePath)
|
||||
|
||||
err = stepData.ReadPipelineStepData(metadataFile)
|
||||
checkError(err)
|
||||
|
||||
fmt.Printf("Step name: %v\n", stepData.Metadata.Name)
|
||||
|
||||
osImport := false
|
||||
osImport, err = setDefaultParameters(&stepData)
|
||||
checkError(err)
|
||||
|
||||
myStepInfo := getStepInfo(&stepData, osImport, exportPrefix)
|
||||
|
||||
step := stepTemplate(myStepInfo)
|
||||
err = writeFile(fmt.Sprintf("cmd/%v_generated.go", stepData.Metadata.Name), step, 0644)
|
||||
checkError(err)
|
||||
|
||||
test := stepTestTemplate(myStepInfo)
|
||||
err = writeFile(fmt.Sprintf("cmd/%v_generated_test.go", stepData.Metadata.Name), test, 0644)
|
||||
checkError(err)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func openMetaFile(name string) (io.ReadCloser, error) {
|
||||
return os.Open(name)
|
||||
}
|
||||
|
||||
func fileWriter(filename string, data []byte, perm os.FileMode) error {
|
||||
return ioutil.WriteFile(filename, data, perm)
|
||||
}
|
||||
|
||||
func setDefaultParameters(stepData *config.StepData) (bool, error) {
|
||||
//ToDo: custom function for default handling, support all relevant parameter types
|
||||
osImportRequired := false
|
||||
for k, param := range stepData.Spec.Inputs.Parameters {
|
||||
|
||||
if param.Default == nil {
|
||||
switch param.Type {
|
||||
case "string":
|
||||
param.Default = fmt.Sprintf("os.Getenv(\"PIPER_%v\")", param.Name)
|
||||
osImportRequired = true
|
||||
case "bool":
|
||||
// ToDo: Check if default should be read from env
|
||||
param.Default = "false"
|
||||
case "[]string":
|
||||
// ToDo: Check if default should be read from env
|
||||
param.Default = "[]string{}"
|
||||
default:
|
||||
return false, fmt.Errorf("Meta data type not set or not known: '%v'", param.Type)
|
||||
}
|
||||
} else {
|
||||
switch param.Type {
|
||||
case "string":
|
||||
param.Default = fmt.Sprintf("\"%v\"", param.Default)
|
||||
case "bool":
|
||||
boolVal := "false"
|
||||
if param.Default.(bool) == true {
|
||||
boolVal = "true"
|
||||
}
|
||||
param.Default = boolVal
|
||||
case "[]string":
|
||||
param.Default = fmt.Sprintf("[]string{\"%v\"}", strings.Join(param.Default.([]string), "\", \""))
|
||||
default:
|
||||
return false, fmt.Errorf("Meta data type not set or not known: '%v'", param.Type)
|
||||
}
|
||||
}
|
||||
|
||||
stepData.Spec.Inputs.Parameters[k] = param
|
||||
}
|
||||
return osImportRequired, nil
|
||||
}
|
||||
|
||||
func getStepInfo(stepData *config.StepData, osImport bool, exportPrefix string) stepInfo {
|
||||
return stepInfo{
|
||||
StepName: stepData.Metadata.Name,
|
||||
CobraCmdFuncName: fmt.Sprintf("%vCommand", strings.Title(stepData.Metadata.Name)),
|
||||
CreateCmdVar: fmt.Sprintf("create%vCmd", strings.Title(stepData.Metadata.Name)),
|
||||
Short: stepData.Metadata.Description,
|
||||
Long: stepData.Metadata.LongDescription,
|
||||
Metadata: stepData.Spec.Inputs.Parameters,
|
||||
FlagsFunc: fmt.Sprintf("add%vFlags", strings.Title(stepData.Metadata.Name)),
|
||||
OSImport: osImport,
|
||||
ExportPrefix: exportPrefix,
|
||||
}
|
||||
}
|
||||
|
||||
func checkError(err error) {
|
||||
if err != nil {
|
||||
fmt.Printf("Error occured: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
||||
|
||||
// MetadataFiles provides a list of all step metadata files
|
||||
func MetadataFiles(sourceDirectory string) ([]string, error) {
|
||||
|
||||
var metadataFiles []string
|
||||
|
||||
err := filepath.Walk(sourceDirectory, func(path string, info os.FileInfo, err error) error {
|
||||
if filepath.Ext(path) == ".yaml" {
|
||||
metadataFiles = append(metadataFiles, path)
|
||||
}
|
||||
return nil
|
||||
})
|
||||
if err != nil {
|
||||
return metadataFiles, nil
|
||||
}
|
||||
return metadataFiles, nil
|
||||
}
|
||||
|
||||
func stepTemplate(myStepInfo stepInfo) []byte {
|
||||
|
||||
funcMap := template.FuncMap{
|
||||
"flagType": flagType,
|
||||
"golangName": golangName,
|
||||
"title": strings.Title,
|
||||
"longName": longName,
|
||||
}
|
||||
|
||||
tmpl, err := template.New("step").Funcs(funcMap).Parse(stepGoTemplate)
|
||||
checkError(err)
|
||||
|
||||
var generatedCode bytes.Buffer
|
||||
err = tmpl.Execute(&generatedCode, myStepInfo)
|
||||
checkError(err)
|
||||
|
||||
return generatedCode.Bytes()
|
||||
}
|
||||
|
||||
func stepTestTemplate(myStepInfo stepInfo) []byte {
|
||||
|
||||
funcMap := template.FuncMap{
|
||||
"flagType": flagType,
|
||||
"golangName": golangName,
|
||||
"title": strings.Title,
|
||||
}
|
||||
|
||||
tmpl, err := template.New("stepTest").Funcs(funcMap).Parse(stepTestGoTemplate)
|
||||
checkError(err)
|
||||
|
||||
var generatedCode bytes.Buffer
|
||||
err = tmpl.Execute(&generatedCode, myStepInfo)
|
||||
checkError(err)
|
||||
|
||||
return generatedCode.Bytes()
|
||||
}
|
||||
|
||||
func longName(long string) string {
|
||||
l := strings.ReplaceAll(long, "`", "` + \"`\" + `")
|
||||
l = strings.TrimSpace(l)
|
||||
return l
|
||||
}
|
||||
|
||||
func golangName(name string) string {
|
||||
properName := strings.Replace(name, "Api", "API", -1)
|
||||
properName = strings.Replace(properName, "api", "API", -1)
|
||||
properName = strings.Replace(properName, "Url", "URL", -1)
|
||||
properName = strings.Replace(properName, "Id", "ID", -1)
|
||||
properName = strings.Replace(properName, "Json", "JSON", -1)
|
||||
properName = strings.Replace(properName, "json", "JSON", -1)
|
||||
return strings.Title(properName)
|
||||
}
|
||||
|
||||
func flagType(paramType string) string {
|
||||
var theFlagType string
|
||||
switch paramType {
|
||||
case "bool":
|
||||
theFlagType = "BoolVar"
|
||||
case "string":
|
||||
theFlagType = "StringVar"
|
||||
case "[]string":
|
||||
theFlagType = "StringSliceVar"
|
||||
default:
|
||||
fmt.Printf("Meta data type not set or not known: '%v'\n", paramType)
|
||||
os.Exit(1)
|
||||
}
|
||||
return theFlagType
|
||||
}
|
230
pkg/generator/helper/helper_test.go
Normal file
230
pkg/generator/helper/helper_test.go
Normal file
@ -0,0 +1,230 @@
|
||||
package helper
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func configOpenFileMock(name string) (io.ReadCloser, error) {
|
||||
meta1 := `metadata:
|
||||
name: testStep
|
||||
description: Test description
|
||||
longDescription: |
|
||||
Long Test description
|
||||
spec:
|
||||
inputs:
|
||||
params:
|
||||
- name: param0
|
||||
type: string
|
||||
description: param0 description
|
||||
default: val0
|
||||
scope:
|
||||
- GENERAL
|
||||
- PARAMETERS
|
||||
mandatory: true
|
||||
- name: param1
|
||||
type: string
|
||||
description: param1 description
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- name: param2
|
||||
type: string
|
||||
description: param1 description
|
||||
scope:
|
||||
- PARAMETERS
|
||||
mandatory: true
|
||||
`
|
||||
var r string
|
||||
switch name {
|
||||
case "test.yaml":
|
||||
r = meta1
|
||||
default:
|
||||
r = ""
|
||||
}
|
||||
return ioutil.NopCloser(strings.NewReader(r)), nil
|
||||
}
|
||||
|
||||
var files map[string][]byte
|
||||
|
||||
func writeFileMock(filename string, data []byte, perm os.FileMode) error {
|
||||
if files == nil {
|
||||
files = make(map[string][]byte)
|
||||
}
|
||||
files[filename] = data
|
||||
return nil
|
||||
}
|
||||
|
||||
func TestProcessMetaFiles(t *testing.T) {
|
||||
|
||||
ProcessMetaFiles([]string{"test.yaml"}, configOpenFileMock, writeFileMock, "")
|
||||
|
||||
t.Run("step code", func(t *testing.T) {
|
||||
goldenFilePath := filepath.Join("testdata", t.Name()+"_generated.golden")
|
||||
expected, err := ioutil.ReadFile(goldenFilePath)
|
||||
if err != nil {
|
||||
t.Fatalf("failed reading %v", goldenFilePath)
|
||||
}
|
||||
assert.Equal(t, expected, files["cmd/testStep_generated.go"])
|
||||
})
|
||||
|
||||
t.Run("test code", func(t *testing.T) {
|
||||
goldenFilePath := filepath.Join("testdata", t.Name()+"_generated.golden")
|
||||
expected, err := ioutil.ReadFile(goldenFilePath)
|
||||
if err != nil {
|
||||
t.Fatalf("failed reading %v", goldenFilePath)
|
||||
}
|
||||
assert.Equal(t, expected, files["cmd/testStep_generated_test.go"])
|
||||
})
|
||||
}
|
||||
|
||||
func TestSetDefaultParameters(t *testing.T) {
|
||||
t.Run("success case", func(t *testing.T) {
|
||||
stepData := config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{Name: "param0", Scope: []string{"GENERAL"}, Type: "string", Default: "val0"},
|
||||
{Name: "param1", Scope: []string{"STEPS"}, Type: "string"},
|
||||
{Name: "param2", Scope: []string{"STAGES"}, Type: "bool", Default: true},
|
||||
{Name: "param3", Scope: []string{"PARAMETERS"}, Type: "bool"},
|
||||
{Name: "param4", Scope: []string{"ENV"}, Type: "[]string", Default: []string{"val4_1", "val4_2"}},
|
||||
{Name: "param5", Scope: []string{"ENV"}, Type: "[]string"},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
expected := []string{
|
||||
"\"val0\"",
|
||||
"os.Getenv(\"PIPER_param1\")",
|
||||
"true",
|
||||
"false",
|
||||
"[]string{\"val4_1\", \"val4_2\"}",
|
||||
"[]string{}",
|
||||
}
|
||||
|
||||
osImport, err := setDefaultParameters(&stepData)
|
||||
|
||||
assert.NoError(t, err, "error occured but none expected")
|
||||
|
||||
assert.Equal(t, true, osImport, "import of os package required")
|
||||
|
||||
for k, v := range expected {
|
||||
assert.Equal(t, v, stepData.Spec.Inputs.Parameters[k].Default, fmt.Sprintf("default not correct for parameter %v", k))
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("error case", func(t *testing.T) {
|
||||
stepData := []config.StepData{
|
||||
{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{Name: "param0", Scope: []string{"GENERAL"}, Type: "int", Default: 10},
|
||||
{Name: "param1", Scope: []string{"GENERAL"}, Type: "int"},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{Name: "param1", Scope: []string{"GENERAL"}, Type: "int"},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for k, v := range stepData {
|
||||
_, err := setDefaultParameters(&v)
|
||||
assert.Error(t, err, fmt.Sprintf("error expected but none occured for parameter %v", k))
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
func TestGetStepInfo(t *testing.T) {
|
||||
|
||||
stepData := config.StepData{
|
||||
Metadata: config.StepMetadata{
|
||||
Name: "testStep",
|
||||
Description: "Test description",
|
||||
LongDescription: "Long Test description",
|
||||
},
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{Name: "param0", Scope: []string{"GENERAL"}, Type: "string", Default: "test"},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
myStepInfo := getStepInfo(&stepData, true, "")
|
||||
|
||||
assert.Equal(t, "testStep", myStepInfo.StepName, "StepName incorrect")
|
||||
assert.Equal(t, "TestStepCommand", myStepInfo.CobraCmdFuncName, "CobraCmdFuncName incorrect")
|
||||
assert.Equal(t, "createTestStepCmd", myStepInfo.CreateCmdVar, "CreateCmdVar incorrect")
|
||||
assert.Equal(t, "Test description", myStepInfo.Short, "Short incorrect")
|
||||
assert.Equal(t, "Long Test description", myStepInfo.Long, "Long incorrect")
|
||||
assert.Equal(t, stepData.Spec.Inputs.Parameters, myStepInfo.Metadata, "Metadata incorrect")
|
||||
assert.Equal(t, "addTestStepFlags", myStepInfo.FlagsFunc, "FlagsFunc incorrect")
|
||||
assert.Equal(t, "addTestStepFlags", myStepInfo.FlagsFunc, "FlagsFunc incorrect")
|
||||
|
||||
}
|
||||
|
||||
func TestLongName(t *testing.T) {
|
||||
tt := []struct {
|
||||
input string
|
||||
expected string
|
||||
}{
|
||||
{input: "my long name with no ticks", expected: "my long name with no ticks"},
|
||||
{input: "my long name with `ticks`", expected: "my long name with ` + \"`\" + `ticks` + \"`\" + `"},
|
||||
}
|
||||
|
||||
for k, v := range tt {
|
||||
assert.Equal(t, v.expected, longName(v.input), fmt.Sprintf("wrong long name for run %v", k))
|
||||
}
|
||||
}
|
||||
|
||||
func TestGolangName(t *testing.T) {
|
||||
tt := []struct {
|
||||
input string
|
||||
expected string
|
||||
}{
|
||||
{input: "testApi", expected: "TestAPI"},
|
||||
{input: "apiTest", expected: "APITest"},
|
||||
{input: "testUrl", expected: "TestURL"},
|
||||
{input: "testId", expected: "TestID"},
|
||||
{input: "testJson", expected: "TestJSON"},
|
||||
{input: "jsonTest", expected: "JSONTest"},
|
||||
}
|
||||
|
||||
for k, v := range tt {
|
||||
assert.Equal(t, v.expected, golangName(v.input), fmt.Sprintf("wrong golang name for run %v", k))
|
||||
}
|
||||
}
|
||||
|
||||
func TestFlagType(t *testing.T) {
|
||||
tt := []struct {
|
||||
input string
|
||||
expected string
|
||||
}{
|
||||
{input: "bool", expected: "BoolVar"},
|
||||
{input: "string", expected: "StringVar"},
|
||||
{input: "[]string", expected: "StringSliceVar"},
|
||||
}
|
||||
|
||||
for k, v := range tt {
|
||||
assert.Equal(t, v.expected, flagType(v.input), fmt.Sprintf("wrong flag type for run %v", k))
|
||||
}
|
||||
}
|
80
pkg/generator/helper/testdata/TestProcessMetaFiles/step_code_generated.golden
vendored
Normal file
80
pkg/generator/helper/testdata/TestProcessMetaFiles/step_code_generated.golden
vendored
Normal file
@ -0,0 +1,80 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"os"
|
||||
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/config"
|
||||
"github.com/SAP/jenkins-library/pkg/log"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
type testStepOptions struct {
|
||||
Param0 string `json:"param0,omitempty"`
|
||||
Param1 string `json:"param1,omitempty"`
|
||||
Param2 string `json:"param2,omitempty"`
|
||||
}
|
||||
|
||||
var myTestStepOptions testStepOptions
|
||||
var testStepStepConfigJSON string
|
||||
|
||||
// TestStepCommand Test description
|
||||
func TestStepCommand() *cobra.Command {
|
||||
metadata := testStepMetadata()
|
||||
var createTestStepCmd = &cobra.Command{
|
||||
Use: "testStep",
|
||||
Short: "Test description",
|
||||
Long: `Long Test description`,
|
||||
PreRunE: func(cmd *cobra.Command, args []string) error {
|
||||
log.SetStepName("testStep")
|
||||
log.SetVerbose(GeneralConfig.Verbose)
|
||||
return PrepareConfig(cmd, &metadata, "testStep", &myTestStepOptions, OpenPiperFile)
|
||||
},
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
return testStep(myTestStepOptions)
|
||||
},
|
||||
}
|
||||
|
||||
addTestStepFlags(createTestStepCmd)
|
||||
return createTestStepCmd
|
||||
}
|
||||
|
||||
func addTestStepFlags(cmd *cobra.Command) {
|
||||
cmd.Flags().StringVar(&myTestStepOptions.Param0, "param0", "val0", "param0 description")
|
||||
cmd.Flags().StringVar(&myTestStepOptions.Param1, "param1", os.Getenv("PIPER_param1"), "param1 description")
|
||||
cmd.Flags().StringVar(&myTestStepOptions.Param2, "param2", os.Getenv("PIPER_param2"), "param1 description")
|
||||
|
||||
cmd.MarkFlagRequired("param0")
|
||||
cmd.MarkFlagRequired("param2")
|
||||
}
|
||||
|
||||
// retrieve step metadata
|
||||
func testStepMetadata() config.StepData {
|
||||
var theMetaData = config.StepData{
|
||||
Spec: config.StepSpec{
|
||||
Inputs: config.StepInputs{
|
||||
Parameters: []config.StepParameters{
|
||||
{
|
||||
Name: "param0",
|
||||
Scope: []string{"GENERAL","PARAMETERS",},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
{
|
||||
Name: "param1",
|
||||
Scope: []string{"PARAMETERS",},
|
||||
Type: "string",
|
||||
Mandatory: false,
|
||||
},
|
||||
{
|
||||
Name: "param2",
|
||||
Scope: []string{"PARAMETERS",},
|
||||
Type: "string",
|
||||
Mandatory: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
return theMetaData
|
||||
}
|
16
pkg/generator/helper/testdata/TestProcessMetaFiles/test_code_generated.golden
vendored
Normal file
16
pkg/generator/helper/testdata/TestProcessMetaFiles/test_code_generated.golden
vendored
Normal file
@ -0,0 +1,16 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestTestStepCommand(t *testing.T) {
|
||||
|
||||
testCmd := TestStepCommand()
|
||||
|
||||
// only high level testing performed - details are tested in step generation procudure
|
||||
assert.Equal(t, "testStep", testCmd.Use, "command name incorrect")
|
||||
|
||||
}
|
41
pkg/generator/step-metadata.go
Normal file
41
pkg/generator/step-metadata.go
Normal file
@ -0,0 +1,41 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"os/exec"
|
||||
|
||||
"github.com/SAP/jenkins-library/pkg/generator/helper"
|
||||
)
|
||||
|
||||
func main() {
|
||||
|
||||
metadataPath := "./resources/metadata"
|
||||
|
||||
metadataFiles, err := helper.MetadataFiles(metadataPath)
|
||||
checkError(err)
|
||||
|
||||
err = helper.ProcessMetaFiles(metadataFiles, openMetaFile, fileWriter, "")
|
||||
checkError(err)
|
||||
|
||||
cmd := exec.Command("go", "fmt", "./cmd")
|
||||
err = cmd.Run()
|
||||
checkError(err)
|
||||
|
||||
}
|
||||
func openMetaFile(name string) (io.ReadCloser, error) {
|
||||
return os.Open(name)
|
||||
}
|
||||
|
||||
func fileWriter(filename string, data []byte, perm os.FileMode) error {
|
||||
return ioutil.WriteFile(filename, data, perm)
|
||||
}
|
||||
|
||||
func checkError(err error) {
|
||||
if err != nil {
|
||||
fmt.Printf("Error occured: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
23
pkg/github/github.go
Normal file
23
pkg/github/github.go
Normal file
@ -0,0 +1,23 @@
|
||||
package github
|
||||
|
||||
import (
|
||||
"context"
|
||||
|
||||
"github.com/google/go-github/v28/github"
|
||||
"golang.org/x/oauth2"
|
||||
)
|
||||
|
||||
//NewClient creates a new GitHub client using an OAuth token for authentication
|
||||
func NewClient(token, apiURL, uploadURL string) (context.Context, *github.Client, error) {
|
||||
ctx := context.Background()
|
||||
ts := oauth2.StaticTokenSource(
|
||||
&oauth2.Token{AccessToken: token},
|
||||
)
|
||||
tc := oauth2.NewClient(ctx, ts)
|
||||
|
||||
client, err := github.NewEnterpriseClient(apiURL, uploadURL, tc)
|
||||
if err != nil {
|
||||
return ctx, nil, err
|
||||
}
|
||||
return ctx, client, nil
|
||||
}
|
30
pkg/log/log.go
Normal file
30
pkg/log/log.go
Normal file
@ -0,0 +1,30 @@
|
||||
package log
|
||||
|
||||
import (
|
||||
"github.com/sirupsen/logrus"
|
||||
)
|
||||
|
||||
// LibraryRepository that is passed into with -ldflags
|
||||
var LibraryRepository string
|
||||
var logger *logrus.Entry
|
||||
|
||||
// Entry returns the logger entry or creates one if none is present.
|
||||
func Entry() *logrus.Entry {
|
||||
if logger == nil {
|
||||
logger = logrus.WithField("library", LibraryRepository)
|
||||
}
|
||||
return logger
|
||||
}
|
||||
|
||||
// SetVerbose sets the log level with respect to verbose flag.
|
||||
func SetVerbose(verbose bool) {
|
||||
if verbose {
|
||||
//Logger().Debugf("logging set to level: %s", level)
|
||||
logrus.SetLevel(logrus.DebugLevel)
|
||||
}
|
||||
}
|
||||
|
||||
// SetStepName sets the stepName field.
|
||||
func SetStepName(stepName string) {
|
||||
logger = Entry().WithField("stepName", stepName)
|
||||
}
|
14
pkg/piperutils/FileUtils.go
Normal file
14
pkg/piperutils/FileUtils.go
Normal file
@ -0,0 +1,14 @@
|
||||
package piperutils
|
||||
|
||||
import (
|
||||
"os"
|
||||
)
|
||||
|
||||
// FileExists ...
|
||||
func FileExists(filename string) bool {
|
||||
info, err := os.Stat(filename)
|
||||
if os.IsNotExist(err) {
|
||||
return false
|
||||
}
|
||||
return !info.IsDir()
|
||||
}
|
11
pom.xml
11
pom.xml
@ -12,7 +12,7 @@
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
<groupId>com.sap.cp.jenkins</groupId>
|
||||
<artifactId>jenkins-library</artifactId>
|
||||
<version>0.11</version>
|
||||
<version>${revision}</version>
|
||||
|
||||
<name>SAP CP Piper Library</name>
|
||||
<description>Shared library containing steps and utilities to set up continuous deployment processes for SAP technologies.</description>
|
||||
@ -40,6 +40,7 @@
|
||||
</pluginRepositories>
|
||||
|
||||
<properties>
|
||||
<revision>0-SNAPSHOT</revision>
|
||||
<findbugs.skip>true</findbugs.skip>
|
||||
<jenkins.version>2.32.3</jenkins.version>
|
||||
<pipeline.version>2.5</pipeline.version>
|
||||
@ -47,7 +48,6 @@
|
||||
<java.level>8</java.level>
|
||||
</properties>
|
||||
|
||||
|
||||
<dependencies>
|
||||
|
||||
<dependency>
|
||||
@ -139,6 +139,13 @@
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>fr.opensagres.js</groupId>
|
||||
<artifactId>minimatch.java</artifactId>
|
||||
<version>1.1.0</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
|
||||
</dependencies>
|
||||
<build>
|
||||
<plugins>
|
||||
|
@ -38,7 +38,6 @@ general:
|
||||
githubServerUrl: 'https://github.com'
|
||||
gitSshKeyCredentialsId: '' #needed to allow sshagent to run with local ssh key
|
||||
jenkinsKubernetes:
|
||||
jnlpAgent: 's4sdk/jenkins-agent-k8s:latest'
|
||||
securityContext:
|
||||
# Setting security context globally is currently not working with jaas
|
||||
# runAsUser: 1000
|
||||
@ -157,9 +156,12 @@ steps:
|
||||
cloudFoundryDeploy:
|
||||
cloudFoundry:
|
||||
apiEndpoint: 'https://api.cf.eu10.hana.ondemand.com'
|
||||
apiParameters: ''
|
||||
loginParameters: ''
|
||||
deployTool: 'cf_native'
|
||||
deployType: 'standard'
|
||||
keepOldInstance: false
|
||||
cfNativeDeployParameters: ''
|
||||
mtaDeployParameters: '-f'
|
||||
mtaExtensionDescriptor: ''
|
||||
mtaPath: ''
|
||||
@ -169,10 +171,10 @@ steps:
|
||||
- 'deployDescriptor'
|
||||
- 'pipelineConfigAndTests'
|
||||
cf_native:
|
||||
dockerImage: 's4sdk/docker-cf-cli'
|
||||
dockerImage: 'ppiper/cf-cli'
|
||||
dockerWorkspace: '/home/piper'
|
||||
mtaDeployPlugin:
|
||||
dockerImage: 's4sdk/docker-cf-cli'
|
||||
dockerImage: 'ppiper/cf-cli'
|
||||
dockerWorkspace: '/home/piper'
|
||||
containerExecuteStructureTests:
|
||||
containerCommand: '/busybox/tail -f /dev/null'
|
||||
@ -183,6 +185,14 @@ steps:
|
||||
stashContent:
|
||||
- 'tests'
|
||||
testReportFilePath: 'cst-report.json'
|
||||
cloudFoundryCreateService:
|
||||
cloudFoundry:
|
||||
apiEndpoint: 'https://api.cf.eu10.hana.ondemand.com'
|
||||
serviceManifest: 'service-manifest.yml'
|
||||
dockerImage: 'ppiper/cf-cli'
|
||||
dockerWorkspace: '/home/piper'
|
||||
stashContent:
|
||||
- 'deployDescriptor'
|
||||
detectExecuteScan:
|
||||
detect:
|
||||
projectVersion: '1'
|
||||
@ -324,7 +334,7 @@ steps:
|
||||
mtaJarLocation: '/opt/sap/mta/lib/mta.jar'
|
||||
dockerImage: 'ppiper/mta-archive-builder'
|
||||
neoDeploy:
|
||||
dockerImage: 's4sdk/docker-neo-cli'
|
||||
dockerImage: 'ppiper/neo-cli'
|
||||
deployMode: 'mta'
|
||||
warAction: 'deploy'
|
||||
extensions: []
|
||||
@ -399,6 +409,13 @@ steps:
|
||||
&& mkdir -p \$GOPATH/src/${config.whitesource.projectName.substring(0, config.whitesource.projectName.lastIndexOf('/'))}
|
||||
&& ln -s \$(pwd) \$GOPATH/src/${config.whitesource.projectName}
|
||||
&& cd \$GOPATH/src/${config.whitesource.projectName} && dep ensure
|
||||
dub:
|
||||
buildDescriptorFile: './dub.json'
|
||||
dockerImage: 'buildpack-deps:stretch-curl'
|
||||
dockerWorkspace: '/home/dub'
|
||||
stashContent:
|
||||
- 'buildDescriptor'
|
||||
- 'checkmarx'
|
||||
sbt:
|
||||
buildDescriptorFile: './build.sbt'
|
||||
dockerImage: 'hseeberger/scala-sbt:8u181_2.12.8_1.2.8'
|
||||
@ -429,25 +446,31 @@ steps:
|
||||
noDefaultExludes: []
|
||||
pipelineStashFilesBeforeBuild:
|
||||
stashIncludes:
|
||||
buildDescriptor: '**/pom.xml, **/.mvn/**, **/assembly.xml, **/.swagger-codegen-ignore, **/package.json, **/requirements.txt, **/setup.py, **/mta*.y*ml, **/.npmrc, Dockerfile, .hadolint.yaml, **/VERSION, **/version.txt, **/Gopkg.*, **/dub.json, **/dub.sdl, **/build.sbt, **/sbtDescriptor.json, **/project/*'
|
||||
buildDescriptor: '**/pom.xml, **/.mvn/**, **/assembly.xml, **/.swagger-codegen-ignore, **/package.json, **/requirements.txt, **/setup.py, **/mta*.y*ml, **/.npmrc, Dockerfile, .hadolint.yaml, **/VERSION, **/version.txt, **/Gopkg.*, **/dub.json, **/dub.sdl, **/build.sbt, **/sbtDescriptor.json, **/project/*, **/ui5.yaml, **/ui5.yml'
|
||||
deployDescriptor: '**/manifest*.y*ml, **/*.mtaext.y*ml, **/*.mtaext, **/xs-app.json, helm/**, *.y*ml'
|
||||
git: '.git/**'
|
||||
opa5: '**/*.*'
|
||||
opensourceConfiguration: '**/srcclr.yml, **/vulas-custom.properties, **/.nsprc, **/.retireignore, **/.retireignore.json, **/.snyk, **/wss-unified-agent.config, **/vendor/**/*'
|
||||
pipelineConfigAndTests: '.pipeline/**'
|
||||
securityDescriptor: '**/xs-security.json'
|
||||
tests: '**/pom.xml, **/*.json, **/*.xml, **/src/**, **/node_modules/**, **/specs/**, **/env/**, **/*.js, **/tests/**'
|
||||
tests: '**/pom.xml, **/*.json, **/*.xml, **/src/**, **/node_modules/**, **/specs/**, **/env/**, **/*.js, **/tests/**, **/*.html, **/*.css, **/*.properties'
|
||||
stashExcludes:
|
||||
buildDescriptor: '**/node_modules/**/package.json'
|
||||
deployDescriptor: ''
|
||||
git: ''
|
||||
opa5: ''
|
||||
opensourceConfiguration: ''
|
||||
pipelineConfigAndTests: ''
|
||||
securityDescriptor: ''
|
||||
tests: ''
|
||||
noDefaultExludes:
|
||||
- 'git'
|
||||
piperPublishWarnings:
|
||||
parserId: piper
|
||||
parserName: Piper
|
||||
parserPattern: '\[(INFO|WARNING|ERROR)\] (.*) \(([^) ]*)\/([^) ]*)\)'
|
||||
parserScript: 'return builder.guessSeverity(matcher.group(1)).setMessage(matcher.group(2)).setModuleName(matcher.group(3)).setType(matcher.group(4)).buildOptional()'
|
||||
recordIssuesSettings:
|
||||
blameDisabled: true
|
||||
enabledForFailure: true
|
||||
seleniumExecuteTests:
|
||||
buildTool: 'npm'
|
||||
containerPortMappings:
|
||||
@ -491,8 +514,12 @@ steps:
|
||||
dockerImage: 'maven:3.5-jdk-8'
|
||||
instance: 'SonarCloud'
|
||||
options: []
|
||||
pullRequestProvider: 'github'
|
||||
sonarScannerDownloadUrl: 'https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-3.3.0.1492-linux.zip'
|
||||
pullRequestProvider: 'GitHub'
|
||||
sonarScannerDownloadUrl: 'https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.2.0.1873-linux.zip'
|
||||
spinnakerTriggerPipeline:
|
||||
certFileCredentialsId: 'spinnaker-client-certificate'
|
||||
keyFileCredentialsId: 'spinnaker-client-key'
|
||||
timeout: 60
|
||||
testsPublishResults:
|
||||
failOnError: false
|
||||
junit:
|
||||
@ -507,7 +534,7 @@ steps:
|
||||
archive: false
|
||||
active: false
|
||||
cobertura:
|
||||
pattern: '**/target/coverage/cobertura-coverage.xml'
|
||||
pattern: '**/target/coverage/**/cobertura-coverage.xml'
|
||||
onlyStableBuilds: true
|
||||
allowEmptyResults: true
|
||||
archive: false
|
||||
@ -532,6 +559,10 @@ steps:
|
||||
active: false
|
||||
checkChangeInDevelopment:
|
||||
failIfStatusIsNotInDevelopment: true
|
||||
tmsUpload:
|
||||
namedUser: 'Piper-Pipeline'
|
||||
stashContent:
|
||||
- 'buildResult'
|
||||
transportRequestCreate:
|
||||
developmentSystemId: null
|
||||
verbose: false
|
||||
@ -561,3 +592,14 @@ steps:
|
||||
nodeLabel: ''
|
||||
stashContent:
|
||||
- 'pipelineConfigAndTests'
|
||||
xsDeploy:
|
||||
credentialsId: 'XS'
|
||||
deployIdLogPattern: '^.*"xs bg-deploy -i (.*) -a .*".*$'
|
||||
loginOpts: ''
|
||||
deployOpts: ''
|
||||
docker:
|
||||
dockerImage: ''
|
||||
dockerPullImage: false
|
||||
mode: 'DEPLOY'
|
||||
action: 'NONE'
|
||||
xsSessionFile: '.xsconfig'
|
||||
|
154
resources/metadata/githubrelease.yaml
Normal file
154
resources/metadata/githubrelease.yaml
Normal file
@ -0,0 +1,154 @@
|
||||
metadata:
|
||||
name: githubPublishRelease
|
||||
description: Publish a release in GitHub
|
||||
longDescription: |
|
||||
This step creates a tag in your GitHub repository together with a release.
|
||||
The release can be filled with text plus additional information like:
|
||||
|
||||
* Closed pull request since last release
|
||||
* Closed issues since last release
|
||||
* Link to delta information showing all commits since last release
|
||||
|
||||
The result looks like
|
||||
|
||||

|
||||
spec:
|
||||
inputs:
|
||||
secrets:
|
||||
- name: githubTokenCredentialsId
|
||||
description: Jenkins 'Secret text' credentials ID containing token to authenticate to GitHub.
|
||||
type: jenkins
|
||||
params:
|
||||
- name: addClosedIssues
|
||||
description: 'If set to `true`, closed issues and merged pull-requests since the last release will added below the `releaseBodyHeader`'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: bool
|
||||
default: false
|
||||
- name: addDeltaToLastRelease
|
||||
description: 'If set to `true`, a link will be added to the relese information that brings up all commits since the last release.'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: bool
|
||||
default: false
|
||||
- name: assetPath
|
||||
description: Path to a release asset which should be uploaded to the list of release assets.
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
- name: commitish
|
||||
description: 'Target git commitish for the release'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
default: "master"
|
||||
- name: excludeLabels
|
||||
description: 'Allows to exclude issues with dedicated list of labels.'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: '[]string'
|
||||
- name: apiUrl
|
||||
aliases:
|
||||
- name: githubApiUrl
|
||||
description: Set the GitHub API url.
|
||||
scope:
|
||||
- GENERAL
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
default: https://api.github.com
|
||||
mandatory: true
|
||||
- name: owner
|
||||
aliases:
|
||||
- name: githubOrg
|
||||
description: 'Set the GitHub organization.'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
mandatory: true
|
||||
- name: repository
|
||||
aliases:
|
||||
- name: githubRepo
|
||||
description: 'Set the GitHub repository.'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
mandatory: true
|
||||
- name: serverUrl
|
||||
aliases:
|
||||
- name: githubServerUrl
|
||||
description: 'GitHub server url for end-user access.'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
default: https://github.com
|
||||
mandatory: true
|
||||
- name: token
|
||||
aliases:
|
||||
- name: githubToken
|
||||
description: 'GitHub personal access token as per https://help.github.com/en/github/authenticating-to-github/creating-a-personal-access-token-for-the-command-line'
|
||||
scope:
|
||||
- GENERAL
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
mandatory: true
|
||||
- name: uploadUrl
|
||||
aliases:
|
||||
- name: githubUploadUrl
|
||||
description: Set the GitHub API url.
|
||||
scope:
|
||||
- GENERAL
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
default: https://uploads.github.com
|
||||
mandatory: true
|
||||
- name: labels
|
||||
description: 'Labels to include in issue search.'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: '[]string'
|
||||
- name: releaseBodyHeader
|
||||
description: Content which will appear for the release.
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
- name: updateAsset
|
||||
description: Specify if a release asset should be updated only.
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: bool
|
||||
- name: version
|
||||
description: 'Define the version number which will be written as tag as well as release name.'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
type: string
|
||||
mandatory: true
|
73
resources/metadata/karma.yaml
Normal file
73
resources/metadata/karma.yaml
Normal file
@ -0,0 +1,73 @@
|
||||
metadata:
|
||||
name: karmaExecuteTests
|
||||
description: Executes the Karma test runner
|
||||
longDescription: |
|
||||
In this step the ([Karma test runner](http://karma-runner.github.io)) is executed.
|
||||
|
||||
The step is using the `seleniumExecuteTest` step to spin up two containers in a Docker network:
|
||||
|
||||
* a Selenium/Chrome container (`selenium/standalone-chrome`)
|
||||
* a NodeJS container (`node:8-stretch`)
|
||||
|
||||
In the Docker network, the containers can be referenced by the values provided in `dockerName` and `sidecarName`, the default values are `karma` and `selenium`. These values must be used in the `hostname` properties of the test configuration ([Karma](https://karma-runner.github.io/1.0/config/configuration-file.html) and [WebDriver](https://github.com/karma-runner/karma-webdriver-launcher#usage)).
|
||||
|
||||
!!! note
|
||||
In a Kubernetes environment, the containers both need to be referenced with `localhost`.
|
||||
spec:
|
||||
inputs:
|
||||
resources:
|
||||
- name: buildDescriptor
|
||||
type: stash
|
||||
- name: tests
|
||||
type: stash
|
||||
params:
|
||||
- name: installCommand
|
||||
type: string
|
||||
description: The command that is executed to install the test tool.
|
||||
default: npm install --quiet
|
||||
scope:
|
||||
- GENERAL
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
mandatory: true
|
||||
- name: modulePath
|
||||
type: string
|
||||
description: Define the path of the module to execute tests on.
|
||||
default: '.'
|
||||
scope:
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
mandatory: true
|
||||
- name: runCommand
|
||||
type: string
|
||||
description: The command that is executed to start the tests.
|
||||
default: npm run karma
|
||||
scope:
|
||||
- GENERAL
|
||||
- PARAMETERS
|
||||
- STAGES
|
||||
- STEPS
|
||||
mandatory: true
|
||||
#outputs:
|
||||
containers:
|
||||
- name: karma
|
||||
image: node:8-stretch
|
||||
env:
|
||||
- name: no_proxy
|
||||
value: localhost,selenium,$no_proxy
|
||||
- name: NO_PROXY
|
||||
value: localhost,selenium,$NO_PROXY
|
||||
workingDir: /home/node
|
||||
volumeMounts:
|
||||
- mountPath: /dev/shm
|
||||
name: dev-shm
|
||||
sidecars:
|
||||
- image: selenium/standalone-chrome
|
||||
name: selenium
|
||||
securityContext:
|
||||
privileged: true
|
||||
volumeMounts:
|
||||
- mountPath: /dev/shm
|
||||
name: dev-shm
|
5
resources/metadata/version.yaml
Normal file
5
resources/metadata/version.yaml
Normal file
@ -0,0 +1,5 @@
|
||||
metadata:
|
||||
name: version
|
||||
description: Returns the version of the piper binary
|
||||
longDescription: |
|
||||
Writes the commit hash and the tag (if any) to stdout and exits with 0.
|
156
src/com/sap/piper/CommonPipelineEnvironment.groovy
Normal file
156
src/com/sap/piper/CommonPipelineEnvironment.groovy
Normal file
@ -0,0 +1,156 @@
|
||||
package com.sap.piper;
|
||||
|
||||
import com.sap.piper.analytics.InfluxData
|
||||
|
||||
public class CommonPipelineEnvironment {
|
||||
|
||||
private static CommonPipelineEnvironment INSTANCE = new CommonPipelineEnvironment()
|
||||
|
||||
static CommonPipelineEnvironment getInstance() {
|
||||
INSTANCE
|
||||
}
|
||||
|
||||
Map defaultConfiguration = [:]
|
||||
|
||||
// The project config
|
||||
Map configuration = [:]
|
||||
|
||||
private Map valueMap = [:]
|
||||
|
||||
//stores properties for a pipeline which build an artifact and then bundles it into a container
|
||||
private Map appContainerProperties = [:]
|
||||
|
||||
//stores version of the artifact which is build during pipeline run
|
||||
def artifactVersion
|
||||
|
||||
//Stores the current buildResult
|
||||
String buildResult = 'SUCCESS'
|
||||
|
||||
//stores the gitCommitId as well as additional git information for the build during pipeline run
|
||||
String gitCommitId
|
||||
String gitCommitMessage
|
||||
String gitSshUrl
|
||||
String gitHttpsUrl
|
||||
String gitBranch
|
||||
|
||||
//GiutHub specific information
|
||||
String githubOrg
|
||||
String githubRepo
|
||||
|
||||
String mtarFilePath
|
||||
|
||||
String changeDocumentId
|
||||
|
||||
String xsDeploymentId
|
||||
|
||||
void setValue(String property, value) {
|
||||
valueMap[property] = value
|
||||
}
|
||||
|
||||
def getValue(String property) {
|
||||
return valueMap.get(property)
|
||||
}
|
||||
|
||||
def setAppContainerProperty(property, value) {
|
||||
appContainerProperties[property] = value
|
||||
}
|
||||
|
||||
def getAppContainerProperty(property) {
|
||||
return appContainerProperties[property]
|
||||
}
|
||||
|
||||
// goes into measurement jenkins_custom_data
|
||||
def setInfluxCustomDataEntry(key, value) {
|
||||
InfluxData.addField('jenkins_custom_data', key, value)
|
||||
}
|
||||
// goes into measurement jenkins_custom_data
|
||||
@Deprecated // not used in library
|
||||
def getInfluxCustomData() {
|
||||
return InfluxData.getInstance().getFields().jenkins_custom_data
|
||||
}
|
||||
|
||||
// goes into measurement jenkins_custom_data
|
||||
def setInfluxCustomDataTagsEntry(key, value) {
|
||||
InfluxData.addTag('jenkins_custom_data', key, value)
|
||||
}
|
||||
// goes into measurement jenkins_custom_data
|
||||
@Deprecated // not used in library
|
||||
def getInfluxCustomDataTags() {
|
||||
return InfluxData.getInstance().getTags().jenkins_custom_data
|
||||
}
|
||||
|
||||
void setInfluxCustomDataMapEntry(measurement, field, value) {
|
||||
InfluxData.addField(measurement, field, value)
|
||||
}
|
||||
@Deprecated // not used in library
|
||||
def getInfluxCustomDataMap() {
|
||||
return InfluxData.getInstance().getFields()
|
||||
}
|
||||
|
||||
def setInfluxCustomDataMapTagsEntry(measurement, tag, value) {
|
||||
InfluxData.addTag(measurement, tag, value)
|
||||
}
|
||||
@Deprecated // not used in library
|
||||
def getInfluxCustomDataMapTags() {
|
||||
return InfluxData.getInstance().getTags()
|
||||
}
|
||||
|
||||
@Deprecated // not used in library
|
||||
def setInfluxStepData(key, value) {
|
||||
InfluxData.addField('step_data', key, value)
|
||||
}
|
||||
@Deprecated // not used in library
|
||||
def getInfluxStepData(key) {
|
||||
return InfluxData.getInstance().getFields()['step_data'][key]
|
||||
}
|
||||
|
||||
@Deprecated // not used in library
|
||||
def setInfluxPipelineData(key, value) {
|
||||
InfluxData.addField('pipeline_data', key, value)
|
||||
}
|
||||
@Deprecated // not used in library
|
||||
def setPipelineMeasurement(key, value){
|
||||
setInfluxPipelineData(key, value)
|
||||
}
|
||||
@Deprecated // not used in library
|
||||
def getPipelineMeasurement(key) {
|
||||
return InfluxData.getInstance().getFields()['pipeline_data'][key]
|
||||
}
|
||||
|
||||
def reset() {
|
||||
appContainerProperties = [:]
|
||||
configuration = [:]
|
||||
artifactVersion = null
|
||||
|
||||
gitCommitId = null
|
||||
gitCommitMessage = null
|
||||
gitSshUrl = null
|
||||
gitHttpsUrl = null
|
||||
gitBranch = null
|
||||
|
||||
githubOrg = null
|
||||
githubRepo = null
|
||||
|
||||
xsDeploymentId = null
|
||||
|
||||
mtarFilePath = null
|
||||
valueMap = [:]
|
||||
|
||||
changeDocumentId = null
|
||||
|
||||
InfluxData.reset()
|
||||
}
|
||||
|
||||
Map getStepConfiguration(stepName, stageName = env.STAGE_NAME, includeDefaults = true) {
|
||||
Map defaults = [:]
|
||||
if (includeDefaults) {
|
||||
defaults = DefaultValueCache.getInstance()?.getDefaultValues()?.general ?: [:]
|
||||
defaults = ConfigurationMerger.merge(ConfigurationLoader.defaultStepConfiguration([commonPipelineEnvironment: this], stepName), null, defaults)
|
||||
defaults = ConfigurationMerger.merge(ConfigurationLoader.defaultStageConfiguration([commonPipelineEnvironment: this], stageName), null, defaults)
|
||||
}
|
||||
Map config = ConfigurationMerger.merge(configuration.get('general') ?: [:], null, defaults)
|
||||
config = ConfigurationMerger.merge(configuration.get('steps')?.get(stepName) ?: [:], null, config)
|
||||
config = ConfigurationMerger.merge(configuration.get('stages')?.get(stageName) ?: [:], null, config)
|
||||
return config
|
||||
}
|
||||
}
|
@ -1,7 +1,5 @@
|
||||
package com.sap.piper
|
||||
|
||||
import com.cloudbees.groovy.cps.NonCPS
|
||||
|
||||
@API
|
||||
class ConfigurationHelper implements Serializable {
|
||||
|
||||
@ -22,6 +20,7 @@ class ConfigurationHelper implements Serializable {
|
||||
private Script step
|
||||
private String name
|
||||
private Map validationResults = null
|
||||
private String dependingOn
|
||||
|
||||
private ConfigurationHelper(Script step, Map config){
|
||||
this.config = config ?: [:]
|
||||
@ -35,22 +34,35 @@ class ConfigurationHelper implements Serializable {
|
||||
return this
|
||||
}
|
||||
|
||||
ConfigurationHelper mixinGeneralConfig(Set filter = null, Map compatibleParameters = [:]){
|
||||
mixinGeneralConfig(null, filter, compatibleParameters)
|
||||
}
|
||||
@Deprecated
|
||||
/** Use mixinGeneralConfig without commonPipelineEnvironment*/
|
||||
ConfigurationHelper mixinGeneralConfig(commonPipelineEnvironment, Set filter = null, Map compatibleParameters = [:]){
|
||||
Map generalConfiguration = ConfigurationLoader.generalConfiguration([commonPipelineEnvironment: commonPipelineEnvironment])
|
||||
Map generalConfiguration = ConfigurationLoader.generalConfiguration()
|
||||
return mixin(generalConfiguration, filter, compatibleParameters)
|
||||
}
|
||||
|
||||
ConfigurationHelper mixinStageConfig(stageName, Set filter = null, Map compatibleParameters = [:]){
|
||||
mixinStageConfig(null, stageName, filter, compatibleParameters)
|
||||
}
|
||||
@Deprecated
|
||||
ConfigurationHelper mixinStageConfig(commonPipelineEnvironment, stageName, Set filter = null, Map compatibleParameters = [:]){
|
||||
Map stageConfiguration = ConfigurationLoader.stageConfiguration([commonPipelineEnvironment: commonPipelineEnvironment], stageName)
|
||||
Map stageConfiguration = ConfigurationLoader.stageConfiguration(stageName)
|
||||
return mixin(stageConfiguration, filter, compatibleParameters)
|
||||
}
|
||||
|
||||
ConfigurationHelper mixinStepConfig(Set filter = null, Map compatibleParameters = [:]){
|
||||
mixinStepConfig(null, filter, compatibleParameters)
|
||||
}
|
||||
@Deprecated
|
||||
ConfigurationHelper mixinStepConfig(commonPipelineEnvironment, Set filter = null, Map compatibleParameters = [:]){
|
||||
Map stepConfiguration = ConfigurationLoader.stepConfiguration([commonPipelineEnvironment: commonPipelineEnvironment], name)
|
||||
Map stepConfiguration = ConfigurationLoader.stepConfiguration(name)
|
||||
return mixin(stepConfiguration, filter, compatibleParameters)
|
||||
}
|
||||
|
||||
final ConfigurationHelper mixin(Map parameters, Set filter = null, Map compatibleParameters = [:]){
|
||||
ConfigurationHelper mixin(Map parameters, Set filter = null, Map compatibleParameters = [:]){
|
||||
if (parameters.size() > 0 && compatibleParameters.size() > 0) {
|
||||
parameters = ConfigurationMerger.merge(handleCompatibility(compatibleParameters, parameters), null, parameters)
|
||||
}
|
||||
@ -87,22 +99,25 @@ class ConfigurationHelper implements Serializable {
|
||||
return newConfig
|
||||
}
|
||||
|
||||
Map dependingOn(dependentKey){
|
||||
return [
|
||||
mixin: {key ->
|
||||
def parts = tokenizeKey(key)
|
||||
def targetMap = config
|
||||
if(parts.size() > 1) {
|
||||
key = parts.last()
|
||||
parts.remove(key)
|
||||
targetMap = getConfigPropertyNested(config, (parts as Iterable).join(SEPARATOR))
|
||||
}
|
||||
def dependentValue = config[dependentKey]
|
||||
if(targetMap[key] == null && dependentValue && config[dependentValue])
|
||||
targetMap[key] = config[dependentValue][key]
|
||||
return this
|
||||
}
|
||||
]
|
||||
ConfigurationHelper mixin(String key){
|
||||
def parts = tokenizeKey(key)
|
||||
def targetMap = config
|
||||
if(parts.size() > 1) {
|
||||
key = parts.last()
|
||||
parts.remove(key)
|
||||
targetMap = getConfigPropertyNested(config, parts.join(SEPARATOR))
|
||||
}
|
||||
def dependentValue = config[dependingOn]
|
||||
if(targetMap[key] == null && dependentValue && config[dependentValue])
|
||||
targetMap[key] = config[dependentValue][key]
|
||||
|
||||
dependingOn = null
|
||||
return this
|
||||
}
|
||||
|
||||
ConfigurationHelper dependingOn(dependentKey){
|
||||
dependingOn = dependentKey
|
||||
return this
|
||||
}
|
||||
|
||||
ConfigurationHelper addIfEmpty(key, value){
|
||||
@ -121,8 +136,6 @@ class ConfigurationHelper implements Serializable {
|
||||
return this
|
||||
}
|
||||
|
||||
@NonCPS // required because we have a closure in the
|
||||
// method body that cannot be CPS transformed
|
||||
Map use(){
|
||||
handleValidationFailures()
|
||||
MapUtils.traverse(config, { v -> (v instanceof GString) ? v.toString() : v })
|
||||
@ -130,8 +143,6 @@ class ConfigurationHelper implements Serializable {
|
||||
return MapUtils.deepCopy(config)
|
||||
}
|
||||
|
||||
|
||||
|
||||
/* private */ def getConfigPropertyNested(key) {
|
||||
return getConfigPropertyNested(config, key)
|
||||
}
|
||||
@ -143,7 +154,7 @@ class ConfigurationHelper implements Serializable {
|
||||
if (config[parts.head()] != null) {
|
||||
|
||||
if (config[parts.head()] in Map && !parts.tail().isEmpty()) {
|
||||
return getConfigPropertyNested(config[parts.head()], (parts.tail() as Iterable).join(SEPARATOR))
|
||||
return getConfigPropertyNested(config[parts.head()], parts.tail().join(SEPARATOR))
|
||||
}
|
||||
|
||||
if (config[parts.head()].class == String) {
|
||||
@ -193,15 +204,12 @@ class ConfigurationHelper implements Serializable {
|
||||
return this
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
private handleValidationFailures() {
|
||||
if(! validationResults) return
|
||||
if(validationResults.size() == 1) throw validationResults.values().first()
|
||||
String msg = 'ERROR - NO VALUE AVAILABLE FOR: ' +
|
||||
(validationResults.keySet().stream().collect() as Iterable).join(', ')
|
||||
String msg = 'ERROR - NO VALUE AVAILABLE FOR: ' + validationResults.keySet().join(', ')
|
||||
IllegalArgumentException iae = new IllegalArgumentException(msg)
|
||||
validationResults.each { e -> iae.addSuppressed(e.value) }
|
||||
throw iae
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -1,54 +1,83 @@
|
||||
package com.sap.piper
|
||||
|
||||
import com.cloudbees.groovy.cps.NonCPS
|
||||
// script is present in the signatures in order to keep api compatibility.
|
||||
// The script referenced is not used inside the method bodies.
|
||||
|
||||
@API(deprecated = true)
|
||||
class ConfigurationLoader implements Serializable {
|
||||
@NonCPS
|
||||
|
||||
static Map stepConfiguration(String stepName) {
|
||||
return loadConfiguration('steps', stepName, ConfigurationType.CUSTOM_CONFIGURATION)
|
||||
}
|
||||
@Deprecated
|
||||
/** Use stepConfiguration(stepName) instead */
|
||||
static Map stepConfiguration(script, String stepName) {
|
||||
return loadConfiguration(script, 'steps', stepName, ConfigurationType.CUSTOM_CONFIGURATION)
|
||||
return stepConfiguration(stepName)
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
static Map stageConfiguration(String stageName) {
|
||||
return loadConfiguration('stages', stageName, ConfigurationType.CUSTOM_CONFIGURATION)
|
||||
}
|
||||
@Deprecated
|
||||
/** Use stageConfiguration(stageName) instead */
|
||||
static Map stageConfiguration(script, String stageName) {
|
||||
return loadConfiguration(script, 'stages', stageName, ConfigurationType.CUSTOM_CONFIGURATION)
|
||||
return stageConfiguration(stageName)
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
static Map defaultStepConfiguration(String stepName) {
|
||||
return loadConfiguration('steps', stepName, ConfigurationType.DEFAULT_CONFIGURATION)
|
||||
}
|
||||
@Deprecated
|
||||
/** Use defaultStepConfiguration(stepName) instead */
|
||||
static Map defaultStepConfiguration(script, String stepName) {
|
||||
return loadConfiguration(script, 'steps', stepName, ConfigurationType.DEFAULT_CONFIGURATION)
|
||||
return defaultStepConfiguration(stepName)
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
static Map defaultStageConfiguration(String stageName) {
|
||||
return loadConfiguration('stages', stageName, ConfigurationType.DEFAULT_CONFIGURATION)
|
||||
}
|
||||
@Deprecated
|
||||
/** Use defaultStageConfiguration(stepName) instead */
|
||||
static Map defaultStageConfiguration(script, String stageName) {
|
||||
return loadConfiguration(script, 'stages', stageName, ConfigurationType.DEFAULT_CONFIGURATION)
|
||||
return defaultStageConfiguration(stageName)
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
static Map generalConfiguration(script){
|
||||
static Map generalConfiguration(){
|
||||
try {
|
||||
return script?.commonPipelineEnvironment?.configuration?.general ?: [:]
|
||||
return CommonPipelineEnvironment.getInstance()?.configuration?.general ?: [:]
|
||||
} catch (groovy.lang.MissingPropertyException mpe) {
|
||||
return [:]
|
||||
}
|
||||
}
|
||||
@Deprecated
|
||||
/** Use generalConfiguration() instead */
|
||||
static Map generalConfiguration(script){
|
||||
return generalConfiguration()
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
static Map defaultGeneralConfiguration(script){
|
||||
static Map defaultGeneralConfiguration(){
|
||||
return DefaultValueCache.getInstance()?.getDefaultValues()?.general ?: [:]
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
static Map postActionConfiguration(script, String actionName){
|
||||
return loadConfiguration(script, 'postActions', actionName, ConfigurationType.CUSTOM_CONFIGURATION)
|
||||
@Deprecated
|
||||
/** Use defaultGeneralConfiguration() instead */
|
||||
static Map defaultGeneralConfiguration(script){
|
||||
return defaultGeneralConfiguration()
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
private static Map loadConfiguration(script, String type, String entryName, ConfigurationType configType){
|
||||
static Map postActionConfiguration(String actionName){
|
||||
return loadConfiguration('postActions', actionName, ConfigurationType.CUSTOM_CONFIGURATION)
|
||||
}
|
||||
@Deprecated
|
||||
/** Use postActionConfiguration() instead */
|
||||
static Map postActionConfiguration(script, String actionName){
|
||||
return postActionConfiguration(actionName)
|
||||
}
|
||||
|
||||
private static Map loadConfiguration(String type, String entryName, ConfigurationType configType){
|
||||
switch (configType) {
|
||||
case ConfigurationType.CUSTOM_CONFIGURATION:
|
||||
try {
|
||||
return script?.commonPipelineEnvironment?.configuration?.get(type)?.get(entryName) ?: [:]
|
||||
return CommonPipelineEnvironment.getInstance()?.configuration?.get(type)?.get(entryName) ?: [:]
|
||||
} catch (groovy.lang.MissingPropertyException mpe) {
|
||||
return [:]
|
||||
}
|
||||
|
@ -1,10 +1,8 @@
|
||||
package com.sap.piper
|
||||
|
||||
import com.cloudbees.groovy.cps.NonCPS
|
||||
|
||||
@API(deprecated = true)
|
||||
class ConfigurationMerger {
|
||||
@NonCPS
|
||||
static Map merge(Map configs, Set configKeys, Map defaults) {
|
||||
Map filteredConfig = configKeys?configs.subMap(configKeys):configs
|
||||
|
||||
@ -12,7 +10,6 @@ class ConfigurationMerger {
|
||||
MapUtils.pruneNulls(filteredConfig))
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
static Map merge(
|
||||
Map parameters, Set parameterKeys,
|
||||
Map configuration, Set configurationKeys,
|
||||
|
@ -2,8 +2,6 @@ package com.sap.piper
|
||||
|
||||
import com.sap.piper.MapUtils
|
||||
|
||||
import com.cloudbees.groovy.cps.NonCPS
|
||||
|
||||
@API
|
||||
class DefaultValueCache implements Serializable {
|
||||
private static DefaultValueCache instance
|
||||
@ -14,7 +12,6 @@ class DefaultValueCache implements Serializable {
|
||||
this.defaultValues = defaultValues
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
static getInstance(){
|
||||
return instance
|
||||
}
|
||||
@ -23,7 +20,6 @@ class DefaultValueCache implements Serializable {
|
||||
instance = new DefaultValueCache(defaultValues)
|
||||
}
|
||||
|
||||
@NonCPS
|
||||
Map getDefaultValues(){
|
||||
return defaultValues
|
||||
}
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user