When working with complex CI/CD workflows, you'll often need to chain Jenkins jobs together while passing dynamic parameters between them. The Parameterized Trigger Plugin is specifically designed for this purpose, though its pipeline integration isn't always well-documented.
Here's the basic structure for triggering a parameterized job from a Jenkinsfile:
build job: 'downstream-job-name',
parameters: [
string(name: 'PARAM1', value: "${env.BUILD_ID}"),
booleanParam(name: 'DEPLOY_FLAG', value: true),
textParam(name: 'RELEASE_NOTES', value: 'Sample release notes')
],
propagate: false,
wait: false
You can combine this with other Jenkins pipeline features for more sophisticated workflows:
stage('Trigger Downstream') {
steps {
script {
def dynamicValue = sh(returnStdout: true, script: 'git rev-parse --short HEAD').trim()
build job: 'deployment-job',
parameters: [
string(name: 'GIT_COMMIT', value: dynamicValue),
booleanParam(name: 'RUN_TESTS', value: params.TEST_FLAG),
choiceParam(name: 'ENVIRONMENT', choices: ['dev', 'staging', 'prod'])
],
propagate: true,
wait: true
}
}
}
The plugin supports various parameter types in pipelines:
string
for text valuesbooleanParam
for true/false flagstextParam
for multiline contentchoiceParam
for dropdown optionsfileParam
for file uploads
When implementing parameterized triggers, consider these patterns:
try {
build job: 'critical-deployment',
parameters: [
string(name: 'VERSION', value: readFile('version.txt'))
],
propagate: true,
wait: true
} catch (err) {
echo "Downstream job failed: ${err}"
currentBuild.result = 'UNSTABLE'
}
If you need more control than the Parameterized Trigger Plugin offers, consider:
- Using the HTTP Request Plugin to call Jenkins REST API
- Writing custom shared libraries for complex logic
- Implementing webhooks with the Generic Webhook Trigger Plugin
When implementing CI/CD pipelines, developers frequently need to trigger downstream jobs with dynamic parameters. The Parameterized Trigger Plugin
(version 2.45+) provides this functionality, but its pipeline integration isn't well-documented.
Here's the correct declarative pipeline syntax to trigger parameterized jobs:
pipeline {
agent any
stages {
stage('Trigger Downstream') {
steps {
build job: 'downstream-job',
parameters: [
string(name: 'ENVIRONMENT', value: 'production'),
booleanParam(name: 'RUN_TESTS', value: true),
text(name: 'CONFIG_JSON', value: '{"timeout":300}')
],
propagate: false,
wait: false
}
}
}
}
- String parameters: For textual configuration values
- Boolean parameters: For toggle-based decisions
- Text parameters: For multi-line content like JSON/YAML
- Choice parameters: For pre-defined option sets
For complex scenarios with dynamic parameter generation:
script {
def dynamicParams = []
if (env.BRANCH_NAME == 'main') {
dynamicParams.add(string(name: 'DEPLOY_FLAG', value: 'true'))
}
build job: 'deployment-job',
parameters: dynamicParams,
quietPeriod: 5 // delays trigger by 5 seconds
}
Always implement proper error handling when triggering jobs:
steps {
script {
try {
build job: 'critical-job',
parameters: [...],
propagate: true
} catch (err) {
echo "Trigger failed: ${err}"
currentBuild.result = 'UNSTABLE'
}
}
}
When triggering multiple jobs:
- Use
parallel
steps for independent jobs - Set
wait: false
for fire-and-forget triggers - Limit parameter payload size (avoid large base64 blobs)
If facing plugin limitations, consider:
// Using HTTP API via curl
sh '''
curl -X POST "${JENKINS_URL}/job/downstream/buildWithParameters" \
--user "user:apiToken" \
--data "param1=value1¶m2=value2"
'''