Passing JSON Variables in Azure Pipelines
Learn how to handle JSON variables in Azure DevOps pipelines, avoid escaping issues, and ensure seamless API integration with proper normalization techniques.
Join the DZone community and get the full member experience.
Join For FreeWhen working with Azure DevOps pipelines, there are situations where you need to use JSON as a variable — whether it's for dynamically passing configurations, triggering APIs, or embedding JSON data into scripts. A common use case is creating a pipeline that triggers an API requiring a JSON payload.
However, Azure DevOps treats all variables as plain strings, and when you attempt to pass JSON, it often results in malformed data due to improper escaping. This can break APIs or other components expecting valid JSON.
Why JSON Variables Fail in Azure Pipelines
- Azure DevOps pipeline variables are treated as plain strings.
- When attempting to pass JSON, double quotes (
"
) and backslashes (\
) in the JSON are often misinterpreted, leading to malformed data.
Example of the Issue
A pipeline variable is defined as:
{
"source": {
"type": "filesystem",
"location": "test-location",
"filePath": "data.csv"
}
}
When used in a script, it might get transformed into:
{\\"source\\":{\\"type\\":\\"filesystem\\",\\"location\\":\\"test-location\\",\\"filePath\\":\\"data.csv\\"}}
This results in broken APIs or unexpected behavior.
Why It Doesn't Work
- Double-escaping: Azure DevOps escapes quotes and backslashes automatically when passing variables.
- API expectation mismatch: APIs expect JSON in a valid format, but the pipeline delivers an improperly escaped version.
To use JSON variables in Azure pipelines, the key is to normalize the JSON and escape it correctly. Here's how you can solve the problem step by step.
Step 1: Define a JSON Variable in the Azure Pipeline UI
In the pipeline UI, add a variable named json_data
with the following value (dummy data):
{
"source": {
"type": "filesystem",
"location": "test-location",
"filePath": "data.csv"
},
"sink": {
"type": "filesystem",
"location": "output-location",
"filePath": "processed-data.csv"
}
}
When stored in Azure DevOps, this JSON will be automatically escaped, resulting in something like:
{\\"source\\":{\\"type\\":\\"filesystem\\",\\"location\\":\\"test-location\\",\\"filePath\\":\\"data.csv\\"},\\"sink\\":{\\"type\\":\\"filesystem\\",\\"location\\":\\"output-location\\",\\"filePath\\":\\"processed-data.csv\\"}}
Step 2: Process the Variable in Your YAML Script
Here’s how to normalize the escaped JSON in your YAML file:
trigger:
branches:
include:
- main
jobs:
- job: HandleJsonVariable
pool:
vmImage: 'ubuntu-latest'
steps:
- script: |
# Step 1: Retrieve the raw JSON from the pipeline variable
raw_json=$(echo "$(json_data)")
# Step 2: Normalize the escaped JSON by replacing \\ with \
normalized_json=$(echo "$raw_json" | sed 's/\\\\/\\/g')
# Step 3: Debug normalized JSON
echo "Normalized JSON:"
echo "$normalized_json" | jq .
# Step 4: Escape quotes for specific use cases (e.g., if embedding JSON in another JSON payload)
escaped_json=$(echo "$normalized_json" | sed 's/"/\\"/g')
# Debug escaped JSON
echo "Escaped JSON:"
echo "\"$escaped_json\""
# Step 5: Construct a general JSON payload dynamically
final_payload=$(cat <<EOF
{
"example_key": "example_value",
"payload_data": "$escaped_json"
}
EOF
)
# Step 6: Debug the final payload
echo "Final Payload:"
echo "$final_payload" | jq .
# Step 7: Simulate sending the payload to an external API (replace with actual API details if needed)
echo "Simulating API Call..."
response=$(curl -s -X POST https://example-api.com/endpoint \
-H "Content-Type: application/json" \
-d "$final_payload")
# Step 8: Debug the API response
echo "API Response:"
echo "$response" | jq .
displayName: 'Handle and Process JSON Variable'
Retrieve the Variable
Fetch the json_data
variable using $(json_data)
.
Normalize the JSON
Replace \\
with \
to correct the double-escaping.
normalized_json=$(echo "$raw_json" | sed 's/\\\\/\\/g')
Escape Quotes (Optional)
If the API requires JSON as a string within another JSON, escape quotes:
escaped_json=$(echo "$normalized_json" | sed 's/"/\\"/g')
Build the Payload
Construct the JSON payload dynamically using a cat <<EOF
block.
Debug Output
Use echo
and jq
to validate each transformation step.
Key Takeaways
- Azure variables are strings: JSON must be processed before use.
- Normalize and escape as needed: Properly escape quotes and backslashes.
- Debugging is essential: Use tools like
jq
to validate JSON at every step.
Passing JSON variables in Azure DevOps pipelines can be challenging, but with the right approach, it's manageable. By normalizing and escaping JSON, you can ensure your pipeline works seamlessly with APIs or other JSON-consuming components.
Opinions expressed by DZone contributors are their own.
Comments