Infrastructure as Code Security (IaC Security)
IaC Security refers to security practices for Infrastructure as Code—Terraform, Pulumi, AWS CloudFormation, Bicep, and Ansible. Security flaws in IaC templates directly lead to insecure cloud infrastructure: public S3 buckets, overprivileged IAM roles, missing encryption, and exposed ports. IaC scanning tools such as Checkov, tfsec, and Trivy detect misconfigurations before deployment.
IaC Security closes a critical gap: When infrastructure is defined as code, security issues can be identified right in the code—before the insecure resource even exists. A Terraform template with acl = "public-read" for an S3 bucket can be detected and rejected in the CI/CD pipeline before a developer clicks "Apply."
Common IaC Security Mistakes
Most common misconfigurations in Terraform (AWS):
1. Public S3 bucket:
# WRONG:
resource "aws_s3_bucket" "data" {
bucket = "company-customer-data"
acl = "public-read" # CRITICAL: Anyone can read the data!
}
# CORRECT:
resource "aws_s3_bucket" "data" {
bucket = "company-customer-data"
}
resource "aws_s3_bucket_public_access_block" "data" {
bucket = aws_s3_bucket.data.id
block_public_acls = true
block_public_policy = true
ignore_public_acls = true
restrict_public_buckets = true
}
2. Overprivileged IAM role:
# WRONG:
resource "aws_iam_policy" "app_policy" {
policy = jsonencode({
Statement = [{
Effect = "Allow"
Action = "*" # All actions!
Resource = "*" # All resources!
}]
})
}
# CORRECT (Least Privilege):
resource "aws_iam_policy" "app_policy" {
policy = jsonencode({
Statement = [{
Effect = "Allow"
Action = ["s3:GetObject", "s3:PutObject"]
Resource = ["arn:aws:s3:::company-app-data/*"]
}]
})
}
3. Missing Encryption (RDS):
# WRONG:
resource "aws_db_instance" "prod" {
storage_encrypted = false # Or omitted (Default: false!)
}
# CORRECT:
resource "aws_db_instance" "prod" {
storage_encrypted = true
kms_key_id = aws_kms_key.rds.arn
}
4. Security group too open:
# WRONG:
resource "aws_security_group_rule" "ssh" {
type = "ingress"
from_port = 22
to_port = 22
protocol = "tcp"
cidr_blocks = ["0.0.0.0/0"] # SSH from the Internet!
}
# CORRECT:
resource "aws_security_group_rule" "ssh" {
type = "ingress"
from_port = 22
to_port = 22
protocol = "tcp"
source_security_group_id = aws_security_group.bastion.id # Bastion only!
}
5. Missing VPC Flow Logs:
# WRONG: VPC without Flow Logs → no network audit trail
# CORRECT:
resource "aws_flow_log" "vpc" {
iam_role_arn = aws_iam_role.flow_logs.arn
log_destination = aws_cloudwatch_log_group.vpc_logs.arn
traffic_type = "ALL"
vpc_id = aws_vpc.main.id
}
IaC Scanning Tools
Checkov (Bridgecrew / Palo Alto):
→ Open source, most comprehensive rule set
→ Supports: Terraform, CloudFormation, Kubernetes, Dockerfile, Bicep, ARM
→ 2500+ built-in checks
→ SARIF output for GitHub Security Tab
Installation:
pip install checkov
Terraform scan:
checkov -d ./terraform --framework terraform
# Or specific checks:
checkov -d ./terraform --check CKV_AWS_18,CKV_AWS_21
CI/CD Integration (GitHub Actions):
- name: Checkov IaC Scan
uses: bridgecrewio/checkov-action@master
with:
directory: terraform/
framework: terraform
output_format: sarif
output_file_path: results.sarif
soft_fail: false # Pipeline fails on CRITICAL!
Sample Output:
Check: CKV_AWS_18: "Ensure the S3 bucket has access logging enabled"
FAILED for resource: aws_s3_bucket.data
File: /terraform/s3.tf:1-5
Check: CKV_AWS_21: "Ensure all data stored in the S3 bucket have versioning enabled"
PASSED for resource: aws_s3_bucket.backup
tfsec (Aqua Security):
→ Terraform-specific, very fast
→ SARIF + JUnit output
→ Custom checks in Rego (OPA)
Scan:
tfsec ./terraform
tfsec ./terraform --format sarif --out tfsec.sarif
tfsec ./terraform --minimum-severity HIGH
GitHub Actions:
- name: tfsec
uses: aquasecurity/tfsec-action@v1.0.0
with:
working_directory: terraform/
github_token: ${{ secrets.GITHUB_TOKEN }}
Trivy (Aqua Security - comprehensive):
→ IaC + Container + SCA + Secrets
→ One tool for all scan types
Scan Terraform:
trivy config ./terraform
trivy config --severity HIGH,CRITICAL ./terraform
trivy config --format sarif --output trivy.sarif ./terraform
Kubernetes Manifests:
trivy config ./k8s/
# Checks: privileged containers, root user, capabilities, etc.
Dockerfile:
trivy config ./Dockerfile
KICS (Checkmarx - Open Source):
→ Supports 24+ IaC languages
→ Very broad coverage
kics scan -p ./infrastructure -o results.json
Secrets in IaC
CRITICAL: Never include secrets in IaC code!
Often accidentally checked in:
# NEVER:
resource "aws_db_instance" "prod" {
password = "SuperSecret123!" # In the Git repository!
}
resource "aws_secretsmanager_secret_version" "db" {
secret_string = jsonencode({
password = "MyPassword" # In the state file!
})
}
Terraform State File:
→ All resource attributes stored—including passwords!
→ terraform.tfstate is HIGHLY SENSITIVE
→ Local: .gitignore for *.tfstate and *.tfstate.backup
→ Remote Backend: S3 + DynamoDB (with encryption)
# Backend with encryption:
terraform {
backend "s3" {
bucket = "company-terraform-state"
key = "prod/terraform.tfstate"
region = "eu-central-1"
encrypt = true # KMS encryption!
kms_key_id = "arn:aws:kms:..."
dynamodb_table = "terraform-state-lock"
}
}
Proper Secrets Management in IaC:
Option 1 - Reference to Secrets Manager:
# Terraform: Reference AWS Secrets Manager (do not hardcode the value!)
data "aws_secretsmanager_secret_version" "db_password" {
secret_id = "prod/db/password" # Name, no value!
}
resource "aws_db_instance" "prod" {
password = data.aws_secretsmanager_secret_version.db_password.secret_string
}
# Problem: Password still ends up in the state file!
Option 2 - Empty resource, set password externally:
resource "aws_db_instance" "prod" {
# DO NOT set password in Terraform - set via API or console
lifecycle {
ignore_changes = [password] # Terraform does not overwrite
}
}
Option 3 - External Data Source + Vault:
data "external" "vault_secret" {
program = ["vault", "read", "-format=json", "secret/db/password"]
}
# Vault provides the secret at runtime - never in the code!
Secret scanning for IaC:
# TruffleHog:
trufflehog git file://./ # Scans Git history for secrets!
# TruffleHog also finds old commits!
# detect-secrets (Yelp):
detect-secrets scan --all-files > .secrets.baseline
detect-secrets audit .secrets.baseline
IaC Security in CI/CD
Complete DevSecOps pipeline for Terraform:
name: Terraform Security Pipeline
on: [push, pull_request]
jobs:
security:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0 # For git-history secret scan
# 1. Secret scanning (before anything else!)
- name: TruffleHog Secret Scan
uses: trufflesecurity/trufflehog@main
with:
path: ./
extra_args: --only-verified
# 2. IaC Linting + Formatting
- name: Terraform Format Check
run: terraform fmt -check -recursive ./terraform
- name: Terraform Validate
run: |
cd terraform
terraform init -backend=false
terraform validate
# 3. IaC Security Scanning
- name: Checkov Scan
uses: bridgecrewio/checkov-action@master
with:
directory: terraform/
output_format: sarif
output_file_path: checkov.sarif
soft_fail: true # Only warnings for now
- name: tfsec Scan
uses: aquasecurity/tfsec-action@v1.0.0
with:
working_directory: terraform/
# 4. SARIF upload for GitHub Security Tab
- name: Upload SARIF Results
uses: github/codeql-action/upload-sarif@v2
if: always()
with:
sarif_file: checkov.sarif
# 5. Plan only after security checks (dependent!)
plan:
needs: security
runs-on: ubuntu-latest
steps:
- name: Terraform Plan
run: terraform plan -out=tfplan
- name: Checkov on Plan:
run: checkov -f tfplan --framework terraform_plan
Checkov Custom Policy (custom rules):
# custom_checks/S3_bucket_tags.py
from checkov.common.models.enums import CheckResult, CheckCategories
from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
class S3BucketRequiredTags(BaseResourceCheck):
def __init__(self):
name = "S3 bucket must have Owner and CostCenter tags"
id = "FIRMA_S3_001"
supported_resources = ["aws_s3_bucket"]
categories = [CheckCategories.GENERAL_SECURITY]
super().__init__(name=name, id=id,
categories=categories,
supported_resources=supported_resources)
def scan_resource_conf(self, conf):
tags = conf.get("tags", [{}])[0]
if isinstance(tags, dict):
if "Owner" in tags and "CostCenter" in tags:
return CheckResult.PASSED
return CheckResult.FAILED
scanner = S3BucketRequiredTags()
# Run:
checkov -d ./terraform --external-checks-dir ./custom_checks