DEV Community

Olivia Craft
Olivia Craft

Posted on

Cursor Rules for Bash: 6 Rules That Stop AI From Generating Broken Shell Scripts

Cursor Rules for Bash: 6 Rules That Stop AI From Generating Broken Shell Scripts

Cursor generates Bash scripts quickly. The problem? Those scripts break silently — unquoted variables that split on spaces, missing error handling that lets failures cascade, hardcoded paths that only work on your machine, and cd commands that leave you in the wrong directory when something fails.

Shell scripts run in production. They deploy your code, rotate your secrets, process your data, and manage your infrastructure. A broken script doesn't give you a red squiggly line — it gives you a 3am pager alert.

You can fix this by adding targeted rules to your .cursorrules or .cursor/rules/*.mdc files. Here are 6 rules I use on every Bash project, with bad vs. good examples showing exactly what changes.


Rule 1: Always Start With strict Mode

Every Bash script must start with:
  set -euo pipefail
Immediately after the shebang line.
Never rely on default Bash behavior where errors are silently ignored.
Enter fullscreen mode Exit fullscreen mode

Without strict mode, Bash ignores every error and keeps running. A failed rm followed by a mkdir followed by a cp can destroy your deployment.

Without this rule, Cursor generates fragile scripts:

#!/bin/bash
# ❌ Bad: no strict mode, errors silently ignored

BACKUP_DIR=/tmp/backup
rm -rf $BACKUP_DIR
mkdir $BACKUP_DIR
cp -r /var/app/data $BACKUP_DIR
tar czf /var/backups/data.tar.gz $BACKUP_DIR
rm -rf $BACKUP_DIR

echo "Backup complete"
Enter fullscreen mode Exit fullscreen mode

If cp fails halfway, you still get "Backup complete" and a corrupted archive that you won't discover until the restore.

With this rule, Cursor writes defensive scripts:

#!/usr/bin/env bash
set -euo pipefail
# ✅ Good: strict mode catches failures immediately

BACKUP_DIR="/tmp/backup"
rm -rf "${BACKUP_DIR}"
mkdir -p "${BACKUP_DIR}"
cp -r /var/app/data "${BACKUP_DIR}"
tar czf /var/backups/data.tar.gz -C "${BACKUP_DIR}" .
rm -rf "${BACKUP_DIR}"

echo "Backup complete"
Enter fullscreen mode Exit fullscreen mode

If any command fails, the script stops immediately. No silent corruption.


Rule 2: Always Quote Variables — No Exceptions

Every variable expansion must be double-quoted: "${variable}"
Never use bare $variable outside of arithmetic contexts.
Use "${array[@]}" for array expansion.
The only exception is inside [[ ]] test expressions for the right side of ==.
Enter fullscreen mode Exit fullscreen mode

Unquoted variables undergo word splitting and glob expansion. A filename with a space becomes two arguments. A variable containing * expands to every file in the directory.

Without this rule:

# ❌ Bad: unquoted variables cause word splitting
FILE_PATH=/home/user/my documents/report.pdf

if [ -f $FILE_PATH ]; then
    cp $FILE_PATH /tmp/backup/
fi

for f in $FILES; do
    rm $f
done
Enter fullscreen mode Exit fullscreen mode

my documents becomes two arguments: my and documents. The cp fails with "cannot stat 'my'".

With this rule:

# ✅ Good: all variables quoted
FILE_PATH="/home/user/my documents/report.pdf"

if [[ -f "${FILE_PATH}" ]]; then
    cp "${FILE_PATH}" /tmp/backup/
fi

for f in "${FILES[@]}"; do
    rm "${f}"
done
Enter fullscreen mode Exit fullscreen mode

Paths with spaces, glob characters, and empty strings all work correctly.


Rule 3: Use Functions With local Variables

Extract repeated logic into functions.
Declare all function variables with local.
Functions must have a descriptive name in snake_case.
Never use global variables inside functions without explicit documentation.
Enter fullscreen mode Exit fullscreen mode

Bash variables are global by default. A variable named i in one function silently clobbers i in the calling function.

Without this rule:

# ❌ Bad: global variables leak between functions
process_files() {
    count=0
    for file in /data/*.csv; do
        result=$(wc -l < "$file")
        count=$((count + result))
    done
    echo "$count"
}

generate_report() {
    count=0  # clobbers the outer count if called during process
    for section in header body footer; do
        count=$((count + 1))
    done
    echo "Sections: $count"
}
Enter fullscreen mode Exit fullscreen mode

With this rule:

# ✅ Good: local variables, no leaking
process_files() {
    local count=0
    local file
    local result

    for file in /data/*.csv; do
        result=$(wc -l < "${file}")
        count=$((count + result))
    done
    echo "${count}"
}

generate_report() {
    local count=0
    local section

    for section in header body footer; do
        count=$((count + 1))
    done
    echo "Sections: ${count}"
}
Enter fullscreen mode Exit fullscreen mode

Each function owns its variables. No surprises when composing functions together.


Rule 4: Validate Arguments and Show Usage

Every script must validate required arguments at the top.
Show a usage message with expected arguments when validation fails.
Use meaningful variable names instead of bare $1, $2, $3.
Never access positional parameters beyond $3 without shifting into named variables.
Enter fullscreen mode Exit fullscreen mode

A script that silently accepts zero arguments and fails 200 lines later is a debugging nightmare.

Without this rule:

# ❌ Bad: no validation, mystery positional parameters
#!/bin/bash

tar czf $2 $1
scp $2 $3:$4
rm $2
Enter fullscreen mode Exit fullscreen mode

What are $1 through $4? Run this with the wrong argument count and you're scp-ing nothing to nowhere.

With this rule:

#!/usr/bin/env bash
set -euo pipefail
# ✅ Good: validated arguments, clear names, usage help

usage() {
    cat <<EOF
Usage: $(basename "$0") <source_dir> <archive_name> <remote_host> <remote_path>

Create a compressed archive and upload it to a remote server.

Arguments:
  source_dir    Directory to archive
  archive_name  Name for the .tar.gz file
  remote_host   SSH hostname for upload
  remote_path   Destination path on remote host

Example:
  $(basename "$0") ./data backup.tar.gz prod-server /var/backups/
EOF
    exit 1
}

[[ $# -ne 4 ]] && usage

readonly SOURCE_DIR="$1"
readonly ARCHIVE_NAME="$2"
readonly REMOTE_HOST="$3"
readonly REMOTE_PATH="$4"

[[ ! -d "${SOURCE_DIR}" ]] && echo "Error: '${SOURCE_DIR}' is not a directory" && exit 1

tar czf "${ARCHIVE_NAME}" -C "${SOURCE_DIR}" .
scp "${ARCHIVE_NAME}" "${REMOTE_HOST}:${REMOTE_PATH}"
rm "${ARCHIVE_NAME}"
Enter fullscreen mode Exit fullscreen mode

Anyone can run the script and understand what it expects without reading the source.


Rule 5: Use Trap for Cleanup — Never Leave Temp Files Behind

Always use trap to clean up temporary files on EXIT, ERR, and INT signals.
Use mktemp for temporary files and directories, never hardcoded /tmp paths.
Clean up resources in reverse order of creation.
Never leave orphaned temp files on failure.
Enter fullscreen mode Exit fullscreen mode

Scripts that crash halfway leave temp files, lock files, and half-written state scattered across the filesystem.

Without this rule:

# ❌ Bad: no cleanup, hardcoded temp paths
TMPFILE=/tmp/myapp_data.tmp
LOCKFILE=/tmp/myapp.lock

touch "$LOCKFILE"
curl -s https://api.example.com/data > "$TMPFILE"
process_data < "$TMPFILE"
rm "$TMPFILE"
rm "$LOCKFILE"
Enter fullscreen mode Exit fullscreen mode

If curl fails or process_data crashes, the lock file stays forever, blocking future runs.

With this rule:

# ✅ Good: trap cleanup, mktemp for safe temp files
TMPDIR=$(mktemp -d)
TMPFILE=$(mktemp)
LOCKFILE=$(mktemp)

cleanup() {
    rm -f "${TMPFILE}" "${LOCKFILE}"
    rm -rf "${TMPDIR}"
}
trap cleanup EXIT

touch "${LOCKFILE}"
curl -sf https://api.example.com/data > "${TMPFILE}"
process_data < "${TMPFILE}"
Enter fullscreen mode Exit fullscreen mode

Whether the script succeeds, fails, or gets killed with Ctrl-C, cleanup always runs.


Rule 6: Use Shellcheck-Clean Patterns Only

All generated Bash must pass shellcheck without warnings.
Use [[ ]] instead of [ ] for test expressions.
Use $(command) instead of backticks for command substitution.
Use printf instead of echo for portable output.
Never use eval or unescaped variable expansion in commands.
Enter fullscreen mode Exit fullscreen mode

Shellcheck catches hundreds of common Bash pitfalls. If Cursor generates code that triggers warnings, it's generating code with bugs.

Without this rule:

# ❌ Bad: shellcheck warnings everywhere
if [ $status == "running" ]; then
    pid=`cat /var/run/app.pid`
    if [ $pid -gt 0 ]; then
        echo -e "App running with PID $pid\n"
    fi
fi

result=`eval "cat $user_provided_path"`
Enter fullscreen mode Exit fullscreen mode

SC2086 (unquoted variable), SC2006 (backticks), SC2046 (unquoted subshell), and eval with user input — a buffet of shell bugs.

With this rule:

# ✅ Good: shellcheck-clean code
if [[ "${status}" == "running" ]]; then
    pid=$(cat /var/run/app.pid)
    if [[ "${pid}" -gt 0 ]]; then
        printf "App running with PID %s\n" "${pid}"
    fi
fi

result=$(cat -- "${validated_path}")
Enter fullscreen mode Exit fullscreen mode

Clean patterns that work correctly across Bash 4+.


Copy-Paste Ready: All 6 Rules

Drop this into your .cursorrules or .cursor/rules/bash.mdc:

# Bash Script Rules

## Strict Mode
- Every script: set -euo pipefail after shebang
- Use #!/usr/bin/env bash, not #!/bin/bash

## Variables
- Always double-quote variable expansions: "${var}"
- Use "${array[@]}" for array expansion
- No bare $variable outside arithmetic

## Functions
- Use local for all function variables
- snake_case function names
- Extract repeated logic into functions

## Arguments
- Validate required arguments at script top
- Show usage message on invalid args
- Assign positional params to named readonly variables

## Cleanup
- Use trap cleanup EXIT for temp files
- Use mktemp, never hardcoded /tmp paths
- Clean up resources in reverse creation order

## Code Quality
- All code must pass shellcheck
- Use [[ ]] not [ ] for tests
- Use $(cmd) not backticks
- Use printf not echo -e
- Never use eval with variable input
Enter fullscreen mode Exit fullscreen mode

The ROI: 6 Rules, Hours Saved Every Week

A broken shell script in production costs 1-4 hours of debugging — because there's no stack trace, no type error, just silent corruption. If these rules prevent even one incident per week, that's 4+ hours saved. Over a month, that's two full workdays of firefighting eliminated.

At $27 for the complete rules pack, it pays for itself the first time it stops a bad script from reaching production.

Want 50+ Production-Tested Rules?

These 6 rules are a starting point. My Cursor Rules Pack v2 includes 50+ rules covering Bash, Python, TypeScript, Docker, CI/CD, and more — organized by tool and priority so Cursor applies them consistently.

Stop debugging silent shell failures. Give Cursor the rules it needs to write bulletproof scripts.

Top comments (0)