AI Tools
AI tools like Cursor, Gemini, Claude, and GitHub Copilot have accelerated workflows for many development teams. This, however, comes with some risks. Unless your team has gone to great lengths to hide secrets from your AI tools, you’re probably sending secrets to their AI agents and potentially leaking them in the code it generates.
varlock
exists to solve both of those problems. By never storing the secret values, you never have to worry about sending them to AI servers. And because of the new @env-spec format you will have better AX (agent experience) when dealing with environment variables in your generated code.
Securely inject secrets into AI CLI tools
Section titled “Securely inject secrets into AI CLI tools”Many AI coding assistants offer CLI tools that require API keys and other secrets. Instead of storing these secrets in plain text .env
or .json
files or exposing them in your shell history, use varlock
to inject them securely at runtime. This applies both to config that might be required to bootstrap the tool itself, as well as things like MCP servers that require API keys.
1. Install varlock
Section titled “1. Install varlock”If you haven’t already, install varlock on your system.
2. Create an environment schema
Section titled “2. Create an environment schema”Define your API keys and secrets in your .env.schema
file. Mark sensitive values appropriately:
# @sensitive @requiredOPENAI_API_KEY=exec('op read "op://api-local/openai/api-key"')
# @sensitive @requiredANTHROPIC_API_KEY=exec('op read "op://api-local/anthropic/api-key"')
# @sensitive @requiredGOOGLE_API_KEY=exec('op read "op://api-local/google/api-key"')
Store the actual secret values in your preferred secrets provider like 1Password (as shown above), AWS Secrets Manager, or any other provider with a CLI to fetch invidual secrets.
3. Run your tool via varlock run
Section titled “3. Run your tool via varlock run”Execute your AI CLI tool through varlock
to securely inject the environment variables:
varlock run -- <your-cli-command>
Popular AI CLI tool examples
Section titled “Popular AI CLI tool examples”Here’s how to configure and run popular AI coding CLI tools with varlock:
Aider is a popular AI pair programming tool that works in your terminal.
Environment variables:
OPENAI_API_KEY
- For OpenAI models (GPT-4, etc.)ANTHROPIC_API_KEY
- For Claude modelsGEMINI_API_KEY
- For Google Gemini models
Add to .env.schema
:
# @sensitive @requiredOPENAI_API_KEY=exec('op read "op://api-local/openai/api-key"')
# @sensitiveANTHROPIC_API_KEY=exec('op read "op://api-local/anthropic/api-key"')
Run with varlock:
varlock run -- aider
# or with specific optionsvarlock run -- aider --model gpt-4-turbo
See supported env variables here.
Claude Code is Anthropic’s CLI tool for AI-assisted coding.
Environment variable:
ANTHROPIC_API_KEY
- Your Anthropic API key
Add to .env.schema
:
# @sensitive @requiredANTHROPIC_API_KEY=exec('op read "op://api-local/anthropic/api-key"')
Run with varlock:
varlock run -- claude
See supported env variables here.
Opencode is a provider-agnostic AI coding assistant that works in your terminal.
Environment variables:
ANTHROPIC_API_KEY
- For Claude modelsOPENAI_API_KEY
- For OpenAI modelsOPENCODE_CONFIG
- Path to custom config file (optional)
Add to .env.schema
:
# @sensitive @requiredANTHROPIC_API_KEY=exec('op read "op://api-local/anthropic/api-key"')
# @sensitiveOPENAI_API_KEY=exec('op read "op://api-local/openai/api-key"')
Add an auth configuration:
opencode auth login
It will ask you to paste your API key. Instead, paste in an env reference
like this:
{"env:ANTHROPIC_API_KEY"}
Your config file (~/.local/share/opencode/auth.json
) should now look like this:
{ "anthropic": { "type": "api", "key": "{env:ANTHROPIC_API_KEY}" }}
Run with varlock:
varlock run -- opencode
# or with specific modelvarlock run -- opencode --model claude-3-5-sonnet
See the Opencode docs for more information.
Gemini CLI is Google’s open source AI agent.
Environment variable:
GOOGLE_API_KEY
orGEMINI_API_KEY
- Your Google AI API key
Add to .env.schema
:
# @sensitive @requiredGOOGLE_CLOUD_PROJECT=exec('op read "op://api-local/google/cloud-project"')
# @sensitive @requiredGOOGLE_API_KEY=exec('op read "op://api-local/google/api-key"')
Run with varlock:
varlock run -- gemini
See the Gemini CLI auth docs for more information.
Allowing schema files for AI tools
Section titled “Allowing schema files for AI tools”Most AI tools ignore .env.*
files by default. To ensure your AI tool can access your environment schema, add the following to your .gitignore
:
!.env.schema
If you use a tool with its own ignore file, check that tool’s documentation to see how it handles ignore files and make sure .env.schema
is allowed.
Custom instructions and rules
Section titled “Custom instructions and rules”To give your AI tool full context about varlock
, you can provide it with the full Varlock llms.txt
. In Cursor, this is accomplished via ‘Add New Custom Docs’.
If your tool supports custom rules, you can use our own varlock Cursor rule file from this repo as a starting point to create your own that is most suited to your workflow.