Skip to content

MCP Security

The Model Context Protocol (MCP) enables AI agents to connect to external data sources and tools. When using MCP, you often need to handle sensitive configuration like API keys, database credentials, and authentication tokens. varlock provides a secure way to manage these secrets without exposing them in your configuration files or to AI agents.

This guide covers three scenarios:

  • Local MCP servers using stdio transport with varlock run
  • Remote MCP servers using varlock’s Node.js integration
  • Third-party MCP servers using varlock to load secrets and pass them to the server

For local development and testing, MCP servers often use stdio transport for communication with clients. This is perfect for using varlock run to securely load environment variables before starting your server.

Create a .env.schema file for your MCP server:

.env.schema
# @defaultSensitive=true
# @defaultRequired=true
# ---
# Database connection for MCP server
# @type=url
DATABASE_URL=
# API key for external service
# @type=string(startsWith="sk_")
EXTERNAL_API_KEY=
# Authentication secret
# @type=string(minLength=32)
AUTH_SECRET=
# Server configuration
# @sensitive=false
# @type=number(min=1024, max=65535)
SERVER_PORT=3000
# @sensitive=false
# @type=enum(debug, info, warn, error)
LOG_LEVEL=info

Create your local .env file with values from your 1Password vault:

.env
DATABASE_URL=exec(`op read "op://devTest/myVault/database-url"`)
EXTERNAL_API_KEY=exec(`op read "op://devTest/myVault/external-api-key"`)
AUTH_SECRET=exec(`op read "op://devTest/myVault/auth-secret"`)
LOG_LEVEL=debug

Update your MCP server’s package.json to use varlock run:

package.json
{
"name": "my-mcp-server",
"scripts": {
"start": "varlock run -- node server.js",
"dev": "varlock run -- node --watch server.js"
},
"dependencies": {
"@modelcontextprotocol/sdk": "^0.4.0"
}
}

For containerized local development, create a Dockerfile that uses varlock:

Dockerfile
FROM node:22-alpine
# Install varlock
RUN npm install -g @varlock/cli
WORKDIR /app
# Copy package files
COPY package*.json ./
COPY pnpm-lock.yaml ./
# Install dependencies
RUN npm install -g pnpm && pnpm install
# Copy application files
COPY . .
# Build the application
RUN pnpm build
# Use varlock run to start the server
CMD ["varlock", "run", "--", "node", "dist/server.js"]

Build and run your Docker container:

Terminal window
# Build the image
docker build -t my-mcp-server:latest .
# Run the container (for testing)
docker run --rm -it my-mcp-server:latest

Create a Cursor configuration file to connect to your local MCP server:

~/.cursor/mcp-servers.json
{
"mcpServers": {
"my-local-server": {
"command": "npm",
"args": ["start"],
"cwd": "/path/to/your/mcp-server",
"env": {
"NODE_ENV": "development"
}
}
}
}

For local MCP servers running in Docker: In this case an off-the-shelf MCP server is used, so we need to use varlock run to load the GITHUB_TOKEN environment variable and pass it to the server.

~/.cursor/mcp-servers.json
{
"mcpServers": {
"github": {
"command": "varlock",
"args": [
"run",
"--",
"docker",
"run",
"--rm",
"-i",
"ghcr.io/github/github-mcp-server:latest"
],
"env": {
"GITHUB_TOKEN": "${GITHUB_TOKEN}"
}
}
}
}

And the corresponding .env.schema file would look something like this:

.env.schema
# @defaultSensitive=true
# @defaultRequired=true
# ---
# GitHub token
# @type=string(startsWith="ghp_")
GITHUB_TOKEN=exec(`op read "op://devTest/myVault/github-token"`)

For production deployments, you’ll want to run MCP servers as standalone processes with varlock integrated directly into the server code.

server.ts
import 'varlock/auto-load';
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
import { ENV } from 'varlock/env';
async function main() {
const server = new Server(
{
name: 'my-mcp-server',
version: '1.0.0'
},
{
capabilities: {
tools: {}
}
}
);
// Register tools with access to secure configuration
server.setRequestHandler('tools/call', async (request) => {
const { name, arguments: args } = request.params;
switch (name) {
case 'query-database':
// Use secure database connection from config
return await queryDatabase(ENV.DATABASE_URL, args);
case 'call-external-api':
// Use secure API key from config
return await callExternalAPI(ENV.EXTERNAL_API_KEY, args);
default:
throw new Error(`Unknown tool: ${name}`);
}
});
const transport = new StdioServerTransport(process.stdin, process.stdout);
await server.connect(transport);
}
async function queryDatabase(databaseUrl: string, args: any) {
// Implementation using secure database URL
console.log('Querying database with secure connection');
return { result: 'database query result' };
}
async function callExternalAPI(apiKey: string, args: any) {
// Implementation using secure API key
console.log('Calling external API with secure key');
return { result: 'api call result' };
}
main().catch(console.error);

For production, create environment-specific schema files. See the environments guide for detailed information on managing multiple environments with varlock.

.env.schema
# @defaultSensitive=true
# @defaultRequired=true
# @envFlag=APP_ENV
# ---
# env flag is used to determine which environment to load
# default is development
# @type=enum(development, staging, test, production)
APP_ENV=development
# Database connection
# @type=url
DATABASE_URL=
# External API credentials
# @type=string(startsWith="sk_")
EXTERNAL_API_KEY=
# Authentication
# @type=string(minLength=32)
AUTH_SECRET=
# Server settings
# @sensitive=false
# @type=number(min=1024, max=65535)
SERVER_PORT=3000
# @sensitive=false
# @type=enum(debug, info, warn, error)
LOG_LEVEL=info
.env.production
DATABASE_URL=exec(`op read "op://prodTest/prodVault/prod-database-url"`)
EXTERNAL_API_KEY=exec(`op read "op://prodTest/prodVault/prod-external-api-key"`)
AUTH_SECRET=exec(`op read "op://prodTest/prodVault/prod-auth-secret"`)
SERVER_PORT=3000
LOG_LEVEL=warn

Then in the command to start the server, you can use the varlock run command to load the environment variables with the correct envFlagenvironment override.

Terminal window
APP_ENV=production varlock run -- node server.js

Always use external secret management such as 1Password or the built-in env var management in your deployment platform.

# ❌ Never do this
API_KEY=sk_live_1234567890abcdef
# ✅ Use external secret management
API_KEY=exec(`op read "op://devTest/myVault/api-key"`)
# ✅ Use external secret management
API_KEY=exec(`op read "op://devTest/myVault/api-key"`)

Create separate schema files for different environments. See the environments guide for detailed information on managing multiple environments with varlock.

.env.schema
# @defaultSensitive=true
# @envFlag=APP_ENV
# ---
# env flag is used to determine which environment to load
# default is development
# @type=enum(development, staging, test, production)
APP_ENV=development
# Common configuration
DATABASE_URL=
API_KEY=
.env.development
DATABASE_URL=postgresql://localhost:5432/dev_db
API_KEY=exec(`op read "op://devTest/myVault/dev-api-key"`)
.env.production
DATABASE_URL=exec(`op read "op://prodTest/prodVault/prod-database-url"`)
API_KEY=exec(`op read "op://prodTest/prodVault/prod-api-key"`)

Use varlock’s validation features to ensure data integrity:

.env.schema
# @type=string(startsWith="sk_", minLength=20)
API_KEY=
# @type=url
DATABASE_URL=

Use varlock’s redaction features to prevent sensitive data from appearing in logs:

import 'varlock/auto-load';
import { ENV } from 'varlock/env';
// Sensitive values are automatically redacted in logs
console.log('API Key:', ENV.API_KEY); // Shows: [xx▒▒▒▒▒]
console.log('Database URL:', ENV.DATABASE_URL); // Shows: [xx▒▒▒▒▒]