Check Command
Note: The
check
command has been merged with the default command. Bothdeeplint
anddeeplint check
now provide the same functionality. This documentation is kept for reference, but we recommend using the default command (deeplint
) directly.
The check
command runs DeepLint's LLM-powered analysis on your codebase, providing advanced linting, code review, and suggestions using AI.
Usage
deeplint [options]
or (for backward compatibility):
deeplint check [options]
DeepLint analyzes staged changes by default.
Options
--provider
string
LLM provider to use
"openai"
--model
string
LLM model to use
process.env.OPENAI_MODEL
or "gpt-4o"
--api-key
string
API key for the LLM provider
process.env.OPENAI_API_KEY
--instructions
string
Additional instructions for the LLM
none
--json
boolean
Output results in JSON format
false
--context
string
Context depth for analysis (light
or deep
)
"light"
--unstaged
boolean
Include unstaged changes in the analysis
false
--debug
boolean
Enable debug output
false
--verbose
boolean
Enable verbose output
false
--temperature
number
Temperature for the LLM (0-1)
0
--max-tokens
number
Maximum tokens for the LLM response
16384
--dump
string
Dump context to a file (specify filename)
none
Examples
Basic LLM Analysis
deeplint
Analyze with a Custom Model
deeplint --model=gpt-4
Add Custom Instructions
deeplint --instructions="Focus on security issues."
Output as JSON
deeplint --json
Analyze Unstaged Changes
deeplint --unstaged
Set Temperature for More Creative Suggestions
deeplint --temperature 0.7
Dump Context to a File
deeplint --dump context.json
Output
By default, DeepLint displays results in a formatted table, grouped by file, with severity coloring and detailed explanations.
Example output:
File: src/commands/default-command.ts
ββββββββββββ¬βββββββ¬βββββββββββββββββββββββββββββββββββββββββββββββ
β Severity β Line β Message β
ββββββββββββΌβββββββΌβββββββββββββββββββββββββββββββββββββββββββββββ€
β warning β 1 β Unused import statement β
β info β 60 β Hardcoded default values β
ββββββββββββ΄βββββββ΄βββββββββββββββββββββββββββββββββββββββββββββββ
Line 1: Unused import statement
Explanation: The import statement for 'Argv' from 'yargs' is not used anywhere in the file.
Suggestion: Remove the unused import statement to clean up the code.
If you use --json
, the output will be a machine-readable JSON object matching the LLM result schema.
LLM Options and Precedence
LLM options can be set via CLI flags, environment variables, or config file. Precedence is:
CLI arguments
Environment variables
Config file (
llm
section)Built-in defaults
See the Configuration Guide for details.
Troubleshooting
Missing API Key: Set
OPENAI_API_KEY
or use--api-key
.Model Not Supported: Check your OpenAI account/model access.
Rate Limits/Quotas: See your OpenAI dashboard for usage.
Network Errors: Ensure you have an internet connection.
See Also
Last updated