Coding standards for AI coding assistants. Auto-detect tech stack, generate project rules. Works with Claude Code, Cu...
Copy the install, test the workflow, then decide if it earns a permanent slot.
Still active enough to matter. Good candidate for a fast stack test instead of a long evaluation loop.
Copy the install, test the workflow, then decide if it earns a permanent slot.
Not hard to test, not trivial to unwind. Worth trying if it closes a sharp gap.
GitHub health 42/100. no security policy. 0 open issues make this testable, but not something to trust blind.
AI Agent
Multiple
Model
Multiple
Build Time
Instant
Fastest way to find out if claude-rules belongs in your setup.
Copy the install command, run a real test, and back it out cleanly if it slows you down.
# Visit: https://github.com/lifedever/claude-rulesRun this first. You will know quickly if the workflow earns a permanent slot.
# No automated removal — visit https://github.com/lifedever/claude-rulesNo messy cleanup loop. If it misses, remove it and keep moving.
Install Location
~/ └─ .claude/ ├─ commands/ ├─ agents/ │ └─ claude-rules/ ← installs here └─ settings.json
Coding standards for AI coding assistants. Auto-detect tech stack, generate project rules. Works with Claude Code, Cursor, Antigravity, GitHub Copilot.. An open-source skill for the AI coding ecosystem.