Installation and Configuration

Prerequisites

  • Node.js: Version 20 or higher
  • npm: Version 9 or higher
  • Claude Desktop or Cursor: For MCP integration (recommended)

Installation via npm

Install BigRack globally on your machine:

npm install -g @bigrack/mcp

This installation includes the MCP server and CLI to manage your racks and projects.

Installation from GitHub

You can install BigRack directly from the GitHub repository by cloning only the MCP package folder:

git clone --filter=blob:none --sparse https://github.com/baptiste-mnh/bigrack.dev.git
cd bigrack
git sparse-checkout set packages/mcp
cd packages/mcp
npm install -g .

This will install the latest version from the GitHub repository. The npm install -g . command installs the package globally from the current directory.

Development Installation

If you want to contribute to the project or use the latest development version, you can clone and build the repository:

1. Clone the repository

git clone https://github.com/baptiste-mnh/bigrack.dev.git
cd bigrack

2. Install dependencies

npm install

3. Build the project

npm run build

4. Link globally

To use the local version as a global command:

npm run link:mcp

Now you can use the bigrack command from anywhere. To unlink later, use npm run unlink:mcp.

Development Mode

For active development with auto-rebuild on changes:

npm run dev:mcp

This watches for file changes and automatically rebuilds the package.

Initial Configuration

1. Initialize BigRack globally

First, initialize BigRack globally on your machine (run once per machine):

bigrack init

This sets up the BigRack database, downloads the embedding model, and creates the configuration directory (~/.bigrack/).

2. Create a Repo in your project

In your project directory, create a Repo using the MCP tool:

Ask your AI assistant:

"Create a new BigRack repo for this project"

This creates a bigrack.json file and registers the Repo in the local SQLite database (~/.bigrack/).

3. First-time setup (Vector Search)

During initialization, BigRack downloads the vector embedding model (~80MB):

🔍 Vector:      ✅ Ready
             Model: Xenova/all-MiniLM-L6-v2
             Dim: 384
             Size: ~22.6 MB

This model enables semantic search over your business context. It runs entirely locally with no external API calls.

Integration with Claude Desktop

Automatic Configuration

To integrate BigRack with Claude Desktop, use:

bigrack setup-claude

This command automatically configures Claude Desktop to use BigRack as an MCP server.

Manual Configuration

If you prefer to configure manually, add this to your Claude Desktop configuration file:

  • Linux/Mac: ~/.config/claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "bigrack": {
      "command": "npx",
      "args": ["-y", "@bigrack/mcp"]
    }
  }
}

Important: Restart Claude Desktop after configuration for changes to take effect.

Integration with Cursor

To integrate BigRack with Cursor:

bigrack setup-cursor

This configures Cursor to use BigRack as an MCP server. Restart Cursor after configuration.

Verification

Verify your installation:

bigrack --version

You should see the BigRack version and a confirmation that the MCP server is ready.

Next Steps

After installation, you can:

  1. Initialize BigRack globally with bigrack init
  2. Create a Repo using the bigrack_create_repo MCP tool in your AI assistant
  3. Add business context with MCP tools in your AI assistant
  4. Create a Project with bigrack_create_project MCP tool
  5. Decompose features using MCP tools in your AI assistant

See the Quick Start Guide for a detailed tutorial.