Introduction

Building the Sekora GitLab MCP server was an ambitious project - creating 71 specialized tools to provide comprehensive GitLab integration for AI assistants. This post documents our development journey, the challenges we faced, and how Claude Code proved to be an invaluable development partner throughout the process.

Project Scope and Planning

The Challenge

Creating a comprehensive MCP server that covers GitLab’s extensive API surface area required:

  • 71 distinct tools across multiple functional areas
  • Type-safe TypeScript implementation
  • Robust error handling for network and API failures
  • Comprehensive testing for reliability
  • Clear documentation for adoption

Initial Architecture Decisions

We made several key architectural decisions early in the project:

// Core architecture pattern
interface MCPTool {
  name: string;
  description: string;
  inputSchema: JSONSchema;
  handler: (params: any) => Promise<any>;
}

// Tool categorization
const TOOL_CATEGORIES = {
  issues: ['create_issue', 'update_issue', 'list_issues'],
  pipelines: ['get_pipeline', 'retry_pipeline', 'cancel_pipeline'],
  merge_requests: ['create_mr', 'approve_mr', 'merge_mr'],
  // ... 68 more tools
};

Development Process with Claude Code

Setting Up the Development Environment

Claude Code excelled at helping establish the initial project structure:

# Project initialization
npm init -y
npm install @modelcontextprotocol/sdk
npm install -D typescript @types/node jest

# TypeScript configuration
npx tsc --init

The AI assistant helped configure optimal TypeScript settings for MCP development:

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "commonjs",
    "lib": ["ES2020"],
    "outDir": "./dist",
    "rootDir": "./src",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true
  }
}

Tool Implementation Strategy

We adopted a systematic approach to implementing the 71 tools:

1. Core Infrastructure First

// Base GitLab client
class GitLabClient {
  constructor(private token: string, private baseUrl: string = 'https://gitlab.com/api/v4') {}
  
  async request<T>(endpoint: string, options: RequestOptions = {}): Promise<T> {
    // Comprehensive error handling
    // Rate limiting
    // Authentication
  }
}

2. Tool Factory Pattern

function createGitLabTool(config: ToolConfig): MCPTool {
  return {
    name: config.name,
    description: config.description,
    inputSchema: config.schema,
    handler: async (params) => {
      try {
        return await config.implementation(params);
      } catch (error) {
        return handleToolError(error, config.name);
      }
    }
  };
}

3. Category-Based Development

We organized development by GitLab feature areas:

Issues Management (12 tools)

  • create_issue, update_issue, close_issue
  • list_issues, search_issues, get_issue
  • add_issue_comment, update_issue_comment
  • assign_issue, unassign_issue
  • add_issue_labels, remove_issue_labels

Pipeline Management (15 tools)

  • get_pipeline, list_pipelines, retry_pipeline
  • cancel_pipeline, get_pipeline_jobs
  • retry_job, cancel_job, get_job_log
  • And 7 more pipeline-related tools…

Technical Challenges and Solutions

Challenge 1: GitLab API Complexity

Problem: GitLab’s API has numerous endpoints with complex parameter structures.

Solution: We created a comprehensive type system:

interface GitLabIssue {
  id: number;
  title: string;
  description: string;
  state: 'opened' | 'closed';
  labels: string[];
  assignees: GitLabUser[];
  milestone?: GitLabMilestone;
  // ... 20+ more fields
}

// JSON Schema generation from TypeScript types
const issueSchema = zodToJsonSchema(GitLabIssueSchema);

Challenge 2: Error Handling Across 71 Tools

Problem: Each tool needed robust error handling for different failure modes.

Solution: Centralized error handling with context:

class MCPError extends Error {
  constructor(
    message: string,
    public code: string,
    public tool: string,
    public context?: any
  ) {
    super(message);
  }
}

function handleToolError(error: unknown, toolName: string) {
  if (error instanceof GitLabAPIError) {
    return new MCPError(
      `GitLab API error: ${error.message}`,
      'GITLAB_API_ERROR',
      toolName,
      { status: error.status, response: error.response }
    );
  }
  
  // Handle network, authentication, and other errors...
}

Challenge 3: Rate Limiting and Performance

Problem: GitLab has strict rate limits, and some operations require multiple API calls.

Solution: Intelligent request batching and caching:

class RateLimitedClient {
  private requestQueue: RequestQueue = new RequestQueue();
  private cache: Map<string, CacheEntry> = new Map();
  
  async request<T>(endpoint: string, options: RequestOptions = {}): Promise<T> {
    // Check cache first
    const cached = this.getCached(endpoint, options);
    if (cached) return cached;
    
    // Queue request with rate limiting
    return this.requestQueue.add(() => this.makeRequest(endpoint, options));
  }
}

Challenge 4: Testing 71 Different Tools

Problem: Comprehensive testing without overwhelming test maintenance.

Solution: Test generation and mock strategies:

// Test factory for GitLab tools
describe.each(GITLAB_TOOLS)('Tool: %s', (toolName) => {
  it('should handle successful responses', async () => {
    const mockResponse = generateMockResponse(toolName);
    mockGitLabAPI.setup(toolName, mockResponse);
    
    const result = await executeTool(toolName, getMockParams(toolName));
    expect(result).toMatchSnapshot();
  });
  
  it('should handle API errors gracefully', async () => {
    mockGitLabAPI.setupError(toolName, new GitLabAPIError('Not found', 404));
    
    const result = await executeTools(toolName, getMockParams(toolName));
    expect(result.error).toBeDefined();
  });
});

Claude Code’s Role in Development

Code Generation and Scaffolding

Claude Code was particularly effective at:

  1. Generating tool implementations from GitLab API documentation
  2. Creating consistent error handling patterns across all tools
  3. Writing comprehensive tests with proper mocking
  4. Maintaining code style consistency across the large codebase

Example: Tool Generation Process

When implementing a new tool, Claude Code would:

// 1. Analyze GitLab API endpoint
const apiEndpoint = 'GET /projects/:id/issues/:issue_iid/notes';

// 2. Generate TypeScript interface
interface IssueNote {
  id: number;
  body: string;
  author: GitLabUser;
  created_at: string;
  updated_at: string;
  system: boolean;
}

// 3. Create tool implementation
export const getIssueComments: MCPTool = {
  name: 'get_issue_comments',
  description: 'Get all comments/notes for a specific issue',
  inputSchema: {
    type: 'object',
    properties: {
      project_id: { type: 'string', description: 'Project ID or path' },
      issue_iid: { type: 'number', description: 'Issue internal ID' }
    },
    required: ['project_id', 'issue_iid']
  },
  handler: async ({ project_id, issue_iid }) => {
    return await gitlabClient.request<IssueNote[]>(
      `projects/${encodeURIComponent(project_id)}/issues/${issue_iid}/notes`
    );
  }
};

Performance Optimizations

Lazy Loading Strategy

class MCPServer {
  private toolsCache: Map<string, MCPTool> = new Map();
  
  getTool(name: string): MCPTool {
    if (!this.toolsCache.has(name)) {
      this.toolsCache.set(name, this.loadTool(name));
    }
    return this.toolsCache.get(name)!;
  }
  
  private loadTool(name: string): MCPTool {
    // Dynamic tool loading to reduce memory footprint
    return require(`./tools/${name}`).default;
  }
}

Request Optimization

// Batch related requests
async function getIssueWithDetails(projectId: string, issueIid: number) {
  const [issue, comments, events] = await Promise.all([
    gitlabClient.getIssue(projectId, issueIid),
    gitlabClient.getIssueComments(projectId, issueIid),
    gitlabClient.getIssueEvents(projectId, issueIid)
  ]);
  
  return { issue, comments, events };
}

Lessons Learned

1. API Design Consistency is Crucial

Maintaining consistent parameter naming and response formats across 71 tools required careful planning:

// Consistent parameter patterns
interface BaseToolParams {
  project_id: string; // Always string, supports both ID and path
}

interface PaginatedToolParams extends BaseToolParams {
  page?: number;
  per_page?: number;
  order_by?: string;
  sort?: 'asc' | 'desc';
}

2. Documentation as Code

We generated API documentation directly from TypeScript types:

// Tool metadata for documentation
const toolMetadata = {
  categories: extractCategories(GITLAB_TOOLS),
  schemas: generateSchemas(GITLAB_TOOLS),
  examples: generateExamples(GITLAB_TOOLS)
};

3. Error Context is Everything

Providing rich error context proved essential for debugging:

function enrichError(error: MCPError, context: ToolContext) {
  return {
    ...error,
    timestamp: new Date().toISOString(),
    tool: context.toolName,
    parameters: context.parameters,
    gitlabUrl: context.client.baseUrl,
    requestId: context.requestId
  };
}

Development Timeline

  • Week 1: Project setup, core infrastructure, first 10 tools
  • Week 2: Issues and merge requests management (25 tools total)
  • Week 3: Pipeline and jobs management (45 tools total)
  • Week 4: Repository and project administration (65 tools total)
  • Week 5: Security, environments, final 6 tools, testing
  • Week 6: Documentation, CI/CD setup, NPM publishing

Testing Strategy

Unit Tests

describe('GitLab MCP Tools', () => {
  beforeEach(() => {
    mockGitLabClient.reset();
  });
  
  describe('Issue Management', () => {
    it('creates issue with all parameters', async () => {
      const params = {
        project_id: 'test/project',
        title: 'Test Issue',
        description: 'Test description',
        labels: ['bug', 'urgent']
      };
      
      mockGitLabClient.post.mockResolvedValue({ id: 123, ...params });
      
      const result = await tools.create_issue.handler(params);
      expect(result.id).toBe(123);
    });
  });
});

Integration Tests

describe('GitLab Integration', () => {
  it('handles complete issue workflow', async () => {
    const issue = await createIssue({ title: 'Integration test' });
    await addComment(issue.project_id, issue.iid, 'Test comment');
    await updateIssue(issue.project_id, issue.iid, { state: 'closed' });
    
    const finalIssue = await getIssue(issue.project_id, issue.iid);
    expect(finalIssue.state).toBe('closed');
  });
});

Deployment and CI/CD

Automated Testing Pipeline

# .gitlab-ci.yml
test:
  image: node:18
  script:
    - npm ci
    - npm run test:unit
    - npm run test:integration
    - npm run lint
    - npm run type-check
  coverage: '/Lines\s*:\s*(\d+\.\d+)%/'

publish:
  stage: deploy
  script:
    - npm run build
    - npm publish --access public
  only:
    - tags

Conclusion

Building the Sekora GitLab MCP server was a complex undertaking that required careful architecture, systematic development, and robust testing. Claude Code proved invaluable throughout the process, helping with:

  • Code generation and scaffolding
  • Error handling patterns
  • Test creation and maintenance
  • Documentation generation
  • Performance optimization

The result is a comprehensive, production-ready MCP server that provides AI assistants with powerful GitLab integration capabilities. The modular architecture and systematic approach we developed will serve as a template for future MCP server projects.

Key Takeaways

  1. Start with solid architecture - The tool factory pattern and centralized error handling scaled beautifully
  2. Leverage AI for consistency - Claude Code helped maintain patterns across 71 different implementations
  3. Test early and often - Our test generation approach caught numerous edge cases
  4. Document as you build - TypeScript types doubled as API documentation
  5. Performance matters - Rate limiting and caching were essential for production use

The complete source code and detailed implementation guides are available in our GitLab repository.


This development journey showcases how AI-assisted development can tackle complex integration projects while maintaining high code quality and comprehensive testing.