Skip to main content

Integrations

HornetHive connects to popular services using Model Context Protocol (MCP) for enterprise-grade AI agent access. When you connect an integration, your AI agents automatically receive tools to interact with that service in real-time.

How It Works​

All integrations follow this seamless flow:

  1. OAuth Authentication - Connect your account securely with one click
  2. MCP Provisioning - Automatic tool discovery and agent access (happens in the background)
  3. Document Ingestion - Optional RAG indexing for historical data and semantic search
  4. Real-Time Access - Agents query live data during workflow execution

This hybrid approach combines the best of both worlds:

  • Real-time tools via MCP for current data (read email, create tasks, check calendar)
  • Semantic search via RAG for historical context (past conversations, archived docs, trends)

Example: When HivePilot generates a PRD, it uses MCP tools to check recent Slack discussions (live) and queries the RAG index for past product specs (historical).

Enterprise Architecture

Learn more about our MCP-based architecture in the MCP Integration Guide.


Available Integrations​

Slack​

  • Start OAuth with /api/oauth/slack/start and exchange the code via /api/oauth/slack/exchange.
  • /api/slack/post sends messages while /api/slack/messages fetches history.
  • /api/slack/ingest stores conversations so crews can search Slack threads.

Google Drive​

  • Begin auth with /api/oauth/drive/start.
  • Files are listed with list_files and downloaded with download_file.
  • ingest_files uploads file contents to the RAG index so documents appear in search results.

Gmail​

  • Start with /api/oauth/gmail/start to request read-only access.
  • list_messages retrieves message IDs while fetch_message loads each email body.
  • ingest_messages stores email text as chunks, letting agents reference past conversations.

Google Calendar​

  • Use /api/oauth/google/start for calendar scopes.
  • list_events fetches upcoming events and ingest_events indexes the description and notes.

Tokens for each integration are stored per workspace. Crew specific overrides can be managed via the /api/integrations endpoints. All ingestion helpers ultimately call the core ingestion_service, ensuring consistent metadata and semantic chunking. OAuth tokens are saved and retrieved using /api/integration-tokens so that automated tasks can call the provider APIs without re-authenticating.