Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions ai-agents/crew-ai-actions.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
title: "AI Agent Actions"
sidebarTitle: "Actions"
description: "Explore the various actions you can perform with your AI agent in CometChat."
---

import Actions from '/snippets/ai-agents/actions.mdx';

<Actions />
142 changes: 142 additions & 0 deletions ai-agents/crew-ai-knowledge-agent.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
---
title: "Build Your Knowledge Agent with CrewAI"
sidebarTitle: "Knowledge Agent"
description: "Create a CrewAI knowledge agent that answers from your docs, streams NDJSON to CometChat, and cites sources."
---

import { Steps, Step } from 'mintlify';

Based on the refreshed [`ai-agent-crew-ai-examples`](https://github.com/cometchat/ai-agent-crew-ai-examples) codebase, here’s how to run the `knowledge_agent` FastAPI service, ingest docs, and stream answers into CometChat.

***

## What you’ll build

- A **CrewAI** agent that grounds replies in your ingested docs (per namespace).
- A **FastAPI** service with ingest/search/generate endpoints plus a `/stream` SSE.
- **CometChat AI Agent** wiring that consumes newline-delimited JSON chunks (`text_start`, `text_delta`, `text_end`, `done`).

***

## Prerequisites

- Python 3.10+ with `pip` (or `uv`)
- `OPENAI_API_KEY` (optionally `OPENAI_BASE_URL`, `KNOWLEDGE_OPENAI_MODEL`, `KNOWLEDGE_EMBEDDING_MODEL`)
- A CometChat app + AI Agent entry

***

## Run the updated sample

<Steps>
<Step title="Install & start">
In <code>ai-agent-crew-ai-examples/</code>:
<pre><code className="language-bash">python3 -m venv .venv
source .venv/bin/activate
pip install -e .
uvicorn knowledge_agent.main:app --host 0.0.0.0 --port 8000 --reload</code></pre>
Env supports <code>.env</code> at repo root or inside <code>knowledge_agent/.env</code>.
</Step>
<Step title="Set env">
<code>OPENAI_API_KEY</code> is required. Optional: <code>OPENAI_BASE_URL</code>, <code>KNOWLEDGE_OPENAI_MODEL</code> (default <code>gpt-4o-mini</code>), <code>KNOWLEDGE_EMBEDDING_MODEL</code> (default <code>text-embedding-3-small</code>).
</Step>
<Step title="Storage">
Ingested files land in <code>knowledge_agent/data/knowledge/&lt;namespace&gt;/</code> and embeddings persist to <code>knowledge_agent/data/chroma/&lt;namespace&gt;/</code>. Duplicate hashes are skipped automatically.
</Step>
</Steps>

***

## API surface (FastAPI)

- `POST /api/tools/ingest` — accept JSON or `multipart/form-data` with `sources` (text/markdown/url) and optional file uploads; returns `saved`, `skipped`, `errors`.
- `POST /api/tools/searchDocs` — semantic search via Chroma; accepts `namespace`, `query`, `max_results`.
- `POST /api/agents/knowledge/generate` — single, non-streaming completion (requires at least one message).
- `POST /stream` — newline-delimited JSON over SSE (`text_start`, `text_delta`, `text_end`, `done`; `error` on failure) ready for CometChat BYOA.
- Validation/behavior: `/ingest` dedupes by content hash (skips duplicates) and returns `207` when mixed `errors`/`saved`; `/stream` rejects empty `messages`.

### Ingest examples

```bash
curl -X POST http://localhost:8000/api/tools/ingest \
-H "Content-Type: application/json" \
-d '{
"namespace": "default",
"sources": [
{ "type": "url", "value": "https://docs.crewai.com/" },
{ "type": "markdown", "title": "Notes", "value": "# CrewAI Rocks" }
]
}'
```

Multipart uploads are also supported:

```bash
curl -X POST http://localhost:8000/api/tools/ingest \
-H "Accept: application/json" \
-F "namespace=default" \
-F "sources=[{\"type\":\"text\",\"value\":\"Hello\"}]" \
-F "files=@/path/to/file.pdf"
```

### Search + answer

```bash
curl -X POST http://localhost:8000/api/tools/searchDocs \
-H "Content-Type: application/json" \
-d '{"namespace":"default","query":"CrewAI agent lifecycle","max_results":4}'
```

```bash
curl -N http://localhost:8000/stream \
-H "Content-Type: application/json" \
-d '{
"thread_id": "thread_1",
"run_id": "run_001",
"messages": [
{ "role": "user", "content": "Summarize the CrewAI agent lifecycle." }
]
}'
```

Streaming payload shape:

```json
{"type":"text_start","message_id":"...","thread_id":"...","run_id":"..."}
{"type":"text_delta","content":"...","message_id":"...","thread_id":"...","run_id":"..."}
{"type":"text_end","message_id":"...","thread_id":"...","run_id":"..."}
{"type":"done","thread_id":"...","run_id":"..."}
# errors (if thrown) look like:
{"type":"error","message":"...","message_id":"...","thread_id":"...","run_id":"..."}
```

***

## Crew internals (for reference)

`knowledge_agent/knowledge_manager.py` builds a search tool per namespace, wired into a CrewAI agent:

```python
search_tool = self._create_search_tool(normalised)
agent = Agent(
role="Knowledge Librarian",
goal="Answer user questions with relevant citations from the knowledge base.",
tools=[search_tool],
llm=model,
)
task = Task(
description="Use search_knowledge_base before answering.\nConversation: {conversation}\nLatest: {question}",
expected_output="A concise, cited answer grounded in ingested docs.",
agent=agent,
)
crew = Crew(agents=[agent], tasks=[task], process=Process.sequential)
```

***

## Wire it to CometChat

- Dashboard → **AI Agent → BYO Agents** and then **Get Started / Integrate → Choose CrewAI**. → **Agent ID** (e.g., `knowledge`) → **Deployment URL** = your public `/stream`.
- The SSE stream is newline-delimited JSON; CometChat AG-UI adapters parse `text_start`/`text_delta`/`text_end` and stop on `done`. Message IDs, thread IDs, and run IDs are included for threading.
- Use namespaces to keep customer/workspace data separate; pass `namespace` in the payload or inside `tool_params.namespace` (either works; defaults to `default` if omitted).
- Keep secrets server-side; add auth headers on the FastAPI route if needed.
103 changes: 103 additions & 0 deletions ai-agents/crew-ai-product-hunt-agent.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
---
title: "Build a Product Hunt Agent with CrewAI"
sidebarTitle: "Product Hunt Agent"
description: "Create a CrewAI agent that fetches Product Hunt posts, answers launch questions, and can trigger frontend actions like confetti."
---

import { Steps, Step } from 'mintlify';

Refreshed for the latest [`ai-agent-crew-ai-examples`](https://github.com/cometchat/ai-agent-crew-ai-examples) codebase: run the `product_hunt_agent` FastAPI service, wire it to CometChat, and stream SSE updates (with optional confetti actions).

---

## What you’ll build

- A CrewAI agent with tools to **get top posts**, **search**, **timeframes**, and **trigger confetti**.
- A FastAPI `/stream` endpoint emitting newline-delimited JSON (`text_start`, `text_delta`, `text_end`, `done`).
- CometChat AI Agent wiring that consumes those SSE chunks; your UI listens for the confetti payload.
- Streaming events follow `text_start` → `text_delta` chunks → `text_end` → `done` (errors emit `type: "error"`).

---

## Prerequisites

- Python 3.10+ with `pip`
- `OPENAI_API_KEY` (optionally `OPENAI_BASE_URL`, `PRODUCT_OPENAI_MODEL`)
- Optional: `PRODUCTHUNT_API_TOKEN` for live GraphQL data (empty lists when missing)
- CometChat app + AI Agent entry

---

## Run the updated sample

<Steps>
<Step title="Install & start">
In <code>ai-agent-crew-ai-examples/</code>:
<pre><code className="language-bash">python3 -m venv .venv
source .venv/bin/activate
pip install -e .
uvicorn product_hunt_agent.main:app --host 0.0.0.0 --port 8001 --reload</code></pre>
</Step>
<Step title="Set env">
Required: <code>OPENAI_API_KEY</code>. Optional: <code>PRODUCTHUNT_API_TOKEN</code> (GraphQL), <code>PRODUCTHUNT_DEFAULT_TIMEZONE</code> (default <code>America/New_York</code>).
</Step>
</Steps>

***

## API surface (FastAPI)

- `GET /api/top` — top posts by votes (`limit` 1–10).
- `GET /api/top-week` — rolling window (default 7 days) with `limit` and `days`.
- `GET /api/top-range` — timeframe queries (`timeframe`, `tz`, `limit`); supports `"today"`, `"yesterday"`, `"last_week"`, `"last_month"`, or ISO dates.
- `GET /api/search` — Algolia search (`q`, `limit`).
- `POST /api/chat` — non-streaming CrewAI answer.
- `POST /stream` — SSE stream (`text_start`, `text_delta`, `text_end`, `done`) ready for CometChat.
- `POST /api/chat` payload: `{"message": "…", "messages": [{ "role": "user", "content": "…" }]}` (array is required if `message` is omitted).

### Streaming example

```bash
curl -N http://localhost:8001/stream \
-H "Content-Type: application/json" \
-d '{
"messages": [
{ "role": "user", "content": "What were the top launches last week?" }
]
}'
```

Streaming payload shape:

```json
{"type":"text_start","message_id":"...","thread_id":"...","run_id":"..."}
{"type":"text_delta","content":"...","message_id":"...","thread_id":"...","run_id":"..."}
{"type":"text_end","message_id":"...","thread_id":"...","run_id":"..."}
{"type":"done","thread_id":"...","run_id":"..."}
# errors (when thrown) look like:
{"type":"error","message":"...","message_id":"...","thread_id":"...","run_id":"..."}
```

***

## Crew internals (for reference)

Key tools in `product_hunt_agent/agent_builder.py`:

```python
@tool("getTopProducts") # votes-ranked, clamps limit 1-10
@tool("getTopProductsThisWeek") # rolling-week window, clamps days 1-31 and limit 1-10
@tool("getTopProductsByTimeframe") # "today", "yesterday", "last_week", ISO, ranges; clamps limit 1-10
@tool("searchProducts") # Algolia search (no token needed)
@tool("triggerConfetti") # returns payload: colors, particleCount, spread, startVelocity, origin, ticks, disableSound
```

All tools run server-side; if `PRODUCTHUNT_API_TOKEN` is missing, top/timeframe queries return empty arrays but still respond cleanly (search still works via Algolia defaults).

***

## Wire it to CometChat

- Dashboard → **AI Agent → BYO Agents** and then **Get Started / Integrate → Choose CrewAI**. → **Agent ID** (e.g., `product_hunt`) → **Deployment URL** = your public `/stream`.
- Listen for `text_start`/`text_delta`/`text_end` to render streaming text; stop on `done`.
- When `triggerConfetti` returns, map the payload to your UI handler (Widget/React UI Kit). Keep API tokens server-side.
9 changes: 9 additions & 0 deletions ai-agents/crew-ai-tools.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
title: "AI Agent Tools"
sidebarTitle: "Tools"
description: "Explore the various tools you can use with your AI agent in CometChat."
---

import Tools from '/snippets/ai-agents/tools.mdx';

<Tools />
Loading