@@ -67,6 +67,21 @@ See: `task_context_example.py`, `worker_example.py`
6767
6868---
6969
70+ ### AI/LLM Workflows
71+
72+ See [ agentic_workflows/] ( agentic_workflows/ ) for the full set of AI agent examples.
73+
74+ | File | Description | Run |
75+ | ------| -------------| -----|
76+ | ** agentic_workflows/llm_chat.py** | Automated multi-turn LLM chat | ` python examples/agentic_workflows/llm_chat.py ` |
77+ | ** agentic_workflows/llm_chat_human_in_loop.py** | Interactive chat with WAIT task pauses | ` python examples/agentic_workflows/llm_chat_human_in_loop.py ` |
78+ | ** agentic_workflows/multiagent_chat.py** | Multi-agent debate with moderator routing | ` python examples/agentic_workflows/multiagent_chat.py ` |
79+ | ** agentic_workflows/function_calling_example.py** | LLM picks Python functions to call | ` python examples/agentic_workflows/function_calling_example.py ` |
80+ | ** agentic_workflows/mcp_weather_agent.py** | AI agent with MCP tool calling | ` python examples/agentic_workflows/mcp_weather_agent.py "What's the weather?" ` |
81+ | ** rag_workflow.py** | RAG pipeline: markitdown, pgvector, search, answer | ` python examples/rag_workflow.py file.pdf "question" ` |
82+
83+ ---
84+
7085### Monitoring
7186
7287| File | Description | Run |
@@ -174,6 +189,65 @@ python examples/prompt_journey.py
174189
175190---
176191
192+ ### RAG Pipeline Setup
193+
194+ Complete RAG (Retrieval Augmented Generation) pipeline example:
195+
196+ ``` bash
197+ # 1. Install dependencies
198+ pip install conductor-python " markitdown[pdf]"
199+
200+ # 2. Configure (requires Orkes Conductor with AI/LLM support)
201+ # - Vector DB integration named "postgres-prod" (pgvector)
202+ # - LLM provider named "openai" with a valid API key
203+ export CONDUCTOR_SERVER_URL=" http://localhost:7001/api"
204+
205+ # 3. Run RAG workflow
206+ python examples/rag_workflow.py examples/goog-20251231.pdf " What were Google's total revenues?"
207+ ```
208+
209+ ** Pipeline:** ` convert_to_markdown ` → ` LLM_INDEX_TEXT ` → ` WAIT ` → ` LLM_SEARCH_INDEX ` → ` LLM_CHAT_COMPLETE `
210+
211+ ** Features:**
212+ - Document conversion (PDF, Word, Excel → Markdown via [ markitdown] ( https://github.com/microsoft/markitdown ) )
213+ - Vector database ingestion into pgvector with OpenAI ` text-embedding-3-small ` embeddings
214+ - Semantic search with configurable result count
215+ - Context-aware answer generation with ` gpt-4o-mini `
216+
217+ ---
218+
219+ ### MCP Tool Integration Setup
220+
221+ MCP (Model Context Protocol) agent example:
222+
223+ ``` bash
224+ # 1. Install MCP weather server
225+ pip install mcp-weather-server
226+
227+ # 2. Start MCP server
228+ python3 -m mcp_weather_server \
229+ --mode streamable-http \
230+ --host localhost \
231+ --port 3001 \
232+ --stateless
233+
234+ # 3. Run AI agent
235+ export OPENAI_API_KEY=" your-key"
236+ export ANTHROPIC_API_KEY=" your-key"
237+ python examples/agentic_workflows/mcp_weather_agent.py " What's the weather in Tokyo?"
238+
239+ # Or simple mode (direct tool call):
240+ python examples/agentic_workflows/mcp_weather_agent.py " Temperature in New York" --simple
241+ ```
242+
243+ ** Features:**
244+ - MCP tool discovery
245+ - LLM-based planning (agent decides which tool to use)
246+ - Tool execution via HTTP/Streamable transport
247+ - Natural language response generation
248+
249+ ---
250+
177251## 🎓 Learning Path (60-Second Guide)
178252
179253``` bash
@@ -189,7 +263,11 @@ python examples/worker_configuration_example.py
189263# 4. Workflows (10 min)
190264python examples/dynamic_workflow.py
191265
192- # 5. Monitoring (5 min)
266+ # 5. AI/LLM Workflows (15 min)
267+ python examples/agentic_workflows/llm_chat.py
268+ python examples/rag_workflow.py examples/goog-20251231.pdf " What were Google's total revenues?"
269+
270+ # 6. Monitoring (5 min)
193271python examples/metrics_example.py
194272curl http://localhost:8000/metrics
195273```
@@ -214,6 +292,15 @@ examples/
214292│ ├── workflow_status_listner.py # Workflow events
215293│ └── test_workflows.py # Unit tests
216294│
295+ ├── AI/LLM Workflows
296+ │ ├── rag_workflow.py # RAG pipeline (markitdown + pgvector)
297+ │ └── agentic_workflows/ # Agentic AI examples
298+ │ ├── llm_chat.py # Multi-turn LLM chat
299+ │ ├── llm_chat_human_in_loop.py # Interactive chat with WAIT
300+ │ ├── multiagent_chat.py # Multi-agent debate
301+ │ ├── function_calling_example.py # LLM function calling
302+ │ └── mcp_weather_agent.py # MCP tool calling agent
303+ │
217304├── Monitoring
218305│ ├── metrics_example.py # Prometheus metrics
219306│ ├── event_listener_examples.py # Custom listeners
@@ -245,14 +332,11 @@ examples/
245332│ └── other_workers/
246333│
247334└── orkes/ # Orkes-specific features
248- ├── ai_orchestration/ # AI/LLM integration
249- │ ├── open_ai_chat_gpt.py
250- │ ├── open_ai_function_example.py
251- │ └── vector_db_helloworld.py
252- └── workers/ # Advanced patterns
253- ├── http_poll.py
254- ├── sync_updates.py
255- └── wait_for_webhook.py
335+ ├── vector_db_helloworld.py # Vector DB operations
336+ ├── agentic_workflow.py # AI agent (AIOrchestrator)
337+ ├── http_poll.py
338+ ├── sync_updates.py
339+ └── wait_for_webhook.py
256340```
257341
258342---
0 commit comments