backend-infra-engineer: Release v0.3.3 snapshot

This commit is contained in:
scawful
2025-11-21 21:35:50 -05:00
parent 3d71417f62
commit 476dd1cd1c
818 changed files with 65706 additions and 35514 deletions

View File

@@ -0,0 +1,85 @@
# AI API & Agentic Workflow Enhancement - Phase 2 Handoff
**Date**: 2025-11-20
**Status**: Phase 2 Implementation Complete
**Previous Plan**: `docs/internal/AI_API_ENHANCEMENT_HANDOFF.md`
## Overview
This handoff covers the completion of Phase 2, which focused on unifying the UI for model selection and implementing the initial HTTP API server foundation. The codebase is now ready for building and verifying the API endpoints.
## Completed Work
### 1. UI Unification (`src/app/editor/agent/agent_chat_widget.cc`)
- **Unified Model List**: Replaced the separate Ollama/Gemini list logic with a single, unified list derived from `ModelRegistry`.
- **Provider Badges**: Models in the list now display their provider (e.g., `[ollama]`, `[gemini]`).
- **Contextual Configuration**:
- If an **Ollama** model is selected, the "Ollama Host" input is displayed.
- If a **Gemini** model is selected, the "API Key" input is displayed.
- **Favorites & Presets**: Updated to work with the unified `ModelInfo` structure.
### 2. HTTP Server Implementation (`src/cli/service/api/`)
- **`HttpServer` Class**:
- Wraps `httplib::Server` running in a background `std::thread`.
- Exposed via `Start(port)` and `Stop()` methods.
- Graceful shutdown handling.
- **API Handlers**:
- `GET /api/v1/health`: Returns server status (JSON).
- `GET /api/v1/models`: Returns list of available models from `ModelRegistry`.
- **Integration**:
- Updated `src/cli/agent.cmake` to include `http_server.cc`, `api_handlers.cc`, and `model_registry.cc`.
- Updated `src/app/main.cc` to accept `--enable_api` and `--api_port` flags.
## Build & Test Instructions
### 1. Building
The project uses CMake. The new files are automatically included in the `yaze_agent` library via `src/cli/agent.cmake`.
```bash
# Generate build files (if not already done)
cmake -B build -G Ninja
# Build the main application
cmake --build build --target yaze_app
```
### 2. Testing the UI
1. Launch the editor:
```bash
./build/yaze_app --editor=Agent
```
2. Verify the **Model Configuration** panel:
- You should see a single list of models.
- Try searching for a model.
- Select an Ollama model -> Verify "Host" input appears.
- Select a Gemini model -> Verify "API Key" input appears.
### 3. Testing the API
1. Launch the editor with API enabled:
```bash
./build/yaze_app --enable_api --api_port=8080
```
*(Check logs for "Starting API server on port 8080")*
2. Test Health Endpoint:
```bash
curl -v http://localhost:8080/api/v1/health
# Expected: {"status":"ok", "version":"1.0", ...}
```
3. Test Models Endpoint:
```bash
curl -v http://localhost:8080/api/v1/models
# Expected: {"models": [{"name": "...", "provider": "..."}], "count": ...}
```
## Next Steps (Phase 3 & 4)
### Phase 3: Tool Expansion
- **FileSystemTool**: Implement safe file read/write operations (`src/cli/handlers/tools/filesystem_commands.h`).
- **BuildTool**: Implement cmake/ninja triggers.
- **Editor Integration**: Inject editor state (open files, errors) into the agent context.
### Phase 4: Structured Output
- Refactor `ToolDispatcher` to return JSON objects instead of capturing stdout strings.
- Update API to expose a `/api/v1/chat` endpoint that returns these structured responses.