feat(analysis): Implement Configurable Analysis Template Engine
This commit introduces a comprehensive, template-based analysis orchestration system, refactoring the entire analysis generation workflow from the ground up.
Key Changes:
1. **Backend Architecture (`report-generator-service`):**
* Replaced the naive analysis workflow with a robust orchestrator based on a Directed Acyclic Graph (DAG) of module dependencies.
* Implemented a full topological sort (`petgraph`) to determine the correct execution order and detect circular dependencies.
2. **Data Models (`common-contracts`, `data-persistence-service`):**
* Introduced the concept of `AnalysisTemplateSets` to allow for multiple, independent, and configurable analysis workflows.
* Created a new `analysis_results` table to persist the output of each module for every analysis run, ensuring traceability.
* Implemented a file-free data seeding mechanism to populate default analysis templates on service startup.
3. **API Layer (`api-gateway`):**
* Added a new asynchronous endpoint (`POST /analysis-requests/{symbol}`) to trigger analysis workflows via NATS messages.
* Updated all configuration endpoints to support the new `AnalysisTemplateSets` model.
4. **Frontend UI (`/config`, `/query`):**
* Completely refactored the "Analysis Config" page into a two-level management UI for "Template Sets" and the "Modules" within them, supporting full CRUD operations.
* Updated the "Query" page to allow users to select which analysis template to use when generating a report.
This new architecture provides a powerful, flexible, and robust foundation for all future development of our intelligent analysis capabilities.
This commit is contained in:
parent
60e6c8f61b
commit
427776b863
@ -0,0 +1,245 @@
|
||||
---
|
||||
status: "Active"
|
||||
date: "2025-11-17"
|
||||
author: "AI 助手"
|
||||
---
|
||||
|
||||
# 设计文档:可配置的分析模板与编排器
|
||||
|
||||
## 1. 概述与目标
|
||||
|
||||
### 1.1. 问题陈述
|
||||
|
||||
我们当前基于 Rust 的后端缺少执行智能、多步骤财务分析所需的核心业务逻辑。`report-generator-service` 作为此逻辑的载体,其内部实现尚不完整。更重要的是,当前的系统设计缺少一个清晰的、可扩展的方式来管理和复用成套的分析流程,并且在配置初始化方面存在对本地文件的依赖,这不符合我们健壮的系统设计原则。
|
||||
|
||||
### 1.2. 目标
|
||||
|
||||
本任务旨在我们的 Rust 微服务架构中,设计并实现一个以**分析模板集(Analysis Template Sets)**为核心的、健壮的、可配置的**分析模块编排器**。该系统将允许我们创建、管理和执行多套独立的、包含复杂依赖关系的分析工作流。
|
||||
|
||||
为达成此目标,需要完成以下任务:
|
||||
1. **引入分析模板集**:在系统顶层设计中引入“分析模板集”的概念,每个模板集包含一套独立的分析模块及其配置。
|
||||
2. **实现前端模板化管理**:在前端配置中心实现对“分析模板集”的完整 CRUD 管理,并允许在每个模板集内部对分析模块进行 CRUD 管理。
|
||||
3. **构建健壮的后端编排器**:在 `report-generator-service` 中实现一个能够执行指定分析模板集的后端编排器,该编排器需基于拓扑排序来处理模块间的依赖关系。
|
||||
4. **实现无文件依赖的数据初始化**:通过在服务二进制文件中嵌入默认配置的方式,实现系统首次启动时的数据播种(Seeding),彻底移除对本地配置文件的依赖。
|
||||
|
||||
## 2. 新数据模型 (`common-contracts`)
|
||||
|
||||
为了支持“分析模板集”的概念,我们需要定义新的数据结构。
|
||||
|
||||
```rust
|
||||
// common-contracts/src/config_models.rs
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
|
||||
// 整个系统的分析模板配置,作为顶级对象存储在数据库中
|
||||
// Key: 模板ID (e.g., "standard_fundamentals")
|
||||
pub type AnalysisTemplateSets = HashMap<String, AnalysisTemplateSet>;
|
||||
|
||||
// 单个分析模板集,代表一套完整的分析流程
|
||||
#[derive(Serialize, Deserialize, Debug, Clone)]
|
||||
pub struct AnalysisTemplateSet {
|
||||
pub name: String, // 人类可读的模板名称, e.g., "标准基本面分析"
|
||||
// 该模板集包含的所有分析模块
|
||||
// Key: 模块ID (e.g., "fundamental_analysis")
|
||||
pub modules: HashMap<String, AnalysisModuleConfig>,
|
||||
}
|
||||
|
||||
// 单个分析模块的配置 (与之前定义保持一致)
|
||||
#[derive(Serialize, Deserialize, Debug, Clone)]
|
||||
pub struct AnalysisModuleConfig {
|
||||
pub name: String,
|
||||
pub provider_id: String,
|
||||
pub model_id: String,
|
||||
pub prompt_template: String,
|
||||
// 依赖关系列表,其中的字符串必须是同一个模板集内其他模块的ID
|
||||
pub dependencies: Vec<String>,
|
||||
}
|
||||
```
|
||||
|
||||
## 3. 系统架构与数据流
|
||||
|
||||
### 3.1. 高层数据流
|
||||
|
||||
1. **配置流程**:
|
||||
* **用户** 在 **前端** 与配置页面交互,创建或修改一个“分析模板集”。
|
||||
* **前端** 向 **API 网关** 发送 `PUT /api/v1/configs/analysis_template_sets` 请求。
|
||||
* **API 网关** 将请求代理至 **数据持久化服务**,由其将序列化后的 `AnalysisTemplateSets` 对象完整保存到数据库中。
|
||||
|
||||
2. **执行流程**:
|
||||
* **用户** 在 **前端** 选择一个**分析模板集**,然后为特定的股票代码触发分析。
|
||||
* **前端** 向 **API 网关** 发送 `POST /api/v1/analysis-requests/{symbol}` 请求,请求体中包含所选的 `template_id`。
|
||||
* **API 网关** 验证请求,并向 **NATS 消息总线** 发布一条包含 `symbol`, `template_id` 和 `request_id` 的 `GenerateReportCommand` 消息。
|
||||
* **报告生成服务** 订阅该消息,并根据 `template_id` 启动指定的编排工作流。
|
||||
|
||||
## 4. 前端实施计划 (`/config` 页面)
|
||||
|
||||
前端配置页面需要重构为两级结构:
|
||||
|
||||
1. **第一级:模板集管理**
|
||||
* 显示一个包含所有“分析模板集”的列表。
|
||||
* 提供“创建新模板集”、“重命名”、“删除模板集”的功能。
|
||||
* 用户选择一个模板集后,进入第二级管理界面。
|
||||
|
||||
2. **第二级:分析模块管理 (在选定的模板集内)**
|
||||
* **主界面**: 进入模板集后,主界面将以列表形式展示该模板集内所有的分析模块。每个模块将以一个独立的“卡片”形式呈现。
|
||||
* **创建 (Create)**:
|
||||
* 在模块列表的顶部或底部,将设置一个“新增分析模块”按钮。
|
||||
* 点击后,将展开一个表单,要求用户输入新模块的**模块ID**(唯一的、机器可读的英文标识符)和**模块名称**(人类可读的显示名称)。
|
||||
* **读取 (Read)**:
|
||||
* 每个模块卡片默认会显示其**模块名称**和**模块ID**。
|
||||
* 卡片可以被展开,以显示其详细配置。
|
||||
* **更新 (Update)**:
|
||||
* 在展开的模块卡片内,所有配置项均可编辑:
|
||||
* **LLM Provider**: 一个下拉菜单,选项为系统中所有已配置的LLM供应商。
|
||||
* **Model**: 一个级联下拉菜单,根据所选的Provider动态加载其可用模型。
|
||||
* **提示词模板**: 一个多行文本输入框,用于编辑模块的核心Prompt。
|
||||
* **依赖关系**: 一个复选框列表,该列表**仅显示当前模板集内除本模块外的所有其他模块**,用于勾选依赖项。
|
||||
* **删除 (Delete)**:
|
||||
* 每个模块卡片的右上角将设置一个“删除”按钮。
|
||||
* 点击后,会弹出一个确认对话框,防止用户误操作。
|
||||
|
||||
## 6. 数据库与数据结构设计
|
||||
|
||||
为了支撑上述功能,我们需要在 `data-persistence-service` 中明确两个核心的数据存储模型:一个用于存储**配置**,一个用于存储**结果**。
|
||||
|
||||
### 6.1. 配置存储:`system_config` 表
|
||||
|
||||
我们将利用现有的 `system_config` 表来存储整个分析模板集的配置。
|
||||
|
||||
- **用途**: 作为所有分析模板集的“单一事实来源”。
|
||||
- **存储方式**:
|
||||
- 表中的一条记录。
|
||||
- `config_key` (主键): `analysis_template_sets`
|
||||
- `config_value` (类型: `JSONB`): 存储序列化后的 `AnalysisTemplateSets` (即 `HashMap<String, AnalysisTemplateSet>`) 对象。
|
||||
- **对应数据结构 (`common-contracts`)**: 我们在第2节中定义的 `AnalysisTemplateSets` 类型是此记录的直接映射。
|
||||
|
||||
### 6.2. 结果存储:`analysis_results` 表 (新)
|
||||
|
||||
为了存储每次分析工作流执行后,各个模块生成的具体内容,我们需要一张新表。
|
||||
|
||||
- **表名**: `analysis_results`
|
||||
- **用途**: 持久化存储每一次分析运行的产出,便于历史追溯和未来查询。
|
||||
- **SQL Schema**:
|
||||
```sql
|
||||
CREATE TABLE analysis_results (
|
||||
id BIGSERIAL PRIMARY KEY,
|
||||
request_id UUID NOT NULL, -- 关联单次完整分析请求的ID
|
||||
symbol VARCHAR(32) NOT NULL, -- 关联的股票代码
|
||||
template_id VARCHAR(64) NOT NULL, -- 使用的分析模板集ID
|
||||
module_id VARCHAR(64) NOT NULL, -- 产出此结果的模块ID
|
||||
content TEXT NOT NULL, -- LLM生成的分析内容
|
||||
meta_data JSONB, -- 存储额外元数据 (e.g., model_name, tokens, elapsed_ms)
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
|
||||
-- 建立索引以优化查询
|
||||
INDEX idx_analysis_results_request_id (request_id),
|
||||
INDEX idx_analysis_results_symbol_template (symbol, template_id)
|
||||
);
|
||||
```
|
||||
- **对应数据结构 (`common-contracts`)**:
|
||||
```rust
|
||||
// common-contracts/src/dtos.rs
|
||||
|
||||
#[derive(Serialize, Deserialize, Debug, Clone)]
|
||||
pub struct NewAnalysisResult {
|
||||
pub request_id: Uuid,
|
||||
pub symbol: String,
|
||||
pub template_id: String,
|
||||
pub module_id: String,
|
||||
pub content: String,
|
||||
pub meta_data: serde_json::Value,
|
||||
}
|
||||
```
|
||||
|
||||
## 5. 后端实施计划
|
||||
|
||||
### 5.1. `data-persistence-service`
|
||||
|
||||
- **数据初始化 (无文件依赖)**: 实现一次性的、基于硬编码的启动逻辑。
|
||||
1. 在 `data-persistence-service` 的代码中,将 `config/analysis-config.json` 的内容硬编码为一个 Rust 字符串常量。
|
||||
2. 在服务启动时,检查 `system_config` 表中是否存在键为 `analysis_template_sets` 的记录。
|
||||
3. 如果**不存在**,则:
|
||||
a. 解析硬编码的字符串,构建一个默认的 `AnalysisTemplateSet` (例如,ID为 `default`, 名称为 “默认分析模板”)。
|
||||
b. 将这个默认模板集包装进一个 `AnalysisTemplateSets` 的 HashMap 中。
|
||||
c. 将序列化后的 `AnalysisTemplateSets` 对象写入数据库。
|
||||
4. 此机制确保系统在首次部署时,无需任何外部文件即可拥有一套功能完备的默认分析模板。
|
||||
- **新职责**: 实现对 `analysis_results` 表的CRUD操作API。
|
||||
|
||||
### 5.2. `api-gateway`
|
||||
|
||||
- **端点更新**: `POST /api/v1/analysis-requests/{symbol}`。
|
||||
- **逻辑变更**:
|
||||
* 该端点现在需要从请求体中解析出 `template_id`。
|
||||
* 它构建的 `GenerateReportCommand` 消息中,必须包含 `template_id` 字段。
|
||||
|
||||
### 5.3. `report-generator-service` (核心任务)
|
||||
|
||||
`worker.rs` 中的编排逻辑需要进行如下调整和实现:
|
||||
|
||||
1. **消息消费者**: 订阅的 `GenerateReportCommand` 消息现在会包含 `template_id`。
|
||||
|
||||
2. **编排逻辑 (`run_report_generation_workflow`)**:
|
||||
* **获取配置**: 从 `data-persistence-service` 获取完整的 `AnalysisTemplateSets` 对象。
|
||||
* **选择模板**: 根据传入的 `template_id`,从 `AnalysisTemplateSets` 中选择出本次需要执行的 `AnalysisTemplateSet`。如果找不到,则记录错误并终止。
|
||||
* **构建依赖图**: 使用所选模板集中的 `modules` 来构建有向图。强烈推荐使用 `petgraph` crate。
|
||||
* **拓扑排序**: 对该图执行拓扑排序,**必须包含循环检测**。
|
||||
* **顺序执行**: 遍历排序后的模块列表,后续的上下文注入、LLM调用和结果持久化逻辑与之前设计一致,但操作范围仅限于当前模板集内的模块。
|
||||
|
||||
3. **补全缺失逻辑**:
|
||||
* **实现结果持久化**: 调用 `data-persistence-service` 提供的API,将每个模块生成的 `NewAnalysisResult` 存入 `analysis_results` 表。
|
||||
|
||||
## 6. 未来工作
|
||||
|
||||
### 6.1. 演进至 "Deep Research" 模块
|
||||
|
||||
此设计为未来的 "Deep Research" 模块演进奠定了坚实的基础。当该模块准备就绪时,我们可以创建一个新的“分析模板集”,其中的某些模块(如 `news_analysis`)将不再直接调用 LLM,而是调用 Deep Research 服务。Deep Research 服务将执行复杂的数据挖掘,并将高度精炼的结果返回给编排器,再由编排器注入到后续的 LLM 调用中,从而实现“数据驱动”的分析范式。
|
||||
|
||||
### 6.2. 引入工具调用框架 (Tool Calling Framework)
|
||||
|
||||
为了以一种更通用和可扩展的方式向提示词模板中注入多样化的上下文数据,我们规划引入“工具调用”框架。
|
||||
|
||||
- **概念**: “工具”是指一段独立的、用于获取特定类型数据的程序(例如,获取财务数据、获取实时股价、获取最新新闻等)。
|
||||
- **配置**: 在前端的模块配置界面,除了依赖关系外,我们还将为每个模块提供一个“可用工具”的复选框列表。用户可以为模块勾选需要调用的一个或多个工具。
|
||||
- **执行**:
|
||||
1. 在 `report-generator-service` 的编排器执行一个模块前,它会先检查该模块配置中启用了哪些“工具”。
|
||||
2. 编排器将按顺序执行这些工具。
|
||||
3. 每个工具的输出(例如,格式化为Markdown的财务数据表格)将被注入到一个统一的上下文字段中。
|
||||
- **首个工具**: 我们设想的第一个工具就是 **`财务数据注入工具`**。它将负责获取并格式化财务报表,其实现逻辑与本文档旧版本中描述的“核心逻辑细化”部分一致。
|
||||
|
||||
通过此框架,我们可以将数据注入的逻辑与编排器的核心逻辑解耦,使其更易于维护和扩展。**此项为远期规划,不在本轮实施范围之内。**
|
||||
|
||||
## 8. 实施清单 (Step-by-Step To-do List)
|
||||
|
||||
以下是为完成本项目所需的、按顺序排列的开发任务清单。
|
||||
|
||||
### 阶段一:数据模型与持久化层准备
|
||||
|
||||
- [x] **任务 1.1**: 在 `common-contracts` crate 中,创建或更新 `src/config_models.rs`,定义 `AnalysisTemplateSets`, `AnalysisTemplateSet`, `AnalysisModuleConfig` 等新的数据结构。
|
||||
- [x] **任务 1.2**: 在 `common-contracts` crate 中,创建或更新 `src/dtos.rs`,定义用于写入分析结果的 `NewAnalysisResult` 数据传输对象 (DTO)。
|
||||
- [x] **任务 1.3**: 在 `data-persistence-service` 中,创建新的数据库迁移文件 (`migrations/`),用于新增 `analysis_results` 表,其 schema 遵循本文档第6.2节的定义。
|
||||
- [x] **任务 1.4**: 在 `data-persistence-service` 中,实现 `analysis_results` 表的 CRUD API (至少需要 `create` 方法)。
|
||||
- [x] **任务 1.5**: 在 `data-persistence-service` 中,实现数据播种(Seeding)逻辑:在服务启动时,将硬编码的默认分析模板集写入数据库(如果尚不存在)。
|
||||
|
||||
### 阶段二:后端核心逻辑实现 (`report-generator-service`)
|
||||
|
||||
- [x] **任务 2.1**: 为 `report-generator-service` 添加 `petgraph` crate 作为依赖,用于构建和处理依赖图。
|
||||
- [x] **任务 2.2**: 重构 `worker.rs` 中的 `run_report_generation_workflow` 函数,使其能够接收包含 `template_id` 的消息。
|
||||
- [x] **任务 2.3**: 在 `worker.rs` 中,**实现完整的拓扑排序算法**,用以替代当前简陋的循环实现。此算法必须包含循环依赖检测。
|
||||
- [x] **任务 2.4**: 更新编排器逻辑,使其能够根据 `template_id` 从获取到的 `AnalysisTemplateSets` 中选择正确的工作流进行处理。
|
||||
- [x] **任务 2.5**: 实现调用 `data-persistence-service` 的逻辑,将每个模块成功生成的 `NewAnalysisResult` 持久化到 `analysis_results` 表中。
|
||||
|
||||
### 阶段三:服务集成与端到端打通
|
||||
|
||||
- [x] **任务 3.1**: 在 `api-gateway` 中,新增 `POST /api/v1/analysis-requests/{symbol}` 端点。
|
||||
- [x] **任务 3.2**: 在 `api-gateway` 的新端点中,实现接收前端请求(包含 `template_id`),并向 NATS 发布 `GenerateReportCommand` 消息的逻辑。
|
||||
- [x] **任务 3.3**: 在 `report-generator-service` 中,更新其 NATS 消费者,使其能够正确订阅和解析新的 `GenerateReportCommand` 消息。
|
||||
- [x] **任务 3.4**: 进行端到端集成测试,确保从前端触发的请求能够正确地启动 `report-generator-service` 并执行完整的分析流程(此时可不关心前端UI)。
|
||||
|
||||
### 阶段四:前端 UI 实现
|
||||
|
||||
- [x] **任务 4.1**: 重构 `frontend/src/app/config/page.tsx` 页面,实现两级管理结构:先管理“分析模板集”。
|
||||
- [x] **任务 4.2**: 实现“分析模板集”的创建、重命名和删除功能,并调用对应的后端API。
|
||||
- [x] **任务 4.3**: 实现模板集内部的“分析模块”管理界面,包括模块的创建、更新(所有字段)和删除功能。
|
||||
- [x] **任务 4.4**: 确保在分析请求发起的页面(例如主查询页面),用户可以选择使用哪个“分析模板集”来执行分析。
|
||||
- [x] **任务 4.5**: 更新前端调用 `api-gateway` 的逻辑,在分析请求的 body 中附带上用户选择的 `template_id`。
|
||||
@ -1,111 +0,0 @@
|
||||
---
|
||||
status: "Active"
|
||||
date: "2025-11-17"
|
||||
author: "AI 助手"
|
||||
---
|
||||
|
||||
# 设计文档:分析模块编排器
|
||||
|
||||
## 1. 概述与目标
|
||||
|
||||
### 1.1. 问题陈述
|
||||
|
||||
我们当前基于 Rust 的后端缺少执行智能、多步骤财务分析所需的核心业务逻辑。尽管旧的 Python 系统拥有一个功能性的分析框架,但在初次重构过程中,这部分逻辑并未被迁移。本应承载此逻辑的 `report-generator-service` 服务目前仅包含一个无法正常工作的占位符实现。此外,前端配置页面缺少从零开始创建或管理分析模块的用户界面,这导致了一个“先有鸡还是先有蛋”的困境——系统中不存在任何可供配置的默认模块。
|
||||
|
||||
### 1.2. 目标
|
||||
|
||||
本任务旨在我们的 Rust 微服务架构中,设计并实现一个健壮的、可配置的**分析模块编排器(Analysis Module Orchestrator)**。该系统将复刻并改进旧 Python 系统的逻辑,以支持完全通过配置(提示词和依赖关系)来创建、管理和执行复杂的、具备依赖关系感知能力的分析工作流。
|
||||
|
||||
为达成此目标,需要完成以下任务:
|
||||
1. 在前端为分析模块管理实现一个完整的 CRUD (创建、读取、更新、删除) 操作界面。
|
||||
2. 在 `report-generator-service` 中实现一个健壮的后端编排器,使其能够基于模块依赖关系构成的有向无环图 (DAG) 来执行分析工作流。
|
||||
3. 通过 `api-gateway` 和 NATS 消息总线整合前端与后端服务,以打造无缝的端到端用户体验。
|
||||
4. 实现一个数据播种(Data Seeding)机制,以确保系统在首次启动时能够预加载一套默认的分析模块。
|
||||
|
||||
## 2. 系统架构与数据流
|
||||
|
||||
本次实现将涉及四个关键服务和一个消息总线:`前端`、`API 网关`、`数据持久化服务` 和 `报告生成服务`。
|
||||
|
||||
### 2.1. 高层数据流
|
||||
|
||||
1. **配置流程**:
|
||||
* **用户** 在 **前端** 配置页面上进行交互,以创建或更新分析模块。
|
||||
* **前端** 向 **API 网关** 发送 `PUT /api/v1/configs/analysis_modules` 请求。
|
||||
* **API 网关** 将这些请求代理至 **数据持久化服务**,由其将配置保存到数据库的 `system_config` 表中。
|
||||
|
||||
2. **执行流程**:
|
||||
* **用户** 在 **前端** 为特定的股票代码触发一次分析运行。
|
||||
* **前端** 向 **API 网关** 发送 `POST /api/v1/analysis-requests/{symbol}` 请求。
|
||||
* **API 网关** 验证请求,并向 **NATS 消息总线** 的一个新主题发布一条 `GenerateReportCommand` 消息。随后,它会立即向前端返回一个带有请求ID的 `202 Accepted` 响应。
|
||||
* **报告生成服务** 订阅 `GenerateReportCommand` 主题,接收到消息后,启动编排工作流。
|
||||
* **报告生成服务** 从 **数据持久化服务** 获取所需的分析模块配置。
|
||||
* 服务执行分析,为每个模块调用 LLM API,并通过 **数据持久化服务** 将结果持久化存回数据库。
|
||||
|
||||
## 3. 前端实施计划 (`/config` 页面)
|
||||
|
||||
我们将修改 `frontend/src/app/config/page.tsx` 文件,为分析模块提供完整的 CRUD 用户体验。
|
||||
|
||||
- **创建 (Create)**: 添加一个“新增模块”按钮。点击后,将显示一个表单,用于输入:
|
||||
- **模块 ID**: 一个唯一的、机器可读的字符串 (例如, `fundamental_analysis`)。
|
||||
- **模块名称**: 一个人类可读的显示名称 (例如, "基本面分析")。
|
||||
- **读取 (Read)**: 页面将为每个已存在的分析模块渲染一个卡片,展示其当前配置。
|
||||
- **更新 (Update)**: 每个模块卡片将包含以下可编辑字段:
|
||||
- **LLM Provider**: 一个下拉菜单,其选项从 `llm_providers` 配置中动态填充。
|
||||
- **Model**: 一个级联下拉菜单,显示所选 Provider 下可用的模型。
|
||||
- **提示词模板**: 一个用于编写 Prompt 的大文本区域。
|
||||
- **依赖关系**: 一个包含所有其他模块ID的复选框列表,允许用户定义模块间的依赖。
|
||||
- **删除 (Delete)**: 每个模块卡片将有一个带有确认对话框的“删除”按钮。
|
||||
|
||||
## 4. 后端实施计划
|
||||
|
||||
### 4.1. `data-persistence-service`
|
||||
|
||||
- **数据播种 (关键任务)**: 实现一次性的启动逻辑。
|
||||
1. 在服务启动时,检查 `system_config` 表中是否存在键为 `analysis_modules` 的记录。
|
||||
2. 如果记录**不存在**,则从磁盘读取旧的 `config/analysis-config.json` 文件。
|
||||
3. 解析文件内容,并将其作为 `analysis_modules` 的值插入数据库。
|
||||
4. 此机制确保系统在首次部署时,即被预置一套默认且功能完备的分析模块。
|
||||
- **API**: 无需变更。现有的 `GET /configs/analysis_modules` 和 `PUT /configs/analysis_modules` 端点已能满足需求。
|
||||
|
||||
### 4.2. `api-gateway`
|
||||
|
||||
- **新端点**: 创建一个新的端点 `POST /api/v1/analysis-requests/{symbol}`。
|
||||
- **逻辑**:
|
||||
1. 此端点不应执行任何重度计算任务。
|
||||
2. 它将从路径中接收一个股票 `symbol`。
|
||||
3. 它将生成一个唯一的 `request_id` (例如, UUID)。
|
||||
4. 它将构建一条包含 `symbol` 和 `request_id` 的 `GenerateReportCommand` 消息。
|
||||
5. 它将此消息发布到一个专用的 NATS 主题 (例如, `analysis.commands.generate_report`)。
|
||||
6. 它将立即返回一个 `202 Accepted` 状态码,并在响应体中包含 `request_id`。
|
||||
|
||||
### 4.3. `report-generator-service` (核心任务)
|
||||
|
||||
此服务需要进行最主要的开发工作。所有逻辑将在 `worker.rs` 文件中实现。
|
||||
|
||||
1. **消息消费者**: 服务将订阅 `analysis.commands.generate_report` NATS 主题。一旦收到 `GenerateReportCommand` 消息,即触发 `run_report_generation_workflow` 工作流。
|
||||
|
||||
2. **编排逻辑 (`run_report_generation_workflow`)**:
|
||||
* **获取配置**: 从 `data-persistence-service` 获取完整的 `AnalysisModulesConfig`。
|
||||
* **构建依赖图**: 根据模块配置,在内存中构建一个有向图。强烈推荐使用 `petgraph` crate 来完成此任务。
|
||||
* **拓扑排序**: 对该图执行拓扑排序,以获得一个线性的执行顺序。该算法**必须**包含循环检测功能,以便在配置错误时能够优雅地处理,并记录错误日志。
|
||||
* **顺序执行**: 遍历排序后的模块列表。对每个模块:
|
||||
* **构建上下文**: 收集其所有直接依赖模块的文本输出(这些模块已保证被提前执行)。
|
||||
* **渲染提示词**: 使用 `Tera` 模板引擎,将依赖模块的输出以及其他所需数据(如公司名称、财务数据)注入到当前模块的 `prompt_template` 中。
|
||||
* **执行 LLM 调用**: 通过 `LlmClient` 调用相应的 LLM API。
|
||||
* **持久化结果**: 成功生成内容后,调用 `data-persistence-service` 将输出文本保存,并与 `symbol` 和 `module_id` 关联。同时,将结果保存在本地,以供工作流中的后续模块使用。
|
||||
|
||||
3. **补全缺失逻辑**:
|
||||
* 实现 `// TODO` 中关于持久化结果的部分。
|
||||
* 将 `financial_data` 占位符替换为从 `data-persistence-service` 获取并格式化后的真实财务数据。
|
||||
|
||||
## 5. 未来工作:向 "Deep Research" 模块演进
|
||||
|
||||
如前所述,初始实现将依赖 LLM 的内部知识来完成“新闻”或“市场情绪”等分析。这是一个为快速实现功能而刻意选择的短期策略。
|
||||
|
||||
长期愿景是用一个 `Deep Research` 模块来取代这种模式。该模块将作为一个智能的数据预处理器。届时,编排器将不再注入简单的文本,而是触发 Deep Research 模块,后者将:
|
||||
1. 理解目标分析模块(如 `news_analysis`)的数据需求。
|
||||
2. 查询内部数据源(例如,数据库中的 `news` 表)以查找相关信息。
|
||||
3. 对检索到的数据执行多步推理或摘要。
|
||||
4. 为最终的分析模块提示词提供一个高质量、经过浓缩的数据包。
|
||||
|
||||
这一演进将使我们的系统从“提示词驱动”转变为“数据驱动”,从而显著提升分析结果的可靠性、可控性和准确性。
|
||||
45
frontend/src/app/api/configs/analysis_template_sets/route.ts
Normal file
45
frontend/src/app/api/configs/analysis_template_sets/route.ts
Normal file
@ -0,0 +1,45 @@
|
||||
const BACKEND_BASE = process.env.BACKEND_INTERNAL_URL || process.env.NEXT_PUBLIC_BACKEND_URL;
|
||||
|
||||
export async function GET() {
|
||||
if (!BACKEND_BASE) {
|
||||
return new Response('NEXT_PUBLIC_BACKEND_URL 未配置', { status: 500 });
|
||||
}
|
||||
try {
|
||||
const resp = await fetch(`${BACKEND_BASE}/configs/analysis_template_sets`, {
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
cache: 'no-store',
|
||||
});
|
||||
const text = await resp.text();
|
||||
return new Response(text, {
|
||||
status: resp.status,
|
||||
headers: { 'Content-Type': resp.headers.get('Content-Type') || 'application/json' },
|
||||
});
|
||||
} catch (e: any) {
|
||||
const errorBody = JSON.stringify({ message: e?.message || '连接后端失败' });
|
||||
return new Response(errorBody, { status: 502, headers: { 'Content-Type': 'application/json' } });
|
||||
}
|
||||
}
|
||||
|
||||
export async function PUT(req: Request) {
|
||||
if (!BACKEND_BASE) {
|
||||
return new Response('NEXT_PUBLIC_BACKEND_URL 未配置', { status: 500 });
|
||||
}
|
||||
const body = await req.text();
|
||||
try {
|
||||
const resp = await fetch(`${BACKEND_BASE}/configs/analysis_template_sets`, {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body,
|
||||
});
|
||||
const text = await resp.text();
|
||||
return new Response(text, {
|
||||
status: resp.status,
|
||||
headers: { 'Content-Type': resp.headers.get('Content-Type') || 'application/json' },
|
||||
});
|
||||
} catch (e: any) {
|
||||
const errorBody = JSON.stringify({ message: e?.message || '连接后端失败' });
|
||||
return new Response(errorBody, { status: 502, headers: { 'Content-Type': 'application/json' } });
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@ -20,8 +20,11 @@ import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@
|
||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import { Spinner } from "@/components/ui/spinner";
|
||||
// Types are imported from '@/types'
|
||||
import type { AnalysisModulesConfig, DataSourcesConfig, DataSourceConfig, DataSourceProvider, LlmProvidersConfig, LlmModel } from '@/types';
|
||||
import { useDataSourcesConfig, updateDataSourcesConfig } from '@/hooks/useApi';
|
||||
import type {
|
||||
AnalysisModulesConfig, DataSourcesConfig, DataSourceConfig, DataSourceProvider, LlmProvidersConfig, LlmModel,
|
||||
AnalysisTemplateSets, AnalysisTemplateSet, AnalysisModuleConfig
|
||||
} from '@/types';
|
||||
import { useDataSourcesConfig, updateDataSourcesConfig, useAnalysisTemplateSets, updateAnalysisTemplateSets } from '@/hooks/useApi';
|
||||
|
||||
export default function ConfigPage() {
|
||||
// 从 Zustand store 获取全局状态
|
||||
@ -31,6 +34,8 @@ export default function ConfigPage() {
|
||||
|
||||
// 加载分析配置(统一使用 initialAnalysisModules)
|
||||
// const { data: analysisConfig, mutate: mutateAnalysisConfig } = useAnalysisModules();
|
||||
// LLM Providers(用于模型列表与保存)
|
||||
const { data: llmProviders, mutate: mutateLlmProviders } = useLlmProviders();
|
||||
|
||||
// 本地表单状态
|
||||
// 数据源本地状态
|
||||
@ -39,12 +44,6 @@ export default function ConfigPage() {
|
||||
|
||||
// 分析配置的本地状态
|
||||
const [localAnalysisModules, setLocalAnalysisModules] = useState<AnalysisModulesConfig>({});
|
||||
|
||||
// -- New State for Creating Analysis Modules --
|
||||
const [isCreatingModule, setIsCreatingModule] = useState(false);
|
||||
const [newModuleId, setNewModuleId] = useState('');
|
||||
const [newModuleName, setNewModuleName] = useState('');
|
||||
|
||||
// 分析配置保存状态(状态定义在下方统一维护)
|
||||
|
||||
// 测试结果状态
|
||||
@ -55,11 +54,22 @@ export default function ConfigPage() {
|
||||
const [saveMessage, setSaveMessage] = useState('');
|
||||
|
||||
// --- New State for Analysis Modules ---
|
||||
const { data: llmProviders, mutate: mutateLlmProviders } = useLlmProviders();
|
||||
const { data: initialAnalysisModules, mutate } = useAnalysisModules();
|
||||
const { data: initialAnalysisTemplateSets, mutate: mutateAnalysisTemplateSets } = useAnalysisTemplateSets();
|
||||
const [localTemplateSets, setLocalTemplateSets] = useState<AnalysisTemplateSets>({});
|
||||
const [selectedTemplateId, setSelectedTemplateId] = useState<string | null>(null);
|
||||
|
||||
const [isSavingAnalysis, setIsSavingAnalysis] = useState(false);
|
||||
const [analysisSaveMessage, setAnalysisSaveMessage] = useState('');
|
||||
|
||||
// State for creating/editing templates and modules
|
||||
const [newTemplateId, setNewTemplateId] = useState('');
|
||||
const [newTemplateName, setNewTemplateName] = useState('');
|
||||
const [isCreatingTemplate, setIsCreatingTemplate] = useState(false);
|
||||
|
||||
const [isCreatingModule, setIsCreatingModule] = useState(false);
|
||||
const [newModuleId, setNewModuleId] = useState('');
|
||||
const [newModuleName, setNewModuleName] = useState('');
|
||||
|
||||
// --- State for LLM Providers Management ---
|
||||
const [localLlmProviders, setLocalLlmProviders] = useState<LlmProvidersConfig>({});
|
||||
const [isSavingLlm, setIsSavingLlm] = useState(false);
|
||||
@ -160,10 +170,11 @@ export default function ConfigPage() {
|
||||
}, [localLlmProviders, pendingApiKeys, flushSaveLlmImmediate]);
|
||||
|
||||
useEffect(() => {
|
||||
if (initialAnalysisModules) {
|
||||
setLocalAnalysisModules(initialAnalysisModules);
|
||||
}
|
||||
}, [initialAnalysisModules]);
|
||||
if (!initialAnalysisTemplateSets) return;
|
||||
setLocalTemplateSets(initialAnalysisTemplateSets);
|
||||
// 仅在未选择时,从后端数据中选择第一个模板;避免覆盖本地新增的选择与状态
|
||||
setSelectedTemplateId(prev => prev ?? (Object.keys(initialAnalysisTemplateSets)[0] || null));
|
||||
}, [initialAnalysisTemplateSets]);
|
||||
|
||||
useEffect(() => {
|
||||
if (initialDataSources) {
|
||||
@ -180,10 +191,20 @@ export default function ConfigPage() {
|
||||
}
|
||||
}, [llmProviders, normalizeProviders]);
|
||||
|
||||
const handleAnalysisChange = (moduleId: string, field: string, value: string) => {
|
||||
setLocalAnalysisModules(prev => ({
|
||||
const handleAnalysisChange = (moduleId: string, field: string, value: any) => {
|
||||
if (!selectedTemplateId) return;
|
||||
setLocalTemplateSets(prev => ({
|
||||
...prev,
|
||||
[moduleId]: { ...prev[moduleId], [field]: value }
|
||||
[selectedTemplateId]: {
|
||||
...prev[selectedTemplateId],
|
||||
modules: {
|
||||
...prev[selectedTemplateId].modules,
|
||||
[moduleId]: {
|
||||
...prev[selectedTemplateId].modules[moduleId],
|
||||
[field]: value,
|
||||
},
|
||||
},
|
||||
},
|
||||
}));
|
||||
};
|
||||
|
||||
@ -191,8 +212,8 @@ export default function ConfigPage() {
|
||||
setIsSavingAnalysis(true);
|
||||
setAnalysisSaveMessage('保存中...');
|
||||
try {
|
||||
const updated = await updateAnalysisModules(localAnalysisModules);
|
||||
await mutate(updated, false);
|
||||
const updated = await updateAnalysisTemplateSets(localTemplateSets);
|
||||
await mutateAnalysisTemplateSets(updated, false);
|
||||
setAnalysisSaveMessage('分析配置保存成功!');
|
||||
} catch (e: any) {
|
||||
setAnalysisSaveMessage(`保存失败: ${e.message}`);
|
||||
@ -216,40 +237,99 @@ export default function ConfigPage() {
|
||||
};
|
||||
|
||||
// 更新分析模块的依赖
|
||||
const updateAnalysisDependencies = (type: string, dependency: string, checked: boolean) => {
|
||||
setLocalAnalysisModules(prev => {
|
||||
const currentConfig = prev[type];
|
||||
const currentDeps = currentConfig.dependencies || [];
|
||||
|
||||
const updateAnalysisDependencies = (moduleId: string, dependency: string, checked: boolean) => {
|
||||
if (!selectedTemplateId) return;
|
||||
setLocalTemplateSets(prev => {
|
||||
const currentModule = prev[selectedTemplateId].modules[moduleId];
|
||||
const currentDeps = currentModule.dependencies || [];
|
||||
const newDeps = checked
|
||||
? [...currentDeps, dependency]
|
||||
// 移除依赖,并去重
|
||||
: currentDeps.filter(d => d !== dependency);
|
||||
|
||||
return {
|
||||
...prev,
|
||||
[type]: {
|
||||
...currentConfig,
|
||||
dependencies: [...new Set(newDeps)] // 确保唯一性
|
||||
}
|
||||
[selectedTemplateId]: {
|
||||
...prev[selectedTemplateId],
|
||||
modules: {
|
||||
...prev[selectedTemplateId].modules,
|
||||
[moduleId]: {
|
||||
...currentModule,
|
||||
dependencies: [...new Set(newDeps)],
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
});
|
||||
};
|
||||
|
||||
// --- New handlers for module creation/deletion ---
|
||||
// --- Handlers for templates and modules ---
|
||||
|
||||
const handleAddTemplate = () => {
|
||||
if (!newTemplateId || !newTemplateName) {
|
||||
setAnalysisSaveMessage('模板 ID 和名称不能为空');
|
||||
return;
|
||||
}
|
||||
if (localTemplateSets[newTemplateId]) {
|
||||
setAnalysisSaveMessage('模板 ID 已存在');
|
||||
return;
|
||||
}
|
||||
const newSet: AnalysisTemplateSets = {
|
||||
...localTemplateSets,
|
||||
[newTemplateId]: {
|
||||
name: newTemplateName,
|
||||
modules: {},
|
||||
},
|
||||
};
|
||||
setLocalTemplateSets(newSet);
|
||||
setSelectedTemplateId(newTemplateId);
|
||||
setNewTemplateId('');
|
||||
setNewTemplateName('');
|
||||
setIsCreatingTemplate(false);
|
||||
// 新建后立即持久化,避免刷新/切 tab 导致本地新增被覆盖且无网络请求记录
|
||||
(async () => {
|
||||
setIsSavingAnalysis(true);
|
||||
setAnalysisSaveMessage('保存中...');
|
||||
try {
|
||||
const updated = await updateAnalysisTemplateSets(newSet);
|
||||
await mutateAnalysisTemplateSets(updated, false);
|
||||
setAnalysisSaveMessage('分析配置保存成功!');
|
||||
} catch (e: any) {
|
||||
setAnalysisSaveMessage(`保存失败: ${e?.message || '未知错误'}`);
|
||||
} finally {
|
||||
setIsSavingAnalysis(false);
|
||||
setTimeout(() => setAnalysisSaveMessage(''), 5000);
|
||||
}
|
||||
})();
|
||||
};
|
||||
|
||||
const handleDeleteTemplate = () => {
|
||||
if (!selectedTemplateId || !window.confirm(`确定要删除模板 "${localTemplateSets[selectedTemplateId].name}" 吗?`)) {
|
||||
return;
|
||||
}
|
||||
const newSets = { ...localTemplateSets };
|
||||
delete newSets[selectedTemplateId];
|
||||
setLocalTemplateSets(newSets);
|
||||
// Select the first available template or null
|
||||
const firstKey = Object.keys(newSets)[0] || null;
|
||||
setSelectedTemplateId(firstKey);
|
||||
};
|
||||
|
||||
|
||||
const handleAddNewModule = () => {
|
||||
if (!newModuleId || !newModuleName) {
|
||||
if (!selectedTemplateId || !newModuleId || !newModuleName) {
|
||||
setAnalysisSaveMessage('模块 ID 和名称不能为空');
|
||||
setTimeout(() => setAnalysisSaveMessage(''), 3000);
|
||||
return;
|
||||
}
|
||||
if (localAnalysisModules[newModuleId]) {
|
||||
if (localTemplateSets[selectedTemplateId].modules[newModuleId]) {
|
||||
setAnalysisSaveMessage('模块 ID 已存在');
|
||||
setTimeout(() => setAnalysisSaveMessage(''), 3000);
|
||||
return;
|
||||
}
|
||||
setLocalAnalysisModules(prev => ({
|
||||
setLocalTemplateSets(prev => ({
|
||||
...prev,
|
||||
[selectedTemplateId]: {
|
||||
...prev[selectedTemplateId],
|
||||
modules: {
|
||||
...prev[selectedTemplateId].modules,
|
||||
[newModuleId]: {
|
||||
name: newModuleName,
|
||||
provider_id: '',
|
||||
@ -257,6 +337,8 @@ export default function ConfigPage() {
|
||||
prompt_template: '',
|
||||
dependencies: [],
|
||||
}
|
||||
}
|
||||
}
|
||||
}));
|
||||
setNewModuleId('');
|
||||
setNewModuleName('');
|
||||
@ -264,10 +346,17 @@ export default function ConfigPage() {
|
||||
};
|
||||
|
||||
const handleDeleteModule = (moduleId: string) => {
|
||||
setLocalAnalysisModules(prev => {
|
||||
const next = { ...prev };
|
||||
delete next[moduleId];
|
||||
return next;
|
||||
if (!selectedTemplateId) return;
|
||||
setLocalTemplateSets(prev => {
|
||||
const newModules = { ...prev[selectedTemplateId].modules };
|
||||
delete newModules[moduleId];
|
||||
return {
|
||||
...prev,
|
||||
[selectedTemplateId]: {
|
||||
...prev[selectedTemplateId],
|
||||
modules: newModules,
|
||||
},
|
||||
};
|
||||
});
|
||||
};
|
||||
|
||||
@ -993,12 +1082,64 @@ export default function ConfigPage() {
|
||||
<TabsContent value="analysis" className="space-y-4">
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>分析模块配置</CardTitle>
|
||||
<CardDescription>配置各个分析模块的模型、提示词和依赖关系。模块ID将作为Prompt中注入上下文的占位符。</CardDescription>
|
||||
<CardTitle>分析模板与模块配置</CardTitle>
|
||||
<CardDescription>管理不同的分析模板集,并为每个模板集内的模块配置模型、提示词和依赖关系。</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent className="space-y-6">
|
||||
{Object.entries(localAnalysisModules).map(([moduleId, config]) => {
|
||||
|
||||
{/* --- Level 1: Template Set Management --- */}
|
||||
<div className="p-4 border rounded-lg bg-slate-50 space-y-4">
|
||||
<div className="flex items-center gap-4">
|
||||
<Label className="font-semibold">当前分析模板:</Label>
|
||||
<Select
|
||||
value={selectedTemplateId || ''}
|
||||
onValueChange={(id) => setSelectedTemplateId(id)}
|
||||
>
|
||||
<SelectTrigger className="w-[280px]">
|
||||
<SelectValue placeholder="选择一个模板..." />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{Object.entries(localTemplateSets).map(([id, set]) => (
|
||||
<SelectItem key={id} value={id}>{set.name} ({id})</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
<Button variant="outline" onClick={() => setIsCreatingTemplate(true)} disabled={isCreatingTemplate}>+ 新建模板</Button>
|
||||
<Button variant="destructive" onClick={handleDeleteTemplate} disabled={!selectedTemplateId}>删除当前模板</Button>
|
||||
</div>
|
||||
|
||||
{isCreatingTemplate && (
|
||||
<div className="space-y-3 p-3 border rounded-md border-dashed">
|
||||
<h4 className="font-semibold">新建分析模板集</h4>
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<Input
|
||||
placeholder="模板 ID (e.g., standard_v2)"
|
||||
value={newTemplateId}
|
||||
onChange={(e) => setNewTemplateId(e.target.value.replace(/\s/g, ''))}
|
||||
/>
|
||||
<Input
|
||||
placeholder="模板名称 (e.g., 标准分析模板V2)"
|
||||
value={newTemplateName}
|
||||
onChange={(e) => setNewTemplateName(e.target.value)}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex gap-2">
|
||||
<Button onClick={handleAddTemplate}>确认创建</Button>
|
||||
<Button variant="ghost" onClick={() => setIsCreatingTemplate(false)}>取消</Button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<Separator />
|
||||
|
||||
{/* --- Level 2: Module Management (within selected template) --- */}
|
||||
{selectedTemplateId && localTemplateSets[selectedTemplateId] ? (
|
||||
<div className="space-y-6">
|
||||
{Object.entries(localTemplateSets[selectedTemplateId].modules).map(([moduleId, config]) => {
|
||||
const availableModels = llmProviders?.[config.provider_id]?.models.filter(m => m.is_active) || [];
|
||||
const allModulesInSet = localTemplateSets[selectedTemplateId].modules;
|
||||
|
||||
return (
|
||||
<div key={moduleId} className="space-y-4 p-4 border rounded-lg">
|
||||
<div className="flex justify-between items-start">
|
||||
@ -1006,7 +1147,7 @@ export default function ConfigPage() {
|
||||
<h3 className="text-lg font-semibold">{config.name || moduleId}</h3>
|
||||
<p className="text-sm text-muted-foreground">ID: <Badge variant="secondary">{moduleId}</Badge></p>
|
||||
</div>
|
||||
<Button variant="destructive" size="sm" onClick={() => handleDeleteModule(moduleId)}>删除</Button>
|
||||
<Button variant="destructive" size="sm" onClick={() => handleDeleteModule(moduleId)}>删除模块</Button>
|
||||
</div>
|
||||
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
@ -1020,8 +1161,8 @@ export default function ConfigPage() {
|
||||
<SelectValue placeholder="选择一个 Provider" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{llmProviders && Object.entries(llmProviders).map(([pId, p]) => (
|
||||
<SelectItem key={pId} value={pId}>{p.name}</SelectItem>
|
||||
{Object.entries(llmProviders || {}).map(([pId, p]) => (
|
||||
<SelectItem key={pId} value={pId}>{p.name || pId}</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
@ -1054,13 +1195,36 @@ export default function ConfigPage() {
|
||||
rows={10}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className="space-y-2">
|
||||
<Label>依赖模块 (Dependencies)</Label>
|
||||
<div className="grid grid-cols-2 md:grid-cols-4 gap-2 p-2 border rounded-md">
|
||||
{Object.keys(allModulesInSet)
|
||||
.filter(id => id !== moduleId)
|
||||
.map(depId => (
|
||||
<div key={depId} className="flex items-center space-x-2">
|
||||
<Checkbox
|
||||
id={`${moduleId}-${depId}`}
|
||||
checked={(config.dependencies || []).includes(depId)}
|
||||
onCheckedChange={(checked) => updateAnalysisDependencies(moduleId, depId, !!checked)}
|
||||
/>
|
||||
<label
|
||||
htmlFor={`${moduleId}-${depId}`}
|
||||
className="text-sm font-medium"
|
||||
>
|
||||
{allModulesInSet[depId]?.name || depId}
|
||||
</label>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
|
||||
{isCreatingModule && (
|
||||
<div className="space-y-4 p-4 border rounded-lg border-dashed">
|
||||
<h3 className="text-lg font-semibold">新增分析模块</h3>
|
||||
<h3 className="text-lg font-semibold">在 "{localTemplateSets[selectedTemplateId].name}" 中新增分析模块</h3>
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div className="space-y-2">
|
||||
<Label htmlFor="new-module-id">模块 ID (英文, 无空格)</Label>
|
||||
@ -1088,12 +1252,12 @@ export default function ConfigPage() {
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="flex items-center gap-4 pt-4">
|
||||
<div className="flex items-center gap-4 pt-4 border-t">
|
||||
<Button onClick={() => setIsCreatingModule(true)} variant="outline" disabled={isCreatingModule}>
|
||||
+ 新增分析模块
|
||||
</Button>
|
||||
<Button onClick={handleSaveAnalysis} disabled={isSavingAnalysis}>
|
||||
{isSavingAnalysis ? '保存中...' : '保存分析配置'}
|
||||
{isSavingAnalysis ? '保存中...' : '保存所有变更'}
|
||||
</Button>
|
||||
{analysisSaveMessage && (
|
||||
<span className={`text-sm ${analysisSaveMessage.includes('成功') ? 'text-green-600' : 'text-red-600'}`}>
|
||||
@ -1101,6 +1265,12 @@ export default function ConfigPage() {
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<div className="text-center text-muted-foreground py-10">
|
||||
<p>请先选择或创建一个分析模板集。</p>
|
||||
</div>
|
||||
)}
|
||||
</CardContent>
|
||||
</Card>
|
||||
</TabsContent>
|
||||
|
||||
@ -5,6 +5,10 @@ import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/com
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
import { useAnalysisTemplateSets } from "@/hooks/useApi"; // Import the new hook
|
||||
import type { AnalysisTemplateSets } from "@/types";
|
||||
|
||||
type ReportItem = {
|
||||
report_id: string;
|
||||
@ -15,42 +19,62 @@ type ReportItem = {
|
||||
|
||||
export default function QueryPage() {
|
||||
const [market, setMarket] = useState<"cn" | "us" | "jp">("cn");
|
||||
const [orgId, setOrgId] = useState("AAPL");
|
||||
const [orgId, setOrgId] = useState("600519");
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [reports, setReports] = useState<ReportItem[]>([]);
|
||||
const [msg, setMsg] = useState<string | null>(null);
|
||||
|
||||
// --- New State for Template Selection ---
|
||||
const { data: templateSets } = useAnalysisTemplateSets();
|
||||
const [selectedTemplateId, setSelectedTemplateId] = useState<string>('');
|
||||
|
||||
// Auto-select first template when available
|
||||
useEffect(() => {
|
||||
if (templateSets && Object.keys(templateSets).length > 0 && !selectedTemplateId) {
|
||||
setSelectedTemplateId(Object.keys(templateSets)[0]);
|
||||
}
|
||||
}, [templateSets, selectedTemplateId]);
|
||||
|
||||
|
||||
async function loadReports() {
|
||||
if (!market || !orgId) return;
|
||||
setLoading(true);
|
||||
try {
|
||||
// This API seems deprecated, but we keep the logic for now.
|
||||
// In a real scenario, this would query the new `analysis_results` table.
|
||||
const res = await fetch(`/api/orgs/${market}/${orgId}/reports`);
|
||||
const data = await res.json();
|
||||
setReports(data.reports ?? []);
|
||||
} catch (e) {
|
||||
setMsg("加载失败");
|
||||
setMsg("加载历史报告失败");
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
}
|
||||
|
||||
async function triggerGenerate() {
|
||||
if (!market || !orgId) return;
|
||||
if (!orgId || !selectedTemplateId) {
|
||||
setMsg("企业ID和分析模板不能为空");
|
||||
return;
|
||||
}
|
||||
setMsg("已触发生成任务…");
|
||||
try {
|
||||
const res = await fetch(`/api/orgs/${market}/${orgId}/reports/generate`, {
|
||||
const res = await fetch(`/api/analysis-requests/${orgId}`, {
|
||||
method: "POST",
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ template_id: selectedTemplateId }),
|
||||
});
|
||||
const data = await res.json();
|
||||
if (data.queued) {
|
||||
setMsg("生成任务已入队,稍后自动出现在列表中");
|
||||
// 简单轮询刷新
|
||||
setTimeout(loadReports, 1500);
|
||||
if (res.status === 202 && data.request_id) {
|
||||
setMsg(`生成任务已入队 (Request ID: ${data.request_id}),请稍后查询结果。`);
|
||||
// Simple polling to refresh history
|
||||
setTimeout(loadReports, 3000);
|
||||
} else {
|
||||
setMsg("触发失败");
|
||||
const errorMsg = data.error || "触发失败,未知错误。";
|
||||
setMsg(`触发失败: ${errorMsg}`);
|
||||
}
|
||||
} catch {
|
||||
setMsg("触发失败");
|
||||
} catch (e: any) {
|
||||
setMsg(`触发失败: ${e.message || "网络请求错误"}`);
|
||||
}
|
||||
}
|
||||
|
||||
@ -62,35 +86,57 @@ export default function QueryPage() {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<header className="space-y-2">
|
||||
<h1 className="text-2xl font-semibold">统一查询</h1>
|
||||
<p className="text-sm text-muted-foreground">输入企业ID与市场,查询历史报告并触发新报告生成。</p>
|
||||
<h1 className="text-2xl font-semibold">统一查询与分析</h1>
|
||||
<p className="text-sm text-muted-foreground">输入企业ID,选择分析模板,触发新报告生成或查询历史报告。</p>
|
||||
</header>
|
||||
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>查询条件</CardTitle>
|
||||
<CardDescription>选择市场并输入企业ID(如 us:AAPL / cn:600519)</CardDescription>
|
||||
<CardTitle>查询与分析</CardTitle>
|
||||
<CardDescription>选择市场、企业ID和分析模板来启动一个新的分析任务。</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent className="space-y-3">
|
||||
<div className="flex gap-2">
|
||||
<select
|
||||
className="border rounded px-2 py-1 bg-background"
|
||||
value={market}
|
||||
onChange={(e) => setMarket(e.target.value as "cn" | "us" | "jp")}
|
||||
>
|
||||
<option value="cn">中国(cn)</option>
|
||||
<option value="us">美国(us)</option>
|
||||
<option value="jp">日本(jp)</option>
|
||||
</select>
|
||||
<CardContent className="space-y-4">
|
||||
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
|
||||
<div className="space-y-2">
|
||||
<Label>市场</Label>
|
||||
<Select value={market} onValueChange={(v) => setMarket(v as any)}>
|
||||
<SelectTrigger>
|
||||
<SelectValue />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="cn">中国 (cn)</SelectItem>
|
||||
<SelectItem value="us">美国 (us)</SelectItem>
|
||||
<SelectItem value="jp">日本 (jp)</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
<div className="space-y-2">
|
||||
<Label>企业ID</Label>
|
||||
<Input
|
||||
value={orgId}
|
||||
onChange={(e) => setOrgId(e.target.value)}
|
||||
placeholder="输入企业ID,如 AAPL / 600519"
|
||||
placeholder="输入企业ID,如 600519"
|
||||
/>
|
||||
<Button onClick={loadReports} disabled={loading}>查询</Button>
|
||||
<Button onClick={triggerGenerate} variant="secondary">触发生成</Button>
|
||||
</div>
|
||||
{msg && <p className="text-xs text-muted-foreground">{msg}</p>}
|
||||
<div className="space-y-2">
|
||||
<Label>分析模板</Label>
|
||||
<Select value={selectedTemplateId} onValueChange={setSelectedTemplateId}>
|
||||
<SelectTrigger>
|
||||
<SelectValue placeholder="选择一个分析模板..." />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{templateSets && Object.entries(templateSets).map(([id, set]) => (
|
||||
<SelectItem key={id} value={id}>{set.name}</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex gap-2">
|
||||
<Button onClick={loadReports} disabled={loading}>查询历史</Button>
|
||||
<Button onClick={triggerGenerate} variant="secondary" disabled={!selectedTemplateId}>触发生成</Button>
|
||||
</div>
|
||||
{msg && <p className="text-xs text-muted-foreground pt-2">{msg}</p>}
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
|
||||
@ -6,6 +6,7 @@ import {
|
||||
AnalysisConfigResponse,
|
||||
LlmProvidersConfig,
|
||||
AnalysisModulesConfig,
|
||||
AnalysisTemplateSets, // New type
|
||||
FinancialConfigResponse,
|
||||
DataSourcesConfig,
|
||||
} from "@/types";
|
||||
@ -345,7 +346,27 @@ export async function discoverProviderModelsPreview(apiBaseUrl: string, apiKey:
|
||||
return res.json();
|
||||
}
|
||||
|
||||
// --- Analysis Modules Config Hooks ---
|
||||
// --- Analysis Template Sets Config Hooks (NEW) ---
|
||||
|
||||
export function useAnalysisTemplateSets() {
|
||||
return useSWR<AnalysisTemplateSets>('/api/configs/analysis_template_sets', fetcher);
|
||||
}
|
||||
|
||||
export async function updateAnalysisTemplateSets(payload: AnalysisTemplateSets) {
|
||||
const res = await fetch('/api/configs/analysis_template_sets', {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
if (!res.ok) {
|
||||
const text = await res.text().catch(() => '');
|
||||
throw new Error(text || `HTTP ${res.status}`);
|
||||
}
|
||||
return res.json() as Promise<AnalysisTemplateSets>;
|
||||
}
|
||||
|
||||
|
||||
// --- Analysis Modules Config Hooks (OLD - DEPRECATED) ---
|
||||
|
||||
export function useAnalysisModules() {
|
||||
return useSWR<AnalysisModulesConfig>('/api/configs/analysis_modules', fetcher);
|
||||
|
||||
@ -499,6 +499,32 @@ export interface AnalysisModuleConfig {
|
||||
/** 分析模块配置集合:键为 module_id(如 bull_case) */
|
||||
export type AnalysisModulesConfig = Record<string, AnalysisModuleConfig>;
|
||||
|
||||
|
||||
// ============================================================================
|
||||
// 分析模板集配置类型 (NEW)
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* 单个分析模板集,代表一套完整的分析流程
|
||||
* e.g., "Standard Fundamental Analysis"
|
||||
*/
|
||||
export interface AnalysisTemplateSet {
|
||||
/** 人类可读的模板名称, e.g., "标准基本面分析" */
|
||||
name: string;
|
||||
/**
|
||||
* 该模板集包含的所有分析模块
|
||||
* Key: 模块ID (e.g., "fundamental_analysis")
|
||||
*/
|
||||
modules: Record<string, AnalysisModuleConfig>;
|
||||
}
|
||||
|
||||
/**
|
||||
* 整个系统的分析模板配置,作为顶级对象
|
||||
* Key: 模板ID (e.g., "standard_fundamentals")
|
||||
*/
|
||||
export type AnalysisTemplateSets = Record<string, AnalysisTemplateSet>;
|
||||
|
||||
|
||||
// ============================================================================
|
||||
// 数据源配置类型(与后端 common-contracts 配置保持结构一致)
|
||||
// ============================================================================
|
||||
|
||||
@ -7,7 +7,7 @@ use axum::{
|
||||
routing::{get, post},
|
||||
Router,
|
||||
};
|
||||
use common_contracts::messages::FetchCompanyDataCommand;
|
||||
use common_contracts::messages::{FetchCompanyDataCommand, GenerateReportCommand};
|
||||
use common_contracts::observability::{HealthStatus, ServiceStatus, TaskProgress};
|
||||
use futures_util::future::join_all;
|
||||
use serde::{Deserialize, Serialize};
|
||||
@ -16,6 +16,7 @@ use tracing::{info, warn};
|
||||
use uuid::Uuid;
|
||||
|
||||
const DATA_FETCH_QUEUE: &str = "data_fetch_commands";
|
||||
const ANALYSIS_COMMANDS_QUEUE: &str = "analysis.commands.generate_report";
|
||||
|
||||
// --- Request/Response Structs ---
|
||||
#[derive(Deserialize)]
|
||||
@ -29,6 +30,11 @@ pub struct RequestAcceptedResponse {
|
||||
pub request_id: Uuid,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
pub struct AnalysisRequest {
|
||||
pub template_id: String,
|
||||
}
|
||||
|
||||
// --- Router Definition ---
|
||||
pub fn create_router(app_state: AppState) -> Router {
|
||||
Router::new()
|
||||
@ -41,12 +47,25 @@ pub fn create_router(app_state: AppState) -> Router {
|
||||
fn create_v1_router() -> Router<AppState> {
|
||||
Router::new()
|
||||
.route("/data-requests", post(trigger_data_fetch))
|
||||
.route(
|
||||
"/analysis-requests/{symbol}",
|
||||
post(trigger_analysis_generation),
|
||||
)
|
||||
.route("/companies/{symbol}/profile", get(get_company_profile))
|
||||
.route("/tasks/{request_id}", get(get_task_progress))
|
||||
// --- New Config Routes ---
|
||||
.route("/configs/llm_providers", get(get_llm_providers_config).put(update_llm_providers_config))
|
||||
.route("/configs/analysis_modules", get(get_analysis_modules_config).put(update_analysis_modules_config))
|
||||
.route("/configs/data_sources", get(get_data_sources_config).put(update_data_sources_config))
|
||||
.route(
|
||||
"/configs/llm_providers",
|
||||
get(get_llm_providers_config).put(update_llm_providers_config),
|
||||
)
|
||||
.route(
|
||||
"/configs/analysis_template_sets",
|
||||
get(get_analysis_template_sets).put(update_analysis_template_sets),
|
||||
)
|
||||
.route(
|
||||
"/configs/data_sources",
|
||||
get(get_data_sources_config).put(update_data_sources_config),
|
||||
)
|
||||
// --- New Discover Routes ---
|
||||
.route("/discover-models/{provider_id}", get(discover_models))
|
||||
.route("/discover-models", post(discover_models_preview))
|
||||
@ -102,6 +121,36 @@ async fn trigger_data_fetch(
|
||||
))
|
||||
}
|
||||
|
||||
/// [POST /v1/analysis-requests/:symbol]
|
||||
/// Triggers the analysis report generation workflow by publishing a command.
|
||||
async fn trigger_analysis_generation(
|
||||
State(state): State<AppState>,
|
||||
Path(symbol): Path<String>,
|
||||
Json(payload): Json<AnalysisRequest>,
|
||||
) -> Result<impl IntoResponse> {
|
||||
let request_id = Uuid::new_v4();
|
||||
let command = GenerateReportCommand {
|
||||
request_id,
|
||||
symbol,
|
||||
template_id: payload.template_id,
|
||||
};
|
||||
|
||||
info!(request_id = %request_id, "Publishing analysis generation command");
|
||||
|
||||
state
|
||||
.nats_client
|
||||
.publish(
|
||||
ANALYSIS_COMMANDS_QUEUE.to_string(),
|
||||
serde_json::to_vec(&command).unwrap().into(),
|
||||
)
|
||||
.await?;
|
||||
|
||||
Ok((
|
||||
StatusCode::ACCEPTED,
|
||||
Json(RequestAcceptedResponse { request_id }),
|
||||
))
|
||||
}
|
||||
|
||||
/// [GET /v1/companies/:symbol/profile]
|
||||
/// Queries the persisted company profile from the data-persistence-service.
|
||||
async fn get_company_profile(
|
||||
@ -161,7 +210,9 @@ async fn get_task_progress(
|
||||
|
||||
// --- Config API Handlers (Proxy to data-persistence-service) ---
|
||||
|
||||
use common_contracts::config_models::{LlmProvidersConfig, AnalysisModulesConfig, DataSourcesConfig};
|
||||
use common_contracts::config_models::{
|
||||
AnalysisTemplateSets, DataSourcesConfig, LlmProvidersConfig,
|
||||
};
|
||||
|
||||
/// [GET /v1/configs/llm_providers]
|
||||
async fn get_llm_providers_config(
|
||||
@ -180,20 +231,26 @@ async fn update_llm_providers_config(
|
||||
Ok(Json(updated_config))
|
||||
}
|
||||
|
||||
/// [GET /v1/configs/analysis_modules]
|
||||
async fn get_analysis_modules_config(
|
||||
/// [GET /v1/configs/analysis_template_sets]
|
||||
async fn get_analysis_template_sets(
|
||||
State(state): State<AppState>,
|
||||
) -> Result<impl IntoResponse> {
|
||||
let config = state.persistence_client.get_analysis_modules_config().await?;
|
||||
let config = state
|
||||
.persistence_client
|
||||
.get_analysis_template_sets()
|
||||
.await?;
|
||||
Ok(Json(config))
|
||||
}
|
||||
|
||||
/// [PUT /v1/configs/analysis_modules]
|
||||
async fn update_analysis_modules_config(
|
||||
/// [PUT /v1/configs/analysis_template_sets]
|
||||
async fn update_analysis_template_sets(
|
||||
State(state): State<AppState>,
|
||||
Json(payload): Json<AnalysisModulesConfig>,
|
||||
Json(payload): Json<AnalysisTemplateSets>,
|
||||
) -> Result<impl IntoResponse> {
|
||||
let updated_config = state.persistence_client.update_analysis_modules_config(&payload).await?;
|
||||
let updated_config = state
|
||||
.persistence_client
|
||||
.update_analysis_template_sets(&payload)
|
||||
.await?;
|
||||
Ok(Json(updated_config))
|
||||
}
|
||||
|
||||
|
||||
@ -4,7 +4,7 @@
|
||||
|
||||
use crate::error::Result;
|
||||
use common_contracts::dtos::CompanyProfileDto;
|
||||
use common_contracts::config_models::{LlmProvidersConfig, AnalysisModulesConfig, DataSourcesConfig};
|
||||
use common_contracts::config_models::{LlmProvidersConfig, DataSourcesConfig, AnalysisTemplateSets};
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct PersistenceClient {
|
||||
@ -48,7 +48,10 @@ impl PersistenceClient {
|
||||
Ok(config)
|
||||
}
|
||||
|
||||
pub async fn update_llm_providers_config(&self, payload: &LlmProvidersConfig) -> Result<LlmProvidersConfig> {
|
||||
pub async fn update_llm_providers_config(
|
||||
&self,
|
||||
payload: &LlmProvidersConfig,
|
||||
) -> Result<LlmProvidersConfig> {
|
||||
let url = format!("{}/configs/llm_providers", self.base_url);
|
||||
let updated_config = self
|
||||
.client
|
||||
@ -62,21 +65,24 @@ impl PersistenceClient {
|
||||
Ok(updated_config)
|
||||
}
|
||||
|
||||
pub async fn get_analysis_modules_config(&self) -> Result<AnalysisModulesConfig> {
|
||||
let url = format!("{}/configs/analysis_modules", self.base_url);
|
||||
pub async fn get_analysis_template_sets(&self) -> Result<AnalysisTemplateSets> {
|
||||
let url = format!("{}/configs/analysis_template_sets", self.base_url);
|
||||
let config = self
|
||||
.client
|
||||
.get(&url)
|
||||
.send()
|
||||
.await?
|
||||
.error_for_status()?
|
||||
.json::<AnalysisModulesConfig>()
|
||||
.json::<AnalysisTemplateSets>()
|
||||
.await?;
|
||||
Ok(config)
|
||||
}
|
||||
|
||||
pub async fn update_analysis_modules_config(&self, payload: &AnalysisModulesConfig) -> Result<AnalysisModulesConfig> {
|
||||
let url = format!("{}/configs/analysis_modules", self.base_url);
|
||||
pub async fn update_analysis_template_sets(
|
||||
&self,
|
||||
payload: &AnalysisTemplateSets,
|
||||
) -> Result<AnalysisTemplateSets> {
|
||||
let url = format!("{}/configs/analysis_template_sets", self.base_url);
|
||||
let updated_config = self
|
||||
.client
|
||||
.put(&url)
|
||||
@ -84,7 +90,7 @@ impl PersistenceClient {
|
||||
.send()
|
||||
.await?
|
||||
.error_for_status()?
|
||||
.json::<AnalysisModulesConfig>()
|
||||
.json::<AnalysisTemplateSets>()
|
||||
.await?;
|
||||
Ok(updated_config)
|
||||
}
|
||||
|
||||
@ -22,18 +22,50 @@ pub struct LlmProvider {
|
||||
// 整个LLM Provider注册中心的数据结构
|
||||
pub type LlmProvidersConfig = HashMap<String, LlmProvider>; // Key: provider_id, e.g., "openai_official"
|
||||
|
||||
// 单个分析模块的配置
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, ToSchema)]
|
||||
// --- Analysis Module Config (NEW TEMPLATE-BASED STRUCTURE) ---
|
||||
|
||||
/// Top-level configuration object for all analysis templates.
|
||||
/// Key: Template ID (e.g., "standard_fundamentals")
|
||||
pub type AnalysisTemplateSets = HashMap<String, AnalysisTemplateSet>;
|
||||
|
||||
/// A single, self-contained set of analysis modules representing a complete workflow.
|
||||
/// e.g., "Standard Fundamental Analysis"
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, ToSchema)]
|
||||
pub struct AnalysisTemplateSet {
|
||||
/// Human-readable name for the template set.
|
||||
pub name: String,
|
||||
/// All analysis modules contained within this template set.
|
||||
/// Key: Module ID (e.g., "fundamental_analysis")
|
||||
pub modules: HashMap<String, AnalysisModuleConfig>,
|
||||
}
|
||||
|
||||
/// Configuration for a single analysis module.
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, ToSchema)]
|
||||
pub struct AnalysisModuleConfig {
|
||||
pub name: String, // "看涨分析"
|
||||
pub provider_id: String, // 引用 LlmProvidersConfig 的 Key
|
||||
pub model_id: String, // 引用 LlmModel 中的 model_id
|
||||
pub name: String,
|
||||
pub provider_id: String,
|
||||
pub model_id: String,
|
||||
pub prompt_template: String,
|
||||
/// List of dependencies. Each string must be a key in the parent `modules` HashMap.
|
||||
pub dependencies: Vec<String>,
|
||||
}
|
||||
|
||||
// --- Analysis Module Config (OLD DEPRECATED STRUCTURE) ---
|
||||
|
||||
// This is the old, flat structure for analysis modules.
|
||||
// It is DEPRECATED and will be removed once all services are migrated
|
||||
// to the new AnalysisTemplateSets model.
|
||||
pub type AnalysisModulesConfig = HashMap<String, OldAnalysisModuleConfig>;
|
||||
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
|
||||
pub struct OldAnalysisModuleConfig {
|
||||
pub name: String,
|
||||
pub provider_id: String,
|
||||
pub model_id: String,
|
||||
pub prompt_template: String,
|
||||
pub dependencies: Vec<String>,
|
||||
}
|
||||
|
||||
// 整个分析模块配置集合的数据结构
|
||||
pub type AnalysisModulesConfig = HashMap<String, AnalysisModuleConfig>; // Key: module_id, e.g., "bull_case"
|
||||
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, Default)]
|
||||
pub struct SystemConfig {
|
||||
|
||||
@ -48,27 +48,31 @@ pub struct DailyMarketDataBatchDto {
|
||||
pub records: Vec<DailyMarketDataDto>,
|
||||
}
|
||||
|
||||
// Analysis Results API DTOs
|
||||
// Analysis Results API DTOs (NEW)
|
||||
#[api_dto]
|
||||
pub struct NewAnalysisResultDto {
|
||||
pub struct NewAnalysisResult {
|
||||
pub request_id: Uuid,
|
||||
pub symbol: String,
|
||||
pub template_id: String,
|
||||
pub module_id: String,
|
||||
pub model_name: Option<String>,
|
||||
pub content: String,
|
||||
pub meta_data: Option<JsonValue>,
|
||||
pub meta_data: JsonValue,
|
||||
}
|
||||
|
||||
/// Represents a persisted analysis result read from the database.
|
||||
#[api_dto]
|
||||
pub struct AnalysisResultDto {
|
||||
pub id: Uuid,
|
||||
pub id: i64,
|
||||
pub request_id: Uuid,
|
||||
pub symbol: String,
|
||||
pub template_id: String,
|
||||
pub module_id: String,
|
||||
pub generated_at: chrono::DateTime<chrono::Utc>,
|
||||
pub model_name: Option<String>,
|
||||
pub content: String,
|
||||
pub meta_data: Option<JsonValue>,
|
||||
pub meta_data: JsonValue,
|
||||
pub created_at: chrono::DateTime<chrono::Utc>,
|
||||
}
|
||||
|
||||
|
||||
// Realtime Quotes DTOs
|
||||
#[api_dto]
|
||||
pub struct RealtimeQuoteDto {
|
||||
|
||||
@ -2,13 +2,27 @@ use serde::{Serialize, Deserialize};
|
||||
use uuid::Uuid;
|
||||
|
||||
// --- Commands ---
|
||||
#[derive(Serialize, Deserialize, Debug, Clone)]
|
||||
///
|
||||
/// Published by: `api-gateway`
|
||||
/// Consumed by: `*-provider-services`
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct FetchCompanyDataCommand {
|
||||
pub request_id: Uuid,
|
||||
pub symbol: String,
|
||||
pub market: String,
|
||||
}
|
||||
|
||||
/// Command to start a full report generation workflow.
|
||||
///
|
||||
/// Published by: `api-gateway`
|
||||
/// Consumed by: `report-generator-service`
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct GenerateReportCommand {
|
||||
pub request_id: Uuid,
|
||||
pub symbol: String,
|
||||
pub template_id: String,
|
||||
}
|
||||
|
||||
// --- Events ---
|
||||
#[derive(Serialize, Deserialize, Debug, Clone)]
|
||||
pub struct CompanyProfilePersistedEvent {
|
||||
|
||||
@ -74,7 +74,7 @@ wasm-cli = []
|
||||
mcp = ["service_kit/mcp"]
|
||||
# 可选:透传 api-cli 给 service_kit
|
||||
# api-cli = ["service_kit/api-cli"]
|
||||
full-data = []
|
||||
# full-data = []
|
||||
|
||||
# --- For Local Development ---
|
||||
# If you are developing `service_kit` locally, uncomment the following lines
|
||||
|
||||
@ -19,6 +19,8 @@ RUN cargo chef cook --release --recipe-path /app/services/data-persistence-servi
|
||||
# 复制服务源码用于实际构建
|
||||
COPY services/common-contracts /app/services/common-contracts
|
||||
COPY services/data-persistence-service /app/services/data-persistence-service
|
||||
## 为了在编译期通过 include_str! 嵌入根目录配置,将 /config 拷贝到 /app/config
|
||||
COPY config /app/config
|
||||
RUN cargo build --release --bin data-persistence-service-server
|
||||
|
||||
FROM debian:bookworm-slim AS runtime
|
||||
@ -30,5 +32,7 @@ COPY --from=builder /app/services/data-persistence-service/target/release/data-p
|
||||
COPY services/data-persistence-service/migrations ./migrations
|
||||
ENV HOST=0.0.0.0
|
||||
ENV PORT=3000
|
||||
## 当迁移版本发生偏差时,允许继续启动(仅容器默认;本地可覆盖)
|
||||
ENV SKIP_MIGRATIONS_ON_MISMATCH=1
|
||||
EXPOSE 3000
|
||||
ENTRYPOINT ["/usr/local/bin/data-persistence-service-server"]
|
||||
|
||||
@ -1,94 +1,121 @@
|
||||
use crate::{
|
||||
db,
|
||||
dtos::{AnalysisResultDto, NewAnalysisResultDto},
|
||||
AppState, ServerError,
|
||||
};
|
||||
use crate::models::AnalysisResult;
|
||||
use crate::{AppState, ServerError};
|
||||
use axum::{
|
||||
extract::{Path, Query, State},
|
||||
http::StatusCode,
|
||||
response::IntoResponse,
|
||||
Json,
|
||||
};
|
||||
use common_contracts::dtos::{AnalysisResultDto, NewAnalysisResult};
|
||||
use serde::Deserialize;
|
||||
use service_kit::api;
|
||||
use uuid::Uuid;
|
||||
use tracing::info;
|
||||
use tracing::instrument;
|
||||
use anyhow::Error as AnyhowError;
|
||||
|
||||
#[derive(Deserialize, utoipa::ToSchema)]
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct AnalysisQuery {
|
||||
pub symbol: String,
|
||||
pub module_id: Option<String>,
|
||||
}
|
||||
|
||||
#[api(POST, "/api/v1/analysis-results", output(detail = "AnalysisResultDto"))]
|
||||
/// Creates a new analysis result and returns the created record.
|
||||
#[instrument(skip(state, payload), fields(request_id = %payload.request_id, symbol = %payload.symbol, module_id = %payload.module_id))]
|
||||
pub async fn create_analysis_result(
|
||||
State(state): State<AppState>,
|
||||
Json(payload): Json<NewAnalysisResultDto>,
|
||||
) -> Result<Json<AnalysisResultDto>, ServerError> {
|
||||
info!(target: "api", symbol = %payload.symbol, module_id = %payload.module_id, "POST /analysis-results → create_analysis_result called");
|
||||
let new_result = db::create_analysis_result(&state.pool, &payload).await?;
|
||||
Json(payload): Json<NewAnalysisResult>,
|
||||
) -> Result<impl IntoResponse, ServerError> {
|
||||
let result = sqlx::query_as::<_, AnalysisResult>(
|
||||
r#"
|
||||
INSERT INTO analysis_results (request_id, symbol, template_id, module_id, content, meta_data)
|
||||
VALUES ($1, $2, $3, $4, $5, $6)
|
||||
RETURNING *
|
||||
"#
|
||||
)
|
||||
.bind(&payload.request_id)
|
||||
.bind(&payload.symbol)
|
||||
.bind(&payload.template_id)
|
||||
.bind(&payload.module_id)
|
||||
.bind(&payload.content)
|
||||
.bind(&payload.meta_data)
|
||||
.fetch_one(state.pool())
|
||||
.await
|
||||
.map_err(AnyhowError::from)?;
|
||||
|
||||
// Convert model to DTO
|
||||
let dto = AnalysisResultDto {
|
||||
id: new_result.id,
|
||||
symbol: new_result.symbol,
|
||||
module_id: new_result.module_id,
|
||||
generated_at: new_result.generated_at,
|
||||
model_name: new_result.model_name,
|
||||
content: new_result.content,
|
||||
meta_data: new_result.meta_data,
|
||||
id: result.id,
|
||||
request_id: result.request_id,
|
||||
symbol: result.symbol,
|
||||
template_id: result.template_id,
|
||||
module_id: result.module_id,
|
||||
content: result.content,
|
||||
meta_data: result.meta_data,
|
||||
created_at: result.created_at,
|
||||
};
|
||||
|
||||
info!(target: "api", id = %dto.id, symbol = %dto.symbol, module_id = %dto.module_id, "create_analysis_result completed");
|
||||
Ok(Json(dto))
|
||||
Ok((StatusCode::CREATED, Json(dto)))
|
||||
}
|
||||
|
||||
#[api(GET, "/api/v1/analysis-results", output(list = "AnalysisResultDto"))]
|
||||
/// Retrieves all analysis results for a given symbol.
|
||||
#[instrument(skip(state))]
|
||||
pub async fn get_analysis_results(
|
||||
State(state): State<AppState>,
|
||||
Query(query): Query<AnalysisQuery>,
|
||||
) -> Result<Json<Vec<AnalysisResultDto>>, ServerError> {
|
||||
info!(target: "api", symbol = %query.symbol, module_id = ?query.module_id, "GET /analysis-results → get_analysis_results called");
|
||||
let results = db::get_analysis_results(&state.pool, &query.symbol, query.module_id.as_deref()).await?;
|
||||
let results = sqlx::query_as::<_, AnalysisResult>(
|
||||
r#"
|
||||
SELECT * FROM analysis_results
|
||||
WHERE symbol = $1
|
||||
ORDER BY created_at DESC
|
||||
"#
|
||||
)
|
||||
.bind(&query.symbol)
|
||||
.fetch_all(state.pool())
|
||||
.await
|
||||
.map_err(AnyhowError::from)?;
|
||||
|
||||
// Convert Vec<Model> to Vec<Dto>
|
||||
let dtos: Vec<AnalysisResultDto> = results
|
||||
let dtos = results
|
||||
.into_iter()
|
||||
.map(|r| AnalysisResultDto {
|
||||
id: r.id,
|
||||
request_id: r.request_id,
|
||||
symbol: r.symbol,
|
||||
template_id: r.template_id,
|
||||
module_id: r.module_id,
|
||||
generated_at: r.generated_at,
|
||||
model_name: r.model_name,
|
||||
content: r.content,
|
||||
meta_data: r.meta_data,
|
||||
created_at: r.created_at,
|
||||
})
|
||||
.collect();
|
||||
|
||||
info!(target: "api", count = dtos.len(), symbol = %query.symbol, "get_analysis_results completed");
|
||||
Ok(Json(dtos))
|
||||
}
|
||||
|
||||
#[api(GET, "/api/v1/analysis-results/{id}", output(detail = "AnalysisResultDto"))]
|
||||
/// Retrieves a single analysis result by its primary ID.
|
||||
#[instrument(skip(state))]
|
||||
pub async fn get_analysis_result_by_id(
|
||||
State(state): State<AppState>,
|
||||
Path(id): Path<String>,
|
||||
Path(id): Path<i64>,
|
||||
) -> Result<Json<AnalysisResultDto>, ServerError> {
|
||||
let parsed = Uuid::parse_str(&id).map_err(|e| ServerError::Anyhow(e.into()))?;
|
||||
info!(target: "api", id = %id, "GET /analysis-results/{{id}} → get_analysis_result_by_id called");
|
||||
let result = db::get_analysis_result_by_id(&state.pool, parsed)
|
||||
.await?
|
||||
.ok_or_else(|| ServerError::NotFound(format!("Analysis result with id '{}' not found", id)))?;
|
||||
let result = sqlx::query_as::<_, AnalysisResult>(
|
||||
r#"
|
||||
SELECT * FROM analysis_results
|
||||
WHERE id = $1
|
||||
"#
|
||||
)
|
||||
.bind(&id)
|
||||
.fetch_one(state.pool())
|
||||
.await
|
||||
.map_err(AnyhowError::from)?;
|
||||
|
||||
// Convert model to DTO
|
||||
let dto = AnalysisResultDto {
|
||||
id: result.id,
|
||||
request_id: result.request_id,
|
||||
symbol: result.symbol,
|
||||
template_id: result.template_id,
|
||||
module_id: result.module_id,
|
||||
generated_at: result.generated_at,
|
||||
model_name: result.model_name,
|
||||
content: result.content,
|
||||
meta_data: result.meta_data,
|
||||
created_at: result.created_at,
|
||||
};
|
||||
|
||||
info!(target: "api", id = %dto.id, symbol = %dto.symbol, module_id = %dto.module_id, "get_analysis_result_by_id completed");
|
||||
Ok(Json(dto))
|
||||
}
|
||||
|
||||
@ -9,6 +9,7 @@ use axum::{
|
||||
};
|
||||
use service_kit::api;
|
||||
use tracing::info;
|
||||
use anyhow::Error as AnyhowError;
|
||||
|
||||
#[api(PUT, "/api/v1/companies")]
|
||||
pub async fn upsert_company(
|
||||
@ -16,7 +17,7 @@ pub async fn upsert_company(
|
||||
Json(payload): Json<CompanyProfileDto>,
|
||||
) -> Result<(), ServerError> {
|
||||
info!(target: "api", symbol = %payload.symbol, "PUT /companies → upsert_company called");
|
||||
db::upsert_company(&state.pool, &payload).await?;
|
||||
db::upsert_company(&state.pool, &payload).await.map_err(AnyhowError::from)?;
|
||||
info!(target: "api", symbol = %payload.symbol, "upsert_company completed");
|
||||
Ok(())
|
||||
}
|
||||
@ -28,7 +29,8 @@ pub async fn get_company_by_symbol(
|
||||
) -> Result<Json<CompanyProfileDto>, ServerError> {
|
||||
info!(target: "api", symbol = %symbol, "GET /companies/{{symbol}} → get_company_by_symbol called");
|
||||
let company = db::get_company_by_symbol(&state.pool, &symbol)
|
||||
.await?
|
||||
.await
|
||||
.map_err(AnyhowError::from)?
|
||||
.ok_or_else(|| ServerError::NotFound(format!("Company with symbol '{}' not found", symbol)))?;
|
||||
|
||||
// Convert from model to DTO
|
||||
|
||||
@ -1,10 +1,7 @@
|
||||
use axum::{extract::State, Json};
|
||||
use common_contracts::config_models::{
|
||||
AnalysisModulesConfig, DataSourceConfig, LlmProvidersConfig,
|
||||
};
|
||||
use common_contracts::config_models::{AnalysisTemplateSets, DataSourcesConfig, LlmProvidersConfig};
|
||||
use service_kit::api;
|
||||
use std::collections::HashMap;
|
||||
|
||||
use tracing::instrument;
|
||||
use crate::{db::system_config, AppState, ServerError};
|
||||
|
||||
#[api(GET, "/api/v1/configs/llm_providers", output(detail = "LlmProvidersConfig"))]
|
||||
@ -26,27 +23,25 @@ pub async fn update_llm_providers_config(
|
||||
Ok(Json(updated_config))
|
||||
}
|
||||
|
||||
#[api(GET, "/api/v1/configs/analysis_modules", output(detail = "AnalysisModulesConfig"))]
|
||||
pub async fn get_analysis_modules_config(
|
||||
#[api(GET, "/api/v1/configs/analysis_template_sets", output(detail = "AnalysisTemplateSets"))]
|
||||
pub async fn get_analysis_template_sets(
|
||||
State(state): State<AppState>,
|
||||
) -> Result<Json<AnalysisModulesConfig>, ServerError> {
|
||||
) -> Result<Json<AnalysisTemplateSets>, ServerError> {
|
||||
let pool = state.pool();
|
||||
let config = system_config::get_config::<AnalysisModulesConfig>(pool, "analysis_modules").await?;
|
||||
let config = system_config::get_config::<AnalysisTemplateSets>(pool, "analysis_template_sets").await?;
|
||||
Ok(Json(config))
|
||||
}
|
||||
|
||||
#[api(PUT, "/api/v1/configs/analysis_modules", output(detail = "AnalysisModulesConfig"))]
|
||||
pub async fn update_analysis_modules_config(
|
||||
#[api(PUT, "/api/v1/configs/analysis_template_sets", output(detail = "AnalysisTemplateSets"))]
|
||||
pub async fn update_analysis_template_sets(
|
||||
State(state): State<AppState>,
|
||||
Json(payload): Json<AnalysisModulesConfig>,
|
||||
) -> Result<Json<AnalysisModulesConfig>, ServerError> {
|
||||
Json(payload): Json<AnalysisTemplateSets>,
|
||||
) -> Result<Json<AnalysisTemplateSets>, ServerError> {
|
||||
let pool = state.pool();
|
||||
let updated_config = system_config::update_config(pool, "analysis_modules", &payload).await?;
|
||||
Ok(Json(updated_config))
|
||||
let updated = system_config::update_config(pool, "analysis_template_sets", &payload).await?;
|
||||
Ok(Json(updated))
|
||||
}
|
||||
|
||||
pub type DataSourcesConfig = HashMap<String, DataSourceConfig>;
|
||||
|
||||
#[api(
|
||||
GET,
|
||||
"/api/v1/configs/data_sources",
|
||||
|
||||
@ -11,6 +11,7 @@ use chrono::NaiveDate;
|
||||
use serde::Deserialize;
|
||||
use service_kit::api;
|
||||
use tracing::info;
|
||||
use anyhow::Error as AnyhowError;
|
||||
|
||||
#[derive(Deserialize, utoipa::ToSchema)]
|
||||
pub struct FinancialsQuery {
|
||||
@ -23,7 +24,7 @@ pub async fn batch_insert_financials(
|
||||
Json(payload): Json<crate::dtos::TimeSeriesFinancialBatchDto>,
|
||||
) -> Result<axum::http::StatusCode, ServerError> {
|
||||
info!(target: "api", count = payload.records.len(), "POST /market-data/financials/batch → batch_insert_financials called");
|
||||
db::batch_insert_financials(&state.pool, &payload.records).await?;
|
||||
db::batch_insert_financials(&state.pool, &payload.records).await.map_err(AnyhowError::from)?;
|
||||
info!(target: "api", count = payload.records.len(), "batch_insert_financials completed");
|
||||
Ok(axum::http::StatusCode::CREATED)
|
||||
}
|
||||
@ -36,7 +37,7 @@ pub async fn get_financials_by_symbol(
|
||||
) -> Result<Json<Vec<TimeSeriesFinancialDto>>, ServerError> {
|
||||
info!(target: "api", symbol = %symbol, metrics = ?query.metrics, "GET /market-data/financials/{{symbol}} → get_financials_by_symbol called");
|
||||
let metrics = query.metrics.map(|s| s.split(',').map(String::from).collect());
|
||||
let financials = db::get_financials_by_symbol(&state.pool, &symbol, metrics).await?;
|
||||
let financials = db::get_financials_by_symbol(&state.pool, &symbol, metrics).await.map_err(AnyhowError::from)?;
|
||||
|
||||
// Convert Vec<Model> to Vec<Dto>
|
||||
let dtos: Vec<TimeSeriesFinancialDto> = financials
|
||||
@ -70,7 +71,7 @@ pub async fn upsert_realtime_quote(
|
||||
Json(quote): Json<RealtimeQuoteDto>,
|
||||
) -> Result<axum::http::StatusCode, ServerError> {
|
||||
info!(target: "api", symbol = %quote.symbol, market = %quote.market, "POST /market-data/quotes → upsert_realtime_quote called");
|
||||
db::insert_realtime_quote(&state.pool, "e).await?;
|
||||
db::insert_realtime_quote(&state.pool, "e).await.map_err(AnyhowError::from)?;
|
||||
Ok(axum::http::StatusCode::CREATED)
|
||||
}
|
||||
|
||||
@ -82,7 +83,7 @@ pub async fn get_latest_realtime_quote(
|
||||
) -> Result<Json<RealtimeQuoteDto>, ServerError> {
|
||||
let market = q.market.clone();
|
||||
info!(target: "api", symbol = %symbol, market = %market, "GET /market-data/quotes/{{market}}/{{symbol}} → get_latest_realtime_quote called");
|
||||
if let Some(rec) = db::get_latest_realtime_quote(&state.pool, &market, &symbol).await? {
|
||||
if let Some(rec) = db::get_latest_realtime_quote(&state.pool, &market, &symbol).await.map_err(AnyhowError::from)? {
|
||||
if let Some(max_age) = q.max_age_seconds {
|
||||
let cutoff = chrono::Utc::now() - chrono::Duration::seconds(max_age);
|
||||
if rec.ts < cutoff {
|
||||
@ -121,7 +122,7 @@ pub async fn batch_insert_daily_data(
|
||||
Json(payload): Json<crate::dtos::DailyMarketDataBatchDto>,
|
||||
) -> Result<axum::http::StatusCode, ServerError> {
|
||||
info!(target: "api", count = payload.records.len(), "POST /market-data/daily/batch → batch_insert_daily_data called");
|
||||
db::batch_insert_daily_data(&state.pool, &payload.records).await?;
|
||||
db::batch_insert_daily_data(&state.pool, &payload.records).await.map_err(AnyhowError::from)?;
|
||||
info!(target: "api", count = payload.records.len(), "batch_insert_daily_data completed");
|
||||
Ok(axum::http::StatusCode::CREATED)
|
||||
}
|
||||
@ -135,7 +136,8 @@ pub async fn get_daily_data_by_symbol(
|
||||
info!(target: "api", symbol = %symbol, start = ?query.start_date, end = ?query.end_date, "GET /market-data/daily/{{symbol}} → get_daily_data_by_symbol called");
|
||||
let daily_data =
|
||||
db::get_daily_data_by_symbol(&state.pool, &symbol, query.start_date, query.end_date)
|
||||
.await?;
|
||||
.await
|
||||
.map_err(AnyhowError::from)?;
|
||||
|
||||
// Convert Vec<Model> to Vec<Dto>
|
||||
let dtos: Vec<DailyMarketDataDto> = daily_data
|
||||
|
||||
@ -1,10 +1,47 @@
|
||||
// This module will contain all the API handler definitions
|
||||
// which are then collected by the `inventory` crate.
|
||||
#[cfg(feature = "full-data")]
|
||||
pub mod companies;
|
||||
#[cfg(feature = "full-data")]
|
||||
pub mod market_data;
|
||||
#[cfg(feature = "full-data")]
|
||||
pub mod analysis;
|
||||
pub mod system;
|
||||
pub mod configs;
|
||||
mod analysis;
|
||||
mod companies;
|
||||
mod configs;
|
||||
mod market_data;
|
||||
mod system;
|
||||
use crate::AppState;
|
||||
use axum::{
|
||||
routing::{get, post},
|
||||
Router,
|
||||
};
|
||||
|
||||
pub fn create_router(_state: AppState) -> Router<AppState> {
|
||||
let router: Router<AppState> = Router::new()
|
||||
// System
|
||||
.route("/health", get(system::get_health))
|
||||
// Configs
|
||||
.route(
|
||||
"/configs/llm_providers",
|
||||
get(configs::get_llm_providers_config).put(configs::update_llm_providers_config),
|
||||
)
|
||||
.route(
|
||||
"/configs/analysis_template_sets",
|
||||
get(configs::get_analysis_template_sets).put(configs::update_analysis_template_sets),
|
||||
)
|
||||
.route(
|
||||
"/configs/data_sources",
|
||||
get(configs::get_data_sources_config).put(configs::update_data_sources_config),
|
||||
)
|
||||
// Companies
|
||||
.route("/companies/{symbol}", get(companies::get_company_by_symbol))
|
||||
// Market Data
|
||||
.route(
|
||||
"/market-data/financial-statements/{symbol}",
|
||||
get(market_data::get_financials_by_symbol),
|
||||
)
|
||||
// Analysis Results
|
||||
.route(
|
||||
"/analysis-results",
|
||||
post(analysis::create_analysis_result).get(analysis::get_analysis_results),
|
||||
)
|
||||
.route(
|
||||
"/analysis-results/:id",
|
||||
get(analysis::get_analysis_result_by_id),
|
||||
);
|
||||
|
||||
router
|
||||
}
|
||||
|
||||
42
services/data-persistence-service/src/db/companies.rs
Normal file
42
services/data-persistence-service/src/db/companies.rs
Normal file
@ -0,0 +1,42 @@
|
||||
use common_contracts::dtos::CompanyProfileDto;
|
||||
use common_contracts::models::CompanyProfile;
|
||||
use sqlx::PgPool;
|
||||
|
||||
pub async fn upsert_company(pool: &PgPool, payload: &CompanyProfileDto) -> Result<(), sqlx::Error> {
|
||||
sqlx::query(
|
||||
r#"
|
||||
INSERT INTO company_profiles (symbol, name, industry, list_date, additional_info, updated_at)
|
||||
VALUES ($1, $2, $3, $4, $5, NOW())
|
||||
ON CONFLICT (symbol) DO UPDATE SET
|
||||
name = EXCLUDED.name,
|
||||
industry = EXCLUDED.industry,
|
||||
list_date = EXCLUDED.list_date,
|
||||
additional_info = EXCLUDED.additional_info,
|
||||
updated_at = NOW()
|
||||
"#,
|
||||
)
|
||||
.bind(&payload.symbol)
|
||||
.bind(&payload.name)
|
||||
.bind(&payload.industry)
|
||||
.bind(&payload.list_date)
|
||||
.bind(&payload.additional_info)
|
||||
.execute(pool)
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_company_by_symbol(pool: &PgPool, symbol: &str) -> Result<Option<CompanyProfile>, sqlx::Error> {
|
||||
let rec = sqlx::query_as::<_, CompanyProfile>(
|
||||
r#"
|
||||
SELECT symbol, name, industry, list_date, additional_info, updated_at
|
||||
FROM company_profiles
|
||||
WHERE symbol = $1
|
||||
"#,
|
||||
)
|
||||
.bind(symbol)
|
||||
.fetch_optional(pool)
|
||||
.await?;
|
||||
Ok(rec)
|
||||
}
|
||||
|
||||
|
||||
237
services/data-persistence-service/src/db/market_data.rs
Normal file
237
services/data-persistence-service/src/db/market_data.rs
Normal file
@ -0,0 +1,237 @@
|
||||
use chrono::NaiveDate;
|
||||
use common_contracts::dtos::{DailyMarketDataDto, RealtimeQuoteDto, TimeSeriesFinancialDto};
|
||||
use common_contracts::models::{DailyMarketData, RealtimeQuote, TimeSeriesFinancial};
|
||||
use sqlx::PgPool;
|
||||
|
||||
pub async fn batch_insert_financials(pool: &PgPool, records: &[TimeSeriesFinancialDto]) -> Result<(), sqlx::Error> {
|
||||
let mut tx = pool.begin().await?;
|
||||
for r in records {
|
||||
sqlx::query(
|
||||
r#"
|
||||
INSERT INTO time_series_financials (symbol, metric_name, period_date, value, source)
|
||||
VALUES ($1, $2, $3, $4, $5)
|
||||
ON CONFLICT (symbol, metric_name, period_date) DO UPDATE SET
|
||||
value = EXCLUDED.value,
|
||||
source = EXCLUDED.source
|
||||
"#,
|
||||
)
|
||||
.bind(&r.symbol)
|
||||
.bind(&r.metric_name)
|
||||
.bind(&r.period_date)
|
||||
.bind(&r.value)
|
||||
.bind(&r.source)
|
||||
.execute(&mut *tx)
|
||||
.await?;
|
||||
}
|
||||
tx.commit().await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_financials_by_symbol(
|
||||
pool: &PgPool,
|
||||
symbol: &str,
|
||||
metrics: Option<Vec<String>>,
|
||||
) -> Result<Vec<TimeSeriesFinancial>, sqlx::Error> {
|
||||
if let Some(metrics) = metrics {
|
||||
let recs = sqlx::query_as::<_, TimeSeriesFinancial>(
|
||||
r#"
|
||||
SELECT symbol, metric_name, period_date, value, source
|
||||
FROM time_series_financials
|
||||
WHERE symbol = $1 AND metric_name = ANY($2)
|
||||
ORDER BY period_date DESC
|
||||
"#,
|
||||
)
|
||||
.bind(symbol)
|
||||
.bind(&metrics)
|
||||
.fetch_all(pool)
|
||||
.await?;
|
||||
Ok(recs)
|
||||
} else {
|
||||
let recs = sqlx::query_as::<_, TimeSeriesFinancial>(
|
||||
r#"
|
||||
SELECT symbol, metric_name, period_date, value, source
|
||||
FROM time_series_financials
|
||||
WHERE symbol = $1
|
||||
ORDER BY period_date DESC
|
||||
"#,
|
||||
)
|
||||
.bind(symbol)
|
||||
.fetch_all(pool)
|
||||
.await?;
|
||||
Ok(recs)
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn insert_realtime_quote(pool: &PgPool, q: &RealtimeQuoteDto) -> Result<(), sqlx::Error> {
|
||||
sqlx::query(
|
||||
r#"
|
||||
INSERT INTO realtime_quotes (
|
||||
symbol, market, ts, price, open_price, high_price, low_price,
|
||||
prev_close, change, change_percent, volume, source, updated_at
|
||||
)
|
||||
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, NOW())
|
||||
ON CONFLICT (symbol, market, ts) DO UPDATE SET
|
||||
price = EXCLUDED.price,
|
||||
open_price = EXCLUDED.open_price,
|
||||
high_price = EXCLUDED.high_price,
|
||||
low_price = EXCLUDED.low_price,
|
||||
prev_close = EXCLUDED.prev_close,
|
||||
change = EXCLUDED.change,
|
||||
change_percent = EXCLUDED.change_percent,
|
||||
volume = EXCLUDED.volume,
|
||||
source = EXCLUDED.source,
|
||||
updated_at = NOW()
|
||||
"#,
|
||||
)
|
||||
.bind(&q.symbol)
|
||||
.bind(&q.market)
|
||||
.bind(&q.ts)
|
||||
.bind(&q.price)
|
||||
.bind(&q.open_price)
|
||||
.bind(&q.high_price)
|
||||
.bind(&q.low_price)
|
||||
.bind(&q.prev_close)
|
||||
.bind(&q.change)
|
||||
.bind(&q.change_percent)
|
||||
.bind(&q.volume)
|
||||
.bind(&q.source)
|
||||
.execute(pool)
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_latest_realtime_quote(
|
||||
pool: &PgPool,
|
||||
market: &str,
|
||||
symbol: &str,
|
||||
) -> Result<Option<RealtimeQuote>, sqlx::Error> {
|
||||
let rec = sqlx::query_as::<_, RealtimeQuote>(
|
||||
r#"
|
||||
SELECT symbol, market, ts, price, open_price, high_price, low_price,
|
||||
prev_close, change, change_percent, volume, source, updated_at
|
||||
FROM realtime_quotes
|
||||
WHERE market = $1 AND symbol = $2
|
||||
ORDER BY ts DESC
|
||||
LIMIT 1
|
||||
"#,
|
||||
)
|
||||
.bind(market)
|
||||
.bind(symbol)
|
||||
.fetch_optional(pool)
|
||||
.await?;
|
||||
Ok(rec)
|
||||
}
|
||||
|
||||
pub async fn batch_insert_daily_data(pool: &PgPool, records: &[DailyMarketDataDto]) -> Result<(), sqlx::Error> {
|
||||
let mut tx = pool.begin().await?;
|
||||
for r in records {
|
||||
sqlx::query(
|
||||
r#"
|
||||
INSERT INTO daily_market_data (
|
||||
symbol, trade_date, open_price, high_price, low_price, close_price,
|
||||
volume, pe, pb, total_mv
|
||||
)
|
||||
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10)
|
||||
ON CONFLICT (symbol, trade_date) DO UPDATE SET
|
||||
open_price = EXCLUDED.open_price,
|
||||
high_price = EXCLUDED.high_price,
|
||||
low_price = EXCLUDED.low_price,
|
||||
close_price = EXCLUDED.close_price,
|
||||
volume = EXCLUDED.volume,
|
||||
pe = EXCLUDED.pe,
|
||||
pb = EXCLUDED.pb,
|
||||
total_mv = EXCLUDED.total_mv
|
||||
"#,
|
||||
)
|
||||
.bind(&r.symbol)
|
||||
.bind(&r.trade_date)
|
||||
.bind(&r.open_price)
|
||||
.bind(&r.high_price)
|
||||
.bind(&r.low_price)
|
||||
.bind(&r.close_price)
|
||||
.bind(&r.volume)
|
||||
.bind(&r.pe)
|
||||
.bind(&r.pb)
|
||||
.bind(&r.total_mv)
|
||||
.execute(&mut *tx)
|
||||
.await?;
|
||||
}
|
||||
tx.commit().await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn get_daily_data_by_symbol(
|
||||
pool: &PgPool,
|
||||
symbol: &str,
|
||||
start_date: Option<NaiveDate>,
|
||||
end_date: Option<NaiveDate>,
|
||||
) -> Result<Vec<DailyMarketData>, sqlx::Error> {
|
||||
match (start_date, end_date) {
|
||||
(Some(start), Some(end)) => {
|
||||
let recs = sqlx::query_as::<_, DailyMarketData>(
|
||||
r#"
|
||||
SELECT symbol, trade_date, open_price, high_price, low_price, close_price,
|
||||
volume, pe, pb, total_mv
|
||||
FROM daily_market_data
|
||||
WHERE symbol = $1 AND trade_date BETWEEN $2 AND $3
|
||||
ORDER BY trade_date DESC
|
||||
"#,
|
||||
)
|
||||
.bind(symbol)
|
||||
.bind(start)
|
||||
.bind(end)
|
||||
.fetch_all(pool)
|
||||
.await?;
|
||||
Ok(recs)
|
||||
}
|
||||
(Some(start), None) => {
|
||||
let recs = sqlx::query_as::<_, DailyMarketData>(
|
||||
r#"
|
||||
SELECT symbol, trade_date, open_price, high_price, low_price, close_price,
|
||||
volume, pe, pb, total_mv
|
||||
FROM daily_market_data
|
||||
WHERE symbol = $1 AND trade_date >= $2
|
||||
ORDER BY trade_date DESC
|
||||
"#,
|
||||
)
|
||||
.bind(symbol)
|
||||
.bind(start)
|
||||
.fetch_all(pool)
|
||||
.await?;
|
||||
Ok(recs)
|
||||
}
|
||||
(None, Some(end)) => {
|
||||
let recs = sqlx::query_as::<_, DailyMarketData>(
|
||||
r#"
|
||||
SELECT symbol, trade_date, open_price, high_price, low_price, close_price,
|
||||
volume, pe, pb, total_mv
|
||||
FROM daily_market_data
|
||||
WHERE symbol = $1 AND trade_date <= $2
|
||||
ORDER BY trade_date DESC
|
||||
"#,
|
||||
)
|
||||
.bind(symbol)
|
||||
.bind(end)
|
||||
.fetch_all(pool)
|
||||
.await?;
|
||||
Ok(recs)
|
||||
}
|
||||
(None, None) => {
|
||||
let recs = sqlx::query_as::<_, DailyMarketData>(
|
||||
r#"
|
||||
SELECT symbol, trade_date, open_price, high_price, low_price, close_price,
|
||||
volume, pe, pb, total_mv
|
||||
FROM daily_market_data
|
||||
WHERE symbol = $1
|
||||
ORDER BY trade_date DESC
|
||||
"#,
|
||||
)
|
||||
.bind(symbol)
|
||||
.fetch_all(pool)
|
||||
.await?;
|
||||
Ok(recs)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@ -5,3 +5,11 @@
|
||||
// to fetch or store data.
|
||||
|
||||
pub mod system_config;
|
||||
pub mod companies;
|
||||
pub mod market_data;
|
||||
|
||||
pub use companies::{get_company_by_symbol, upsert_company};
|
||||
pub use market_data::{
|
||||
batch_insert_daily_data, batch_insert_financials, get_daily_data_by_symbol,
|
||||
get_financials_by_symbol, get_latest_realtime_quote, insert_realtime_quote,
|
||||
};
|
||||
|
||||
@ -1,7 +1,8 @@
|
||||
mod seeding;
|
||||
use data_persistence_service as app;
|
||||
use axum::Router;
|
||||
use sqlx::PgPool;
|
||||
use thiserror::Error;
|
||||
use sqlx::migrate::MigrateError;
|
||||
use tracing_subscriber::{EnvFilter, fmt::SubscriberBuilder};
|
||||
use tower_http::trace::TraceLayer;
|
||||
|
||||
@ -19,6 +20,49 @@ pub async fn main() {
|
||||
let db_url = std::env::var("DATABASE_URL").expect("DATABASE_URL must be set");
|
||||
let pool = PgPool::connect(&db_url).await.expect("Failed to connect to database");
|
||||
|
||||
// Run database migrations (strict by default; can be skipped via env for dev)
|
||||
let skip_migrations = std::env::var("SKIP_MIGRATIONS")
|
||||
.map(|v| {
|
||||
let v = v.to_ascii_lowercase();
|
||||
v == "1" || v == "true" || v == "yes" || v == "on"
|
||||
})
|
||||
.unwrap_or(false);
|
||||
if skip_migrations {
|
||||
println!("⚠️ SKIP_MIGRATIONS=1 → 跳过数据库迁移(仅建议用于本地开发/调试)。");
|
||||
} else {
|
||||
let res = sqlx::migrate!("./migrations").run(&pool).await;
|
||||
if let Err(e) = res {
|
||||
let allow_on_mismatch = std::env::var("SKIP_MIGRATIONS_ON_MISMATCH")
|
||||
.map(|v| {
|
||||
let v = v.to_ascii_lowercase();
|
||||
v == "1" || v == "true" || v == "yes" || v == "on"
|
||||
})
|
||||
.unwrap_or(false);
|
||||
match &e {
|
||||
MigrateError::VersionMismatch(ver) if allow_on_mismatch => {
|
||||
eprintln!(
|
||||
"❗ 检测到迁移版本不一致:VersionMismatch({}).\n\
|
||||
已根据 SKIP_MIGRATIONS_ON_MISMATCH=1 跳过迁移运行,仅继续启动服务。\n\
|
||||
建议修复:\n\
|
||||
1) 不要修改已应用的迁移文件;如需变更,请创建新迁移;\n\
|
||||
2) 或使用 `sqlx migrate repair` 修复校验;\n\
|
||||
3) 或清空/重建数据库后重新应用迁移。",
|
||||
ver
|
||||
);
|
||||
}
|
||||
_ => {
|
||||
panic!("Failed to run database migrations: {}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Seed the database with default data if necessary
|
||||
if let Err(e) = seeding::seed_data(&pool).await {
|
||||
tracing::error!("Failed to seed database: {}", e);
|
||||
// We don't exit here, as the app might still be functional.
|
||||
}
|
||||
|
||||
let state = app::AppState::new(pool);
|
||||
|
||||
let openapi = app::build_openapi_spec();
|
||||
|
||||
@ -1,12 +1,25 @@
|
||||
use common_contracts::config_models::{AnalysisModulesConfig, LlmProvidersConfig, DataSourceConfig};
|
||||
use common_contracts::config_models::{AnalysisTemplateSets, DataSourceConfig, LlmProvidersConfig};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use uuid::Uuid;
|
||||
|
||||
pub use common_contracts::models::*;
|
||||
|
||||
#[derive(Serialize, Deserialize, Debug, Clone, Default)]
|
||||
pub struct SystemConfig {
|
||||
pub llm_providers: LlmProvidersConfig,
|
||||
pub analysis_modules: AnalysisModulesConfig,
|
||||
pub analysis_template_sets: AnalysisTemplateSets,
|
||||
pub data_sources: HashMap<String, DataSourceConfig>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, sqlx::FromRow)]
|
||||
pub struct AnalysisResult {
|
||||
pub id: i64,
|
||||
pub request_id: Uuid,
|
||||
pub symbol: String,
|
||||
pub template_id: String,
|
||||
pub module_id: String,
|
||||
pub content: String,
|
||||
pub meta_data: serde_json::Value,
|
||||
pub created_at: chrono::DateTime<chrono::Utc>,
|
||||
}
|
||||
|
||||
91
services/data-persistence-service/src/seeding.rs
Normal file
91
services/data-persistence-service/src/seeding.rs
Normal file
@ -0,0 +1,91 @@
|
||||
//! One-time data seeding logic for initializing the database.
|
||||
|
||||
use data_persistence_service::models::SystemConfig;
|
||||
use common_contracts::config_models::{AnalysisModuleConfig, AnalysisTemplateSet, AnalysisTemplateSets};
|
||||
use sqlx::PgPool;
|
||||
use std::collections::HashMap;
|
||||
use tracing::info;
|
||||
|
||||
const DEFAULT_ANALYSIS_CONFIG_JSON: &str = include_str!("../../../config/analysis-config.json");
|
||||
const CONFIG_KEY: &str = "analysis_template_sets";
|
||||
|
||||
#[derive(serde::Deserialize)]
|
||||
struct RawAnalysisConfig {
|
||||
analysis_modules: HashMap<String, RawModule>,
|
||||
}
|
||||
|
||||
#[derive(serde::Deserialize)]
|
||||
struct RawModule {
|
||||
name: String,
|
||||
#[serde(default)]
|
||||
dependencies: Vec<String>,
|
||||
#[serde(rename = "model")]
|
||||
model_id: String,
|
||||
prompt_template: String,
|
||||
}
|
||||
|
||||
/// Seeds the database with default configurations if they don't already exist.
|
||||
pub async fn seed_data(pool: &PgPool) -> Result<(), sqlx::Error> {
|
||||
info!("Checking if default data seeding is required...");
|
||||
|
||||
let mut tx = pool.begin().await?;
|
||||
|
||||
// Check if the 'analysis_template_sets' config already exists.
|
||||
let count: i64 = sqlx::query_scalar("SELECT COUNT(*) FROM system_config WHERE config_key = $1")
|
||||
.bind(CONFIG_KEY)
|
||||
.fetch_one(&mut *tx)
|
||||
.await?;
|
||||
|
||||
if count == 0 {
|
||||
info!("No 'analysis_template_sets' config found. Seeding default analysis templates...");
|
||||
|
||||
// 解析当前仓库中的配置文件结构:
|
||||
// { "analysis_modules": { "<module_id>": { name, model, prompt_template, dependencies? } } }
|
||||
let raw: RawAnalysisConfig = serde_json::from_str(DEFAULT_ANALYSIS_CONFIG_JSON)
|
||||
.expect("Failed to parse embedded default analysis config JSON");
|
||||
|
||||
let modules = raw
|
||||
.analysis_modules
|
||||
.into_iter()
|
||||
.map(|(k, v)| {
|
||||
(
|
||||
k,
|
||||
AnalysisModuleConfig {
|
||||
name: v.name,
|
||||
provider_id: "".to_string(), // 由用户后续配置
|
||||
model_id: v.model_id,
|
||||
prompt_template: v.prompt_template,
|
||||
dependencies: v.dependencies,
|
||||
},
|
||||
)
|
||||
})
|
||||
.collect();
|
||||
|
||||
let default_template_set = AnalysisTemplateSet {
|
||||
name: "默认分析模板".to_string(),
|
||||
modules,
|
||||
};
|
||||
|
||||
let mut template_sets = AnalysisTemplateSets::new();
|
||||
template_sets.insert("default".to_string(), default_template_set);
|
||||
|
||||
// 仅存储 analysis_template_sets 自身为该 key 的值
|
||||
let config_value = serde_json::to_value(&template_sets)
|
||||
.expect("Failed to serialize default analysis template sets");
|
||||
|
||||
// Insert the new default config.
|
||||
sqlx::query("INSERT INTO system_config (config_key, config_value) VALUES ($1, $2)")
|
||||
.bind(CONFIG_KEY)
|
||||
.bind(config_value)
|
||||
.execute(&mut *tx)
|
||||
.await?;
|
||||
|
||||
info!("Successfully seeded default analysis templates.");
|
||||
} else {
|
||||
info!("Database already seeded. Skipping.");
|
||||
}
|
||||
|
||||
tx.commit().await?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
17
services/report-generator-service/Cargo.lock
generated
17
services/report-generator-service/Cargo.lock
generated
@ -728,6 +728,12 @@ version = "0.1.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "3a3076410a55c90011c298b04d0cfa770b00fa04e1e3c97d3f6c9de105a03844"
|
||||
|
||||
[[package]]
|
||||
name = "fixedbitset"
|
||||
version = "0.4.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
|
||||
|
||||
[[package]]
|
||||
name = "flume"
|
||||
version = "0.11.1"
|
||||
@ -1657,6 +1663,16 @@ dependencies = [
|
||||
"sha2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "petgraph"
|
||||
version = "0.6.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "b4c5cc86750666a3ed20bdaf5ca2a0344f9c67674cae0515bec2da16fbaa47db"
|
||||
dependencies = [
|
||||
"fixedbitset",
|
||||
"indexmap",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "phf"
|
||||
version = "0.11.3"
|
||||
@ -2037,6 +2053,7 @@ dependencies = [
|
||||
"config",
|
||||
"dashmap",
|
||||
"futures",
|
||||
"petgraph",
|
||||
"reqwest",
|
||||
"secrecy",
|
||||
"serde",
|
||||
|
||||
@ -41,3 +41,4 @@ thiserror = "2.0.17"
|
||||
anyhow = "1.0"
|
||||
chrono = "0.4.38"
|
||||
tera = "1.19"
|
||||
petgraph = "0.6.5"
|
||||
|
||||
@ -38,7 +38,7 @@ async fn main() -> Result<()> {
|
||||
.map_err(|e| ProviderError::Internal(anyhow::anyhow!(e.to_string())))?;
|
||||
let state_clone = app_state.clone();
|
||||
tokio::spawn(async move {
|
||||
if let Err(e) = message_consumer::subscribe_to_events(state_clone, nats_client).await {
|
||||
if let Err(e) = message_consumer::subscribe_to_commands(state_clone, nats_client).await {
|
||||
tracing::error!("message consumer exited with error: {:?}", e);
|
||||
}
|
||||
});
|
||||
|
||||
@ -1,36 +1,40 @@
|
||||
use std::sync::Arc;
|
||||
|
||||
use common_contracts::messages::FinancialsPersistedEvent;
|
||||
use common_contracts::messages::GenerateReportCommand;
|
||||
use futures::StreamExt;
|
||||
use tracing::{error, info};
|
||||
|
||||
use crate::{state::AppState, worker::run_report_generation_workflow};
|
||||
|
||||
const SUBJECT_NAME: &str = "events.data.financials_persisted";
|
||||
const SUBJECT_NAME: &str = "analysis.commands.generate_report";
|
||||
|
||||
pub async fn subscribe_to_events(
|
||||
pub async fn subscribe_to_commands(
|
||||
app_state: AppState,
|
||||
nats_client: async_nats::Client,
|
||||
) -> Result<(), anyhow::Error> {
|
||||
let mut subscriber = nats_client.subscribe(SUBJECT_NAME.to_string()).await?;
|
||||
info!(
|
||||
"Consumer started, waiting for messages on subject '{}'",
|
||||
"Consumer started, waiting for commands on subject '{}'",
|
||||
SUBJECT_NAME
|
||||
);
|
||||
|
||||
while let Some(message) = subscriber.next().await {
|
||||
info!("Received NATS message for financials persisted event.");
|
||||
info!("Received NATS command to generate report.");
|
||||
let state_clone = app_state.clone();
|
||||
tokio::spawn(async move {
|
||||
match serde_json::from_slice::<FinancialsPersistedEvent>(&message.payload) {
|
||||
Ok(event) => {
|
||||
info!("Deserialized event for symbol: {}", event.symbol);
|
||||
if let Err(e) = run_report_generation_workflow(Arc::new(state_clone), event).await {
|
||||
match serde_json::from_slice::<GenerateReportCommand>(&message.payload) {
|
||||
Ok(command) => {
|
||||
info!(
|
||||
"Deserialized command for symbol: {}, template: {}",
|
||||
command.symbol, command.template_id
|
||||
);
|
||||
if let Err(e) = run_report_generation_workflow(Arc::new(state_clone), command).await
|
||||
{
|
||||
error!("Error running report generation workflow: {:?}", e);
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to deserialize message: {}", e);
|
||||
error!("Failed to deserialize GenerateReportCommand: {}", e);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
@ -6,8 +6,11 @@
|
||||
|
||||
use crate::error::Result;
|
||||
use common_contracts::{
|
||||
config_models::{AnalysisModulesConfig, LlmProvidersConfig},
|
||||
dtos::{CompanyProfileDto, RealtimeQuoteDto, TimeSeriesFinancialBatchDto, TimeSeriesFinancialDto},
|
||||
config_models::{AnalysisTemplateSets, LlmProvidersConfig},
|
||||
dtos::{
|
||||
CompanyProfileDto, NewAnalysisResult, RealtimeQuoteDto, TimeSeriesFinancialBatchDto,
|
||||
TimeSeriesFinancialDto,
|
||||
},
|
||||
};
|
||||
use tracing::info;
|
||||
|
||||
@ -43,7 +46,7 @@ impl PersistenceClient {
|
||||
&self,
|
||||
symbol: &str,
|
||||
) -> Result<Vec<TimeSeriesFinancialDto>> {
|
||||
let url = format!("{}/market-data/financials/{}", self.base_url, symbol);
|
||||
let url = format!("{}/market-data/financial-statements/{}", self.base_url, symbol);
|
||||
info!("Fetching financials for {} from {}", symbol, url);
|
||||
let dtos = self
|
||||
.client
|
||||
@ -72,20 +75,37 @@ impl PersistenceClient {
|
||||
Ok(config)
|
||||
}
|
||||
|
||||
pub async fn get_analysis_modules_config(&self) -> Result<AnalysisModulesConfig> {
|
||||
let url = format!("{}/configs/analysis_modules", self.base_url);
|
||||
info!("Fetching analysis modules config from {}", url);
|
||||
pub async fn get_analysis_template_sets(&self) -> Result<AnalysisTemplateSets> {
|
||||
let url = format!("{}/configs/analysis_template_sets", self.base_url);
|
||||
info!("Fetching analysis template sets from {}", url);
|
||||
let config = self
|
||||
.client
|
||||
.get(&url)
|
||||
.send()
|
||||
.await?
|
||||
.error_for_status()?
|
||||
.json::<AnalysisModulesConfig>()
|
||||
.json::<AnalysisTemplateSets>()
|
||||
.await?;
|
||||
Ok(config)
|
||||
}
|
||||
|
||||
// --- Data Writing Methods ---
|
||||
|
||||
pub async fn create_analysis_result(&self, result: NewAnalysisResult) -> Result<()> {
|
||||
let url = format!("{}/analysis-results", self.base_url);
|
||||
info!(
|
||||
"Persisting analysis result for symbol '{}', module '{}' to {}",
|
||||
result.symbol, result.module_id, url
|
||||
);
|
||||
self.client
|
||||
.post(&url)
|
||||
.json(&result)
|
||||
.send()
|
||||
.await?
|
||||
.error_for_status()?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn upsert_company_profile(&self, profile: CompanyProfileDto) -> Result<()> {
|
||||
let url = format!("{}/companies", self.base_url);
|
||||
info!("Upserting company profile for {} to {}", profile.symbol, url);
|
||||
@ -116,7 +136,12 @@ impl PersistenceClient {
|
||||
}
|
||||
let url = format!("{}/market-data/financials/batch", self.base_url);
|
||||
let symbol = dtos[0].symbol.clone();
|
||||
info!("Batch inserting {} financial statements for {} to {}", dtos.len(), symbol, url);
|
||||
info!(
|
||||
"Batch inserting {} financial statements for {} to {}",
|
||||
dtos.len(),
|
||||
symbol,
|
||||
url
|
||||
);
|
||||
|
||||
let batch = TimeSeriesFinancialBatchDto { records: dtos };
|
||||
|
||||
|
||||
@ -1,21 +1,26 @@
|
||||
use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
use common_contracts::config_models::{AnalysisModuleConfig, AnalysisModulesConfig, LlmProvider, LlmProvidersConfig};
|
||||
use common_contracts::dtos::{CompanyProfileDto, TimeSeriesFinancialDto};
|
||||
use common_contracts::messages::FinancialsPersistedEvent;
|
||||
|
||||
use common_contracts::config_models::{
|
||||
AnalysisModuleConfig, AnalysisTemplateSets, LlmProvidersConfig,
|
||||
};
|
||||
use common_contracts::dtos::{CompanyProfileDto, NewAnalysisResult, TimeSeriesFinancialDto};
|
||||
use common_contracts::messages::GenerateReportCommand; // Assuming this command is defined
|
||||
use petgraph::algo::toposort;
|
||||
use petgraph::graph::DiGraph;
|
||||
use tera::{Context, Tera};
|
||||
use tracing::{info, warn, instrument};
|
||||
use tracing::{info, warn, instrument, error};
|
||||
use uuid::Uuid;
|
||||
|
||||
use crate::error::{ProviderError, Result};
|
||||
use crate::llm_client::LlmClient;
|
||||
use crate::persistence::PersistenceClient;
|
||||
use crate::state::AppState;
|
||||
use crate::templates::render_prompt;
|
||||
|
||||
#[instrument(skip_all, fields(symbol = %event.symbol))]
|
||||
#[instrument(skip_all, fields(request_id = %command.request_id, symbol = %command.symbol, template_id = %command.template_id))]
|
||||
pub async fn run_report_generation_workflow(
|
||||
state: Arc<AppState>,
|
||||
event: FinancialsPersistedEvent,
|
||||
command: GenerateReportCommand,
|
||||
) -> Result<()> {
|
||||
info!("Starting report generation workflow.");
|
||||
|
||||
@ -23,88 +28,167 @@ pub async fn run_report_generation_workflow(
|
||||
PersistenceClient::new(state.config.data_persistence_service_url.clone());
|
||||
|
||||
// 1. Fetch all necessary data AND configurations in parallel
|
||||
let (profile, financials, llm_providers, analysis_modules) =
|
||||
fetch_data_and_configs(&persistence_client, &event.symbol).await?;
|
||||
|
||||
if financials.is_empty() {
|
||||
warn!("No financial data found. Aborting report generation.");
|
||||
return Ok(());
|
||||
let (profile, financials, llm_providers, template_sets) =
|
||||
match fetch_data_and_configs(&persistence_client, &command.symbol).await {
|
||||
Ok(data) => data,
|
||||
Err(e) => {
|
||||
error!("Failed to fetch initial data and configs: {}", e);
|
||||
return Err(e);
|
||||
}
|
||||
};
|
||||
|
||||
// --- New: Dynamic, Multi-Module Workflow ---
|
||||
// 2. Select the correct template set
|
||||
let template_set = match template_sets.get(&command.template_id) {
|
||||
Some(ts) => ts,
|
||||
None => {
|
||||
let err_msg = format!("Analysis template set with ID '{}' not found.", command.template_id);
|
||||
error!("{}", err_msg);
|
||||
return Err(ProviderError::Configuration(err_msg));
|
||||
}
|
||||
};
|
||||
|
||||
// 3. Topologically sort modules to get execution order
|
||||
let sorted_modules = match sort_modules_by_dependency(&template_set.modules) {
|
||||
Ok(order) => order,
|
||||
Err(e) => {
|
||||
error!("Failed to sort analysis modules: {}", e);
|
||||
return Err(e);
|
||||
}
|
||||
};
|
||||
|
||||
info!(execution_order = ?sorted_modules, "Successfully determined module execution order.");
|
||||
|
||||
// 4. Execute modules in order
|
||||
let mut generated_results: HashMap<String, String> = HashMap::new();
|
||||
|
||||
// Naive sequential execution based on dependencies. A proper topological sort would be better.
|
||||
// For now, we just iterate multiple times to resolve dependencies.
|
||||
for _ in 0..analysis_modules.len() {
|
||||
for (module_id, module_config) in &analysis_modules {
|
||||
if generated_results.contains_key(module_id.as_str()) {
|
||||
continue; // Already generated
|
||||
}
|
||||
|
||||
// Check if all dependencies are met
|
||||
let deps_met = module_config.dependencies.iter().all(|dep| generated_results.contains_key(dep));
|
||||
if !deps_met {
|
||||
continue; // Will try again in the next iteration
|
||||
}
|
||||
for module_id in sorted_modules {
|
||||
let module_config = template_set.modules.get(&module_id).unwrap(); // Should not fail due to sorting logic
|
||||
|
||||
info!(module_id = %module_id, "All dependencies met. Generating report for module.");
|
||||
|
||||
// 2. Dynamically create LLM client for this module
|
||||
let llm_client = create_llm_client_for_module(&state, &llm_providers, module_config)?;
|
||||
let llm_client = match create_llm_client_for_module(&llm_providers, module_config) {
|
||||
Ok(client) => client,
|
||||
Err(e) => {
|
||||
error!(module_id = %module_id, "Failed to create LLM client: {}. Skipping module.", e);
|
||||
generated_results.insert(module_id.clone(), format!("Error: Failed to create LLM client: {}", e));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
// 3. Create context and render the prompt
|
||||
let mut context = Context::new();
|
||||
context.insert("company_name", &profile.name);
|
||||
context.insert("ts_code", &event.symbol);
|
||||
// Inject dependencies into context
|
||||
context.insert("ts_code", &command.symbol);
|
||||
|
||||
for dep in &module_config.dependencies {
|
||||
if let Some(content) = generated_results.get(dep) {
|
||||
context.insert(dep, content);
|
||||
}
|
||||
}
|
||||
// A placeholder for financial data, can be expanded
|
||||
|
||||
// TODO: This is a placeholder. Implement proper financial data formatting and injection.
|
||||
context.insert("financial_data", "...");
|
||||
|
||||
let prompt = Tera::one_off(&module_config.prompt_template, &context, true)
|
||||
.map_err(|e| ProviderError::Internal(anyhow::anyhow!("Prompt rendering failed for module '{}': {}", module_id, e)))?;
|
||||
let prompt = match Tera::one_off(&module_config.prompt_template, &context, true) {
|
||||
Ok(p) => p,
|
||||
Err(e) => {
|
||||
let err_msg = format!("Prompt rendering failed: {}", e);
|
||||
error!(module_id = %module_id, "{}", err_msg);
|
||||
generated_results.insert(module_id.clone(), format!("Error: {}", err_msg));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
let content = match llm_client.generate_text(prompt).await {
|
||||
Ok(c) => c,
|
||||
Err(e) => {
|
||||
let err_msg = format!("LLM generation failed: {}", e);
|
||||
error!(module_id = %module_id, "{}", err_msg);
|
||||
generated_results.insert(module_id.clone(), format!("Error: {}", err_msg));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
// 4. Call the LLM to generate the content for this module
|
||||
let content = llm_client.generate_text(prompt).await?;
|
||||
info!(module_id = %module_id, "Successfully generated content.");
|
||||
|
||||
// TODO: Persist the generated result via persistence_client
|
||||
let result_to_persist = NewAnalysisResult {
|
||||
request_id: command.request_id,
|
||||
symbol: command.symbol.clone(),
|
||||
template_id: command.template_id.clone(),
|
||||
module_id: module_id.clone(),
|
||||
content: content.clone(),
|
||||
meta_data: serde_json::json!({ "model_id": module_config.model_id }),
|
||||
};
|
||||
|
||||
if let Err(e) = persistence_client.create_analysis_result(result_to_persist).await {
|
||||
error!(module_id = %module_id, "Failed to persist analysis result: {}", e);
|
||||
// Decide if we should continue or fail the whole workflow
|
||||
}
|
||||
|
||||
generated_results.insert(module_id.clone(), content);
|
||||
}
|
||||
}
|
||||
|
||||
if generated_results.len() != analysis_modules.len() {
|
||||
warn!("Could not generate all modules due to missing dependencies or circular dependency.");
|
||||
}
|
||||
|
||||
info!("Report generation workflow finished.");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn sort_modules_by_dependency(
|
||||
modules: &HashMap<String, AnalysisModuleConfig>,
|
||||
) -> Result<Vec<String>> {
|
||||
let mut graph = DiGraph::<String, ()>::new();
|
||||
let mut node_map = HashMap::new();
|
||||
|
||||
for module_id in modules.keys() {
|
||||
let index = graph.add_node(module_id.clone());
|
||||
node_map.insert(module_id.clone(), index);
|
||||
}
|
||||
|
||||
for (module_id, module_config) in modules {
|
||||
if let Some(&module_index) = node_map.get(module_id) {
|
||||
for dep in &module_config.dependencies {
|
||||
if let Some(&dep_index) = node_map.get(dep) {
|
||||
graph.add_edge(dep_index, module_index, ());
|
||||
} else {
|
||||
return Err(ProviderError::Configuration(format!(
|
||||
"Module '{}' has a missing dependency: '{}'",
|
||||
module_id, dep
|
||||
)));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
match toposort(&graph, None) {
|
||||
Ok(sorted_nodes) => {
|
||||
let sorted_ids = sorted_nodes
|
||||
.into_iter()
|
||||
.map(|node_index| graph[node_index].clone())
|
||||
.collect();
|
||||
Ok(sorted_ids)
|
||||
}
|
||||
Err(cycle) => {
|
||||
let cycle_id = graph[cycle.node_id()].clone();
|
||||
Err(ProviderError::Configuration(format!(
|
||||
"Circular dependency detected in analysis modules. Cycle involves: '{}'",
|
||||
cycle_id
|
||||
)))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn create_llm_client_for_module(
|
||||
state: &Arc<AppState>,
|
||||
llm_providers: &LlmProvidersConfig,
|
||||
module_config: &AnalysisModuleConfig,
|
||||
) -> Result<LlmClient> {
|
||||
let provider = llm_providers.get(&module_config.provider_id).ok_or_else(|| {
|
||||
ProviderError::Configuration(format!(
|
||||
"Provider '{}' not found in llm_providers config",
|
||||
module_config.provider_id
|
||||
"Provider '{}' not found for module '{}'",
|
||||
module_config.provider_id, module_config.name
|
||||
))
|
||||
})?;
|
||||
|
||||
// In the old design, the api key name was stored. In the new design, it's stored directly.
|
||||
let api_key = provider.api_key.clone();
|
||||
|
||||
Ok(LlmClient::new(
|
||||
provider.api_base_url.clone(),
|
||||
api_key.into(), // Convert String to SecretString
|
||||
provider.api_key.clone().into(),
|
||||
module_config.model_id.clone(),
|
||||
))
|
||||
}
|
||||
@ -116,13 +200,13 @@ async fn fetch_data_and_configs(
|
||||
CompanyProfileDto,
|
||||
Vec<TimeSeriesFinancialDto>,
|
||||
LlmProvidersConfig,
|
||||
AnalysisModulesConfig,
|
||||
AnalysisTemplateSets,
|
||||
)> {
|
||||
let (profile, financials, llm_providers, analysis_modules) = tokio::try_join!(
|
||||
let (profile, financials, llm_providers, template_sets) = tokio::try_join!(
|
||||
client.get_company_profile(symbol),
|
||||
client.get_financial_statements(symbol),
|
||||
client.get_llm_providers_config(),
|
||||
client.get_analysis_modules_config(),
|
||||
client.get_analysis_template_sets(), // Changed from get_analysis_modules_config
|
||||
)?;
|
||||
Ok((profile, financials, llm_providers, analysis_modules))
|
||||
Ok((profile, financials, llm_providers, template_sets))
|
||||
}
|
||||
|
||||
Loading…
Reference in New Issue
Block a user