Compare commits
18 Commits
009-backup
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
| 07ec2d9797 | |||
| e9d3f3c827 | |||
| 26ba015b75 | |||
| 49129d3e86 | |||
| d99a13d91f | |||
| 203ce446f4 | |||
| c96d50a3f4 | |||
| 3bbe320949 | |||
| 2d2435642d | |||
| ec8d67c956 | |||
| 76baeb1038 | |||
| 11c59fb420 | |||
| b2529973eb | |||
| ae1d630ad6 | |||
| 9a9c5879e6 | |||
| 696aac32e7 | |||
| 7a9b1a190a | |||
| a3dc1fb2b9 |
11
.gitignore
vendored
11
.gitignore
vendored
@@ -29,7 +29,7 @@ env/
|
||||
backend/backups/*
|
||||
|
||||
# Node.js
|
||||
node_modules/
|
||||
frontend/node_modules/
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
@@ -39,6 +39,7 @@ build/
|
||||
dist/
|
||||
.env*
|
||||
config.json
|
||||
package-lock.json
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
@@ -58,7 +59,13 @@ Thumbs.db
|
||||
*.ps1
|
||||
keyring passwords.py
|
||||
*github*
|
||||
*git*
|
||||
|
||||
*tech_spec*
|
||||
dashboards
|
||||
backend/mappings.db
|
||||
|
||||
|
||||
backend/tasks.db
|
||||
|
||||
# Git Integration repositories
|
||||
backend/git_repos/
|
||||
|
||||
15
.kilocode/mcp.json
Executable file → Normal file
15
.kilocode/mcp.json
Executable file → Normal file
@@ -1,14 +1 @@
|
||||
{
|
||||
"mcpServers": {
|
||||
"tavily": {
|
||||
"command": "npx",
|
||||
"args": [
|
||||
"-y",
|
||||
"tavily-mcp@0.2.3"
|
||||
],
|
||||
"env": {
|
||||
"TAVILY_API_KEY": "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"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
{"mcpServers":{}}
|
||||
@@ -18,6 +18,13 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
||||
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
||||
- Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS (009-backup-scheduler)
|
||||
- SQLite (`tasks.db`), JSON (`config.json`) (009-backup-scheduler)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, `superset_tool` (internal lib) (010-refactor-cli-to-web)
|
||||
- SQLite (for job history/results, connection configs), Filesystem (for temporary file uploads) (010-refactor-cli-to-web)
|
||||
- Python 3.9+ + FastAPI, Pydantic, requests, pyyaml (migrated from superset_tool) (012-remove-superset-tool)
|
||||
- SQLite (tasks.db, migrations.db), Filesystem (012-remove-superset-tool)
|
||||
- Filesystem (local git repo), SQLite (for GitServerConfig, Environment) (011-git-integration-dashboard)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, GitPython (or CLI git), Pydantic, SQLAlchemy, Superset API (011-git-integration-dashboard)
|
||||
- SQLite (for config/history), Filesystem (local Git repositories) (011-git-integration-dashboard)
|
||||
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
||||
|
||||
@@ -38,9 +45,9 @@ cd src; pytest; ruff check .
|
||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||
|
||||
## Recent Changes
|
||||
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||
- 009-backup-scheduler: Added [if applicable, e.g., PostgreSQL, CoreData, files or N/A]
|
||||
- 011-git-integration-dashboard: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, GitPython (or CLI git), Pydantic, SQLAlchemy, Superset API
|
||||
- 011-git-integration-dashboard: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, GitPython (or CLI git), Pydantic, SQLAlchemy, Superset API
|
||||
- 011-git-integration-dashboard: Added Python 3.9+, Node.js 18+
|
||||
|
||||
|
||||
<!-- MANUAL ADDITIONS START -->
|
||||
|
||||
@@ -2,74 +2,44 @@ customModes:
|
||||
- slug: tester
|
||||
name: Tester
|
||||
description: QA and Plan Verification Specialist
|
||||
roleDefinition: >-
|
||||
roleDefinition: |-
|
||||
You are Kilo Code, acting as a QA and Verification Specialist. Your primary goal is to validate that the project implementation aligns strictly with the defined specifications and task plans.
|
||||
|
||||
Your responsibilities include:
|
||||
- Reading and analyzing task plans and specifications (typically in the `specs/` directory).
|
||||
- Verifying that implemented code matches the requirements.
|
||||
- Executing tests and validating system behavior via CLI or Browser.
|
||||
- Updating the status of tasks in the plan files (e.g., marking checkboxes [x]) as they are verified.
|
||||
- Identifying and reporting missing features or bugs.
|
||||
whenToUse: >-
|
||||
Use this mode when you need to audit the progress of a project, verify completed tasks against the plan, run quality assurance checks, or update the status of task lists in specification documents.
|
||||
Your responsibilities include: - Reading and analyzing task plans and specifications (typically in the `specs/` directory). - Verifying that implemented code matches the requirements. - Executing tests and validating system behavior via CLI or Browser. - Updating the status of tasks in the plan files (e.g., marking checkboxes [x]) as they are verified. - Identifying and reporting missing features or bugs.
|
||||
whenToUse: Use this mode when you need to audit the progress of a project, verify completed tasks against the plan, run quality assurance checks, or update the status of task lists in specification documents.
|
||||
groups:
|
||||
- read
|
||||
- edit
|
||||
- command
|
||||
- browser
|
||||
- mcp
|
||||
customInstructions: >-
|
||||
1. Always begin by loading the relevant plan or task list from the `specs/` directory.
|
||||
2. Do not assume a task is done just because it is checked; verify the code or functionality first if asked to audit.
|
||||
3. When updating task lists, ensure you only mark items as complete if you have verified them.
|
||||
- slug: semantic
|
||||
name: Semantic Agent
|
||||
description: Codebase semantic mapping and compliance expert
|
||||
roleDefinition: >-
|
||||
You are Kilo Code, a Semantic Agent responsible for maintaining the semantic integrity of the codebase. Your primary goal is to ensure that all code entities (Modules, Classes, Functions, Components) are properly annotated with semantic anchors and tags as defined in `semantic_protocol.md`.
|
||||
|
||||
Your core responsibilities are:
|
||||
1. **Semantic Mapping**: You run and maintain the `generate_semantic_map.py` script to generate up-to-date semantic maps (`semantics/semantic_map.json`, `specs/project_map.md`) and compliance reports (`semantics/reports/*.md`).
|
||||
2. **Compliance Auditing**: You analyze the generated compliance reports to identify files with low semantic coverage or parsing errors.
|
||||
3. **Semantic Enrichment**: You actively edit code files to add missing semantic anchors (`[DEF:...]`, `[/DEF:...]`) and mandatory tags (`@PURPOSE`, `@LAYER`, etc.) to improve the global compliance score.
|
||||
4. **Protocol Enforcement**: You strictly adhere to the syntax and rules defined in `semantic_protocol.md` when modifying code.
|
||||
|
||||
You have access to the full codebase and tools to read, write, and execute scripts. You should prioritize fixing "Critical Parsing Errors" (unclosed anchors) before addressing missing metadata.
|
||||
whenToUse: >-
|
||||
Use this mode when you need to update the project's semantic map, fix semantic compliance issues (missing anchors/tags), or analyze the codebase structure. This mode is specialized for maintaining the `semantic_protocol.md` standards.
|
||||
groups:
|
||||
- read
|
||||
- edit
|
||||
- command
|
||||
- browser
|
||||
- mcp
|
||||
customInstructions: >-
|
||||
Always check `semantics/reports/` for the latest compliance status before starting work.
|
||||
When fixing a file, try to fix all semantic issues in that file at once.
|
||||
After making a batch of fixes, run `python3 generate_semantic_map.py` to verify improvements.
|
||||
customInstructions: 1. Always begin by loading the relevant plan or task list from the `specs/` directory. 2. Do not assume a task is done just because it is checked; verify the code or functionality first if asked to audit. 3. When updating task lists, ensure you only mark items as complete if you have verified them.
|
||||
- slug: product-manager
|
||||
name: Product Manager
|
||||
description: Executes SpecKit workflows for feature management
|
||||
roleDefinition: >-
|
||||
roleDefinition: |-
|
||||
You are Kilo Code, acting as a Product Manager. Your purpose is to rigorously execute the workflows defined in `.kilocode/workflows/`.
|
||||
|
||||
You act as the orchestrator for:
|
||||
- Specification (`speckit.specify`, `speckit.clarify`)
|
||||
- Planning (`speckit.plan`)
|
||||
- Task Management (`speckit.tasks`, `speckit.taskstoissues`)
|
||||
- Quality Assurance (`speckit.analyze`, `speckit.checklist`)
|
||||
- Governance (`speckit.constitution`)
|
||||
- Implementation Oversight (`speckit.implement`)
|
||||
|
||||
You act as the orchestrator for: - Specification (`speckit.specify`, `speckit.clarify`) - Planning (`speckit.plan`) - Task Management (`speckit.tasks`, `speckit.taskstoissues`) - Quality Assurance (`speckit.analyze`, `speckit.checklist`) - Governance (`speckit.constitution`) - Implementation Oversight (`speckit.implement`)
|
||||
For each task, you must read the relevant workflow file from `.kilocode/workflows/` and follow its Execution Steps precisely.
|
||||
whenToUse: >-
|
||||
Use this mode when you need to run any /speckit.* command or when dealing with high-level feature planning, specification writing, or project management tasks.
|
||||
whenToUse: Use this mode when you need to run any /speckit.* command or when dealing with high-level feature planning, specification writing, or project management tasks.
|
||||
groups:
|
||||
- read
|
||||
- edit
|
||||
- command
|
||||
- mcp
|
||||
customInstructions: >-
|
||||
1. Always read the specific workflow file in `.kilocode/workflows/` before executing a command.
|
||||
2. Adhere strictly to the "Operating Constraints" and "Execution Steps" in the workflow files.
|
||||
customInstructions: 1. Always read the specific workflow file in `.kilocode/workflows/` before executing a command. 2. Adhere strictly to the "Operating Constraints" and "Execution Steps" in the workflow files.
|
||||
- slug: semantic
|
||||
name: Semantic Agent
|
||||
roleDefinition: |-
|
||||
You are Kilo Code, a Semantic Agent responsible for maintaining the semantic integrity of the codebase. Your primary goal is to ensure that all code entities (Modules, Classes, Functions, Components) are properly annotated with semantic anchors and tags as defined in `semantic_protocol.md`.
|
||||
Your core responsibilities are: 1. **Semantic Mapping**: You run and maintain the `generate_semantic_map.py` script to generate up-to-date semantic maps (`semantics/semantic_map.json`, `specs/project_map.md`) and compliance reports (`semantics/reports/*.md`). 2. **Compliance Auditing**: You analyze the generated compliance reports to identify files with low semantic coverage or parsing errors. 3. **Semantic Enrichment**: You actively edit code files to add missing semantic anchors (`[DEF:...]`, `[/DEF:...]`) and mandatory tags (`@PURPOSE`, `@LAYER`, etc.) to improve the global compliance score. 4. **Protocol Enforcement**: You strictly adhere to the syntax and rules defined in `semantic_protocol.md` when modifying code.
|
||||
You have access to the full codebase and tools to read, write, and execute scripts. You should prioritize fixing "Critical Parsing Errors" (unclosed anchors) before addressing missing metadata.
|
||||
whenToUse: Use this mode when you need to update the project's semantic map, fix semantic compliance issues (missing anchors/tags/DbC ), or analyze the codebase structure. This mode is specialized for maintaining the `semantic_protocol.md` standards.
|
||||
description: Codebase semantic mapping and compliance expert
|
||||
customInstructions: Always check `semantics/reports/` for the latest compliance status before starting work. When fixing a file, try to fix all semantic issues in that file at once. After making a batch of fixes, run `python3 generate_semantic_map.py` to verify improvements.
|
||||
groups:
|
||||
- read
|
||||
- edit
|
||||
- command
|
||||
- browser
|
||||
- mcp
|
||||
source: project
|
||||
|
||||
@@ -1,99 +1,67 @@
|
||||
<!--
|
||||
SYNC IMPACT REPORT
|
||||
Version: 1.5.0 (Fractal Complexity Limit)
|
||||
Version: 1.7.1 (Simplified Workflow)
|
||||
Changes:
|
||||
- Added Section VI (Fractal Complexity Limit) to enforce strict module (~300 lines) and function (~30-50 lines) size limits.
|
||||
- Aims to maintain semantic coherence and avoid "Attention Sink".
|
||||
- Simplified Generation Workflow to a single phase: Code Generation from `tasks.md`.
|
||||
- Removed multi-phase Architecture/Implementation split to streamline development.
|
||||
Templates Status:
|
||||
- .specify/templates/plan-template.md: ✅ Aligned.
|
||||
- .specify/templates/plan-template.md: ✅ Aligned (Dynamic check).
|
||||
- .specify/templates/spec-template.md: ✅ Aligned.
|
||||
- .specify/templates/tasks-arch-template.md: ✅ Aligned (New role-based split).
|
||||
- .specify/templates/tasks-dev-template.md: ✅ Aligned (New role-based split).
|
||||
- .specify/templates/tasks-template.md: ✅ Aligned.
|
||||
-->
|
||||
# Semantic Code Generation Constitution
|
||||
|
||||
## Core Principles
|
||||
|
||||
### I. Causal Validity (Contracts First)
|
||||
Semantic definitions (Contracts) must ALWAYS precede implementation code. Logic is downstream of definition. We define the structure and constraints (`[DEF]`, `@PRE`, `@POST`) before writing the executable logic. This ensures that the "what" and "why" govern the "how".
|
||||
### I. Semantic Protocol Compliance
|
||||
The file `semantic_protocol.md` is the **authoritative technical standard** for this project. All code generation, refactoring, and architecture must strictly adhere to the standards, syntax, and workflows defined therein.
|
||||
- **Syntax**: `[DEF]` anchors, `@RELATION` tags, and metadata must match the Protocol specification.
|
||||
- **Structure**: File layouts and headers must follow the "File Structure Standard".
|
||||
- **Workflow**: The technical steps for generating code must align with the Protocol.
|
||||
|
||||
### II. Immutability of Architecture
|
||||
Once defined, architectural decisions in the Module Header (`@LAYER`, `@INVARIANT`, `@CONSTRAINT`) are treated as immutable constraints for that module. Changes to these require an explicit refactoring step, not ad-hoc modification during implementation.
|
||||
### II. Causal Validity (Contracts First)
|
||||
As defined in the Protocol, Semantic definitions (Contracts) must ALWAYS precede implementation code. Logic is downstream of definition. We define the structure and constraints (`[DEF]`, `@PRE`, `@POST`) before writing the executable logic.
|
||||
|
||||
### III. Semantic Format Compliance
|
||||
All output must strictly follow the `[DEF]` / `[/DEF]` anchor syntax with specific Metadata Tags (`@KEY`) and Graph Relations (`@RELATION`). **Crucially, the closing anchor must strictly match the full content of the opening anchor (e.g., `[DEF:identifier:Type]` must close with `[/DEF:identifier:Type]`).**
|
||||
|
||||
**Standardized Graph Relations**
|
||||
To ensure the integrity of the Semantic Graph, `@RELATION` must use a strict taxonomy:
|
||||
- `DEPENDS_ON` (Structural dependency)
|
||||
- `CALLS` (Flow control)
|
||||
- `CREATES` (Instantiation)
|
||||
- `INHERITS_FROM` / `IMPLEMENTS` (OOP hierarchy)
|
||||
- `READS_STATE` / `WRITES_STATE` (Data flow)
|
||||
- `DISPATCHES` / `HANDLES` (Event flow)
|
||||
|
||||
Ad-hoc relationships are forbidden. This structure is non-negotiable as it ensures the codebase remains machine-readable, fractal-structured, and optimized for Sparse Attention navigation by AI agents.
|
||||
### III. Immutability of Architecture
|
||||
Architectural decisions in the Module Header (`@LAYER`, `@INVARIANT`, `@CONSTRAINT`) are treated as immutable constraints. Changes to these require an explicit refactoring step, not ad-hoc modification during implementation.
|
||||
|
||||
### IV. Design by Contract (DbC)
|
||||
Contracts are the Source of Truth. Functions and Classes must define their purpose, specifications, and constraints (`@PRE`, `@POST`, `@THROW`) in the metadata block before implementation. Implementation must strictly satisfy these contracts.
|
||||
Contracts are the Source of Truth. Functions and Classes must define their purpose, specifications, and constraints in the metadata block before implementation, strictly following the **Contracts (Section IV)** standard in `semantic_protocol.md`.
|
||||
|
||||
### V. Belief State Logging
|
||||
Logs must define the agent's internal state for debugging and coherence checks. We use a strict format: `[{ANCHOR_ID}][{STATE}] {MESSAGE}`. For Python, a **Context Manager** pattern MUST be used to automatically handle `Entry`, `Exit`, and `Coherence` states, ensuring structural integrity and error capturing.
|
||||
Agents must maintain belief state logs for debugging and coherence checks, strictly following the **Logging Standard (Section V)** defined in `semantic_protocol.md`.
|
||||
|
||||
### VI. Fractal Complexity Limit
|
||||
To maintain semantic coherence and avoid "Attention Sink" issues:
|
||||
- **Module Size**: If a Module body exceeds ~300 lines (or logical complexity), it MUST be refactored into sub-modules or a package structure.
|
||||
- **Function Size**: Functions should fit within a standard attention "chunk" (approx. 30-50 lines). If larger, logic MUST be decomposed into helper functions with their own contracts.
|
||||
To maintain semantic coherence, code must adhere to the complexity limits (Module/Function size) defined in the **Fractal Complexity Limit (Section VI)** of `semantic_protocol.md`.
|
||||
|
||||
This ensures every vector embedding remains sharp and focused.
|
||||
### VII. Everything is a Plugin
|
||||
All functional extensions, tools, or major features must be implemented as modular Plugins inheriting from `PluginBase`. Logic should not reside in standalone services or scripts unless strictly necessary for core infrastructure. This ensures a unified execution model via the `TaskManager`, consistent logging, and modularity.
|
||||
|
||||
## File Structure Standards
|
||||
|
||||
### Python Modules
|
||||
Every `.py` file must start with a Module definition header (`[DEF:module_name:Module]`) containing:
|
||||
- `@SEMANTICS`: Keywords for vector search.
|
||||
- `@PURPOSE`: Primary responsibility of the module.
|
||||
- `@LAYER`: Architecture layer (Domain/Infra/UI).
|
||||
- `@RELATION`: Dependencies.
|
||||
- `@INVARIANT` & `@CONSTRAINT`: Immutable rules.
|
||||
- `@PUBLIC_API`: Exported symbols.
|
||||
|
||||
### Svelte Components
|
||||
Every `.svelte` file must start with a Component definition header (`[DEF:ComponentName:Component]`) wrapped in an HTML comment `<!-- ... -->` containing:
|
||||
- `@SEMANTICS`: Keywords for vector search.
|
||||
- `@PURPOSE`: Primary responsibility of the component.
|
||||
- `@LAYER`: Architecture layer (UI/State/Layout).
|
||||
- `@RELATION`: Child components, Stores used, API calls.
|
||||
- `@PROPS`: Input properties.
|
||||
- `@EVENTS`: Emitted events.
|
||||
- `@INVARIANT`: Immutable UI/State rules.
|
||||
Refer to **Section III (File Structure Standard)** in `semantic_protocol.md` for the authoritative definitions of:
|
||||
- Python Module Headers (`.py`)
|
||||
- Svelte Component Headers (`.svelte`)
|
||||
|
||||
## Generation Workflow
|
||||
The development process follows a strict sequence enforced by Agent Roles:
|
||||
The development process follows a streamlined single-phase workflow:
|
||||
|
||||
### 1. Architecture Phase (Mode: `tech-lead`)
|
||||
**Input**: `tasks-arch.md`
|
||||
### 1. Code Generation Phase (Mode: `code`)
|
||||
**Input**: `tasks.md`
|
||||
**Responsibility**:
|
||||
- Analyze request and graph position.
|
||||
- Generate `[DEF]` anchors, Headers, and Contracts (`@PRE`, `@POST`).
|
||||
- **Output**: Scaffolding files with no implementation logic.
|
||||
|
||||
### 2. Implementation Phase (Mode: `code`)
|
||||
**Input**: `tasks-dev.md` + Scaffolding files
|
||||
**Responsibility**:
|
||||
- Read contracts defined by Architect.
|
||||
- Write implementation code that strictly satisfies contracts.
|
||||
- Select task from `tasks.md`.
|
||||
- Generate Scaffolding (`[DEF]` anchors, Headers, Contracts) AND Implementation in one pass.
|
||||
- Ensure strict adherence to Protocol Section IV (Contracts) and Section VII (Generation Workflow).
|
||||
- **Output**: Working code with passing tests.
|
||||
|
||||
### 3. Validation
|
||||
### 2. Validation
|
||||
If logic conflicts with Contract -> Stop -> Report Error.
|
||||
|
||||
## Governance
|
||||
This Constitution establishes the "Semantic Code Generation Protocol" as the supreme law of this repository.
|
||||
|
||||
- **Automated Enforcement**: All code generation tools and agents must parse and validate adherence to the `[DEF]` syntax and Contract requirements.
|
||||
- **Amendments**: Changes to the syntax or core principles require a formal amendment to this Constitution and a corresponding update to the constitution
|
||||
- **Review**: Code reviews must verify that implementation matches the preceding contracts and that no "naked code" exists outside of semantic anchors.
|
||||
- **Compliance**: Failure to adhere to the `[DEF]` / `[/DEF]` structure (including matching closing tags) constitutes a build failure.
|
||||
- **Authoritative Source**: `semantic_protocol.md` defines the specific implementation rules for these Principles.
|
||||
- **Automated Enforcement**: Tools must validate adherence to the `semantic_protocol.md` syntax.
|
||||
- **Amendments**: Changes to core principles require a Constitution amendment. Changes to technical syntax require a Protocol update.
|
||||
- **Compliance**: Failure to adhere to the Protocol constitutes a build failure.
|
||||
|
||||
**Version**: 1.5.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2025-12-27
|
||||
**Version**: 1.7.1 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-01-13
|
||||
|
||||
1
backend/backend/git_repos/12
Submodule
1
backend/backend/git_repos/12
Submodule
Submodule backend/backend/git_repos/12 added at d592fa7ed5
35
backend/delete_running_tasks.py
Normal file
35
backend/delete_running_tasks.py
Normal file
@@ -0,0 +1,35 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Script to delete tasks with RUNNING status from the database."""
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
from src.core.database import TasksSessionLocal
|
||||
from src.models.task import TaskRecord
|
||||
|
||||
def delete_running_tasks():
|
||||
"""Delete all tasks with RUNNING status from the database."""
|
||||
session: Session = TasksSessionLocal()
|
||||
try:
|
||||
# Find all task records with RUNNING status
|
||||
running_tasks = session.query(TaskRecord).filter(TaskRecord.status == "RUNNING").all()
|
||||
|
||||
if not running_tasks:
|
||||
print("No RUNNING tasks found.")
|
||||
return
|
||||
|
||||
print(f"Found {len(running_tasks)} RUNNING tasks:")
|
||||
for task in running_tasks:
|
||||
print(f"- Task ID: {task.id}, Type: {task.type}")
|
||||
|
||||
# Delete the found tasks
|
||||
session.query(TaskRecord).filter(TaskRecord.status == "RUNNING").delete(synchronize_session=False)
|
||||
session.commit()
|
||||
|
||||
print(f"Successfully deleted {len(running_tasks)} RUNNING tasks.")
|
||||
except Exception as e:
|
||||
session.rollback()
|
||||
print(f"Error deleting tasks: {e}")
|
||||
finally:
|
||||
session.close()
|
||||
|
||||
if __name__ == "__main__":
|
||||
delete_running_tasks()
|
||||
Binary file not shown.
@@ -41,3 +41,7 @@ tzlocal==5.3.1
|
||||
urllib3==2.6.2
|
||||
uvicorn==0.38.0
|
||||
websockets==15.0.1
|
||||
pandas
|
||||
psycopg2-binary
|
||||
openpyxl
|
||||
GitPython==3.1.44
|
||||
@@ -31,6 +31,12 @@ oauth2_scheme = OAuth2AuthorizationCodeBearer(
|
||||
tokenUrl="https://your-adfs-server/adfs/oauth2/token",
|
||||
)
|
||||
|
||||
# [DEF:get_current_user:Function]
|
||||
# @PURPOSE: Dependency to get the current user from the ADFS token.
|
||||
# @PARAM: token (str) - The OAuth2 bearer token.
|
||||
# @PRE: token should be provided via Authorization header.
|
||||
# @POST: Returns user details if authenticated, else raises 401.
|
||||
# @RETURN: Dict[str, str] - User information.
|
||||
async def get_current_user(token: str = Depends(oauth2_scheme)):
|
||||
"""
|
||||
Dependency to get the current user from the ADFS token.
|
||||
@@ -49,4 +55,5 @@ async def get_current_user(token: str = Depends(oauth2_scheme)):
|
||||
)
|
||||
# A real implementation would return a user object.
|
||||
return {"placeholder_user": "user@example.com"}
|
||||
# [/DEF:get_current_user:Function]
|
||||
# [/DEF:AuthModule:Module]
|
||||
@@ -1 +1 @@
|
||||
from . import plugins, tasks, settings
|
||||
from . import plugins, tasks, settings, connections, environments, mappings, migration, git
|
||||
|
||||
100
backend/src/api/routes/connections.py
Normal file
100
backend/src/api/routes/connections.py
Normal file
@@ -0,0 +1,100 @@
|
||||
# [DEF:ConnectionsRouter:Module]
|
||||
# @SEMANTICS: api, router, connections, database
|
||||
# @PURPOSE: Defines the FastAPI router for managing external database connections.
|
||||
# @LAYER: UI (API)
|
||||
# @RELATION: Depends on SQLAlchemy session.
|
||||
# @CONSTRAINT: Must use belief_scope for logging.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Optional
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.orm import Session
|
||||
from ...core.database import get_db
|
||||
from ...models.connection import ConnectionConfig
|
||||
from pydantic import BaseModel, Field
|
||||
from datetime import datetime
|
||||
from ...core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
# [DEF:ConnectionSchema:Class]
|
||||
# @PURPOSE: Pydantic model for connection response.
|
||||
class ConnectionSchema(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
type: str
|
||||
host: Optional[str] = None
|
||||
port: Optional[int] = None
|
||||
database: Optional[str] = None
|
||||
username: Optional[str] = None
|
||||
created_at: datetime
|
||||
|
||||
class Config:
|
||||
orm_mode = True
|
||||
# [/DEF:ConnectionSchema:Class]
|
||||
|
||||
# [DEF:ConnectionCreate:Class]
|
||||
# @PURPOSE: Pydantic model for creating a connection.
|
||||
class ConnectionCreate(BaseModel):
|
||||
name: str
|
||||
type: str
|
||||
host: Optional[str] = None
|
||||
port: Optional[int] = None
|
||||
database: Optional[str] = None
|
||||
username: Optional[str] = None
|
||||
password: Optional[str] = None
|
||||
# [/DEF:ConnectionCreate:Class]
|
||||
|
||||
# [DEF:list_connections:Function]
|
||||
# @PURPOSE: Lists all saved connections.
|
||||
# @PRE: Database session is active.
|
||||
# @POST: Returns list of connection configs.
|
||||
# @PARAM: db (Session) - Database session.
|
||||
# @RETURN: List[ConnectionSchema] - List of connections.
|
||||
@router.get("", response_model=List[ConnectionSchema])
|
||||
async def list_connections(db: Session = Depends(get_db)):
|
||||
with belief_scope("ConnectionsRouter.list_connections"):
|
||||
connections = db.query(ConnectionConfig).all()
|
||||
return connections
|
||||
# [/DEF:list_connections:Function]
|
||||
|
||||
# [DEF:create_connection:Function]
|
||||
# @PURPOSE: Creates a new connection configuration.
|
||||
# @PRE: Connection name is unique.
|
||||
# @POST: Connection is saved to DB.
|
||||
# @PARAM: connection (ConnectionCreate) - Config data.
|
||||
# @PARAM: db (Session) - Database session.
|
||||
# @RETURN: ConnectionSchema - Created connection.
|
||||
@router.post("", response_model=ConnectionSchema, status_code=status.HTTP_201_CREATED)
|
||||
async def create_connection(connection: ConnectionCreate, db: Session = Depends(get_db)):
|
||||
with belief_scope("ConnectionsRouter.create_connection", f"name={connection.name}"):
|
||||
db_connection = ConnectionConfig(**connection.dict())
|
||||
db.add(db_connection)
|
||||
db.commit()
|
||||
db.refresh(db_connection)
|
||||
logger.info(f"[ConnectionsRouter.create_connection][Success] Created connection {db_connection.id}")
|
||||
return db_connection
|
||||
# [/DEF:create_connection:Function]
|
||||
|
||||
# [DEF:delete_connection:Function]
|
||||
# @PURPOSE: Deletes a connection configuration.
|
||||
# @PRE: Connection ID exists.
|
||||
# @POST: Connection is removed from DB.
|
||||
# @PARAM: connection_id (str) - ID to delete.
|
||||
# @PARAM: db (Session) - Database session.
|
||||
# @RETURN: None.
|
||||
@router.delete("/{connection_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||
async def delete_connection(connection_id: str, db: Session = Depends(get_db)):
|
||||
with belief_scope("ConnectionsRouter.delete_connection", f"id={connection_id}"):
|
||||
db_connection = db.query(ConnectionConfig).filter(ConnectionConfig.id == connection_id).first()
|
||||
if not db_connection:
|
||||
logger.error(f"[ConnectionsRouter.delete_connection][State] Connection {connection_id} not found")
|
||||
raise HTTPException(status_code=404, detail="Connection not found")
|
||||
db.delete(db_connection)
|
||||
db.commit()
|
||||
logger.info(f"[ConnectionsRouter.delete_connection][Success] Deleted connection {connection_id}")
|
||||
return
|
||||
# [/DEF:delete_connection:Function]
|
||||
|
||||
# [/DEF:ConnectionsRouter:Module]
|
||||
@@ -11,11 +11,11 @@
|
||||
# [SECTION: IMPORTS]
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Dict, Optional
|
||||
from backend.src.dependencies import get_config_manager, get_scheduler_service
|
||||
from backend.src.core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
from ...dependencies import get_config_manager, get_scheduler_service
|
||||
from ...core.superset_client import SupersetClient
|
||||
from pydantic import BaseModel, Field
|
||||
from backend.src.core.config_models import Environment as EnvModel
|
||||
from ...core.config_models import Environment as EnvModel
|
||||
from ...core.logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter()
|
||||
@@ -43,10 +43,13 @@ class DatabaseResponse(BaseModel):
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: List all configured environments.
|
||||
# @PRE: config_manager is injected via Depends.
|
||||
# @POST: Returns a list of EnvironmentResponse objects.
|
||||
# @RETURN: List[EnvironmentResponse]
|
||||
@router.get("", response_model=List[EnvironmentResponse])
|
||||
async def get_environments(config_manager=Depends(get_config_manager)):
|
||||
envs = config_manager.get_environments()
|
||||
with belief_scope("get_environments"):
|
||||
envs = config_manager.get_environments()
|
||||
# Ensure envs is a list
|
||||
if not isinstance(envs, list):
|
||||
envs = []
|
||||
@@ -58,13 +61,15 @@ async def get_environments(config_manager=Depends(get_config_manager)):
|
||||
backup_schedule=ScheduleSchema(
|
||||
enabled=e.backup_schedule.enabled,
|
||||
cron_expression=e.backup_schedule.cron_expression
|
||||
) if e.backup_schedule else None
|
||||
) if getattr(e, 'backup_schedule', None) else None
|
||||
) for e in envs
|
||||
]
|
||||
# [/DEF:get_environments:Function]
|
||||
|
||||
# [DEF:update_environment_schedule:Function]
|
||||
# @PURPOSE: Update backup schedule for an environment.
|
||||
# @PRE: Environment id exists, schedule is valid ScheduleSchema.
|
||||
# @POST: Backup schedule updated and scheduler reloaded.
|
||||
# @PARAM: id (str) - The environment ID.
|
||||
# @PARAM: schedule (ScheduleSchema) - The new schedule.
|
||||
@router.put("/{id}/schedule")
|
||||
@@ -74,7 +79,8 @@ async def update_environment_schedule(
|
||||
config_manager=Depends(get_config_manager),
|
||||
scheduler_service=Depends(get_scheduler_service)
|
||||
):
|
||||
envs = config_manager.get_environments()
|
||||
with belief_scope("update_environment_schedule", f"id={id}"):
|
||||
envs = config_manager.get_environments()
|
||||
env = next((e for e in envs if e.id == id), None)
|
||||
if not env:
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
@@ -93,29 +99,21 @@ async def update_environment_schedule(
|
||||
|
||||
# [DEF:get_environment_databases:Function]
|
||||
# @PURPOSE: Fetch the list of databases from a specific environment.
|
||||
# @PRE: Environment id exists.
|
||||
# @POST: Returns a list of database summaries from the environment.
|
||||
# @PARAM: id (str) - The environment ID.
|
||||
# @RETURN: List[Dict] - List of databases.
|
||||
@router.get("/{id}/databases")
|
||||
async def get_environment_databases(id: str, config_manager=Depends(get_config_manager)):
|
||||
envs = config_manager.get_environments()
|
||||
with belief_scope("get_environment_databases", f"id={id}"):
|
||||
envs = config_manager.get_environments()
|
||||
env = next((e for e in envs if e.id == id), None)
|
||||
if not env:
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Initialize SupersetClient from environment config
|
||||
# Note: We need to map Environment model to SupersetConfig
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db provider
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(superset_config)
|
||||
client = SupersetClient(env)
|
||||
return client.get_databases_summary()
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Failed to fetch databases: {str(e)}")
|
||||
|
||||
303
backend/src/api/routes/git.py
Normal file
303
backend/src/api/routes/git.py
Normal file
@@ -0,0 +1,303 @@
|
||||
# [DEF:backend.src.api.routes.git:Module]
|
||||
#
|
||||
# @SEMANTICS: git, routes, api, fastapi, repository, deployment
|
||||
# @PURPOSE: Provides FastAPI endpoints for Git integration operations.
|
||||
# @LAYER: API
|
||||
# @RELATION: USES -> src.services.git_service.GitService
|
||||
# @RELATION: USES -> src.api.routes.git_schemas
|
||||
# @RELATION: USES -> src.models.git
|
||||
#
|
||||
# @INVARIANT: All Git operations must be routed through GitService.
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import List, Optional
|
||||
import typing
|
||||
from src.dependencies import get_config_manager
|
||||
from src.core.database import get_db
|
||||
from src.models.git import GitServerConfig, GitStatus, DeploymentEnvironment, GitRepository
|
||||
from src.api.routes.git_schemas import (
|
||||
GitServerConfigSchema, GitServerConfigCreate,
|
||||
GitRepositorySchema, BranchSchema, BranchCreate,
|
||||
BranchCheckout, CommitSchema, CommitCreate,
|
||||
DeploymentEnvironmentSchema, DeployRequest, RepoInitRequest
|
||||
)
|
||||
from src.services.git_service import GitService
|
||||
from src.core.logger import logger, belief_scope
|
||||
|
||||
router = APIRouter(prefix="/api/git", tags=["git"])
|
||||
git_service = GitService()
|
||||
|
||||
# [DEF:get_git_configs:Function]
|
||||
# @PURPOSE: List all configured Git servers.
|
||||
# @RETURN: List[GitServerConfigSchema]
|
||||
@router.get("/config", response_model=List[GitServerConfigSchema])
|
||||
async def get_git_configs(db: Session = Depends(get_db)):
|
||||
with belief_scope("get_git_configs"):
|
||||
return db.query(GitServerConfig).all()
|
||||
# [/DEF:get_git_configs:Function]
|
||||
|
||||
# [DEF:create_git_config:Function]
|
||||
# @PURPOSE: Register a new Git server configuration.
|
||||
# @PARAM: config (GitServerConfigCreate)
|
||||
# @RETURN: GitServerConfigSchema
|
||||
@router.post("/config", response_model=GitServerConfigSchema)
|
||||
async def create_git_config(config: GitServerConfigCreate, db: Session = Depends(get_db)):
|
||||
with belief_scope("create_git_config"):
|
||||
db_config = GitServerConfig(**config.dict())
|
||||
db.add(db_config)
|
||||
db.commit()
|
||||
db.refresh(db_config)
|
||||
return db_config
|
||||
# [/DEF:create_git_config:Function]
|
||||
|
||||
# [DEF:delete_git_config:Function]
|
||||
# @PURPOSE: Remove a Git server configuration.
|
||||
# @PARAM: config_id (str)
|
||||
@router.delete("/config/{config_id}")
|
||||
async def delete_git_config(config_id: str, db: Session = Depends(get_db)):
|
||||
with belief_scope("delete_git_config"):
|
||||
db_config = db.query(GitServerConfig).filter(GitServerConfig.id == config_id).first()
|
||||
if not db_config:
|
||||
raise HTTPException(status_code=404, detail="Configuration not found")
|
||||
|
||||
db.delete(db_config)
|
||||
db.commit()
|
||||
return {"status": "success", "message": "Configuration deleted"}
|
||||
# [/DEF:delete_git_config:Function]
|
||||
|
||||
# [DEF:test_git_config:Function]
|
||||
# @PURPOSE: Validate connection to a Git server using provided credentials.
|
||||
# @PARAM: config (GitServerConfigCreate)
|
||||
@router.post("/config/test")
|
||||
async def test_git_config(config: GitServerConfigCreate):
|
||||
with belief_scope("test_git_config"):
|
||||
success = await git_service.test_connection(config.provider, config.url, config.pat)
|
||||
if success:
|
||||
return {"status": "success", "message": "Connection successful"}
|
||||
else:
|
||||
raise HTTPException(status_code=400, detail="Connection failed")
|
||||
# [/DEF:test_git_config:Function]
|
||||
|
||||
# [DEF:init_repository:Function]
|
||||
# @PURPOSE: Link a dashboard to a Git repository and perform initial clone/init.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: init_data (RepoInitRequest)
|
||||
@router.post("/repositories/{dashboard_id}/init")
|
||||
async def init_repository(dashboard_id: int, init_data: RepoInitRequest, db: Session = Depends(get_db)):
|
||||
with belief_scope("init_repository"):
|
||||
# 1. Get config
|
||||
config = db.query(GitServerConfig).filter(GitServerConfig.id == init_data.config_id).first()
|
||||
if not config:
|
||||
raise HTTPException(status_code=404, detail="Git configuration not found")
|
||||
|
||||
try:
|
||||
# 2. Perform Git clone/init
|
||||
logger.info(f"[init_repository][Action] Initializing repo for dashboard {dashboard_id}")
|
||||
git_service.init_repo(dashboard_id, init_data.remote_url, config.pat)
|
||||
|
||||
# 3. Save to DB
|
||||
repo_path = git_service._get_repo_path(dashboard_id)
|
||||
db_repo = db.query(GitRepository).filter(GitRepository.dashboard_id == dashboard_id).first()
|
||||
if not db_repo:
|
||||
db_repo = GitRepository(
|
||||
dashboard_id=dashboard_id,
|
||||
config_id=config.id,
|
||||
remote_url=init_data.remote_url,
|
||||
local_path=repo_path
|
||||
)
|
||||
db.add(db_repo)
|
||||
else:
|
||||
db_repo.config_id = config.id
|
||||
db_repo.remote_url = init_data.remote_url
|
||||
db_repo.local_path = repo_path
|
||||
|
||||
db.commit()
|
||||
logger.info(f"[init_repository][Coherence:OK] Repository initialized for dashboard {dashboard_id}")
|
||||
return {"status": "success", "message": "Repository initialized"}
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
logger.error(f"[init_repository][Coherence:Failed] Failed to init repository: {e}")
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:init_repository:Function]
|
||||
|
||||
# [DEF:get_branches:Function]
|
||||
# @PURPOSE: List all branches for a dashboard's repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @RETURN: List[BranchSchema]
|
||||
@router.get("/repositories/{dashboard_id}/branches", response_model=List[BranchSchema])
|
||||
async def get_branches(dashboard_id: int):
|
||||
with belief_scope("get_branches"):
|
||||
try:
|
||||
return git_service.list_branches(dashboard_id)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
# [/DEF:get_branches:Function]
|
||||
|
||||
# [DEF:create_branch:Function]
|
||||
# @PURPOSE: Create a new branch in the dashboard's repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: branch_data (BranchCreate)
|
||||
@router.post("/repositories/{dashboard_id}/branches")
|
||||
async def create_branch(dashboard_id: int, branch_data: BranchCreate):
|
||||
with belief_scope("create_branch"):
|
||||
try:
|
||||
git_service.create_branch(dashboard_id, branch_data.name, branch_data.from_branch)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:create_branch:Function]
|
||||
|
||||
# [DEF:checkout_branch:Function]
|
||||
# @PURPOSE: Switch the dashboard's repository to a specific branch.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: checkout_data (BranchCheckout)
|
||||
@router.post("/repositories/{dashboard_id}/checkout")
|
||||
async def checkout_branch(dashboard_id: int, checkout_data: BranchCheckout):
|
||||
with belief_scope("checkout_branch"):
|
||||
try:
|
||||
git_service.checkout_branch(dashboard_id, checkout_data.name)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:checkout_branch:Function]
|
||||
|
||||
# [DEF:commit_changes:Function]
|
||||
# @PURPOSE: Stage and commit changes in the dashboard's repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: commit_data (CommitCreate)
|
||||
@router.post("/repositories/{dashboard_id}/commit")
|
||||
async def commit_changes(dashboard_id: int, commit_data: CommitCreate):
|
||||
with belief_scope("commit_changes"):
|
||||
try:
|
||||
git_service.commit_changes(dashboard_id, commit_data.message, commit_data.files)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:commit_changes:Function]
|
||||
|
||||
# [DEF:push_changes:Function]
|
||||
# @PURPOSE: Push local commits to the remote repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
@router.post("/repositories/{dashboard_id}/push")
|
||||
async def push_changes(dashboard_id: int):
|
||||
with belief_scope("push_changes"):
|
||||
try:
|
||||
git_service.push_changes(dashboard_id)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:push_changes:Function]
|
||||
|
||||
# [DEF:pull_changes:Function]
|
||||
# @PURPOSE: Pull changes from the remote repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
@router.post("/repositories/{dashboard_id}/pull")
|
||||
async def pull_changes(dashboard_id: int):
|
||||
with belief_scope("pull_changes"):
|
||||
try:
|
||||
git_service.pull_changes(dashboard_id)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:pull_changes:Function]
|
||||
|
||||
# [DEF:sync_dashboard:Function]
|
||||
# @PURPOSE: Sync dashboard state from Superset to Git using the GitPlugin.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: source_env_id (Optional[str])
|
||||
@router.post("/repositories/{dashboard_id}/sync")
|
||||
async def sync_dashboard(dashboard_id: int, source_env_id: typing.Optional[str] = None):
|
||||
with belief_scope("sync_dashboard"):
|
||||
try:
|
||||
from src.plugins.git_plugin import GitPlugin
|
||||
plugin = GitPlugin()
|
||||
return await plugin.execute({
|
||||
"operation": "sync",
|
||||
"dashboard_id": dashboard_id,
|
||||
"source_env_id": source_env_id
|
||||
})
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:sync_dashboard:Function]
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: List all deployment environments.
|
||||
# @RETURN: List[DeploymentEnvironmentSchema]
|
||||
@router.get("/environments", response_model=List[DeploymentEnvironmentSchema])
|
||||
async def get_environments(config_manager=Depends(get_config_manager)):
|
||||
with belief_scope("get_environments"):
|
||||
envs = config_manager.get_environments()
|
||||
return [
|
||||
DeploymentEnvironmentSchema(
|
||||
id=e.id,
|
||||
name=e.name,
|
||||
superset_url=e.url,
|
||||
is_active=True
|
||||
) for e in envs
|
||||
]
|
||||
# [/DEF:get_environments:Function]
|
||||
|
||||
# [DEF:deploy_dashboard:Function]
|
||||
# @PURPOSE: Deploy dashboard from Git to a target environment.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: deploy_data (DeployRequest)
|
||||
@router.post("/repositories/{dashboard_id}/deploy")
|
||||
async def deploy_dashboard(dashboard_id: int, deploy_data: DeployRequest):
|
||||
with belief_scope("deploy_dashboard"):
|
||||
try:
|
||||
from src.plugins.git_plugin import GitPlugin
|
||||
plugin = GitPlugin()
|
||||
return await plugin.execute({
|
||||
"operation": "deploy",
|
||||
"dashboard_id": dashboard_id,
|
||||
"environment_id": deploy_data.environment_id
|
||||
})
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:deploy_dashboard:Function]
|
||||
|
||||
# [DEF:get_history:Function]
|
||||
# @PURPOSE: View commit history for a dashboard's repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: limit (int)
|
||||
# @RETURN: List[CommitSchema]
|
||||
@router.get("/repositories/{dashboard_id}/history", response_model=List[CommitSchema])
|
||||
async def get_history(dashboard_id: int, limit: int = 50):
|
||||
with belief_scope("get_history"):
|
||||
try:
|
||||
return git_service.get_commit_history(dashboard_id, limit)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
# [/DEF:get_history:Function]
|
||||
|
||||
# [DEF:get_repository_status:Function]
|
||||
# @PURPOSE: Get current Git status for a dashboard repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @RETURN: dict
|
||||
@router.get("/repositories/{dashboard_id}/status")
|
||||
async def get_repository_status(dashboard_id: int):
|
||||
with belief_scope("get_repository_status"):
|
||||
try:
|
||||
return git_service.get_status(dashboard_id)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:get_repository_status:Function]
|
||||
|
||||
# [DEF:get_repository_diff:Function]
|
||||
# @PURPOSE: Get Git diff for a dashboard repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: file_path (Optional[str])
|
||||
# @PARAM: staged (bool)
|
||||
# @RETURN: str
|
||||
@router.get("/repositories/{dashboard_id}/diff")
|
||||
async def get_repository_diff(dashboard_id: int, file_path: Optional[str] = None, staged: bool = False):
|
||||
with belief_scope("get_repository_diff"):
|
||||
try:
|
||||
diff_text = git_service.get_diff(dashboard_id, file_path, staged)
|
||||
return diff_text
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:get_repository_diff:Function]
|
||||
|
||||
# [/DEF:backend.src.api.routes.git:Module]
|
||||
130
backend/src/api/routes/git_schemas.py
Normal file
130
backend/src/api/routes/git_schemas.py
Normal file
@@ -0,0 +1,130 @@
|
||||
# [DEF:backend.src.api.routes.git_schemas:Module]
|
||||
#
|
||||
# @SEMANTICS: git, schemas, pydantic, api, contracts
|
||||
# @PURPOSE: Defines Pydantic models for the Git integration API layer.
|
||||
# @LAYER: API
|
||||
# @RELATION: DEPENDS_ON -> backend.src.models.git
|
||||
#
|
||||
# @INVARIANT: All schemas must be compatible with the FastAPI router.
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from typing import List, Optional
|
||||
from datetime import datetime
|
||||
from uuid import UUID
|
||||
from src.models.git import GitProvider, GitStatus, SyncStatus
|
||||
|
||||
# [DEF:GitServerConfigBase:Class]
|
||||
class GitServerConfigBase(BaseModel):
|
||||
name: str = Field(..., description="Display name for the Git server")
|
||||
provider: GitProvider = Field(..., description="Git provider (GITHUB, GITLAB, GITEA)")
|
||||
url: str = Field(..., description="Server base URL")
|
||||
pat: str = Field(..., description="Personal Access Token")
|
||||
default_repository: Optional[str] = Field(None, description="Default repository path (org/repo)")
|
||||
# [/DEF:GitServerConfigBase:Class]
|
||||
|
||||
# [DEF:GitServerConfigCreate:Class]
|
||||
class GitServerConfigCreate(GitServerConfigBase):
|
||||
"""Schema for creating a new Git server configuration."""
|
||||
pass
|
||||
# [/DEF:GitServerConfigCreate:Class]
|
||||
|
||||
# [DEF:GitServerConfigSchema:Class]
|
||||
class GitServerConfigSchema(GitServerConfigBase):
|
||||
"""Schema for representing a Git server configuration with metadata."""
|
||||
id: str
|
||||
status: GitStatus
|
||||
last_validated: datetime
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:GitServerConfigSchema:Class]
|
||||
|
||||
# [DEF:GitRepositorySchema:Class]
|
||||
class GitRepositorySchema(BaseModel):
|
||||
"""Schema for tracking a local Git repository linked to a dashboard."""
|
||||
id: str
|
||||
dashboard_id: int
|
||||
config_id: str
|
||||
remote_url: str
|
||||
local_path: str
|
||||
current_branch: str
|
||||
sync_status: SyncStatus
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:GitRepositorySchema:Class]
|
||||
|
||||
# [DEF:BranchSchema:Class]
|
||||
class BranchSchema(BaseModel):
|
||||
"""Schema for representing a Git branch."""
|
||||
name: str
|
||||
commit_hash: str
|
||||
is_remote: bool
|
||||
last_updated: datetime
|
||||
# [/DEF:BranchSchema:Class]
|
||||
|
||||
# [DEF:CommitSchema:Class]
|
||||
class CommitSchema(BaseModel):
|
||||
"""Schema for representing a Git commit."""
|
||||
hash: str
|
||||
author: str
|
||||
email: str
|
||||
timestamp: datetime
|
||||
message: str
|
||||
files_changed: List[str]
|
||||
# [/DEF:CommitSchema:Class]
|
||||
|
||||
# [DEF:BranchCreate:Class]
|
||||
class BranchCreate(BaseModel):
|
||||
"""Schema for branch creation requests."""
|
||||
name: str
|
||||
from_branch: str
|
||||
# [/DEF:BranchCreate:Class]
|
||||
|
||||
# [DEF:BranchCheckout:Class]
|
||||
class BranchCheckout(BaseModel):
|
||||
"""Schema for branch checkout requests."""
|
||||
name: str
|
||||
# [/DEF:BranchCheckout:Class]
|
||||
|
||||
# [DEF:CommitCreate:Class]
|
||||
class CommitCreate(BaseModel):
|
||||
"""Schema for staging and committing changes."""
|
||||
message: str
|
||||
files: List[str]
|
||||
# [/DEF:CommitCreate:Class]
|
||||
|
||||
# [DEF:ConflictResolution:Class]
|
||||
class ConflictResolution(BaseModel):
|
||||
"""Schema for resolving merge conflicts."""
|
||||
file_path: str
|
||||
resolution: str = Field(pattern="^(mine|theirs|manual)$")
|
||||
content: Optional[str] = None
|
||||
# [/DEF:ConflictResolution:Class]
|
||||
|
||||
# [DEF:DeploymentEnvironmentSchema:Class]
|
||||
class DeploymentEnvironmentSchema(BaseModel):
|
||||
"""Schema for representing a target deployment environment."""
|
||||
id: str
|
||||
name: str
|
||||
superset_url: str
|
||||
is_active: bool
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:DeploymentEnvironmentSchema:Class]
|
||||
|
||||
# [DEF:DeployRequest:Class]
|
||||
class DeployRequest(BaseModel):
|
||||
"""Schema for deployment requests."""
|
||||
environment_id: str
|
||||
# [/DEF:DeployRequest:Class]
|
||||
|
||||
# [DEF:RepoInitRequest:Class]
|
||||
class RepoInitRequest(BaseModel):
|
||||
"""Schema for repository initialization requests."""
|
||||
config_id: str
|
||||
remote_url: str
|
||||
# [/DEF:RepoInitRequest:Class]
|
||||
|
||||
# [/DEF:backend.src.api.routes.git_schemas:Module]
|
||||
@@ -13,9 +13,10 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import List, Optional
|
||||
from backend.src.dependencies import get_config_manager
|
||||
from backend.src.core.database import get_db
|
||||
from backend.src.models.mapping import DatabaseMapping
|
||||
from ...core.logger import belief_scope
|
||||
from ...dependencies import get_config_manager
|
||||
from ...core.database import get_db
|
||||
from ...models.mapping import DatabaseMapping
|
||||
from pydantic import BaseModel
|
||||
# [/SECTION]
|
||||
|
||||
@@ -53,13 +54,16 @@ class SuggestRequest(BaseModel):
|
||||
|
||||
# [DEF:get_mappings:Function]
|
||||
# @PURPOSE: List all saved database mappings.
|
||||
# @PRE: db session is injected.
|
||||
# @POST: Returns filtered list of DatabaseMapping records.
|
||||
@router.get("", response_model=List[MappingResponse])
|
||||
async def get_mappings(
|
||||
source_env_id: Optional[str] = None,
|
||||
target_env_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
query = db.query(DatabaseMapping)
|
||||
with belief_scope("get_mappings"):
|
||||
query = db.query(DatabaseMapping)
|
||||
if source_env_id:
|
||||
query = query.filter(DatabaseMapping.source_env_id == source_env_id)
|
||||
if target_env_id:
|
||||
@@ -69,42 +73,48 @@ async def get_mappings(
|
||||
|
||||
# [DEF:create_mapping:Function]
|
||||
# @PURPOSE: Create or update a database mapping.
|
||||
# @PRE: mapping is valid MappingCreate, db session is injected.
|
||||
# @POST: DatabaseMapping created or updated in database.
|
||||
@router.post("", response_model=MappingResponse)
|
||||
async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
||||
# Check if mapping already exists
|
||||
existing = db.query(DatabaseMapping).filter(
|
||||
DatabaseMapping.source_env_id == mapping.source_env_id,
|
||||
DatabaseMapping.target_env_id == mapping.target_env_id,
|
||||
DatabaseMapping.source_db_uuid == mapping.source_db_uuid
|
||||
).first()
|
||||
|
||||
if existing:
|
||||
existing.target_db_uuid = mapping.target_db_uuid
|
||||
existing.target_db_name = mapping.target_db_name
|
||||
with belief_scope("create_mapping"):
|
||||
# Check if mapping already exists
|
||||
existing = db.query(DatabaseMapping).filter(
|
||||
DatabaseMapping.source_env_id == mapping.source_env_id,
|
||||
DatabaseMapping.target_env_id == mapping.target_env_id,
|
||||
DatabaseMapping.source_db_uuid == mapping.source_db_uuid
|
||||
).first()
|
||||
|
||||
if existing:
|
||||
existing.target_db_uuid = mapping.target_db_uuid
|
||||
existing.target_db_name = mapping.target_db_name
|
||||
db.commit()
|
||||
db.refresh(existing)
|
||||
return existing
|
||||
|
||||
new_mapping = DatabaseMapping(**mapping.dict())
|
||||
db.add(new_mapping)
|
||||
db.commit()
|
||||
db.refresh(existing)
|
||||
return existing
|
||||
|
||||
new_mapping = DatabaseMapping(**mapping.dict())
|
||||
db.add(new_mapping)
|
||||
db.commit()
|
||||
db.refresh(new_mapping)
|
||||
return new_mapping
|
||||
db.refresh(new_mapping)
|
||||
return new_mapping
|
||||
# [/DEF:create_mapping:Function]
|
||||
|
||||
# [DEF:suggest_mappings_api:Function]
|
||||
# @PURPOSE: Get suggested mappings based on fuzzy matching.
|
||||
# @PRE: request is valid SuggestRequest, config_manager is injected.
|
||||
# @POST: Returns mapping suggestions.
|
||||
@router.post("/suggest")
|
||||
async def suggest_mappings_api(
|
||||
request: SuggestRequest,
|
||||
config_manager=Depends(get_config_manager)
|
||||
):
|
||||
from backend.src.services.mapping_service import MappingService
|
||||
service = MappingService(config_manager)
|
||||
try:
|
||||
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
with belief_scope("suggest_mappings_api"):
|
||||
from backend.src.services.mapping_service import MappingService
|
||||
service = MappingService(config_manager)
|
||||
try:
|
||||
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
# [/DEF:suggest_mappings_api:Function]
|
||||
|
||||
# [/DEF:backend.src.api.routes.mappings:Module]
|
||||
|
||||
@@ -7,10 +7,10 @@
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Dict
|
||||
from backend.src.dependencies import get_config_manager, get_task_manager
|
||||
from backend.src.models.dashboard import DashboardMetadata, DashboardSelection
|
||||
from backend.src.core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
from ...dependencies import get_config_manager, get_task_manager
|
||||
from ...models.dashboard import DashboardMetadata, DashboardSelection
|
||||
from ...core.superset_client import SupersetClient
|
||||
from ...core.logger import belief_scope
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["migration"])
|
||||
|
||||
@@ -22,19 +22,13 @@ router = APIRouter(prefix="/api", tags=["migration"])
|
||||
# @RETURN: List[DashboardMetadata]
|
||||
@router.get("/environments/{env_id}/dashboards", response_model=List[DashboardMetadata])
|
||||
async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)):
|
||||
environments = config_manager.get_environments()
|
||||
with belief_scope("get_dashboards", f"env_id={env_id}"):
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == env_id), None)
|
||||
if not env:
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={'provider': 'db', 'username': env.username, 'password': env.password, 'refresh': False},
|
||||
verify_ssl=True,
|
||||
timeout=30
|
||||
)
|
||||
client = SupersetClient(config)
|
||||
client = SupersetClient(env)
|
||||
dashboards = client.get_dashboards_summary()
|
||||
return dashboards
|
||||
# [/DEF:get_dashboards:Function]
|
||||
@@ -47,8 +41,9 @@ async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)
|
||||
# @RETURN: Dict - {"task_id": str, "message": str}
|
||||
@router.post("/migration/execute")
|
||||
async def execute_migration(selection: DashboardSelection, config_manager=Depends(get_config_manager), task_manager=Depends(get_task_manager)):
|
||||
# Validate environments exist
|
||||
environments = config_manager.get_environments()
|
||||
with belief_scope("execute_migration"):
|
||||
# Validate environments exist
|
||||
environments = config_manager.get_environments()
|
||||
env_ids = {e.id for e in environments}
|
||||
if selection.source_env_id not in env_ids or selection.target_env_id not in env_ids:
|
||||
raise HTTPException(status_code=400, detail="Invalid source or target environment")
|
||||
|
||||
@@ -8,15 +8,23 @@ from fastapi import APIRouter, Depends
|
||||
|
||||
from ...core.plugin_base import PluginConfig
|
||||
from ...dependencies import get_plugin_loader
|
||||
from ...core.logger import belief_scope
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.get("/", response_model=List[PluginConfig])
|
||||
# [DEF:list_plugins:Function]
|
||||
# @PURPOSE: Retrieve a list of all available plugins.
|
||||
# @PRE: plugin_loader is injected via Depends.
|
||||
# @POST: Returns a list of PluginConfig objects.
|
||||
# @RETURN: List[PluginConfig] - List of registered plugins.
|
||||
@router.get("", response_model=List[PluginConfig])
|
||||
async def list_plugins(
|
||||
plugin_loader = Depends(get_plugin_loader)
|
||||
):
|
||||
"""
|
||||
Retrieve a list of all available plugins.
|
||||
"""
|
||||
return plugin_loader.get_all_plugin_configs()
|
||||
with belief_scope("list_plugins"):
|
||||
"""
|
||||
Retrieve a list of all available plugins.
|
||||
"""
|
||||
return plugin_loader.get_all_plugin_configs()
|
||||
# [/DEF:list_plugins:Function]
|
||||
# [/DEF:PluginsRouter:Module]
|
||||
@@ -15,9 +15,8 @@ from typing import List
|
||||
from ...core.config_models import AppConfig, Environment, GlobalSettings
|
||||
from ...dependencies import get_config_manager
|
||||
from ...core.config_manager import ConfigManager
|
||||
from ...core.logger import logger
|
||||
from ...core.logger import logger, belief_scope
|
||||
from ...core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
@@ -25,10 +24,13 @@ router = APIRouter()
|
||||
|
||||
# [DEF:get_settings:Function]
|
||||
# @PURPOSE: Retrieves all application settings.
|
||||
# @PRE: Config manager is available.
|
||||
# @POST: Returns masked AppConfig.
|
||||
# @RETURN: AppConfig - The current configuration.
|
||||
@router.get("/", response_model=AppConfig)
|
||||
@router.get("", response_model=AppConfig)
|
||||
async def get_settings(config_manager: ConfigManager = Depends(get_config_manager)):
|
||||
logger.info("[get_settings][Entry] Fetching all settings")
|
||||
with belief_scope("get_settings"):
|
||||
logger.info("[get_settings][Entry] Fetching all settings")
|
||||
config = config_manager.get_config().copy(deep=True)
|
||||
# Mask passwords
|
||||
for env in config.environments:
|
||||
@@ -39,29 +41,37 @@ async def get_settings(config_manager: ConfigManager = Depends(get_config_manage
|
||||
|
||||
# [DEF:update_global_settings:Function]
|
||||
# @PURPOSE: Updates global application settings.
|
||||
# @PRE: New settings are provided.
|
||||
# @POST: Global settings are updated.
|
||||
# @PARAM: settings (GlobalSettings) - The new global settings.
|
||||
# @RETURN: GlobalSettings - The updated settings.
|
||||
@router.patch("/global", response_model=GlobalSettings)
|
||||
async def update_global_settings(
|
||||
settings: GlobalSettings,
|
||||
settings: GlobalSettings,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
):
|
||||
logger.info("[update_global_settings][Entry] Updating global settings")
|
||||
with belief_scope("update_global_settings"):
|
||||
logger.info("[update_global_settings][Entry] Updating global settings")
|
||||
config_manager.update_global_settings(settings)
|
||||
return settings
|
||||
# [/DEF:update_global_settings:Function]
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: Lists all configured Superset environments.
|
||||
# @PRE: Config manager is available.
|
||||
# @POST: Returns list of environments.
|
||||
# @RETURN: List[Environment] - List of environments.
|
||||
@router.get("/environments", response_model=List[Environment])
|
||||
async def get_environments(config_manager: ConfigManager = Depends(get_config_manager)):
|
||||
logger.info("[get_environments][Entry] Fetching environments")
|
||||
with belief_scope("get_environments"):
|
||||
logger.info("[get_environments][Entry] Fetching environments")
|
||||
return config_manager.get_environments()
|
||||
# [/DEF:get_environments:Function]
|
||||
|
||||
# [DEF:add_environment:Function]
|
||||
# @PURPOSE: Adds a new Superset environment.
|
||||
# @PRE: Environment data is valid and reachable.
|
||||
# @POST: Environment is added to config.
|
||||
# @PARAM: env (Environment) - The environment to add.
|
||||
# @RETURN: Environment - The added environment.
|
||||
@router.post("/environments", response_model=Environment)
|
||||
@@ -69,21 +79,12 @@ async def add_environment(
|
||||
env: Environment,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
):
|
||||
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
||||
with belief_scope("add_environment"):
|
||||
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
||||
|
||||
# Validate connection before adding
|
||||
try:
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db",
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "true"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(config=superset_config)
|
||||
client = SupersetClient(env)
|
||||
client.get_dashboards(query={"page_size": 1})
|
||||
except Exception as e:
|
||||
logger.error(f"[add_environment][Coherence:Failed] Connection validation failed: {e}")
|
||||
@@ -95,16 +96,19 @@ async def add_environment(
|
||||
|
||||
# [DEF:update_environment:Function]
|
||||
# @PURPOSE: Updates an existing Superset environment.
|
||||
# @PRE: ID and valid environment data are provided.
|
||||
# @POST: Environment is updated in config.
|
||||
# @PARAM: id (str) - The ID of the environment to update.
|
||||
# @PARAM: env (Environment) - The updated environment data.
|
||||
# @RETURN: Environment - The updated environment.
|
||||
@router.put("/environments/{id}", response_model=Environment)
|
||||
async def update_environment(
|
||||
id: str,
|
||||
env: Environment,
|
||||
id: str,
|
||||
env: Environment,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
):
|
||||
logger.info(f"[update_environment][Entry] Updating environment {id}")
|
||||
with belief_scope("update_environment"):
|
||||
logger.info(f"[update_environment][Entry] Updating environment {id}")
|
||||
|
||||
# If password is masked, we need the real one for validation
|
||||
env_to_validate = env.copy(deep=True)
|
||||
@@ -115,17 +119,7 @@ async def update_environment(
|
||||
|
||||
# Validate connection before updating
|
||||
try:
|
||||
superset_config = SupersetConfig(
|
||||
env=env_to_validate.name,
|
||||
base_url=env_to_validate.url,
|
||||
auth={
|
||||
"provider": "db",
|
||||
"username": env_to_validate.username,
|
||||
"password": env_to_validate.password,
|
||||
"refresh": "true"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(config=superset_config)
|
||||
client = SupersetClient(env_to_validate)
|
||||
client.get_dashboards(query={"page_size": 1})
|
||||
except Exception as e:
|
||||
logger.error(f"[update_environment][Coherence:Failed] Connection validation failed: {e}")
|
||||
@@ -138,19 +132,24 @@ async def update_environment(
|
||||
|
||||
# [DEF:delete_environment:Function]
|
||||
# @PURPOSE: Deletes a Superset environment.
|
||||
# @PRE: ID is provided.
|
||||
# @POST: Environment is removed from config.
|
||||
# @PARAM: id (str) - The ID of the environment to delete.
|
||||
@router.delete("/environments/{id}")
|
||||
async def delete_environment(
|
||||
id: str,
|
||||
id: str,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
):
|
||||
logger.info(f"[delete_environment][Entry] Deleting environment {id}")
|
||||
with belief_scope("delete_environment"):
|
||||
logger.info(f"[delete_environment][Entry] Deleting environment {id}")
|
||||
config_manager.delete_environment(id)
|
||||
return {"message": f"Environment {id} deleted"}
|
||||
# [/DEF:delete_environment:Function]
|
||||
|
||||
# [DEF:test_environment_connection:Function]
|
||||
# @PURPOSE: Tests the connection to a Superset environment.
|
||||
# @PRE: ID is provided.
|
||||
# @POST: Returns success or error status.
|
||||
# @PARAM: id (str) - The ID of the environment to test.
|
||||
# @RETURN: dict - Success message or error.
|
||||
@router.post("/environments/{id}/test")
|
||||
@@ -158,7 +157,8 @@ async def test_environment_connection(
|
||||
id: str,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
):
|
||||
logger.info(f"[test_environment_connection][Entry] Testing environment {id}")
|
||||
with belief_scope("test_environment_connection"):
|
||||
logger.info(f"[test_environment_connection][Entry] Testing environment {id}")
|
||||
|
||||
# Find environment
|
||||
env = next((e for e in config_manager.get_environments() if e.id == id), None)
|
||||
@@ -166,21 +166,8 @@ async def test_environment_connection(
|
||||
raise HTTPException(status_code=404, detail=f"Environment {id} not found")
|
||||
|
||||
try:
|
||||
# Create SupersetConfig
|
||||
# Note: SupersetConfig expects 'auth' dict with specific keys
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db for now
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "true"
|
||||
}
|
||||
)
|
||||
|
||||
# Initialize client (this will trigger authentication)
|
||||
client = SupersetClient(config=superset_config)
|
||||
client = SupersetClient(env)
|
||||
|
||||
# Try a simple request to verify
|
||||
client.get_dashboards(query={"page_size": 1})
|
||||
@@ -194,6 +181,8 @@ async def test_environment_connection(
|
||||
|
||||
# [DEF:validate_backup_path:Function]
|
||||
# @PURPOSE: Validates if a backup path exists and is writable.
|
||||
# @PRE: Path is provided in path_data.
|
||||
# @POST: Returns success or error status.
|
||||
# @PARAM: path (str) - The path to validate.
|
||||
# @RETURN: dict - Validation result.
|
||||
@router.post("/validate-path")
|
||||
@@ -201,11 +190,12 @@ async def validate_backup_path(
|
||||
path_data: dict,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
):
|
||||
path = path_data.get("path")
|
||||
if not path:
|
||||
raise HTTPException(status_code=400, detail="Path is required")
|
||||
|
||||
logger.info(f"[validate_backup_path][Entry] Validating path: {path}")
|
||||
with belief_scope("validate_backup_path"):
|
||||
path = path_data.get("path")
|
||||
if not path:
|
||||
raise HTTPException(status_code=400, detail="Path is required")
|
||||
|
||||
logger.info(f"[validate_backup_path][Entry] Validating path: {path}")
|
||||
|
||||
valid, message = config_manager.validate_path(path)
|
||||
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
from typing import List, Dict, Any, Optional
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from pydantic import BaseModel
|
||||
from ...core.logger import belief_scope
|
||||
|
||||
from ...core.task_manager import TaskManager, Task, TaskStatus, LogEntry
|
||||
from ...dependencies import get_task_manager
|
||||
@@ -23,6 +24,13 @@ class ResumeTaskRequest(BaseModel):
|
||||
passwords: Dict[str, str]
|
||||
|
||||
@router.post("", response_model=Task, status_code=status.HTTP_201_CREATED)
|
||||
# [DEF:create_task:Function]
|
||||
# @PURPOSE: Create and start a new task for a given plugin.
|
||||
# @PARAM: request (CreateTaskRequest) - The request body containing plugin_id and params.
|
||||
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||
# @PRE: plugin_id must exist and params must be valid for that plugin.
|
||||
# @POST: A new task is created and started.
|
||||
# @RETURN: Task - The created task instance.
|
||||
async def create_task(
|
||||
request: CreateTaskRequest,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
@@ -30,16 +38,27 @@ async def create_task(
|
||||
"""
|
||||
Create and start a new task for a given plugin.
|
||||
"""
|
||||
try:
|
||||
task = await task_manager.create_task(
|
||||
plugin_id=request.plugin_id,
|
||||
params=request.params
|
||||
)
|
||||
return task
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e))
|
||||
with belief_scope("create_task"):
|
||||
try:
|
||||
task = await task_manager.create_task(
|
||||
plugin_id=request.plugin_id,
|
||||
params=request.params
|
||||
)
|
||||
return task
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e))
|
||||
# [/DEF:create_task:Function]
|
||||
|
||||
@router.get("", response_model=List[Task])
|
||||
# [DEF:list_tasks:Function]
|
||||
# @PURPOSE: Retrieve a list of tasks with pagination and optional status filter.
|
||||
# @PARAM: limit (int) - Maximum number of tasks to return.
|
||||
# @PARAM: offset (int) - Number of tasks to skip.
|
||||
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
|
||||
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||
# @PRE: task_manager must be available.
|
||||
# @POST: Returns a list of tasks.
|
||||
# @RETURN: List[Task] - List of tasks.
|
||||
async def list_tasks(
|
||||
limit: int = 10,
|
||||
offset: int = 0,
|
||||
@@ -49,9 +68,18 @@ async def list_tasks(
|
||||
"""
|
||||
Retrieve a list of tasks with pagination and optional status filter.
|
||||
"""
|
||||
return task_manager.get_tasks(limit=limit, offset=offset, status=status)
|
||||
with belief_scope("list_tasks"):
|
||||
return task_manager.get_tasks(limit=limit, offset=offset, status=status)
|
||||
# [/DEF:list_tasks:Function]
|
||||
|
||||
@router.get("/{task_id}", response_model=Task)
|
||||
# [DEF:get_task:Function]
|
||||
# @PURPOSE: Retrieve the details of a specific task.
|
||||
# @PARAM: task_id (str) - The unique identifier of the task.
|
||||
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||
# @PRE: task_id must exist.
|
||||
# @POST: Returns task details or raises 404.
|
||||
# @RETURN: Task - The task details.
|
||||
async def get_task(
|
||||
task_id: str,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
@@ -59,12 +87,21 @@ async def get_task(
|
||||
"""
|
||||
Retrieve the details of a specific task.
|
||||
"""
|
||||
task = task_manager.get_task(task_id)
|
||||
if not task:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||
return task
|
||||
with belief_scope("get_task"):
|
||||
task = task_manager.get_task(task_id)
|
||||
if not task:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||
return task
|
||||
# [/DEF:get_task:Function]
|
||||
|
||||
@router.get("/{task_id}/logs", response_model=List[LogEntry])
|
||||
# [DEF:get_task_logs:Function]
|
||||
# @PURPOSE: Retrieve logs for a specific task.
|
||||
# @PARAM: task_id (str) - The unique identifier of the task.
|
||||
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||
# @PRE: task_id must exist.
|
||||
# @POST: Returns a list of log entries or raises 404.
|
||||
# @RETURN: List[LogEntry] - List of log entries.
|
||||
async def get_task_logs(
|
||||
task_id: str,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
@@ -72,12 +109,22 @@ async def get_task_logs(
|
||||
"""
|
||||
Retrieve logs for a specific task.
|
||||
"""
|
||||
task = task_manager.get_task(task_id)
|
||||
if not task:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||
return task_manager.get_task_logs(task_id)
|
||||
with belief_scope("get_task_logs"):
|
||||
task = task_manager.get_task(task_id)
|
||||
if not task:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||
return task_manager.get_task_logs(task_id)
|
||||
# [/DEF:get_task_logs:Function]
|
||||
|
||||
@router.post("/{task_id}/resolve", response_model=Task)
|
||||
# [DEF:resolve_task:Function]
|
||||
# @PURPOSE: Resolve a task that is awaiting mapping.
|
||||
# @PARAM: task_id (str) - The unique identifier of the task.
|
||||
# @PARAM: request (ResolveTaskRequest) - The resolution parameters.
|
||||
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||
# @PRE: task must be in AWAITING_MAPPING status.
|
||||
# @POST: Task is resolved and resumes execution.
|
||||
# @RETURN: Task - The updated task object.
|
||||
async def resolve_task(
|
||||
task_id: str,
|
||||
request: ResolveTaskRequest,
|
||||
@@ -86,13 +133,23 @@ async def resolve_task(
|
||||
"""
|
||||
Resolve a task that is awaiting mapping.
|
||||
"""
|
||||
try:
|
||||
await task_manager.resolve_task(task_id, request.resolution_params)
|
||||
return task_manager.get_task(task_id)
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
with belief_scope("resolve_task"):
|
||||
try:
|
||||
await task_manager.resolve_task(task_id, request.resolution_params)
|
||||
return task_manager.get_task(task_id)
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
# [/DEF:resolve_task:Function]
|
||||
|
||||
@router.post("/{task_id}/resume", response_model=Task)
|
||||
# [DEF:resume_task:Function]
|
||||
# @PURPOSE: Resume a task that is awaiting input (e.g., passwords).
|
||||
# @PARAM: task_id (str) - The unique identifier of the task.
|
||||
# @PARAM: request (ResumeTaskRequest) - The input (passwords).
|
||||
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||
# @PRE: task must be in AWAITING_INPUT status.
|
||||
# @POST: Task resumes execution with provided input.
|
||||
# @RETURN: Task - The updated task object.
|
||||
async def resume_task(
|
||||
task_id: str,
|
||||
request: ResumeTaskRequest,
|
||||
@@ -101,13 +158,21 @@ async def resume_task(
|
||||
"""
|
||||
Resume a task that is awaiting input (e.g., passwords).
|
||||
"""
|
||||
try:
|
||||
task_manager.resume_task_with_password(task_id, request.passwords)
|
||||
return task_manager.get_task(task_id)
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
with belief_scope("resume_task"):
|
||||
try:
|
||||
task_manager.resume_task_with_password(task_id, request.passwords)
|
||||
return task_manager.get_task(task_id)
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
# [/DEF:resume_task:Function]
|
||||
|
||||
@router.delete("", status_code=status.HTTP_204_NO_CONTENT)
|
||||
# [DEF:clear_tasks:Function]
|
||||
# @PURPOSE: Clear tasks matching the status filter.
|
||||
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
|
||||
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||
# @PRE: task_manager is available.
|
||||
# @POST: Tasks are removed from memory/persistence.
|
||||
async def clear_tasks(
|
||||
status: Optional[TaskStatus] = None,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
@@ -115,6 +180,8 @@ async def clear_tasks(
|
||||
"""
|
||||
Clear tasks matching the status filter. If no filter, clears all non-running tasks.
|
||||
"""
|
||||
task_manager.clear_tasks(status)
|
||||
return
|
||||
with belief_scope("clear_tasks", f"status={status}"):
|
||||
task_manager.clear_tasks(status)
|
||||
return
|
||||
# [/DEF:clear_tasks:Function]
|
||||
# [/DEF:TasksRouter:Module]
|
||||
@@ -6,12 +6,10 @@
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add project root to sys.path to allow importing superset_tool
|
||||
# Assuming app.py is in backend/src/
|
||||
# project_root is used for static files mounting
|
||||
project_root = Path(__file__).resolve().parent.parent.parent
|
||||
sys.path.append(str(project_root))
|
||||
|
||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request
|
||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request, HTTPException
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.responses import FileResponse
|
||||
@@ -19,8 +17,8 @@ import asyncio
|
||||
import os
|
||||
|
||||
from .dependencies import get_task_manager, get_scheduler_service
|
||||
from .core.logger import logger
|
||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration
|
||||
from .core.logger import logger, belief_scope
|
||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections, git
|
||||
from .core.database import init_db
|
||||
|
||||
# [DEF:App:Global]
|
||||
@@ -33,17 +31,29 @@ app = FastAPI(
|
||||
)
|
||||
# [/DEF:App:Global]
|
||||
|
||||
# [DEF:startup_event:Function]
|
||||
# @PURPOSE: Handles application startup tasks, such as starting the scheduler.
|
||||
# @PRE: None.
|
||||
# @POST: Scheduler is started.
|
||||
# Startup event
|
||||
@app.on_event("startup")
|
||||
async def startup_event():
|
||||
scheduler = get_scheduler_service()
|
||||
with belief_scope("startup_event"):
|
||||
scheduler = get_scheduler_service()
|
||||
scheduler.start()
|
||||
# [/DEF:startup_event:Function]
|
||||
|
||||
# [DEF:shutdown_event:Function]
|
||||
# @PURPOSE: Handles application shutdown tasks, such as stopping the scheduler.
|
||||
# @PRE: None.
|
||||
# @POST: Scheduler is stopped.
|
||||
# Shutdown event
|
||||
@app.on_event("shutdown")
|
||||
async def shutdown_event():
|
||||
scheduler = get_scheduler_service()
|
||||
with belief_scope("shutdown_event"):
|
||||
scheduler = get_scheduler_service()
|
||||
scheduler.stop()
|
||||
# [/DEF:shutdown_event:Function]
|
||||
|
||||
# Configure CORS
|
||||
app.add_middleware(
|
||||
@@ -55,27 +65,39 @@ app.add_middleware(
|
||||
)
|
||||
|
||||
|
||||
# [DEF:log_requests:Function]
|
||||
# @PURPOSE: Middleware to log incoming HTTP requests and their response status.
|
||||
# @PRE: request is a FastAPI Request object.
|
||||
# @POST: Logs request and response details.
|
||||
# @PARAM: request (Request) - The incoming request object.
|
||||
# @PARAM: call_next (Callable) - The next middleware or route handler.
|
||||
@app.middleware("http")
|
||||
async def log_requests(request: Request, call_next):
|
||||
logger.info(f"[DEBUG] Incoming request: {request.method} {request.url.path}")
|
||||
response = await call_next(request)
|
||||
logger.info(f"[DEBUG] Response status: {response.status_code} for {request.url.path}")
|
||||
return response
|
||||
with belief_scope("log_requests", f"{request.method} {request.url.path}"):
|
||||
logger.info(f"[DEBUG] Incoming request: {request.method} {request.url.path}")
|
||||
response = await call_next(request)
|
||||
logger.info(f"[DEBUG] Response status: {response.status_code} for {request.url.path}")
|
||||
return response
|
||||
# [/DEF:log_requests:Function]
|
||||
|
||||
# Include API routes
|
||||
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
||||
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
||||
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
||||
app.include_router(connections.router, prefix="/api/settings/connections", tags=["Connections"])
|
||||
app.include_router(environments.router, prefix="/api/environments", tags=["Environments"])
|
||||
app.include_router(mappings.router)
|
||||
app.include_router(migration.router)
|
||||
app.include_router(git.router)
|
||||
|
||||
# [DEF:WebSocketEndpoint:Endpoint]
|
||||
# @SEMANTICS: websocket, logs, streaming, real-time
|
||||
# @PURPOSE: Provides a WebSocket endpoint for clients to connect to and receive real-time log entries for a specific task.
|
||||
# [DEF:websocket_endpoint:Function]
|
||||
# @PURPOSE: Provides a WebSocket endpoint for real-time log streaming of a task.
|
||||
# @PRE: task_id must be a valid task ID.
|
||||
# @POST: WebSocket connection is managed and logs are streamed until disconnect.
|
||||
@app.websocket("/ws/logs/{task_id}")
|
||||
async def websocket_endpoint(websocket: WebSocket, task_id: str):
|
||||
await websocket.accept()
|
||||
with belief_scope("websocket_endpoint", f"task_id={task_id}"):
|
||||
await websocket.accept()
|
||||
logger.info(f"WebSocket connection accepted for task {task_id}")
|
||||
task_manager = get_task_manager()
|
||||
queue = await task_manager.subscribe_logs(task_id)
|
||||
@@ -125,7 +147,7 @@ async def websocket_endpoint(websocket: WebSocket, task_id: str):
|
||||
logger.error(f"WebSocket error for task {task_id}: {e}")
|
||||
finally:
|
||||
task_manager.unsubscribe_logs(task_id, queue)
|
||||
# [/DEF:WebSocketEndpoint:Endpoint]
|
||||
# [/DEF:websocket_endpoint:Function]
|
||||
|
||||
# [DEF:StaticFiles:Mount]
|
||||
# @SEMANTICS: static, frontend, spa
|
||||
@@ -135,20 +157,33 @@ if frontend_path.exists():
|
||||
app.mount("/_app", StaticFiles(directory=str(frontend_path / "_app")), name="static")
|
||||
|
||||
# Serve other static files from the root of build directory
|
||||
# [DEF:serve_spa:Function]
|
||||
# @PURPOSE: Serves frontend static files or index.html for SPA routing.
|
||||
# @PRE: file_path is requested by the client.
|
||||
# @POST: Returns the requested file or index.html as a fallback.
|
||||
@app.get("/{file_path:path}")
|
||||
async def serve_spa(file_path: str):
|
||||
full_path = frontend_path / file_path
|
||||
if full_path.is_file():
|
||||
return FileResponse(str(full_path))
|
||||
# Fallback to index.html for SPA routing
|
||||
return FileResponse(str(frontend_path / "index.html"))
|
||||
with belief_scope("serve_spa", f"path={file_path}"):
|
||||
# Don't serve SPA for API routes that fell through
|
||||
if file_path.startswith("api/"):
|
||||
logger.info(f"[DEBUG] API route fell through to serve_spa: {file_path}")
|
||||
raise HTTPException(status_code=404, detail=f"API endpoint not found: {file_path}")
|
||||
|
||||
full_path = frontend_path / file_path
|
||||
if full_path.is_file():
|
||||
return FileResponse(str(full_path))
|
||||
# Fallback to index.html for SPA routing
|
||||
return FileResponse(str(frontend_path / "index.html"))
|
||||
# [/DEF:serve_spa:Function]
|
||||
else:
|
||||
# [DEF:RootEndpoint:Endpoint]
|
||||
# @SEMANTICS: root, healthcheck
|
||||
# @PURPOSE: A simple root endpoint to confirm that the API is running.
|
||||
# [DEF:read_root:Function]
|
||||
# @PURPOSE: A simple root endpoint to confirm that the API is running when frontend is missing.
|
||||
# @PRE: None.
|
||||
# @POST: Returns a JSON message indicating API status.
|
||||
@app.get("/")
|
||||
async def read_root():
|
||||
return {"message": "Superset Tools API is running (Frontend build not found)"}
|
||||
# [/DEF:RootEndpoint:Endpoint]
|
||||
with belief_scope("read_root"):
|
||||
return {"message": "Superset Tools API is running (Frontend build not found)"}
|
||||
# [/DEF:read_root:Function]
|
||||
# [/DEF:StaticFiles:Mount]
|
||||
# [/DEF:AppModule:Module]
|
||||
|
||||
@@ -16,7 +16,7 @@ import os
|
||||
from pathlib import Path
|
||||
from typing import Optional, List
|
||||
from .config_models import AppConfig, Environment, GlobalSettings
|
||||
from .logger import logger, configure_logger
|
||||
from .logger import logger, configure_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:ConfigManager:Class]
|
||||
@@ -30,30 +30,33 @@ class ConfigManager:
|
||||
# @POST: self.config is an instance of AppConfig
|
||||
# @PARAM: config_path (str) - Path to the configuration file.
|
||||
def __init__(self, config_path: str = "config.json"):
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(config_path, str) and config_path, "config_path must be a non-empty string"
|
||||
|
||||
logger.info(f"[ConfigManager][Entry] Initializing with {config_path}")
|
||||
|
||||
# 2. Logic implementation
|
||||
self.config_path = Path(config_path)
|
||||
self.config: AppConfig = self._load_config()
|
||||
with belief_scope("__init__"):
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(config_path, str) and config_path, "config_path must be a non-empty string"
|
||||
|
||||
logger.info(f"[ConfigManager][Entry] Initializing with {config_path}")
|
||||
|
||||
# 2. Logic implementation
|
||||
self.config_path = Path(config_path)
|
||||
self.config: AppConfig = self._load_config()
|
||||
|
||||
# Configure logger with loaded settings
|
||||
configure_logger(self.config.settings.logging)
|
||||
# Configure logger with loaded settings
|
||||
configure_logger(self.config.settings.logging)
|
||||
|
||||
# 3. Runtime check of @POST
|
||||
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
||||
# 3. Runtime check of @POST
|
||||
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
||||
|
||||
logger.info(f"[ConfigManager][Exit] Initialized")
|
||||
logger.info(f"[ConfigManager][Exit] Initialized")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:_load_config:Function]
|
||||
# @PURPOSE: Loads the configuration from disk or creates a default one.
|
||||
# @PRE: self.config_path is set.
|
||||
# @POST: isinstance(return, AppConfig)
|
||||
# @RETURN: AppConfig - The loaded or default configuration.
|
||||
def _load_config(self) -> AppConfig:
|
||||
logger.debug(f"[_load_config][Entry] Loading from {self.config_path}")
|
||||
with belief_scope("_load_config"):
|
||||
logger.debug(f"[_load_config][Entry] Loading from {self.config_path}")
|
||||
|
||||
if not self.config_path.exists():
|
||||
logger.info(f"[_load_config][Action] Config file not found. Creating default.")
|
||||
@@ -83,9 +86,11 @@ class ConfigManager:
|
||||
# [DEF:_save_config_to_disk:Function]
|
||||
# @PURPOSE: Saves the provided configuration object to disk.
|
||||
# @PRE: isinstance(config, AppConfig)
|
||||
# @POST: Configuration saved to disk.
|
||||
# @PARAM: config (AppConfig) - The configuration to save.
|
||||
def _save_config_to_disk(self, config: AppConfig):
|
||||
logger.debug(f"[_save_config_to_disk][Entry] Saving to {self.config_path}")
|
||||
with belief_scope("_save_config_to_disk"):
|
||||
logger.debug(f"[_save_config_to_disk][Entry] Saving to {self.config_path}")
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(config, AppConfig), "config must be an instance of AppConfig"
|
||||
@@ -101,23 +106,31 @@ class ConfigManager:
|
||||
|
||||
# [DEF:save:Function]
|
||||
# @PURPOSE: Saves the current configuration state to disk.
|
||||
# @PRE: self.config is set.
|
||||
# @POST: self._save_config_to_disk called.
|
||||
def save(self):
|
||||
self._save_config_to_disk(self.config)
|
||||
with belief_scope("save"):
|
||||
self._save_config_to_disk(self.config)
|
||||
# [/DEF:save:Function]
|
||||
|
||||
# [DEF:get_config:Function]
|
||||
# @PURPOSE: Returns the current configuration.
|
||||
# @PRE: self.config is set.
|
||||
# @POST: Returns self.config.
|
||||
# @RETURN: AppConfig - The current configuration.
|
||||
def get_config(self) -> AppConfig:
|
||||
return self.config
|
||||
with belief_scope("get_config"):
|
||||
return self.config
|
||||
# [/DEF:get_config:Function]
|
||||
|
||||
# [DEF:update_global_settings:Function]
|
||||
# @PURPOSE: Updates the global settings and persists the change.
|
||||
# @PRE: isinstance(settings, GlobalSettings)
|
||||
# @POST: self.config.settings updated and saved.
|
||||
# @PARAM: settings (GlobalSettings) - The new global settings.
|
||||
def update_global_settings(self, settings: GlobalSettings):
|
||||
logger.info(f"[update_global_settings][Entry] Updating settings")
|
||||
with belief_scope("update_global_settings"):
|
||||
logger.info(f"[update_global_settings][Entry] Updating settings")
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(settings, GlobalSettings), "settings must be an instance of GlobalSettings"
|
||||
@@ -134,10 +147,13 @@ class ConfigManager:
|
||||
|
||||
# [DEF:validate_path:Function]
|
||||
# @PURPOSE: Validates if a path exists and is writable.
|
||||
# @PRE: path is a string.
|
||||
# @POST: Returns (bool, str) status.
|
||||
# @PARAM: path (str) - The path to validate.
|
||||
# @RETURN: tuple (bool, str) - (is_valid, message)
|
||||
def validate_path(self, path: str) -> tuple[bool, str]:
|
||||
p = os.path.abspath(path)
|
||||
with belief_scope("validate_path"):
|
||||
p = os.path.abspath(path)
|
||||
if not os.path.exists(p):
|
||||
try:
|
||||
os.makedirs(p, exist_ok=True)
|
||||
@@ -152,24 +168,46 @@ class ConfigManager:
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: Returns the list of configured environments.
|
||||
# @PRE: self.config is set.
|
||||
# @POST: Returns list of environments.
|
||||
# @RETURN: List[Environment] - List of environments.
|
||||
def get_environments(self) -> List[Environment]:
|
||||
return self.config.environments
|
||||
with belief_scope("get_environments"):
|
||||
return self.config.environments
|
||||
# [/DEF:get_environments:Function]
|
||||
|
||||
# [DEF:has_environments:Function]
|
||||
# @PURPOSE: Checks if at least one environment is configured.
|
||||
# @PRE: self.config is set.
|
||||
# @POST: Returns boolean indicating if environments exist.
|
||||
# @RETURN: bool - True if at least one environment exists.
|
||||
def has_environments(self) -> bool:
|
||||
return len(self.config.environments) > 0
|
||||
with belief_scope("has_environments"):
|
||||
return len(self.config.environments) > 0
|
||||
# [/DEF:has_environments:Function]
|
||||
|
||||
# [DEF:get_environment:Function]
|
||||
# @PURPOSE: Returns a single environment by ID.
|
||||
# @PRE: self.config is set and isinstance(env_id, str) and len(env_id) > 0.
|
||||
# @POST: Returns Environment object if found, None otherwise.
|
||||
# @PARAM: env_id (str) - The ID of the environment to retrieve.
|
||||
# @RETURN: Optional[Environment] - The environment with the given ID, or None.
|
||||
def get_environment(self, env_id: str) -> Optional[Environment]:
|
||||
with belief_scope("get_environment"):
|
||||
for env in self.config.environments:
|
||||
if env.id == env_id:
|
||||
return env
|
||||
return None
|
||||
# [/DEF:get_environment:Function]
|
||||
|
||||
# [DEF:add_environment:Function]
|
||||
# @PURPOSE: Adds a new environment to the configuration.
|
||||
# @PRE: isinstance(env, Environment)
|
||||
# @POST: Environment added or updated in self.config.environments.
|
||||
# @PARAM: env (Environment) - The environment to add.
|
||||
def add_environment(self, env: Environment):
|
||||
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
||||
with belief_scope("add_environment"):
|
||||
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(env, Environment), "env must be an instance of Environment"
|
||||
@@ -186,11 +224,13 @@ class ConfigManager:
|
||||
# [DEF:update_environment:Function]
|
||||
# @PURPOSE: Updates an existing environment.
|
||||
# @PRE: isinstance(env_id, str) and len(env_id) > 0 and isinstance(updated_env, Environment)
|
||||
# @POST: Returns True if environment was found and updated.
|
||||
# @PARAM: env_id (str) - The ID of the environment to update.
|
||||
# @PARAM: updated_env (Environment) - The updated environment data.
|
||||
# @RETURN: bool - True if updated, False otherwise.
|
||||
def update_environment(self, env_id: str, updated_env: Environment) -> bool:
|
||||
logger.info(f"[update_environment][Entry] Updating {env_id}")
|
||||
with belief_scope("update_environment"):
|
||||
logger.info(f"[update_environment][Entry] Updating {env_id}")
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert env_id and isinstance(env_id, str), "env_id must be a non-empty string"
|
||||
@@ -215,9 +255,11 @@ class ConfigManager:
|
||||
# [DEF:delete_environment:Function]
|
||||
# @PURPOSE: Deletes an environment by ID.
|
||||
# @PRE: isinstance(env_id, str) and len(env_id) > 0
|
||||
# @POST: Environment removed from self.config.environments if it existed.
|
||||
# @PARAM: env_id (str) - The ID of the environment to delete.
|
||||
def delete_environment(self, env_id: str):
|
||||
logger.info(f"[delete_environment][Entry] Deleting {env_id}")
|
||||
with belief_scope("delete_environment"):
|
||||
logger.info(f"[delete_environment][Entry] Deleting {env_id}")
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert env_id and isinstance(env_id, str), "env_id must be a non-empty string"
|
||||
|
||||
@@ -23,6 +23,8 @@ class Environment(BaseModel):
|
||||
url: str
|
||||
username: str
|
||||
password: str # Will be masked in UI
|
||||
verify_ssl: bool = True
|
||||
timeout: int = 30
|
||||
is_default: bool = False
|
||||
backup_schedule: Schedule = Field(default_factory=Schedule)
|
||||
# [/DEF:Environment:DataClass]
|
||||
|
||||
@@ -11,9 +11,12 @@
|
||||
# [SECTION: IMPORTS]
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker, Session
|
||||
from backend.src.models.mapping import Base
|
||||
# Import TaskRecord to ensure it's registered with Base
|
||||
from backend.src.models.task import TaskRecord
|
||||
from ..models.mapping import Base
|
||||
# Import models to ensure they're registered with Base
|
||||
from ..models.task import TaskRecord
|
||||
from ..models.connection import ConnectionConfig
|
||||
from ..models.git import GitServerConfig, GitRepository, DeploymentEnvironment
|
||||
from .logger import belief_scope
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
@@ -45,33 +48,40 @@ TasksSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=tasks_e
|
||||
|
||||
# [DEF:init_db:Function]
|
||||
# @PURPOSE: Initializes the database by creating all tables.
|
||||
# @PRE: engine and tasks_engine are initialized.
|
||||
# @POST: Database tables created.
|
||||
def init_db():
|
||||
Base.metadata.create_all(bind=engine)
|
||||
Base.metadata.create_all(bind=tasks_engine)
|
||||
with belief_scope("init_db"):
|
||||
Base.metadata.create_all(bind=engine)
|
||||
Base.metadata.create_all(bind=tasks_engine)
|
||||
# [/DEF:init_db:Function]
|
||||
|
||||
# [DEF:get_db:Function]
|
||||
# @PURPOSE: Dependency for getting a database session.
|
||||
# @PRE: SessionLocal is initialized.
|
||||
# @POST: Session is closed after use.
|
||||
# @RETURN: Generator[Session, None, None]
|
||||
def get_db():
|
||||
db = SessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
||||
with belief_scope("get_db"):
|
||||
db = SessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
||||
# [/DEF:get_db:Function]
|
||||
|
||||
# [DEF:get_tasks_db:Function]
|
||||
# @PURPOSE: Dependency for getting a tasks database session.
|
||||
# @PRE: TasksSessionLocal is initialized.
|
||||
# @POST: Session is closed after use.
|
||||
# @RETURN: Generator[Session, None, None]
|
||||
def get_tasks_db():
|
||||
db = TasksSessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
||||
with belief_scope("get_tasks_db"):
|
||||
db = TasksSessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
||||
# [/DEF:get_tasks_db:Function]
|
||||
|
||||
# [/DEF:backend.src.core.database:Module]
|
||||
|
||||
@@ -22,12 +22,18 @@ _enable_belief_state = True
|
||||
# [DEF:BeliefFormatter:Class]
|
||||
# @PURPOSE: Custom logging formatter that adds belief state prefixes to log messages.
|
||||
class BeliefFormatter(logging.Formatter):
|
||||
# [DEF:format:Function]
|
||||
# @PURPOSE: Formats the log record, adding belief state context if available.
|
||||
# @PRE: record is a logging.LogRecord.
|
||||
# @POST: Returns formatted string.
|
||||
# @PARAM: record (logging.LogRecord) - The log record to format.
|
||||
# @RETURN: str - The formatted log message.
|
||||
def format(self, record):
|
||||
msg = super().format(record)
|
||||
anchor_id = getattr(_belief_state, 'anchor_id', None)
|
||||
if anchor_id:
|
||||
msg = f"[{anchor_id}][Action] {msg}"
|
||||
return msg
|
||||
record.msg = f"[{anchor_id}][Action] {record.msg}"
|
||||
return super().format(record)
|
||||
# [/DEF:format:Function]
|
||||
# [/DEF:BeliefFormatter:Class]
|
||||
|
||||
# Re-using LogEntry from task_manager for consistency
|
||||
@@ -42,8 +48,12 @@ class LogEntry(BaseModel):
|
||||
|
||||
# [/DEF:LogEntry:Class]
|
||||
|
||||
# [DEF:BeliefScope:Function]
|
||||
# [DEF:belief_scope:Function]
|
||||
# @PURPOSE: Context manager for structured Belief State logging.
|
||||
# @PARAM: anchor_id (str) - The identifier for the current semantic block.
|
||||
# @PARAM: message (str) - Optional entry message.
|
||||
# @PRE: anchor_id must be provided.
|
||||
# @POST: Thread-local belief state is updated and entry/exit logs are generated.
|
||||
@contextmanager
|
||||
def belief_scope(anchor_id: str, message: str = ""):
|
||||
# Log Entry if enabled
|
||||
@@ -71,9 +81,9 @@ def belief_scope(anchor_id: str, message: str = ""):
|
||||
# Restore old anchor
|
||||
_belief_state.anchor_id = old_anchor
|
||||
|
||||
# [/DEF:BeliefScope:Function]
|
||||
# [/DEF:belief_scope:Function]
|
||||
|
||||
# [DEF:ConfigureLogger:Function]
|
||||
# [DEF:configure_logger:Function]
|
||||
# @PURPOSE: Configures the logger with the provided logging settings.
|
||||
# @PRE: config is a valid LoggingConfig instance.
|
||||
# @POST: Logger level, handlers, and belief state flag are updated.
|
||||
@@ -115,7 +125,7 @@ def configure_logger(config):
|
||||
handler.setFormatter(BeliefFormatter(
|
||||
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
||||
))
|
||||
# [/DEF:ConfigureLogger:Function]
|
||||
# [/DEF:configure_logger:Function]
|
||||
|
||||
# [DEF:WebSocketLogHandler:Class]
|
||||
# @SEMANTICS: logging, handler, websocket, buffer
|
||||
@@ -125,12 +135,23 @@ class WebSocketLogHandler(logging.Handler):
|
||||
A logging handler that stores log records and can be extended to send them
|
||||
over WebSockets.
|
||||
"""
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the handler with a fixed-capacity buffer.
|
||||
# @PRE: capacity is an integer.
|
||||
# @POST: Instance initialized with empty deque.
|
||||
# @PARAM: capacity (int) - Maximum number of logs to keep in memory.
|
||||
def __init__(self, capacity: int = 1000):
|
||||
super().__init__()
|
||||
self.log_buffer: deque[LogEntry] = deque(maxlen=capacity)
|
||||
# In a real implementation, you'd have a way to manage active WebSocket connections
|
||||
# e.g., self.active_connections: Set[WebSocket] = set()
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:emit:Function]
|
||||
# @PURPOSE: Captures a log record, formats it, and stores it in the buffer.
|
||||
# @PRE: record is a logging.LogRecord.
|
||||
# @POST: Log is added to the log_buffer.
|
||||
# @PARAM: record (logging.LogRecord) - The log record to emit.
|
||||
def emit(self, record: logging.LogRecord):
|
||||
try:
|
||||
log_entry = LogEntry(
|
||||
@@ -151,12 +172,19 @@ class WebSocketLogHandler(logging.Handler):
|
||||
# Example: for ws in self.active_connections: await ws.send_json(log_entry.dict())
|
||||
except Exception:
|
||||
self.handleError(record)
|
||||
# [/DEF:emit:Function]
|
||||
|
||||
# [DEF:get_recent_logs:Function]
|
||||
# @PURPOSE: Returns a list of recent log entries from the buffer.
|
||||
# @PRE: None.
|
||||
# @POST: Returns list of LogEntry objects.
|
||||
# @RETURN: List[LogEntry] - List of buffered log entries.
|
||||
def get_recent_logs(self) -> List[LogEntry]:
|
||||
"""
|
||||
Returns a list of recent log entries from the buffer.
|
||||
"""
|
||||
return list(self.log_buffer)
|
||||
# [/DEF:get_recent_logs:Function]
|
||||
|
||||
# [/DEF:WebSocketLogHandler:Class]
|
||||
|
||||
@@ -164,6 +192,18 @@ class WebSocketLogHandler(logging.Handler):
|
||||
# @SEMANTICS: logger, global, instance
|
||||
# @PURPOSE: The global logger instance for the application, configured with both a console handler and the custom WebSocket handler.
|
||||
logger = logging.getLogger("superset_tools_app")
|
||||
|
||||
# [DEF:believed:Function]
|
||||
# @PURPOSE: A decorator that wraps a function in a belief scope.
|
||||
# @PARAM: anchor_id (str) - The identifier for the semantic block.
|
||||
def believed(anchor_id: str):
|
||||
def decorator(func):
|
||||
def wrapper(*args, **kwargs):
|
||||
with belief_scope(anchor_id):
|
||||
return func(*args, **kwargs)
|
||||
return wrapper
|
||||
return decorator
|
||||
# [/DEF:believed:Function]
|
||||
logger.setLevel(logging.INFO)
|
||||
|
||||
# Create a formatter
|
||||
|
||||
@@ -23,12 +23,14 @@ import yaml
|
||||
# @PURPOSE: Engine for transforming Superset export ZIPs.
|
||||
class MigrationEngine:
|
||||
|
||||
# [DEF:MigrationEngine.transform_zip:Function]
|
||||
# [DEF:transform_zip:Function]
|
||||
# @PURPOSE: Extracts ZIP, replaces database UUIDs in YAMLs, and re-packages.
|
||||
# @PARAM: zip_path (str) - Path to the source ZIP file.
|
||||
# @PARAM: output_path (str) - Path where the transformed ZIP will be saved.
|
||||
# @PARAM: db_mapping (Dict[str, str]) - Mapping of source UUID to target UUID.
|
||||
# @PARAM: strip_databases (bool) - Whether to remove the databases directory from the archive.
|
||||
# @PRE: zip_path must point to a valid Superset export archive.
|
||||
# @POST: Transformed archive is saved to output_path.
|
||||
# @RETURN: bool - True if successful.
|
||||
def transform_zip(self, zip_path: str, output_path: str, db_mapping: Dict[str, str], strip_databases: bool = True) -> bool:
|
||||
"""
|
||||
@@ -73,10 +75,14 @@ class MigrationEngine:
|
||||
except Exception as e:
|
||||
logger.error(f"[MigrationEngine.transform_zip][Coherence:Failed] Error transforming ZIP: {e}")
|
||||
return False
|
||||
# [/DEF:MigrationEngine.transform_zip:Function]
|
||||
# [/DEF:transform_zip:Function]
|
||||
|
||||
# [DEF:MigrationEngine._transform_yaml:Function]
|
||||
# [DEF:_transform_yaml:Function]
|
||||
# @PURPOSE: Replaces database_uuid in a single YAML file.
|
||||
# @PARAM: file_path (Path) - Path to the YAML file.
|
||||
# @PARAM: db_mapping (Dict[str, str]) - UUID mapping dictionary.
|
||||
# @PRE: file_path must exist and be readable.
|
||||
# @POST: File is modified in-place if source UUID matches mapping.
|
||||
def _transform_yaml(self, file_path: Path, db_mapping: Dict[str, str]):
|
||||
with open(file_path, 'r') as f:
|
||||
data = yaml.safe_load(f)
|
||||
@@ -91,7 +97,7 @@ class MigrationEngine:
|
||||
data['database_uuid'] = db_mapping[source_uuid]
|
||||
with open(file_path, 'w') as f:
|
||||
yaml.dump(data, f)
|
||||
# [/DEF:MigrationEngine._transform_yaml:Function]
|
||||
# [/DEF:_transform_yaml:Function]
|
||||
|
||||
# [/DEF:MigrationEngine:Class]
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Dict, Any
|
||||
from .logger import belief_scope
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
@@ -17,43 +18,86 @@ class PluginBase(ABC):
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
# [DEF:id:Function]
|
||||
# @PURPOSE: Returns the unique identifier for the plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string ID.
|
||||
# @RETURN: str - Plugin ID.
|
||||
def id(self) -> str:
|
||||
"""A unique identifier for the plugin."""
|
||||
pass
|
||||
with belief_scope("id"):
|
||||
pass
|
||||
# [/DEF:id:Function]
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
# [DEF:name:Function]
|
||||
# @PURPOSE: Returns the human-readable name of the plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string name.
|
||||
# @RETURN: str - Plugin name.
|
||||
def name(self) -> str:
|
||||
"""A human-readable name for the plugin."""
|
||||
pass
|
||||
with belief_scope("name"):
|
||||
pass
|
||||
# [/DEF:name:Function]
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
# [DEF:description:Function]
|
||||
# @PURPOSE: Returns a brief description of the plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string description.
|
||||
# @RETURN: str - Plugin description.
|
||||
def description(self) -> str:
|
||||
"""A brief description of what the plugin does."""
|
||||
pass
|
||||
with belief_scope("description"):
|
||||
pass
|
||||
# [/DEF:description:Function]
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
# [DEF:version:Function]
|
||||
# @PURPOSE: Returns the version of the plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string version.
|
||||
# @RETURN: str - Plugin version.
|
||||
def version(self) -> str:
|
||||
"""The version of the plugin."""
|
||||
pass
|
||||
with belief_scope("version"):
|
||||
pass
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@abstractmethod
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for the plugin's input parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns dict schema.
|
||||
# @RETURN: Dict[str, Any] - JSON schema.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Returns the JSON schema for the plugin's input parameters.
|
||||
This schema will be used to generate the frontend form.
|
||||
"""
|
||||
pass
|
||||
with belief_scope("get_schema"):
|
||||
pass
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
@abstractmethod
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Executes the plugin's core logic.
|
||||
# @PARAM: params (Dict[str, Any]) - Validated input parameters.
|
||||
# @PRE: params must be a dictionary.
|
||||
# @POST: Plugin execution is completed.
|
||||
async def execute(self, params: Dict[str, Any]):
|
||||
with belief_scope("execute"):
|
||||
pass
|
||||
"""
|
||||
Executes the plugin's logic.
|
||||
The `params` argument will be validated against the schema returned by `get_schema()`.
|
||||
"""
|
||||
pass
|
||||
# [/DEF:execute:Function]
|
||||
# [/DEF:PluginBase:Class]
|
||||
|
||||
# [DEF:PluginConfig:Class]
|
||||
|
||||
@@ -4,6 +4,7 @@ import sys # Added this line
|
||||
from typing import Dict, Type, List, Optional
|
||||
from .plugin_base import PluginBase, PluginConfig
|
||||
from jsonschema import validate
|
||||
from .logger import belief_scope
|
||||
|
||||
# [DEF:PluginLoader:Class]
|
||||
# @SEMANTICS: plugin, loader, dynamic, import
|
||||
@@ -16,22 +17,28 @@ class PluginLoader:
|
||||
that inherit from PluginBase.
|
||||
"""
|
||||
|
||||
# [DEF:PluginLoader.__init__:Function]
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the PluginLoader with a directory to scan.
|
||||
# @PRE: plugin_dir is a valid directory path.
|
||||
# @POST: Plugins are loaded and registered.
|
||||
# @PARAM: plugin_dir (str) - The directory containing plugin modules.
|
||||
def __init__(self, plugin_dir: str):
|
||||
self.plugin_dir = plugin_dir
|
||||
self._plugins: Dict[str, PluginBase] = {}
|
||||
self._plugin_configs: Dict[str, PluginConfig] = {}
|
||||
self._load_plugins()
|
||||
# [/DEF:PluginLoader.__init__:Function]
|
||||
with belief_scope("__init__"):
|
||||
self.plugin_dir = plugin_dir
|
||||
self._plugins: Dict[str, PluginBase] = {}
|
||||
self._plugin_configs: Dict[str, PluginConfig] = {}
|
||||
self._load_plugins()
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:PluginLoader._load_plugins:Function]
|
||||
# [DEF:_load_plugins:Function]
|
||||
# @PURPOSE: Scans the plugin directory and loads all valid plugins.
|
||||
# @PRE: plugin_dir exists or can be created.
|
||||
# @POST: _load_module is called for each .py file.
|
||||
def _load_plugins(self):
|
||||
"""
|
||||
Scans the plugin directory, imports modules, and registers valid plugins.
|
||||
"""
|
||||
with belief_scope("_load_plugins"):
|
||||
"""
|
||||
Scans the plugin directory, imports modules, and registers valid plugins.
|
||||
"""
|
||||
if not os.path.exists(self.plugin_dir):
|
||||
os.makedirs(self.plugin_dir)
|
||||
|
||||
@@ -47,16 +54,19 @@ class PluginLoader:
|
||||
module_name = filename[:-3]
|
||||
file_path = os.path.join(self.plugin_dir, filename)
|
||||
self._load_module(module_name, file_path)
|
||||
# [/DEF:PluginLoader._load_plugins:Function]
|
||||
# [/DEF:_load_plugins:Function]
|
||||
|
||||
# [DEF:PluginLoader._load_module:Function]
|
||||
# [DEF:_load_module:Function]
|
||||
# @PURPOSE: Loads a single Python module and discovers PluginBase implementations.
|
||||
# @PRE: module_name and file_path are valid.
|
||||
# @POST: Plugin classes are instantiated and registered.
|
||||
# @PARAM: module_name (str) - The name of the module.
|
||||
# @PARAM: file_path (str) - The path to the module file.
|
||||
def _load_module(self, module_name: str, file_path: str):
|
||||
"""
|
||||
Loads a single Python module and extracts PluginBase subclasses.
|
||||
"""
|
||||
with belief_scope("_load_module"):
|
||||
"""
|
||||
Loads a single Python module and extracts PluginBase subclasses.
|
||||
"""
|
||||
# Try to determine the correct package prefix based on how the app is running
|
||||
# For standalone execution, we need to handle the import differently
|
||||
if __name__ == "__main__" or "test" in __name__:
|
||||
@@ -94,15 +104,18 @@ class PluginLoader:
|
||||
self._register_plugin(plugin_instance)
|
||||
except Exception as e:
|
||||
print(f"Error instantiating plugin {attribute_name} in {module_name}: {e}") # Replace with proper logging
|
||||
# [/DEF:PluginLoader._load_module:Function]
|
||||
# [/DEF:_load_module:Function]
|
||||
|
||||
# [DEF:PluginLoader._register_plugin:Function]
|
||||
# [DEF:_register_plugin:Function]
|
||||
# @PURPOSE: Registers a PluginBase instance and its configuration.
|
||||
# @PRE: plugin_instance is a valid implementation of PluginBase.
|
||||
# @POST: Plugin is added to _plugins and _plugin_configs.
|
||||
# @PARAM: plugin_instance (PluginBase) - The plugin instance to register.
|
||||
def _register_plugin(self, plugin_instance: PluginBase):
|
||||
"""
|
||||
Registers a valid plugin instance.
|
||||
"""
|
||||
with belief_scope("_register_plugin"):
|
||||
"""
|
||||
Registers a valid plugin instance.
|
||||
"""
|
||||
plugin_id = plugin_instance.id
|
||||
if plugin_id in self._plugins:
|
||||
print(f"Warning: Duplicate plugin ID '{plugin_id}' found. Skipping.") # Replace with proper logging
|
||||
@@ -131,39 +144,48 @@ class PluginLoader:
|
||||
except Exception as e:
|
||||
from ..core.logger import logger
|
||||
logger.error(f"Error validating plugin '{plugin_instance.name}' (ID: {plugin_id}): {e}")
|
||||
# [/DEF:PluginLoader._register_plugin:Function]
|
||||
# [/DEF:_register_plugin:Function]
|
||||
|
||||
|
||||
# [DEF:PluginLoader.get_plugin:Function]
|
||||
# [DEF:get_plugin:Function]
|
||||
# @PURPOSE: Retrieves a loaded plugin instance by its ID.
|
||||
# @PRE: plugin_id is a string.
|
||||
# @POST: Returns plugin instance or None.
|
||||
# @PARAM: plugin_id (str) - The unique identifier of the plugin.
|
||||
# @RETURN: Optional[PluginBase] - The plugin instance if found, otherwise None.
|
||||
def get_plugin(self, plugin_id: str) -> Optional[PluginBase]:
|
||||
"""
|
||||
Returns a loaded plugin instance by its ID.
|
||||
"""
|
||||
with belief_scope("get_plugin"):
|
||||
"""
|
||||
Returns a loaded plugin instance by its ID.
|
||||
"""
|
||||
return self._plugins.get(plugin_id)
|
||||
# [/DEF:PluginLoader.get_plugin:Function]
|
||||
# [/DEF:get_plugin:Function]
|
||||
|
||||
# [DEF:PluginLoader.get_all_plugin_configs:Function]
|
||||
# [DEF:get_all_plugin_configs:Function]
|
||||
# @PURPOSE: Returns a list of all registered plugin configurations.
|
||||
# @PRE: None.
|
||||
# @POST: Returns list of all PluginConfig objects.
|
||||
# @RETURN: List[PluginConfig] - A list of plugin configurations.
|
||||
def get_all_plugin_configs(self) -> List[PluginConfig]:
|
||||
"""
|
||||
Returns a list of all loaded plugin configurations.
|
||||
"""
|
||||
with belief_scope("get_all_plugin_configs"):
|
||||
"""
|
||||
Returns a list of all loaded plugin configurations.
|
||||
"""
|
||||
return list(self._plugin_configs.values())
|
||||
# [/DEF:PluginLoader.get_all_plugin_configs:Function]
|
||||
# [/DEF:get_all_plugin_configs:Function]
|
||||
|
||||
# [DEF:PluginLoader.has_plugin:Function]
|
||||
# [DEF:has_plugin:Function]
|
||||
# @PURPOSE: Checks if a plugin with the given ID is registered.
|
||||
# @PRE: plugin_id is a string.
|
||||
# @POST: Returns True if plugin exists.
|
||||
# @PARAM: plugin_id (str) - The unique identifier of the plugin.
|
||||
# @RETURN: bool - True if the plugin is registered, False otherwise.
|
||||
def has_plugin(self, plugin_id: str) -> bool:
|
||||
"""
|
||||
Checks if a plugin with the given ID is loaded.
|
||||
"""
|
||||
with belief_scope("has_plugin"):
|
||||
"""
|
||||
Checks if a plugin with the given ID is loaded.
|
||||
"""
|
||||
return plugin_id in self._plugins
|
||||
# [/DEF:PluginLoader.has_plugin:Function]
|
||||
# [/DEF:has_plugin:Function]
|
||||
|
||||
# [/DEF:PluginLoader:Class]
|
||||
@@ -17,34 +17,45 @@ import asyncio
|
||||
# @SEMANTICS: scheduler, service, apscheduler
|
||||
# @PURPOSE: Provides a service to manage scheduled backup tasks.
|
||||
class SchedulerService:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the scheduler service with task and config managers.
|
||||
# @PRE: task_manager and config_manager must be provided.
|
||||
# @POST: Scheduler instance is created but not started.
|
||||
def __init__(self, task_manager, config_manager: ConfigManager):
|
||||
with belief_scope("SchedulerService.__init__"):
|
||||
self.task_manager = task_manager
|
||||
self.config_manager = config_manager
|
||||
self.scheduler = BackgroundScheduler()
|
||||
self.loop = asyncio.get_event_loop()
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:SchedulerService.start:Function]
|
||||
# [DEF:start:Function]
|
||||
# @PURPOSE: Starts the background scheduler and loads initial schedules.
|
||||
# @PRE: Scheduler should be initialized.
|
||||
# @POST: Scheduler is running and schedules are loaded.
|
||||
def start(self):
|
||||
with belief_scope("SchedulerService.start"):
|
||||
if not self.scheduler.running:
|
||||
self.scheduler.start()
|
||||
logger.info("Scheduler started.")
|
||||
self.load_schedules()
|
||||
# [/DEF:SchedulerService.start:Function]
|
||||
# [/DEF:start:Function]
|
||||
|
||||
# [DEF:SchedulerService.stop:Function]
|
||||
# [DEF:stop:Function]
|
||||
# @PURPOSE: Stops the background scheduler.
|
||||
# @PRE: Scheduler should be running.
|
||||
# @POST: Scheduler is shut down.
|
||||
def stop(self):
|
||||
with belief_scope("SchedulerService.stop"):
|
||||
if self.scheduler.running:
|
||||
self.scheduler.shutdown()
|
||||
logger.info("Scheduler stopped.")
|
||||
# [/DEF:SchedulerService.stop:Function]
|
||||
# [/DEF:stop:Function]
|
||||
|
||||
# [DEF:SchedulerService.load_schedules:Function]
|
||||
# [DEF:load_schedules:Function]
|
||||
# @PURPOSE: Loads backup schedules from configuration and registers them.
|
||||
# @PRE: config_manager must have valid configuration.
|
||||
# @POST: All enabled backup jobs are added to the scheduler.
|
||||
def load_schedules(self):
|
||||
with belief_scope("SchedulerService.load_schedules"):
|
||||
# Clear existing jobs
|
||||
@@ -54,12 +65,14 @@ class SchedulerService:
|
||||
for env in config.environments:
|
||||
if env.backup_schedule and env.backup_schedule.enabled:
|
||||
self.add_backup_job(env.id, env.backup_schedule.cron_expression)
|
||||
# [/DEF:SchedulerService.load_schedules:Function]
|
||||
# [/DEF:load_schedules:Function]
|
||||
|
||||
# [DEF:SchedulerService.add_backup_job:Function]
|
||||
# [DEF:add_backup_job:Function]
|
||||
# @PURPOSE: Adds a scheduled backup job for an environment.
|
||||
# @PARAM: env_id (str) - The ID of the environment.
|
||||
# @PARAM: cron_expression (str) - The cron expression for the schedule.
|
||||
# @PRE: env_id and cron_expression must be valid strings.
|
||||
# @POST: A new job is added to the scheduler or replaced if it already exists.
|
||||
# @PARAM: env_id (str) - The ID of the environment.
|
||||
# @PARAM: cron_expression (str) - The cron expression for the schedule.
|
||||
def add_backup_job(self, env_id: str, cron_expression: str):
|
||||
with belief_scope("SchedulerService.add_backup_job", f"env_id={env_id}, cron={cron_expression}"):
|
||||
job_id = f"backup_{env_id}"
|
||||
@@ -74,11 +87,13 @@ class SchedulerService:
|
||||
logger.info(f"Scheduled backup job added for environment {env_id}: {cron_expression}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to add backup job for environment {env_id}: {e}")
|
||||
# [/DEF:SchedulerService.add_backup_job:Function]
|
||||
# [/DEF:add_backup_job:Function]
|
||||
|
||||
# [DEF:SchedulerService._trigger_backup:Function]
|
||||
# [DEF:_trigger_backup:Function]
|
||||
# @PURPOSE: Triggered by the scheduler to start a backup task.
|
||||
# @PARAM: env_id (str) - The ID of the environment.
|
||||
# @PRE: env_id must be a valid environment ID.
|
||||
# @POST: A new backup task is created in the task manager if not already running.
|
||||
# @PARAM: env_id (str) - The ID of the environment.
|
||||
def _trigger_backup(self, env_id: str):
|
||||
with belief_scope("SchedulerService._trigger_backup", f"env_id={env_id}"):
|
||||
logger.info(f"Triggering scheduled backup for environment {env_id}")
|
||||
@@ -98,7 +113,7 @@ class SchedulerService:
|
||||
self.task_manager.create_task("superset-backup", {"environment_id": env_id}),
|
||||
self.loop
|
||||
)
|
||||
# [/DEF:SchedulerService._trigger_backup:Function]
|
||||
# [/DEF:_trigger_backup:Function]
|
||||
|
||||
# [/DEF:SchedulerService:Class]
|
||||
# [/DEF:SchedulerModule:Module]
|
||||
@@ -1,82 +1,399 @@
|
||||
# [DEF:backend.src.core.superset_client:Module]
|
||||
#
|
||||
# @SEMANTICS: superset, api, client, database, metadata
|
||||
# @PURPOSE: Extends the base SupersetClient with database-specific metadata fetching.
|
||||
# @SEMANTICS: superset, api, client, rest, http, dashboard, dataset, import, export
|
||||
# @PURPOSE: Предоставляет высокоуровневый клиент для взаимодействия с Superset REST API, инкапсулируя логику запросов, обработку ошибок и пагинацию.
|
||||
# @LAYER: Core
|
||||
# @RELATION: INHERITS_FROM -> superset_tool.client.SupersetClient
|
||||
# @RELATION: USES -> backend.src.core.utils.network.APIClient
|
||||
# @RELATION: USES -> backend.src.core.config_models.Environment
|
||||
#
|
||||
# @INVARIANT: All database metadata requests must include UUID and name.
|
||||
# @INVARIANT: All network operations must use the internal APIClient instance.
|
||||
# @PUBLIC_API: SupersetClient
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Dict, Optional, Tuple
|
||||
from superset_tool.client import SupersetClient as BaseSupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
import json
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional, Tuple, Union, cast
|
||||
from requests import Response
|
||||
from .logger import logger as app_logger, belief_scope
|
||||
from .utils.network import APIClient, SupersetAPIError, AuthenticationError, DashboardNotFoundError, NetworkError
|
||||
from .utils.fileio import get_filename_from_headers
|
||||
from .config_models import Environment
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:SupersetClient:Class]
|
||||
# @PURPOSE: Extended SupersetClient for migration-specific operations.
|
||||
class SupersetClient(BaseSupersetClient):
|
||||
|
||||
# [DEF:SupersetClient.get_databases_summary:Function]
|
||||
# @PURPOSE: Fetch a summary of databases including uuid, name, and engine.
|
||||
# @POST: Returns a list of database dictionaries with 'engine' field.
|
||||
# @RETURN: List[Dict] - Summary of databases.
|
||||
def get_databases_summary(self) -> List[Dict]:
|
||||
"""
|
||||
Fetch a summary of databases including uuid, name, and engine.
|
||||
"""
|
||||
query = {
|
||||
"columns": ["uuid", "database_name", "backend"]
|
||||
# @PURPOSE: Класс-обёртка над Superset REST API, предоставляющий методы для работы с дашбордами и датасетами.
|
||||
class SupersetClient:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Инициализирует клиент, проверяет конфигурацию и создает сетевой клиент.
|
||||
# @PRE: `env` должен быть валидным объектом Environment.
|
||||
# @POST: Атрибуты `env` и `network` созданы и готовы к работе.
|
||||
# @PARAM: env (Environment) - Конфигурация окружения.
|
||||
def __init__(self, env: Environment):
|
||||
with belief_scope("__init__"):
|
||||
app_logger.info("[SupersetClient.__init__][Enter] Initializing SupersetClient for env %s.", env.name)
|
||||
self.env = env
|
||||
# Construct auth payload expected by Superset API
|
||||
auth_payload = {
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"provider": "db",
|
||||
"refresh": "true"
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
|
||||
# Map 'backend' to 'engine' for consistency with contracts
|
||||
for db in databases:
|
||||
db['engine'] = db.pop('backend', None)
|
||||
self.network = APIClient(
|
||||
config={
|
||||
"base_url": env.url,
|
||||
"auth": auth_payload
|
||||
},
|
||||
verify_ssl=env.verify_ssl,
|
||||
timeout=env.timeout
|
||||
)
|
||||
self.delete_before_reimport: bool = False
|
||||
app_logger.info("[SupersetClient.__init__][Exit] SupersetClient initialized.")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:authenticate:Function]
|
||||
# @PURPOSE: Authenticates the client using the configured credentials.
|
||||
# @PRE: self.network must be initialized with valid auth configuration.
|
||||
# @POST: Client is authenticated and tokens are stored.
|
||||
# @RETURN: Dict[str, str] - Authentication tokens.
|
||||
def authenticate(self) -> Dict[str, str]:
|
||||
with belief_scope("SupersetClient.authenticate"):
|
||||
return self.network.authenticate()
|
||||
# [/DEF:authenticate:Function]
|
||||
|
||||
@property
|
||||
# [DEF:headers:Function]
|
||||
# @PURPOSE: Возвращает базовые HTTP-заголовки, используемые сетевым клиентом.
|
||||
def headers(self) -> dict:
|
||||
with belief_scope("headers"):
|
||||
return self.network.headers
|
||||
# [/DEF:headers:Function]
|
||||
|
||||
# [SECTION: DASHBOARD OPERATIONS]
|
||||
|
||||
# [DEF:get_dashboards:Function]
|
||||
# @PURPOSE: Получает полный список дашбордов, автоматически обрабатывая пагинацию.
|
||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса для API.
|
||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список дашбордов).
|
||||
def get_dashboards(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||
with belief_scope("get_dashboards"):
|
||||
app_logger.info("[get_dashboards][Enter] Fetching dashboards.")
|
||||
validated_query = self._validate_query_params(query or {})
|
||||
if 'columns' not in validated_query:
|
||||
validated_query['columns'] = ["slug", "id", "changed_on_utc", "dashboard_title", "published"]
|
||||
|
||||
return databases
|
||||
# [/DEF:SupersetClient.get_databases_summary:Function]
|
||||
total_count = self._fetch_total_object_count(endpoint="/dashboard/")
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/dashboard/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
)
|
||||
app_logger.info("[get_dashboards][Exit] Found %d dashboards.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_dashboards:Function]
|
||||
|
||||
# [DEF:SupersetClient.get_database_by_uuid:Function]
|
||||
# @PURPOSE: Find a database by its UUID.
|
||||
# @PARAM: db_uuid (str) - The UUID of the database.
|
||||
# @RETURN: Optional[Dict] - Database info if found, else None.
|
||||
def get_database_by_uuid(self, db_uuid: str) -> Optional[Dict]:
|
||||
"""
|
||||
Find a database by its UUID.
|
||||
"""
|
||||
query = {
|
||||
"filters": [{"col": "uuid", "op": "eq", "value": db_uuid}]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
return databases[0] if databases else None
|
||||
# [/DEF:SupersetClient.get_database_by_uuid:Function]
|
||||
|
||||
# [DEF:SupersetClient.get_dashboards_summary:Function]
|
||||
# [DEF:get_dashboards_summary:Function]
|
||||
# @PURPOSE: Fetches dashboard metadata optimized for the grid.
|
||||
# @POST: Returns a list of dashboard dictionaries.
|
||||
# @RETURN: List[Dict]
|
||||
# @RETURN: List[Dict]
|
||||
def get_dashboards_summary(self) -> List[Dict]:
|
||||
"""
|
||||
Fetches dashboard metadata optimized for the grid.
|
||||
Returns a list of dictionaries mapped to DashboardMetadata fields.
|
||||
"""
|
||||
query = {
|
||||
"columns": ["id", "dashboard_title", "changed_on_utc", "published"]
|
||||
}
|
||||
_, dashboards = self.get_dashboards(query=query)
|
||||
with belief_scope("SupersetClient.get_dashboards_summary"):
|
||||
query = {
|
||||
"columns": ["id", "dashboard_title", "changed_on_utc", "published"]
|
||||
}
|
||||
_, dashboards = self.get_dashboards(query=query)
|
||||
|
||||
# Map fields to DashboardMetadata schema
|
||||
result = []
|
||||
for dash in dashboards:
|
||||
result.append({
|
||||
"id": dash.get("id"),
|
||||
"title": dash.get("dashboard_title"),
|
||||
"last_modified": dash.get("changed_on_utc"),
|
||||
"status": "published" if dash.get("published") else "draft"
|
||||
})
|
||||
return result
|
||||
# [/DEF:SupersetClient.get_dashboards_summary:Function]
|
||||
# Map fields to DashboardMetadata schema
|
||||
result = []
|
||||
for dash in dashboards:
|
||||
result.append({
|
||||
"id": dash.get("id"),
|
||||
"title": dash.get("dashboard_title"),
|
||||
"last_modified": dash.get("changed_on_utc"),
|
||||
"status": "published" if dash.get("published") else "draft"
|
||||
})
|
||||
return result
|
||||
# [/DEF:get_dashboards_summary:Function]
|
||||
|
||||
# [DEF:export_dashboard:Function]
|
||||
# @PURPOSE: Экспортирует дашборд в виде ZIP-архива.
|
||||
# @PARAM: dashboard_id (int) - ID дашборда для экспорта.
|
||||
# @RETURN: Tuple[bytes, str] - Бинарное содержимое ZIP-архива и имя файла.
|
||||
def export_dashboard(self, dashboard_id: int) -> Tuple[bytes, str]:
|
||||
with belief_scope("export_dashboard"):
|
||||
app_logger.info("[export_dashboard][Enter] Exporting dashboard %s.", dashboard_id)
|
||||
response = self.network.request(
|
||||
method="GET",
|
||||
endpoint="/dashboard/export/",
|
||||
params={"q": json.dumps([dashboard_id])},
|
||||
stream=True,
|
||||
raw_response=True,
|
||||
)
|
||||
response = cast(Response, response)
|
||||
self._validate_export_response(response, dashboard_id)
|
||||
filename = self._resolve_export_filename(response, dashboard_id)
|
||||
app_logger.info("[export_dashboard][Exit] Exported dashboard %s to %s.", dashboard_id, filename)
|
||||
return response.content, filename
|
||||
# [/DEF:export_dashboard:Function]
|
||||
|
||||
# [DEF:import_dashboard:Function]
|
||||
# @PURPOSE: Импортирует дашборд из ZIP-файла.
|
||||
# @PARAM: file_name (Union[str, Path]) - Путь к ZIP-архиву.
|
||||
# @PARAM: dash_id (Optional[int]) - ID дашборда для удаления при сбое.
|
||||
# @PARAM: dash_slug (Optional[str]) - Slug дашборда для поиска ID.
|
||||
# @RETURN: Dict - Ответ API в случае успеха.
|
||||
def import_dashboard(self, file_name: Union[str, Path], dash_id: Optional[int] = None, dash_slug: Optional[str] = None) -> Dict:
|
||||
with belief_scope("import_dashboard"):
|
||||
file_path = str(file_name)
|
||||
self._validate_import_file(file_path)
|
||||
try:
|
||||
return self._do_import(file_path)
|
||||
except Exception as exc:
|
||||
app_logger.error("[import_dashboard][Failure] First import attempt failed: %s", exc, exc_info=True)
|
||||
if not self.delete_before_reimport:
|
||||
raise
|
||||
|
||||
target_id = self._resolve_target_id_for_delete(dash_id, dash_slug)
|
||||
if target_id is None:
|
||||
app_logger.error("[import_dashboard][Failure] No ID available for delete-retry.")
|
||||
raise
|
||||
|
||||
self.delete_dashboard(target_id)
|
||||
app_logger.info("[import_dashboard][State] Deleted dashboard ID %s, retrying import.", target_id)
|
||||
return self._do_import(file_path)
|
||||
# [/DEF:import_dashboard:Function]
|
||||
|
||||
# [DEF:delete_dashboard:Function]
|
||||
# @PURPOSE: Удаляет дашборд по его ID или slug.
|
||||
# @PARAM: dashboard_id (Union[int, str]) - ID или slug дашборда.
|
||||
def delete_dashboard(self, dashboard_id: Union[int, str]) -> None:
|
||||
with belief_scope("delete_dashboard"):
|
||||
app_logger.info("[delete_dashboard][Enter] Deleting dashboard %s.", dashboard_id)
|
||||
response = self.network.request(method="DELETE", endpoint=f"/dashboard/{dashboard_id}")
|
||||
response = cast(Dict, response)
|
||||
if response.get("result", True) is not False:
|
||||
app_logger.info("[delete_dashboard][Success] Dashboard %s deleted.", dashboard_id)
|
||||
else:
|
||||
app_logger.warning("[delete_dashboard][Warning] Unexpected response while deleting %s: %s", dashboard_id, response)
|
||||
# [/DEF:delete_dashboard:Function]
|
||||
|
||||
# [/SECTION]
|
||||
|
||||
# [SECTION: DATASET OPERATIONS]
|
||||
|
||||
# [DEF:get_datasets:Function]
|
||||
# @PURPOSE: Получает полный список датасетов, автоматически обрабатывая пагинацию.
|
||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса.
|
||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список датасетов).
|
||||
def get_datasets(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||
with belief_scope("get_datasets"):
|
||||
app_logger.info("[get_datasets][Enter] Fetching datasets.")
|
||||
validated_query = self._validate_query_params(query)
|
||||
|
||||
total_count = self._fetch_total_object_count(endpoint="/dataset/")
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/dataset/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
)
|
||||
app_logger.info("[get_datasets][Exit] Found %d datasets.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_datasets:Function]
|
||||
|
||||
# [DEF:get_dataset:Function]
|
||||
# @PURPOSE: Получает информацию о конкретном датасете по его ID.
|
||||
# @PARAM: dataset_id (int) - ID датасета.
|
||||
# @RETURN: Dict - Информация о датасете.
|
||||
def get_dataset(self, dataset_id: int) -> Dict:
|
||||
with belief_scope("SupersetClient.get_dataset", f"id={dataset_id}"):
|
||||
app_logger.info("[get_dataset][Enter] Fetching dataset %s.", dataset_id)
|
||||
response = self.network.request(method="GET", endpoint=f"/dataset/{dataset_id}")
|
||||
response = cast(Dict, response)
|
||||
app_logger.info("[get_dataset][Exit] Got dataset %s.", dataset_id)
|
||||
return response
|
||||
# [/DEF:get_dataset:Function]
|
||||
|
||||
# [DEF:update_dataset:Function]
|
||||
# @PURPOSE: Обновляет данные датасета по его ID.
|
||||
# @PARAM: dataset_id (int) - ID датасета.
|
||||
# @PARAM: data (Dict) - Данные для обновления.
|
||||
# @RETURN: Dict - Ответ API.
|
||||
def update_dataset(self, dataset_id: int, data: Dict) -> Dict:
|
||||
with belief_scope("SupersetClient.update_dataset", f"id={dataset_id}"):
|
||||
app_logger.info("[update_dataset][Enter] Updating dataset %s.", dataset_id)
|
||||
response = self.network.request(
|
||||
method="PUT",
|
||||
endpoint=f"/dataset/{dataset_id}",
|
||||
data=json.dumps(data),
|
||||
headers={'Content-Type': 'application/json'}
|
||||
)
|
||||
response = cast(Dict, response)
|
||||
app_logger.info("[update_dataset][Exit] Updated dataset %s.", dataset_id)
|
||||
return response
|
||||
# [/DEF:update_dataset:Function]
|
||||
|
||||
# [/SECTION]
|
||||
|
||||
# [SECTION: DATABASE OPERATIONS]
|
||||
|
||||
# [DEF:get_databases:Function]
|
||||
# @PURPOSE: Получает полный список баз данных.
|
||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса.
|
||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список баз данных).
|
||||
def get_databases(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||
with belief_scope("get_databases"):
|
||||
app_logger.info("[get_databases][Enter] Fetching databases.")
|
||||
validated_query = self._validate_query_params(query or {})
|
||||
if 'columns' not in validated_query:
|
||||
validated_query['columns'] = []
|
||||
total_count = self._fetch_total_object_count(endpoint="/database/")
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/database/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
)
|
||||
app_logger.info("[get_databases][Exit] Found %d databases.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_databases:Function]
|
||||
|
||||
# [DEF:get_database:Function]
|
||||
# @PURPOSE: Получает информацию о конкретной базе данных по её ID.
|
||||
# @PARAM: database_id (int) - ID базы данных.
|
||||
# @RETURN: Dict - Информация о базе данных.
|
||||
def get_database(self, database_id: int) -> Dict:
|
||||
with belief_scope("get_database"):
|
||||
app_logger.info("[get_database][Enter] Fetching database %s.", database_id)
|
||||
response = self.network.request(method="GET", endpoint=f"/database/{database_id}")
|
||||
response = cast(Dict, response)
|
||||
app_logger.info("[get_database][Exit] Got database %s.", database_id)
|
||||
return response
|
||||
# [/DEF:get_database:Function]
|
||||
|
||||
# [DEF:get_databases_summary:Function]
|
||||
# @PURPOSE: Fetch a summary of databases including uuid, name, and engine.
|
||||
# @RETURN: List[Dict] - Summary of databases.
|
||||
def get_databases_summary(self) -> List[Dict]:
|
||||
with belief_scope("SupersetClient.get_databases_summary"):
|
||||
query = {
|
||||
"columns": ["uuid", "database_name", "backend"]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
|
||||
# Map 'backend' to 'engine' for consistency with contracts
|
||||
for db in databases:
|
||||
db['engine'] = db.pop('backend', None)
|
||||
|
||||
return databases
|
||||
# [/DEF:get_databases_summary:Function]
|
||||
|
||||
# [DEF:get_database_by_uuid:Function]
|
||||
# @PURPOSE: Find a database by its UUID.
|
||||
# @PARAM: db_uuid (str) - The UUID of the database.
|
||||
# @RETURN: Optional[Dict] - Database info if found, else None.
|
||||
def get_database_by_uuid(self, db_uuid: str) -> Optional[Dict]:
|
||||
with belief_scope("SupersetClient.get_database_by_uuid", f"uuid={db_uuid}"):
|
||||
query = {
|
||||
"filters": [{"col": "uuid", "op": "eq", "value": db_uuid}]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
return databases[0] if databases else None
|
||||
# [/DEF:get_database_by_uuid:Function]
|
||||
|
||||
# [/SECTION]
|
||||
|
||||
# [SECTION: HELPERS]
|
||||
|
||||
# [DEF:_resolve_target_id_for_delete:Function]
|
||||
def _resolve_target_id_for_delete(self, dash_id: Optional[int], dash_slug: Optional[str]) -> Optional[int]:
|
||||
with belief_scope("_resolve_target_id_for_delete"):
|
||||
if dash_id is not None:
|
||||
return dash_id
|
||||
if dash_slug is not None:
|
||||
app_logger.debug("[_resolve_target_id_for_delete][State] Resolving ID by slug '%s'.", dash_slug)
|
||||
try:
|
||||
_, candidates = self.get_dashboards(query={"filters": [{"col": "slug", "op": "eq", "value": dash_slug}]})
|
||||
if candidates:
|
||||
target_id = candidates[0]["id"]
|
||||
app_logger.debug("[_resolve_target_id_for_delete][Success] Resolved slug to ID %s.", target_id)
|
||||
return target_id
|
||||
except Exception as e:
|
||||
app_logger.warning("[_resolve_target_id_for_delete][Warning] Could not resolve slug '%s' to ID: %s", dash_slug, e)
|
||||
return None
|
||||
# [/DEF:_resolve_target_id_for_delete:Function]
|
||||
|
||||
# [DEF:_do_import:Function]
|
||||
def _do_import(self, file_name: Union[str, Path]) -> Dict:
|
||||
with belief_scope("_do_import"):
|
||||
app_logger.debug(f"[_do_import][State] Uploading file: {file_name}")
|
||||
file_path = Path(file_name)
|
||||
if not file_path.exists():
|
||||
app_logger.error(f"[_do_import][Failure] File does not exist: {file_name}")
|
||||
raise FileNotFoundError(f"File does not exist: {file_name}")
|
||||
|
||||
return self.network.upload_file(
|
||||
endpoint="/dashboard/import/",
|
||||
file_info={"file_obj": file_path, "file_name": file_path.name, "form_field": "formData"},
|
||||
extra_data={"overwrite": "true"},
|
||||
timeout=self.env.timeout * 2,
|
||||
)
|
||||
# [/DEF:_do_import:Function]
|
||||
|
||||
# [DEF:_validate_export_response:Function]
|
||||
def _validate_export_response(self, response: Response, dashboard_id: int) -> None:
|
||||
with belief_scope("_validate_export_response"):
|
||||
content_type = response.headers.get("Content-Type", "")
|
||||
if "application/zip" not in content_type:
|
||||
raise SupersetAPIError(f"Получен не ZIP-архив (Content-Type: {content_type})")
|
||||
if not response.content:
|
||||
raise SupersetAPIError("Получены пустые данные при экспорте")
|
||||
# [/DEF:_validate_export_response:Function]
|
||||
|
||||
# [DEF:_resolve_export_filename:Function]
|
||||
def _resolve_export_filename(self, response: Response, dashboard_id: int) -> str:
|
||||
with belief_scope("_resolve_export_filename"):
|
||||
filename = get_filename_from_headers(dict(response.headers))
|
||||
if not filename:
|
||||
from datetime import datetime
|
||||
timestamp = datetime.now().strftime("%Y%m%dT%H%M%S")
|
||||
filename = f"dashboard_export_{dashboard_id}_{timestamp}.zip"
|
||||
app_logger.warning("[_resolve_export_filename][Warning] Generated filename: %s", filename)
|
||||
return filename
|
||||
# [/DEF:_resolve_export_filename:Function]
|
||||
|
||||
# [DEF:_validate_query_params:Function]
|
||||
def _validate_query_params(self, query: Optional[Dict]) -> Dict:
|
||||
with belief_scope("_validate_query_params"):
|
||||
base_query = {"page": 0, "page_size": 1000}
|
||||
return {**base_query, **(query or {})}
|
||||
# [/DEF:_validate_query_params:Function]
|
||||
|
||||
# [DEF:_fetch_total_object_count:Function]
|
||||
def _fetch_total_object_count(self, endpoint: str) -> int:
|
||||
with belief_scope("_fetch_total_object_count"):
|
||||
return self.network.fetch_paginated_count(
|
||||
endpoint=endpoint,
|
||||
query_params={"page": 0, "page_size": 1},
|
||||
count_field="count",
|
||||
)
|
||||
# [/DEF:_fetch_total_object_count:Function]
|
||||
|
||||
# [DEF:_fetch_all_pages:Function]
|
||||
def _fetch_all_pages(self, endpoint: str, pagination_options: Dict) -> List[Dict]:
|
||||
with belief_scope("_fetch_all_pages"):
|
||||
return self.network.fetch_paginated_data(endpoint=endpoint, pagination_options=pagination_options)
|
||||
# [/DEF:_fetch_all_pages:Function]
|
||||
|
||||
# [DEF:_validate_import_file:Function]
|
||||
def _validate_import_file(self, zip_path: Union[str, Path]) -> None:
|
||||
with belief_scope("_validate_import_file"):
|
||||
path = Path(zip_path)
|
||||
if not path.exists():
|
||||
raise FileNotFoundError(f"Файл {zip_path} не существует")
|
||||
if not zipfile.is_zipfile(path):
|
||||
raise SupersetAPIError(f"Файл {zip_path} не является ZIP-архивом")
|
||||
with zipfile.ZipFile(path, "r") as zf:
|
||||
if not any(n.endswith("metadata.yaml") for n in zf.namelist()):
|
||||
raise SupersetAPIError(f"Архив {zip_path} не содержит 'metadata.yaml'")
|
||||
# [/DEF:_validate_import_file:Function]
|
||||
|
||||
# [/SECTION]
|
||||
|
||||
# [/DEF:SupersetClient:Class]
|
||||
|
||||
|
||||
@@ -12,12 +12,19 @@ from ..config_manager import ConfigManager
|
||||
# [DEF:TaskCleanupService:Class]
|
||||
# @PURPOSE: Provides methods to clean up old task records.
|
||||
class TaskCleanupService:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the cleanup service with dependencies.
|
||||
# @PRE: persistence_service and config_manager are valid.
|
||||
# @POST: Cleanup service is ready.
|
||||
def __init__(self, persistence_service: TaskPersistenceService, config_manager: ConfigManager):
|
||||
self.persistence_service = persistence_service
|
||||
self.config_manager = config_manager
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:TaskCleanupService.run_cleanup:Function]
|
||||
# [DEF:run_cleanup:Function]
|
||||
# @PURPOSE: Deletes tasks older than the configured retention period.
|
||||
# @PRE: Config manager has valid settings.
|
||||
# @POST: Old tasks are deleted from persistence.
|
||||
def run_cleanup(self):
|
||||
with belief_scope("TaskCleanupService.run_cleanup"):
|
||||
settings = self.config_manager.get_config().settings
|
||||
@@ -34,7 +41,7 @@ class TaskCleanupService:
|
||||
to_delete = [t.id for t in tasks[settings.task_retention_limit:]]
|
||||
self.persistence_service.delete_tasks(to_delete)
|
||||
logger.info(f"Deleted {len(to_delete)} tasks exceeding limit of {settings.task_retention_limit}")
|
||||
# [/DEF:TaskCleanupService.run_cleanup:Function]
|
||||
# [/DEF:run_cleanup:Function]
|
||||
|
||||
# [/DEF:TaskCleanupService:Class]
|
||||
# [/DEF:TaskCleanupModule:Module]
|
||||
@@ -25,7 +25,7 @@ class TaskManager:
|
||||
Manages the lifecycle of tasks, including their creation, execution, and state tracking.
|
||||
"""
|
||||
|
||||
# [DEF:TaskManager.__init__:Function]
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initialize the TaskManager with dependencies.
|
||||
# @PRE: plugin_loader is initialized.
|
||||
# @POST: TaskManager is ready to accept tasks.
|
||||
@@ -46,9 +46,9 @@ class TaskManager:
|
||||
|
||||
# Load persisted tasks on startup
|
||||
self.load_persisted_tasks()
|
||||
# [/DEF:TaskManager.__init__:Function]
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:TaskManager.create_task:Function]
|
||||
# [DEF:create_task:Function]
|
||||
# @PURPOSE: Creates and queues a new task for execution.
|
||||
# @PRE: Plugin with plugin_id exists. Params are valid.
|
||||
# @POST: Task is created, added to registry, and scheduled for execution.
|
||||
@@ -75,9 +75,9 @@ class TaskManager:
|
||||
logger.info(f"Task {task.id} created and scheduled for execution")
|
||||
self.loop.create_task(self._run_task(task.id)) # Schedule task for execution
|
||||
return task
|
||||
# [/DEF:TaskManager.create_task:Function]
|
||||
# [/DEF:create_task:Function]
|
||||
|
||||
# [DEF:TaskManager._run_task:Function]
|
||||
# [DEF:_run_task:Function]
|
||||
# @PURPOSE: Internal method to execute a task.
|
||||
# @PRE: Task exists in registry.
|
||||
# @POST: Task is executed, status updated to SUCCESS or FAILED.
|
||||
@@ -98,9 +98,9 @@ class TaskManager:
|
||||
params = {**task.params, "_task_id": task_id}
|
||||
|
||||
if asyncio.iscoroutinefunction(plugin.execute):
|
||||
await plugin.execute(params)
|
||||
task.result = await plugin.execute(params)
|
||||
else:
|
||||
await self.loop.run_in_executor(
|
||||
task.result = await self.loop.run_in_executor(
|
||||
self.executor,
|
||||
plugin.execute,
|
||||
params
|
||||
@@ -117,9 +117,9 @@ class TaskManager:
|
||||
task.finished_at = datetime.utcnow()
|
||||
self.persistence_service.persist_task(task)
|
||||
logger.info(f"Task {task_id} execution finished with status: {task.status}")
|
||||
# [/DEF:TaskManager._run_task:Function]
|
||||
# [/DEF:_run_task:Function]
|
||||
|
||||
# [DEF:TaskManager.resolve_task:Function]
|
||||
# [DEF:resolve_task:Function]
|
||||
# @PURPOSE: Resumes a task that is awaiting mapping.
|
||||
# @PRE: Task exists and is in AWAITING_MAPPING state.
|
||||
# @POST: Task status updated to RUNNING, params updated, execution resumed.
|
||||
@@ -141,9 +141,9 @@ class TaskManager:
|
||||
# Signal the future to continue
|
||||
if task_id in self.task_futures:
|
||||
self.task_futures[task_id].set_result(True)
|
||||
# [/DEF:TaskManager.resolve_task:Function]
|
||||
# [/DEF:resolve_task:Function]
|
||||
|
||||
# [DEF:TaskManager.wait_for_resolution:Function]
|
||||
# [DEF:wait_for_resolution:Function]
|
||||
# @PURPOSE: Pauses execution and waits for a resolution signal.
|
||||
# @PRE: Task exists.
|
||||
# @POST: Execution pauses until future is set.
|
||||
@@ -162,9 +162,9 @@ class TaskManager:
|
||||
finally:
|
||||
if task_id in self.task_futures:
|
||||
del self.task_futures[task_id]
|
||||
# [/DEF:TaskManager.wait_for_resolution:Function]
|
||||
# [/DEF:wait_for_resolution:Function]
|
||||
|
||||
# [DEF:TaskManager.wait_for_input:Function]
|
||||
# [DEF:wait_for_input:Function]
|
||||
# @PURPOSE: Pauses execution and waits for user input.
|
||||
# @PRE: Task exists.
|
||||
# @POST: Execution pauses until future is set via resume_task_with_password.
|
||||
@@ -182,24 +182,30 @@ class TaskManager:
|
||||
finally:
|
||||
if task_id in self.task_futures:
|
||||
del self.task_futures[task_id]
|
||||
# [/DEF:TaskManager.wait_for_input:Function]
|
||||
# [/DEF:wait_for_input:Function]
|
||||
|
||||
# [DEF:TaskManager.get_task:Function]
|
||||
# [DEF:get_task:Function]
|
||||
# @PURPOSE: Retrieves a task by its ID.
|
||||
# @PRE: task_id is a string.
|
||||
# @POST: Returns Task object or None.
|
||||
# @PARAM: task_id (str) - ID of the task.
|
||||
# @RETURN: Optional[Task] - The task or None.
|
||||
def get_task(self, task_id: str) -> Optional[Task]:
|
||||
return self.tasks.get(task_id)
|
||||
# [/DEF:TaskManager.get_task:Function]
|
||||
with belief_scope("TaskManager.get_task", f"task_id={task_id}"):
|
||||
return self.tasks.get(task_id)
|
||||
# [/DEF:get_task:Function]
|
||||
|
||||
# [DEF:TaskManager.get_all_tasks:Function]
|
||||
# [DEF:get_all_tasks:Function]
|
||||
# @PURPOSE: Retrieves all registered tasks.
|
||||
# @PRE: None.
|
||||
# @POST: Returns list of all Task objects.
|
||||
# @RETURN: List[Task] - All tasks.
|
||||
def get_all_tasks(self) -> List[Task]:
|
||||
return list(self.tasks.values())
|
||||
# [/DEF:TaskManager.get_all_tasks:Function]
|
||||
with belief_scope("TaskManager.get_all_tasks"):
|
||||
return list(self.tasks.values())
|
||||
# [/DEF:get_all_tasks:Function]
|
||||
|
||||
# [DEF:TaskManager.get_tasks:Function]
|
||||
# [DEF:get_tasks:Function]
|
||||
# @PURPOSE: Retrieves tasks with pagination and optional status filter.
|
||||
# @PRE: limit and offset are non-negative integers.
|
||||
# @POST: Returns a list of tasks sorted by start_time descending.
|
||||
@@ -208,24 +214,28 @@ class TaskManager:
|
||||
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
|
||||
# @RETURN: List[Task] - List of tasks matching criteria.
|
||||
def get_tasks(self, limit: int = 10, offset: int = 0, status: Optional[TaskStatus] = None) -> List[Task]:
|
||||
tasks = list(self.tasks.values())
|
||||
with belief_scope("TaskManager.get_tasks"):
|
||||
tasks = list(self.tasks.values())
|
||||
if status:
|
||||
tasks = [t for t in tasks if t.status == status]
|
||||
# Sort by start_time descending (most recent first)
|
||||
tasks.sort(key=lambda t: t.started_at or datetime.min, reverse=True)
|
||||
return tasks[offset:offset + limit]
|
||||
# [/DEF:TaskManager.get_tasks:Function]
|
||||
# [/DEF:get_tasks:Function]
|
||||
|
||||
# [DEF:TaskManager.get_task_logs:Function]
|
||||
# [DEF:get_task_logs:Function]
|
||||
# @PURPOSE: Retrieves logs for a specific task.
|
||||
# @PRE: task_id is a string.
|
||||
# @POST: Returns list of LogEntry objects.
|
||||
# @PARAM: task_id (str) - ID of the task.
|
||||
# @RETURN: List[LogEntry] - List of log entries.
|
||||
def get_task_logs(self, task_id: str) -> List[LogEntry]:
|
||||
task = self.tasks.get(task_id)
|
||||
return task.logs if task else []
|
||||
# [/DEF:TaskManager.get_task_logs:Function]
|
||||
with belief_scope("TaskManager.get_task_logs", f"task_id={task_id}"):
|
||||
task = self.tasks.get(task_id)
|
||||
return task.logs if task else []
|
||||
# [/DEF:get_task_logs:Function]
|
||||
|
||||
# [DEF:TaskManager._add_log:Function]
|
||||
# [DEF:_add_log:Function]
|
||||
# @PURPOSE: Adds a log entry to a task and notifies subscribers.
|
||||
# @PRE: Task exists.
|
||||
# @POST: Log added to task and pushed to queues.
|
||||
@@ -234,54 +244,64 @@ class TaskManager:
|
||||
# @PARAM: message (str) - Log message.
|
||||
# @PARAM: context (Optional[Dict]) - Log context.
|
||||
def _add_log(self, task_id: str, level: str, message: str, context: Optional[Dict[str, Any]] = None):
|
||||
task = self.tasks.get(task_id)
|
||||
if not task:
|
||||
return
|
||||
with belief_scope("TaskManager._add_log", f"task_id={task_id}"):
|
||||
task = self.tasks.get(task_id)
|
||||
if not task:
|
||||
return
|
||||
|
||||
log_entry = LogEntry(level=level, message=message, context=context)
|
||||
task.logs.append(log_entry)
|
||||
self.persistence_service.persist_task(task)
|
||||
log_entry = LogEntry(level=level, message=message, context=context)
|
||||
task.logs.append(log_entry)
|
||||
self.persistence_service.persist_task(task)
|
||||
|
||||
# Notify subscribers
|
||||
if task_id in self.subscribers:
|
||||
for queue in self.subscribers[task_id]:
|
||||
self.loop.call_soon_threadsafe(queue.put_nowait, log_entry)
|
||||
# [/DEF:TaskManager._add_log:Function]
|
||||
# Notify subscribers
|
||||
if task_id in self.subscribers:
|
||||
for queue in self.subscribers[task_id]:
|
||||
self.loop.call_soon_threadsafe(queue.put_nowait, log_entry)
|
||||
# [/DEF:_add_log:Function]
|
||||
|
||||
# [DEF:TaskManager.subscribe_logs:Function]
|
||||
# [DEF:subscribe_logs:Function]
|
||||
# @PURPOSE: Subscribes to real-time logs for a task.
|
||||
# @PRE: task_id is a string.
|
||||
# @POST: Returns an asyncio.Queue for log entries.
|
||||
# @PARAM: task_id (str) - ID of the task.
|
||||
# @RETURN: asyncio.Queue - Queue for log entries.
|
||||
async def subscribe_logs(self, task_id: str) -> asyncio.Queue:
|
||||
queue = asyncio.Queue()
|
||||
if task_id not in self.subscribers:
|
||||
self.subscribers[task_id] = []
|
||||
self.subscribers[task_id].append(queue)
|
||||
return queue
|
||||
# [/DEF:TaskManager.subscribe_logs:Function]
|
||||
with belief_scope("TaskManager.subscribe_logs", f"task_id={task_id}"):
|
||||
queue = asyncio.Queue()
|
||||
if task_id not in self.subscribers:
|
||||
self.subscribers[task_id] = []
|
||||
self.subscribers[task_id].append(queue)
|
||||
return queue
|
||||
# [/DEF:subscribe_logs:Function]
|
||||
|
||||
# [DEF:TaskManager.unsubscribe_logs:Function]
|
||||
# [DEF:unsubscribe_logs:Function]
|
||||
# @PURPOSE: Unsubscribes from real-time logs for a task.
|
||||
# @PRE: task_id is a string, queue is asyncio.Queue.
|
||||
# @POST: Queue removed from subscribers.
|
||||
# @PARAM: task_id (str) - ID of the task.
|
||||
# @PARAM: queue (asyncio.Queue) - Queue to remove.
|
||||
def unsubscribe_logs(self, task_id: str, queue: asyncio.Queue):
|
||||
if task_id in self.subscribers:
|
||||
if queue in self.subscribers[task_id]:
|
||||
self.subscribers[task_id].remove(queue)
|
||||
if not self.subscribers[task_id]:
|
||||
del self.subscribers[task_id]
|
||||
# [/DEF:TaskManager.unsubscribe_logs:Function]
|
||||
with belief_scope("TaskManager.unsubscribe_logs", f"task_id={task_id}"):
|
||||
if task_id in self.subscribers:
|
||||
if queue in self.subscribers[task_id]:
|
||||
self.subscribers[task_id].remove(queue)
|
||||
if not self.subscribers[task_id]:
|
||||
del self.subscribers[task_id]
|
||||
# [/DEF:unsubscribe_logs:Function]
|
||||
|
||||
# [DEF:TaskManager.load_persisted_tasks:Function]
|
||||
# [DEF:load_persisted_tasks:Function]
|
||||
# @PURPOSE: Load persisted tasks using persistence service.
|
||||
# @PRE: None.
|
||||
# @POST: Persisted tasks loaded into self.tasks.
|
||||
def load_persisted_tasks(self) -> None:
|
||||
loaded_tasks = self.persistence_service.load_tasks(limit=100)
|
||||
for task in loaded_tasks:
|
||||
if task.id not in self.tasks:
|
||||
self.tasks[task.id] = task
|
||||
# [/DEF:TaskManager.load_persisted_tasks:Function]
|
||||
with belief_scope("TaskManager.load_persisted_tasks"):
|
||||
loaded_tasks = self.persistence_service.load_tasks(limit=100)
|
||||
for task in loaded_tasks:
|
||||
if task.id not in self.tasks:
|
||||
self.tasks[task.id] = task
|
||||
# [/DEF:load_persisted_tasks:Function]
|
||||
|
||||
# [DEF:TaskManager.await_input:Function]
|
||||
# [DEF:await_input:Function]
|
||||
# @PURPOSE: Transition a task to AWAITING_INPUT state with input request.
|
||||
# @PRE: Task exists and is in RUNNING state.
|
||||
# @POST: Task status changed to AWAITING_INPUT, input_request set, persisted.
|
||||
@@ -301,9 +321,9 @@ class TaskManager:
|
||||
task.input_request = input_request
|
||||
self.persistence_service.persist_task(task)
|
||||
self._add_log(task_id, "INFO", "Task paused for user input", {"input_request": input_request})
|
||||
# [/DEF:TaskManager.await_input:Function]
|
||||
# [/DEF:await_input:Function]
|
||||
|
||||
# [DEF:TaskManager.resume_task_with_password:Function]
|
||||
# [DEF:resume_task_with_password:Function]
|
||||
# @PURPOSE: Resume a task that is awaiting input with provided passwords.
|
||||
# @PRE: Task exists and is in AWAITING_INPUT state.
|
||||
# @POST: Task status changed to RUNNING, passwords injected, task resumed.
|
||||
@@ -330,10 +350,12 @@ class TaskManager:
|
||||
|
||||
if task_id in self.task_futures:
|
||||
self.task_futures[task_id].set_result(True)
|
||||
# [/DEF:TaskManager.resume_task_with_password:Function]
|
||||
# [/DEF:resume_task_with_password:Function]
|
||||
|
||||
# [DEF:TaskManager.clear_tasks:Function]
|
||||
# [DEF:clear_tasks:Function]
|
||||
# @PURPOSE: Clears tasks based on status filter.
|
||||
# @PRE: status is Optional[TaskStatus].
|
||||
# @POST: Tasks matching filter (or all non-active) cleared from registry and database.
|
||||
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
|
||||
# @RETURN: int - Number of tasks cleared.
|
||||
def clear_tasks(self, status: Optional[TaskStatus] = None) -> int:
|
||||
@@ -370,7 +392,7 @@ class TaskManager:
|
||||
|
||||
logger.info(f"Cleared {len(tasks_to_remove)} tasks.")
|
||||
return len(tasks_to_remove)
|
||||
# [/DEF:TaskManager.clear_tasks:Function]
|
||||
# [/DEF:clear_tasks:Function]
|
||||
|
||||
# [/DEF:TaskManager:Class]
|
||||
# [/DEF:TaskManagerModule:Module]
|
||||
@@ -51,8 +51,9 @@ class Task(BaseModel):
|
||||
params: Dict[str, Any] = Field(default_factory=dict)
|
||||
input_required: bool = False
|
||||
input_request: Optional[Dict[str, Any]] = None
|
||||
result: Optional[Dict[str, Any]] = None
|
||||
|
||||
# [DEF:Task.__init__:Function]
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the Task model and validates input_request for AWAITING_INPUT status.
|
||||
# @PRE: If status is AWAITING_INPUT, input_request must be provided.
|
||||
# @POST: Task instance is created or ValueError is raised.
|
||||
@@ -61,7 +62,7 @@ class Task(BaseModel):
|
||||
super().__init__(**data)
|
||||
if self.status == TaskStatus.AWAITING_INPUT and not self.input_request:
|
||||
raise ValueError("input_request is required when status is AWAITING_INPUT")
|
||||
# [/DEF:Task.__init__:Function]
|
||||
# [/DEF:__init__:Function]
|
||||
# [/DEF:Task:Class]
|
||||
|
||||
# [/DEF:TaskManagerModels:Module]
|
||||
@@ -11,8 +11,8 @@ from typing import List, Optional, Dict, Any
|
||||
import json
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
from backend.src.models.task import TaskRecord
|
||||
from backend.src.core.database import TasksSessionLocal
|
||||
from ...models.task import TaskRecord
|
||||
from ..database import TasksSessionLocal
|
||||
from .models import Task, TaskStatus, LogEntry
|
||||
from ..logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
@@ -21,12 +21,20 @@ from ..logger import logger, belief_scope
|
||||
# @SEMANTICS: persistence, service, database, sqlalchemy
|
||||
# @PURPOSE: Provides methods to save and load tasks from the tasks.db database using SQLAlchemy.
|
||||
class TaskPersistenceService:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the persistence service.
|
||||
# @PRE: None.
|
||||
# @POST: Service is ready.
|
||||
def __init__(self):
|
||||
# We use TasksSessionLocal from database.py
|
||||
pass
|
||||
with belief_scope("TaskPersistenceService.__init__"):
|
||||
# We use TasksSessionLocal from database.py
|
||||
pass
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:TaskPersistenceService.persist_task:Function]
|
||||
# [DEF:persist_task:Function]
|
||||
# @PURPOSE: Persists or updates a single task in the database.
|
||||
# @PRE: isinstance(task, Task)
|
||||
# @POST: Task record created or updated in database.
|
||||
# @PARAM: task (Task) - The task object to persist.
|
||||
def persist_task(self, task: Task) -> None:
|
||||
with belief_scope("TaskPersistenceService.persist_task", f"task_id={task.id}"):
|
||||
@@ -43,6 +51,7 @@ class TaskPersistenceService:
|
||||
record.started_at = task.started_at
|
||||
record.finished_at = task.finished_at
|
||||
record.params = task.params
|
||||
record.result = task.result
|
||||
|
||||
# Store logs as JSON, converting datetime to string
|
||||
record.logs = []
|
||||
@@ -65,18 +74,23 @@ class TaskPersistenceService:
|
||||
logger.error(f"Failed to persist task {task.id}: {e}")
|
||||
finally:
|
||||
session.close()
|
||||
# [/DEF:TaskPersistenceService.persist_task:Function]
|
||||
# [/DEF:persist_task:Function]
|
||||
|
||||
# [DEF:TaskPersistenceService.persist_tasks:Function]
|
||||
# [DEF:persist_tasks:Function]
|
||||
# @PURPOSE: Persists multiple tasks.
|
||||
# @PRE: isinstance(tasks, list)
|
||||
# @POST: All tasks in list are persisted.
|
||||
# @PARAM: tasks (List[Task]) - The list of tasks to persist.
|
||||
def persist_tasks(self, tasks: List[Task]) -> None:
|
||||
for task in tasks:
|
||||
self.persist_task(task)
|
||||
# [/DEF:TaskPersistenceService.persist_tasks:Function]
|
||||
with belief_scope("TaskPersistenceService.persist_tasks"):
|
||||
for task in tasks:
|
||||
self.persist_task(task)
|
||||
# [/DEF:persist_tasks:Function]
|
||||
|
||||
# [DEF:TaskPersistenceService.load_tasks:Function]
|
||||
# [DEF:load_tasks:Function]
|
||||
# @PURPOSE: Loads tasks from the database.
|
||||
# @PRE: limit is an integer.
|
||||
# @POST: Returns list of Task objects.
|
||||
# @PARAM: limit (int) - Max tasks to load.
|
||||
# @PARAM: status (Optional[TaskStatus]) - Filter by status.
|
||||
# @RETURN: List[Task] - The loaded tasks.
|
||||
@@ -108,6 +122,7 @@ class TaskPersistenceService:
|
||||
started_at=record.started_at,
|
||||
finished_at=record.finished_at,
|
||||
params=record.params or {},
|
||||
result=record.result,
|
||||
logs=logs
|
||||
)
|
||||
loaded_tasks.append(task)
|
||||
@@ -117,10 +132,12 @@ class TaskPersistenceService:
|
||||
return loaded_tasks
|
||||
finally:
|
||||
session.close()
|
||||
# [/DEF:TaskPersistenceService.load_tasks:Function]
|
||||
# [/DEF:load_tasks:Function]
|
||||
|
||||
# [DEF:TaskPersistenceService.delete_tasks:Function]
|
||||
# [DEF:delete_tasks:Function]
|
||||
# @PURPOSE: Deletes specific tasks from the database.
|
||||
# @PRE: task_ids is a list of strings.
|
||||
# @POST: Specified task records deleted from database.
|
||||
# @PARAM: task_ids (List[str]) - List of task IDs to delete.
|
||||
def delete_tasks(self, task_ids: List[str]) -> None:
|
||||
if not task_ids:
|
||||
@@ -135,7 +152,7 @@ class TaskPersistenceService:
|
||||
logger.error(f"Failed to delete tasks: {e}")
|
||||
finally:
|
||||
session.close()
|
||||
# [/DEF:TaskPersistenceService.delete_tasks:Function]
|
||||
# [/DEF:delete_tasks:Function]
|
||||
|
||||
# [/DEF:TaskPersistenceService:Class]
|
||||
# [/DEF:TaskPersistenceModule:Module]
|
||||
237
backend/src/core/utils/dataset_mapper.py
Normal file
237
backend/src/core/utils/dataset_mapper.py
Normal file
@@ -0,0 +1,237 @@
|
||||
# [DEF:backend.core.utils.dataset_mapper:Module]
|
||||
#
|
||||
# @SEMANTICS: dataset, mapping, postgresql, xlsx, superset
|
||||
# @PURPOSE: Этот модуль отвечает за обновление метаданных (verbose_map) в датасетах Superset, извлекая их из PostgreSQL или XLSX-файлов.
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> backend.core.superset_client
|
||||
# @RELATION: DEPENDS_ON -> pandas
|
||||
# @RELATION: DEPENDS_ON -> psycopg2
|
||||
# @PUBLIC_API: DatasetMapper
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import pandas as pd # type: ignore
|
||||
import psycopg2 # type: ignore
|
||||
from typing import Dict, List, Optional, Any
|
||||
from ..logger import logger as app_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:DatasetMapper:Class]
|
||||
# @PURPOSE: Класс для меппинга и обновления verbose_map в датасетах Superset.
|
||||
class DatasetMapper:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the mapper.
|
||||
# @POST: Объект DatasetMapper инициализирован.
|
||||
def __init__(self):
|
||||
pass
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:get_postgres_comments:Function]
|
||||
# @PURPOSE: Извлекает комментарии к колонкам из системного каталога PostgreSQL.
|
||||
# @PRE: db_config должен содержать валидные параметры подключения (host, port, user, password, dbname).
|
||||
# @PRE: table_name и table_schema должны быть строками.
|
||||
# @POST: Возвращается словарь, где ключи - имена колонок, значения - комментарии из БД.
|
||||
# @THROW: Exception - При ошибках подключения или выполнения запроса к БД.
|
||||
# @PARAM: db_config (Dict) - Конфигурация для подключения к БД.
|
||||
# @PARAM: table_name (str) - Имя таблицы.
|
||||
# @PARAM: table_schema (str) - Схема таблицы.
|
||||
# @RETURN: Dict[str, str] - Словарь с комментариями к колонкам.
|
||||
def get_postgres_comments(self, db_config: Dict, table_name: str, table_schema: str) -> Dict[str, str]:
|
||||
with belief_scope("Fetch comments from PostgreSQL"):
|
||||
app_logger.info("[get_postgres_comments][Enter] Fetching comments from PostgreSQL for %s.%s.", table_schema, table_name)
|
||||
query = f"""
|
||||
SELECT
|
||||
cols.column_name,
|
||||
CASE
|
||||
WHEN pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
) LIKE '%|%' THEN
|
||||
split_part(
|
||||
pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
),
|
||||
'|',
|
||||
1
|
||||
)
|
||||
ELSE
|
||||
pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
)
|
||||
END AS column_comment
|
||||
FROM
|
||||
information_schema.columns cols
|
||||
WHERE cols.table_catalog = '{db_config.get('dbname')}' AND cols.table_name = '{table_name}' AND cols.table_schema = '{table_schema}';
|
||||
"""
|
||||
comments = {}
|
||||
try:
|
||||
with psycopg2.connect(**db_config) as conn, conn.cursor() as cursor:
|
||||
cursor.execute(query)
|
||||
for row in cursor.fetchall():
|
||||
if row[1]:
|
||||
comments[row[0]] = row[1]
|
||||
app_logger.info("[get_postgres_comments][Success] Fetched %d comments.", len(comments))
|
||||
except Exception as e:
|
||||
app_logger.error("[get_postgres_comments][Failure] %s", e, exc_info=True)
|
||||
raise
|
||||
return comments
|
||||
# [/DEF:get_postgres_comments:Function]
|
||||
|
||||
# [DEF:load_excel_mappings:Function]
|
||||
# @PURPOSE: Загружает меппинги 'column_name' -> 'column_comment' из XLSX файла.
|
||||
# @PRE: file_path должен указывать на существующий XLSX файл.
|
||||
# @POST: Возвращается словарь с меппингами из файла.
|
||||
# @THROW: Exception - При ошибках чтения файла или парсинга.
|
||||
# @PARAM: file_path (str) - Путь к XLSX файлу.
|
||||
# @RETURN: Dict[str, str] - Словарь с меппингами.
|
||||
def load_excel_mappings(self, file_path: str) -> Dict[str, str]:
|
||||
with belief_scope("Load mappings from Excel"):
|
||||
app_logger.info("[load_excel_mappings][Enter] Loading mappings from %s.", file_path)
|
||||
try:
|
||||
df = pd.read_excel(file_path)
|
||||
mappings = df.set_index('column_name')['verbose_name'].to_dict()
|
||||
app_logger.info("[load_excel_mappings][Success] Loaded %d mappings.", len(mappings))
|
||||
return mappings
|
||||
except Exception as e:
|
||||
app_logger.error("[load_excel_mappings][Failure] %s", e, exc_info=True)
|
||||
raise
|
||||
# [/DEF:load_excel_mappings:Function]
|
||||
|
||||
# [DEF:run_mapping:Function]
|
||||
# @PURPOSE: Основная функция для выполнения меппинга и обновления verbose_map датасета в Superset.
|
||||
# @PRE: superset_client должен быть авторизован.
|
||||
# @PRE: dataset_id должен быть существующим ID в Superset.
|
||||
# @POST: Если найдены изменения, датасет в Superset обновлен через API.
|
||||
# @RELATION: CALLS -> self.get_postgres_comments
|
||||
# @RELATION: CALLS -> self.load_excel_mappings
|
||||
# @RELATION: CALLS -> superset_client.get_dataset
|
||||
# @RELATION: CALLS -> superset_client.update_dataset
|
||||
# @PARAM: superset_client (Any) - Клиент Superset.
|
||||
# @PARAM: dataset_id (int) - ID датасета для обновления.
|
||||
# @PARAM: source (str) - Источник данных ('postgres', 'excel', 'both').
|
||||
# @PARAM: postgres_config (Optional[Dict]) - Конфигурация для подключения к PostgreSQL.
|
||||
# @PARAM: excel_path (Optional[str]) - Путь к XLSX файлу.
|
||||
# @PARAM: table_name (Optional[str]) - Имя таблицы в PostgreSQL.
|
||||
# @PARAM: table_schema (Optional[str]) - Схема таблицы в PostgreSQL.
|
||||
def run_mapping(self, superset_client: Any, dataset_id: int, source: str, postgres_config: Optional[Dict] = None, excel_path: Optional[str] = None, table_name: Optional[str] = None, table_schema: Optional[str] = None):
|
||||
with belief_scope(f"Run dataset mapping for ID {dataset_id}"):
|
||||
app_logger.info("[run_mapping][Enter] Starting dataset mapping for ID %d from source '%s'.", dataset_id, source)
|
||||
mappings: Dict[str, str] = {}
|
||||
|
||||
try:
|
||||
if source in ['postgres', 'both']:
|
||||
assert postgres_config and table_name and table_schema, "Postgres config is required."
|
||||
mappings.update(self.get_postgres_comments(postgres_config, table_name, table_schema))
|
||||
if source in ['excel', 'both']:
|
||||
assert excel_path, "Excel path is required."
|
||||
mappings.update(self.load_excel_mappings(excel_path))
|
||||
if source not in ['postgres', 'excel', 'both']:
|
||||
app_logger.error("[run_mapping][Failure] Invalid source: %s.", source)
|
||||
return
|
||||
|
||||
dataset_response = superset_client.get_dataset(dataset_id)
|
||||
dataset_data = dataset_response['result']
|
||||
|
||||
original_columns = dataset_data.get('columns', [])
|
||||
updated_columns = []
|
||||
changes_made = False
|
||||
|
||||
for column in original_columns:
|
||||
col_name = column.get('column_name')
|
||||
|
||||
new_column = {
|
||||
"column_name": col_name,
|
||||
"id": column.get("id"),
|
||||
"advanced_data_type": column.get("advanced_data_type"),
|
||||
"description": column.get("description"),
|
||||
"expression": column.get("expression"),
|
||||
"extra": column.get("extra"),
|
||||
"filterable": column.get("filterable"),
|
||||
"groupby": column.get("groupby"),
|
||||
"is_active": column.get("is_active"),
|
||||
"is_dttm": column.get("is_dttm"),
|
||||
"python_date_format": column.get("python_date_format"),
|
||||
"type": column.get("type"),
|
||||
"uuid": column.get("uuid"),
|
||||
"verbose_name": column.get("verbose_name"),
|
||||
}
|
||||
|
||||
new_column = {k: v for k, v in new_column.items() if v is not None}
|
||||
|
||||
if col_name in mappings:
|
||||
mapping_value = mappings[col_name]
|
||||
if isinstance(mapping_value, str) and new_column.get('verbose_name') != mapping_value:
|
||||
new_column['verbose_name'] = mapping_value
|
||||
changes_made = True
|
||||
|
||||
updated_columns.append(new_column)
|
||||
|
||||
updated_metrics = []
|
||||
for metric in dataset_data.get("metrics", []):
|
||||
new_metric = {
|
||||
"id": metric.get("id"),
|
||||
"metric_name": metric.get("metric_name"),
|
||||
"expression": metric.get("expression"),
|
||||
"verbose_name": metric.get("verbose_name"),
|
||||
"description": metric.get("description"),
|
||||
"d3format": metric.get("d3format"),
|
||||
"currency": metric.get("currency"),
|
||||
"extra": metric.get("extra"),
|
||||
"warning_text": metric.get("warning_text"),
|
||||
"metric_type": metric.get("metric_type"),
|
||||
"uuid": metric.get("uuid"),
|
||||
}
|
||||
updated_metrics.append({k: v for k, v in new_metric.items() if v is not None})
|
||||
|
||||
if changes_made:
|
||||
payload_for_update = {
|
||||
"database_id": dataset_data.get("database", {}).get("id"),
|
||||
"table_name": dataset_data.get("table_name"),
|
||||
"schema": dataset_data.get("schema"),
|
||||
"columns": updated_columns,
|
||||
"owners": [owner["id"] for owner in dataset_data.get("owners", [])],
|
||||
"metrics": updated_metrics,
|
||||
"extra": dataset_data.get("extra"),
|
||||
"description": dataset_data.get("description"),
|
||||
"sql": dataset_data.get("sql"),
|
||||
"cache_timeout": dataset_data.get("cache_timeout"),
|
||||
"catalog": dataset_data.get("catalog"),
|
||||
"default_endpoint": dataset_data.get("default_endpoint"),
|
||||
"external_url": dataset_data.get("external_url"),
|
||||
"fetch_values_predicate": dataset_data.get("fetch_values_predicate"),
|
||||
"filter_select_enabled": dataset_data.get("filter_select_enabled"),
|
||||
"is_managed_externally": dataset_data.get("is_managed_externally"),
|
||||
"is_sqllab_view": dataset_data.get("is_sqllab_view"),
|
||||
"main_dttm_col": dataset_data.get("main_dttm_col"),
|
||||
"normalize_columns": dataset_data.get("normalize_columns"),
|
||||
"offset": dataset_data.get("offset"),
|
||||
"template_params": dataset_data.get("template_params"),
|
||||
}
|
||||
|
||||
payload_for_update = {k: v for k, v in payload_for_update.items() if v is not None}
|
||||
|
||||
superset_client.update_dataset(dataset_id, payload_for_update)
|
||||
app_logger.info("[run_mapping][Success] Dataset %d columns' verbose_name updated.", dataset_id)
|
||||
else:
|
||||
app_logger.info("[run_mapping][State] No changes in columns' verbose_name, skipping update.")
|
||||
|
||||
except (AssertionError, FileNotFoundError, Exception) as e:
|
||||
app_logger.error("[run_mapping][Failure] %s", e, exc_info=True)
|
||||
return
|
||||
# [/DEF:run_mapping:Function]
|
||||
# [/DEF:DatasetMapper:Class]
|
||||
|
||||
# [/DEF:backend.core.utils.dataset_mapper:Module]
|
||||
486
backend/src/core/utils/fileio.py
Normal file
486
backend/src/core/utils/fileio.py
Normal file
@@ -0,0 +1,486 @@
|
||||
# [DEF:backend.core.utils.fileio:Module]
|
||||
#
|
||||
# @SEMANTICS: file, io, zip, yaml, temp, archive, utility
|
||||
# @PURPOSE: Предоставляет набор утилит для управления файловыми операциями, включая работу с временными файлами, архивами ZIP, файлами YAML и очистку директорий.
|
||||
# @LAYER: Infra
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.logger
|
||||
# @RELATION: DEPENDS_ON -> pyyaml
|
||||
# @PUBLIC_API: create_temp_file, remove_empty_directories, read_dashboard_from_disk, calculate_crc32, RetentionPolicy, archive_exports, save_and_unpack_dashboard, update_yamls, create_dashboard_export, sanitize_filename, get_filename_from_headers, consolidate_archive_folders
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import os
|
||||
import re
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Any, Optional, Tuple, Dict, List, Union, LiteralString, Generator
|
||||
from contextlib import contextmanager
|
||||
import tempfile
|
||||
from datetime import date, datetime
|
||||
import shutil
|
||||
import zlib
|
||||
from dataclasses import dataclass
|
||||
import yaml
|
||||
from ..logger import logger as app_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:InvalidZipFormatError:Class]
|
||||
class InvalidZipFormatError(Exception):
|
||||
pass
|
||||
|
||||
# [DEF:create_temp_file:Function]
|
||||
# @PURPOSE: Контекстный менеджер для создания временного файла или директории с гарантированным удалением.
|
||||
# @PRE: suffix должен быть строкой, определяющей тип ресурса.
|
||||
# @POST: Временный ресурс создан и путь к нему возвращен; ресурс удален после выхода из контекста.
|
||||
# @PARAM: content (Optional[bytes]) - Бинарное содержимое для записи во временный файл.
|
||||
# @PARAM: suffix (str) - Суффикс ресурса. Если `.dir`, создается директория.
|
||||
# @PARAM: mode (str) - Режим записи в файл (e.g., 'wb').
|
||||
# @YIELDS: Path - Путь к временному ресурсу.
|
||||
# @THROW: IOError - При ошибках создания ресурса.
|
||||
@contextmanager
|
||||
def create_temp_file(content: Optional[bytes] = None, suffix: str = ".zip", mode: str = 'wb', dry_run = False) -> Generator[Path, None, None]:
|
||||
with belief_scope("Create temporary resource"):
|
||||
resource_path = None
|
||||
is_dir = suffix.startswith('.dir')
|
||||
try:
|
||||
if is_dir:
|
||||
with tempfile.TemporaryDirectory(suffix=suffix) as temp_dir:
|
||||
resource_path = Path(temp_dir)
|
||||
app_logger.debug("[create_temp_file][State] Created temporary directory: %s", resource_path)
|
||||
yield resource_path
|
||||
else:
|
||||
fd, temp_path_str = tempfile.mkstemp(suffix=suffix)
|
||||
resource_path = Path(temp_path_str)
|
||||
os.close(fd)
|
||||
if content:
|
||||
resource_path.write_bytes(content)
|
||||
app_logger.debug("[create_temp_file][State] Created temporary file: %s", resource_path)
|
||||
yield resource_path
|
||||
finally:
|
||||
if resource_path and resource_path.exists() and not dry_run:
|
||||
try:
|
||||
if resource_path.is_dir():
|
||||
shutil.rmtree(resource_path)
|
||||
app_logger.debug("[create_temp_file][Cleanup] Removed temporary directory: %s", resource_path)
|
||||
else:
|
||||
resource_path.unlink()
|
||||
app_logger.debug("[create_temp_file][Cleanup] Removed temporary file: %s", resource_path)
|
||||
except OSError as e:
|
||||
app_logger.error("[create_temp_file][Failure] Error during cleanup of %s: %s", resource_path, e)
|
||||
# [/DEF:create_temp_file:Function]
|
||||
|
||||
# [DEF:remove_empty_directories:Function]
|
||||
# @PURPOSE: Рекурсивно удаляет все пустые поддиректории, начиная с указанного пути.
|
||||
# @PRE: root_dir должен быть путем к существующей директории.
|
||||
# @POST: Все пустые поддиректории удалены, возвращено их количество.
|
||||
# @PARAM: root_dir (str) - Путь к корневой директории для очистки.
|
||||
# @RETURN: int - Количество удаленных директорий.
|
||||
def remove_empty_directories(root_dir: str) -> int:
|
||||
with belief_scope(f"Remove empty directories in {root_dir}"):
|
||||
app_logger.info("[remove_empty_directories][Enter] Starting cleanup of empty directories in %s", root_dir)
|
||||
removed_count = 0
|
||||
if not os.path.isdir(root_dir):
|
||||
app_logger.error("[remove_empty_directories][Failure] Directory not found: %s", root_dir)
|
||||
return 0
|
||||
for current_dir, _, _ in os.walk(root_dir, topdown=False):
|
||||
if not os.listdir(current_dir):
|
||||
try:
|
||||
os.rmdir(current_dir)
|
||||
removed_count += 1
|
||||
app_logger.info("[remove_empty_directories][State] Removed empty directory: %s", current_dir)
|
||||
except OSError as e:
|
||||
app_logger.error("[remove_empty_directories][Failure] Failed to remove %s: %s", current_dir, e)
|
||||
app_logger.info("[remove_empty_directories][Exit] Removed %d empty directories.", removed_count)
|
||||
return removed_count
|
||||
# [/DEF:remove_empty_directories:Function]
|
||||
|
||||
# [DEF:read_dashboard_from_disk:Function]
|
||||
# @PURPOSE: Читает бинарное содержимое файла с диска.
|
||||
# @PRE: file_path должен указывать на существующий файл.
|
||||
# @POST: Возвращает байты содержимого и имя файла.
|
||||
# @PARAM: file_path (str) - Путь к файлу.
|
||||
# @RETURN: Tuple[bytes, str] - Кортеж (содержимое, имя файла).
|
||||
# @THROW: FileNotFoundError - Если файл не найден.
|
||||
def read_dashboard_from_disk(file_path: str) -> Tuple[bytes, str]:
|
||||
with belief_scope(f"Read dashboard from {file_path}"):
|
||||
path = Path(file_path)
|
||||
assert path.is_file(), f"Файл дашборда не найден: {file_path}"
|
||||
app_logger.info("[read_dashboard_from_disk][Enter] Reading file: %s", file_path)
|
||||
content = path.read_bytes()
|
||||
if not content:
|
||||
app_logger.warning("[read_dashboard_from_disk][Warning] File is empty: %s", file_path)
|
||||
return content, path.name
|
||||
# [/DEF:read_dashboard_from_disk:Function]
|
||||
|
||||
# [DEF:calculate_crc32:Function]
|
||||
# @PURPOSE: Вычисляет контрольную сумму CRC32 для файла.
|
||||
# @PRE: file_path должен быть объектом Path к существующему файлу.
|
||||
# @POST: Возвращает 8-значную hex-строку CRC32.
|
||||
# @PARAM: file_path (Path) - Путь к файлу.
|
||||
# @RETURN: str - 8-значное шестнадцатеричное представление CRC32.
|
||||
# @THROW: IOError - При ошибках чтения файла.
|
||||
def calculate_crc32(file_path: Path) -> str:
|
||||
with belief_scope(f"Calculate CRC32 for {file_path}"):
|
||||
with open(file_path, 'rb') as f:
|
||||
crc32_value = zlib.crc32(f.read())
|
||||
return f"{crc32_value:08x}"
|
||||
# [/DEF:calculate_crc32:Function]
|
||||
|
||||
# [SECTION: DATA_CLASSES]
|
||||
# [DEF:RetentionPolicy:DataClass]
|
||||
# @PURPOSE: Определяет политику хранения для архивов (ежедневные, еженедельные, ежемесячные).
|
||||
@dataclass
|
||||
class RetentionPolicy:
|
||||
daily: int = 7
|
||||
weekly: int = 4
|
||||
monthly: int = 12
|
||||
# [/DEF:RetentionPolicy:DataClass]
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:archive_exports:Function]
|
||||
# @PURPOSE: Управляет архивом экспортированных файлов, применяя политику хранения и дедупликацию.
|
||||
# @PRE: output_dir должен быть путем к существующей директории.
|
||||
# @POST: Старые или дублирующиеся архивы удалены согласно политике.
|
||||
# @RELATION: CALLS -> apply_retention_policy
|
||||
# @RELATION: CALLS -> calculate_crc32
|
||||
# @PARAM: output_dir (str) - Директория с архивами.
|
||||
# @PARAM: policy (RetentionPolicy) - Политика хранения.
|
||||
# @PARAM: deduplicate (bool) - Флаг для включения удаления дубликатов по CRC32.
|
||||
def archive_exports(output_dir: str, policy: RetentionPolicy, deduplicate: bool = False) -> None:
|
||||
with belief_scope(f"Archive exports in {output_dir}"):
|
||||
output_path = Path(output_dir)
|
||||
if not output_path.is_dir():
|
||||
app_logger.warning("[archive_exports][Skip] Archive directory not found: %s", output_dir)
|
||||
return
|
||||
|
||||
app_logger.info("[archive_exports][Enter] Managing archive in %s", output_dir)
|
||||
|
||||
# 1. Collect all zip files
|
||||
zip_files = list(output_path.glob("*.zip"))
|
||||
if not zip_files:
|
||||
app_logger.info("[archive_exports][State] No zip files found in %s", output_dir)
|
||||
return
|
||||
|
||||
# 2. Deduplication
|
||||
if deduplicate:
|
||||
app_logger.info("[archive_exports][State] Starting deduplication...")
|
||||
checksums = {}
|
||||
files_to_remove = []
|
||||
|
||||
# Sort by modification time (newest first) to keep the latest version
|
||||
zip_files.sort(key=lambda f: f.stat().st_mtime, reverse=True)
|
||||
|
||||
for file_path in zip_files:
|
||||
try:
|
||||
crc = calculate_crc32(file_path)
|
||||
if crc in checksums:
|
||||
files_to_remove.append(file_path)
|
||||
app_logger.debug("[archive_exports][State] Duplicate found: %s (same as %s)", file_path.name, checksums[crc].name)
|
||||
else:
|
||||
checksums[crc] = file_path
|
||||
except Exception as e:
|
||||
app_logger.error("[archive_exports][Failure] Failed to calculate CRC32 for %s: %s", file_path, e)
|
||||
|
||||
for f in files_to_remove:
|
||||
try:
|
||||
f.unlink()
|
||||
zip_files.remove(f)
|
||||
app_logger.info("[archive_exports][State] Removed duplicate: %s", f.name)
|
||||
except OSError as e:
|
||||
app_logger.error("[archive_exports][Failure] Failed to remove duplicate %s: %s", f, e)
|
||||
|
||||
# 3. Retention Policy
|
||||
files_with_dates = []
|
||||
for file_path in zip_files:
|
||||
# Try to extract date from filename
|
||||
# Pattern: ..._YYYYMMDD_HHMMSS.zip or ..._YYYYMMDD.zip
|
||||
match = re.search(r'_(\d{8})_', file_path.name)
|
||||
file_date = None
|
||||
if match:
|
||||
try:
|
||||
date_str = match.group(1)
|
||||
file_date = datetime.strptime(date_str, "%Y%m%d").date()
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
if not file_date:
|
||||
# Fallback to modification time
|
||||
file_date = datetime.fromtimestamp(file_path.stat().st_mtime).date()
|
||||
|
||||
files_with_dates.append((file_path, file_date))
|
||||
|
||||
files_to_keep = apply_retention_policy(files_with_dates, policy)
|
||||
|
||||
for file_path, _ in files_with_dates:
|
||||
if file_path not in files_to_keep:
|
||||
try:
|
||||
file_path.unlink()
|
||||
app_logger.info("[archive_exports][State] Removed by retention policy: %s", file_path.name)
|
||||
except OSError as e:
|
||||
app_logger.error("[archive_exports][Failure] Failed to remove %s: %s", file_path, e)
|
||||
# [/DEF:archive_exports:Function]
|
||||
|
||||
# [DEF:apply_retention_policy:Function]
|
||||
# @PURPOSE: (Helper) Применяет политику хранения к списку файлов, возвращая те, что нужно сохранить.
|
||||
# @PRE: files_with_dates is a list of (Path, date) tuples.
|
||||
# @POST: Returns a set of files to keep.
|
||||
# @PARAM: files_with_dates (List[Tuple[Path, date]]) - Список файлов с датами.
|
||||
# @PARAM: policy (RetentionPolicy) - Политика хранения.
|
||||
# @RETURN: set - Множество путей к файлам, которые должны быть сохранены.
|
||||
def apply_retention_policy(files_with_dates: List[Tuple[Path, date]], policy: RetentionPolicy) -> set:
|
||||
with belief_scope("Apply retention policy"):
|
||||
# Сортируем по дате (от новой к старой)
|
||||
sorted_files = sorted(files_with_dates, key=lambda x: x[1], reverse=True)
|
||||
# Словарь для хранения файлов по категориям
|
||||
daily_files = []
|
||||
weekly_files = []
|
||||
monthly_files = []
|
||||
today = date.today()
|
||||
for file_path, file_date in sorted_files:
|
||||
# Ежедневные
|
||||
if (today - file_date).days < policy.daily:
|
||||
daily_files.append(file_path)
|
||||
# Еженедельные
|
||||
elif (today - file_date).days < policy.weekly * 7:
|
||||
weekly_files.append(file_path)
|
||||
# Ежемесячные
|
||||
elif (today - file_date).days < policy.monthly * 30:
|
||||
monthly_files.append(file_path)
|
||||
# Возвращаем множество файлов, которые нужно сохранить
|
||||
files_to_keep = set()
|
||||
files_to_keep.update(daily_files)
|
||||
files_to_keep.update(weekly_files[:policy.weekly])
|
||||
files_to_keep.update(monthly_files[:policy.monthly])
|
||||
app_logger.debug("[apply_retention_policy][State] Keeping %d files according to retention policy", len(files_to_keep))
|
||||
return files_to_keep
|
||||
# [/DEF:apply_retention_policy:Function]
|
||||
|
||||
# [DEF:save_and_unpack_dashboard:Function]
|
||||
# @PURPOSE: Сохраняет бинарное содержимое ZIP-архива на диск и опционально распаковывает его.
|
||||
# @PRE: zip_content должен быть байтами валидного ZIP-архива.
|
||||
# @POST: ZIP-файл сохранен, и если unpack=True, он распакован в output_dir.
|
||||
# @PARAM: zip_content (bytes) - Содержимое ZIP-архива.
|
||||
# @PARAM: output_dir (Union[str, Path]) - Директория для сохранения.
|
||||
# @PARAM: unpack (bool) - Флаг, нужно ли распаковывать архив.
|
||||
# @PARAM: original_filename (Optional[str]) - Исходное имя файла для сохранения.
|
||||
# @RETURN: Tuple[Path, Optional[Path]] - Путь к ZIP-файлу и, если применимо, путь к директории с распаковкой.
|
||||
# @THROW: InvalidZipFormatError - При ошибке формата ZIP.
|
||||
def save_and_unpack_dashboard(zip_content: bytes, output_dir: Union[str, Path], unpack: bool = False, original_filename: Optional[str] = None) -> Tuple[Path, Optional[Path]]:
|
||||
with belief_scope("Save and unpack dashboard"):
|
||||
app_logger.info("[save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: %s", unpack)
|
||||
try:
|
||||
output_path = Path(output_dir)
|
||||
output_path.mkdir(parents=True, exist_ok=True)
|
||||
zip_name = sanitize_filename(original_filename) if original_filename else f"dashboard_export_{datetime.now().strftime('%Y%m%d_%H%M%S')}.zip"
|
||||
zip_path = output_path / zip_name
|
||||
zip_path.write_bytes(zip_content)
|
||||
app_logger.info("[save_and_unpack_dashboard][State] Dashboard saved to: %s", zip_path)
|
||||
if unpack:
|
||||
with zipfile.ZipFile(zip_path, 'r') as zip_ref:
|
||||
zip_ref.extractall(output_path)
|
||||
app_logger.info("[save_and_unpack_dashboard][State] Dashboard unpacked to: %s", output_path)
|
||||
return zip_path, output_path
|
||||
return zip_path, None
|
||||
except zipfile.BadZipFile as e:
|
||||
app_logger.error("[save_and_unpack_dashboard][Failure] Invalid ZIP archive: %s", e)
|
||||
raise InvalidZipFormatError(f"Invalid ZIP file: {e}") from e
|
||||
# [/DEF:save_and_unpack_dashboard:Function]
|
||||
|
||||
# [DEF:update_yamls:Function]
|
||||
# @PURPOSE: Обновляет конфигурации в YAML-файлах, заменяя значения или применяя regex.
|
||||
# @PRE: path должен быть существующей директорией.
|
||||
# @POST: Все YAML файлы в директории обновлены согласно переданным параметрам.
|
||||
# @RELATION: CALLS -> _update_yaml_file
|
||||
# @THROW: FileNotFoundError - Если `path` не существует.
|
||||
# @PARAM: db_configs (Optional[List[Dict]]) - Список конфигураций для замены.
|
||||
# @PARAM: path (str) - Путь к директории с YAML файлами.
|
||||
# @PARAM: regexp_pattern (Optional[LiteralString]) - Паттерн для поиска.
|
||||
# @PARAM: replace_string (Optional[LiteralString]) - Строка для замены.
|
||||
def update_yamls(db_configs: Optional[List[Dict[str, Any]]] = None, path: str = "dashboards", regexp_pattern: Optional[LiteralString] = None, replace_string: Optional[LiteralString] = None) -> None:
|
||||
with belief_scope("Update YAML configurations"):
|
||||
app_logger.info("[update_yamls][Enter] Starting YAML configuration update.")
|
||||
dir_path = Path(path)
|
||||
assert dir_path.is_dir(), f"Путь {path} не существует или не является директорией"
|
||||
|
||||
configs: List[Dict[str, Any]] = db_configs or []
|
||||
|
||||
for file_path in dir_path.rglob("*.yaml"):
|
||||
_update_yaml_file(file_path, configs, regexp_pattern, replace_string)
|
||||
# [/DEF:update_yamls:Function]
|
||||
|
||||
# [DEF:_update_yaml_file:Function]
|
||||
# @PURPOSE: (Helper) Обновляет один YAML файл.
|
||||
# @PRE: file_path должен быть объектом Path к существующему YAML файлу.
|
||||
# @POST: Файл обновлен согласно переданным конфигурациям или регулярному выражению.
|
||||
# @PARAM: file_path (Path) - Путь к файлу.
|
||||
# @PARAM: db_configs (List[Dict]) - Конфигурации.
|
||||
# @PARAM: regexp_pattern (Optional[str]) - Паттерн.
|
||||
# @PARAM: replace_string (Optional[str]) - Замена.
|
||||
def _update_yaml_file(file_path: Path, db_configs: List[Dict[str, Any]], regexp_pattern: Optional[str], replace_string: Optional[str]) -> None:
|
||||
with belief_scope(f"Update YAML file: {file_path}"):
|
||||
# Читаем содержимое файла
|
||||
try:
|
||||
with open(file_path, 'r', encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
except Exception as e:
|
||||
app_logger.error("[_update_yaml_file][Failure] Failed to read %s: %s", file_path, e)
|
||||
return
|
||||
# Если задан pattern и replace_string, применяем замену по регулярному выражению
|
||||
if regexp_pattern and replace_string:
|
||||
try:
|
||||
new_content = re.sub(regexp_pattern, replace_string, content)
|
||||
if new_content != content:
|
||||
with open(file_path, 'w', encoding='utf-8') as f:
|
||||
f.write(new_content)
|
||||
app_logger.info("[_update_yaml_file][State] Updated %s using regex pattern", file_path)
|
||||
except Exception as e:
|
||||
app_logger.error("[_update_yaml_file][Failure] Error applying regex to %s: %s", file_path, e)
|
||||
# Если заданы конфигурации, заменяем значения (поддержка old/new)
|
||||
if db_configs:
|
||||
try:
|
||||
# Прямой текстовый заменитель для старых/новых значений, чтобы сохранить структуру файла
|
||||
modified_content = content
|
||||
for cfg in db_configs:
|
||||
# Ожидаем структуру: {'old': {...}, 'new': {...}}
|
||||
old_cfg = cfg.get('old', {})
|
||||
new_cfg = cfg.get('new', {})
|
||||
for key, old_val in old_cfg.items():
|
||||
if key in new_cfg:
|
||||
new_val = new_cfg[key]
|
||||
# Заменяем только точные совпадения старого значения в тексте YAML, используя ключ для контекста
|
||||
if isinstance(old_val, str):
|
||||
# Ищем паттерн: key: "value" или key: value
|
||||
key_pattern = re.escape(key)
|
||||
val_pattern = re.escape(old_val)
|
||||
# Группы: 1=ключ+разделитель, 2=открывающая кавычка (опц), 3=значение, 4=закрывающая кавычка (опц)
|
||||
pattern = rf'({key_pattern}\s*:\s*)(["\']?)({val_pattern})(["\']?)'
|
||||
|
||||
# [DEF:replacer:Function]
|
||||
# @PURPOSE: Функция замены, сохраняющая кавычки если они были.
|
||||
# @PRE: match должен быть объектом совпадения регулярного выражения.
|
||||
# @POST: Возвращает строку с новым значением, сохраняя префикс и кавычки.
|
||||
def replacer(match):
|
||||
prefix = match.group(1)
|
||||
quote_open = match.group(2)
|
||||
quote_close = match.group(4)
|
||||
return f"{prefix}{quote_open}{new_val}{quote_close}"
|
||||
# [/DEF:replacer:Function]
|
||||
|
||||
modified_content = re.sub(pattern, replacer, modified_content)
|
||||
app_logger.info("[_update_yaml_file][State] Replaced '%s' with '%s' for key %s in %s", old_val, new_val, key, file_path)
|
||||
# Записываем обратно изменённый контент без парсинга YAML, сохраняем оригинальное форматирование
|
||||
with open(file_path, 'w', encoding='utf-8') as f:
|
||||
f.write(modified_content)
|
||||
except Exception as e:
|
||||
app_logger.error("[_update_yaml_file][Failure] Error performing raw replacement in %s: %s", file_path, e)
|
||||
# [/DEF:_update_yaml_file:Function]
|
||||
|
||||
# [DEF:create_dashboard_export:Function]
|
||||
# @PURPOSE: Создает ZIP-архив из указанных исходных путей.
|
||||
# @PRE: source_paths должен содержать существующие пути.
|
||||
# @POST: ZIP-архив создан по пути zip_path.
|
||||
# @PARAM: zip_path (Union[str, Path]) - Путь для сохранения ZIP архива.
|
||||
# @PARAM: source_paths (List[Union[str, Path]]) - Список исходных путей для архивации.
|
||||
# @PARAM: exclude_extensions (Optional[List[str]]) - Список расширений для исключения.
|
||||
# @RETURN: bool - `True` при успехе, `False` при ошибке.
|
||||
def create_dashboard_export(zip_path: Union[str, Path], source_paths: List[Union[str, Path]], exclude_extensions: Optional[List[str]] = None) -> bool:
|
||||
with belief_scope(f"Create dashboard export: {zip_path}"):
|
||||
app_logger.info("[create_dashboard_export][Enter] Packing dashboard: %s -> %s", source_paths, zip_path)
|
||||
try:
|
||||
exclude_ext = [ext.lower() for ext in exclude_extensions or []]
|
||||
with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf:
|
||||
for src_path_str in source_paths:
|
||||
src_path = Path(src_path_str)
|
||||
assert src_path.exists(), f"Путь не найден: {src_path}"
|
||||
for item in src_path.rglob('*'):
|
||||
if item.is_file() and item.suffix.lower() not in exclude_ext:
|
||||
arcname = item.relative_to(src_path.parent)
|
||||
zipf.write(item, arcname)
|
||||
app_logger.info("[create_dashboard_export][Exit] Archive created: %s", zip_path)
|
||||
return True
|
||||
except (IOError, zipfile.BadZipFile, AssertionError) as e:
|
||||
app_logger.error("[create_dashboard_export][Failure] Error: %s", e, exc_info=True)
|
||||
return False
|
||||
# [/DEF:create_dashboard_export:Function]
|
||||
|
||||
# [DEF:sanitize_filename:Function]
|
||||
# @PURPOSE: Очищает строку от символов, недопустимых в именах файлов.
|
||||
# @PRE: filename должен быть строкой.
|
||||
# @POST: Возвращает строку без спецсимволов.
|
||||
# @PARAM: filename (str) - Исходное имя файла.
|
||||
# @RETURN: str - Очищенная строка.
|
||||
def sanitize_filename(filename: str) -> str:
|
||||
with belief_scope(f"Sanitize filename: {filename}"):
|
||||
return re.sub(r'[\\/*?:"<>|]', "_", filename).strip()
|
||||
# [/DEF:sanitize_filename:Function]
|
||||
|
||||
# [DEF:get_filename_from_headers:Function]
|
||||
# @PURPOSE: Извлекает имя файла из HTTP заголовка 'Content-Disposition'.
|
||||
# @PRE: headers должен быть словарем заголовков.
|
||||
# @POST: Возвращает имя файла или None, если заголовок отсутствует.
|
||||
# @PARAM: headers (dict) - Словарь HTTP заголовков.
|
||||
# @RETURN: Optional[str] - Имя файла or `None`.
|
||||
def get_filename_from_headers(headers: dict) -> Optional[str]:
|
||||
with belief_scope("Get filename from headers"):
|
||||
content_disposition = headers.get("Content-Disposition", "")
|
||||
if match := re.search(r'filename="?([^"]+)"?', content_disposition):
|
||||
return match.group(1).strip()
|
||||
return None
|
||||
# [/DEF:get_filename_from_headers:Function]
|
||||
|
||||
# [DEF:consolidate_archive_folders:Function]
|
||||
# @PURPOSE: Консолидирует директории архивов на основе общего слага в имени.
|
||||
# @PRE: root_directory должен быть объектом Path к существующей директории.
|
||||
# @POST: Директории с одинаковым префиксом объединены в одну.
|
||||
# @THROW: TypeError, ValueError - Если `root_directory` невалиден.
|
||||
# @PARAM: root_directory (Path) - Корневая директория для консолидации.
|
||||
def consolidate_archive_folders(root_directory: Path) -> None:
|
||||
with belief_scope(f"Consolidate archives in {root_directory}"):
|
||||
assert isinstance(root_directory, Path), "root_directory must be a Path object."
|
||||
assert root_directory.is_dir(), "root_directory must be an existing directory."
|
||||
|
||||
app_logger.info("[consolidate_archive_folders][Enter] Consolidating archives in %s", root_directory)
|
||||
# Собираем все директории с архивами
|
||||
archive_dirs = []
|
||||
for item in root_directory.iterdir():
|
||||
if item.is_dir():
|
||||
# Проверяем, есть ли в директории ZIP-архивы
|
||||
if any(item.glob("*.zip")):
|
||||
archive_dirs.append(item)
|
||||
# Группируем по слагу (части имени до первого '_')
|
||||
slug_groups = {}
|
||||
for dir_path in archive_dirs:
|
||||
dir_name = dir_path.name
|
||||
slug = dir_name.split('_')[0] if '_' in dir_name else dir_name
|
||||
if slug not in slug_groups:
|
||||
slug_groups[slug] = []
|
||||
slug_groups[slug].append(dir_path)
|
||||
# Для каждой группы консолидируем
|
||||
for slug, dirs in slug_groups.items():
|
||||
if len(dirs) <= 1:
|
||||
continue
|
||||
# Создаем целевую директорию
|
||||
target_dir = root_directory / slug
|
||||
target_dir.mkdir(exist_ok=True)
|
||||
app_logger.info("[consolidate_archive_folders][State] Consolidating %d directories under %s", len(dirs), target_dir)
|
||||
# Перемещаем содержимое
|
||||
for source_dir in dirs:
|
||||
if source_dir == target_dir:
|
||||
continue
|
||||
for item in source_dir.iterdir():
|
||||
dest_item = target_dir / item.name
|
||||
try:
|
||||
if item.is_dir():
|
||||
shutil.move(str(item), str(dest_item))
|
||||
else:
|
||||
shutil.move(str(item), str(dest_item))
|
||||
except Exception as e:
|
||||
app_logger.error("[consolidate_archive_folders][Failure] Failed to move %s to %s: %s", item, dest_item, e)
|
||||
# Удаляем исходную директорию
|
||||
try:
|
||||
source_dir.rmdir()
|
||||
app_logger.info("[consolidate_archive_folders][State] Removed source directory: %s", source_dir)
|
||||
except Exception as e:
|
||||
app_logger.error("[consolidate_archive_folders][Failure] Failed to remove source directory %s: %s", source_dir, e)
|
||||
# [/DEF:consolidate_archive_folders:Function]
|
||||
|
||||
# [/DEF:backend.core.utils.fileio:Module]
|
||||
518
superset_tool/utils/network.py → backend/src/core/utils/network.py
Executable file → Normal file
518
superset_tool/utils/network.py → backend/src/core/utils/network.py
Executable file → Normal file
@@ -1,232 +1,286 @@
|
||||
# [DEF:superset_tool.utils.network:Module]
|
||||
#
|
||||
# @SEMANTICS: network, http, client, api, requests, session, authentication
|
||||
# @PURPOSE: Инкапсулирует низкоуровневую HTTP-логику для взаимодействия с Superset API, включая аутентификацию, управление сессией, retry-логику и обработку ошибок.
|
||||
# @LAYER: Infra
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.exceptions
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.utils.logger
|
||||
# @RELATION: DEPENDS_ON -> requests
|
||||
# @PUBLIC_API: APIClient
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Optional, Dict, Any, List, Union, cast
|
||||
import json
|
||||
import io
|
||||
from pathlib import Path
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
import urllib3
|
||||
from urllib3.util.retry import Retry
|
||||
from superset_tool.exceptions import AuthenticationError, NetworkError, DashboardNotFoundError, SupersetAPIError, PermissionDeniedError
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:APIClient:Class]
|
||||
# @PURPOSE: Инкапсулирует HTTP-логику для работы с API, включая сессии, аутентификацию, и обработку запросов.
|
||||
class APIClient:
|
||||
DEFAULT_TIMEOUT = 30
|
||||
|
||||
# [DEF:APIClient.__init__:Function]
|
||||
# @PURPOSE: Инициализирует API клиент с конфигурацией, сессией и логгером.
|
||||
# @PARAM: config (Dict[str, Any]) - Конфигурация.
|
||||
# @PARAM: verify_ssl (bool) - Проверять ли SSL.
|
||||
# @PARAM: timeout (int) - Таймаут запросов.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Логгер.
|
||||
def __init__(self, config: Dict[str, Any], verify_ssl: bool = True, timeout: int = DEFAULT_TIMEOUT, logger: Optional[SupersetLogger] = None):
|
||||
self.logger = logger or SupersetLogger(name="APIClient")
|
||||
self.logger.info("[APIClient.__init__][Entry] Initializing APIClient.")
|
||||
self.base_url: str = config.get("base_url", "")
|
||||
self.auth = config.get("auth")
|
||||
self.request_settings = {"verify_ssl": verify_ssl, "timeout": timeout}
|
||||
self.session = self._init_session()
|
||||
self._tokens: Dict[str, str] = {}
|
||||
self._authenticated = False
|
||||
self.logger.info("[APIClient.__init__][Exit] APIClient initialized.")
|
||||
# [/DEF:APIClient.__init__:Function]
|
||||
|
||||
# [DEF:APIClient._init_session:Function]
|
||||
# @PURPOSE: Создает и настраивает `requests.Session` с retry-логикой.
|
||||
# @RETURN: requests.Session - Настроенная сессия.
|
||||
def _init_session(self) -> requests.Session:
|
||||
session = requests.Session()
|
||||
retries = Retry(total=3, backoff_factor=0.5, status_forcelist=[500, 502, 503, 504])
|
||||
adapter = HTTPAdapter(max_retries=retries)
|
||||
session.mount('http://', adapter)
|
||||
session.mount('https://', adapter)
|
||||
if not self.request_settings["verify_ssl"]:
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
self.logger.warning("[_init_session][State] SSL verification disabled.")
|
||||
session.verify = self.request_settings["verify_ssl"]
|
||||
return session
|
||||
# [/DEF:APIClient._init_session:Function]
|
||||
|
||||
# [DEF:APIClient.authenticate:Function]
|
||||
# @PURPOSE: Выполняет аутентификацию в Superset API и получает access и CSRF токены.
|
||||
# @POST: `self._tokens` заполнен, `self._authenticated` установлен в `True`.
|
||||
# @RETURN: Dict[str, str] - Словарь с токенами.
|
||||
# @THROW: AuthenticationError, NetworkError - при ошибках.
|
||||
def authenticate(self) -> Dict[str, str]:
|
||||
self.logger.info("[authenticate][Enter] Authenticating to %s", self.base_url)
|
||||
try:
|
||||
login_url = f"{self.base_url}/security/login"
|
||||
response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"])
|
||||
response.raise_for_status()
|
||||
access_token = response.json()["access_token"]
|
||||
|
||||
csrf_url = f"{self.base_url}/security/csrf_token/"
|
||||
csrf_response = self.session.get(csrf_url, headers={"Authorization": f"Bearer {access_token}"}, timeout=self.request_settings["timeout"])
|
||||
csrf_response.raise_for_status()
|
||||
|
||||
self._tokens = {"access_token": access_token, "csrf_token": csrf_response.json()["result"]}
|
||||
self._authenticated = True
|
||||
self.logger.info("[authenticate][Exit] Authenticated successfully.")
|
||||
return self._tokens
|
||||
except requests.exceptions.HTTPError as e:
|
||||
raise AuthenticationError(f"Authentication failed: {e}") from e
|
||||
except (requests.exceptions.RequestException, KeyError) as e:
|
||||
raise NetworkError(f"Network or parsing error during authentication: {e}") from e
|
||||
# [/DEF:APIClient.authenticate:Function]
|
||||
|
||||
@property
|
||||
def headers(self) -> Dict[str, str]:
|
||||
# [DEF:APIClient.headers:Function]
|
||||
# @PURPOSE: Возвращает HTTP-заголовки для аутентифицированных запросов.
|
||||
if not self._authenticated: self.authenticate()
|
||||
return {
|
||||
"Authorization": f"Bearer {self._tokens['access_token']}",
|
||||
"X-CSRFToken": self._tokens.get("csrf_token", ""),
|
||||
"Referer": self.base_url,
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
# [/DEF:APIClient.headers:Function]
|
||||
|
||||
# [DEF:APIClient.request:Function]
|
||||
# @PURPOSE: Выполняет универсальный HTTP-запрос к API.
|
||||
# @RETURN: `requests.Response` если `raw_response=True`, иначе `dict`.
|
||||
# @THROW: SupersetAPIError, NetworkError и их подклассы.
|
||||
# @PARAM: method (str) - HTTP метод.
|
||||
# @PARAM: endpoint (str) - API эндпоинт.
|
||||
# @PARAM: headers (Optional[Dict]) - Дополнительные заголовки.
|
||||
# @PARAM: raw_response (bool) - Возвращать ли сырой ответ.
|
||||
def request(self, method: str, endpoint: str, headers: Optional[Dict] = None, raw_response: bool = False, **kwargs) -> Union[requests.Response, Dict[str, Any]]:
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy()
|
||||
if headers: _headers.update(headers)
|
||||
|
||||
try:
|
||||
response = self.session.request(method, full_url, headers=_headers, **kwargs)
|
||||
response.raise_for_status()
|
||||
return response if raw_response else response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
self._handle_http_error(e, endpoint)
|
||||
except requests.exceptions.RequestException as e:
|
||||
self._handle_network_error(e, full_url)
|
||||
# [/DEF:APIClient.request:Function]
|
||||
|
||||
# [DEF:APIClient._handle_http_error:Function]
|
||||
# @PURPOSE: (Helper) Преобразует HTTP ошибки в кастомные исключения.
|
||||
# @PARAM: e (requests.exceptions.HTTPError) - Ошибка.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str):
|
||||
status_code = e.response.status_code
|
||||
if status_code == 404: raise DashboardNotFoundError(endpoint) from e
|
||||
if status_code == 403: raise PermissionDeniedError() from e
|
||||
if status_code == 401: raise AuthenticationError() from e
|
||||
raise SupersetAPIError(f"API Error {status_code}: {e.response.text}") from e
|
||||
# [/DEF:APIClient._handle_http_error:Function]
|
||||
|
||||
# [DEF:APIClient._handle_network_error:Function]
|
||||
# @PURPOSE: (Helper) Преобразует сетевые ошибки в `NetworkError`.
|
||||
# @PARAM: e (requests.exceptions.RequestException) - Ошибка.
|
||||
# @PARAM: url (str) - URL.
|
||||
def _handle_network_error(self, e: requests.exceptions.RequestException, url: str):
|
||||
if isinstance(e, requests.exceptions.Timeout): msg = "Request timeout"
|
||||
elif isinstance(e, requests.exceptions.ConnectionError): msg = "Connection error"
|
||||
else: msg = f"Unknown network error: {e}"
|
||||
raise NetworkError(msg, url=url) from e
|
||||
# [/DEF:APIClient._handle_network_error:Function]
|
||||
|
||||
# [DEF:APIClient.upload_file:Function]
|
||||
# @PURPOSE: Загружает файл на сервер через multipart/form-data.
|
||||
# @RETURN: Ответ API в виде словаря.
|
||||
# @THROW: SupersetAPIError, NetworkError, TypeError.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: file_info (Dict[str, Any]) - Информация о файле.
|
||||
# @PARAM: extra_data (Optional[Dict]) - Дополнительные данные.
|
||||
# @PARAM: timeout (Optional[int]) - Таймаут.
|
||||
def upload_file(self, endpoint: str, file_info: Dict[str, Any], extra_data: Optional[Dict] = None, timeout: Optional[int] = None) -> Dict:
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy(); _headers.pop('Content-Type', None)
|
||||
|
||||
file_obj, file_name, form_field = file_info.get("file_obj"), file_info.get("file_name"), file_info.get("form_field", "file")
|
||||
|
||||
files_payload = {}
|
||||
if isinstance(file_obj, (str, Path)):
|
||||
with open(file_obj, 'rb') as f:
|
||||
files_payload = {form_field: (file_name, f.read(), 'application/x-zip-compressed')}
|
||||
elif isinstance(file_obj, io.BytesIO):
|
||||
files_payload = {form_field: (file_name, file_obj.getvalue(), 'application/x-zip-compressed')}
|
||||
else:
|
||||
raise TypeError(f"Unsupported file_obj type: {type(file_obj)}")
|
||||
|
||||
return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
|
||||
# [/DEF:APIClient.upload_file:Function]
|
||||
|
||||
# [DEF:APIClient._perform_upload:Function]
|
||||
# @PURPOSE: (Helper) Выполняет POST запрос с файлом.
|
||||
# @PARAM: url (str) - URL.
|
||||
# @PARAM: files (Dict) - Файлы.
|
||||
# @PARAM: data (Optional[Dict]) - Данные.
|
||||
# @PARAM: headers (Dict) - Заголовки.
|
||||
# @PARAM: timeout (Optional[int]) - Таймаут.
|
||||
# @RETURN: Dict - Ответ.
|
||||
def _perform_upload(self, url: str, files: Dict, data: Optional[Dict], headers: Dict, timeout: Optional[int]) -> Dict:
|
||||
try:
|
||||
response = self.session.post(url, files=files, data=data or {}, headers=headers, timeout=timeout or self.request_settings["timeout"])
|
||||
response.raise_for_status()
|
||||
# Добавляем логирование для отладки
|
||||
if response.status_code == 200:
|
||||
try:
|
||||
return response.json()
|
||||
except Exception as json_e:
|
||||
self.logger.debug(f"[_perform_upload][Debug] Response is not valid JSON: {response.text[:200]}...")
|
||||
raise SupersetAPIError(f"API error during upload: Response is not valid JSON: {json_e}") from json_e
|
||||
return response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error during upload: {e}", url=url) from e
|
||||
# [/DEF:APIClient._perform_upload:Function]
|
||||
|
||||
# [DEF:APIClient.fetch_paginated_count:Function]
|
||||
# @PURPOSE: Получает общее количество элементов для пагинации.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: query_params (Dict) - Параметры запроса.
|
||||
# @PARAM: count_field (str) - Поле с количеством.
|
||||
# @RETURN: int - Количество.
|
||||
def fetch_paginated_count(self, endpoint: str, query_params: Dict, count_field: str = "count") -> int:
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query_params)}))
|
||||
return response_json.get(count_field, 0)
|
||||
# [/DEF:APIClient.fetch_paginated_count:Function]
|
||||
|
||||
# [DEF:APIClient.fetch_paginated_data:Function]
|
||||
# @PURPOSE: Автоматически собирает данные со всех страниц пагинированного эндпоинта.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: pagination_options (Dict[str, Any]) - Опции пагинации.
|
||||
# @RETURN: List[Any] - Список данных.
|
||||
def fetch_paginated_data(self, endpoint: str, pagination_options: Dict[str, Any]) -> List[Any]:
|
||||
base_query, total_count = pagination_options["base_query"], pagination_options["total_count"]
|
||||
results_field, page_size = pagination_options["results_field"], base_query.get('page_size')
|
||||
assert page_size and page_size > 0, "'page_size' must be a positive number."
|
||||
|
||||
results = []
|
||||
for page in range((total_count + page_size - 1) // page_size):
|
||||
query = {**base_query, 'page': page}
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query)}))
|
||||
results.extend(response_json.get(results_field, []))
|
||||
return results
|
||||
# [/DEF:APIClient.fetch_paginated_data:Function]
|
||||
|
||||
# [/DEF:APIClient:Class]
|
||||
|
||||
# [/DEF:superset_tool.utils.network:Module]
|
||||
# [DEF:backend.core.utils.network:Module]
|
||||
#
|
||||
# @SEMANTICS: network, http, client, api, requests, session, authentication
|
||||
# @PURPOSE: Инкапсулирует низкоуровневую HTTP-логику для взаимодействия с Superset API, включая аутентификацию, управление сессией, retry-логику и обработку ошибок.
|
||||
# @LAYER: Infra
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.logger
|
||||
# @RELATION: DEPENDS_ON -> requests
|
||||
# @PUBLIC_API: APIClient
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Optional, Dict, Any, List, Union, cast
|
||||
import json
|
||||
import io
|
||||
from pathlib import Path
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
import urllib3
|
||||
from urllib3.util.retry import Retry
|
||||
from ..logger import logger as app_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:SupersetAPIError:Class]
|
||||
class SupersetAPIError(Exception):
|
||||
def __init__(self, message: str = "Superset API error", **context: Any):
|
||||
self.context = context
|
||||
super().__init__(f"[API_FAILURE] {message} | Context: {self.context}")
|
||||
|
||||
# [DEF:AuthenticationError:Class]
|
||||
class AuthenticationError(SupersetAPIError):
|
||||
def __init__(self, message: str = "Authentication failed", **context: Any):
|
||||
super().__init__(message, type="authentication", **context)
|
||||
|
||||
# [DEF:PermissionDeniedError:Class]
|
||||
class PermissionDeniedError(AuthenticationError):
|
||||
def __init__(self, message: str = "Permission denied", **context: Any):
|
||||
super().__init__(message, **context)
|
||||
|
||||
# [DEF:DashboardNotFoundError:Class]
|
||||
class DashboardNotFoundError(SupersetAPIError):
|
||||
def __init__(self, resource_id: Union[int, str], message: str = "Dashboard not found", **context: Any):
|
||||
super().__init__(f"Dashboard '{resource_id}' {message}", subtype="not_found", resource_id=resource_id, **context)
|
||||
|
||||
# [DEF:NetworkError:Class]
|
||||
class NetworkError(Exception):
|
||||
def __init__(self, message: str = "Network connection failed", **context: Any):
|
||||
self.context = context
|
||||
super().__init__(f"[NETWORK_FAILURE] {message} | Context: {self.context}")
|
||||
|
||||
# [DEF:APIClient:Class]
|
||||
# @PURPOSE: Инкапсулирует HTTP-логику для работы с API, включая сессии, аутентификацию, и обработку запросов.
|
||||
class APIClient:
|
||||
DEFAULT_TIMEOUT = 30
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Инициализирует API клиент с конфигурацией, сессией и логгером.
|
||||
# @PARAM: config (Dict[str, Any]) - Конфигурация.
|
||||
# @PARAM: verify_ssl (bool) - Проверять ли SSL.
|
||||
# @PARAM: timeout (int) - Таймаут запросов.
|
||||
# @PRE: config must contain 'base_url' and 'auth'.
|
||||
# @POST: APIClient instance is initialized with a session.
|
||||
def __init__(self, config: Dict[str, Any], verify_ssl: bool = True, timeout: int = DEFAULT_TIMEOUT):
|
||||
with belief_scope("__init__"):
|
||||
app_logger.info("[APIClient.__init__][Entry] Initializing APIClient.")
|
||||
self.base_url: str = config.get("base_url", "")
|
||||
self.auth = config.get("auth")
|
||||
self.request_settings = {"verify_ssl": verify_ssl, "timeout": timeout}
|
||||
self.session = self._init_session()
|
||||
self._tokens: Dict[str, str] = {}
|
||||
self._authenticated = False
|
||||
app_logger.info("[APIClient.__init__][Exit] APIClient initialized.")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:_init_session:Function]
|
||||
# @PURPOSE: Создает и настраивает `requests.Session` с retry-логикой.
|
||||
# @PRE: self.request_settings must be initialized.
|
||||
# @POST: Returns a configured requests.Session instance.
|
||||
# @RETURN: requests.Session - Настроенная сессия.
|
||||
def _init_session(self) -> requests.Session:
|
||||
with belief_scope("_init_session"):
|
||||
session = requests.Session()
|
||||
retries = Retry(total=3, backoff_factor=0.5, status_forcelist=[500, 502, 503, 504])
|
||||
adapter = HTTPAdapter(max_retries=retries)
|
||||
session.mount('http://', adapter)
|
||||
session.mount('https://', adapter)
|
||||
if not self.request_settings["verify_ssl"]:
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
app_logger.warning("[_init_session][State] SSL verification disabled.")
|
||||
session.verify = self.request_settings["verify_ssl"]
|
||||
return session
|
||||
# [/DEF:_init_session:Function]
|
||||
|
||||
# [DEF:authenticate:Function]
|
||||
# @PURPOSE: Выполняет аутентификацию в Superset API и получает access и CSRF токены.
|
||||
# @PRE: self.auth and self.base_url must be valid.
|
||||
# @POST: `self._tokens` заполнен, `self._authenticated` установлен в `True`.
|
||||
# @RETURN: Dict[str, str] - Словарь с токенами.
|
||||
# @THROW: AuthenticationError, NetworkError - при ошибках.
|
||||
def authenticate(self) -> Dict[str, str]:
|
||||
with belief_scope("authenticate"):
|
||||
app_logger.info("[authenticate][Enter] Authenticating to %s", self.base_url)
|
||||
try:
|
||||
login_url = f"{self.base_url}/security/login"
|
||||
response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"])
|
||||
response.raise_for_status()
|
||||
access_token = response.json()["access_token"]
|
||||
|
||||
csrf_url = f"{self.base_url}/security/csrf_token/"
|
||||
csrf_response = self.session.get(csrf_url, headers={"Authorization": f"Bearer {access_token}"}, timeout=self.request_settings["timeout"])
|
||||
csrf_response.raise_for_status()
|
||||
|
||||
self._tokens = {"access_token": access_token, "csrf_token": csrf_response.json()["result"]}
|
||||
self._authenticated = True
|
||||
app_logger.info("[authenticate][Exit] Authenticated successfully.")
|
||||
return self._tokens
|
||||
except requests.exceptions.HTTPError as e:
|
||||
raise AuthenticationError(f"Authentication failed: {e}") from e
|
||||
except (requests.exceptions.RequestException, KeyError) as e:
|
||||
raise NetworkError(f"Network or parsing error during authentication: {e}") from e
|
||||
# [/DEF:authenticate:Function]
|
||||
|
||||
@property
|
||||
# [DEF:headers:Function]
|
||||
# @PURPOSE: Возвращает HTTP-заголовки для аутентифицированных запросов.
|
||||
# @PRE: APIClient is initialized and authenticated or can be authenticated.
|
||||
# @POST: Returns headers including auth tokens.
|
||||
def headers(self) -> Dict[str, str]:
|
||||
with belief_scope("headers"):
|
||||
if not self._authenticated: self.authenticate()
|
||||
return {
|
||||
"Authorization": f"Bearer {self._tokens['access_token']}",
|
||||
"X-CSRFToken": self._tokens.get("csrf_token", ""),
|
||||
"Referer": self.base_url,
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
# [/DEF:headers:Function]
|
||||
|
||||
# [DEF:request:Function]
|
||||
# @PURPOSE: Выполняет универсальный HTTP-запрос к API.
|
||||
# @PARAM: method (str) - HTTP метод.
|
||||
# @PARAM: endpoint (str) - API эндпоинт.
|
||||
# @PARAM: headers (Optional[Dict]) - Дополнительные заголовки.
|
||||
# @PARAM: raw_response (bool) - Возвращать ли сырой ответ.
|
||||
# @PRE: method and endpoint must be strings.
|
||||
# @POST: Returns response content or raw Response object.
|
||||
# @RETURN: `requests.Response` если `raw_response=True`, иначе `dict`.
|
||||
# @THROW: SupersetAPIError, NetworkError и их подклассы.
|
||||
def request(self, method: str, endpoint: str, headers: Optional[Dict] = None, raw_response: bool = False, **kwargs) -> Union[requests.Response, Dict[str, Any]]:
|
||||
with belief_scope("request"):
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy()
|
||||
if headers: _headers.update(headers)
|
||||
|
||||
try:
|
||||
response = self.session.request(method, full_url, headers=_headers, **kwargs)
|
||||
response.raise_for_status()
|
||||
return response if raw_response else response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
self._handle_http_error(e, endpoint)
|
||||
except requests.exceptions.RequestException as e:
|
||||
self._handle_network_error(e, full_url)
|
||||
# [/DEF:request:Function]
|
||||
|
||||
# [DEF:_handle_http_error:Function]
|
||||
# @PURPOSE: (Helper) Преобразует HTTP ошибки в кастомные исключения.
|
||||
# @PARAM: e (requests.exceptions.HTTPError) - Ошибка.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PRE: e must be a valid HTTPError with a response.
|
||||
# @POST: Raises a specific SupersetAPIError or subclass.
|
||||
def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str):
|
||||
with belief_scope("_handle_http_error"):
|
||||
status_code = e.response.status_code
|
||||
if status_code == 404: raise DashboardNotFoundError(endpoint) from e
|
||||
if status_code == 403: raise PermissionDeniedError() from e
|
||||
if status_code == 401: raise AuthenticationError() from e
|
||||
raise SupersetAPIError(f"API Error {status_code}: {e.response.text}") from e
|
||||
# [/DEF:_handle_http_error:Function]
|
||||
|
||||
# [DEF:_handle_network_error:Function]
|
||||
# @PURPOSE: (Helper) Преобразует сетевые ошибки в `NetworkError`.
|
||||
# @PARAM: e (requests.exceptions.RequestException) - Ошибка.
|
||||
# @PARAM: url (str) - URL.
|
||||
# @PRE: e must be a RequestException.
|
||||
# @POST: Raises a NetworkError.
|
||||
def _handle_network_error(self, e: requests.exceptions.RequestException, url: str):
|
||||
with belief_scope("_handle_network_error"):
|
||||
if isinstance(e, requests.exceptions.Timeout): msg = "Request timeout"
|
||||
elif isinstance(e, requests.exceptions.ConnectionError): msg = "Connection error"
|
||||
else: msg = f"Unknown network error: {e}"
|
||||
raise NetworkError(msg, url=url) from e
|
||||
# [/DEF:_handle_network_error:Function]
|
||||
|
||||
# [DEF:upload_file:Function]
|
||||
# @PURPOSE: Загружает файл на сервер через multipart/form-data.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: file_info (Dict[str, Any]) - Информация о файле.
|
||||
# @PARAM: extra_data (Optional[Dict]) - Дополнительные данные.
|
||||
# @PARAM: timeout (Optional[int]) - Таймаут.
|
||||
# @PRE: file_info must contain 'file_obj' and 'file_name'.
|
||||
# @POST: File is uploaded and response returned.
|
||||
# @RETURN: Ответ API в виде словаря.
|
||||
# @THROW: SupersetAPIError, NetworkError, TypeError.
|
||||
def upload_file(self, endpoint: str, file_info: Dict[str, Any], extra_data: Optional[Dict] = None, timeout: Optional[int] = None) -> Dict:
|
||||
with belief_scope("upload_file"):
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy(); _headers.pop('Content-Type', None)
|
||||
|
||||
file_obj, file_name, form_field = file_info.get("file_obj"), file_info.get("file_name"), file_info.get("form_field", "file")
|
||||
|
||||
files_payload = {}
|
||||
if isinstance(file_obj, (str, Path)):
|
||||
with open(file_obj, 'rb') as f:
|
||||
files_payload = {form_field: (file_name, f.read(), 'application/x-zip-compressed')}
|
||||
elif isinstance(file_obj, io.BytesIO):
|
||||
files_payload = {form_field: (file_name, file_obj.getvalue(), 'application/x-zip-compressed')}
|
||||
else:
|
||||
raise TypeError(f"Unsupported file_obj type: {type(file_obj)}")
|
||||
|
||||
return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
|
||||
# [/DEF:upload_file:Function]
|
||||
|
||||
# [DEF:_perform_upload:Function]
|
||||
# @PURPOSE: (Helper) Выполняет POST запрос с файлом.
|
||||
# @PARAM: url (str) - URL.
|
||||
# @PARAM: files (Dict) - Файлы.
|
||||
# @PARAM: data (Optional[Dict]) - Данные.
|
||||
# @PARAM: headers (Dict) - Заголовки.
|
||||
# @PARAM: timeout (Optional[int]) - Таймаут.
|
||||
# @PRE: url, files, and headers must be provided.
|
||||
# @POST: POST request is performed and JSON response returned.
|
||||
# @RETURN: Dict - Ответ.
|
||||
def _perform_upload(self, url: str, files: Dict, data: Optional[Dict], headers: Dict, timeout: Optional[int]) -> Dict:
|
||||
with belief_scope("_perform_upload"):
|
||||
try:
|
||||
response = self.session.post(url, files=files, data=data or {}, headers=headers, timeout=timeout or self.request_settings["timeout"])
|
||||
response.raise_for_status()
|
||||
if response.status_code == 200:
|
||||
try:
|
||||
return response.json()
|
||||
except Exception as json_e:
|
||||
app_logger.debug(f"[_perform_upload][Debug] Response is not valid JSON: {response.text[:200]}...")
|
||||
raise SupersetAPIError(f"API error during upload: Response is not valid JSON: {json_e}") from json_e
|
||||
return response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error during upload: {e}", url=url) from e
|
||||
# [/DEF:_perform_upload:Function]
|
||||
|
||||
# [DEF:fetch_paginated_count:Function]
|
||||
# @PURPOSE: Получает общее количество элементов для пагинации.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: query_params (Dict) - Параметры запроса.
|
||||
# @PARAM: count_field (str) - Поле с количеством.
|
||||
# @PRE: query_params must be a dictionary.
|
||||
# @POST: Returns total count of items.
|
||||
# @RETURN: int - Количество.
|
||||
def fetch_paginated_count(self, endpoint: str, query_params: Dict, count_field: str = "count") -> int:
|
||||
with belief_scope("fetch_paginated_count"):
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query_params)}))
|
||||
return response_json.get(count_field, 0)
|
||||
# [/DEF:fetch_paginated_count:Function]
|
||||
|
||||
# [DEF:fetch_paginated_data:Function]
|
||||
# @PURPOSE: Автоматически собирает данные со всех страниц пагинированного эндпоинта.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: pagination_options (Dict[str, Any]) - Опции пагинации.
|
||||
# @PRE: pagination_options must contain 'base_query', 'total_count', 'results_field'.
|
||||
# @POST: Returns all items across all pages.
|
||||
# @RETURN: List[Any] - Список данных.
|
||||
def fetch_paginated_data(self, endpoint: str, pagination_options: Dict[str, Any]) -> List[Any]:
|
||||
with belief_scope("fetch_paginated_data"):
|
||||
base_query, total_count = pagination_options["base_query"], pagination_options["total_count"]
|
||||
results_field, page_size = pagination_options["results_field"], base_query.get('page_size')
|
||||
assert page_size and page_size > 0, "'page_size' must be a positive number."
|
||||
|
||||
results = []
|
||||
for page in range((total_count + page_size - 1) // page_size):
|
||||
query = {**base_query, 'page': page}
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query)}))
|
||||
results.extend(response_json.get(results_field, []))
|
||||
return results
|
||||
# [/DEF:fetch_paginated_data:Function]
|
||||
|
||||
# [/DEF:APIClient:Class]
|
||||
|
||||
# [/DEF:backend.core.utils.network:Module]
|
||||
@@ -10,6 +10,7 @@ from .core.task_manager import TaskManager
|
||||
from .core.config_manager import ConfigManager
|
||||
from .core.scheduler import SchedulerService
|
||||
from .core.database import init_db
|
||||
from .core.logger import logger, belief_scope
|
||||
|
||||
# Initialize singletons
|
||||
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
||||
@@ -20,13 +21,20 @@ config_manager = ConfigManager(config_path=str(config_path))
|
||||
# Initialize database before any other services that might use it
|
||||
init_db()
|
||||
|
||||
# [DEF:get_config_manager:Function]
|
||||
# @PURPOSE: Dependency injector for the ConfigManager.
|
||||
# @PRE: Global config_manager must be initialized.
|
||||
# @POST: Returns shared ConfigManager instance.
|
||||
# @RETURN: ConfigManager - The shared config manager instance.
|
||||
def get_config_manager() -> ConfigManager:
|
||||
"""Dependency injector for the ConfigManager."""
|
||||
return config_manager
|
||||
with belief_scope("get_config_manager"):
|
||||
return config_manager
|
||||
# [/DEF:get_config_manager:Function]
|
||||
|
||||
plugin_dir = Path(__file__).parent / "plugins"
|
||||
|
||||
plugin_loader = PluginLoader(plugin_dir=str(plugin_dir))
|
||||
from .core.logger import logger
|
||||
logger.info(f"PluginLoader initialized with directory: {plugin_dir}")
|
||||
logger.info(f"Available plugins: {[config.name for config in plugin_loader.get_all_plugin_configs()]}")
|
||||
|
||||
@@ -36,15 +44,37 @@ logger.info("TaskManager initialized")
|
||||
scheduler_service = SchedulerService(task_manager, config_manager)
|
||||
logger.info("SchedulerService initialized")
|
||||
|
||||
# [DEF:get_plugin_loader:Function]
|
||||
# @PURPOSE: Dependency injector for the PluginLoader.
|
||||
# @PRE: Global plugin_loader must be initialized.
|
||||
# @POST: Returns shared PluginLoader instance.
|
||||
# @RETURN: PluginLoader - The shared plugin loader instance.
|
||||
def get_plugin_loader() -> PluginLoader:
|
||||
"""Dependency injector for the PluginLoader."""
|
||||
return plugin_loader
|
||||
with belief_scope("get_plugin_loader"):
|
||||
return plugin_loader
|
||||
# [/DEF:get_plugin_loader:Function]
|
||||
|
||||
# [DEF:get_task_manager:Function]
|
||||
# @PURPOSE: Dependency injector for the TaskManager.
|
||||
# @PRE: Global task_manager must be initialized.
|
||||
# @POST: Returns shared TaskManager instance.
|
||||
# @RETURN: TaskManager - The shared task manager instance.
|
||||
def get_task_manager() -> TaskManager:
|
||||
"""Dependency injector for the TaskManager."""
|
||||
return task_manager
|
||||
with belief_scope("get_task_manager"):
|
||||
return task_manager
|
||||
# [/DEF:get_task_manager:Function]
|
||||
|
||||
# [DEF:get_scheduler_service:Function]
|
||||
# @PURPOSE: Dependency injector for the SchedulerService.
|
||||
# @PRE: Global scheduler_service must be initialized.
|
||||
# @POST: Returns shared SchedulerService instance.
|
||||
# @RETURN: SchedulerService - The shared scheduler service instance.
|
||||
def get_scheduler_service() -> SchedulerService:
|
||||
"""Dependency injector for the SchedulerService."""
|
||||
return scheduler_service
|
||||
with belief_scope("get_scheduler_service"):
|
||||
return scheduler_service
|
||||
# [/DEF:get_scheduler_service:Function]
|
||||
|
||||
# [/DEF:Dependencies:Module]
|
||||
34
backend/src/models/connection.py
Normal file
34
backend/src/models/connection.py
Normal file
@@ -0,0 +1,34 @@
|
||||
# [DEF:backend.src.models.connection:Module]
|
||||
#
|
||||
# @SEMANTICS: database, connection, configuration, sqlalchemy, sqlite
|
||||
# @PURPOSE: Defines the database schema for external database connection configurations.
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||
#
|
||||
# @INVARIANT: All primary keys are UUID strings.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from sqlalchemy import Column, String, Integer, DateTime
|
||||
from sqlalchemy.sql import func
|
||||
from .mapping import Base
|
||||
import uuid
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:ConnectionConfig:Class]
|
||||
# @PURPOSE: Stores credentials for external databases used for column mapping.
|
||||
class ConnectionConfig(Base):
|
||||
__tablename__ = "connection_configs"
|
||||
|
||||
id = Column(String, primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
name = Column(String, nullable=False)
|
||||
type = Column(String, nullable=False) # e.g., "postgres"
|
||||
host = Column(String, nullable=True)
|
||||
port = Column(Integer, nullable=True)
|
||||
database = Column(String, nullable=True)
|
||||
username = Column(String, nullable=True)
|
||||
password = Column(String, nullable=True) # Encrypted/Obfuscated password
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now())
|
||||
# [/DEF:ConnectionConfig:Class]
|
||||
|
||||
# [/DEF:backend.src.models.connection:Module]
|
||||
73
backend/src/models/git.py
Normal file
73
backend/src/models/git.py
Normal file
@@ -0,0 +1,73 @@
|
||||
"""
|
||||
[DEF:GitModels:Module]
|
||||
Git-specific SQLAlchemy models for configuration and repository tracking.
|
||||
@RELATION: specs/011-git-integration-dashboard/data-model.md
|
||||
"""
|
||||
|
||||
import enum
|
||||
from datetime import datetime
|
||||
from sqlalchemy import Column, String, Integer, DateTime, Enum, ForeignKey, Boolean
|
||||
from sqlalchemy.dialects.postgresql import UUID
|
||||
import uuid
|
||||
from src.core.database import Base
|
||||
|
||||
class GitProvider(str, enum.Enum):
|
||||
GITHUB = "GITHUB"
|
||||
GITLAB = "GITLAB"
|
||||
GITEA = "GITEA"
|
||||
|
||||
class GitStatus(str, enum.Enum):
|
||||
CONNECTED = "CONNECTED"
|
||||
FAILED = "FAILED"
|
||||
UNKNOWN = "UNKNOWN"
|
||||
|
||||
class SyncStatus(str, enum.Enum):
|
||||
CLEAN = "CLEAN"
|
||||
DIRTY = "DIRTY"
|
||||
CONFLICT = "CONFLICT"
|
||||
|
||||
class GitServerConfig(Base):
|
||||
"""
|
||||
[DEF:GitServerConfig:Class]
|
||||
Configuration for a Git server connection.
|
||||
"""
|
||||
__tablename__ = "git_server_configs"
|
||||
|
||||
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
name = Column(String(255), nullable=False)
|
||||
provider = Column(Enum(GitProvider), nullable=False)
|
||||
url = Column(String(255), nullable=False)
|
||||
pat = Column(String(255), nullable=False) # PERSONAL ACCESS TOKEN
|
||||
default_repository = Column(String(255), nullable=True)
|
||||
status = Column(Enum(GitStatus), default=GitStatus.UNKNOWN)
|
||||
last_validated = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
class GitRepository(Base):
|
||||
"""
|
||||
[DEF:GitRepository:Class]
|
||||
Tracking for a local Git repository linked to a dashboard.
|
||||
"""
|
||||
__tablename__ = "git_repositories"
|
||||
|
||||
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
dashboard_id = Column(Integer, nullable=False, unique=True)
|
||||
config_id = Column(String(36), ForeignKey("git_server_configs.id"), nullable=False)
|
||||
remote_url = Column(String(255), nullable=False)
|
||||
local_path = Column(String(255), nullable=False)
|
||||
current_branch = Column(String(255), default="main")
|
||||
sync_status = Column(Enum(SyncStatus), default=SyncStatus.CLEAN)
|
||||
|
||||
class DeploymentEnvironment(Base):
|
||||
"""
|
||||
[DEF:DeploymentEnvironment:Class]
|
||||
Target Superset environments for dashboard deployment.
|
||||
"""
|
||||
__tablename__ = "deployment_environments"
|
||||
|
||||
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
name = Column(String(255), nullable=False)
|
||||
superset_url = Column(String(255), nullable=False)
|
||||
superset_token = Column(String(255), nullable=False)
|
||||
is_active = Column(Boolean, default=True)
|
||||
|
||||
# [/DEF:GitModels:Module]
|
||||
@@ -27,6 +27,7 @@ class TaskRecord(Base):
|
||||
finished_at = Column(DateTime(timezone=True), nullable=True)
|
||||
logs = Column(JSON, nullable=True) # Store structured logs as JSON
|
||||
error = Column(String, nullable=True)
|
||||
result = Column(JSON, nullable=True)
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||
params = Column(JSON, nullable=True)
|
||||
# [/DEF:TaskRecord:Class]
|
||||
|
||||
@@ -11,10 +11,10 @@ from pathlib import Path
|
||||
from requests.exceptions import RequestException
|
||||
|
||||
from ..core.plugin_base import PluginBase
|
||||
from superset_tool.client import SupersetClient
|
||||
from superset_tool.exceptions import SupersetAPIError
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
from superset_tool.utils.fileio import (
|
||||
from ..core.logger import belief_scope
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.utils.network import SupersetAPIError
|
||||
from ..core.utils.fileio import (
|
||||
save_and_unpack_dashboard,
|
||||
archive_exports,
|
||||
sanitize_filename,
|
||||
@@ -22,7 +22,6 @@ from superset_tool.utils.fileio import (
|
||||
remove_empty_directories,
|
||||
RetentionPolicy
|
||||
)
|
||||
from superset_tool.utils.init_clients import setup_clients
|
||||
from ..dependencies import get_config_manager
|
||||
|
||||
# [DEF:BackupPlugin:Class]
|
||||
@@ -33,24 +32,58 @@ class BackupPlugin(PluginBase):
|
||||
"""
|
||||
|
||||
@property
|
||||
# [DEF:id:Function]
|
||||
# @PURPOSE: Returns the unique identifier for the backup plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string ID.
|
||||
# @RETURN: str - "superset-backup"
|
||||
def id(self) -> str:
|
||||
return "superset-backup"
|
||||
with belief_scope("id"):
|
||||
return "superset-backup"
|
||||
# [/DEF:id:Function]
|
||||
|
||||
@property
|
||||
# [DEF:name:Function]
|
||||
# @PURPOSE: Returns the human-readable name of the backup plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string name.
|
||||
# @RETURN: str - Plugin name.
|
||||
def name(self) -> str:
|
||||
return "Superset Dashboard Backup"
|
||||
with belief_scope("name"):
|
||||
return "Superset Dashboard Backup"
|
||||
# [/DEF:name:Function]
|
||||
|
||||
@property
|
||||
# [DEF:description:Function]
|
||||
# @PURPOSE: Returns a description of the backup plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string description.
|
||||
# @RETURN: str - Plugin description.
|
||||
def description(self) -> str:
|
||||
return "Backs up all dashboards from a Superset instance."
|
||||
with belief_scope("description"):
|
||||
return "Backs up all dashboards from a Superset instance."
|
||||
# [/DEF:description:Function]
|
||||
|
||||
@property
|
||||
# [DEF:version:Function]
|
||||
# @PURPOSE: Returns the version of the backup plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string version.
|
||||
# @RETURN: str - "1.0.0"
|
||||
def version(self) -> str:
|
||||
return "1.0.0"
|
||||
with belief_scope("version"):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for backup plugin parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns dictionary schema.
|
||||
# @RETURN: Dict[str, Any] - JSON schema.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
config_manager = get_config_manager()
|
||||
envs = [e.name for e in config_manager.get_environments()]
|
||||
with belief_scope("get_schema"):
|
||||
config_manager = get_config_manager()
|
||||
envs = [e.name for e in config_manager.get_environments()]
|
||||
default_path = config_manager.get_config().settings.backup_path
|
||||
|
||||
return {
|
||||
@@ -71,79 +104,86 @@ class BackupPlugin(PluginBase):
|
||||
},
|
||||
"required": ["env", "backup_path"],
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Executes the dashboard backup logic.
|
||||
# @PARAM: params (Dict[str, Any]) - Backup parameters (env, backup_path).
|
||||
# @PRE: Target environment must be configured. params must be a dictionary.
|
||||
# @POST: All dashboards are exported and archived.
|
||||
async def execute(self, params: Dict[str, Any]):
|
||||
config_manager = get_config_manager()
|
||||
env_id = params.get("environment_id")
|
||||
|
||||
# Resolve environment name if environment_id is provided
|
||||
if env_id:
|
||||
env_config = next((e for e in config_manager.get_environments() if e.id == env_id), None)
|
||||
if env_config:
|
||||
params["env"] = env_config.name
|
||||
|
||||
env = params.get("env")
|
||||
if not env:
|
||||
raise KeyError("env")
|
||||
|
||||
backup_path_str = params.get("backup_path") or config_manager.get_config().settings.backup_path
|
||||
backup_path = Path(backup_path_str)
|
||||
|
||||
logger = SupersetLogger(log_dir=backup_path / "Logs", console=True)
|
||||
logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
||||
|
||||
try:
|
||||
with belief_scope("execute"):
|
||||
config_manager = get_config_manager()
|
||||
if not config_manager.has_environments():
|
||||
raise ValueError("No Superset environments configured. Please add an environment in Settings.")
|
||||
env_id = params.get("environment_id")
|
||||
|
||||
# Resolve environment name if environment_id is provided
|
||||
if env_id:
|
||||
env_config = next((e for e in config_manager.get_environments() if e.id == env_id), None)
|
||||
if env_config:
|
||||
params["env"] = env_config.name
|
||||
|
||||
env = params.get("env")
|
||||
if not env:
|
||||
raise KeyError("env")
|
||||
|
||||
backup_path_str = params.get("backup_path") or config_manager.get_config().settings.backup_path
|
||||
backup_path = Path(backup_path_str)
|
||||
|
||||
from ..core.logger import logger as app_logger
|
||||
app_logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
||||
|
||||
try:
|
||||
config_manager = get_config_manager()
|
||||
if not config_manager.has_environments():
|
||||
raise ValueError("No Superset environments configured. Please add an environment in Settings.")
|
||||
|
||||
env_config = config_manager.get_environment(env)
|
||||
if not env_config:
|
||||
raise ValueError(f"Environment '{env}' not found in configuration.")
|
||||
|
||||
clients = setup_clients(logger, custom_envs=config_manager.get_environments())
|
||||
client = clients.get(env)
|
||||
|
||||
if not client:
|
||||
raise ValueError(f"Environment '{env}' not found in configuration.")
|
||||
|
||||
dashboard_count, dashboard_meta = client.get_dashboards()
|
||||
logger.info(f"[BackupPlugin][Progress] Found {dashboard_count} dashboards to export in {env}.")
|
||||
client = SupersetClient(env_config)
|
||||
|
||||
dashboard_count, dashboard_meta = client.get_dashboards()
|
||||
app_logger.info(f"[BackupPlugin][Progress] Found {dashboard_count} dashboards to export in {env}.")
|
||||
|
||||
if dashboard_count == 0:
|
||||
logger.info("[BackupPlugin][Exit] No dashboards to back up.")
|
||||
return
|
||||
if dashboard_count == 0:
|
||||
app_logger.info("[BackupPlugin][Exit] No dashboards to back up.")
|
||||
return
|
||||
|
||||
for db in dashboard_meta:
|
||||
dashboard_id = db.get('id')
|
||||
dashboard_title = db.get('dashboard_title', 'Unknown Dashboard')
|
||||
if not dashboard_id:
|
||||
continue
|
||||
for db in dashboard_meta:
|
||||
dashboard_id = db.get('id')
|
||||
dashboard_title = db.get('dashboard_title', 'Unknown Dashboard')
|
||||
if not dashboard_id:
|
||||
continue
|
||||
|
||||
try:
|
||||
dashboard_base_dir_name = sanitize_filename(f"{dashboard_title}")
|
||||
dashboard_dir = backup_path / env.upper() / dashboard_base_dir_name
|
||||
dashboard_dir.mkdir(parents=True, exist_ok=True)
|
||||
try:
|
||||
dashboard_base_dir_name = sanitize_filename(f"{dashboard_title}")
|
||||
dashboard_dir = backup_path / env.upper() / dashboard_base_dir_name
|
||||
dashboard_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
zip_content, filename = client.export_dashboard(dashboard_id)
|
||||
zip_content, filename = client.export_dashboard(dashboard_id)
|
||||
|
||||
save_and_unpack_dashboard(
|
||||
zip_content=zip_content,
|
||||
original_filename=filename,
|
||||
output_dir=dashboard_dir,
|
||||
unpack=False,
|
||||
logger=logger
|
||||
)
|
||||
save_and_unpack_dashboard(
|
||||
zip_content=zip_content,
|
||||
original_filename=filename,
|
||||
output_dir=dashboard_dir,
|
||||
unpack=False
|
||||
)
|
||||
|
||||
archive_exports(str(dashboard_dir), policy=RetentionPolicy(), logger=logger)
|
||||
archive_exports(str(dashboard_dir), policy=RetentionPolicy())
|
||||
|
||||
except (SupersetAPIError, RequestException, IOError, OSError) as db_error:
|
||||
logger.error(f"[BackupPlugin][Failure] Failed to export dashboard {dashboard_title} (ID: {dashboard_id}): {db_error}", exc_info=True)
|
||||
continue
|
||||
|
||||
consolidate_archive_folders(backup_path / env.upper(), logger=logger)
|
||||
remove_empty_directories(str(backup_path / env.upper()), logger=logger)
|
||||
except (SupersetAPIError, RequestException, IOError, OSError) as db_error:
|
||||
app_logger.error(f"[BackupPlugin][Failure] Failed to export dashboard {dashboard_title} (ID: {dashboard_id}): {db_error}", exc_info=True)
|
||||
continue
|
||||
|
||||
consolidate_archive_folders(backup_path / env.upper())
|
||||
remove_empty_directories(str(backup_path / env.upper()))
|
||||
|
||||
logger.info(f"[BackupPlugin][CoherenceCheck:Passed] Backup logic completed for {env}.")
|
||||
app_logger.info(f"[BackupPlugin][CoherenceCheck:Passed] Backup logic completed for {env}.")
|
||||
|
||||
except (RequestException, IOError, KeyError) as e:
|
||||
logger.critical(f"[BackupPlugin][Failure] Fatal error during backup for {env}: {e}", exc_info=True)
|
||||
raise e
|
||||
except (RequestException, IOError, KeyError) as e:
|
||||
app_logger.critical(f"[BackupPlugin][Failure] Fatal error during backup for {env}: {e}", exc_info=True)
|
||||
raise e
|
||||
# [/DEF:execute:Function]
|
||||
# [/DEF:BackupPlugin:Class]
|
||||
# [/DEF:BackupPlugin:Module]
|
||||
187
backend/src/plugins/debug.py
Normal file
187
backend/src/plugins/debug.py
Normal file
@@ -0,0 +1,187 @@
|
||||
# [DEF:DebugPluginModule:Module]
|
||||
# @SEMANTICS: plugin, debug, api, database, superset
|
||||
# @PURPOSE: Implements a plugin for system diagnostics and debugging Superset API responses.
|
||||
# @LAYER: Plugins
|
||||
# @RELATION: Inherits from PluginBase. Uses SupersetClient from core.
|
||||
# @CONSTRAINT: Must use belief_scope for logging.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Dict, Any, Optional
|
||||
from ..core.plugin_base import PluginBase
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:DebugPlugin:Class]
|
||||
# @PURPOSE: Plugin for system diagnostics and debugging.
|
||||
class DebugPlugin(PluginBase):
|
||||
"""
|
||||
Plugin for system diagnostics and debugging.
|
||||
"""
|
||||
|
||||
@property
|
||||
# [DEF:id:Function]
|
||||
# @PURPOSE: Returns the unique identifier for the debug plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string ID.
|
||||
# @RETURN: str - "system-debug"
|
||||
def id(self) -> str:
|
||||
with belief_scope("id"):
|
||||
return "system-debug"
|
||||
# [/DEF:id:Function]
|
||||
|
||||
@property
|
||||
# [DEF:name:Function]
|
||||
# @PURPOSE: Returns the human-readable name of the debug plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string name.
|
||||
# @RETURN: str - Plugin name.
|
||||
def name(self) -> str:
|
||||
with belief_scope("name"):
|
||||
return "System Debug"
|
||||
# [/DEF:name:Function]
|
||||
|
||||
@property
|
||||
# [DEF:description:Function]
|
||||
# @PURPOSE: Returns a description of the debug plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string description.
|
||||
# @RETURN: str - Plugin description.
|
||||
def description(self) -> str:
|
||||
with belief_scope("description"):
|
||||
return "Run system diagnostics and debug Superset API responses."
|
||||
# [/DEF:description:Function]
|
||||
|
||||
@property
|
||||
# [DEF:version:Function]
|
||||
# @PURPOSE: Returns the version of the debug plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string version.
|
||||
# @RETURN: str - "1.0.0"
|
||||
def version(self) -> str:
|
||||
with belief_scope("version"):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for the debug plugin parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns dictionary schema.
|
||||
# @RETURN: Dict[str, Any] - JSON schema.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
with belief_scope("get_schema"):
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"action": {
|
||||
"type": "string",
|
||||
"title": "Action",
|
||||
"enum": ["test-db-api", "get-dataset-structure"],
|
||||
"default": "test-db-api"
|
||||
},
|
||||
"env": {
|
||||
"type": "string",
|
||||
"title": "Environment",
|
||||
"description": "The Superset environment (for dataset structure)."
|
||||
},
|
||||
"dataset_id": {
|
||||
"type": "integer",
|
||||
"title": "Dataset ID",
|
||||
"description": "The ID of the dataset (for dataset structure)."
|
||||
},
|
||||
"source_env": {
|
||||
"type": "string",
|
||||
"title": "Source Environment",
|
||||
"description": "Source env for DB API test."
|
||||
},
|
||||
"target_env": {
|
||||
"type": "string",
|
||||
"title": "Target Environment",
|
||||
"description": "Target env for DB API test."
|
||||
}
|
||||
},
|
||||
"required": ["action"]
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Executes the debug logic.
|
||||
# @PARAM: params (Dict[str, Any]) - Debug parameters.
|
||||
# @PRE: action must be provided in params.
|
||||
# @POST: Debug action is executed and results returned.
|
||||
# @RETURN: Dict[str, Any] - Execution results.
|
||||
async def execute(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
||||
with belief_scope("execute"):
|
||||
action = params.get("action")
|
||||
|
||||
if action == "test-db-api":
|
||||
return await self._test_db_api(params)
|
||||
elif action == "get-dataset-structure":
|
||||
return await self._get_dataset_structure(params)
|
||||
else:
|
||||
raise ValueError(f"Unknown action: {action}")
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
# [DEF:_test_db_api:Function]
|
||||
# @PURPOSE: Tests database API connectivity for source and target environments.
|
||||
# @PRE: source_env and target_env params exist in params.
|
||||
# @POST: Returns DB counts for both envs.
|
||||
# @PARAM: params (Dict) - Plugin parameters.
|
||||
# @RETURN: Dict - Comparison results.
|
||||
async def _test_db_api(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
||||
with belief_scope("_test_db_api"):
|
||||
source_env_name = params.get("source_env")
|
||||
target_env_name = params.get("target_env")
|
||||
|
||||
if not source_env_name or not target_env_name:
|
||||
raise ValueError("source_env and target_env are required for test-db-api")
|
||||
|
||||
from ..dependencies import get_config_manager
|
||||
config_manager = get_config_manager()
|
||||
|
||||
results = {}
|
||||
for name in [source_env_name, target_env_name]:
|
||||
env_config = config_manager.get_environment(name)
|
||||
if not env_config:
|
||||
raise ValueError(f"Environment '{name}' not found.")
|
||||
|
||||
client = SupersetClient(env_config)
|
||||
client.authenticate()
|
||||
count, dbs = client.get_databases()
|
||||
results[name] = {
|
||||
"count": count,
|
||||
"databases": dbs
|
||||
}
|
||||
|
||||
return results
|
||||
# [/DEF:_test_db_api:Function]
|
||||
|
||||
# [DEF:_get_dataset_structure:Function]
|
||||
# @PURPOSE: Retrieves the structure of a dataset.
|
||||
# @PRE: env and dataset_id params exist in params.
|
||||
# @POST: Returns dataset JSON structure.
|
||||
# @PARAM: params (Dict) - Plugin parameters.
|
||||
# @RETURN: Dict - Dataset structure.
|
||||
async def _get_dataset_structure(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
||||
with belief_scope("_get_dataset_structure"):
|
||||
env_name = params.get("env")
|
||||
dataset_id = params.get("dataset_id")
|
||||
|
||||
if not env_name or dataset_id is None:
|
||||
raise ValueError("env and dataset_id are required for get-dataset-structure")
|
||||
|
||||
from ..dependencies import get_config_manager
|
||||
config_manager = get_config_manager()
|
||||
env_config = config_manager.get_environment(env_name)
|
||||
if not env_config:
|
||||
raise ValueError(f"Environment '{env_name}' not found.")
|
||||
|
||||
client = SupersetClient(env_config)
|
||||
client.authenticate()
|
||||
|
||||
dataset_response = client.get_dataset(dataset_id)
|
||||
return dataset_response.get('result') or {}
|
||||
# [/DEF:_get_dataset_structure:Function]
|
||||
|
||||
# [/DEF:DebugPlugin:Class]
|
||||
# [/DEF:DebugPluginModule:Module]
|
||||
345
backend/src/plugins/git_plugin.py
Normal file
345
backend/src/plugins/git_plugin.py
Normal file
@@ -0,0 +1,345 @@
|
||||
# [DEF:backend.src.plugins.git_plugin:Module]
|
||||
#
|
||||
# @SEMANTICS: git, plugin, dashboard, version_control, sync, deploy
|
||||
# @PURPOSE: Предоставляет плагин для версионирования и развертывания дашбордов Superset.
|
||||
# @LAYER: Plugin
|
||||
# @RELATION: INHERITS_FROM -> src.core.plugin_base.PluginBase
|
||||
# @RELATION: USES -> src.services.git_service.GitService
|
||||
# @RELATION: USES -> src.core.superset_client.SupersetClient
|
||||
# @RELATION: USES -> src.core.config_manager.ConfigManager
|
||||
#
|
||||
# @INVARIANT: Все операции с Git должны выполняться через GitService.
|
||||
# @CONSTRAINT: Плагин работает только с распакованными YAML-экспортами Superset.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import os
|
||||
import io
|
||||
import shutil
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Dict, Any, Optional
|
||||
from src.core.plugin_base import PluginBase
|
||||
from src.services.git_service import GitService
|
||||
from src.core.logger import logger, belief_scope
|
||||
from src.core.config_manager import ConfigManager
|
||||
from src.core.superset_client import SupersetClient
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:GitPlugin:Class]
|
||||
# @PURPOSE: Реализация плагина Git Integration для управления версиями дашбордов.
|
||||
class GitPlugin(PluginBase):
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Инициализирует плагин и его зависимости.
|
||||
# @POST: Инициализированы git_service и config_manager.
|
||||
def __init__(self):
|
||||
with belief_scope("GitPlugin.__init__"):
|
||||
logger.info("[GitPlugin.__init__][Entry] Initializing GitPlugin.")
|
||||
self.git_service = GitService()
|
||||
|
||||
# Robust config path resolution:
|
||||
# 1. Try absolute path from src/dependencies.py style if possible
|
||||
# 2. Try relative paths based on common execution patterns
|
||||
if os.path.exists("../config.json"):
|
||||
config_path = "../config.json"
|
||||
elif os.path.exists("config.json"):
|
||||
config_path = "config.json"
|
||||
else:
|
||||
# Fallback to the one initialized in dependencies if we can import it
|
||||
try:
|
||||
from src.dependencies import config_manager
|
||||
self.config_manager = config_manager
|
||||
logger.info("[GitPlugin.__init__][Exit] GitPlugin initialized using shared config_manager.")
|
||||
return
|
||||
except:
|
||||
config_path = "config.json"
|
||||
|
||||
self.config_manager = ConfigManager(config_path)
|
||||
logger.info(f"[GitPlugin.__init__][Exit] GitPlugin initialized with {config_path}")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
@property
|
||||
def id(self) -> str:
|
||||
return "git-integration"
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return "Git Integration"
|
||||
|
||||
@property
|
||||
def description(self) -> str:
|
||||
return "Version control for Superset dashboards"
|
||||
|
||||
@property
|
||||
def version(self) -> str:
|
||||
return "0.1.0"
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Возвращает JSON-схему параметров для выполнения задач плагина.
|
||||
# @RETURN: Dict[str, Any] - Схема параметров.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
with belief_scope("GitPlugin.get_schema"):
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"operation": {"type": "string", "enum": ["sync", "deploy", "history"]},
|
||||
"dashboard_id": {"type": "integer"},
|
||||
"environment_id": {"type": "string"},
|
||||
"source_env_id": {"type": "string"}
|
||||
},
|
||||
"required": ["operation", "dashboard_id"]
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:initialize:Function]
|
||||
# @PURPOSE: Выполняет начальную настройку плагина.
|
||||
# @POST: Плагин готов к выполнению задач.
|
||||
async def initialize(self):
|
||||
with belief_scope("GitPlugin.initialize"):
|
||||
logger.info("[GitPlugin.initialize][Action] Initializing Git Integration Plugin logic.")
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Основной метод выполнения задач плагина.
|
||||
# @PRE: task_data содержит 'operation' и 'dashboard_id'.
|
||||
# @POST: Возвращает результат выполнения операции.
|
||||
# @PARAM: task_data (Dict[str, Any]) - Данные задачи.
|
||||
# @RETURN: Dict[str, Any] - Статус и сообщение.
|
||||
# @RELATION: CALLS -> self._handle_sync
|
||||
# @RELATION: CALLS -> self._handle_deploy
|
||||
async def execute(self, task_data: Dict[str, Any]) -> Dict[str, Any]:
|
||||
with belief_scope("GitPlugin.execute"):
|
||||
operation = task_data.get("operation")
|
||||
dashboard_id = task_data.get("dashboard_id")
|
||||
|
||||
logger.info(f"[GitPlugin.execute][Entry] Executing operation: {operation} for dashboard {dashboard_id}")
|
||||
|
||||
if operation == "sync":
|
||||
source_env_id = task_data.get("source_env_id")
|
||||
result = await self._handle_sync(dashboard_id, source_env_id)
|
||||
elif operation == "deploy":
|
||||
env_id = task_data.get("environment_id")
|
||||
result = await self._handle_deploy(dashboard_id, env_id)
|
||||
elif operation == "history":
|
||||
result = {"status": "success", "message": "History available via API"}
|
||||
else:
|
||||
logger.error(f"[GitPlugin.execute][Coherence:Failed] Unknown operation: {operation}")
|
||||
raise ValueError(f"Unknown operation: {operation}")
|
||||
|
||||
logger.info(f"[GitPlugin.execute][Exit] Operation {operation} completed.")
|
||||
return result
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
# [DEF:_handle_sync:Function]
|
||||
# @PURPOSE: Экспортирует дашборд из Superset и распаковывает в Git-репозиторий.
|
||||
# @PRE: Репозиторий для дашборда должен существовать.
|
||||
# @POST: Файлы в репозитории обновлены до текущего состояния в Superset.
|
||||
# @PARAM: dashboard_id (int) - ID дашборда.
|
||||
# @PARAM: source_env_id (Optional[str]) - ID исходного окружения.
|
||||
# @RETURN: Dict[str, str] - Результат синхронизации.
|
||||
# @SIDE_EFFECT: Изменяет файлы в локальной рабочей директории репозитория.
|
||||
# @RELATION: CALLS -> src.services.git_service.GitService.get_repo
|
||||
# @RELATION: CALLS -> src.core.superset_client.SupersetClient.export_dashboard
|
||||
async def _handle_sync(self, dashboard_id: int, source_env_id: Optional[str] = None) -> Dict[str, str]:
|
||||
with belief_scope("GitPlugin._handle_sync"):
|
||||
try:
|
||||
# 1. Получение репозитория
|
||||
repo = self.git_service.get_repo(dashboard_id)
|
||||
repo_path = Path(repo.working_dir)
|
||||
logger.info(f"[_handle_sync][Action] Target repo path: {repo_path}")
|
||||
|
||||
# 2. Настройка клиента Superset
|
||||
env = self._get_env(source_env_id)
|
||||
client = SupersetClient(env)
|
||||
client.authenticate()
|
||||
|
||||
# 3. Экспорт дашборда
|
||||
logger.info(f"[_handle_sync][Action] Exporting dashboard {dashboard_id} from {env.name}")
|
||||
zip_bytes, _ = client.export_dashboard(dashboard_id)
|
||||
|
||||
# 4. Распаковка с выравниванием структуры (flattening)
|
||||
logger.info(f"[_handle_sync][Action] Unpacking export to {repo_path}")
|
||||
|
||||
# Список папок/файлов, которые мы ожидаем от Superset
|
||||
managed_dirs = ["dashboards", "charts", "datasets", "databases"]
|
||||
managed_files = ["metadata.yaml"]
|
||||
|
||||
# Очистка старых данных перед распаковкой, чтобы не оставалось "призраков"
|
||||
for d in managed_dirs:
|
||||
d_path = repo_path / d
|
||||
if d_path.exists() and d_path.is_dir():
|
||||
shutil.rmtree(d_path)
|
||||
for f in managed_files:
|
||||
f_path = repo_path / f
|
||||
if f_path.exists():
|
||||
f_path.unlink()
|
||||
|
||||
with zipfile.ZipFile(io.BytesIO(zip_bytes)) as zf:
|
||||
# Superset экспортирует всё в подпапку dashboard_export_timestamp/
|
||||
# Нам нужно найти это имя папки
|
||||
namelist = zf.namelist()
|
||||
if not namelist:
|
||||
raise ValueError("Export ZIP is empty")
|
||||
|
||||
root_folder = namelist[0].split('/')[0]
|
||||
logger.info(f"[_handle_sync][Action] Detected root folder in ZIP: {root_folder}")
|
||||
|
||||
for member in zf.infolist():
|
||||
if member.filename.startswith(root_folder + "/") and len(member.filename) > len(root_folder) + 1:
|
||||
# Убираем префикс папки
|
||||
relative_path = member.filename[len(root_folder)+1:]
|
||||
target_path = repo_path / relative_path
|
||||
|
||||
if member.is_dir():
|
||||
target_path.mkdir(parents=True, exist_ok=True)
|
||||
else:
|
||||
target_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with zf.open(member) as source, open(target_path, "wb") as target:
|
||||
shutil.copyfileobj(source, target)
|
||||
|
||||
# 5. Автоматический staging изменений (не коммит, чтобы юзер мог проверить diff)
|
||||
try:
|
||||
repo.git.add(A=True)
|
||||
logger.info(f"[_handle_sync][Action] Changes staged in git")
|
||||
except Exception as ge:
|
||||
logger.warning(f"[_handle_sync][Action] Failed to stage changes: {ge}")
|
||||
|
||||
logger.info(f"[_handle_sync][Coherence:OK] Dashboard {dashboard_id} synced successfully.")
|
||||
return {"status": "success", "message": "Dashboard synced and flattened in local repository"}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[_handle_sync][Coherence:Failed] Sync failed: {e}")
|
||||
raise
|
||||
# [/DEF:_handle_sync:Function]
|
||||
|
||||
# [DEF:_handle_deploy:Function]
|
||||
# @PURPOSE: Упаковывает репозиторий в ZIP и импортирует в целевое окружение Superset.
|
||||
# @PRE: environment_id должен соответствовать настроенному окружению.
|
||||
# @POST: Дашборд импортирован в целевой Superset.
|
||||
# @PARAM: dashboard_id (int) - ID дашборда.
|
||||
# @PARAM: env_id (str) - ID целевого окружения.
|
||||
# @RETURN: Dict[str, Any] - Результат деплоя.
|
||||
# @SIDE_EFFECT: Создает и удаляет временный ZIP-файл.
|
||||
# @RELATION: CALLS -> src.core.superset_client.SupersetClient.import_dashboard
|
||||
async def _handle_deploy(self, dashboard_id: int, env_id: str) -> Dict[str, Any]:
|
||||
with belief_scope("GitPlugin._handle_deploy"):
|
||||
try:
|
||||
if not env_id:
|
||||
raise ValueError("Target environment ID required for deployment")
|
||||
|
||||
# 1. Получение репозитория
|
||||
repo = self.git_service.get_repo(dashboard_id)
|
||||
repo_path = Path(repo.working_dir)
|
||||
|
||||
# 2. Упаковка в ZIP
|
||||
logger.info(f"[_handle_deploy][Action] Packing repository {repo_path} for deployment.")
|
||||
zip_buffer = io.BytesIO()
|
||||
|
||||
# Superset expects a root directory in the ZIP (e.g., dashboard_export_20240101T000000/)
|
||||
root_dir_name = f"dashboard_export_{dashboard_id}"
|
||||
|
||||
with zipfile.ZipFile(zip_buffer, "w", zipfile.ZIP_DEFLATED) as zf:
|
||||
for root, dirs, files in os.walk(repo_path):
|
||||
if ".git" in dirs:
|
||||
dirs.remove(".git")
|
||||
for file in files:
|
||||
if file == ".git" or file.endswith(".zip"): continue
|
||||
file_path = Path(root) / file
|
||||
# Prepend the root directory name to the archive path
|
||||
arcname = Path(root_dir_name) / file_path.relative_to(repo_path)
|
||||
zf.write(file_path, arcname)
|
||||
|
||||
zip_buffer.seek(0)
|
||||
|
||||
# 3. Настройка клиента Superset
|
||||
env = self.config_manager.get_environment(env_id)
|
||||
if not env:
|
||||
raise ValueError(f"Environment {env_id} not found")
|
||||
|
||||
client = SupersetClient(env)
|
||||
client.authenticate()
|
||||
|
||||
# 4. Импорт
|
||||
temp_zip_path = repo_path / f"deploy_{dashboard_id}.zip"
|
||||
logger.info(f"[_handle_deploy][Action] Saving temporary zip to {temp_zip_path}")
|
||||
with open(temp_zip_path, "wb") as f:
|
||||
f.write(zip_buffer.getvalue())
|
||||
|
||||
try:
|
||||
logger.info(f"[_handle_deploy][Action] Importing dashboard to {env.name}")
|
||||
result = client.import_dashboard(temp_zip_path)
|
||||
logger.info(f"[_handle_deploy][Coherence:OK] Deployment successful for dashboard {dashboard_id}.")
|
||||
return {"status": "success", "message": f"Dashboard deployed to {env.name}", "details": result}
|
||||
finally:
|
||||
if temp_zip_path.exists():
|
||||
os.remove(temp_zip_path)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[_handle_deploy][Coherence:Failed] Deployment failed: {e}")
|
||||
raise
|
||||
# [/DEF:_handle_deploy:Function]
|
||||
|
||||
# [DEF:_get_env:Function]
|
||||
# @PURPOSE: Вспомогательный метод для получения конфигурации окружения.
|
||||
# @PARAM: env_id (Optional[str]) - ID окружения.
|
||||
# @RETURN: Environment - Объект конфигурации окружения.
|
||||
def _get_env(self, env_id: Optional[str] = None):
|
||||
with belief_scope("GitPlugin._get_env"):
|
||||
logger.info(f"[_get_env][Entry] Fetching environment for ID: {env_id}")
|
||||
|
||||
# Priority 1: ConfigManager (config.json)
|
||||
if env_id:
|
||||
env = self.config_manager.get_environment(env_id)
|
||||
if env:
|
||||
logger.info(f"[_get_env][Exit] Found environment by ID in ConfigManager: {env.name}")
|
||||
return env
|
||||
|
||||
# Priority 2: Database (DeploymentEnvironment)
|
||||
from src.core.database import SessionLocal
|
||||
from src.models.git import DeploymentEnvironment
|
||||
|
||||
db = SessionLocal()
|
||||
try:
|
||||
if env_id:
|
||||
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.id == env_id).first()
|
||||
else:
|
||||
# If no ID, try to find active or any environment in DB
|
||||
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.is_active == True).first()
|
||||
if not db_env:
|
||||
db_env = db.query(DeploymentEnvironment).first()
|
||||
|
||||
if db_env:
|
||||
logger.info(f"[_get_env][Exit] Found environment in DB: {db_env.name}")
|
||||
from src.core.config_models import Environment
|
||||
# Use token as password for SupersetClient
|
||||
return Environment(
|
||||
id=db_env.id,
|
||||
name=db_env.name,
|
||||
url=db_env.superset_url,
|
||||
username="admin",
|
||||
password=db_env.superset_token,
|
||||
verify_ssl=True
|
||||
)
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
# Priority 3: ConfigManager Default (if no env_id provided)
|
||||
envs = self.config_manager.get_environments()
|
||||
if envs:
|
||||
if env_id:
|
||||
# If env_id was provided but not found in DB or specifically by ID in config,
|
||||
# but we have other envs, maybe it's one of them?
|
||||
env = next((e for e in envs if e.id == env_id), None)
|
||||
if env:
|
||||
logger.info(f"[_get_env][Exit] Found environment {env_id} in ConfigManager list")
|
||||
return env
|
||||
|
||||
if not env_id:
|
||||
logger.info(f"[_get_env][Exit] Using first environment from ConfigManager: {envs[0].name}")
|
||||
return envs[0]
|
||||
|
||||
logger.error(f"[_get_env][Coherence:Failed] No environments configured (searched config.json and DB). env_id={env_id}")
|
||||
raise ValueError("No environments configured. Please add a Superset Environment in Settings.")
|
||||
# [/DEF:_get_env:Function]
|
||||
|
||||
# [/DEF:GitPlugin:Class]
|
||||
# [/DEF:backend.src.plugins.git_plugin:Module]
|
||||
195
backend/src/plugins/mapper.py
Normal file
195
backend/src/plugins/mapper.py
Normal file
@@ -0,0 +1,195 @@
|
||||
# [DEF:MapperPluginModule:Module]
|
||||
# @SEMANTICS: plugin, mapper, datasets, postgresql, excel
|
||||
# @PURPOSE: Implements a plugin for mapping dataset columns using external database connections or Excel files.
|
||||
# @LAYER: Plugins
|
||||
# @RELATION: Inherits from PluginBase. Uses DatasetMapper from superset_tool.
|
||||
# @CONSTRAINT: Must use belief_scope for logging.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Dict, Any, Optional
|
||||
from ..core.plugin_base import PluginBase
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.logger import logger, belief_scope
|
||||
from ..core.database import SessionLocal
|
||||
from ..models.connection import ConnectionConfig
|
||||
from ..core.utils.dataset_mapper import DatasetMapper
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:MapperPlugin:Class]
|
||||
# @PURPOSE: Plugin for mapping dataset columns verbose names.
|
||||
class MapperPlugin(PluginBase):
|
||||
"""
|
||||
Plugin for mapping dataset columns verbose names.
|
||||
"""
|
||||
|
||||
@property
|
||||
# [DEF:id:Function]
|
||||
# @PURPOSE: Returns the unique identifier for the mapper plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string ID.
|
||||
# @RETURN: str - "dataset-mapper"
|
||||
def id(self) -> str:
|
||||
with belief_scope("id"):
|
||||
return "dataset-mapper"
|
||||
# [/DEF:id:Function]
|
||||
|
||||
@property
|
||||
# [DEF:name:Function]
|
||||
# @PURPOSE: Returns the human-readable name of the mapper plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string name.
|
||||
# @RETURN: str - Plugin name.
|
||||
def name(self) -> str:
|
||||
with belief_scope("name"):
|
||||
return "Dataset Mapper"
|
||||
# [/DEF:name:Function]
|
||||
|
||||
@property
|
||||
# [DEF:description:Function]
|
||||
# @PURPOSE: Returns a description of the mapper plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string description.
|
||||
# @RETURN: str - Plugin description.
|
||||
def description(self) -> str:
|
||||
with belief_scope("description"):
|
||||
return "Map dataset column verbose names using PostgreSQL comments or Excel files."
|
||||
# [/DEF:description:Function]
|
||||
|
||||
@property
|
||||
# [DEF:version:Function]
|
||||
# @PURPOSE: Returns the version of the mapper plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string version.
|
||||
# @RETURN: str - "1.0.0"
|
||||
def version(self) -> str:
|
||||
with belief_scope("version"):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for the mapper plugin parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns dictionary schema.
|
||||
# @RETURN: Dict[str, Any] - JSON schema.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
with belief_scope("get_schema"):
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"env": {
|
||||
"type": "string",
|
||||
"title": "Environment",
|
||||
"description": "The Superset environment (e.g., 'dev')."
|
||||
},
|
||||
"dataset_id": {
|
||||
"type": "integer",
|
||||
"title": "Dataset ID",
|
||||
"description": "The ID of the dataset to update."
|
||||
},
|
||||
"source": {
|
||||
"type": "string",
|
||||
"title": "Mapping Source",
|
||||
"enum": ["postgres", "excel"],
|
||||
"default": "postgres"
|
||||
},
|
||||
"connection_id": {
|
||||
"type": "string",
|
||||
"title": "Saved Connection",
|
||||
"description": "The ID of a saved database connection (for postgres source)."
|
||||
},
|
||||
"table_name": {
|
||||
"type": "string",
|
||||
"title": "Table Name",
|
||||
"description": "Target table name in PostgreSQL."
|
||||
},
|
||||
"table_schema": {
|
||||
"type": "string",
|
||||
"title": "Table Schema",
|
||||
"description": "Target table schema in PostgreSQL.",
|
||||
"default": "public"
|
||||
},
|
||||
"excel_path": {
|
||||
"type": "string",
|
||||
"title": "Excel Path",
|
||||
"description": "Path to the Excel file (for excel source)."
|
||||
}
|
||||
},
|
||||
"required": ["env", "dataset_id", "source"]
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Executes the dataset mapping logic.
|
||||
# @PARAM: params (Dict[str, Any]) - Mapping parameters.
|
||||
# @PRE: Params contain valid 'env', 'dataset_id', and 'source'. params must be a dictionary.
|
||||
# @POST: Updates the dataset in Superset.
|
||||
# @RETURN: Dict[str, Any] - Execution status.
|
||||
async def execute(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
||||
with belief_scope("execute"):
|
||||
env_name = params.get("env")
|
||||
dataset_id = params.get("dataset_id")
|
||||
source = params.get("source")
|
||||
|
||||
if not env_name or dataset_id is None or not source:
|
||||
logger.error("[MapperPlugin.execute][State] Missing required parameters.")
|
||||
raise ValueError("Missing required parameters: env, dataset_id, source")
|
||||
|
||||
# Get config and initialize client
|
||||
from ..dependencies import get_config_manager
|
||||
config_manager = get_config_manager()
|
||||
env_config = config_manager.get_environment(env_name)
|
||||
if not env_config:
|
||||
logger.error(f"[MapperPlugin.execute][State] Environment '{env_name}' not found.")
|
||||
raise ValueError(f"Environment '{env_name}' not found in configuration.")
|
||||
|
||||
client = SupersetClient(env_config)
|
||||
client.authenticate()
|
||||
|
||||
postgres_config = None
|
||||
if source == "postgres":
|
||||
connection_id = params.get("connection_id")
|
||||
if not connection_id:
|
||||
logger.error("[MapperPlugin.execute][State] connection_id is required for postgres source.")
|
||||
raise ValueError("connection_id is required for postgres source.")
|
||||
|
||||
# Load connection from DB
|
||||
db = SessionLocal()
|
||||
try:
|
||||
conn_config = db.query(ConnectionConfig).filter(ConnectionConfig.id == connection_id).first()
|
||||
if not conn_config:
|
||||
logger.error(f"[MapperPlugin.execute][State] Connection {connection_id} not found.")
|
||||
raise ValueError(f"Connection {connection_id} not found.")
|
||||
|
||||
postgres_config = {
|
||||
'dbname': conn_config.database,
|
||||
'user': conn_config.username,
|
||||
'password': conn_config.password,
|
||||
'host': conn_config.host,
|
||||
'port': str(conn_config.port) if conn_config.port else '5432'
|
||||
}
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
logger.info(f"[MapperPlugin.execute][Action] Starting mapping for dataset {dataset_id} in {env_name}")
|
||||
|
||||
mapper = DatasetMapper()
|
||||
|
||||
try:
|
||||
mapper.run_mapping(
|
||||
superset_client=client,
|
||||
dataset_id=dataset_id,
|
||||
source=source,
|
||||
postgres_config=postgres_config,
|
||||
excel_path=params.get("excel_path"),
|
||||
table_name=params.get("table_name"),
|
||||
table_schema=params.get("table_schema") or "public"
|
||||
)
|
||||
logger.info(f"[MapperPlugin.execute][Success] Mapping completed for dataset {dataset_id}")
|
||||
return {"status": "success", "dataset_id": dataset_id}
|
||||
except Exception as e:
|
||||
logger.error(f"[MapperPlugin.execute][Failure] Mapping failed: {e}")
|
||||
raise
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
# [/DEF:MapperPlugin:Class]
|
||||
# [/DEF:MapperPluginModule:Module]
|
||||
@@ -12,11 +12,10 @@ import zipfile
|
||||
import re
|
||||
|
||||
from ..core.plugin_base import PluginBase
|
||||
from superset_tool.client import SupersetClient
|
||||
from superset_tool.utils.init_clients import setup_clients
|
||||
from superset_tool.utils.fileio import create_temp_file, update_yamls, create_dashboard_export
|
||||
from ..core.logger import belief_scope
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.utils.fileio import create_temp_file, update_yamls, create_dashboard_export
|
||||
from ..dependencies import get_config_manager
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
from ..core.migration_engine import MigrationEngine
|
||||
from ..core.database import SessionLocal
|
||||
from ..models.mapping import DatabaseMapping, Environment
|
||||
@@ -29,23 +28,57 @@ class MigrationPlugin(PluginBase):
|
||||
"""
|
||||
|
||||
@property
|
||||
# [DEF:id:Function]
|
||||
# @PURPOSE: Returns the unique identifier for the migration plugin.
|
||||
# @PRE: None.
|
||||
# @POST: Returns "superset-migration".
|
||||
# @RETURN: str - "superset-migration"
|
||||
def id(self) -> str:
|
||||
return "superset-migration"
|
||||
with belief_scope("id"):
|
||||
return "superset-migration"
|
||||
# [/DEF:id:Function]
|
||||
|
||||
@property
|
||||
# [DEF:name:Function]
|
||||
# @PURPOSE: Returns the human-readable name of the migration plugin.
|
||||
# @PRE: None.
|
||||
# @POST: Returns the plugin name.
|
||||
# @RETURN: str - Plugin name.
|
||||
def name(self) -> str:
|
||||
return "Superset Dashboard Migration"
|
||||
with belief_scope("name"):
|
||||
return "Superset Dashboard Migration"
|
||||
# [/DEF:name:Function]
|
||||
|
||||
@property
|
||||
# [DEF:description:Function]
|
||||
# @PURPOSE: Returns a description of the migration plugin.
|
||||
# @PRE: None.
|
||||
# @POST: Returns the plugin description.
|
||||
# @RETURN: str - Plugin description.
|
||||
def description(self) -> str:
|
||||
return "Migrates dashboards between Superset environments."
|
||||
with belief_scope("description"):
|
||||
return "Migrates dashboards between Superset environments."
|
||||
# [/DEF:description:Function]
|
||||
|
||||
@property
|
||||
# [DEF:version:Function]
|
||||
# @PURPOSE: Returns the version of the migration plugin.
|
||||
# @PRE: None.
|
||||
# @POST: Returns "1.0.0".
|
||||
# @RETURN: str - "1.0.0"
|
||||
def version(self) -> str:
|
||||
return "1.0.0"
|
||||
with belief_scope("version"):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for migration plugin parameters.
|
||||
# @PRE: Config manager is available.
|
||||
# @POST: Returns a valid JSON schema dictionary.
|
||||
# @RETURN: Dict[str, Any] - JSON schema.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
config_manager = get_config_manager()
|
||||
with belief_scope("get_schema"):
|
||||
config_manager = get_config_manager()
|
||||
envs = [e.name for e in config_manager.get_environments()]
|
||||
|
||||
return {
|
||||
@@ -87,11 +120,18 @@ class MigrationPlugin(PluginBase):
|
||||
},
|
||||
"required": ["from_env", "to_env", "dashboard_regex"],
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Executes the dashboard migration logic.
|
||||
# @PARAM: params (Dict[str, Any]) - Migration parameters.
|
||||
# @PRE: Source and target environments must be configured.
|
||||
# @POST: Selected dashboards are migrated.
|
||||
async def execute(self, params: Dict[str, Any]):
|
||||
source_env_id = params.get("source_env_id")
|
||||
target_env_id = params.get("target_env_id")
|
||||
selected_ids = params.get("selected_ids")
|
||||
with belief_scope("MigrationPlugin.execute"):
|
||||
source_env_id = params.get("source_env_id")
|
||||
target_env_id = params.get("target_env_id")
|
||||
selected_ids = params.get("selected_ids")
|
||||
|
||||
# Legacy support or alternative params
|
||||
from_env_name = params.get("from_env")
|
||||
@@ -108,30 +148,78 @@ class MigrationPlugin(PluginBase):
|
||||
from ..dependencies import get_task_manager
|
||||
tm = get_task_manager()
|
||||
|
||||
class TaskLoggerProxy(SupersetLogger):
|
||||
class TaskLoggerProxy:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the proxy logger.
|
||||
# @PRE: None.
|
||||
# @POST: Instance is initialized.
|
||||
def __init__(self):
|
||||
# Initialize parent with dummy values since we override methods
|
||||
super().__init__(console=False)
|
||||
with belief_scope("__init__"):
|
||||
# Initialize parent with dummy values since we override methods
|
||||
pass
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:debug:Function]
|
||||
# @PURPOSE: Logs a debug message to the task manager.
|
||||
# @PRE: msg is a string.
|
||||
# @POST: Log is added to task manager if task_id exists.
|
||||
def debug(self, msg, *args, extra=None, **kwargs):
|
||||
if task_id: tm._add_log(task_id, "DEBUG", msg, extra or {})
|
||||
with belief_scope("debug"):
|
||||
if task_id: tm._add_log(task_id, "DEBUG", msg, extra or {})
|
||||
# [/DEF:debug:Function]
|
||||
|
||||
# [DEF:info:Function]
|
||||
# @PURPOSE: Logs an info message to the task manager.
|
||||
# @PRE: msg is a string.
|
||||
# @POST: Log is added to task manager if task_id exists.
|
||||
def info(self, msg, *args, extra=None, **kwargs):
|
||||
if task_id: tm._add_log(task_id, "INFO", msg, extra or {})
|
||||
with belief_scope("info"):
|
||||
if task_id: tm._add_log(task_id, "INFO", msg, extra or {})
|
||||
# [/DEF:info:Function]
|
||||
|
||||
# [DEF:warning:Function]
|
||||
# @PURPOSE: Logs a warning message to the task manager.
|
||||
# @PRE: msg is a string.
|
||||
# @POST: Log is added to task manager if task_id exists.
|
||||
def warning(self, msg, *args, extra=None, **kwargs):
|
||||
if task_id: tm._add_log(task_id, "WARNING", msg, extra or {})
|
||||
with belief_scope("warning"):
|
||||
if task_id: tm._add_log(task_id, "WARNING", msg, extra or {})
|
||||
# [/DEF:warning:Function]
|
||||
|
||||
# [DEF:error:Function]
|
||||
# @PURPOSE: Logs an error message to the task manager.
|
||||
# @PRE: msg is a string.
|
||||
# @POST: Log is added to task manager if task_id exists.
|
||||
def error(self, msg, *args, extra=None, **kwargs):
|
||||
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
|
||||
with belief_scope("error"):
|
||||
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
|
||||
# [/DEF:error:Function]
|
||||
|
||||
# [DEF:critical:Function]
|
||||
# @PURPOSE: Logs a critical message to the task manager.
|
||||
# @PRE: msg is a string.
|
||||
# @POST: Log is added to task manager if task_id exists.
|
||||
def critical(self, msg, *args, extra=None, **kwargs):
|
||||
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
|
||||
with belief_scope("critical"):
|
||||
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
|
||||
# [/DEF:critical:Function]
|
||||
|
||||
# [DEF:exception:Function]
|
||||
# @PURPOSE: Logs an exception message to the task manager.
|
||||
# @PRE: msg is a string.
|
||||
# @POST: Log is added to task manager if task_id exists.
|
||||
def exception(self, msg, *args, **kwargs):
|
||||
if task_id: tm._add_log(task_id, "ERROR", msg, {"exception": True})
|
||||
with belief_scope("exception"):
|
||||
if task_id: tm._add_log(task_id, "ERROR", msg, {"exception": True})
|
||||
# [/DEF:exception:Function]
|
||||
|
||||
logger = TaskLoggerProxy()
|
||||
logger.info(f"[MigrationPlugin][Entry] Starting migration task.")
|
||||
logger.info(f"[MigrationPlugin][Action] Params: {params}")
|
||||
|
||||
try:
|
||||
config_manager = get_config_manager()
|
||||
with belief_scope("execute"):
|
||||
config_manager = get_config_manager()
|
||||
environments = config_manager.get_environments()
|
||||
|
||||
# Resolve environments
|
||||
@@ -156,9 +244,8 @@ class MigrationPlugin(PluginBase):
|
||||
|
||||
logger.info(f"[MigrationPlugin][State] Resolved environments: {from_env_name} -> {to_env_name}")
|
||||
|
||||
all_clients = setup_clients(logger, custom_envs=environments)
|
||||
from_c = all_clients.get(from_env_name)
|
||||
to_c = all_clients.get(to_env_name)
|
||||
from_c = SupersetClient(src_env)
|
||||
to_c = SupersetClient(tgt_env)
|
||||
|
||||
if not from_c or not to_c:
|
||||
raise ValueError(f"Clients not initialized for environments: {from_env_name}, {to_env_name}")
|
||||
@@ -289,12 +376,12 @@ class MigrationPlugin(PluginBase):
|
||||
continue
|
||||
|
||||
logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True)
|
||||
# [/DEF:MigrationPlugin.execute:Action]
|
||||
|
||||
logger.info("[MigrationPlugin][Exit] Migration finished.")
|
||||
|
||||
logger.info("[MigrationPlugin][Exit] Migration finished.")
|
||||
except Exception as e:
|
||||
logger.critical(f"[MigrationPlugin][Failure] Fatal error during migration: {e}", exc_info=True)
|
||||
raise e
|
||||
# [/DEF:MigrationPlugin.execute:Action]
|
||||
# [/DEF:execute:Function]
|
||||
# [/DEF:MigrationPlugin:Class]
|
||||
# [/DEF:MigrationPlugin:Module]
|
||||
202
backend/src/plugins/search.py
Normal file
202
backend/src/plugins/search.py
Normal file
@@ -0,0 +1,202 @@
|
||||
# [DEF:SearchPluginModule:Module]
|
||||
# @SEMANTICS: plugin, search, datasets, regex, superset
|
||||
# @PURPOSE: Implements a plugin for searching text patterns across all datasets in a specific Superset environment.
|
||||
# @LAYER: Plugins
|
||||
# @RELATION: Inherits from PluginBase. Uses SupersetClient from core.
|
||||
# @CONSTRAINT: Must use belief_scope for logging.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import re
|
||||
from typing import Dict, Any, List, Optional
|
||||
from ..core.plugin_base import PluginBase
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:SearchPlugin:Class]
|
||||
# @PURPOSE: Plugin for searching text patterns in Superset datasets.
|
||||
class SearchPlugin(PluginBase):
|
||||
"""
|
||||
Plugin for searching text patterns in Superset datasets.
|
||||
"""
|
||||
|
||||
@property
|
||||
# [DEF:id:Function]
|
||||
# @PURPOSE: Returns the unique identifier for the search plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string ID.
|
||||
# @RETURN: str - "search-datasets"
|
||||
def id(self) -> str:
|
||||
with belief_scope("id"):
|
||||
return "search-datasets"
|
||||
# [/DEF:id:Function]
|
||||
|
||||
@property
|
||||
# [DEF:name:Function]
|
||||
# @PURPOSE: Returns the human-readable name of the search plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string name.
|
||||
# @RETURN: str - Plugin name.
|
||||
def name(self) -> str:
|
||||
with belief_scope("name"):
|
||||
return "Search Datasets"
|
||||
# [/DEF:name:Function]
|
||||
|
||||
@property
|
||||
# [DEF:description:Function]
|
||||
# @PURPOSE: Returns a description of the search plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string description.
|
||||
# @RETURN: str - Plugin description.
|
||||
def description(self) -> str:
|
||||
with belief_scope("description"):
|
||||
return "Search for text patterns across all datasets in a specific environment."
|
||||
# [/DEF:description:Function]
|
||||
|
||||
@property
|
||||
# [DEF:version:Function]
|
||||
# @PURPOSE: Returns the version of the search plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string version.
|
||||
# @RETURN: str - "1.0.0"
|
||||
def version(self) -> str:
|
||||
with belief_scope("version"):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for the search plugin parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns dictionary schema.
|
||||
# @RETURN: Dict[str, Any] - JSON schema.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
with belief_scope("get_schema"):
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"env": {
|
||||
"type": "string",
|
||||
"title": "Environment",
|
||||
"description": "The Superset environment to search in (e.g., 'dev', 'prod')."
|
||||
},
|
||||
"query": {
|
||||
"type": "string",
|
||||
"title": "Search Query (Regex)",
|
||||
"description": "The regex pattern to search for."
|
||||
}
|
||||
},
|
||||
"required": ["env", "query"]
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Executes the dataset search logic.
|
||||
# @PARAM: params (Dict[str, Any]) - Search parameters.
|
||||
# @PRE: Params contain valid 'env' and 'query'.
|
||||
# @POST: Returns a dictionary with count and results list.
|
||||
# @RETURN: Dict[str, Any] - Search results.
|
||||
async def execute(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
||||
with belief_scope("SearchPlugin.execute", f"params={params}"):
|
||||
env_name = params.get("env")
|
||||
search_query = params.get("query")
|
||||
|
||||
if not env_name or not search_query:
|
||||
logger.error("[SearchPlugin.execute][State] Missing required parameters.")
|
||||
raise ValueError("Missing required parameters: env, query")
|
||||
|
||||
# Get config and initialize client
|
||||
from ..dependencies import get_config_manager
|
||||
config_manager = get_config_manager()
|
||||
env_config = config_manager.get_environment(env_name)
|
||||
if not env_config:
|
||||
logger.error(f"[SearchPlugin.execute][State] Environment '{env_name}' not found.")
|
||||
raise ValueError(f"Environment '{env_name}' not found in configuration.")
|
||||
|
||||
client = SupersetClient(env_config)
|
||||
client.authenticate()
|
||||
|
||||
logger.info(f"[SearchPlugin.execute][Action] Searching for pattern: '{search_query}' in environment: {env_name}")
|
||||
|
||||
try:
|
||||
# Ported logic from search_script.py
|
||||
_, datasets = client.get_datasets(query={"columns": ["id", "table_name", "sql", "database", "columns"]})
|
||||
|
||||
if not datasets:
|
||||
logger.warning("[SearchPlugin.execute][State] No datasets found.")
|
||||
return {"count": 0, "results": []}
|
||||
|
||||
pattern = re.compile(search_query, re.IGNORECASE)
|
||||
results = []
|
||||
|
||||
for dataset in datasets:
|
||||
dataset_id = dataset.get('id')
|
||||
dataset_name = dataset.get('table_name', 'Unknown')
|
||||
if not dataset_id:
|
||||
continue
|
||||
|
||||
for field, value in dataset.items():
|
||||
value_str = str(value)
|
||||
if pattern.search(value_str):
|
||||
match_obj = pattern.search(value_str)
|
||||
results.append({
|
||||
"dataset_id": dataset_id,
|
||||
"dataset_name": dataset_name,
|
||||
"field": field,
|
||||
"match_context": self._get_context(value_str, match_obj.group() if match_obj else ""),
|
||||
"full_value": value_str
|
||||
})
|
||||
|
||||
logger.info(f"[SearchPlugin.execute][Success] Found matches in {len(results)} locations.")
|
||||
return {
|
||||
"count": len(results),
|
||||
"results": results
|
||||
}
|
||||
|
||||
except re.error as e:
|
||||
logger.error(f"[SearchPlugin.execute][Failure] Invalid regex pattern: {e}")
|
||||
raise ValueError(f"Invalid regex pattern: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"[SearchPlugin.execute][Failure] Error during search: {e}")
|
||||
raise
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
# [DEF:_get_context:Function]
|
||||
# @PURPOSE: Extracts a small context around the match for display.
|
||||
# @PARAM: text (str) - The full text to extract context from.
|
||||
# @PARAM: match_text (str) - The matched text pattern.
|
||||
# @PARAM: context_lines (int) - Number of lines of context to include.
|
||||
# @PRE: text and match_text must be strings.
|
||||
# @POST: Returns context string.
|
||||
# @RETURN: str - Extracted context.
|
||||
def _get_context(self, text: str, match_text: str, context_lines: int = 1) -> str:
|
||||
"""
|
||||
Extracts a small context around the match for display.
|
||||
"""
|
||||
with belief_scope("_get_context"):
|
||||
if not match_text:
|
||||
return text[:100] + "..." if len(text) > 100 else text
|
||||
|
||||
lines = text.splitlines()
|
||||
match_line_index = -1
|
||||
for i, line in enumerate(lines):
|
||||
if match_text in line:
|
||||
match_line_index = i
|
||||
break
|
||||
|
||||
if match_line_index != -1:
|
||||
start = max(0, match_line_index - context_lines)
|
||||
end = min(len(lines), match_line_index + context_lines + 1)
|
||||
context = []
|
||||
for i in range(start, end):
|
||||
line_content = lines[i]
|
||||
if i == match_line_index:
|
||||
context.append(f"==> {line_content}")
|
||||
else:
|
||||
context.append(f" {line_content}")
|
||||
return "\n".join(context)
|
||||
|
||||
return text[:100] + "..." if len(text) > 100 else text
|
||||
# [/DEF:_get_context:Function]
|
||||
|
||||
# [/DEF:SearchPlugin:Class]
|
||||
# [/DEF:SearchPluginModule:Module]
|
||||
380
backend/src/services/git_service.py
Normal file
380
backend/src/services/git_service.py
Normal file
@@ -0,0 +1,380 @@
|
||||
# [DEF:backend.src.services.git_service:Module]
|
||||
#
|
||||
# @SEMANTICS: git, service, gitpython, repository, version_control
|
||||
# @PURPOSE: Core Git logic using GitPython to manage dashboard repositories.
|
||||
# @LAYER: Service
|
||||
# @RELATION: INHERITS_FROM -> None
|
||||
# @RELATION: USED_BY -> src.api.routes.git
|
||||
# @RELATION: USED_BY -> src.plugins.git_plugin
|
||||
#
|
||||
# @INVARIANT: All Git operations must be performed on a valid local directory.
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import httpx
|
||||
from git import Repo, RemoteProgress
|
||||
from fastapi import HTTPException
|
||||
from typing import List, Optional
|
||||
from datetime import datetime
|
||||
from src.core.logger import logger, belief_scope
|
||||
from src.models.git import GitProvider
|
||||
|
||||
# [DEF:GitService:Class]
|
||||
# @PURPOSE: Wrapper for GitPython operations with semantic logging and error handling.
|
||||
class GitService:
|
||||
"""
|
||||
Wrapper for GitPython operations.
|
||||
"""
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the GitService with a base path for repositories.
|
||||
# @PARAM: base_path (str) - Root directory for all Git clones.
|
||||
def __init__(self, base_path: str = "backend/git_repos"):
|
||||
with belief_scope("GitService.__init__"):
|
||||
self.base_path = base_path
|
||||
if not os.path.exists(self.base_path):
|
||||
os.makedirs(self.base_path)
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:_get_repo_path:Function]
|
||||
# @PURPOSE: Resolves the local filesystem path for a dashboard's repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @RETURN: str
|
||||
def _get_repo_path(self, dashboard_id: int) -> str:
|
||||
return os.path.join(self.base_path, str(dashboard_id))
|
||||
# [/DEF:_get_repo_path:Function]
|
||||
|
||||
# [DEF:init_repo:Function]
|
||||
# @PURPOSE: Initialize or clone a repository for a dashboard.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: remote_url (str)
|
||||
# @PARAM: pat (str) - Personal Access Token for authentication.
|
||||
# @RETURN: Repo - GitPython Repo object.
|
||||
def init_repo(self, dashboard_id: int, remote_url: str, pat: str) -> Repo:
|
||||
with belief_scope("GitService.init_repo"):
|
||||
repo_path = self._get_repo_path(dashboard_id)
|
||||
|
||||
# Inject PAT into remote URL if needed
|
||||
if pat and "://" in remote_url:
|
||||
proto, rest = remote_url.split("://", 1)
|
||||
auth_url = f"{proto}://oauth2:{pat}@{rest}"
|
||||
else:
|
||||
auth_url = remote_url
|
||||
|
||||
if os.path.exists(repo_path):
|
||||
logger.info(f"[init_repo][Action] Opening existing repo at {repo_path}")
|
||||
return Repo(repo_path)
|
||||
|
||||
logger.info(f"[init_repo][Action] Cloning {remote_url} to {repo_path}")
|
||||
return Repo.clone_from(auth_url, repo_path)
|
||||
# [/DEF:init_repo:Function]
|
||||
|
||||
# [DEF:get_repo:Function]
|
||||
# @PURPOSE: Get Repo object for a dashboard.
|
||||
# @PRE: Repository must exist on disk.
|
||||
# @RETURN: Repo
|
||||
def get_repo(self, dashboard_id: int) -> Repo:
|
||||
with belief_scope("GitService.get_repo"):
|
||||
repo_path = self._get_repo_path(dashboard_id)
|
||||
if not os.path.exists(repo_path):
|
||||
logger.error(f"[get_repo][Coherence:Failed] Repository for dashboard {dashboard_id} does not exist")
|
||||
raise HTTPException(status_code=404, detail=f"Repository for dashboard {dashboard_id} not found")
|
||||
try:
|
||||
return Repo(repo_path)
|
||||
except Exception as e:
|
||||
logger.error(f"[get_repo][Coherence:Failed] Failed to open repository at {repo_path}: {e}")
|
||||
raise HTTPException(status_code=500, detail="Failed to open local Git repository")
|
||||
# [/DEF:get_repo:Function]
|
||||
|
||||
# [DEF:list_branches:Function]
|
||||
# @PURPOSE: List all branches for a dashboard's repository.
|
||||
# @RETURN: List[dict]
|
||||
def list_branches(self, dashboard_id: int) -> List[dict]:
|
||||
with belief_scope("GitService.list_branches"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
logger.info(f"[list_branches][Action] Listing branches for {dashboard_id}. Refs: {repo.refs}")
|
||||
branches = []
|
||||
|
||||
# Add existing refs
|
||||
for ref in repo.refs:
|
||||
try:
|
||||
# Strip prefixes for UI
|
||||
name = ref.name.replace('refs/heads/', '').replace('refs/remotes/origin/', '')
|
||||
|
||||
# Avoid duplicates (e.g. local and remote with same name)
|
||||
if any(b['name'] == name for b in branches):
|
||||
continue
|
||||
|
||||
branches.append({
|
||||
"name": name,
|
||||
"commit_hash": ref.commit.hexsha if hasattr(ref, 'commit') else "0000000",
|
||||
"is_remote": ref.is_remote() if hasattr(ref, 'is_remote') else False,
|
||||
"last_updated": datetime.fromtimestamp(ref.commit.committed_date) if hasattr(ref, 'commit') else datetime.utcnow()
|
||||
})
|
||||
except Exception as e:
|
||||
logger.warning(f"[list_branches][Action] Skipping ref {ref}: {e}")
|
||||
|
||||
# Ensure the current active branch is in the list even if it has no commits or refs
|
||||
try:
|
||||
active_name = repo.active_branch.name
|
||||
if not any(b['name'] == active_name for b in branches):
|
||||
branches.append({
|
||||
"name": active_name,
|
||||
"commit_hash": "0000000",
|
||||
"is_remote": False,
|
||||
"last_updated": datetime.utcnow()
|
||||
})
|
||||
except Exception as e:
|
||||
logger.warning(f"[list_branches][Action] Could not determine active branch: {e}")
|
||||
# If everything else failed and list is still empty, add default
|
||||
if not branches:
|
||||
branches.append({
|
||||
"name": "main",
|
||||
"commit_hash": "0000000",
|
||||
"is_remote": False,
|
||||
"last_updated": datetime.utcnow()
|
||||
})
|
||||
|
||||
return branches
|
||||
# [/DEF:list_branches:Function]
|
||||
|
||||
# [DEF:create_branch:Function]
|
||||
# @PURPOSE: Create a new branch from an existing one.
|
||||
# @PARAM: name (str) - New branch name.
|
||||
# @PARAM: from_branch (str) - Source branch.
|
||||
def create_branch(self, dashboard_id: int, name: str, from_branch: str = "main"):
|
||||
with belief_scope("GitService.create_branch"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
logger.info(f"[create_branch][Action] Creating branch {name} from {from_branch}")
|
||||
|
||||
# Handle empty repository case (no commits)
|
||||
if not repo.heads and not repo.remotes:
|
||||
logger.warning(f"[create_branch][Action] Repository is empty. Creating initial commit to enable branching.")
|
||||
readme_path = os.path.join(repo.working_dir, "README.md")
|
||||
if not os.path.exists(readme_path):
|
||||
with open(readme_path, "w") as f:
|
||||
f.write(f"# Dashboard {dashboard_id}\nGit repository for Superset dashboard integration.")
|
||||
repo.index.add(["README.md"])
|
||||
repo.index.commit("Initial commit")
|
||||
|
||||
# Verify source branch exists
|
||||
try:
|
||||
repo.commit(from_branch)
|
||||
except:
|
||||
logger.warning(f"[create_branch][Action] Source branch {from_branch} not found, using HEAD")
|
||||
from_branch = repo.head
|
||||
|
||||
try:
|
||||
new_branch = repo.create_head(name, from_branch)
|
||||
return new_branch
|
||||
except Exception as e:
|
||||
logger.error(f"[create_branch][Coherence:Failed] {e}")
|
||||
raise
|
||||
# [/DEF:create_branch:Function]
|
||||
# [/DEF:create_branch:Function]
|
||||
|
||||
# [DEF:checkout_branch:Function]
|
||||
# @PURPOSE: Switch to a specific branch.
|
||||
def checkout_branch(self, dashboard_id: int, name: str):
|
||||
with belief_scope("GitService.checkout_branch"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
logger.info(f"[checkout_branch][Action] Checking out branch {name}")
|
||||
repo.git.checkout(name)
|
||||
# [/DEF:checkout_branch:Function]
|
||||
|
||||
# [DEF:commit_changes:Function]
|
||||
# @PURPOSE: Stage and commit changes.
|
||||
# @PARAM: message (str) - Commit message.
|
||||
# @PARAM: files (List[str]) - Optional list of specific files to stage.
|
||||
def commit_changes(self, dashboard_id: int, message: str, files: List[str] = None):
|
||||
with belief_scope("GitService.commit_changes"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
|
||||
# Check if there are any changes to commit
|
||||
if not repo.is_dirty(untracked_files=True) and not files:
|
||||
logger.info(f"[commit_changes][Action] No changes to commit for dashboard {dashboard_id}")
|
||||
return
|
||||
|
||||
if files:
|
||||
logger.info(f"[commit_changes][Action] Staging files: {files}")
|
||||
repo.index.add(files)
|
||||
else:
|
||||
logger.info("[commit_changes][Action] Staging all changes")
|
||||
repo.git.add(A=True)
|
||||
|
||||
repo.index.commit(message)
|
||||
logger.info(f"[commit_changes][Coherence:OK] Committed changes with message: {message}")
|
||||
# [/DEF:commit_changes:Function]
|
||||
|
||||
# [DEF:push_changes:Function]
|
||||
# @PURPOSE: Push local commits to remote.
|
||||
def push_changes(self, dashboard_id: int):
|
||||
with belief_scope("GitService.push_changes"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
|
||||
# Ensure we have something to push
|
||||
if not repo.heads:
|
||||
logger.warning(f"[push_changes][Coherence:Failed] No local branches to push for dashboard {dashboard_id}")
|
||||
return
|
||||
|
||||
try:
|
||||
origin = repo.remote(name='origin')
|
||||
except ValueError:
|
||||
logger.error(f"[push_changes][Coherence:Failed] Remote 'origin' not found for dashboard {dashboard_id}")
|
||||
raise HTTPException(status_code=400, detail="Remote 'origin' not configured")
|
||||
|
||||
# Check if current branch has an upstream
|
||||
try:
|
||||
current_branch = repo.active_branch
|
||||
logger.info(f"[push_changes][Action] Pushing branch {current_branch.name} to origin")
|
||||
# Using a timeout for network operations
|
||||
push_info = origin.push(refspec=f'{current_branch.name}:{current_branch.name}')
|
||||
for info in push_info:
|
||||
if info.flags & info.ERROR:
|
||||
logger.error(f"[push_changes][Coherence:Failed] Error pushing ref {info.remote_ref_string}: {info.summary}")
|
||||
raise Exception(f"Git push error for {info.remote_ref_string}: {info.summary}")
|
||||
except Exception as e:
|
||||
logger.error(f"[push_changes][Coherence:Failed] Failed to push changes: {e}")
|
||||
raise HTTPException(status_code=500, detail=f"Git push failed: {str(e)}")
|
||||
# [/DEF:push_changes:Function]
|
||||
|
||||
# [DEF:pull_changes:Function]
|
||||
# @PURPOSE: Pull changes from remote.
|
||||
def pull_changes(self, dashboard_id: int):
|
||||
with belief_scope("GitService.pull_changes"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
try:
|
||||
origin = repo.remote(name='origin')
|
||||
logger.info("[pull_changes][Action] Pulling changes from origin")
|
||||
fetch_info = origin.pull()
|
||||
for info in fetch_info:
|
||||
if info.flags & info.ERROR:
|
||||
logger.error(f"[pull_changes][Coherence:Failed] Error pulling ref {info.ref}: {info.note}")
|
||||
raise Exception(f"Git pull error for {info.ref}: {info.note}")
|
||||
except ValueError:
|
||||
logger.error(f"[pull_changes][Coherence:Failed] Remote 'origin' not found for dashboard {dashboard_id}")
|
||||
raise HTTPException(status_code=400, detail="Remote 'origin' not configured")
|
||||
except Exception as e:
|
||||
logger.error(f"[pull_changes][Coherence:Failed] Failed to pull changes: {e}")
|
||||
raise HTTPException(status_code=500, detail=f"Git pull failed: {str(e)}")
|
||||
# [/DEF:pull_changes:Function]
|
||||
|
||||
# [DEF:get_status:Function]
|
||||
# @PURPOSE: Get current repository status (dirty files, untracked, etc.)
|
||||
# @RETURN: dict
|
||||
def get_status(self, dashboard_id: int) -> dict:
|
||||
with belief_scope("GitService.get_status"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
|
||||
# Handle empty repository (no commits)
|
||||
has_commits = False
|
||||
try:
|
||||
repo.head.commit
|
||||
has_commits = True
|
||||
except (ValueError, Exception):
|
||||
has_commits = False
|
||||
|
||||
return {
|
||||
"is_dirty": repo.is_dirty(untracked_files=True),
|
||||
"untracked_files": repo.untracked_files,
|
||||
"modified_files": [item.a_path for item in repo.index.diff(None)],
|
||||
"staged_files": [item.a_path for item in repo.index.diff("HEAD")] if has_commits else [],
|
||||
"current_branch": repo.active_branch.name
|
||||
}
|
||||
# [/DEF:get_status:Function]
|
||||
|
||||
# [DEF:get_diff:Function]
|
||||
# @PURPOSE: Generate diff for a file or the whole repository.
|
||||
# @PARAM: file_path (str) - Optional specific file.
|
||||
# @PARAM: staged (bool) - Whether to show staged changes.
|
||||
# @RETURN: str
|
||||
def get_diff(self, dashboard_id: int, file_path: str = None, staged: bool = False) -> str:
|
||||
with belief_scope("GitService.get_diff"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
diff_args = []
|
||||
if staged:
|
||||
diff_args.append("--staged")
|
||||
|
||||
if file_path:
|
||||
return repo.git.diff(*diff_args, "--", file_path)
|
||||
return repo.git.diff(*diff_args)
|
||||
# [/DEF:get_diff:Function]
|
||||
|
||||
# [DEF:get_commit_history:Function]
|
||||
# @PURPOSE: Retrieve commit history for a repository.
|
||||
# @PARAM: limit (int) - Max number of commits to return.
|
||||
# @RETURN: List[dict]
|
||||
def get_commit_history(self, dashboard_id: int, limit: int = 50) -> List[dict]:
|
||||
with belief_scope("GitService.get_commit_history"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
commits = []
|
||||
try:
|
||||
# Check if there are any commits at all
|
||||
if not repo.heads and not repo.remotes:
|
||||
return []
|
||||
|
||||
for commit in repo.iter_commits(max_count=limit):
|
||||
commits.append({
|
||||
"hash": commit.hexsha,
|
||||
"author": commit.author.name,
|
||||
"email": commit.author.email,
|
||||
"timestamp": datetime.fromtimestamp(commit.committed_date),
|
||||
"message": commit.message.strip(),
|
||||
"files_changed": list(commit.stats.files.keys())
|
||||
})
|
||||
except Exception as e:
|
||||
logger.warning(f"[get_commit_history][Action] Could not retrieve commit history for dashboard {dashboard_id}: {e}")
|
||||
return []
|
||||
return commits
|
||||
# [/DEF:get_commit_history:Function]
|
||||
|
||||
# [DEF:test_connection:Function]
|
||||
# @PURPOSE: Test connection to Git provider using PAT.
|
||||
# @PARAM: provider (GitProvider)
|
||||
# @PARAM: url (str)
|
||||
# @PARAM: pat (str)
|
||||
# @RETURN: bool
|
||||
async def test_connection(self, provider: GitProvider, url: str, pat: str) -> bool:
|
||||
with belief_scope("GitService.test_connection"):
|
||||
# Check for offline mode or local-only URLs
|
||||
if ".local" in url or "localhost" in url:
|
||||
logger.info("[test_connection][Action] Local/Offline mode detected for URL")
|
||||
return True
|
||||
|
||||
if not url.startswith(('http://', 'https://')):
|
||||
logger.error(f"[test_connection][Coherence:Failed] Invalid URL protocol: {url}")
|
||||
return False
|
||||
|
||||
if not pat or not pat.strip():
|
||||
logger.error("[test_connection][Coherence:Failed] Git PAT is missing or empty")
|
||||
return False
|
||||
|
||||
pat = pat.strip()
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
if provider == GitProvider.GITHUB:
|
||||
headers = {"Authorization": f"token {pat}"}
|
||||
api_url = "https://api.github.com/user" if "github.com" in url else f"{url.rstrip('/')}/api/v3/user"
|
||||
resp = await client.get(api_url, headers=headers)
|
||||
elif provider == GitProvider.GITLAB:
|
||||
headers = {"PRIVATE-TOKEN": pat}
|
||||
api_url = f"{url.rstrip('/')}/api/v4/user"
|
||||
resp = await client.get(api_url, headers=headers)
|
||||
elif provider == GitProvider.GITEA:
|
||||
headers = {"Authorization": f"token {pat}"}
|
||||
api_url = f"{url.rstrip('/')}/api/v1/user"
|
||||
resp = await client.get(api_url, headers=headers)
|
||||
else:
|
||||
return False
|
||||
|
||||
if resp.status_code != 200:
|
||||
logger.error(f"[test_connection][Coherence:Failed] Git connection test failed for {provider} at {api_url}. Status: {resp.status_code}")
|
||||
return resp.status_code == 200
|
||||
except Exception as e:
|
||||
logger.error(f"[test_connection][Coherence:Failed] Error testing git connection: {e}")
|
||||
return False
|
||||
# [/DEF:test_connection:Function]
|
||||
|
||||
# [/DEF:GitService:Class]
|
||||
# [/DEF:backend.src.services.git_service:Module]
|
||||
@@ -10,59 +10,61 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Dict
|
||||
from backend.src.core.logger import belief_scope
|
||||
from backend.src.core.superset_client import SupersetClient
|
||||
from backend.src.core.utils.matching import suggest_mappings
|
||||
from superset_tool.models import SupersetConfig
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:MappingService:Class]
|
||||
# @PURPOSE: Service for handling database mapping logic.
|
||||
class MappingService:
|
||||
|
||||
# [DEF:MappingService.__init__:Function]
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the mapping service with a config manager.
|
||||
# @PRE: config_manager is provided.
|
||||
# @PARAM: config_manager (ConfigManager) - The configuration manager.
|
||||
# @POST: Service is initialized.
|
||||
def __init__(self, config_manager):
|
||||
self.config_manager = config_manager
|
||||
# [/DEF:MappingService.__init__:Function]
|
||||
with belief_scope("MappingService.__init__"):
|
||||
self.config_manager = config_manager
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:MappingService._get_client:Function]
|
||||
# [DEF:_get_client:Function]
|
||||
# @PURPOSE: Helper to get an initialized SupersetClient for an environment.
|
||||
# @PARAM: env_id (str) - The ID of the environment.
|
||||
# @PRE: environment must exist in config.
|
||||
# @POST: Returns an initialized SupersetClient.
|
||||
# @RETURN: SupersetClient - Initialized client.
|
||||
def _get_client(self, env_id: str) -> SupersetClient:
|
||||
envs = self.config_manager.get_environments()
|
||||
env = next((e for e in envs if e.id == env_id), None)
|
||||
if not env:
|
||||
raise ValueError(f"Environment {env_id} not found")
|
||||
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db",
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
return SupersetClient(superset_config)
|
||||
# [/DEF:MappingService._get_client:Function]
|
||||
with belief_scope("MappingService._get_client", f"env_id={env_id}"):
|
||||
envs = self.config_manager.get_environments()
|
||||
env = next((e for e in envs if e.id == env_id), None)
|
||||
if not env:
|
||||
raise ValueError(f"Environment {env_id} not found")
|
||||
|
||||
return SupersetClient(env)
|
||||
# [/DEF:_get_client:Function]
|
||||
|
||||
# [DEF:MappingService.get_suggestions:Function]
|
||||
# [DEF:get_suggestions:Function]
|
||||
# @PURPOSE: Fetches databases from both environments and returns fuzzy matching suggestions.
|
||||
# @PARAM: source_env_id (str) - Source environment ID.
|
||||
# @PARAM: target_env_id (str) - Target environment ID.
|
||||
# @PRE: Both environments must be accessible.
|
||||
# @POST: Returns fuzzy-matched database suggestions.
|
||||
# @RETURN: List[Dict] - Suggested mappings.
|
||||
async def get_suggestions(self, source_env_id: str, target_env_id: str) -> List[Dict]:
|
||||
"""
|
||||
Get suggested mappings between two environments.
|
||||
"""
|
||||
source_client = self._get_client(source_env_id)
|
||||
target_client = self._get_client(target_env_id)
|
||||
|
||||
source_dbs = source_client.get_databases_summary()
|
||||
target_dbs = target_client.get_databases_summary()
|
||||
|
||||
return suggest_mappings(source_dbs, target_dbs)
|
||||
# [/DEF:MappingService.get_suggestions:Function]
|
||||
with belief_scope("MappingService.get_suggestions", f"source={source_env_id}, target={target_env_id}"):
|
||||
"""
|
||||
Get suggested mappings between two environments.
|
||||
"""
|
||||
source_client = self._get_client(source_env_id)
|
||||
target_client = self._get_client(target_env_id)
|
||||
|
||||
source_dbs = source_client.get_databases_summary()
|
||||
target_dbs = target_client.get_databases_summary()
|
||||
|
||||
return suggest_mappings(source_dbs, target_dbs)
|
||||
# [/DEF:get_suggestions:Function]
|
||||
|
||||
# [/DEF:MappingService:Class]
|
||||
|
||||
|
||||
BIN
backend/tasks.db
BIN
backend/tasks.db
Binary file not shown.
@@ -1,7 +1,11 @@
|
||||
import pytest
|
||||
from backend.src.core.logger import belief_scope, logger
|
||||
from src.core.logger import belief_scope, logger
|
||||
|
||||
|
||||
# [DEF:test_belief_scope_logs_entry_action_exit:Function]
|
||||
# @PURPOSE: Test that belief_scope generates [ID][Entry], [ID][Action], and [ID][Exit] logs.
|
||||
# @PRE: belief_scope is available. caplog fixture is used.
|
||||
# @POST: Logs are verified to contain Entry, Action, and Exit tags.
|
||||
def test_belief_scope_logs_entry_action_exit(caplog):
|
||||
"""Test that belief_scope generates [ID][Entry], [ID][Action], and [ID][Exit] logs."""
|
||||
caplog.set_level("INFO")
|
||||
@@ -15,8 +19,13 @@ def test_belief_scope_logs_entry_action_exit(caplog):
|
||||
assert any("[TestFunction][Entry]" in msg for msg in log_messages), "Entry log not found"
|
||||
assert any("[TestFunction][Action] Doing something important" in msg for msg in log_messages), "Action log not found"
|
||||
assert any("[TestFunction][Exit]" in msg for msg in log_messages), "Exit log not found"
|
||||
# [/DEF:test_belief_scope_logs_entry_action_exit:Function]
|
||||
|
||||
|
||||
# [DEF:test_belief_scope_error_handling:Function]
|
||||
# @PURPOSE: Test that belief_scope logs Coherence:Failed on exception.
|
||||
# @PRE: belief_scope is available. caplog fixture is used.
|
||||
# @POST: Logs are verified to contain Coherence:Failed tag.
|
||||
def test_belief_scope_error_handling(caplog):
|
||||
"""Test that belief_scope logs Coherence:Failed on exception."""
|
||||
caplog.set_level("INFO")
|
||||
@@ -30,8 +39,13 @@ def test_belief_scope_error_handling(caplog):
|
||||
assert any("[FailingFunction][Entry]" in msg for msg in log_messages), "Entry log not found"
|
||||
assert any("[FailingFunction][Coherence:Failed]" in msg for msg in log_messages), "Failed coherence log not found"
|
||||
# Exit should not be logged on failure
|
||||
# [/DEF:test_belief_scope_error_handling:Function]
|
||||
|
||||
|
||||
# [DEF:test_belief_scope_success_coherence:Function]
|
||||
# @PURPOSE: Test that belief_scope logs Coherence:OK on success.
|
||||
# @PRE: belief_scope is available. caplog fixture is used.
|
||||
# @POST: Logs are verified to contain Coherence:OK tag.
|
||||
def test_belief_scope_success_coherence(caplog):
|
||||
"""Test that belief_scope logs Coherence:OK on success."""
|
||||
caplog.set_level("INFO")
|
||||
@@ -41,4 +55,5 @@ def test_belief_scope_success_coherence(caplog):
|
||||
|
||||
log_messages = [record.message for record in caplog.records]
|
||||
|
||||
assert any("[SuccessFunction][Coherence:OK]" in msg for msg in log_messages), "Success coherence log not found"
|
||||
assert any("[SuccessFunction][Coherence:OK]" in msg for msg in log_messages), "Success coherence log not found"
|
||||
# [/DEF:test_belief_scope_success_coherence:Function]
|
||||
@@ -1,49 +1,23 @@
|
||||
import pytest
|
||||
from superset_tool.models import SupersetConfig
|
||||
from src.core.config_models import Environment
|
||||
from src.core.logger import belief_scope
|
||||
|
||||
def test_superset_config_url_normalization():
|
||||
auth = {
|
||||
"provider": "db",
|
||||
"username": "admin",
|
||||
"password": "password",
|
||||
"refresh": "token"
|
||||
}
|
||||
|
||||
# Test with /api/v1 already present
|
||||
config = SupersetConfig(
|
||||
env="dev",
|
||||
base_url="http://localhost:8088/api/v1",
|
||||
auth=auth
|
||||
)
|
||||
assert config.base_url == "http://localhost:8088/api/v1"
|
||||
|
||||
# Test without /api/v1
|
||||
config = SupersetConfig(
|
||||
env="dev",
|
||||
base_url="http://localhost:8088",
|
||||
auth=auth
|
||||
)
|
||||
assert config.base_url == "http://localhost:8088/api/v1"
|
||||
|
||||
# Test with trailing slash
|
||||
config = SupersetConfig(
|
||||
env="dev",
|
||||
base_url="http://localhost:8088/",
|
||||
auth=auth
|
||||
)
|
||||
assert config.base_url == "http://localhost:8088/api/v1"
|
||||
|
||||
def test_superset_config_invalid_url():
|
||||
auth = {
|
||||
"provider": "db",
|
||||
"username": "admin",
|
||||
"password": "password",
|
||||
"refresh": "token"
|
||||
}
|
||||
|
||||
with pytest.raises(ValueError, match="Must start with http:// or https://"):
|
||||
SupersetConfig(
|
||||
env="dev",
|
||||
base_url="localhost:8088",
|
||||
auth=auth
|
||||
# [DEF:test_environment_model:Function]
|
||||
# @PURPOSE: Tests that Environment model correctly stores values.
|
||||
# @PRE: Environment class is available.
|
||||
# @POST: Values are verified.
|
||||
def test_environment_model():
|
||||
with belief_scope("test_environment_model"):
|
||||
env = Environment(
|
||||
id="test-id",
|
||||
name="test-env",
|
||||
url="http://localhost:8088/api/v1",
|
||||
username="admin",
|
||||
password="password"
|
||||
)
|
||||
assert env.id == "test-id"
|
||||
assert env.name == "test-env"
|
||||
assert env.url == "http://localhost:8088/api/v1"
|
||||
# [/DEF:test_superset_config_url_normalization:Function]
|
||||
|
||||
# [/DEF:test_superset_config_invalid_url:Function]
|
||||
|
||||
163
backup_script.py
163
backup_script.py
@@ -1,163 +0,0 @@
|
||||
# [DEF:backup_script:Module]
|
||||
#
|
||||
# @SEMANTICS: backup, superset, automation, dashboard
|
||||
# @PURPOSE: Этот модуль отвечает за автоматизированное резервное копирование дашбордов Superset.
|
||||
# @LAYER: App
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.client
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.utils
|
||||
# @PUBLIC_API: BackupConfig, backup_dashboards, main
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import logging
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from dataclasses import dataclass,field
|
||||
from requests.exceptions import RequestException
|
||||
from superset_tool.client import SupersetClient
|
||||
from superset_tool.exceptions import SupersetAPIError
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
from superset_tool.utils.fileio import (
|
||||
save_and_unpack_dashboard,
|
||||
archive_exports,
|
||||
sanitize_filename,
|
||||
consolidate_archive_folders,
|
||||
remove_empty_directories,
|
||||
RetentionPolicy
|
||||
)
|
||||
from superset_tool.utils.init_clients import setup_clients
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:BackupConfig:DataClass]
|
||||
# @PURPOSE: Хранит конфигурацию для процесса бэкапа.
|
||||
@dataclass
|
||||
class BackupConfig:
|
||||
"""Конфигурация для процесса бэкапа."""
|
||||
consolidate: bool = True
|
||||
rotate_archive: bool = True
|
||||
clean_folders: bool = True
|
||||
retention_policy: RetentionPolicy = field(default_factory=RetentionPolicy)
|
||||
# [/DEF:BackupConfig:DataClass]
|
||||
|
||||
# [DEF:backup_dashboards:Function]
|
||||
# @PURPOSE: Выполняет бэкап всех доступных дашбордов для заданного клиента и окружения, пропуская ошибки экспорта.
|
||||
# @PRE: `client` должен быть инициализированным экземпляром `SupersetClient`.
|
||||
# @PRE: `env_name` должен быть строкой, обозначающей окружение.
|
||||
# @PRE: `backup_root` должен быть валидным путем к корневой директории бэкапа.
|
||||
# @POST: Дашборды экспортируются и сохраняются. Ошибки экспорта логируются и не приводят к остановке скрипта.
|
||||
# @RELATION: CALLS -> client.get_dashboards
|
||||
# @RELATION: CALLS -> client.export_dashboard
|
||||
# @RELATION: CALLS -> save_and_unpack_dashboard
|
||||
# @RELATION: CALLS -> archive_exports
|
||||
# @RELATION: CALLS -> consolidate_archive_folders
|
||||
# @RELATION: CALLS -> remove_empty_directories
|
||||
# @PARAM: client (SupersetClient) - Клиент для доступа к API Superset.
|
||||
# @PARAM: env_name (str) - Имя окружения (e.g., 'PROD').
|
||||
# @PARAM: backup_root (Path) - Корневая директория для сохранения бэкапов.
|
||||
# @PARAM: logger (SupersetLogger) - Инстанс логгера.
|
||||
# @PARAM: config (BackupConfig) - Конфигурация процесса бэкапа.
|
||||
# @RETURN: bool - `True` если все дашборды были экспортированы без критических ошибок, `False` иначе.
|
||||
def backup_dashboards(
|
||||
client: SupersetClient,
|
||||
env_name: str,
|
||||
backup_root: Path,
|
||||
logger: SupersetLogger,
|
||||
config: BackupConfig
|
||||
) -> bool:
|
||||
logger.info(f"[backup_dashboards][Entry] Starting backup for {env_name}.")
|
||||
try:
|
||||
dashboard_count, dashboard_meta = client.get_dashboards()
|
||||
logger.info(f"[backup_dashboards][Progress] Found {dashboard_count} dashboards to export in {env_name}.")
|
||||
if dashboard_count == 0:
|
||||
return True
|
||||
|
||||
success_count = 0
|
||||
for db in dashboard_meta:
|
||||
dashboard_id = db.get('id')
|
||||
dashboard_title = db.get('dashboard_title', 'Unknown Dashboard')
|
||||
if not dashboard_id:
|
||||
continue
|
||||
|
||||
try:
|
||||
dashboard_base_dir_name = sanitize_filename(f"{dashboard_title}")
|
||||
dashboard_dir = backup_root / env_name / dashboard_base_dir_name
|
||||
dashboard_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
zip_content, filename = client.export_dashboard(dashboard_id)
|
||||
|
||||
save_and_unpack_dashboard(
|
||||
zip_content=zip_content,
|
||||
original_filename=filename,
|
||||
output_dir=dashboard_dir,
|
||||
unpack=False,
|
||||
logger=logger
|
||||
)
|
||||
|
||||
if config.rotate_archive:
|
||||
archive_exports(str(dashboard_dir), policy=config.retention_policy, logger=logger)
|
||||
|
||||
success_count += 1
|
||||
except (SupersetAPIError, RequestException, IOError, OSError) as db_error:
|
||||
logger.error(f"[backup_dashboards][Failure] Failed to export dashboard {dashboard_title} (ID: {dashboard_id}): {db_error}", exc_info=True)
|
||||
continue
|
||||
|
||||
if config.consolidate:
|
||||
consolidate_archive_folders(backup_root / env_name , logger=logger)
|
||||
|
||||
if config.clean_folders:
|
||||
remove_empty_directories(str(backup_root / env_name), logger=logger)
|
||||
|
||||
logger.info(f"[backup_dashboards][CoherenceCheck:Passed] Backup logic completed.")
|
||||
return success_count == dashboard_count
|
||||
except (RequestException, IOError) as e:
|
||||
logger.critical(f"[backup_dashboards][Failure] Fatal error during backup for {env_name}: {e}", exc_info=True)
|
||||
return False
|
||||
# [/DEF:backup_dashboards:Function]
|
||||
|
||||
# [DEF:main:Function]
|
||||
# @PURPOSE: Основная точка входа для запуска процесса резервного копирования.
|
||||
# @RELATION: CALLS -> setup_clients
|
||||
# @RELATION: CALLS -> backup_dashboards
|
||||
# @RETURN: int - Код выхода (0 - успех, 1 - ошибка).
|
||||
def main() -> int:
|
||||
log_dir = Path("P:\\Superset\\010 Бекапы\\Logs")
|
||||
logger = SupersetLogger(log_dir=log_dir, level=logging.INFO, console=True)
|
||||
logger.info("[main][Entry] Starting Superset backup process.")
|
||||
|
||||
exit_code = 0
|
||||
try:
|
||||
clients = setup_clients(logger)
|
||||
superset_backup_repo = Path("P:\\Superset\\010 Бекапы")
|
||||
superset_backup_repo.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
results = {}
|
||||
environments = ['dev', 'sbx', 'prod', 'preprod']
|
||||
backup_config = BackupConfig(rotate_archive=True)
|
||||
|
||||
for env in environments:
|
||||
try:
|
||||
results[env] = backup_dashboards(
|
||||
clients[env],
|
||||
env.upper(),
|
||||
superset_backup_repo,
|
||||
logger=logger,
|
||||
config=backup_config
|
||||
)
|
||||
except Exception as env_error:
|
||||
logger.critical(f"[main][Failure] Critical error for environment {env}: {env_error}", exc_info=True)
|
||||
results[env] = False
|
||||
|
||||
if not all(results.values()):
|
||||
exit_code = 1
|
||||
|
||||
except (RequestException, IOError) as e:
|
||||
logger.critical(f"[main][Failure] Fatal error in main execution: {e}", exc_info=True)
|
||||
exit_code = 1
|
||||
|
||||
logger.info("[main][Exit] Superset backup process finished.")
|
||||
return exit_code
|
||||
# [/DEF:main:Function]
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
|
||||
# [/DEF:backup_script:Module]
|
||||
@@ -0,0 +1,55 @@
|
||||
slice_name: "FI-0083 \u0421\u0442\u0430\u0442\u0438\u0441\u0442\u0438\u043A\u0430\
|
||||
\ \u043F\u043E \u0414\u0417/\u041F\u0414\u0417"
|
||||
description: null
|
||||
certified_by: null
|
||||
certification_details: null
|
||||
viz_type: pivot_table_v2
|
||||
params:
|
||||
datasource: 859__table
|
||||
viz_type: pivot_table_v2
|
||||
slice_id: 4019
|
||||
groupbyColumns:
|
||||
- dt
|
||||
groupbyRows:
|
||||
- counterparty_search_name
|
||||
- attribute
|
||||
time_grain_sqla: P1M
|
||||
temporal_columns_lookup:
|
||||
dt: true
|
||||
metrics:
|
||||
- m_debt_amount
|
||||
- m_overdue_amount
|
||||
metricsLayout: COLUMNS
|
||||
adhoc_filters:
|
||||
- clause: WHERE
|
||||
comparator: No filter
|
||||
expressionType: SIMPLE
|
||||
operator: TEMPORAL_RANGE
|
||||
subject: dt
|
||||
row_limit: '90000'
|
||||
order_desc: false
|
||||
aggregateFunction: Sum
|
||||
combineMetric: true
|
||||
valueFormat: SMART_NUMBER
|
||||
date_format: smart_date
|
||||
rowOrder: key_a_to_z
|
||||
colOrder: key_a_to_z
|
||||
value_font_size: 12
|
||||
header_font_size: 12
|
||||
label_align: left
|
||||
column_config:
|
||||
m_debt_amount:
|
||||
d3NumberFormat: ',d'
|
||||
m_overdue_amount:
|
||||
d3NumberFormat: ',d'
|
||||
conditional_formatting: []
|
||||
extra_form_data: {}
|
||||
dashboards:
|
||||
- 184
|
||||
query_context: '{"datasource":{"id":859,"type":"table"},"force":false,"queries":[{"filters":[{"col":"dt","op":"TEMPORAL_RANGE","val":"No
|
||||
filter"}],"extras":{"having":"","where":""},"applied_time_extras":{},"columns":[{"timeGrain":"P1M","columnType":"BASE_AXIS","sqlExpression":"dt","label":"dt","expressionType":"SQL"},"counterparty_search_name","attribute"],"metrics":["m_debt_amount","m_overdue_amount"],"orderby":[["m_debt_amount",true]],"annotation_layers":[],"row_limit":90000,"series_limit":0,"order_desc":false,"url_params":{},"custom_params":{},"custom_form_data":{}}],"form_data":{"datasource":"859__table","viz_type":"pivot_table_v2","slice_id":4019,"groupbyColumns":["dt"],"groupbyRows":["counterparty_search_name","attribute"],"time_grain_sqla":"P1M","temporal_columns_lookup":{"dt":true},"metrics":["m_debt_amount","m_overdue_amount"],"metricsLayout":"COLUMNS","adhoc_filters":[{"clause":"WHERE","comparator":"No
|
||||
filter","expressionType":"SIMPLE","operator":"TEMPORAL_RANGE","subject":"dt"}],"row_limit":"90000","order_desc":false,"aggregateFunction":"Sum","combineMetric":true,"valueFormat":"SMART_NUMBER","date_format":"smart_date","rowOrder":"key_a_to_z","colOrder":"key_a_to_z","value_font_size":12,"header_font_size":12,"label_align":"left","column_config":{"m_debt_amount":{"d3NumberFormat":",d"},"m_overdue_amount":{"d3NumberFormat":",d"}},"conditional_formatting":[],"extra_form_data":{},"dashboards":[184],"force":false,"result_format":"json","result_type":"full"},"result_format":"json","result_type":"full"}'
|
||||
cache_timeout: null
|
||||
uuid: 9c293065-73e2-4d9b-a175-d188ff8ef575
|
||||
version: 1.0.0
|
||||
dataset_uuid: 9e645dc0-da25-4f61-9465-6e649b0bc4b1
|
||||
@@ -0,0 +1,13 @@
|
||||
database_name: Prod Clickhouse
|
||||
sqlalchemy_uri: clickhousedb+connect://viz_superset_click_prod:XXXXXXXXXX@rgm-s-khclk.hq.root.ad:443/dm
|
||||
cache_timeout: null
|
||||
expose_in_sqllab: true
|
||||
allow_run_async: false
|
||||
allow_ctas: false
|
||||
allow_cvas: false
|
||||
allow_dml: true
|
||||
allow_file_upload: false
|
||||
extra:
|
||||
allows_virtual_table_explore: true
|
||||
uuid: 97aced68-326a-4094-b381-27980560efa9
|
||||
version: 1.0.0
|
||||
@@ -0,0 +1,119 @@
|
||||
table_name: "FI-0080-06 \u041A\u0430\u043B\u0435\u043D\u0434\u0430\u0440\u044C (\u041E\
|
||||
\u0431\u0449\u0438\u0439 \u0441\u043F\u0440\u0430\u0432\u043E\u0447\u043D\u0438\u043A\
|
||||
)"
|
||||
main_dttm_col: null
|
||||
description: null
|
||||
default_endpoint: null
|
||||
offset: 0
|
||||
cache_timeout: null
|
||||
schema: dm_view
|
||||
sql: "-- [HEADER]\r\n-- [\u041D\u0410\u0417\u041D\u0410\u0427\u0415\u041D\u0418\u0415\
|
||||
]: \u041F\u043E\u043B\u0443\u0447\u0435\u043D\u0438\u0435 \u0434\u0438\u0430\u043F\
|
||||
\u0430\u0437\u043E\u043D\u0430 \u0434\u0430\u0442 \u0434\u043B\u044F \u043E\u0442\
|
||||
\u0447\u0435\u0442\u0430 \u043E \u0437\u0430\u0434\u043E\u043B\u0436\u0435\u043D\
|
||||
\u043D\u043E\u0441\u0442\u044F\u0445 \u043F\u043E \u043E\u0431\u043E\u0440\u043E\
|
||||
\u0442\u043D\u044B\u043C \u0441\u0440\u0435\u0434\u0441\u0442\u0432\u0430\u043C\r\
|
||||
\n-- [\u041A\u041B\u042E\u0427\u0415\u0412\u042B\u0415 \u041A\u041E\u041B\u041E\u041D\
|
||||
\u041A\u0418]:\r\n-- - from_dt_txt: \u041D\u0430\u0447\u0430\u043B\u044C\u043D\
|
||||
\u0430\u044F \u0434\u0430\u0442\u0430 \u0432 \u0444\u043E\u0440\u043C\u0430\u0442\
|
||||
\u0435 DD.MM.YYYY\r\n-- - to_dt_txt: \u041A\u043E\u043D\u0435\u0447\u043D\u0430\
|
||||
\u044F \u0434\u0430\u0442\u0430 \u0432 \u0444\u043E\u0440\u043C\u0430\u0442\u0435\
|
||||
\ DD.MM.YYYY\r\n-- [JINJA \u041F\u0410\u0420\u0410\u041C\u0415\u0422\u0420\u042B\
|
||||
]:\r\n-- - {{ filter_values(\"yes_no_check\") }}: \u0424\u0438\u043B\u044C\u0442\
|
||||
\u0440 \"\u0414\u0430/\u041D\u0435\u0442\" \u0434\u043B\u044F \u043E\u0433\u0440\
|
||||
\u0430\u043D\u0438\u0447\u0435\u043D\u0438\u044F \u0432\u044B\u0431\u043E\u0440\u043A\
|
||||
\u0438 \u043F\u043E \u0434\u0430\u0442\u0435\r\n-- [\u041B\u041E\u0413\u0418\u041A\
|
||||
\u0410]: \u041E\u043F\u0440\u0435\u0434\u0435\u043B\u044F\u0435\u0442 \u043F\u043E\
|
||||
\u0440\u043E\u0433\u043E\u0432\u0443\u044E \u0434\u0430\u0442\u0443 \u0432 \u0437\
|
||||
\u0430\u0432\u0438\u0441\u0438\u043C\u043E\u0441\u0442\u0438 \u043E\u0442 \u0442\
|
||||
\u0435\u043A\u0443\u0449\u0435\u0433\u043E \u0434\u043D\u044F \u043C\u0435\u0441\
|
||||
\u044F\u0446\u0430 \u0438 \u0444\u0438\u043B\u044C\u0442\u0440\u0443\u0435\u0442\
|
||||
\ \u0434\u0430\u043D\u043D\u044B\u0435\r\n\r\nWITH date_threshold AS (\r\n SELECT\
|
||||
\ \r\n -- \u041E\u043F\u0440\u0435\u0434\u0435\u043B\u044F\u0435\u043C \u043F\
|
||||
\u043E\u0440\u043E\u0433\u043E\u0432\u0443\u044E \u0434\u0430\u0442\u0443 \u0432\
|
||||
\ \u0437\u0430\u0432\u0438\u0441\u0438\u043C\u043E\u0441\u0442\u0438 \u043E\u0442\
|
||||
\ \u0442\u0435\u043A\u0443\u0449\u0435\u0433\u043E \u0434\u043D\u044F \r\n \
|
||||
\ CASE \r\n WHEN toDayOfMonth(now()) <= 10 THEN \r\n \
|
||||
\ toStartOfMonth(dateSub(MONTH, 1, now())) \r\n ELSE \r\n \
|
||||
\ toStartOfMonth(now()) \r\n END AS cutoff_date \r\n),\r\nfiltered_dates\
|
||||
\ AS (\r\n SELECT \r\n dt,\r\n formatDateTime(dt, '%d.%m.%Y') AS\
|
||||
\ from_dt_txt,\r\n formatDateTime(dt, '%d.%m.%Y') AS to_dt_txt\r\n \
|
||||
\ --dt as from_dt_txt,\r\n -- dt as to_dt_txt\r\n FROM dm_view.account_debt_for_working_capital_final\r\
|
||||
\n WHERE 1=1\r\n -- \u0411\u0435\u0437\u043E\u043F\u0430\u0441\u043D\u0430\
|
||||
\u044F \u043F\u0440\u043E\u0432\u0435\u0440\u043A\u0430 \u0444\u0438\u043B\u044C\
|
||||
\u0442\u0440\u0430\r\n {% if filter_values(\"yes_no_check\") | length !=\
|
||||
\ 0 %}\r\n {% if filter_values(\"yes_no_check\")[0] == \"\u0414\u0430\
|
||||
\" %}\r\n AND dt < (SELECT cutoff_date FROM date_threshold)\r\n \
|
||||
\ {% endif %}\r\n {% endif %}\r\n)\r\nSELECT \r\ndt,\r\n from_dt_txt,\r\
|
||||
\n to_dt_txt,\r\n formatDateTime(toLastDayOfMonth(dt), '%d.%m.%Y') as last_day_of_month_dt_txt\r\
|
||||
\nFROM \r\n filtered_dates\r\nGROUP BY \r\n dt, from_dt_txt, to_dt_txt\r\n\
|
||||
ORDER BY \r\n dt DESC"
|
||||
params: null
|
||||
template_params: null
|
||||
filter_select_enabled: true
|
||||
fetch_values_predicate: null
|
||||
extra: null
|
||||
normalize_columns: false
|
||||
uuid: fca62707-6947-4440-a16b-70cb6a5cea5b
|
||||
metrics:
|
||||
- metric_name: max_date
|
||||
verbose_name: max_date
|
||||
metric_type: count
|
||||
expression: max(dt)
|
||||
description: null
|
||||
d3format: null
|
||||
currency: null
|
||||
extra:
|
||||
warning_markdown: ''
|
||||
warning_text: null
|
||||
columns:
|
||||
- column_name: from_dt_txt
|
||||
verbose_name: null
|
||||
is_dttm: true
|
||||
is_active: true
|
||||
type: String
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: '%Y'
|
||||
extra: {}
|
||||
- column_name: dt
|
||||
verbose_name: null
|
||||
is_dttm: true
|
||||
is_active: true
|
||||
type: Date
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra: {}
|
||||
- column_name: last_day_of_month_dt_txt
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: String
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra: {}
|
||||
- column_name: to_dt_txt
|
||||
verbose_name: null
|
||||
is_dttm: true
|
||||
is_active: true
|
||||
type: String
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra: {}
|
||||
version: 1.0.0
|
||||
database_uuid: 97aced68-326a-4094-b381-27980560efa9
|
||||
@@ -0,0 +1,190 @@
|
||||
table_name: "FI-0090 \u0421\u0442\u0430\u0442\u0438\u0441\u0442\u0438\u043A\u0430\
|
||||
\ \u043F\u043E \u0414\u0417/\u041F\u0414\u0417"
|
||||
main_dttm_col: dt
|
||||
description: null
|
||||
default_endpoint: null
|
||||
offset: 0
|
||||
cache_timeout: null
|
||||
schema: dm_view
|
||||
sql: "-- [JINJA_BLOCK] \u0426\u0435\u043D\u0442\u0440\u0430\u043B\u0438\u0437\u043E\
|
||||
\u0432\u0430\u043D\u043D\u043E\u0435 \u043E\u043F\u0440\u0435\u0434\u0435\u043B\u0435\
|
||||
\u043D\u0438\u0435 \u0432\u0441\u0435\u0445 Jinja \u043F\u0435\u0440\u0435\u043C\
|
||||
\u0435\u043D\u043D\u044B\u0445\r\n{% set raw_to = filter_values('last_day_of_month_dt_txt')[0]\
|
||||
\ \r\n if filter_values('last_day_of_month_dt_txt') else '01.05.2025'\
|
||||
\ %}\r\n\r\n{# \u0440\u0430\u0437\u0431\u0438\u0432\u0430\u0435\u043C \xABDD.MM.YYYY\xBB\
|
||||
\ \u043D\u0430 \u0447\u0430\u0441\u0442\u0438 #}\r\n{% set to_parts = raw_to.split('.')\
|
||||
\ %}\r\n\r\n{# \u0441\u043E\u0431\u0438\u0440\u0430\u0435\u043C ISO\u2011\u0441\u0442\
|
||||
\u0440\u043E\u043A\u0443 \xABYYYY-MM-DD\xBB #}\r\n{% set to_dt = to_parts[2] \
|
||||
\ ~ '-' ~ to_parts[1] ~ '-' ~ to_parts[0] %}\r\n\r\nwith \r\ncp_relations_type\
|
||||
\ AS (\r\n select * from ( SELECT \r\n ctd.counterparty_code AS counterparty_code,\r\
|
||||
\n min(dt_from) as dt_from,\r\n max(dt_to) as dt_to,\r\n crt.relation_type_code\
|
||||
\ || ' ' || crt.relation_type_name AS relation_type_code_name\r\n FROM\r\n \
|
||||
\ dm_view.counterparty_td ctd\r\n JOIN dm_view.counterparty_relation_type_texts\
|
||||
\ crt \r\n ON ctd.relation_type_code = crt.relation_type_code\r\n GROUP\
|
||||
\ BY\r\n ctd.counterparty_code, ctd.counterparty_full_name,\r\n crt.relation_type_code,crt.relation_type_name)\r\
|
||||
\n WHERE \r\n dt_from <= toDate('{{to_dt }}') AND \r\n \
|
||||
\ dt_to >= toDate('{{to_dt }}')\r\n ),\r\nt_debt as \r\n(SELECT dt, \r\n\
|
||||
counterparty_search_name,\r\ncp_relations_type.relation_type_code_name as relation_type_code_name,\r\
|
||||
\nunit_balance_code || ' ' || unit_balance_name as unit_balance_code_name,\r\n'1.\
|
||||
\ \u0421\u0443\u043C\u043C\u0430' as attribute,\r\nsum(debt_balance_subposition_no_revaluation_usd_amount)\
|
||||
\ as debt_amount,\r\nsumIf(debt_balance_subposition_no_revaluation_usd_amount,dt_overdue\
|
||||
\ < dt) as overdue_amount\r\nfrom dm_view.account_debt_for_working_capital t_debt\r\
|
||||
\njoin cp_relations_type ON\r\ncp_relations_type.counterparty_code = t_debt.counterparty_code\r\
|
||||
\nwhere dt = toLastDayOfMonth(dt)\r\nand match(general_ledger_account_code,'((62)|(60)|(76))')\r\
|
||||
\nand debit_or_credit = 'S'\r\nand account_type = 'D'\r\nand dt between addMonths(toDate('{{to_dt\
|
||||
\ }}'),-12) and toDate('{{to_dt }}')\r\ngroup by dt, counterparty_search_name,unit_balance_code_name,relation_type_code_name\r\
|
||||
\n),\r\n\r\nt_transaction_count_base as \r\n(\r\nselect *,\r\ncp_relations_type.relation_type_code_name\
|
||||
\ as relation_type_code_name,\r\nunit_balance_code || ' ' || unit_balance_name as\
|
||||
\ unit_balance_code_name,\r\n case when dt_overdue<dt_clearing then\r\n \
|
||||
\ dateDiff(day, dt_overdue, dt_clearing) \r\n else 0\r\n end\
|
||||
\ as overdue_days\r\nfrom dm_view.accounting_documents_leading_to_debt t_docs\r\n\
|
||||
join cp_relations_type ON\r\ncp_relations_type.counterparty_code = t_docs.counterparty_code\r\
|
||||
\nwhere 1=1\r\n\r\nand match(general_ledger_account_code,'((62)|(60)|(76))')\r\n\
|
||||
and debit_or_credit = 'S'\r\nand account_type = 'D'\r\n)\r\n\r\nselect * from t_debt\r\
|
||||
\n\r\nunion all \r\n\r\nselect toLastDayOfMonth(dt_debt) as dt, \r\ncounterparty_search_name,\r\
|
||||
\nrelation_type_code_name,\r\nunit_balance_code_name,\r\n'2. \u043A\u043E\u043B\u0438\
|
||||
\u0447\u0435\u0441\u0442\u0432\u043E \u0442\u0440\u0430\u043D\u0437\u0430\u043A\u0446\
|
||||
\u0438\u0439 \u0432 \u043C\u0435\u0441\u044F\u0446' as attribute,\r\ncount(1) as\
|
||||
\ debt_amount,\r\nnull as overdue_amount\r\nfrom t_transaction_count_base\r\nwhere\
|
||||
\ dt_debt between addMonths(toDate('{{to_dt }}'),-12) and toDate('{{to_dt }}')\r\
|
||||
\ngroup by toLastDayOfMonth(dt_debt), \r\ncounterparty_search_name,\r\nrelation_type_code_name,\r\
|
||||
\nunit_balance_code_name,attribute\r\n\r\nunion all \r\n\r\nselect toLastDayOfMonth(dt_clearing)\
|
||||
\ as dt, \r\ncounterparty_search_name,\r\nrelation_type_code_name,\r\nunit_balance_code_name,\r\
|
||||
\n'2. \u043A\u043E\u043B\u0438\u0447\u0435\u0441\u0442\u0432\u043E \u0442\u0440\u0430\
|
||||
\u043D\u0437\u0430\u043A\u0446\u0438\u0439 \u0432 \u043C\u0435\u0441\u044F\u0446\
|
||||
' as attribute,\r\nnull as debt_amount,\r\ncount(1) as overdue_amount\r\nfrom t_transaction_count_base\r\
|
||||
\nwhere dt_clearing between addMonths(toDate('{{to_dt }}'),-12) and toDate('{{to_dt\
|
||||
\ }}')\r\nand overdue_days > 0\r\ngroup by toLastDayOfMonth(dt_clearing), \r\ncounterparty_search_name,\r\
|
||||
\nrelation_type_code_name,\r\nunit_balance_code_name,attribute\r\n\r\nunion all\
|
||||
\ \r\n\r\nselect toLastDayOfMonth(dt_clearing) as dt, \r\ncounterparty_search_name,\r\
|
||||
\nrelation_type_code_name,\r\nunit_balance_code_name,\r\nmultiIf(\r\noverdue_days\
|
||||
\ < 30,'3. \u0434\u043E 30',\r\noverdue_days between 30 and 60, '4. \u043E\u0442\
|
||||
\ 30 \u0434\u043E 60',\r\noverdue_days between 61 and 90, '5. \u043E\u0442 61 \u0434\
|
||||
\u043E 90',\r\noverdue_days>90,'6. \u0431\u043E\u043B\u0435\u0435 90 \u0434\u043D\
|
||||
',\r\nnull\r\n)\r\n as attribute,\r\nnull as debt_amount,\r\ncount(1) as overdue_amount\r\
|
||||
\nfrom t_transaction_count_base\r\nwhere dt_clearing between addMonths(toDate('{{to_dt\
|
||||
\ }}'),-12) and toDate('{{to_dt }}')\r\nand overdue_days > 0\r\ngroup by toLastDayOfMonth(dt_clearing),\
|
||||
\ \r\ncounterparty_search_name,\r\nrelation_type_code_name,\r\nattribute,unit_balance_code_name,attribute\r\
|
||||
\n"
|
||||
params: null
|
||||
template_params: null
|
||||
filter_select_enabled: true
|
||||
fetch_values_predicate: null
|
||||
extra: null
|
||||
normalize_columns: false
|
||||
uuid: 9e645dc0-da25-4f61-9465-6e649b0bc4b1
|
||||
metrics:
|
||||
- metric_name: m_debt_amount
|
||||
verbose_name: "\u0414\u0417, $"
|
||||
metric_type: count
|
||||
expression: sum(debt_amount)
|
||||
description: null
|
||||
d3format: null
|
||||
currency: null
|
||||
extra:
|
||||
warning_markdown: ''
|
||||
warning_text: null
|
||||
- metric_name: m_overdue_amount
|
||||
verbose_name: "\u041F\u0414\u0417, $"
|
||||
metric_type: null
|
||||
expression: sum(overdue_amount)
|
||||
description: null
|
||||
d3format: null
|
||||
currency: null
|
||||
extra:
|
||||
warning_markdown: ''
|
||||
warning_text: null
|
||||
columns:
|
||||
- column_name: debt_amount
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(Decimal(38, 2))
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: overdue_amount
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(Decimal(38, 2))
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: dt
|
||||
verbose_name: null
|
||||
is_dttm: true
|
||||
is_active: true
|
||||
type: Nullable(Date)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: unit_balance_code_name
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(String)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: relation_type_code_name
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(String)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: counterparty_search_name
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(String)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: attribute
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(String)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
version: 1.0.0
|
||||
database_uuid: 97aced68-326a-4094-b381-27980560efa9
|
||||
@@ -0,0 +1,3 @@
|
||||
version: 1.0.0
|
||||
type: Dashboard
|
||||
timestamp: '2026-01-14T11:21:08.078620+00:00'
|
||||
@@ -1,79 +0,0 @@
|
||||
# [DEF:debug_db_api:Module]
|
||||
#
|
||||
# @SEMANTICS: debug, api, database, script
|
||||
# @PURPOSE: Скрипт для отладки структуры ответа API баз данных.
|
||||
# @LAYER: App
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.client
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.utils
|
||||
# @PUBLIC_API: debug_database_api
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import json
|
||||
import logging
|
||||
from superset_tool.client import SupersetClient
|
||||
from superset_tool.utils.init_clients import setup_clients
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:debug_database_api:Function]
|
||||
# @PURPOSE: Отладка структуры ответа API баз данных.
|
||||
# @RELATION: CALLS -> setup_clients
|
||||
# @RELATION: CALLS -> client.get_databases
|
||||
def debug_database_api():
|
||||
logger = SupersetLogger(name="debug_db_api", level=logging.DEBUG)
|
||||
|
||||
# Инициализируем клиенты
|
||||
clients = setup_clients(logger)
|
||||
# Log JWT bearer tokens for each client
|
||||
for env_name, client in clients.items():
|
||||
try:
|
||||
# Ensure authentication (access token fetched via headers property)
|
||||
_ = client.headers
|
||||
token = client.network._tokens.get("access_token")
|
||||
logger.info(f"[debug_database_api][Token] Bearer token for {env_name}: {token}")
|
||||
except Exception as exc:
|
||||
logger.error(f"[debug_database_api][Token] Failed to retrieve token for {env_name}: {exc}", exc_info=True)
|
||||
|
||||
# Проверяем доступные окружения
|
||||
print("Доступные окружения:")
|
||||
for env_name, client in clients.items():
|
||||
print(f" {env_name}: {client.config.base_url}")
|
||||
|
||||
# Выбираем два окружения для тестирования
|
||||
if len(clients) < 2:
|
||||
print("Недостаточно окружений для тестирования")
|
||||
return
|
||||
|
||||
env_names = list(clients.keys())[:2]
|
||||
from_env, to_env = env_names[0], env_names[1]
|
||||
|
||||
from_client = clients[from_env]
|
||||
to_client = clients[to_env]
|
||||
|
||||
print(f"\nТестируем API для окружений: {from_env} -> {to_env}")
|
||||
|
||||
try:
|
||||
# Получаем список баз данных из первого окружения
|
||||
print(f"\nПолучаем список БД из {from_env}:")
|
||||
count, dbs = from_client.get_databases()
|
||||
print(f"Найдено {count} баз данных")
|
||||
print("Полный ответ API:")
|
||||
print(json.dumps({"count": count, "result": dbs}, indent=2, ensure_ascii=False))
|
||||
|
||||
# Получаем список баз данных из второго окружения
|
||||
print(f"\nПолучаем список БД из {to_env}:")
|
||||
count, dbs = to_client.get_databases()
|
||||
print(f"Найдено {count} баз данных")
|
||||
print("Полный ответ API:")
|
||||
print(json.dumps({"count": count, "result": dbs}, indent=2, ensure_ascii=False))
|
||||
|
||||
except Exception as e:
|
||||
print(f"Ошибка при тестировании API: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
# [/DEF:debug_database_api:Function]
|
||||
|
||||
if __name__ == "__main__":
|
||||
debug_database_api()
|
||||
|
||||
# [/DEF:debug_db_api:Module]
|
||||
7
frontend/.eslintignore
Normal file
7
frontend/.eslintignore
Normal file
@@ -0,0 +1,7 @@
|
||||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
.svelte-kit/
|
||||
.vite/
|
||||
coverage/
|
||||
*.min.js
|
||||
9
frontend/.prettierignore
Normal file
9
frontend/.prettierignore
Normal file
@@ -0,0 +1,9 @@
|
||||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
.svelte-kit/
|
||||
.vite/
|
||||
coverage/
|
||||
package-lock.json
|
||||
yarn.lock
|
||||
pnpm-lock.yaml
|
||||
@@ -1,13 +0,0 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>frontend</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="app"></div>
|
||||
<script type="module" src="/src/main.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,4 +1,5 @@
|
||||
{
|
||||
"extends": "./.svelte-kit/tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"moduleResolution": "bundler",
|
||||
"target": "ESNext",
|
||||
|
||||
10
frontend/package-lock.json
generated
10
frontend/package-lock.json
generated
@@ -912,7 +912,6 @@
|
||||
"integrity": "sha512-Vp3zX/qlwerQmHMP6x0Ry1oY7eKKRcOWGc2P59srOp4zcqyn+etJyQpELgOi4+ZSUgteX8Y387NuwruLgGXLUQ==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"@standard-schema/spec": "^1.0.0",
|
||||
"@sveltejs/acorn-typescript": "^1.0.5",
|
||||
@@ -952,7 +951,6 @@
|
||||
"integrity": "sha512-YZs/OSKOQAQCnJvM/P+F1URotNnYNeU3P2s4oIpzm1uFaqUEqRxUB0g5ejMjEb5Gjb9/PiBI5Ktrq4rUUF8UVQ==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"@sveltejs/vite-plugin-svelte-inspector": "^5.0.0",
|
||||
"debug": "^4.4.1",
|
||||
@@ -1006,7 +1004,6 @@
|
||||
"integrity": "sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"bin": {
|
||||
"acorn": "bin/acorn"
|
||||
},
|
||||
@@ -1155,7 +1152,6 @@
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"baseline-browser-mapping": "^2.9.0",
|
||||
"caniuse-lite": "^1.0.30001759",
|
||||
@@ -1613,7 +1609,6 @@
|
||||
"integrity": "sha512-/imKNG4EbWNrVjoNC/1H5/9GFy+tqjGBHCaSsN+P2RnPqjsLmv6UD3Ej+Kj8nBWaRAwyk7kK5ZUc+OEatnTR3A==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"bin": {
|
||||
"jiti": "bin/jiti.js"
|
||||
}
|
||||
@@ -1851,7 +1846,6 @@
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"nanoid": "^3.3.11",
|
||||
"picocolors": "^1.1.1",
|
||||
@@ -2224,7 +2218,6 @@
|
||||
"integrity": "sha512-ZhLtvroYxUxr+HQJfMZEDRsGsmU46x12RvAv/zi9584f5KOX7bUrEbhPJ7cKFmUvZTJXi/CFZUYwDC6M1FigPw==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"@jridgewell/remapping": "^2.3.4",
|
||||
"@jridgewell/sourcemap-codec": "^1.5.0",
|
||||
@@ -2348,7 +2341,6 @@
|
||||
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
@@ -2430,7 +2422,6 @@
|
||||
"integrity": "sha512-dZwN5L1VlUBewiP6H9s2+B3e3Jg96D0vzN+Ry73sOefebhYr9f94wwkMNN/9ouoU8pV1BqA1d1zGk8928cx0rg==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"esbuild": "^0.27.0",
|
||||
"fdir": "^6.5.0",
|
||||
@@ -2524,7 +2515,6 @@
|
||||
"integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
|
||||
@@ -1,113 +0,0 @@
|
||||
<!-- [DEF:App:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: main, entrypoint, layout, navigation
|
||||
@PURPOSE: The root component of the frontend application. Manages navigation and layout.
|
||||
@LAYER: UI
|
||||
@RELATION: DEPENDS_ON -> frontend/src/pages/Dashboard.svelte
|
||||
@RELATION: DEPENDS_ON -> frontend/src/pages/Settings.svelte
|
||||
@RELATION: DEPENDS_ON -> frontend/src/lib/stores.js
|
||||
|
||||
@INVARIANT: Navigation state must be persisted in the currentPage store.
|
||||
-->
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { get } from 'svelte/store';
|
||||
import Dashboard from './pages/Dashboard.svelte';
|
||||
import Settings from './pages/Settings.svelte';
|
||||
import { selectedPlugin, selectedTask, currentPage } from './lib/stores.js';
|
||||
import TaskRunner from './components/TaskRunner.svelte';
|
||||
import DynamicForm from './components/DynamicForm.svelte';
|
||||
import { api } from './lib/api.js';
|
||||
import Toast from './components/Toast.svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [DEF:handleFormSubmit:Function]
|
||||
/**
|
||||
* @purpose Handles form submission for task creation.
|
||||
* @param {CustomEvent} event - The submit event from DynamicForm.
|
||||
*/
|
||||
async function handleFormSubmit(event) {
|
||||
console.log("[App.handleFormSubmit][Action] Handling form submission for task creation.");
|
||||
const params = event.detail;
|
||||
try {
|
||||
const plugin = get(selectedPlugin);
|
||||
const task = await api.createTask(plugin.id, params);
|
||||
selectedTask.set(task);
|
||||
selectedPlugin.set(null);
|
||||
console.log(`[App.handleFormSubmit][Coherence:OK] Task created id=${task.id}`);
|
||||
} catch (error) {
|
||||
console.error(`[App.handleFormSubmit][Coherence:Failed] Task creation failed error=${error}`);
|
||||
}
|
||||
}
|
||||
// [/DEF:handleFormSubmit:Function]
|
||||
|
||||
// [DEF:navigate:Function]
|
||||
/**
|
||||
* @purpose Changes the current page and resets state.
|
||||
* @param {string} page - Target page name.
|
||||
*/
|
||||
function navigate(page) {
|
||||
console.log(`[App.navigate][Action] Navigating to ${page}.`);
|
||||
// Reset selection first
|
||||
if (page !== get(currentPage)) {
|
||||
selectedPlugin.set(null);
|
||||
selectedTask.set(null);
|
||||
}
|
||||
// Then set page
|
||||
currentPage.set(page);
|
||||
}
|
||||
// [/DEF:navigate:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<Toast />
|
||||
|
||||
<main class="bg-gray-50 min-h-screen">
|
||||
<header class="bg-white shadow-md p-4 flex justify-between items-center">
|
||||
<button
|
||||
type="button"
|
||||
class="text-3xl font-bold text-gray-800 focus:outline-none"
|
||||
on:click={() => navigate('dashboard')}
|
||||
>
|
||||
Superset Tools
|
||||
</button>
|
||||
<nav class="space-x-4">
|
||||
<button
|
||||
type="button"
|
||||
on:click={() => navigate('dashboard')}
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$currentPage === 'dashboard' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Dashboard
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
on:click={() => navigate('settings')}
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$currentPage === 'settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Settings
|
||||
</button>
|
||||
</nav>
|
||||
</header>
|
||||
|
||||
<div class="p-4">
|
||||
{#if $currentPage === 'settings'}
|
||||
<Settings />
|
||||
{:else if $selectedTask}
|
||||
<TaskRunner />
|
||||
<button on:click={() => selectedTask.set(null)} class="mt-4 bg-blue-500 text-white p-2 rounded">
|
||||
Back to Task List
|
||||
</button>
|
||||
{:else if $selectedPlugin}
|
||||
<h2 class="text-2xl font-bold mb-4">{$selectedPlugin.name}</h2>
|
||||
<DynamicForm schema={$selectedPlugin.schema} on:submit={handleFormSubmit} />
|
||||
<button on:click={() => selectedPlugin.set(null)} class="mt-4 bg-gray-500 text-white p-2 rounded">
|
||||
Back to Dashboard
|
||||
</button>
|
||||
{:else}
|
||||
<Dashboard />
|
||||
{/if}
|
||||
</div>
|
||||
</main>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:App:Component] -->
|
||||
@@ -12,6 +12,7 @@
|
||||
// [SECTION: IMPORTS]
|
||||
import { createEventDispatcher } from 'svelte';
|
||||
import type { DashboardMetadata } from '../types/dashboard';
|
||||
import GitManager from './git/GitManager.svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
@@ -27,6 +28,12 @@
|
||||
let sortDirection: "asc" | "desc" = "asc";
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: UI STATE]
|
||||
let showGitManager = false;
|
||||
let gitDashboardId: number | null = null;
|
||||
let gitDashboardTitle = "";
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: DERIVED]
|
||||
$: filteredDashboards = dashboards.filter(d =>
|
||||
d.title.toLowerCase().includes(filterText.toLowerCase())
|
||||
@@ -61,6 +68,8 @@
|
||||
|
||||
// [DEF:handleSort:Function]
|
||||
// @PURPOSE: Toggles sort direction or changes sort column.
|
||||
// @PRE: column name is provided.
|
||||
// @POST: sortColumn and sortDirection state updated.
|
||||
function handleSort(column: keyof DashboardMetadata) {
|
||||
if (sortColumn === column) {
|
||||
sortDirection = sortDirection === "asc" ? "desc" : "asc";
|
||||
@@ -73,6 +82,8 @@
|
||||
|
||||
// [DEF:handleSelectionChange:Function]
|
||||
// @PURPOSE: Handles individual checkbox changes.
|
||||
// @PRE: dashboard ID and checked status provided.
|
||||
// @POST: selectedIds array updated and selectionChanged event dispatched.
|
||||
function handleSelectionChange(id: number, checked: boolean) {
|
||||
let newSelected = [...selectedIds];
|
||||
if (checked) {
|
||||
@@ -87,6 +98,8 @@
|
||||
|
||||
// [DEF:handleSelectAll:Function]
|
||||
// @PURPOSE: Handles select all checkbox.
|
||||
// @PRE: checked status provided.
|
||||
// @POST: selectedIds array updated for all paginated items and event dispatched.
|
||||
function handleSelectAll(checked: boolean) {
|
||||
let newSelected = [...selectedIds];
|
||||
if (checked) {
|
||||
@@ -105,6 +118,8 @@
|
||||
|
||||
// [DEF:goToPage:Function]
|
||||
// @PURPOSE: Changes current page.
|
||||
// @PRE: page index is provided.
|
||||
// @POST: currentPage state updated if within valid range.
|
||||
function goToPage(page: number) {
|
||||
if (page >= 0 && page < totalPages) {
|
||||
currentPage = page;
|
||||
@@ -112,6 +127,17 @@
|
||||
}
|
||||
// [/DEF:goToPage:Function]
|
||||
|
||||
// [DEF:openGit:Function]
|
||||
/**
|
||||
* @purpose Opens the Git management modal for a dashboard.
|
||||
*/
|
||||
function openGit(dashboard: DashboardMetadata) {
|
||||
gitDashboardId = dashboard.id;
|
||||
gitDashboardTitle = dashboard.title;
|
||||
showGitManager = true;
|
||||
}
|
||||
// [/DEF:openGit:Function]
|
||||
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
@@ -148,6 +174,7 @@
|
||||
<th class="px-4 py-2 border-b cursor-pointer" on:click={() => handleSort('status')}>
|
||||
Status {sortColumn === 'status' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
||||
</th>
|
||||
<th class="px-4 py-2 border-b">Git</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
@@ -167,6 +194,14 @@
|
||||
{dashboard.status}
|
||||
</span>
|
||||
</td>
|
||||
<td class="px-4 py-2 border-b">
|
||||
<button
|
||||
on:click={() => openGit(dashboard)}
|
||||
class="text-indigo-600 hover:text-indigo-900 text-sm font-medium"
|
||||
>
|
||||
Manage Git
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
{/each}
|
||||
</tbody>
|
||||
@@ -196,6 +231,15 @@
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if showGitManager && gitDashboardId}
|
||||
<GitManager
|
||||
dashboardId={gitDashboardId}
|
||||
dashboardTitle={gitDashboardTitle}
|
||||
bind:show={showGitManager}
|
||||
/>
|
||||
{/if}
|
||||
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
|
||||
@@ -23,6 +23,8 @@
|
||||
// [DEF:handleSubmit:Function]
|
||||
/**
|
||||
* @purpose Dispatches the submit event with the form data.
|
||||
* @pre formData contains user input.
|
||||
* @post 'submit' event is dispatched with formData.
|
||||
*/
|
||||
function handleSubmit() {
|
||||
console.log("[DynamicForm][Action] Submitting form data.", { formData });
|
||||
@@ -33,6 +35,8 @@
|
||||
// [DEF:initializeForm:Function]
|
||||
/**
|
||||
* @purpose Initialize form data with default values from the schema.
|
||||
* @pre schema is provided and contains properties.
|
||||
* @post formData is initialized with default values or empty strings.
|
||||
*/
|
||||
function initializeForm() {
|
||||
if (schema && schema.properties) {
|
||||
|
||||
@@ -24,6 +24,8 @@
|
||||
// [DEF:handleSelect:Function]
|
||||
/**
|
||||
* @purpose Dispatches the selection change event.
|
||||
* @pre event.target must be an HTMLSelectElement.
|
||||
* @post selectedId is updated and 'change' event is dispatched.
|
||||
* @param {Event} event - The change event from the select element.
|
||||
*/
|
||||
function handleSelect(event: Event) {
|
||||
|
||||
@@ -25,6 +25,8 @@
|
||||
// [DEF:updateMapping:Function]
|
||||
/**
|
||||
* @purpose Updates a mapping for a specific source database.
|
||||
* @pre sourceUuid and targetUuid are provided.
|
||||
* @post 'update' event is dispatched.
|
||||
*/
|
||||
function updateMapping(sourceUuid: string, targetUuid: string) {
|
||||
dispatch('update', { sourceUuid, targetUuid });
|
||||
@@ -34,6 +36,8 @@
|
||||
// [DEF:getSuggestion:Function]
|
||||
/**
|
||||
* @purpose Finds a suggestion for a source database.
|
||||
* @pre sourceUuid is provided.
|
||||
* @post Returns matching suggestion object or undefined.
|
||||
*/
|
||||
function getSuggestion(sourceUuid: string) {
|
||||
return suggestions.find(s => s.source_db_uuid === sourceUuid);
|
||||
|
||||
@@ -25,6 +25,8 @@
|
||||
|
||||
// [DEF:resolve:Function]
|
||||
// @PURPOSE: Dispatches the resolution event with the selected mapping.
|
||||
// @PRE: selectedTargetUuid must be set.
|
||||
// @POST: 'resolve' event is dispatched and modal is hidden.
|
||||
function resolve() {
|
||||
if (!selectedTargetUuid) return;
|
||||
dispatch('resolve', {
|
||||
@@ -38,6 +40,8 @@
|
||||
|
||||
// [DEF:cancel:Function]
|
||||
// @PURPOSE: Cancels the mapping resolution modal.
|
||||
// @PRE: Modal is open.
|
||||
// @POST: 'cancel' event is dispatched and modal is hidden.
|
||||
function cancel() {
|
||||
dispatch('cancel');
|
||||
show = false;
|
||||
|
||||
@@ -29,18 +29,39 @@
|
||||
>
|
||||
Migration
|
||||
</a>
|
||||
<a
|
||||
href="/git"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/git') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Git
|
||||
</a>
|
||||
<a
|
||||
href="/tasks"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/tasks') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Tasks
|
||||
</a>
|
||||
<a
|
||||
href="/settings"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Settings
|
||||
</a>
|
||||
<div class="relative inline-block group">
|
||||
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/tools') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
||||
Tools
|
||||
</button>
|
||||
<div class="absolute hidden group-hover:block bg-white shadow-lg rounded-md mt-1 py-2 w-48 z-10 border border-gray-100 before:absolute before:-top-2 before:left-0 before:right-0 before:h-2 before:content-[''] right-0">
|
||||
<a href="/tools/search" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">Dataset Search</a>
|
||||
<a href="/tools/mapper" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">Dataset Mapper</a>
|
||||
<a href="/tools/debug" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">System Debug</a>
|
||||
</div>
|
||||
</div>
|
||||
<div class="relative inline-block group">
|
||||
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/settings') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
||||
Settings
|
||||
</button>
|
||||
<div class="absolute hidden group-hover:block bg-white shadow-lg rounded-md mt-1 py-2 w-48 z-10 border border-gray-100 before:absolute before:-top-2 before:left-0 before:right-0 before:h-2 before:content-[''] right-0">
|
||||
<a href="/settings" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">General Settings</a>
|
||||
<a href="/settings/connections" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">Connections</a>
|
||||
<a href="/settings/git" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">Git Integration</a>
|
||||
<a href="/settings/environments" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">Environments</a>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
</header>
|
||||
<!-- [/DEF:Navbar:Component] -->
|
||||
|
||||
@@ -20,6 +20,8 @@
|
||||
|
||||
// [DEF:handleSubmit:Function]
|
||||
// @PURPOSE: Validates and dispatches the passwords to resume the task.
|
||||
// @PRE: All database passwords must be entered.
|
||||
// @POST: 'resume' event is dispatched with passwords.
|
||||
function handleSubmit() {
|
||||
if (submitting) return;
|
||||
|
||||
@@ -38,6 +40,8 @@
|
||||
|
||||
// [DEF:handleCancel:Function]
|
||||
// @PURPOSE: Cancels the password prompt.
|
||||
// @PRE: Modal is open.
|
||||
// @POST: 'cancel' event is dispatched and show is set to false.
|
||||
function handleCancel() {
|
||||
dispatch('cancel');
|
||||
show = false;
|
||||
|
||||
@@ -17,6 +17,8 @@
|
||||
|
||||
// [DEF:fetchTasks:Function]
|
||||
// @PURPOSE: Fetches the list of recent tasks from the API.
|
||||
// @PRE: None.
|
||||
// @POST: tasks array is updated and selectedTask status synchronized.
|
||||
async function fetchTasks() {
|
||||
try {
|
||||
const res = await fetch('/api/tasks?limit=10');
|
||||
@@ -47,6 +49,8 @@
|
||||
|
||||
// [DEF:clearTasks:Function]
|
||||
// @PURPOSE: Clears tasks from the history, optionally filtered by status.
|
||||
// @PRE: User confirms deletion via prompt.
|
||||
// @POST: Tasks are deleted from backend and list is re-fetched.
|
||||
async function clearTasks(status = null) {
|
||||
if (!confirm('Are you sure you want to clear tasks?')) return;
|
||||
try {
|
||||
@@ -66,6 +70,8 @@
|
||||
|
||||
// [DEF:selectTask:Function]
|
||||
// @PURPOSE: Selects a task and fetches its full details.
|
||||
// @PRE: task object is provided.
|
||||
// @POST: selectedTask store is updated with full task details.
|
||||
async function selectTask(task) {
|
||||
try {
|
||||
// Fetch the full task details (including logs) before setting it as selected
|
||||
@@ -86,6 +92,8 @@
|
||||
|
||||
// [DEF:getStatusColor:Function]
|
||||
// @PURPOSE: Returns the CSS color class for a given task status.
|
||||
// @PRE: status string is provided.
|
||||
// @POST: Returns tailwind color class string.
|
||||
function getStatusColor(status) {
|
||||
switch (status) {
|
||||
case 'SUCCESS': return 'bg-green-100 text-green-800';
|
||||
@@ -100,6 +108,8 @@
|
||||
|
||||
// [DEF:onMount:Function]
|
||||
// @PURPOSE: Initializes the component by fetching tasks and starting polling.
|
||||
// @PRE: Component is mounting.
|
||||
// @POST: Tasks are fetched and 5s polling interval is started.
|
||||
onMount(() => {
|
||||
fetchTasks();
|
||||
interval = setInterval(fetchTasks, 5000); // Poll every 5s
|
||||
@@ -108,6 +118,8 @@
|
||||
|
||||
// [DEF:onDestroy:Function]
|
||||
// @PURPOSE: Cleans up the polling interval when the component is destroyed.
|
||||
// @PRE: Component is being destroyed.
|
||||
// @POST: Polling interval is cleared.
|
||||
onDestroy(() => {
|
||||
clearInterval(interval);
|
||||
});
|
||||
|
||||
@@ -17,6 +17,8 @@
|
||||
|
||||
// [DEF:getStatusColor:Function]
|
||||
// @PURPOSE: Returns the CSS color class for a given task status.
|
||||
// @PRE: status string is provided.
|
||||
// @POST: Returns tailwind color class string.
|
||||
function getStatusColor(status: string) {
|
||||
switch (status) {
|
||||
case 'SUCCESS': return 'bg-green-100 text-green-800';
|
||||
@@ -32,6 +34,8 @@
|
||||
|
||||
// [DEF:formatTime:Function]
|
||||
// @PURPOSE: Formats a date string using date-fns.
|
||||
// @PRE: dateStr is a valid date string or null.
|
||||
// @POST: Returns human-readable relative time string.
|
||||
function formatTime(dateStr: string | null) {
|
||||
if (!dateStr) return 'N/A';
|
||||
try {
|
||||
@@ -44,6 +48,8 @@
|
||||
|
||||
// [DEF:handleTaskClick:Function]
|
||||
// @PURPOSE: Dispatches a select event when a task is clicked.
|
||||
// @PRE: taskId is provided.
|
||||
// @POST: 'select' event is dispatched with task ID.
|
||||
function handleTaskClick(taskId: string) {
|
||||
dispatch('select', { id: taskId });
|
||||
}
|
||||
|
||||
@@ -1,15 +1,16 @@
|
||||
<!-- [DEF:TaskLogViewer:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: task, log, viewer, modal
|
||||
@PURPOSE: Displays detailed logs for a specific task in a modal.
|
||||
@SEMANTICS: task, log, viewer, modal, inline
|
||||
@PURPOSE: Displays detailed logs for a specific task in a modal or inline.
|
||||
@LAYER: UI
|
||||
@RELATION: USES -> frontend/src/lib/api.js (inferred)
|
||||
@RELATION: USES -> frontend/src/services/taskService.js
|
||||
-->
|
||||
<script>
|
||||
import { createEventDispatcher, onMount, onDestroy } from 'svelte';
|
||||
import { getTaskLogs } from '../services/taskService.js';
|
||||
|
||||
export let show = false;
|
||||
export let inline = false;
|
||||
export let taskId = null;
|
||||
export let taskStatus = null; // To know if we should poll
|
||||
|
||||
@@ -22,17 +23,27 @@
|
||||
let autoScroll = true;
|
||||
let logContainer;
|
||||
|
||||
$: shouldShow = inline || show;
|
||||
|
||||
// [DEF:fetchLogs:Function]
|
||||
// @PURPOSE: Fetches logs for the current task.
|
||||
/**
|
||||
* @purpose Fetches logs for the current task.
|
||||
* @pre taskId must be set.
|
||||
* @post logs array is updated with data from taskService.
|
||||
* @side_effect Updates logs, loading, and error state.
|
||||
*/
|
||||
async function fetchLogs() {
|
||||
if (!taskId) return;
|
||||
console.log(`[fetchLogs][Action] Fetching logs for task context={{'taskId': '${taskId}'}}`);
|
||||
try {
|
||||
logs = await getTaskLogs(taskId);
|
||||
if (autoScroll) {
|
||||
scrollToBottom();
|
||||
}
|
||||
console.log(`[fetchLogs][Coherence:OK] Logs fetched context={{'count': ${logs.length}}}`);
|
||||
} catch (e) {
|
||||
error = e.message;
|
||||
console.error(`[fetchLogs][Coherence:Failed] Error fetching logs context={{'error': '${e.message}'}}`);
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
@@ -40,7 +51,11 @@
|
||||
// [/DEF:fetchLogs:Function]
|
||||
|
||||
// [DEF:scrollToBottom:Function]
|
||||
// @PURPOSE: Scrolls the log container to the bottom.
|
||||
/**
|
||||
* @purpose Scrolls the log container to the bottom.
|
||||
* @pre logContainer element must be bound.
|
||||
* @post logContainer scrollTop is set to scrollHeight.
|
||||
*/
|
||||
function scrollToBottom() {
|
||||
if (logContainer) {
|
||||
setTimeout(() => {
|
||||
@@ -51,7 +66,11 @@
|
||||
// [/DEF:scrollToBottom:Function]
|
||||
|
||||
// [DEF:handleScroll:Function]
|
||||
// @PURPOSE: Updates auto-scroll preference based on scroll position.
|
||||
/**
|
||||
* @purpose Updates auto-scroll preference based on scroll position.
|
||||
* @pre logContainer scroll event fired.
|
||||
* @post autoScroll boolean is updated.
|
||||
*/
|
||||
function handleScroll() {
|
||||
if (!logContainer) return;
|
||||
// If user scrolls up, disable auto-scroll
|
||||
@@ -62,7 +81,11 @@
|
||||
// [/DEF:handleScroll:Function]
|
||||
|
||||
// [DEF:close:Function]
|
||||
// @PURPOSE: Closes the log viewer modal.
|
||||
/**
|
||||
* @purpose Closes the log viewer modal.
|
||||
* @pre Modal is open.
|
||||
* @post Modal is closed and close event is dispatched.
|
||||
*/
|
||||
function close() {
|
||||
dispatch('close');
|
||||
show = false;
|
||||
@@ -70,7 +93,11 @@
|
||||
// [/DEF:close:Function]
|
||||
|
||||
// [DEF:getLogLevelColor:Function]
|
||||
// @PURPOSE: Returns the CSS color class for a given log level.
|
||||
/**
|
||||
* @purpose Returns the CSS color class for a given log level.
|
||||
* @pre level string is provided.
|
||||
* @post Returns tailwind color class string.
|
||||
*/
|
||||
function getLogLevelColor(level) {
|
||||
switch (level) {
|
||||
case 'INFO': return 'text-blue-600';
|
||||
@@ -82,8 +109,10 @@
|
||||
}
|
||||
// [/DEF:getLogLevelColor:Function]
|
||||
|
||||
// React to changes in show/taskId
|
||||
$: if (show && taskId) {
|
||||
// React to changes in show/taskId/taskStatus
|
||||
$: if (shouldShow && taskId) {
|
||||
if (interval) clearInterval(interval);
|
||||
|
||||
logs = [];
|
||||
loading = true;
|
||||
error = "";
|
||||
@@ -98,74 +127,120 @@
|
||||
}
|
||||
|
||||
// [DEF:onDestroy:Function]
|
||||
// @PURPOSE: Cleans up the polling interval.
|
||||
/**
|
||||
* @purpose Cleans up the polling interval.
|
||||
* @pre Component is being destroyed.
|
||||
* @post Polling interval is cleared.
|
||||
*/
|
||||
onDestroy(() => {
|
||||
if (interval) clearInterval(interval);
|
||||
});
|
||||
// [/DEF:onDestroy:Function]
|
||||
</script>
|
||||
|
||||
{#if show}
|
||||
<div class="fixed inset-0 z-50 overflow-y-auto" aria-labelledby="modal-title" role="dialog" aria-modal="true">
|
||||
<div class="flex items-end justify-center min-h-screen pt-4 px-4 pb-20 text-center sm:block sm:p-0">
|
||||
<!-- Background overlay -->
|
||||
<div class="fixed inset-0 bg-gray-500 bg-opacity-75 transition-opacity" aria-hidden="true" on:click={close}></div>
|
||||
{#if shouldShow}
|
||||
{#if inline}
|
||||
<div class="flex flex-col h-full w-full p-4">
|
||||
<div class="flex justify-between items-center mb-4">
|
||||
<h3 class="text-lg font-medium text-gray-900">
|
||||
Task Logs <span class="text-sm text-gray-500 font-normal">({taskId})</span>
|
||||
</h3>
|
||||
<button on:click={fetchLogs} class="text-sm text-indigo-600 hover:text-indigo-900">Refresh</button>
|
||||
</div>
|
||||
|
||||
<div class="flex-1 border rounded-md bg-gray-50 p-4 overflow-y-auto font-mono text-sm"
|
||||
bind:this={logContainer}
|
||||
on:scroll={handleScroll}>
|
||||
{#if loading && logs.length === 0}
|
||||
<p class="text-gray-500 text-center">Loading logs...</p>
|
||||
{:else if error}
|
||||
<p class="text-red-500 text-center">{error}</p>
|
||||
{:else if logs.length === 0}
|
||||
<p class="text-gray-500 text-center">No logs available.</p>
|
||||
{:else}
|
||||
{#each logs as log}
|
||||
<div class="mb-1 hover:bg-gray-100 p-1 rounded">
|
||||
<span class="text-gray-400 text-xs mr-2">
|
||||
{new Date(log.timestamp).toLocaleTimeString()}
|
||||
</span>
|
||||
<span class="font-bold text-xs mr-2 w-16 inline-block {getLogLevelColor(log.level)}">
|
||||
[{log.level}]
|
||||
</span>
|
||||
<span class="text-gray-800 break-words">
|
||||
{log.message}
|
||||
</span>
|
||||
{#if log.context}
|
||||
<div class="ml-24 text-xs text-gray-500 mt-1 bg-gray-100 p-1 rounded overflow-x-auto">
|
||||
<pre>{JSON.stringify(log.context, null, 2)}</pre>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{/each}
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="fixed inset-0 z-50 overflow-y-auto" aria-labelledby="modal-title" role="dialog" aria-modal="true">
|
||||
<div class="flex items-end justify-center min-h-screen pt-4 px-4 pb-20 text-center sm:block sm:p-0">
|
||||
<!-- Background overlay -->
|
||||
<div class="fixed inset-0 bg-gray-500 bg-opacity-75 transition-opacity" aria-hidden="true" on:click={close}></div>
|
||||
|
||||
<span class="hidden sm:inline-block sm:align-middle sm:h-screen" aria-hidden="true">​</span>
|
||||
<span class="hidden sm:inline-block sm:align-middle sm:h-screen" aria-hidden="true">​</span>
|
||||
|
||||
<div class="inline-block align-bottom bg-white rounded-lg text-left overflow-hidden shadow-xl transform transition-all sm:my-8 sm:align-middle sm:max-w-4xl sm:w-full">
|
||||
<div class="bg-white px-4 pt-5 pb-4 sm:p-6 sm:pb-4">
|
||||
<div class="sm:flex sm:items-start">
|
||||
<div class="mt-3 text-center sm:mt-0 sm:ml-4 sm:text-left w-full">
|
||||
<h3 class="text-lg leading-6 font-medium text-gray-900 flex justify-between items-center" id="modal-title">
|
||||
<span>Task Logs <span class="text-sm text-gray-500 font-normal">({taskId})</span></span>
|
||||
<button on:click={fetchLogs} class="text-sm text-indigo-600 hover:text-indigo-900">Refresh</button>
|
||||
</h3>
|
||||
|
||||
<div class="mt-4 border rounded-md bg-gray-50 p-4 h-96 overflow-y-auto font-mono text-sm"
|
||||
bind:this={logContainer}
|
||||
on:scroll={handleScroll}>
|
||||
{#if loading && logs.length === 0}
|
||||
<p class="text-gray-500 text-center">Loading logs...</p>
|
||||
{:else if error}
|
||||
<p class="text-red-500 text-center">{error}</p>
|
||||
{:else if logs.length === 0}
|
||||
<p class="text-gray-500 text-center">No logs available.</p>
|
||||
{:else}
|
||||
{#each logs as log}
|
||||
<div class="mb-1 hover:bg-gray-100 p-1 rounded">
|
||||
<span class="text-gray-400 text-xs mr-2">
|
||||
{new Date(log.timestamp).toLocaleTimeString()}
|
||||
</span>
|
||||
<span class="font-bold text-xs mr-2 w-16 inline-block {getLogLevelColor(log.level)}">
|
||||
[{log.level}]
|
||||
</span>
|
||||
<span class="text-gray-800 break-words">
|
||||
{log.message}
|
||||
</span>
|
||||
{#if log.context}
|
||||
<div class="ml-24 text-xs text-gray-500 mt-1 bg-gray-100 p-1 rounded overflow-x-auto">
|
||||
<pre>{JSON.stringify(log.context, null, 2)}</pre>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{/each}
|
||||
{/if}
|
||||
<div class="inline-block align-bottom bg-white rounded-lg text-left overflow-hidden shadow-xl transform transition-all sm:my-8 sm:align-middle sm:max-w-4xl sm:w-full">
|
||||
<div class="bg-white px-4 pt-5 pb-4 sm:p-6 sm:pb-4">
|
||||
<div class="sm:flex sm:items-start">
|
||||
<div class="mt-3 text-center sm:mt-0 sm:ml-4 sm:text-left w-full">
|
||||
<h3 class="text-lg leading-6 font-medium text-gray-900 flex justify-between items-center" id="modal-title">
|
||||
<span>Task Logs <span class="text-sm text-gray-500 font-normal">({taskId})</span></span>
|
||||
<button on:click={fetchLogs} class="text-sm text-indigo-600 hover:text-indigo-900">Refresh</button>
|
||||
</h3>
|
||||
|
||||
<div class="mt-4 border rounded-md bg-gray-50 p-4 h-96 overflow-y-auto font-mono text-sm"
|
||||
bind:this={logContainer}
|
||||
on:scroll={handleScroll}>
|
||||
{#if loading && logs.length === 0}
|
||||
<p class="text-gray-500 text-center">Loading logs...</p>
|
||||
{:else if error}
|
||||
<p class="text-red-500 text-center">{error}</p>
|
||||
{:else if logs.length === 0}
|
||||
<p class="text-gray-500 text-center">No logs available.</p>
|
||||
{:else}
|
||||
{#each logs as log}
|
||||
<div class="mb-1 hover:bg-gray-100 p-1 rounded">
|
||||
<span class="text-gray-400 text-xs mr-2">
|
||||
{new Date(log.timestamp).toLocaleTimeString()}
|
||||
</span>
|
||||
<span class="font-bold text-xs mr-2 w-16 inline-block {getLogLevelColor(log.level)}">
|
||||
[{log.level}]
|
||||
</span>
|
||||
<span class="text-gray-800 break-words">
|
||||
{log.message}
|
||||
</span>
|
||||
{#if log.context}
|
||||
<div class="ml-24 text-xs text-gray-500 mt-1 bg-gray-100 p-1 rounded overflow-x-auto">
|
||||
<pre>{JSON.stringify(log.context, null, 2)}</pre>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{/each}
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="bg-gray-50 px-4 py-3 sm:px-6 sm:flex sm:flex-row-reverse">
|
||||
<button
|
||||
type="button"
|
||||
class="mt-3 w-full inline-flex justify-center rounded-md border border-gray-300 shadow-sm px-4 py-2 bg-white text-base font-medium text-gray-700 hover:bg-gray-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:mt-0 sm:ml-3 sm:w-auto sm:text-sm"
|
||||
on:click={close}
|
||||
>
|
||||
Close
|
||||
</button>
|
||||
<div class="bg-gray-50 px-4 py-3 sm:px-6 sm:flex sm:flex-row-reverse">
|
||||
<button
|
||||
type="button"
|
||||
class="mt-3 w-full inline-flex justify-center rounded-md border border-gray-300 shadow-sm px-4 py-2 bg-white text-base font-medium text-gray-700 hover:bg-gray-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:mt-0 sm:ml-3 sm:w-auto sm:text-sm"
|
||||
on:click={close}
|
||||
>
|
||||
Close
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
{/if}
|
||||
<!-- [/DEF:TaskLogViewer:Component] -->
|
||||
@@ -38,6 +38,8 @@
|
||||
// [DEF:connect:Function]
|
||||
/**
|
||||
* @purpose Establishes WebSocket connection with exponential backoff.
|
||||
* @pre selectedTask must be set in the store.
|
||||
* @post WebSocket instance created and listeners attached.
|
||||
*/
|
||||
function connect() {
|
||||
const task = get(selectedTask);
|
||||
@@ -131,6 +133,8 @@
|
||||
|
||||
// [DEF:fetchTargetDatabases:Function]
|
||||
// @PURPOSE: Fetches the list of databases in the target environment.
|
||||
// @PRE: task must be selected and have a target environment parameter.
|
||||
// @POST: targetDatabases array is populated with database objects.
|
||||
async function fetchTargetDatabases() {
|
||||
const task = get(selectedTask);
|
||||
if (!task || !task.params.to_env) return;
|
||||
@@ -153,6 +157,8 @@
|
||||
|
||||
// [DEF:handleMappingResolve:Function]
|
||||
// @PURPOSE: Handles the resolution of a missing database mapping.
|
||||
// @PRE: event.detail contains sourceDbUuid, targetDbUuid, and targetDbName.
|
||||
// @POST: Mapping is saved and task is resumed.
|
||||
async function handleMappingResolve(event) {
|
||||
const task = get(selectedTask);
|
||||
const { sourceDbUuid, targetDbUuid, targetDbName } = event.detail;
|
||||
@@ -196,6 +202,8 @@
|
||||
|
||||
// [DEF:handlePasswordResume:Function]
|
||||
// @PURPOSE: Handles the submission of database passwords to resume a task.
|
||||
// @PRE: event.detail contains passwords dictionary.
|
||||
// @POST: Task resume endpoint is called with passwords.
|
||||
async function handlePasswordResume(event) {
|
||||
const task = get(selectedTask);
|
||||
const { passwords } = event.detail;
|
||||
@@ -218,6 +226,8 @@
|
||||
|
||||
// [DEF:startDataTimeout:Function]
|
||||
// @PURPOSE: Starts a timeout to detect when the log stream has stalled.
|
||||
// @PRE: None.
|
||||
// @POST: dataTimeout is set to check connection status after 5s.
|
||||
function startDataTimeout() {
|
||||
waitingForData = false;
|
||||
dataTimeout = setTimeout(() => {
|
||||
@@ -230,6 +240,8 @@
|
||||
|
||||
// [DEF:resetDataTimeout:Function]
|
||||
// @PURPOSE: Resets the data stall timeout.
|
||||
// @PRE: dataTimeout must be active.
|
||||
// @POST: dataTimeout is cleared and restarted.
|
||||
function resetDataTimeout() {
|
||||
clearTimeout(dataTimeout);
|
||||
waitingForData = false;
|
||||
@@ -239,6 +251,8 @@
|
||||
|
||||
// [DEF:onMount:Function]
|
||||
// @PURPOSE: Initializes the component and subscribes to task selection changes.
|
||||
// @PRE: Svelte component is mounting.
|
||||
// @POST: Store subscription is created and returned for cleanup.
|
||||
onMount(() => {
|
||||
// Subscribe to selectedTask changes
|
||||
const unsubscribe = selectedTask.subscribe(task => {
|
||||
@@ -267,6 +281,8 @@
|
||||
// [DEF:onDestroy:Function]
|
||||
/**
|
||||
* @purpose Close WebSocket connection when the component is destroyed.
|
||||
* @pre Component is being destroyed.
|
||||
* @post WebSocket is closed and timeouts are cleared.
|
||||
*/
|
||||
onDestroy(() => {
|
||||
clearTimeout(reconnectTimeout);
|
||||
|
||||
170
frontend/src/components/git/BranchSelector.svelte
Normal file
170
frontend/src/components/git/BranchSelector.svelte
Normal file
@@ -0,0 +1,170 @@
|
||||
<!-- [DEF:BranchSelector:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: git, branch, selection, checkout
|
||||
@PURPOSE: UI для выбора и создания веток Git.
|
||||
@LAYER: Component
|
||||
@RELATION: CALLS -> gitService.getBranches
|
||||
@RELATION: CALLS -> gitService.checkoutBranch
|
||||
@RELATION: CALLS -> gitService.createBranch
|
||||
@RELATION: DISPATCHES -> change
|
||||
-->
|
||||
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount, createEventDispatcher } from 'svelte';
|
||||
import { gitService } from '../../services/gitService';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let dashboardId;
|
||||
export let currentBranch = 'main';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let branches = [];
|
||||
let loading = false;
|
||||
let showCreate = false;
|
||||
let newBranchName = '';
|
||||
// [/SECTION]
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
// [DEF:onMount:Function]
|
||||
onMount(async () => {
|
||||
await loadBranches();
|
||||
});
|
||||
// [/DEF:onMount:Function]
|
||||
|
||||
// [DEF:loadBranches:Function]
|
||||
/**
|
||||
* @purpose Загружает список веток для дашборда.
|
||||
* @post branches обновлен.
|
||||
*/
|
||||
async function loadBranches() {
|
||||
console.log(`[BranchSelector][Action] Loading branches for dashboard ${dashboardId}`);
|
||||
loading = true;
|
||||
try {
|
||||
branches = await gitService.getBranches(dashboardId);
|
||||
console.log(`[BranchSelector][Coherence:OK] Loaded ${branches.length} branches`);
|
||||
} catch (e) {
|
||||
console.error(`[BranchSelector][Coherence:Failed] ${e.message}`);
|
||||
toast('Failed to load branches', 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:loadBranches:Function]
|
||||
|
||||
// [DEF:handleSelect:Function]
|
||||
function handleSelect(event) {
|
||||
handleCheckout(event.target.value);
|
||||
}
|
||||
// [/DEF:handleSelect:Function]
|
||||
|
||||
// [DEF:handleCheckout:Function]
|
||||
/**
|
||||
* @purpose Переключает текущую ветку.
|
||||
* @param {string} branchName - Имя ветки.
|
||||
* @post currentBranch обновлен, событие отправлено.
|
||||
*/
|
||||
async function handleCheckout(branchName) {
|
||||
console.log(`[BranchSelector][Action] Checking out branch ${branchName}`);
|
||||
try {
|
||||
await gitService.checkoutBranch(dashboardId, branchName);
|
||||
currentBranch = branchName;
|
||||
dispatch('change', { branch: branchName });
|
||||
toast(`Switched to ${branchName}`, 'success');
|
||||
console.log(`[BranchSelector][Coherence:OK] Checked out ${branchName}`);
|
||||
} catch (e) {
|
||||
console.error(`[BranchSelector][Coherence:Failed] ${e.message}`);
|
||||
toast(e.message, 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleCheckout:Function]
|
||||
|
||||
// [DEF:handleCreate:Function]
|
||||
/**
|
||||
* @purpose Создает новую ветку.
|
||||
* @post Новая ветка создана и загружена.
|
||||
*/
|
||||
async function handleCreate() {
|
||||
if (!newBranchName) return;
|
||||
console.log(`[BranchSelector][Action] Creating branch ${newBranchName} from ${currentBranch}`);
|
||||
try {
|
||||
await gitService.createBranch(dashboardId, newBranchName, currentBranch);
|
||||
toast(`Created branch ${newBranchName}`, 'success');
|
||||
showCreate = false;
|
||||
newBranchName = '';
|
||||
await loadBranches();
|
||||
console.log(`[BranchSelector][Coherence:OK] Branch created`);
|
||||
} catch (e) {
|
||||
console.error(`[BranchSelector][Coherence:Failed] ${e.message}`);
|
||||
toast(e.message, 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleCreate:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="space-y-2">
|
||||
<div class="flex items-center space-x-2">
|
||||
<div class="relative">
|
||||
<select
|
||||
value={currentBranch}
|
||||
on:change={handleSelect}
|
||||
disabled={loading}
|
||||
class="bg-white border rounded px-3 py-1 text-sm focus:outline-none focus:ring-2 focus:ring-blue-500"
|
||||
>
|
||||
{#each branches as branch}
|
||||
<option value={branch.name}>{branch.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
{#if loading}
|
||||
<span class="absolute -right-6 top-1">
|
||||
<svg class="animate-spin h-4 w-4 text-blue-600" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24">
|
||||
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle>
|
||||
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
||||
</svg>
|
||||
</span>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<button
|
||||
on:click={() => showCreate = !showCreate}
|
||||
disabled={loading}
|
||||
class="text-blue-600 hover:text-blue-800 text-sm font-medium disabled:opacity-50"
|
||||
>
|
||||
+ New Branch
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{#if showCreate}
|
||||
<div class="flex items-center space-x-1 bg-gray-50 p-2 rounded border border-dashed">
|
||||
<input
|
||||
type="text"
|
||||
bind:value={newBranchName}
|
||||
placeholder="branch-name"
|
||||
disabled={loading}
|
||||
class="border rounded px-2 py-1 text-sm w-full max-w-[150px]"
|
||||
/>
|
||||
<button
|
||||
on:click={handleCreate}
|
||||
disabled={loading || !newBranchName}
|
||||
class="bg-green-600 text-white px-3 py-1 rounded text-xs font-medium hover:bg-green-700 disabled:opacity-50"
|
||||
>
|
||||
{loading ? '...' : 'Create'}
|
||||
</button>
|
||||
<button
|
||||
on:click={() => showCreate = false}
|
||||
disabled={loading}
|
||||
class="text-gray-500 hover:text-gray-700 text-xs px-2 py-1 disabled:opacity-50"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:BranchSelector:Component] -->
|
||||
90
frontend/src/components/git/CommitHistory.svelte
Normal file
90
frontend/src/components/git/CommitHistory.svelte
Normal file
@@ -0,0 +1,90 @@
|
||||
<!-- [DEF:CommitHistory:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: git, history, commits, audit
|
||||
@PURPOSE: Displays the commit history for a specific dashboard.
|
||||
@LAYER: Component
|
||||
@RELATION: CALLS -> gitService.getHistory
|
||||
-->
|
||||
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import { gitService } from '../../services/gitService';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let dashboardId;
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let history = [];
|
||||
let loading = false;
|
||||
// [/SECTION]
|
||||
|
||||
// [DEF:onMount:Function]
|
||||
/**
|
||||
* @purpose Load history when component is mounted.
|
||||
*/
|
||||
onMount(async () => {
|
||||
await loadHistory();
|
||||
});
|
||||
// [/DEF:onMount:Function]
|
||||
|
||||
// [DEF:loadHistory:Function]
|
||||
/**
|
||||
* @purpose Fetch commit history from the backend.
|
||||
* @post history state is updated.
|
||||
*/
|
||||
async function loadHistory() {
|
||||
console.log(`[CommitHistory][Action] Loading history for dashboard ${dashboardId}`);
|
||||
loading = true;
|
||||
try {
|
||||
history = await gitService.getHistory(dashboardId);
|
||||
console.log(`[CommitHistory][Coherence:OK] Loaded ${history.length} commits`);
|
||||
} catch (e) {
|
||||
console.error(`[CommitHistory][Coherence:Failed] ${e.message}`);
|
||||
toast('Failed to load commit history', 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:loadHistory:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="mt-6">
|
||||
<h3 class="text-lg font-semibold mb-4 flex justify-between items-center">
|
||||
Commit History
|
||||
<button on:click={loadHistory} class="text-sm text-blue-600 hover:underline">Refresh</button>
|
||||
</h3>
|
||||
|
||||
{#if loading}
|
||||
<div class="flex items-center space-x-2 text-gray-500">
|
||||
<svg class="animate-spin h-4 w-4 text-blue-600" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24">
|
||||
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle>
|
||||
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
||||
</svg>
|
||||
<span>Loading history...</span>
|
||||
</div>
|
||||
{:else if history.length === 0}
|
||||
<p class="text-gray-500 italic">No commits yet.</p>
|
||||
{:else}
|
||||
<div class="space-y-3 max-h-96 overflow-y-auto pr-2">
|
||||
{#each history as commit}
|
||||
<div class="border-l-2 border-blue-500 pl-4 py-1">
|
||||
<div class="flex justify-between items-start">
|
||||
<span class="font-medium text-sm">{commit.message}</span>
|
||||
<span class="text-xs text-gray-400 font-mono">{commit.hash.substring(0, 7)}</span>
|
||||
</div>
|
||||
<div class="text-xs text-gray-500 mt-1">
|
||||
{commit.author} • {new Date(commit.timestamp).toLocaleString()}
|
||||
</div>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:CommitHistory:Component] -->
|
||||
175
frontend/src/components/git/CommitModal.svelte
Normal file
175
frontend/src/components/git/CommitModal.svelte
Normal file
@@ -0,0 +1,175 @@
|
||||
<!-- [DEF:CommitModal:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: git, commit, modal, version_control, diff
|
||||
@PURPOSE: Модальное окно для создания коммита с просмотром изменений (diff).
|
||||
@LAYER: Component
|
||||
@RELATION: CALLS -> gitService.commit
|
||||
@RELATION: CALLS -> gitService.getStatus
|
||||
@RELATION: CALLS -> gitService.getDiff
|
||||
@RELATION: DISPATCHES -> commit
|
||||
-->
|
||||
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { createEventDispatcher, onMount } from 'svelte';
|
||||
import { gitService } from '../../services/gitService';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let dashboardId;
|
||||
export let show = false;
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let message = '';
|
||||
let committing = false;
|
||||
let status = null;
|
||||
let diff = '';
|
||||
let loading = false;
|
||||
// [/SECTION]
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
// [DEF:loadStatus:Function]
|
||||
/**
|
||||
* @purpose Загружает текущий статус репозитория и diff.
|
||||
* @pre dashboardId должен быть валидным.
|
||||
*/
|
||||
async function loadStatus() {
|
||||
if (!dashboardId || !show) return;
|
||||
loading = true;
|
||||
try {
|
||||
console.log(`[CommitModal][Action] Loading status and diff for ${dashboardId}`);
|
||||
status = await gitService.getStatus(dashboardId);
|
||||
// Fetch both unstaged and staged diffs to show complete picture
|
||||
const unstagedDiff = await gitService.getDiff(dashboardId, null, false);
|
||||
const stagedDiff = await gitService.getDiff(dashboardId, null, true);
|
||||
|
||||
diff = "";
|
||||
if (stagedDiff) diff += "--- STAGED CHANGES ---\n" + stagedDiff + "\n\n";
|
||||
if (unstagedDiff) diff += "--- UNSTAGED CHANGES ---\n" + unstagedDiff;
|
||||
|
||||
if (!diff) diff = "";
|
||||
} catch (e) {
|
||||
console.error(`[CommitModal][Coherence:Failed] ${e.message}`);
|
||||
toast('Failed to load changes', 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:loadStatus:Function]
|
||||
|
||||
// [DEF:handleCommit:Function]
|
||||
/**
|
||||
* @purpose Создает коммит с указанным сообщением.
|
||||
* @pre message не должно быть пустым.
|
||||
* @post Коммит создан, событие отправлено, модальное окно закрыто.
|
||||
*/
|
||||
async function handleCommit() {
|
||||
if (!message) return;
|
||||
console.log(`[CommitModal][Action] Committing changes for dashboard ${dashboardId}`);
|
||||
committing = true;
|
||||
try {
|
||||
await gitService.commit(dashboardId, message, []);
|
||||
toast('Changes committed successfully', 'success');
|
||||
dispatch('commit');
|
||||
show = false;
|
||||
message = '';
|
||||
console.log(`[CommitModal][Coherence:OK] Committed`);
|
||||
} catch (e) {
|
||||
console.error(`[CommitModal][Coherence:Failed] ${e.message}`);
|
||||
toast(e.message, 'error');
|
||||
} finally {
|
||||
committing = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleCommit:Function]
|
||||
|
||||
$: if (show) loadStatus();
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
{#if show}
|
||||
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50 p-4">
|
||||
<div class="bg-white p-6 rounded-lg shadow-xl w-full max-w-4xl max-h-[90vh] flex flex-col">
|
||||
<h2 class="text-xl font-bold mb-4">Commit Changes</h2>
|
||||
|
||||
<div class="flex flex-col md:flex-row gap-4 flex-1 overflow-hidden">
|
||||
<!-- Left: Message and Files -->
|
||||
<div class="w-full md:w-1/3 flex flex-col">
|
||||
<div class="mb-4">
|
||||
<label class="block text-sm font-medium text-gray-700 mb-1">Commit Message</label>
|
||||
<textarea
|
||||
bind:value={message}
|
||||
class="w-full border rounded p-2 h-32 focus:ring-2 focus:ring-blue-500 outline-none resize-none"
|
||||
placeholder="Describe your changes..."
|
||||
></textarea>
|
||||
</div>
|
||||
|
||||
{#if status}
|
||||
<div class="flex-1 overflow-y-auto">
|
||||
<h3 class="text-sm font-bold text-gray-500 uppercase mb-2">Changed Files</h3>
|
||||
<ul class="text-xs space-y-1">
|
||||
{#each status.staged_files as file}
|
||||
<li class="text-green-600 flex items-center font-semibold" title="Staged">
|
||||
<span class="mr-2">S</span> {file}
|
||||
</li>
|
||||
{/each}
|
||||
{#each status.modified_files as file}
|
||||
<li class="text-yellow-600 flex items-center" title="Modified (Unstaged)">
|
||||
<span class="mr-2">M</span> {file}
|
||||
</li>
|
||||
{/each}
|
||||
{#each status.untracked_files as file}
|
||||
<li class="text-blue-600 flex items-center" title="Untracked">
|
||||
<span class="mr-2">?</span> {file}
|
||||
</li>
|
||||
{/each}
|
||||
</ul>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- Right: Diff Viewer -->
|
||||
<div class="w-full md:w-2/3 flex flex-col overflow-hidden border rounded bg-gray-50">
|
||||
<div class="bg-gray-200 px-3 py-1 text-xs font-bold text-gray-600 border-b">Changes Preview</div>
|
||||
<div class="flex-1 overflow-auto p-2">
|
||||
{#if loading}
|
||||
<div class="flex items-center justify-center h-full text-gray-500">Loading diff...</div>
|
||||
{:else if diff}
|
||||
<pre class="text-xs font-mono whitespace-pre-wrap">{diff}</pre>
|
||||
{:else}
|
||||
<div class="flex items-center justify-center h-full text-gray-500 italic">No changes detected</div>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="flex justify-end space-x-3 mt-6 pt-4 border-t">
|
||||
<button
|
||||
on:click={() => show = false}
|
||||
class="px-4 py-2 text-gray-600 hover:bg-gray-100 rounded"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
on:click={handleCommit}
|
||||
disabled={committing || !message || loading || (!status?.is_dirty && status?.staged_files?.length === 0)}
|
||||
class="px-4 py-2 bg-blue-600 text-white rounded hover:bg-blue-700 disabled:opacity-50"
|
||||
>
|
||||
{committing ? 'Committing...' : 'Commit'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
pre {
|
||||
tab-size: 4;
|
||||
}
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:CommitModal:Component] -->
|
||||
142
frontend/src/components/git/ConflictResolver.svelte
Normal file
142
frontend/src/components/git/ConflictResolver.svelte
Normal file
@@ -0,0 +1,142 @@
|
||||
<!-- [DEF:ConflictResolver:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: git, conflict, resolution, merge
|
||||
@PURPOSE: UI for resolving merge conflicts (Keep Mine / Keep Theirs).
|
||||
@LAYER: Component
|
||||
@RELATION: DISPATCHES -> resolve
|
||||
|
||||
@INVARIANT: User must resolve all conflicts before saving.
|
||||
-->
|
||||
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { createEventDispatcher } from 'svelte';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
/** @type {Array<{file_path: string, mine: string, theirs: string}>} */
|
||||
export let conflicts = [];
|
||||
export let show = false;
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
const dispatch = createEventDispatcher();
|
||||
/** @type {Object.<string, 'mine' | 'theirs' | 'manual'>} */
|
||||
let resolutions = {};
|
||||
// [/SECTION]
|
||||
|
||||
// [DEF:resolve:Function]
|
||||
/**
|
||||
* @purpose Set resolution strategy for a file.
|
||||
* @pre file path must exist in conflicts array.
|
||||
* @post resolutions state is updated for the given file.
|
||||
* @param {string} file - File path.
|
||||
* @param {'mine'|'theirs'} strategy - Resolution strategy.
|
||||
* @side_effect Updates resolutions state.
|
||||
*/
|
||||
function resolve(file, strategy) {
|
||||
console.log(`[ConflictResolver][Action] Resolving ${file} with ${strategy}`);
|
||||
resolutions[file] = strategy;
|
||||
resolutions = { ...resolutions }; // Trigger update
|
||||
}
|
||||
// [/DEF:resolve:Function]
|
||||
|
||||
// [DEF:handleSave:Function]
|
||||
/**
|
||||
* @purpose Validate and submit resolutions.
|
||||
* @pre All conflicts must have a resolution.
|
||||
* @post 'resolve' event dispatched if valid.
|
||||
* @side_effect Dispatches event and closes modal.
|
||||
*/
|
||||
function handleSave() {
|
||||
// 1. Guard Clause (@PRE)
|
||||
const unresolved = conflicts.filter(c => !resolutions[c.file_path]);
|
||||
if (unresolved.length > 0) {
|
||||
console.warn(`[ConflictResolver][Coherence:Failed] ${unresolved.length} unresolved conflicts`);
|
||||
toast(`Please resolve all conflicts first. (${unresolved.length} remaining)`, 'error');
|
||||
return;
|
||||
}
|
||||
|
||||
// 2. Implementation
|
||||
console.log(`[ConflictResolver][Coherence:OK] All conflicts resolved`);
|
||||
dispatch('resolve', resolutions);
|
||||
show = false;
|
||||
}
|
||||
// [/DEF:handleSave:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
{#if show}
|
||||
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50 p-4">
|
||||
<div class="bg-white p-6 rounded-lg shadow-xl w-full max-w-5xl max-h-[90vh] flex flex-col">
|
||||
<h2 class="text-xl font-bold mb-4 text-red-600">Merge Conflicts Detected</h2>
|
||||
<p class="text-gray-600 mb-4">The following files have conflicts. Please choose how to resolve them.</p>
|
||||
|
||||
<div class="flex-1 overflow-y-auto space-y-6 mb-4 pr-2">
|
||||
{#each conflicts as conflict}
|
||||
<div class="border rounded-lg overflow-hidden">
|
||||
<div class="bg-gray-100 px-4 py-2 font-medium border-b flex justify-between items-center">
|
||||
<span>{conflict.file_path}</span>
|
||||
{#if resolutions[conflict.file_path]}
|
||||
<span class="text-xs bg-blue-100 text-blue-700 px-2 py-0.5 rounded-full uppercase font-bold">
|
||||
Resolved: {resolutions[conflict.file_path]}
|
||||
</span>
|
||||
{/if}
|
||||
</div>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-0 divide-x">
|
||||
<div class="p-0 flex flex-col">
|
||||
<div class="bg-blue-50 px-4 py-1 text-[10px] font-bold text-blue-600 uppercase border-b">Your Changes (Mine)</div>
|
||||
<div class="p-4 bg-white flex-1 overflow-auto">
|
||||
<pre class="text-xs font-mono whitespace-pre">{conflict.mine}</pre>
|
||||
</div>
|
||||
<button
|
||||
class="w-full py-2 text-sm font-medium border-t transition-colors {resolutions[conflict.file_path] === 'mine' ? 'bg-blue-600 text-white' : 'bg-gray-50 hover:bg-blue-50 text-blue-600'}"
|
||||
on:click={() => resolve(conflict.file_path, 'mine')}
|
||||
>
|
||||
Keep Mine
|
||||
</button>
|
||||
</div>
|
||||
<div class="p-0 flex flex-col">
|
||||
<div class="bg-green-50 px-4 py-1 text-[10px] font-bold text-green-600 uppercase border-b">Remote Changes (Theirs)</div>
|
||||
<div class="p-4 bg-white flex-1 overflow-auto">
|
||||
<pre class="text-xs font-mono whitespace-pre">{conflict.theirs}</pre>
|
||||
</div>
|
||||
<button
|
||||
class="w-full py-2 text-sm font-medium border-t transition-colors {resolutions[conflict.file_path] === 'theirs' ? 'bg-green-600 text-white' : 'bg-gray-50 hover:bg-green-50 text-green-600'}"
|
||||
on:click={() => resolve(conflict.file_path, 'theirs')}
|
||||
>
|
||||
Keep Theirs
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
|
||||
<div class="flex justify-end space-x-3 pt-4 border-t">
|
||||
<button
|
||||
on:click={() => show = false}
|
||||
class="px-4 py-2 text-gray-600 hover:bg-gray-100 rounded transition-colors"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
on:click={handleSave}
|
||||
class="px-4 py-2 bg-blue-600 text-white rounded hover:bg-blue-700 transition-colors shadow-sm"
|
||||
>
|
||||
Resolve & Continue
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
pre {
|
||||
tab-size: 4;
|
||||
}
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:ConflictResolver:Component] -->
|
||||
147
frontend/src/components/git/DeploymentModal.svelte
Normal file
147
frontend/src/components/git/DeploymentModal.svelte
Normal file
@@ -0,0 +1,147 @@
|
||||
<!-- [DEF:DeploymentModal:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: deployment, git, environment, modal
|
||||
@PURPOSE: Modal for deploying a dashboard to a target environment.
|
||||
@LAYER: Component
|
||||
@RELATION: CALLS -> frontend/src/services/gitService.js
|
||||
@RELATION: DISPATCHES -> deploy
|
||||
|
||||
@INVARIANT: Cannot deploy without a selected environment.
|
||||
-->
|
||||
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount, createEventDispatcher } from 'svelte';
|
||||
import { gitService } from '../../services/gitService';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let dashboardId;
|
||||
export let show = false;
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let environments = [];
|
||||
let selectedEnv = '';
|
||||
let loading = false;
|
||||
let deploying = false;
|
||||
// [/SECTION]
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
// [DEF:loadStatus:Watcher]
|
||||
$: if (show) loadEnvironments();
|
||||
|
||||
// [DEF:loadEnvironments:Function]
|
||||
/**
|
||||
* @purpose Fetch available environments from API.
|
||||
* @post environments state is populated.
|
||||
* @side_effect Updates environments state.
|
||||
*/
|
||||
async function loadEnvironments() {
|
||||
console.log(`[DeploymentModal][Action] Loading environments`);
|
||||
loading = true;
|
||||
try {
|
||||
environments = await gitService.getEnvironments();
|
||||
if (environments.length > 0) {
|
||||
selectedEnv = environments[0].id;
|
||||
}
|
||||
console.log(`[DeploymentModal][Coherence:OK] Loaded ${environments.length} environments`);
|
||||
} catch (e) {
|
||||
console.error(`[DeploymentModal][Coherence:Failed] ${e.message}`);
|
||||
toast('Failed to load environments', 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:loadEnvironments:Function]
|
||||
|
||||
// [DEF:handleDeploy:Function]
|
||||
/**
|
||||
* @purpose Trigger deployment to selected environment.
|
||||
* @pre selectedEnv must be set.
|
||||
* @post deploy event dispatched on success.
|
||||
* @side_effect Triggers API call, closes modal, shows toast.
|
||||
*/
|
||||
async function handleDeploy() {
|
||||
if (!selectedEnv) return;
|
||||
console.log(`[DeploymentModal][Action] Deploying to ${selectedEnv}`);
|
||||
deploying = true;
|
||||
try {
|
||||
const result = await gitService.deploy(dashboardId, selectedEnv);
|
||||
toast(result.message || 'Deployment triggered successfully', 'success');
|
||||
dispatch('deploy');
|
||||
show = false;
|
||||
console.log(`[DeploymentModal][Coherence:OK] Deployment triggered`);
|
||||
} catch (e) {
|
||||
console.error(`[DeploymentModal][Coherence:Failed] ${e.message}`);
|
||||
toast(e.message, 'error');
|
||||
} finally {
|
||||
deploying = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleDeploy:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
{#if show}
|
||||
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||
<div class="bg-white p-6 rounded-lg shadow-xl w-96">
|
||||
<h2 class="text-xl font-bold mb-4">Deploy Dashboard</h2>
|
||||
|
||||
{#if loading}
|
||||
<p class="text-gray-500">Loading environments...</p>
|
||||
{:else if environments.length === 0}
|
||||
<p class="text-red-500 mb-4">No deployment environments configured.</p>
|
||||
<div class="flex justify-end">
|
||||
<button
|
||||
on:click={() => show = false}
|
||||
class="px-4 py-2 bg-gray-200 text-gray-800 rounded hover:bg-gray-300"
|
||||
>
|
||||
Close
|
||||
</button>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="mb-6">
|
||||
<label class="block text-sm font-medium text-gray-700 mb-2">Select Target Environment</label>
|
||||
<select
|
||||
bind:value={selectedEnv}
|
||||
class="w-full border rounded p-2 focus:ring-2 focus:ring-blue-500 outline-none bg-white"
|
||||
>
|
||||
{#each environments as env}
|
||||
<option value={env.id}>{env.name} ({env.superset_url})</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="flex justify-end space-x-3">
|
||||
<button
|
||||
on:click={() => show = false}
|
||||
class="px-4 py-2 text-gray-600 hover:bg-gray-100 rounded"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
on:click={handleDeploy}
|
||||
disabled={deploying || !selectedEnv}
|
||||
class="px-4 py-2 bg-green-600 text-white rounded hover:bg-green-700 disabled:opacity-50 flex items-center"
|
||||
>
|
||||
{#if deploying}
|
||||
<svg class="animate-spin -ml-1 mr-2 h-4 w-4 text-white" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24">
|
||||
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle>
|
||||
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
||||
</svg>
|
||||
Deploying...
|
||||
{:else}
|
||||
Deploy
|
||||
{/if}
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:DeploymentModal:Component] -->
|
||||
284
frontend/src/components/git/GitManager.svelte
Normal file
284
frontend/src/components/git/GitManager.svelte
Normal file
@@ -0,0 +1,284 @@
|
||||
<!-- [DEF:GitManager:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: git, manager, dashboard, version_control, initialization
|
||||
@PURPOSE: Центральный компонент для управления Git-операциями конкретного дашборда.
|
||||
@LAYER: Component
|
||||
@RELATION: USES -> BranchSelector
|
||||
@RELATION: USES -> CommitModal
|
||||
@RELATION: USES -> CommitHistory
|
||||
@RELATION: USES -> DeploymentModal
|
||||
@RELATION: USES -> ConflictResolver
|
||||
@RELATION: CALLS -> gitService
|
||||
-->
|
||||
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import { gitService } from '../../services/gitService';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
import BranchSelector from './BranchSelector.svelte';
|
||||
import CommitModal from './CommitModal.svelte';
|
||||
import CommitHistory from './CommitHistory.svelte';
|
||||
import DeploymentModal from './DeploymentModal.svelte';
|
||||
import ConflictResolver from './ConflictResolver.svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let dashboardId;
|
||||
export let dashboardTitle = "";
|
||||
export let show = false;
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let currentBranch = 'main';
|
||||
let showCommitModal = false;
|
||||
let showDeployModal = false;
|
||||
let showHistory = true;
|
||||
let showConflicts = false;
|
||||
let conflicts = [];
|
||||
let loading = false;
|
||||
let initialized = false;
|
||||
let checkingStatus = true;
|
||||
|
||||
// Initialization form state
|
||||
let configs = [];
|
||||
let selectedConfigId = "";
|
||||
let remoteUrl = "";
|
||||
// [/SECTION]
|
||||
|
||||
// [DEF:checkStatus:Function]
|
||||
/**
|
||||
* @purpose Проверяет, инициализирован ли репозиторий для данного дашборда.
|
||||
*/
|
||||
async function checkStatus() {
|
||||
checkingStatus = true;
|
||||
try {
|
||||
// If we can get branches, it means repo exists
|
||||
await gitService.getBranches(dashboardId);
|
||||
initialized = true;
|
||||
} catch (e) {
|
||||
initialized = false;
|
||||
// Load configs if not initialized
|
||||
configs = await gitService.getConfigs();
|
||||
if (configs.length > 0) selectedConfigId = configs[0].id;
|
||||
} finally {
|
||||
checkingStatus = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:checkStatus:Function]
|
||||
|
||||
// [DEF:handleInit:Function]
|
||||
/**
|
||||
* @purpose Инициализирует репозиторий для дашборда.
|
||||
*/
|
||||
async function handleInit() {
|
||||
if (!selectedConfigId || !remoteUrl) {
|
||||
toast('Please select a Git server and provide remote URL', 'error');
|
||||
return;
|
||||
}
|
||||
loading = true;
|
||||
try {
|
||||
await gitService.initRepository(dashboardId, selectedConfigId, remoteUrl);
|
||||
toast('Repository initialized successfully', 'success');
|
||||
initialized = true;
|
||||
} catch (e) {
|
||||
toast(e.message, 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleInit:Function]
|
||||
|
||||
// [DEF:handleSync:Function]
|
||||
/**
|
||||
* @purpose Синхронизирует состояние Superset с локальным Git-репозиторием.
|
||||
*/
|
||||
async function handleSync() {
|
||||
loading = true;
|
||||
try {
|
||||
// Try to get selected environment from localStorage (set by EnvSelector)
|
||||
const sourceEnvId = localStorage.getItem('selected_env_id');
|
||||
await gitService.sync(dashboardId, sourceEnvId);
|
||||
toast('Dashboard state synced to Git', 'success');
|
||||
} catch (e) {
|
||||
toast(e.message, 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleSync:Function]
|
||||
|
||||
// [DEF:handlePush:Function]
|
||||
async function handlePush() {
|
||||
loading = true;
|
||||
try {
|
||||
await gitService.push(dashboardId);
|
||||
toast('Changes pushed to remote', 'success');
|
||||
} catch (e) {
|
||||
toast(e.message, 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handlePush:Function]
|
||||
|
||||
// [DEF:handlePull:Function]
|
||||
async function handlePull() {
|
||||
loading = true;
|
||||
try {
|
||||
await gitService.pull(dashboardId);
|
||||
toast('Changes pulled from remote', 'success');
|
||||
} catch (e) {
|
||||
toast(e.message, 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handlePull:Function]
|
||||
|
||||
onMount(checkStatus);
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
{#if show}
|
||||
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||
<div class="bg-white p-6 rounded-lg shadow-2xl w-full max-w-4xl max-h-[90vh] overflow-y-auto">
|
||||
<div class="flex justify-between items-center mb-6 border-b pb-4">
|
||||
<div>
|
||||
<h2 class="text-2xl font-bold">Git Management: {dashboardTitle}</h2>
|
||||
<p class="text-sm text-gray-500">ID: {dashboardId}</p>
|
||||
</div>
|
||||
<button on:click={() => show = false} class="text-gray-500 hover:text-gray-700">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" class="h-6 w-6" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12" />
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{#if checkingStatus}
|
||||
<div class="flex justify-center py-12">
|
||||
<div class="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600"></div>
|
||||
</div>
|
||||
{:else if !initialized}
|
||||
<div class="max-w-md mx-auto py-8">
|
||||
<div class="bg-blue-50 border-l-4 border-blue-400 p-4 mb-6">
|
||||
<p class="text-sm text-blue-700">
|
||||
This dashboard is not yet linked to a Git repository.
|
||||
Please configure the repository details below.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700">Git Server</label>
|
||||
<select bind:value={selectedConfigId} class="mt-1 block w-full border rounded p-2">
|
||||
{#each configs as config}
|
||||
<option value={config.id}>{config.name} ({config.provider})</option>
|
||||
{/each}
|
||||
</select>
|
||||
{#if configs.length === 0}
|
||||
<p class="text-xs text-red-500 mt-1">No Git servers configured. Go to Settings -> Git to add one.</p>
|
||||
{/if}
|
||||
</div>
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700">Remote Repository URL</label>
|
||||
<input
|
||||
type="text"
|
||||
bind:value={remoteUrl}
|
||||
placeholder="https://github.com/org/repo.git"
|
||||
class="mt-1 block w-full border rounded p-2"
|
||||
/>
|
||||
</div>
|
||||
<button
|
||||
on:click={handleInit}
|
||||
disabled={loading || configs.length === 0}
|
||||
class="w-full bg-blue-600 text-white py-2 rounded font-medium hover:bg-blue-700 disabled:opacity-50"
|
||||
>
|
||||
{loading ? 'Initializing...' : 'Initialize Repository'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="grid grid-cols-1 md:grid-cols-3 gap-6">
|
||||
<!-- Left Column: Controls -->
|
||||
<div class="md:col-span-1 space-y-6">
|
||||
<section>
|
||||
<h3 class="text-sm font-semibold text-gray-400 uppercase tracking-wider mb-2">Branch</h3>
|
||||
<BranchSelector {dashboardId} bind:currentBranch />
|
||||
</section>
|
||||
|
||||
<section class="space-y-2">
|
||||
<h3 class="text-sm font-semibold text-gray-400 uppercase tracking-wider mb-2">Actions</h3>
|
||||
<button
|
||||
on:click={handleSync}
|
||||
disabled={loading}
|
||||
class="w-full flex items-center justify-center px-4 py-2 bg-gray-100 hover:bg-gray-200 rounded text-sm font-medium transition"
|
||||
>
|
||||
Sync from Superset
|
||||
</button>
|
||||
<button
|
||||
on:click={() => showCommitModal = true}
|
||||
disabled={loading}
|
||||
class="w-full flex items-center justify-center px-4 py-2 bg-blue-600 hover:bg-blue-700 text-white rounded text-sm font-medium transition"
|
||||
>
|
||||
Commit Changes
|
||||
</button>
|
||||
<div class="grid grid-cols-2 gap-2">
|
||||
<button
|
||||
on:click={handlePull}
|
||||
disabled={loading}
|
||||
class="flex items-center justify-center px-4 py-2 border hover:bg-gray-50 rounded text-sm font-medium transition"
|
||||
>
|
||||
Pull
|
||||
</button>
|
||||
<button
|
||||
on:click={handlePush}
|
||||
disabled={loading}
|
||||
class="flex items-center justify-center px-4 py-2 border hover:bg-gray-50 rounded text-sm font-medium transition"
|
||||
>
|
||||
Push
|
||||
</button>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section>
|
||||
<h3 class="text-sm font-semibold text-gray-400 uppercase tracking-wider mb-2">Deployment</h3>
|
||||
<button
|
||||
on:click={() => showDeployModal = true}
|
||||
disabled={loading}
|
||||
class="w-full flex items-center justify-center px-4 py-2 bg-green-600 hover:bg-green-700 text-white rounded text-sm font-medium transition"
|
||||
>
|
||||
Deploy to Environment
|
||||
</button>
|
||||
</section>
|
||||
</div>
|
||||
|
||||
<!-- Right Column: History -->
|
||||
<div class="md:col-span-2 border-l pl-6">
|
||||
<CommitHistory {dashboardId} />
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<CommitModal
|
||||
{dashboardId}
|
||||
bind:show={showCommitModal}
|
||||
on:commit={() => { /* Refresh history */ }}
|
||||
/>
|
||||
|
||||
<DeploymentModal
|
||||
{dashboardId}
|
||||
bind:show={showDeployModal}
|
||||
/>
|
||||
|
||||
<ConflictResolver
|
||||
{conflicts}
|
||||
bind:show={showConflicts}
|
||||
on:resolve={() => { /* Handle resolution */ }}
|
||||
/>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:GitManager:Component] -->
|
||||
108
frontend/src/components/tools/ConnectionForm.svelte
Normal file
108
frontend/src/components/tools/ConnectionForm.svelte
Normal file
@@ -0,0 +1,108 @@
|
||||
<!-- [DEF:ConnectionForm:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: connection, form, settings
|
||||
@PURPOSE: UI component for creating a new database connection configuration.
|
||||
@LAYER: UI
|
||||
@RELATION: USES -> frontend/src/services/connectionService.js
|
||||
-->
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { createEventDispatcher } from 'svelte';
|
||||
import { createConnection } from '../../services/connectionService.js';
|
||||
import { addToast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
let name = '';
|
||||
let type = 'postgres';
|
||||
let host = '';
|
||||
let port = 5432;
|
||||
let database = '';
|
||||
let username = '';
|
||||
let password = '';
|
||||
let isSubmitting = false;
|
||||
|
||||
// [DEF:handleSubmit:Function]
|
||||
// @PURPOSE: Submits the connection form to the backend.
|
||||
// @PRE: All required fields (name, host, database, username, password) must be filled.
|
||||
// @POST: A new connection is created via the connection service and a success event is dispatched.
|
||||
async function handleSubmit() {
|
||||
if (!name || !host || !database || !username || !password) {
|
||||
addToast('Please fill in all required fields', 'warning');
|
||||
return;
|
||||
}
|
||||
|
||||
isSubmitting = true;
|
||||
try {
|
||||
const newConnection = await createConnection({
|
||||
name, type, host, port, database, username, password
|
||||
});
|
||||
addToast('Connection created successfully', 'success');
|
||||
dispatch('success', newConnection);
|
||||
resetForm();
|
||||
} catch (e) {
|
||||
addToast(e.message, 'error');
|
||||
} finally {
|
||||
isSubmitting = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleSubmit:Function]
|
||||
|
||||
// [DEF:resetForm:Function]
|
||||
/* @PURPOSE: Resets the connection form fields to their default values.
|
||||
@PRE: None.
|
||||
@POST: All form input variables are reset.
|
||||
*/
|
||||
function resetForm() {
|
||||
name = '';
|
||||
host = '';
|
||||
port = 5432;
|
||||
database = '';
|
||||
username = '';
|
||||
password = '';
|
||||
}
|
||||
// [/DEF:resetForm:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="bg-white p-6 rounded-lg shadow-sm border border-gray-200">
|
||||
<h3 class="text-lg font-medium text-gray-900 mb-4">Add New Connection</h3>
|
||||
<form on:submit|preventDefault={handleSubmit} class="space-y-4">
|
||||
<div>
|
||||
<label for="conn-name" class="block text-sm font-medium text-gray-700">Connection Name</label>
|
||||
<input type="text" id="conn-name" bind:value={name} placeholder="e.g. Production DWH" class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="conn-host" class="block text-sm font-medium text-gray-700">Host</label>
|
||||
<input type="text" id="conn-host" bind:value={host} placeholder="10.0.0.1" class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="conn-port" class="block text-sm font-medium text-gray-700">Port</label>
|
||||
<input type="number" id="conn-port" bind:value={port} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<label for="conn-db" class="block text-sm font-medium text-gray-700">Database Name</label>
|
||||
<input type="text" id="conn-db" bind:value={database} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="conn-user" class="block text-sm font-medium text-gray-700">Username</label>
|
||||
<input type="text" id="conn-user" bind:value={username} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="conn-pass" class="block text-sm font-medium text-gray-700">Password</label>
|
||||
<input type="password" id="conn-pass" bind:value={password} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex justify-end pt-2">
|
||||
<button type="submit" disabled={isSubmitting} class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:opacity-50">
|
||||
{isSubmitting ? 'Creating...' : 'Create Connection'}
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
<!-- [/DEF:ConnectionForm:Component] -->
|
||||
88
frontend/src/components/tools/ConnectionList.svelte
Normal file
88
frontend/src/components/tools/ConnectionList.svelte
Normal file
@@ -0,0 +1,88 @@
|
||||
<!-- [DEF:ConnectionList:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: connection, list, settings
|
||||
@PURPOSE: UI component for listing and deleting saved database connection configurations.
|
||||
@LAYER: UI
|
||||
@RELATION: USES -> frontend/src/services/connectionService.js
|
||||
-->
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount, createEventDispatcher } from 'svelte';
|
||||
import { getConnections, deleteConnection } from '../../services/connectionService.js';
|
||||
import { addToast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
let connections = [];
|
||||
let isLoading = true;
|
||||
|
||||
// [DEF:fetchConnections:Function]
|
||||
// @PURPOSE: Fetches the list of connections from the backend.
|
||||
// @PRE: None.
|
||||
// @POST: connections array is populated.
|
||||
async function fetchConnections() {
|
||||
isLoading = true;
|
||||
try {
|
||||
connections = await getConnections();
|
||||
} catch (e) {
|
||||
addToast('Failed to fetch connections', 'error');
|
||||
} finally {
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:fetchConnections:Function]
|
||||
|
||||
// [DEF:handleDelete:Function]
|
||||
// @PURPOSE: Deletes a connection configuration.
|
||||
// @PRE: id is provided and user confirms deletion.
|
||||
// @POST: Connection is deleted from backend and list is reloaded.
|
||||
async function handleDelete(id) {
|
||||
if (!confirm('Are you sure you want to delete this connection?')) return;
|
||||
|
||||
try {
|
||||
await deleteConnection(id);
|
||||
addToast('Connection deleted', 'success');
|
||||
await fetchConnections();
|
||||
} catch (e) {
|
||||
addToast(e.message, 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleDelete:Function]
|
||||
|
||||
onMount(fetchConnections);
|
||||
|
||||
// Expose fetchConnections to parent
|
||||
export { fetchConnections };
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="bg-white shadow overflow-hidden sm:rounded-md border border-gray-200">
|
||||
<div class="px-4 py-5 sm:px-6 bg-gray-50 border-b border-gray-200">
|
||||
<h3 class="text-lg leading-6 font-medium text-gray-900">Saved Connections</h3>
|
||||
</div>
|
||||
<ul class="divide-y divide-gray-200">
|
||||
{#if isLoading}
|
||||
<li class="p-4 text-center text-gray-500">Loading...</li>
|
||||
{:else if connections.length === 0}
|
||||
<li class="p-8 text-center text-gray-500 italic">No connections saved yet.</li>
|
||||
{:else}
|
||||
{#each connections as conn}
|
||||
<li class="p-4 flex items-center justify-between hover:bg-gray-50">
|
||||
<div>
|
||||
<div class="text-sm font-medium text-indigo-600 truncate">{conn.name}</div>
|
||||
<div class="text-xs text-gray-500">{conn.type}://{conn.username}@{conn.host}:{conn.port}/{conn.database}</div>
|
||||
</div>
|
||||
<button
|
||||
on:click={() => handleDelete(conn.id)}
|
||||
class="ml-2 inline-flex items-center px-2 py-1 border border-transparent text-xs font-medium rounded text-red-700 bg-red-100 hover:bg-red-200 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-red-500"
|
||||
>
|
||||
Delete
|
||||
</button>
|
||||
</li>
|
||||
{/each}
|
||||
{/if}
|
||||
</ul>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
<!-- [/DEF:ConnectionList:Component] -->
|
||||
190
frontend/src/components/tools/DebugTool.svelte
Normal file
190
frontend/src/components/tools/DebugTool.svelte
Normal file
@@ -0,0 +1,190 @@
|
||||
<!-- [DEF:DebugTool:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: debug, tool, api, structure
|
||||
@PURPOSE: UI component for system diagnostics and debugging API responses.
|
||||
@LAYER: UI
|
||||
@RELATION: USES -> frontend/src/services/toolsService.js
|
||||
-->
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import { runTask, getTaskStatus } from '../../services/toolsService.js';
|
||||
import { selectedTask } from '../../lib/stores.js';
|
||||
import { addToast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
let envs = [];
|
||||
let action = 'test-db-api';
|
||||
let selectedEnv = '';
|
||||
let datasetId = '';
|
||||
let sourceEnv = '';
|
||||
let targetEnv = '';
|
||||
let isRunning = false;
|
||||
let results = null;
|
||||
let pollInterval;
|
||||
|
||||
// [DEF:fetchEnvironments:Function]
|
||||
/**
|
||||
* @purpose Fetches available environments.
|
||||
* @pre API is available.
|
||||
* @post envs variable is populated.
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function fetchEnvironments() {
|
||||
try {
|
||||
const res = await fetch('/api/environments');
|
||||
envs = await res.json();
|
||||
} catch (e) {
|
||||
addToast('Failed to fetch environments', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:fetchEnvironments:Function]
|
||||
|
||||
// [DEF:handleRunDebug:Function]
|
||||
/**
|
||||
* @purpose Triggers the debug task.
|
||||
* @pre Required fields are selected.
|
||||
* @post Task is started and polling begins.
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async function handleRunDebug() {
|
||||
isRunning = true;
|
||||
results = null;
|
||||
try {
|
||||
let params = { action };
|
||||
if (action === 'test-db-api') {
|
||||
if (!sourceEnv || !targetEnv) {
|
||||
addToast('Source and Target environments are required', 'warning');
|
||||
isRunning = false;
|
||||
return;
|
||||
}
|
||||
const sEnv = envs.find(e => e.id === sourceEnv);
|
||||
const tEnv = envs.find(e => e.id === targetEnv);
|
||||
params.source_env = sEnv.name;
|
||||
params.target_env = tEnv.name;
|
||||
} else {
|
||||
if (!selectedEnv || !datasetId) {
|
||||
addToast('Environment and Dataset ID are required', 'warning');
|
||||
isRunning = false;
|
||||
return;
|
||||
}
|
||||
const env = envs.find(e => e.id === selectedEnv);
|
||||
params.env = env.name;
|
||||
params.dataset_id = parseInt(datasetId);
|
||||
}
|
||||
|
||||
const task = await runTask('system-debug', params);
|
||||
selectedTask.set(task);
|
||||
startPolling(task.id);
|
||||
} catch (e) {
|
||||
isRunning = false;
|
||||
addToast(e.message, 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleRunDebug:Function]
|
||||
|
||||
// [DEF:startPolling:Function]
|
||||
/**
|
||||
* @purpose Polls for task completion.
|
||||
* @pre Task ID is valid.
|
||||
* @post Polls until success/failure.
|
||||
* @param {string} taskId - ID of the task.
|
||||
* @returns {void}
|
||||
*/
|
||||
function startPolling(taskId) {
|
||||
if (pollInterval) clearInterval(pollInterval);
|
||||
pollInterval = setInterval(async () => {
|
||||
try {
|
||||
const task = await getTaskStatus(taskId);
|
||||
selectedTask.set(task);
|
||||
if (task.status === 'SUCCESS') {
|
||||
clearInterval(pollInterval);
|
||||
isRunning = false;
|
||||
results = task.result;
|
||||
addToast('Debug task completed', 'success');
|
||||
} else if (task.status === 'FAILED') {
|
||||
clearInterval(pollInterval);
|
||||
isRunning = false;
|
||||
addToast('Debug task failed', 'error');
|
||||
}
|
||||
} catch (e) {
|
||||
clearInterval(pollInterval);
|
||||
isRunning = false;
|
||||
}
|
||||
}, 2000);
|
||||
}
|
||||
// [/DEF:startPolling:Function]
|
||||
|
||||
onMount(fetchEnvironments);
|
||||
</script>
|
||||
|
||||
<div class="space-y-6">
|
||||
<div class="bg-white p-6 rounded-lg shadow-sm border border-gray-200">
|
||||
<h3 class="text-lg font-medium text-gray-900 mb-4">System Diagnostics</h3>
|
||||
|
||||
<div class="mb-4">
|
||||
<label class="block text-sm font-medium text-gray-700">Debug Action</label>
|
||||
<select bind:value={action} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm">
|
||||
<option value="test-db-api">Test Database API (Compare Envs)</option>
|
||||
<option value="get-dataset-structure">Get Dataset Structure (JSON)</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
{#if action === 'test-db-api'}
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="src-env" class="block text-sm font-medium text-gray-700">Source Environment</label>
|
||||
<select id="src-env" bind:value={sourceEnv} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm">
|
||||
<option value="" disabled>-- Select Source --</option>
|
||||
{#each envs as env}
|
||||
<option value={env.id}>{env.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label for="tgt-env" class="block text-sm font-medium text-gray-700">Target Environment</label>
|
||||
<select id="tgt-env" bind:value={targetEnv} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm">
|
||||
<option value="" disabled>-- Select Target --</option>
|
||||
{#each envs as env}
|
||||
<option value={env.id}>{env.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="debug-env" class="block text-sm font-medium text-gray-700">Environment</label>
|
||||
<select id="debug-env" bind:value={selectedEnv} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm">
|
||||
<option value="" disabled>-- Select Environment --</option>
|
||||
{#each envs as env}
|
||||
<option value={env.id}>{env.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label for="debug-ds-id" class="block text-sm font-medium text-gray-700">Dataset ID</label>
|
||||
<input type="number" id="debug-ds-id" bind:value={datasetId} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<div class="mt-4 flex justify-end">
|
||||
<button on:click={handleRunDebug} disabled={isRunning} class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:opacity-50">
|
||||
{isRunning ? 'Running...' : 'Run Diagnostics'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if results}
|
||||
<div class="bg-white shadow overflow-hidden sm:rounded-md border border-gray-200">
|
||||
<div class="px-4 py-5 sm:px-6 bg-gray-50 border-b border-gray-200">
|
||||
<h3 class="text-lg leading-6 font-medium text-gray-900">Debug Output</h3>
|
||||
</div>
|
||||
<div class="p-4">
|
||||
<pre class="text-xs text-gray-600 bg-gray-900 text-green-400 p-4 rounded-md overflow-x-auto h-96">{JSON.stringify(results, null, 2)}</pre>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/DEF:DebugTool:Component] -->
|
||||
165
frontend/src/components/tools/MapperTool.svelte
Normal file
165
frontend/src/components/tools/MapperTool.svelte
Normal file
@@ -0,0 +1,165 @@
|
||||
<!-- [DEF:MapperTool:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: mapper, tool, dataset, postgresql, excel
|
||||
@PURPOSE: UI component for mapping dataset column verbose names using the MapperPlugin.
|
||||
@LAYER: UI
|
||||
@RELATION: USES -> frontend/src/services/toolsService.js
|
||||
@RELATION: USES -> frontend/src/services/connectionService.js
|
||||
-->
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import { runTask } from '../../services/toolsService.js';
|
||||
import { getConnections } from '../../services/connectionService.js';
|
||||
import { selectedTask } from '../../lib/stores.js';
|
||||
import { addToast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
let envs = [];
|
||||
let connections = [];
|
||||
let selectedEnv = '';
|
||||
let datasetId = '';
|
||||
let source = 'postgres';
|
||||
let selectedConnection = '';
|
||||
let tableName = '';
|
||||
let tableSchema = 'public';
|
||||
let excelPath = '';
|
||||
let isRunning = false;
|
||||
|
||||
// [DEF:fetchData:Function]
|
||||
// @PURPOSE: Fetches environments and saved connections.
|
||||
// @PRE: None.
|
||||
// @POST: envs and connections arrays are populated.
|
||||
async function fetchData() {
|
||||
try {
|
||||
const envsRes = await fetch('/api/environments');
|
||||
envs = await envsRes.json();
|
||||
connections = await getConnections();
|
||||
} catch (e) {
|
||||
addToast('Failed to fetch data', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:fetchData:Function]
|
||||
|
||||
// [DEF:handleRunMapper:Function]
|
||||
// @PURPOSE: Triggers the MapperPlugin task.
|
||||
// @PRE: selectedEnv and datasetId are set; source-specific fields are valid.
|
||||
// @POST: Mapper task is started and selectedTask is updated.
|
||||
async function handleRunMapper() {
|
||||
if (!selectedEnv || !datasetId) {
|
||||
addToast('Please fill in required fields', 'warning');
|
||||
return;
|
||||
}
|
||||
|
||||
if (source === 'postgres' && (!selectedConnection || !tableName)) {
|
||||
addToast('Connection and Table Name are required for postgres source', 'warning');
|
||||
return;
|
||||
}
|
||||
|
||||
if (source === 'excel' && !excelPath) {
|
||||
addToast('Excel path is required for excel source', 'warning');
|
||||
return;
|
||||
}
|
||||
|
||||
isRunning = true;
|
||||
try {
|
||||
const env = envs.find(e => e.id === selectedEnv);
|
||||
const task = await runTask('dataset-mapper', {
|
||||
env: env.name,
|
||||
dataset_id: parseInt(datasetId),
|
||||
source,
|
||||
connection_id: selectedConnection,
|
||||
table_name: tableName,
|
||||
table_schema: tableSchema,
|
||||
excel_path: excelPath
|
||||
});
|
||||
|
||||
selectedTask.set(task);
|
||||
addToast('Mapper task started', 'success');
|
||||
} catch (e) {
|
||||
addToast(e.message, 'error');
|
||||
} finally {
|
||||
isRunning = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleRunMapper:Function]
|
||||
|
||||
onMount(fetchData);
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="bg-white p-6 rounded-lg shadow-sm border border-gray-200">
|
||||
<h3 class="text-lg font-medium text-gray-900 mb-4">Dataset Column Mapper</h3>
|
||||
<div class="space-y-4">
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="mapper-env" class="block text-sm font-medium text-gray-700">Environment</label>
|
||||
<select id="mapper-env" bind:value={selectedEnv} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm">
|
||||
<option value="" disabled>-- Select Environment --</option>
|
||||
{#each envs as env}
|
||||
<option value={env.id}>{env.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label for="mapper-ds-id" class="block text-sm font-medium text-gray-700">Dataset ID</label>
|
||||
<input type="number" id="mapper-ds-id" bind:value={datasetId} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700">Mapping Source</label>
|
||||
<div class="mt-2 flex space-x-4">
|
||||
<label class="inline-flex items-center">
|
||||
<input type="radio" bind:group={source} value="postgres" class="focus:ring-indigo-500 h-4 w-4 text-indigo-600 border-gray-300" />
|
||||
<span class="ml-2 text-sm text-gray-700">PostgreSQL</span>
|
||||
</label>
|
||||
<label class="inline-flex items-center">
|
||||
<input type="radio" bind:group={source} value="excel" class="focus:ring-indigo-500 h-4 w-4 text-indigo-600 border-gray-300" />
|
||||
<span class="ml-2 text-sm text-gray-700">Excel</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if source === 'postgres'}
|
||||
<div class="space-y-4 p-4 bg-gray-50 rounded-md border border-gray-100">
|
||||
<div>
|
||||
<label for="mapper-conn" class="block text-sm font-medium text-gray-700">Saved Connection</label>
|
||||
<select id="mapper-conn" bind:value={selectedConnection} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm">
|
||||
<option value="" disabled>-- Select Connection --</option>
|
||||
{#each connections as conn}
|
||||
<option value={conn.id}>{conn.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="mapper-table" class="block text-sm font-medium text-gray-700">Table Name</label>
|
||||
<input type="text" id="mapper-table" bind:value={tableName} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="mapper-schema" class="block text-sm font-medium text-gray-700">Table Schema</label>
|
||||
<input type="text" id="mapper-schema" bind:value={tableSchema} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="p-4 bg-gray-50 rounded-md border border-gray-100">
|
||||
<label for="mapper-excel" class="block text-sm font-medium text-gray-700">Excel File Path</label>
|
||||
<input type="text" id="mapper-excel" bind:value={excelPath} placeholder="/path/to/mapping.xlsx" class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<div class="flex justify-end">
|
||||
<button
|
||||
on:click={handleRunMapper}
|
||||
disabled={isRunning}
|
||||
class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:opacity-50"
|
||||
>
|
||||
{isRunning ? 'Starting...' : 'Run Mapper'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
<!-- [/DEF:MapperTool:Component] -->
|
||||
186
frontend/src/components/tools/SearchTool.svelte
Normal file
186
frontend/src/components/tools/SearchTool.svelte
Normal file
@@ -0,0 +1,186 @@
|
||||
<!-- [DEF:SearchTool:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: search, tool, dataset, regex
|
||||
@PURPOSE: UI component for searching datasets using the SearchPlugin.
|
||||
@LAYER: UI
|
||||
@RELATION: USES -> frontend/src/services/toolsService.js
|
||||
-->
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import { runTask, getTaskStatus } from '../../services/toolsService.js';
|
||||
import { selectedTask } from '../../lib/stores.js';
|
||||
import { addToast } from '../../lib/toasts.js';
|
||||
// [/SECTION]
|
||||
|
||||
let envs = [];
|
||||
let selectedEnv = '';
|
||||
let searchQuery = '';
|
||||
let isRunning = false;
|
||||
let results = null;
|
||||
let pollInterval;
|
||||
|
||||
// [DEF:fetchEnvironments:Function]
|
||||
// @PURPOSE: Fetches the list of available environments.
|
||||
// @PRE: None.
|
||||
// @POST: envs array is populated.
|
||||
async function fetchEnvironments() {
|
||||
try {
|
||||
const res = await fetch('/api/environments');
|
||||
envs = await res.json();
|
||||
} catch (e) {
|
||||
addToast('Failed to fetch environments', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:fetchEnvironments:Function]
|
||||
|
||||
// [DEF:handleSearch:Function]
|
||||
// @PURPOSE: Triggers the SearchPlugin task.
|
||||
// @PRE: selectedEnv and searchQuery must be set.
|
||||
// @POST: Task is started and polling begins.
|
||||
async function handleSearch() {
|
||||
if (!selectedEnv || !searchQuery) {
|
||||
addToast('Please select environment and enter query', 'warning');
|
||||
return;
|
||||
}
|
||||
|
||||
isRunning = true;
|
||||
results = null;
|
||||
try {
|
||||
// Find the environment name from ID
|
||||
const env = envs.find(e => e.id === selectedEnv);
|
||||
const task = await runTask('search-datasets', {
|
||||
env: env.name,
|
||||
query: searchQuery
|
||||
});
|
||||
|
||||
selectedTask.set(task);
|
||||
startPolling(task.id);
|
||||
} catch (e) {
|
||||
isRunning = false;
|
||||
addToast(e.message, 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleSearch:Function]
|
||||
|
||||
// [DEF:startPolling:Function]
|
||||
// @PURPOSE: Polls for task completion and results.
|
||||
// @PRE: taskId is provided.
|
||||
// @POST: pollInterval is set and results are updated on success.
|
||||
function startPolling(taskId) {
|
||||
if (pollInterval) clearInterval(pollInterval);
|
||||
|
||||
pollInterval = setInterval(async () => {
|
||||
try {
|
||||
const task = await getTaskStatus(taskId);
|
||||
selectedTask.set(task);
|
||||
|
||||
if (task.status === 'SUCCESS') {
|
||||
clearInterval(pollInterval);
|
||||
isRunning = false;
|
||||
results = task.result;
|
||||
addToast('Search completed', 'success');
|
||||
} else if (task.status === 'FAILED') {
|
||||
clearInterval(pollInterval);
|
||||
isRunning = false;
|
||||
addToast('Search failed', 'error');
|
||||
}
|
||||
} catch (e) {
|
||||
clearInterval(pollInterval);
|
||||
isRunning = false;
|
||||
addToast('Error polling task status', 'error');
|
||||
}
|
||||
}, 2000);
|
||||
}
|
||||
// [/DEF:startPolling:Function]
|
||||
|
||||
onMount(fetchEnvironments);
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="space-y-6">
|
||||
<div class="bg-white p-6 rounded-lg shadow-sm border border-gray-200">
|
||||
<h3 class="text-lg font-medium text-gray-900 mb-4">Search Dataset Metadata</h3>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4 items-end">
|
||||
<div>
|
||||
<label for="env-select" class="block text-sm font-medium text-gray-700">Environment</label>
|
||||
<select
|
||||
id="env-select"
|
||||
bind:value={selectedEnv}
|
||||
class="mt-1 block w-full pl-3 pr-10 py-2 text-base border-gray-300 focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm rounded-md"
|
||||
>
|
||||
<option value="" disabled>-- Select Environment --</option>
|
||||
{#each envs as env}
|
||||
<option value={env.id}>{env.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label for="search-query" class="block text-sm font-medium text-gray-700">Regex Pattern</label>
|
||||
<input
|
||||
type="text"
|
||||
id="search-query"
|
||||
bind:value={searchQuery}
|
||||
placeholder="e.g. from dm.*\.account"
|
||||
class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-4 flex justify-end">
|
||||
<button
|
||||
on:click={handleSearch}
|
||||
disabled={isRunning}
|
||||
class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:opacity-50"
|
||||
>
|
||||
{#if isRunning}
|
||||
<svg class="animate-spin -ml-1 mr-3 h-5 w-5 text-white" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24">
|
||||
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle>
|
||||
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
||||
</svg>
|
||||
Searching...
|
||||
{:else}
|
||||
Search
|
||||
{/if}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if results}
|
||||
<div class="bg-white shadow overflow-hidden sm:rounded-md border border-gray-200">
|
||||
<div class="px-4 py-5 sm:px-6 flex justify-between items-center bg-gray-50 border-b border-gray-200">
|
||||
<h3 class="text-lg leading-6 font-medium text-gray-900">
|
||||
Search Results
|
||||
</h3>
|
||||
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-blue-100 text-blue-800">
|
||||
{results.count} matches
|
||||
</span>
|
||||
</div>
|
||||
<ul class="divide-y divide-gray-200">
|
||||
{#each results.results as item}
|
||||
<li class="p-4 hover:bg-gray-50">
|
||||
<div class="flex items-center justify-between">
|
||||
<div class="text-sm font-medium text-indigo-600 truncate">
|
||||
{item.dataset_name} (ID: {item.dataset_id})
|
||||
</div>
|
||||
<div class="ml-2 flex-shrink-0 flex">
|
||||
<p class="px-2 inline-flex text-xs leading-5 font-semibold rounded-full bg-green-100 text-green-800">
|
||||
Field: {item.field}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-2">
|
||||
<pre class="text-xs text-gray-500 bg-gray-50 p-2 rounded border border-gray-100 overflow-x-auto">{item.match_context}</pre>
|
||||
</div>
|
||||
</li>
|
||||
{/each}
|
||||
{#if results.count === 0}
|
||||
<li class="p-8 text-center text-gray-500 italic">
|
||||
No matches found for the given pattern.
|
||||
</li>
|
||||
{/if}
|
||||
</ul>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
<!-- [/DEF:SearchTool:Component] -->
|
||||
@@ -10,6 +10,8 @@ const API_BASE_URL = '/api';
|
||||
|
||||
// [DEF:getWsUrl:Function]
|
||||
// @PURPOSE: Returns the WebSocket URL for a specific task, with fallback logic.
|
||||
// @PRE: taskId is provided.
|
||||
// @POST: Returns valid WebSocket URL string.
|
||||
// @PARAM: taskId (string) - The ID of the task.
|
||||
// @RETURN: string - The WebSocket URL.
|
||||
export const getWsUrl = (taskId) => {
|
||||
@@ -25,6 +27,8 @@ export const getWsUrl = (taskId) => {
|
||||
|
||||
// [DEF:fetchApi:Function]
|
||||
// @PURPOSE: Generic GET request wrapper.
|
||||
// @PRE: endpoint string is provided.
|
||||
// @POST: Returns Promise resolving to JSON data or throws on error.
|
||||
// @PARAM: endpoint (string) - API endpoint.
|
||||
// @RETURN: Promise<any> - JSON response.
|
||||
async function fetchApi(endpoint) {
|
||||
@@ -45,6 +49,8 @@ async function fetchApi(endpoint) {
|
||||
|
||||
// [DEF:postApi:Function]
|
||||
// @PURPOSE: Generic POST request wrapper.
|
||||
// @PRE: endpoint and body are provided.
|
||||
// @POST: Returns Promise resolving to JSON data or throws on error.
|
||||
// @PARAM: endpoint (string) - API endpoint.
|
||||
// @PARAM: body (object) - Request payload.
|
||||
// @RETURN: Promise<any> - JSON response.
|
||||
@@ -72,6 +78,8 @@ async function postApi(endpoint, body) {
|
||||
|
||||
// [DEF:requestApi:Function]
|
||||
// @PURPOSE: Generic request wrapper.
|
||||
// @PRE: endpoint and method are provided.
|
||||
// @POST: Returns Promise resolving to JSON data or throws on error.
|
||||
async function requestApi(endpoint, method = 'GET', body = null) {
|
||||
try {
|
||||
console.log(`[api.requestApi][Action] ${method} to context={{'endpoint': '${endpoint}'}}`);
|
||||
|
||||
@@ -38,6 +38,8 @@ export const taskLogs = writable([]);
|
||||
|
||||
// [DEF:fetchPlugins:Function]
|
||||
// @PURPOSE: Fetches plugins from the API and updates the plugins store.
|
||||
// @PRE: None.
|
||||
// @POST: plugins store is updated with data from the API.
|
||||
export async function fetchPlugins() {
|
||||
try {
|
||||
console.log("[stores.fetchPlugins][Action] Fetching plugins.");
|
||||
@@ -52,6 +54,8 @@ export async function fetchPlugins() {
|
||||
|
||||
// [DEF:fetchTasks:Function]
|
||||
// @PURPOSE: Fetches tasks from the API and updates the tasks store.
|
||||
// @PRE: None.
|
||||
// @POST: tasks store is updated with data from the API.
|
||||
export async function fetchTasks() {
|
||||
try {
|
||||
console.log("[stores.fetchTasks][Action] Fetching tasks.");
|
||||
|
||||
@@ -12,6 +12,8 @@ export const toasts = writable([]);
|
||||
|
||||
// [DEF:addToast:Function]
|
||||
// @PURPOSE: Adds a new toast message.
|
||||
// @PRE: message string is provided.
|
||||
// @POST: New toast is added to the store and scheduled for removal.
|
||||
// @PARAM: message (string) - The message text.
|
||||
// @PARAM: type (string) - The type of toast (info, success, error).
|
||||
// @PARAM: duration (number) - Duration in ms before the toast is removed.
|
||||
@@ -25,6 +27,8 @@ export function addToast(message, type = 'info', duration = 3000) {
|
||||
|
||||
// [DEF:removeToast:Function]
|
||||
// @PURPOSE: Removes a toast message by ID.
|
||||
// @PRE: id is provided.
|
||||
// @POST: Toast is removed from the store.
|
||||
// @PARAM: id (string) - The ID of the toast to remove.
|
||||
function removeToast(id) {
|
||||
console.log(`[toasts.removeToast][Action] Removing toast context={{'id': '${id}'}}`);
|
||||
|
||||
@@ -1,18 +0,0 @@
|
||||
// [DEF:main:Module]
|
||||
// @SEMANTICS: entrypoint, svelte, init
|
||||
// @PURPOSE: Entry point for the Svelte application.
|
||||
// @LAYER: UI-Entry
|
||||
|
||||
import './app.css'
|
||||
import App from './App.svelte'
|
||||
|
||||
// [DEF:app_instance:Data]
|
||||
// @PURPOSE: Initialized Svelte app instance.
|
||||
const app = new App({
|
||||
target: document.getElementById('app'),
|
||||
props: {}
|
||||
})
|
||||
// [/DEF:app_instance:Data]
|
||||
|
||||
export default app
|
||||
// [/DEF:main:Module]
|
||||
@@ -17,6 +17,8 @@
|
||||
// [DEF:onMount:Function]
|
||||
/**
|
||||
* @purpose Fetch plugins when the component mounts.
|
||||
* @pre Component is mounting.
|
||||
* @post plugins store is populated with available tools.
|
||||
*/
|
||||
onMount(async () => {
|
||||
console.log("[Dashboard][Entry] Component mounted, fetching plugins.");
|
||||
@@ -27,6 +29,8 @@
|
||||
// [DEF:selectPlugin:Function]
|
||||
/**
|
||||
* @purpose Selects a plugin to display its form.
|
||||
* @pre plugin object is provided.
|
||||
* @post selectedPlugin store is updated.
|
||||
* @param {Object} plugin - The plugin object to select.
|
||||
*/
|
||||
function selectPlugin(plugin) {
|
||||
|
||||
@@ -50,6 +50,8 @@
|
||||
// [DEF:loadSettings:Function]
|
||||
/**
|
||||
* @purpose Loads settings from the backend.
|
||||
* @pre Component mounted or refresh requested.
|
||||
* @post settings object is populated with backend data.
|
||||
*/
|
||||
async function loadSettings() {
|
||||
try {
|
||||
@@ -67,6 +69,8 @@
|
||||
// [DEF:handleSaveGlobal:Function]
|
||||
/**
|
||||
* @purpose Saves global settings to the backend.
|
||||
* @pre settings.settings contains valid configuration.
|
||||
* @post Backend global settings are updated.
|
||||
*/
|
||||
async function handleSaveGlobal() {
|
||||
try {
|
||||
@@ -84,6 +88,8 @@
|
||||
// [DEF:handleAddOrUpdateEnv:Function]
|
||||
/**
|
||||
* @purpose Adds or updates an environment.
|
||||
* @pre newEnv contains valid environment details.
|
||||
* @post Environment list is updated on backend and reloaded locally.
|
||||
*/
|
||||
async function handleAddOrUpdateEnv() {
|
||||
try {
|
||||
@@ -108,6 +114,8 @@
|
||||
// [DEF:handleDeleteEnv:Function]
|
||||
/**
|
||||
* @purpose Deletes an environment.
|
||||
* @pre id of environment to delete is provided.
|
||||
* @post Environment is removed from backend and list is reloaded.
|
||||
* @param {string} id - The ID of the environment to delete.
|
||||
*/
|
||||
async function handleDeleteEnv(id) {
|
||||
@@ -129,6 +137,8 @@
|
||||
// [DEF:handleTestEnv:Function]
|
||||
/**
|
||||
* @purpose Tests the connection to an environment.
|
||||
* @pre Environment ID is valid.
|
||||
* @post Connection test result is displayed via toast.
|
||||
* @param {string} id - The ID of the environment to test.
|
||||
*/
|
||||
async function handleTestEnv(id) {
|
||||
@@ -152,6 +162,8 @@
|
||||
// [DEF:editEnv:Function]
|
||||
/**
|
||||
* @purpose Sets the form to edit an existing environment.
|
||||
* @pre env object is provided.
|
||||
* @post newEnv is populated with env data and editingEnvId is set.
|
||||
* @param {Object} env - The environment object to edit.
|
||||
*/
|
||||
function editEnv(env) {
|
||||
@@ -163,6 +175,8 @@
|
||||
// [DEF:resetEnvForm:Function]
|
||||
/**
|
||||
* @purpose Resets the environment form.
|
||||
* @pre None.
|
||||
* @post newEnv is reset to initial state and editingEnvId is cleared.
|
||||
*/
|
||||
function resetEnvForm() {
|
||||
newEnv = {
|
||||
|
||||
@@ -14,15 +14,28 @@
|
||||
pluginsStore.set(data.plugins);
|
||||
}
|
||||
|
||||
// [DEF:selectPlugin:Function]
|
||||
/* @PURPOSE: Handles plugin selection and navigation.
|
||||
@PRE: plugin object must be provided.
|
||||
@POST: Navigates to migration or sets selectedPlugin store.
|
||||
*/
|
||||
function selectPlugin(plugin) {
|
||||
console.log(`[Dashboard][Action] Selecting plugin: ${plugin.id}`);
|
||||
if (plugin.id === 'superset-migration') {
|
||||
goto('/migration');
|
||||
} else if (plugin.id === 'git-integration') {
|
||||
goto('/git');
|
||||
} else {
|
||||
selectedPlugin.set(plugin);
|
||||
}
|
||||
}
|
||||
// [/DEF:selectPlugin:Function]
|
||||
|
||||
// [DEF:handleFormSubmit:Function]
|
||||
/* @PURPOSE: Handles task creation from dynamic form submission.
|
||||
@PRE: event.detail must contain task parameters.
|
||||
@POST: Task is created via API and selectedTask store is updated.
|
||||
*/
|
||||
async function handleFormSubmit(event) {
|
||||
console.log("[App.handleFormSubmit][Action] Handling form submission for task creation.");
|
||||
const params = event.detail;
|
||||
@@ -36,6 +49,7 @@
|
||||
console.error(`[App.handleFormSubmit][Coherence:Failed] Task creation failed error=${error}`);
|
||||
}
|
||||
}
|
||||
// [/DEF:handleFormSubmit:Function]
|
||||
</script>
|
||||
|
||||
<div class="container mx-auto p-4">
|
||||
|
||||
@@ -1,5 +1,10 @@
|
||||
import { api } from '../lib/api';
|
||||
|
||||
// [DEF:load:Function]
|
||||
/* @PURPOSE: Loads initial plugin data for the dashboard.
|
||||
@PRE: None.
|
||||
@POST: Returns an object with plugins or an error message.
|
||||
*/
|
||||
/** @type {import('./$types').PageLoad} */
|
||||
export async function load() {
|
||||
try {
|
||||
@@ -15,3 +20,4 @@ export async function load() {
|
||||
};
|
||||
}
|
||||
}
|
||||
// [/DEF:load:Function]
|
||||
|
||||
86
frontend/src/routes/git/+page.svelte
Normal file
86
frontend/src/routes/git/+page.svelte
Normal file
@@ -0,0 +1,86 @@
|
||||
<!-- [DEF:GitDashboardPage:Component] -->
|
||||
<script lang="ts">
|
||||
import { onMount } from 'svelte';
|
||||
import DashboardGrid from '../../components/DashboardGrid.svelte';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
import type { DashboardMetadata } from '../../types/dashboard';
|
||||
|
||||
let environments: any[] = [];
|
||||
let selectedEnvId = "";
|
||||
let dashboards: DashboardMetadata[] = [];
|
||||
let loading = true;
|
||||
let fetchingDashboards = false;
|
||||
|
||||
async function fetchEnvironments() {
|
||||
try {
|
||||
const response = await fetch('/api/environments');
|
||||
if (!response.ok) throw new Error('Failed to fetch environments');
|
||||
environments = await response.json();
|
||||
if (environments.length > 0) {
|
||||
selectedEnvId = environments[0].id;
|
||||
}
|
||||
} catch (e) {
|
||||
toast(e.message, 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
|
||||
async function fetchDashboards(envId: string) {
|
||||
if (!envId) return;
|
||||
fetchingDashboards = true;
|
||||
try {
|
||||
const response = await fetch(`/api/environments/${envId}/dashboards`);
|
||||
if (!response.ok) throw new Error('Failed to fetch dashboards');
|
||||
dashboards = await response.json();
|
||||
} catch (e) {
|
||||
toast(e.message, 'error');
|
||||
dashboards = [];
|
||||
} finally {
|
||||
fetchingDashboards = false;
|
||||
}
|
||||
}
|
||||
|
||||
onMount(fetchEnvironments);
|
||||
|
||||
$: if (selectedEnvId) {
|
||||
fetchDashboards(selectedEnvId);
|
||||
localStorage.setItem('selected_env_id', selectedEnvId);
|
||||
}
|
||||
</script>
|
||||
|
||||
<div class="max-w-6xl mx-auto p-6">
|
||||
<div class="flex justify-between items-center mb-6">
|
||||
<h1 class="text-2xl font-bold text-gray-800">Git Dashboard Management</h1>
|
||||
<div class="flex items-center space-x-4">
|
||||
<label for="env-select" class="text-sm font-medium text-gray-700">Environment:</label>
|
||||
<select
|
||||
id="env-select"
|
||||
bind:value={selectedEnvId}
|
||||
class="border-gray-300 rounded-md shadow-sm focus:ring-blue-500 focus:border-blue-500 p-2 border bg-white"
|
||||
>
|
||||
{#each environments as env}
|
||||
<option value={env.id}>{env.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if loading}
|
||||
<div class="flex justify-center py-12">
|
||||
<div class="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600"></div>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="bg-white rounded-lg shadow p-6">
|
||||
<h2 class="text-lg font-medium mb-4">Select Dashboard to Manage</h2>
|
||||
{#if fetchingDashboards}
|
||||
<p class="text-gray-500">Loading dashboards...</p>
|
||||
{:else if dashboards.length > 0}
|
||||
<DashboardGrid {dashboards} />
|
||||
{:else}
|
||||
<p class="text-gray-500 italic">No dashboards found in this environment.</p>
|
||||
{/if}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/DEF:GitDashboardPage:Component] -->
|
||||
@@ -51,6 +51,7 @@
|
||||
// [DEF:fetchEnvironments:Function]
|
||||
/**
|
||||
* @purpose Fetches the list of environments from the API.
|
||||
* @pre None.
|
||||
* @post environments state is updated.
|
||||
*/
|
||||
async function fetchEnvironments() {
|
||||
@@ -69,6 +70,7 @@
|
||||
// [DEF:fetchDashboards:Function]
|
||||
/**
|
||||
* @purpose Fetches dashboards for the selected source environment.
|
||||
* @pre envId is a valid environment ID.
|
||||
* @param envId The environment ID.
|
||||
* @post dashboards state is updated.
|
||||
*/
|
||||
@@ -93,6 +95,8 @@
|
||||
// [DEF:fetchDatabases:Function]
|
||||
/**
|
||||
* @purpose Fetches databases from both environments and gets suggestions.
|
||||
* @pre sourceEnvId and targetEnvId must be set.
|
||||
* @post sourceDatabases, targetDatabases, mappings, and suggestions are updated.
|
||||
*/
|
||||
async function fetchDatabases() {
|
||||
if (!sourceEnvId || !targetEnvId) return;
|
||||
@@ -128,6 +132,8 @@
|
||||
// [DEF:handleMappingUpdate:Function]
|
||||
/**
|
||||
* @purpose Saves a mapping to the backend.
|
||||
* @pre event.detail contains sourceUuid and targetUuid.
|
||||
* @post Mapping is saved and local mappings list is updated.
|
||||
*/
|
||||
async function handleMappingUpdate(event: CustomEvent) {
|
||||
const { sourceUuid, targetUuid } = event.detail;
|
||||
@@ -162,6 +168,8 @@
|
||||
|
||||
// [DEF:handleViewLogs:Function]
|
||||
// @PURPOSE: Opens the log viewer for a specific task.
|
||||
// @PRE: event.detail contains task object.
|
||||
// @POST: logViewer state updated and showLogViewer set to true.
|
||||
function handleViewLogs(event: CustomEvent) {
|
||||
const task = event.detail;
|
||||
logViewerTaskId = task.id;
|
||||
@@ -172,6 +180,8 @@
|
||||
|
||||
// [DEF:handlePasswordPrompt:Function]
|
||||
// @PURPOSE: Reactive logic to show password prompt when a task is awaiting input.
|
||||
// @PRE: selectedTask status is AWAITING_INPUT.
|
||||
// @POST: showPasswordPrompt set to true with request data.
|
||||
// This is triggered by TaskRunner or TaskHistory when a task needs input
|
||||
// For now, we rely on the WebSocket or manual check.
|
||||
// Ideally, TaskHistory or TaskRunner emits an event when input is needed.
|
||||
@@ -194,6 +204,8 @@
|
||||
|
||||
// [DEF:handleResumeMigration:Function]
|
||||
// @PURPOSE: Resumes a migration task with provided passwords.
|
||||
// @PRE: event.detail contains passwords.
|
||||
// @POST: resumeTask is called and showPasswordPrompt is hidden on success.
|
||||
async function handleResumeMigration(event: CustomEvent) {
|
||||
if (!$selectedTask) return;
|
||||
|
||||
@@ -214,6 +226,7 @@
|
||||
/**
|
||||
* @purpose Starts the migration process.
|
||||
* @pre sourceEnvId and targetEnvId must be set and different.
|
||||
* @post Migration task is started and selectedTask is updated.
|
||||
*/
|
||||
async function startMigration() {
|
||||
if (!sourceEnvId || !targetEnvId) {
|
||||
|
||||
@@ -32,6 +32,8 @@
|
||||
|
||||
// [DEF:fetchEnvironments:Function]
|
||||
// @PURPOSE: Fetches the list of environments.
|
||||
// @PRE: None.
|
||||
// @POST: environments array is populated.
|
||||
async function fetchEnvironments() {
|
||||
try {
|
||||
const response = await fetch('/api/environments');
|
||||
@@ -50,6 +52,8 @@
|
||||
// [DEF:fetchDatabases:Function]
|
||||
/**
|
||||
* @purpose Fetches databases from both environments and gets suggestions.
|
||||
* @pre sourceEnvId and targetEnvId must be set.
|
||||
* @post sourceDatabases, targetDatabases, mappings, and suggestions are updated.
|
||||
*/
|
||||
async function fetchDatabases() {
|
||||
if (!sourceEnvId || !targetEnvId) return;
|
||||
@@ -86,6 +90,8 @@
|
||||
// [DEF:handleUpdate:Function]
|
||||
/**
|
||||
* @purpose Saves a mapping to the backend.
|
||||
* @pre event.detail contains sourceUuid and targetUuid.
|
||||
* @post Mapping is saved and local mappings list is updated.
|
||||
*/
|
||||
async function handleUpdate(event: CustomEvent) {
|
||||
const { sourceUuid, targetUuid } = event.detail;
|
||||
|
||||
@@ -21,6 +21,11 @@
|
||||
|
||||
let editingEnvId = null;
|
||||
|
||||
// [DEF:handleSaveGlobal:Function]
|
||||
/* @PURPOSE: Saves global application settings.
|
||||
@PRE: settings.settings must contain valid configuration.
|
||||
@POST: Global settings are updated via API.
|
||||
*/
|
||||
async function handleSaveGlobal() {
|
||||
try {
|
||||
console.log("[Settings.handleSaveGlobal][Action] Saving global settings.");
|
||||
@@ -32,7 +37,13 @@
|
||||
addToast('Failed to save global settings', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleSaveGlobal:Function]
|
||||
|
||||
// [DEF:handleAddOrUpdateEnv:Function]
|
||||
/* @PURPOSE: Adds a new environment or updates an existing one.
|
||||
@PRE: newEnv must contain valid environment details.
|
||||
@POST: Environment is saved and page is reloaded to reflect changes.
|
||||
*/
|
||||
async function handleAddOrUpdateEnv() {
|
||||
try {
|
||||
console.log(`[Settings.handleAddOrUpdateEnv][Action] ${editingEnvId ? 'Updating' : 'Adding'} environment.`);
|
||||
@@ -54,7 +65,13 @@
|
||||
addToast('Failed to save environment', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleAddOrUpdateEnv:Function]
|
||||
|
||||
// [DEF:handleDeleteEnv:Function]
|
||||
/* @PURPOSE: Deletes a Superset environment.
|
||||
@PRE: id must be a valid environment ID.
|
||||
@POST: Environment is removed and page is reloaded.
|
||||
*/
|
||||
async function handleDeleteEnv(id) {
|
||||
if (confirm('Are you sure you want to delete this environment?')) {
|
||||
try {
|
||||
@@ -69,7 +86,13 @@
|
||||
}
|
||||
}
|
||||
}
|
||||
// [/DEF:handleDeleteEnv:Function]
|
||||
|
||||
// [DEF:handleTestEnv:Function]
|
||||
/* @PURPOSE: Tests the connection to a Superset environment.
|
||||
@PRE: id must be a valid environment ID.
|
||||
@POST: Displays success or error toast based on connection result.
|
||||
*/
|
||||
async function handleTestEnv(id) {
|
||||
try {
|
||||
console.log(`[Settings.handleTestEnv][Action] Testing environment: ${id}`);
|
||||
@@ -86,12 +109,24 @@
|
||||
addToast('Failed to test connection', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleTestEnv:Function]
|
||||
|
||||
// [DEF:editEnv:Function]
|
||||
/* @PURPOSE: Populates the environment form for editing.
|
||||
@PRE: env object must be provided.
|
||||
@POST: newEnv and editingEnvId are updated.
|
||||
*/
|
||||
function editEnv(env) {
|
||||
newEnv = { ...env };
|
||||
editingEnvId = env.id;
|
||||
}
|
||||
// [/DEF:editEnv:Function]
|
||||
|
||||
// [DEF:resetEnvForm:Function]
|
||||
/* @PURPOSE: Resets the environment creation/edit form to default state.
|
||||
@PRE: None.
|
||||
@POST: newEnv is cleared and editingEnvId is set to null.
|
||||
*/
|
||||
function resetEnvForm() {
|
||||
newEnv = {
|
||||
id: '',
|
||||
@@ -103,6 +138,7 @@
|
||||
};
|
||||
editingEnvId = null;
|
||||
}
|
||||
// [/DEF:resetEnvForm:Function]
|
||||
</script>
|
||||
|
||||
<div class="container mx-auto p-4">
|
||||
|
||||
@@ -1,5 +1,10 @@
|
||||
import { api } from '../../lib/api';
|
||||
|
||||
// [DEF:load:Function]
|
||||
/* @PURPOSE: Loads application settings and environment list.
|
||||
@PRE: API must be reachable.
|
||||
@POST: Returns settings object or default values on error.
|
||||
*/
|
||||
/** @type {import('./$types').PageLoad} */
|
||||
export async function load() {
|
||||
try {
|
||||
@@ -21,3 +26,4 @@ export async function load() {
|
||||
};
|
||||
}
|
||||
}
|
||||
// [/DEF:load:Function]
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user