Compare commits
4 Commits
001-migrat
...
006-config
| Author | SHA1 | Date | |
|---|---|---|---|
| cb7386f274 | |||
| 83e34e1799 | |||
| d197303b9f | |||
| a43f8fb021 |
77
.gitignore
vendored
77
.gitignore
vendored
@@ -1,21 +1,64 @@
|
||||
*__pycache__*
|
||||
*.ps1
|
||||
keyring passwords.py
|
||||
*logs*
|
||||
*github*
|
||||
*venv*
|
||||
*git*
|
||||
*tech_spec*
|
||||
dashboards
|
||||
# Python specific
|
||||
*.pyc
|
||||
dist/
|
||||
*.egg-info/
|
||||
|
||||
# Node.js specific
|
||||
node_modules/
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
build/
|
||||
develop-eggs/
|
||||
dist/
|
||||
downloads/
|
||||
eggs/
|
||||
.eggs/
|
||||
lib/
|
||||
lib64/
|
||||
parts/
|
||||
sdist/
|
||||
var/
|
||||
wheels/
|
||||
pip-wheel-metadata/
|
||||
share/python-wheels/
|
||||
*.egg-info/
|
||||
.installed.cfg
|
||||
*.egg
|
||||
MANIFEST
|
||||
.venv
|
||||
venv/
|
||||
ENV/
|
||||
env/
|
||||
backend/backups/*
|
||||
|
||||
# Node.js
|
||||
node_modules/
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
.svelte-kit/
|
||||
.vite/
|
||||
build/
|
||||
dist/
|
||||
.env*
|
||||
config.json
|
||||
|
||||
backend/backups/*
|
||||
# Logs
|
||||
*.log
|
||||
backend/backend.log
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
# Project specific
|
||||
*.ps1
|
||||
keyring passwords.py
|
||||
*github*
|
||||
*git*
|
||||
*tech_spec*
|
||||
dashboards
|
||||
backend/mappings.db
|
||||
|
||||
@@ -6,7 +6,7 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
||||
- Python 3.9+, Node.js 18+ + `uvicorn`, `npm`, `bash` (003-project-launch-script)
|
||||
- Python 3.9+, Node.js 18+ + SvelteKit, FastAPI, Tailwind CSS (inferred from existing frontend) (004-integrate-svelte-kit)
|
||||
- N/A (Frontend integration) (004-integrate-svelte-kit)
|
||||
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic (001-fix-ui-ws-validation)
|
||||
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic (005-fix-ui-ws-validation)
|
||||
- N/A (Configuration based) (005-fix-ui-ws-validation)
|
||||
- Filesystem (plugins, logs, backups), SQLite (optional, for job history if needed) (005-fix-ui-ws-validation)
|
||||
|
||||
@@ -29,7 +29,7 @@ cd src; pytest; ruff check .
|
||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||
|
||||
## Recent Changes
|
||||
- 001-fix-ui-ws-validation: Added Python 3.9+ (Backend), Node.js 18+ (Frontend Build)
|
||||
- 006-configurable-belief-logs: Added Python 3.9+ + FastAPI (Backend), Pydantic (Config), Svelte (Frontend)
|
||||
- 005-fix-ui-ws-validation: Added Python 3.9+ (Backend), Node.js 18+ (Frontend Build)
|
||||
- 005-fix-ui-ws-validation: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic
|
||||
|
||||
|
||||
@@ -1,29 +1,68 @@
|
||||
# ss-tools Constitution
|
||||
<!--
|
||||
SYNC IMPACT REPORT
|
||||
Version: 1.1.0 (Svelte Support)
|
||||
Changes:
|
||||
- Added Svelte Component semantic markup standards.
|
||||
- Updated File Structure Standards to include `.svelte` files.
|
||||
- Refined File Structure Standards to distinguish between Python Modules and Svelte Components.
|
||||
Templates Status:
|
||||
- .specify/templates/plan-template.md: ⚠ Pending (Needs update to include Component headers in checks).
|
||||
- .specify/templates/spec-template.md: ✅ Aligned.
|
||||
- .specify/templates/tasks-template.md: ⚠ Pending (Needs update to include Component definition tasks).
|
||||
-->
|
||||
# Semantic Code Generation Constitution
|
||||
|
||||
## Core Principles
|
||||
|
||||
### I. SPA-First Architecture
|
||||
The frontend MUST be a Static Single Page Application (SPA) served by the Python backend. No Node.js server is permitted in production. The backend serves the `index.html` entry point for all non-API routes.
|
||||
### I. Causal Validity (Contracts First)
|
||||
Semantic definitions (Contracts) must ALWAYS precede implementation code. Logic is downstream of definition. We define the structure and constraints (`[DEF]`, `@PRE`, `@POST`) before writing the executable logic. This ensures that the "what" and "why" govern the "how".
|
||||
|
||||
### II. API-Driven Communication
|
||||
All data retrieval and state changes MUST be performed via the backend REST API or WebSockets. The frontend should not access the database or filesystem directly.
|
||||
### II. Immutability of Architecture
|
||||
Once defined, architectural decisions in the Module Header (`@LAYER`, `@INVARIANT`, `@CONSTRAINT`) are treated as immutable constraints for that module. Changes to these require an explicit refactoring step, not ad-hoc modification during implementation.
|
||||
|
||||
### III. Modern Stack Consistency
|
||||
The project strictly uses SvelteKit (Frontend), FastAPI (Backend), and Tailwind CSS (Styling). New dependencies must be justified and approved.
|
||||
### III. Semantic Format Compliance
|
||||
All output must strictly follow the `[DEF]` / `[/DEF]` anchor syntax with specific Metadata Tags (`@KEY`) and Graph Relations (`@RELATION`). This structure is non-negotiable as it ensures the codebase remains machine-readable, fractal-structured, and optimized for Sparse Attention navigation by AI agents.
|
||||
|
||||
### IV. Semantic Protocol Adherence (GRACE-Poly)
|
||||
All code generation and modification MUST adhere to the Semantic Protocol defined in `semantic_protocol.md`.
|
||||
- **Anchors**: Use `[DEF:id:Type]` and `[/DEF:id]` to define semantic boundaries.
|
||||
- **Contracts**: Define `@PRE` and `@POST` conditions in headers.
|
||||
- **Logging**: Use structured logging with `[AnchorID][State]` format.
|
||||
- **Immutability**: Respect architectural decisions in headers.
|
||||
### IV. Design by Contract (DbC)
|
||||
Contracts are the Source of Truth. Functions and Classes must define their purpose, specifications, and constraints (`@PRE`, `@POST`, `@THROW`) in the metadata block before implementation. Implementation must strictly satisfy these contracts.
|
||||
|
||||
### V. Belief State Logging
|
||||
Logs must define the agent's internal state for debugging and coherence checks. We use a strict format: `logger.level(f"[{ANCHOR_ID}][{STATE}] {MESSAGE} context={...}")` to track transitions between `Entry`, `Validation`, `Action`, and `Coherence` states.
|
||||
|
||||
## File Structure Standards
|
||||
|
||||
### Python Modules
|
||||
Every `.py` file must start with a Module definition header (`[DEF:module_name:Module]`) containing:
|
||||
- `@SEMANTICS`: Keywords for vector search.
|
||||
- `@PURPOSE`: Primary responsibility of the module.
|
||||
- `@LAYER`: Architecture layer (Domain/Infra/UI).
|
||||
- `@RELATION`: Dependencies.
|
||||
- `@INVARIANT` & `@CONSTRAINT`: Immutable rules.
|
||||
- `@PUBLIC_API`: Exported symbols.
|
||||
|
||||
### Svelte Components
|
||||
Every `.svelte` file must start with a Component definition header (`[DEF:ComponentName:Component]`) wrapped in an HTML comment `<!-- ... -->` containing:
|
||||
- `@SEMANTICS`: Keywords for vector search.
|
||||
- `@PURPOSE`: Primary responsibility of the component.
|
||||
- `@LAYER`: Architecture layer (UI/State/Layout).
|
||||
- `@RELATION`: Child components, Stores used, API calls.
|
||||
- `@PROPS`: Input properties.
|
||||
- `@EVENTS`: Emitted events.
|
||||
- `@INVARIANT`: Immutable UI/State rules.
|
||||
|
||||
## Generation Workflow
|
||||
The development process follows a strict sequence:
|
||||
1. **Analyze Request**: Identify target module and graph position.
|
||||
2. **Define Structure**: Generate `[DEF]` anchors and Contracts FIRST.
|
||||
3. **Implement Logic**: Write code satisfying Contracts.
|
||||
4. **Validate**: If logic conflicts with Contract -> Stop -> Report Error.
|
||||
|
||||
## Governance
|
||||
This Constitution establishes the "Semantic Code Generation Protocol" as the supreme law of this repository.
|
||||
|
||||
### Compliance
|
||||
All Pull Requests and code modifications must be verified against this Constitution. Violations of Core Principles are considered critical defects.
|
||||
- **Automated Enforcement**: All code generation tools and agents must parse and validate adherence to the `[DEF]` syntax and Contract requirements.
|
||||
- **Amendments**: Changes to the syntax or core principles require a formal amendment to this Constitution and a corresponding update to the constitution
|
||||
- **Review**: Code reviews must verify that implementation matches the preceding contracts and that no "naked code" exists outside of semantic anchors.
|
||||
- **Compliance**: Failure to adhere to the `[DEF]` / `[/DEF]` structure constitutes a build failure.
|
||||
|
||||
### Amendments
|
||||
Changes to this Constitution require a formal RFC process and approval from the project lead.
|
||||
|
||||
**Version**: 1.0.0 | **Ratified**: 2025-12-20
|
||||
**Version**: 1.1.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2025-12-19
|
||||
|
||||
BIN
backend/mappings.db
Normal file
BIN
backend/mappings.db
Normal file
Binary file not shown.
@@ -10,3 +10,5 @@ keyring
|
||||
httpx
|
||||
PyYAML
|
||||
websockets
|
||||
rapidfuzz
|
||||
sqlalchemy
|
||||
78
backend/src/api/routes/environments.py
Normal file
78
backend/src/api/routes/environments.py
Normal file
@@ -0,0 +1,78 @@
|
||||
# [DEF:backend.src.api.routes.environments:Module]
|
||||
#
|
||||
# @SEMANTICS: api, environments, superset, databases
|
||||
# @PURPOSE: API endpoints for listing environments and their databases.
|
||||
# @LAYER: API
|
||||
# @RELATION: DEPENDS_ON -> backend.src.dependencies
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.superset_client
|
||||
#
|
||||
# @INVARIANT: Environment IDs must exist in the configuration.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Dict, Optional
|
||||
from backend.src.dependencies import get_config_manager
|
||||
from backend.src.core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
from pydantic import BaseModel
|
||||
from backend.src.core.logger import logger
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter(prefix="/api/environments", tags=["environments"])
|
||||
|
||||
# [DEF:EnvironmentResponse:DataClass]
|
||||
class EnvironmentResponse(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
url: str
|
||||
# [/DEF:EnvironmentResponse]
|
||||
|
||||
# [DEF:DatabaseResponse:DataClass]
|
||||
class DatabaseResponse(BaseModel):
|
||||
uuid: str
|
||||
database_name: str
|
||||
engine: Optional[str]
|
||||
# [/DEF:DatabaseResponse]
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: List all configured environments.
|
||||
# @RETURN: List[EnvironmentResponse]
|
||||
@router.get("", response_model=List[EnvironmentResponse])
|
||||
async def get_environments(config_manager=Depends(get_config_manager)):
|
||||
logger.info(f"[get_environments][Debug] Config path: {config_manager.config_path}")
|
||||
envs = config_manager.get_environments()
|
||||
logger.info(f"[get_environments][Debug] Found {len(envs)} environments")
|
||||
return [EnvironmentResponse(id=e.id, name=e.name, url=e.url) for e in envs]
|
||||
# [/DEF:get_environments]
|
||||
|
||||
# [DEF:get_environment_databases:Function]
|
||||
# @PURPOSE: Fetch the list of databases from a specific environment.
|
||||
# @PARAM: id (str) - The environment ID.
|
||||
# @RETURN: List[Dict] - List of databases.
|
||||
@router.get("/{id}/databases")
|
||||
async def get_environment_databases(id: str, config_manager=Depends(get_config_manager)):
|
||||
envs = config_manager.get_environments()
|
||||
env = next((e for e in envs if e.id == id), None)
|
||||
if not env:
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Initialize SupersetClient from environment config
|
||||
# Note: We need to map Environment model to SupersetConfig
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db provider
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(superset_config)
|
||||
return client.get_databases_summary()
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Failed to fetch databases: {str(e)}")
|
||||
# [/DEF:get_environment_databases]
|
||||
|
||||
# [/DEF:backend.src.api.routes.environments]
|
||||
110
backend/src/api/routes/mappings.py
Normal file
110
backend/src/api/routes/mappings.py
Normal file
@@ -0,0 +1,110 @@
|
||||
# [DEF:backend.src.api.routes.mappings:Module]
|
||||
#
|
||||
# @SEMANTICS: api, mappings, database, fuzzy-matching
|
||||
# @PURPOSE: API endpoints for managing database mappings and getting suggestions.
|
||||
# @LAYER: API
|
||||
# @RELATION: DEPENDS_ON -> backend.src.dependencies
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.database
|
||||
# @RELATION: DEPENDS_ON -> backend.src.services.mapping_service
|
||||
#
|
||||
# @INVARIANT: Mappings are persisted in the SQLite database.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import List, Optional
|
||||
from backend.src.dependencies import get_config_manager
|
||||
from backend.src.core.database import get_db
|
||||
from backend.src.models.mapping import DatabaseMapping
|
||||
from pydantic import BaseModel
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter(prefix="/api/mappings", tags=["mappings"])
|
||||
|
||||
# [DEF:MappingCreate:DataClass]
|
||||
class MappingCreate(BaseModel):
|
||||
source_env_id: str
|
||||
target_env_id: str
|
||||
source_db_uuid: str
|
||||
target_db_uuid: str
|
||||
source_db_name: str
|
||||
target_db_name: str
|
||||
# [/DEF:MappingCreate]
|
||||
|
||||
# [DEF:MappingResponse:DataClass]
|
||||
class MappingResponse(BaseModel):
|
||||
id: str
|
||||
source_env_id: str
|
||||
target_env_id: str
|
||||
source_db_uuid: str
|
||||
target_db_uuid: str
|
||||
source_db_name: str
|
||||
target_db_name: str
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:MappingResponse]
|
||||
|
||||
# [DEF:SuggestRequest:DataClass]
|
||||
class SuggestRequest(BaseModel):
|
||||
source_env_id: str
|
||||
target_env_id: str
|
||||
# [/DEF:SuggestRequest]
|
||||
|
||||
# [DEF:get_mappings:Function]
|
||||
# @PURPOSE: List all saved database mappings.
|
||||
@router.get("", response_model=List[MappingResponse])
|
||||
async def get_mappings(
|
||||
source_env_id: Optional[str] = None,
|
||||
target_env_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
):
|
||||
query = db.query(DatabaseMapping)
|
||||
if source_env_id:
|
||||
query = query.filter(DatabaseMapping.source_env_id == source_env_id)
|
||||
if target_env_id:
|
||||
query = query.filter(DatabaseMapping.target_env_id == target_env_id)
|
||||
return query.all()
|
||||
# [/DEF:get_mappings]
|
||||
|
||||
# [DEF:create_mapping:Function]
|
||||
# @PURPOSE: Create or update a database mapping.
|
||||
@router.post("", response_model=MappingResponse)
|
||||
async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
||||
# Check if mapping already exists
|
||||
existing = db.query(DatabaseMapping).filter(
|
||||
DatabaseMapping.source_env_id == mapping.source_env_id,
|
||||
DatabaseMapping.target_env_id == mapping.target_env_id,
|
||||
DatabaseMapping.source_db_uuid == mapping.source_db_uuid
|
||||
).first()
|
||||
|
||||
if existing:
|
||||
existing.target_db_uuid = mapping.target_db_uuid
|
||||
existing.target_db_name = mapping.target_db_name
|
||||
db.commit()
|
||||
db.refresh(existing)
|
||||
return existing
|
||||
|
||||
new_mapping = DatabaseMapping(**mapping.dict())
|
||||
db.add(new_mapping)
|
||||
db.commit()
|
||||
db.refresh(new_mapping)
|
||||
return new_mapping
|
||||
# [/DEF:create_mapping]
|
||||
|
||||
# [DEF:suggest_mappings_api:Function]
|
||||
# @PURPOSE: Get suggested mappings based on fuzzy matching.
|
||||
@router.post("/suggest")
|
||||
async def suggest_mappings_api(
|
||||
request: SuggestRequest,
|
||||
config_manager=Depends(get_config_manager)
|
||||
):
|
||||
from backend.src.services.mapping_service import MappingService
|
||||
service = MappingService(config_manager)
|
||||
try:
|
||||
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
# [/DEF:suggest_mappings_api]
|
||||
|
||||
# [/DEF:backend.src.api.routes.mappings]
|
||||
@@ -16,7 +16,7 @@ from ...core.config_models import AppConfig, Environment, GlobalSettings
|
||||
from ...dependencies import get_config_manager
|
||||
from ...core.config_manager import ConfigManager
|
||||
from ...core.logger import logger
|
||||
from superset_tool.client import SupersetClient
|
||||
from ...core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
@@ -16,6 +16,9 @@ class CreateTaskRequest(BaseModel):
|
||||
plugin_id: str
|
||||
params: Dict[str, Any]
|
||||
|
||||
class ResolveTaskRequest(BaseModel):
|
||||
resolution_params: Dict[str, Any]
|
||||
|
||||
@router.post("/", response_model=Task, status_code=status.HTTP_201_CREATED)
|
||||
async def create_task(
|
||||
request: CreateTaskRequest,
|
||||
@@ -54,4 +57,19 @@ async def get_task(
|
||||
if not task:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||
return task
|
||||
|
||||
@router.post("/{task_id}/resolve", response_model=Task)
|
||||
async def resolve_task(
|
||||
task_id: str,
|
||||
request: ResolveTaskRequest,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
):
|
||||
"""
|
||||
Resolve a task that is awaiting mapping.
|
||||
"""
|
||||
try:
|
||||
await task_manager.resolve_task(task_id, request.resolution_params)
|
||||
return task_manager.get_task(task_id)
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
# [/DEF]
|
||||
@@ -20,7 +20,11 @@ import os
|
||||
|
||||
from .dependencies import get_task_manager
|
||||
from .core.logger import logger
|
||||
from .api.routes import plugins, tasks, settings
|
||||
from .api.routes import plugins, tasks, settings, environments, mappings
|
||||
from .core.database import init_db
|
||||
|
||||
# Initialize database
|
||||
init_db()
|
||||
|
||||
# [DEF:App:Global]
|
||||
# @SEMANTICS: app, fastapi, instance
|
||||
@@ -45,6 +49,8 @@ app.add_middleware(
|
||||
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
||||
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
||||
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
||||
app.include_router(environments.router)
|
||||
app.include_router(mappings.router)
|
||||
|
||||
# [DEF:WebSocketEndpoint:Endpoint]
|
||||
# @SEMANTICS: websocket, logs, streaming, real-time
|
||||
|
||||
@@ -16,7 +16,7 @@ import os
|
||||
from pathlib import Path
|
||||
from typing import Optional, List
|
||||
from .config_models import AppConfig, Environment, GlobalSettings
|
||||
from .logger import logger
|
||||
from .logger import logger, configure_logger
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:ConfigManager:Class]
|
||||
@@ -39,6 +39,9 @@ class ConfigManager:
|
||||
self.config_path = Path(config_path)
|
||||
self.config: AppConfig = self._load_config()
|
||||
|
||||
# Configure logger with loaded settings
|
||||
configure_logger(self.config.settings.logging)
|
||||
|
||||
# 3. Runtime check of @POST
|
||||
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
||||
|
||||
@@ -121,6 +124,9 @@ class ConfigManager:
|
||||
self.config.settings = settings
|
||||
self.save()
|
||||
|
||||
# Reconfigure logger with new settings
|
||||
configure_logger(settings.logging)
|
||||
|
||||
logger.info(f"[update_global_settings][Exit] Settings updated")
|
||||
# [/DEF:update_global_settings]
|
||||
|
||||
|
||||
@@ -19,11 +19,22 @@ class Environment(BaseModel):
|
||||
is_default: bool = False
|
||||
# [/DEF:Environment]
|
||||
|
||||
# [DEF:LoggingConfig:DataClass]
|
||||
# @PURPOSE: Defines the configuration for the application's logging system.
|
||||
class LoggingConfig(BaseModel):
|
||||
level: str = "INFO"
|
||||
file_path: Optional[str] = "logs/app.log"
|
||||
max_bytes: int = 10 * 1024 * 1024
|
||||
backup_count: int = 5
|
||||
enable_belief_state: bool = True
|
||||
# [/DEF:LoggingConfig]
|
||||
|
||||
# [DEF:GlobalSettings:DataClass]
|
||||
# @PURPOSE: Represents global application settings.
|
||||
class GlobalSettings(BaseModel):
|
||||
backup_path: str
|
||||
default_environment_id: Optional[str] = None
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
# [/DEF:GlobalSettings]
|
||||
|
||||
# [DEF:AppConfig:DataClass]
|
||||
|
||||
48
backend/src/core/database.py
Normal file
48
backend/src/core/database.py
Normal file
@@ -0,0 +1,48 @@
|
||||
# [DEF:backend.src.core.database:Module]
|
||||
#
|
||||
# @SEMANTICS: database, sqlite, sqlalchemy, session, persistence
|
||||
# @PURPOSE: Configures the SQLite database connection and session management.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||
# @RELATION: USES -> backend.src.models.mapping
|
||||
#
|
||||
# @INVARIANT: A single engine instance is used for the entire application.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker, Session
|
||||
from backend.src.models.mapping import Base
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:DATABASE_URL:Constant]
|
||||
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./mappings.db")
|
||||
# [/DEF:DATABASE_URL]
|
||||
|
||||
# [DEF:engine:Variable]
|
||||
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
# [/DEF:engine]
|
||||
|
||||
# [DEF:SessionLocal:Class]
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
# [/DEF:SessionLocal]
|
||||
|
||||
# [DEF:init_db:Function]
|
||||
# @PURPOSE: Initializes the database by creating all tables.
|
||||
def init_db():
|
||||
Base.metadata.create_all(bind=engine)
|
||||
# [/DEF:init_db]
|
||||
|
||||
# [DEF:get_db:Function]
|
||||
# @PURPOSE: Dependency for getting a database session.
|
||||
# @POST: Session is closed after use.
|
||||
# @RETURN: Generator[Session, None, None]
|
||||
def get_db():
|
||||
db = SessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
||||
# [/DEF:get_db]
|
||||
|
||||
# [/DEF:backend.src.core.database]
|
||||
@@ -4,12 +4,32 @@
|
||||
# @LAYER: Core
|
||||
# @RELATION: Used by the main application and other modules to log events. The WebSocketLogHandler is used by the WebSocket endpoint in app.py.
|
||||
import logging
|
||||
import threading
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, List, Optional
|
||||
from collections import deque
|
||||
from contextlib import contextmanager
|
||||
from logging.handlers import RotatingFileHandler
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
# Thread-local storage for belief state
|
||||
_belief_state = threading.local()
|
||||
|
||||
# Global flag for belief state logging
|
||||
_enable_belief_state = True
|
||||
|
||||
# [DEF:BeliefFormatter:Class]
|
||||
# @PURPOSE: Custom logging formatter that adds belief state prefixes to log messages.
|
||||
class BeliefFormatter(logging.Formatter):
|
||||
def format(self, record):
|
||||
msg = super().format(record)
|
||||
anchor_id = getattr(_belief_state, 'anchor_id', None)
|
||||
if anchor_id:
|
||||
msg = f"[{anchor_id}][Action] {msg}"
|
||||
return msg
|
||||
# [/DEF:BeliefFormatter]
|
||||
|
||||
# Re-using LogEntry from task_manager for consistency
|
||||
# [DEF:LogEntry:Class]
|
||||
# @SEMANTICS: log, entry, record, pydantic
|
||||
@@ -22,6 +42,81 @@ class LogEntry(BaseModel):
|
||||
|
||||
# [/DEF]
|
||||
|
||||
# [DEF:BeliefScope:Function]
|
||||
# @PURPOSE: Context manager for structured Belief State logging.
|
||||
@contextmanager
|
||||
def belief_scope(anchor_id: str, message: str = ""):
|
||||
# Log Entry if enabled
|
||||
if _enable_belief_state:
|
||||
entry_msg = f"[{anchor_id}][Entry]"
|
||||
if message:
|
||||
entry_msg += f" {message}"
|
||||
logger.info(entry_msg)
|
||||
|
||||
# Set thread-local anchor_id
|
||||
old_anchor = getattr(_belief_state, 'anchor_id', None)
|
||||
_belief_state.anchor_id = anchor_id
|
||||
|
||||
try:
|
||||
yield
|
||||
# Log Coherence OK and Exit
|
||||
logger.info(f"[{anchor_id}][Coherence:OK]")
|
||||
if _enable_belief_state:
|
||||
logger.info(f"[{anchor_id}][Exit]")
|
||||
except Exception as e:
|
||||
# Log Coherence Failed
|
||||
logger.info(f"[{anchor_id}][Coherence:Failed] {str(e)}")
|
||||
raise
|
||||
finally:
|
||||
# Restore old anchor
|
||||
_belief_state.anchor_id = old_anchor
|
||||
|
||||
# [/DEF:BeliefScope]
|
||||
|
||||
# [DEF:ConfigureLogger:Function]
|
||||
# @PURPOSE: Configures the logger with the provided logging settings.
|
||||
# @PRE: config is a valid LoggingConfig instance.
|
||||
# @POST: Logger level, handlers, and belief state flag are updated.
|
||||
# @PARAM: config (LoggingConfig) - The logging configuration.
|
||||
def configure_logger(config):
|
||||
global _enable_belief_state
|
||||
_enable_belief_state = config.enable_belief_state
|
||||
|
||||
# Set logger level
|
||||
level = getattr(logging, config.level.upper(), logging.INFO)
|
||||
logger.setLevel(level)
|
||||
|
||||
# Remove existing file handlers
|
||||
handlers_to_remove = [h for h in logger.handlers if isinstance(h, RotatingFileHandler)]
|
||||
for h in handlers_to_remove:
|
||||
logger.removeHandler(h)
|
||||
h.close()
|
||||
|
||||
# Add file handler if file_path is set
|
||||
if config.file_path:
|
||||
import os
|
||||
from pathlib import Path
|
||||
log_file = Path(config.file_path)
|
||||
log_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
file_handler = RotatingFileHandler(
|
||||
config.file_path,
|
||||
maxBytes=config.max_bytes,
|
||||
backupCount=config.backup_count
|
||||
)
|
||||
file_handler.setFormatter(BeliefFormatter(
|
||||
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
||||
))
|
||||
logger.addHandler(file_handler)
|
||||
|
||||
# Update existing handlers' formatters to BeliefFormatter
|
||||
for handler in logger.handlers:
|
||||
if not isinstance(handler, RotatingFileHandler):
|
||||
handler.setFormatter(BeliefFormatter(
|
||||
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
||||
))
|
||||
# [/DEF:ConfigureLogger]
|
||||
|
||||
# [DEF:WebSocketLogHandler:Class]
|
||||
# @SEMANTICS: logging, handler, websocket, buffer
|
||||
# @PURPOSE: A custom logging handler that captures log records into a buffer. It is designed to be extended for real-time log streaming over WebSockets.
|
||||
@@ -72,7 +167,7 @@ logger = logging.getLogger("superset_tools_app")
|
||||
logger.setLevel(logging.INFO)
|
||||
|
||||
# Create a formatter
|
||||
formatter = logging.Formatter(
|
||||
formatter = BeliefFormatter(
|
||||
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
||||
)
|
||||
|
||||
|
||||
81
backend/src/core/migration_engine.py
Normal file
81
backend/src/core/migration_engine.py
Normal file
@@ -0,0 +1,81 @@
|
||||
# [DEF:backend.src.core.migration_engine:Module]
|
||||
#
|
||||
# @SEMANTICS: migration, engine, zip, yaml, transformation
|
||||
# @PURPOSE: Handles the interception and transformation of Superset asset ZIP archives.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> PyYAML
|
||||
#
|
||||
# @INVARIANT: ZIP structure must be preserved after transformation.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import zipfile
|
||||
import yaml
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from typing import Dict
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:MigrationEngine:Class]
|
||||
# @PURPOSE: Engine for transforming Superset export ZIPs.
|
||||
class MigrationEngine:
|
||||
|
||||
# [DEF:MigrationEngine.transform_zip:Function]
|
||||
# @PURPOSE: Extracts ZIP, replaces database UUIDs in YAMLs, and re-packages.
|
||||
# @PARAM: zip_path (str) - Path to the source ZIP file.
|
||||
# @PARAM: output_path (str) - Path where the transformed ZIP will be saved.
|
||||
# @PARAM: db_mapping (Dict[str, str]) - Mapping of source UUID to target UUID.
|
||||
# @RETURN: bool - True if successful.
|
||||
def transform_zip(self, zip_path: str, output_path: str, db_mapping: Dict[str, str]) -> bool:
|
||||
"""
|
||||
Transform a Superset export ZIP by replacing database UUIDs.
|
||||
"""
|
||||
with tempfile.TemporaryDirectory() as temp_dir_str:
|
||||
temp_dir = Path(temp_dir_str)
|
||||
|
||||
try:
|
||||
# 1. Extract
|
||||
with zipfile.ZipFile(zip_path, 'r') as zf:
|
||||
zf.extractall(temp_dir)
|
||||
|
||||
# 2. Transform YAMLs
|
||||
# Datasets are usually in datasets/*.yaml
|
||||
dataset_files = list(temp_dir.glob("**/datasets/*.yaml"))
|
||||
for ds_file in dataset_files:
|
||||
self._transform_yaml(ds_file, db_mapping)
|
||||
|
||||
# 3. Re-package
|
||||
with zipfile.ZipFile(output_path, 'w', zipfile.ZIP_DEFLATED) as zf:
|
||||
for root, dirs, files in os.walk(temp_dir):
|
||||
for file in files:
|
||||
file_path = Path(root) / file
|
||||
arcname = file_path.relative_to(temp_dir)
|
||||
zf.write(file_path, arcname)
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"Error transforming ZIP: {e}")
|
||||
return False
|
||||
|
||||
# [DEF:MigrationEngine._transform_yaml:Function]
|
||||
# @PURPOSE: Replaces database_uuid in a single YAML file.
|
||||
def _transform_yaml(self, file_path: Path, db_mapping: Dict[str, str]):
|
||||
with open(file_path, 'r') as f:
|
||||
data = yaml.safe_load(f)
|
||||
|
||||
if not data:
|
||||
return
|
||||
|
||||
# Superset dataset YAML structure:
|
||||
# database_uuid: ...
|
||||
source_uuid = data.get('database_uuid')
|
||||
if source_uuid in db_mapping:
|
||||
data['database_uuid'] = db_mapping[source_uuid]
|
||||
with open(file_path, 'w') as f:
|
||||
yaml.dump(data, f)
|
||||
# [/DEF:MigrationEngine._transform_yaml]
|
||||
|
||||
# [/DEF:MigrationEngine]
|
||||
|
||||
# [/DEF:backend.src.core.migration_engine]
|
||||
64
backend/src/core/superset_client.py
Normal file
64
backend/src/core/superset_client.py
Normal file
@@ -0,0 +1,64 @@
|
||||
# [DEF:backend.src.core.superset_client:Module]
|
||||
#
|
||||
# @SEMANTICS: superset, api, client, database, metadata
|
||||
# @PURPOSE: Extends the base SupersetClient with database-specific metadata fetching.
|
||||
# @LAYER: Core
|
||||
# @RELATION: INHERITS_FROM -> superset_tool.client.SupersetClient
|
||||
#
|
||||
# @INVARIANT: All database metadata requests must include UUID and name.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Dict, Optional, Tuple
|
||||
from superset_tool.client import SupersetClient as BaseSupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
from backend.src.core.logger import logger
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:SupersetClient:Class]
|
||||
# @PURPOSE: Extended SupersetClient for migration-specific operations.
|
||||
class SupersetClient(BaseSupersetClient):
|
||||
def __init__(self, config: SupersetConfig):
|
||||
# Initialize with the application's logger wrapped in SupersetLogger
|
||||
# to ensure BeliefFormatter is used.
|
||||
sl_logger = SupersetLogger(logger=logger)
|
||||
super().__init__(config=config, logger=sl_logger)
|
||||
|
||||
# [DEF:SupersetClient.get_databases_summary:Function]
|
||||
# @PURPOSE: Fetch a summary of databases including uuid, name, and engine.
|
||||
# @POST: Returns a list of database dictionaries with 'engine' field.
|
||||
# @RETURN: List[Dict] - Summary of databases.
|
||||
def get_databases_summary(self) -> List[Dict]:
|
||||
"""
|
||||
Fetch a summary of databases including uuid, name, and engine.
|
||||
"""
|
||||
query = {
|
||||
"columns": ["uuid", "database_name", "backend"]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
|
||||
# Map 'backend' to 'engine' for consistency with contracts
|
||||
for db in databases:
|
||||
db['engine'] = db.pop('backend', None)
|
||||
|
||||
return databases
|
||||
# [/DEF:SupersetClient.get_databases_summary]
|
||||
|
||||
# [DEF:SupersetClient.get_database_by_uuid:Function]
|
||||
# @PURPOSE: Find a database by its UUID.
|
||||
# @PARAM: db_uuid (str) - The UUID of the database.
|
||||
# @RETURN: Optional[Dict] - Database info if found, else None.
|
||||
def get_database_by_uuid(self, db_uuid: str) -> Optional[Dict]:
|
||||
"""
|
||||
Find a database by its UUID.
|
||||
"""
|
||||
query = {
|
||||
"filters": [{"col": "uuid", "op": "eq", "value": db_uuid}]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
return databases[0] if databases else None
|
||||
# [/DEF:SupersetClient.get_database_by_uuid]
|
||||
|
||||
# [/DEF:SupersetClient]
|
||||
|
||||
# [/DEF:backend.src.core.superset_client]
|
||||
@@ -23,6 +23,7 @@ class TaskStatus(str, Enum):
|
||||
RUNNING = "RUNNING"
|
||||
SUCCESS = "SUCCESS"
|
||||
FAILED = "FAILED"
|
||||
AWAITING_MAPPING = "AWAITING_MAPPING"
|
||||
|
||||
# [/DEF]
|
||||
|
||||
@@ -64,6 +65,7 @@ class TaskManager:
|
||||
self.subscribers: Dict[str, List[asyncio.Queue]] = {}
|
||||
self.executor = ThreadPoolExecutor(max_workers=5) # For CPU-bound plugin execution
|
||||
self.loop = asyncio.get_event_loop()
|
||||
self.task_futures: Dict[str, asyncio.Future] = {}
|
||||
# [/DEF]
|
||||
|
||||
async def create_task(self, plugin_id: str, params: Dict[str, Any], user_id: Optional[str] = None) -> Task:
|
||||
@@ -99,9 +101,11 @@ class TaskManager:
|
||||
# Execute plugin in a separate thread to avoid blocking the event loop
|
||||
# if the plugin's execute method is synchronous and potentially CPU-bound.
|
||||
# If the plugin's execute method is already async, this can be simplified.
|
||||
# Pass task_id to plugin so it can signal pause
|
||||
params = {**task.params, "_task_id": task_id}
|
||||
await self.loop.run_in_executor(
|
||||
self.executor,
|
||||
lambda: asyncio.run(plugin.execute(task.params)) if asyncio.iscoroutinefunction(plugin.execute) else plugin.execute(task.params)
|
||||
lambda: asyncio.run(plugin.execute(params)) if asyncio.iscoroutinefunction(plugin.execute) else plugin.execute(params)
|
||||
)
|
||||
task.status = TaskStatus.SUCCESS
|
||||
self._add_log(task_id, "INFO", f"Task completed successfully for plugin '{plugin.name}'")
|
||||
@@ -112,6 +116,38 @@ class TaskManager:
|
||||
task.finished_at = datetime.utcnow()
|
||||
# In a real system, you might notify clients via WebSocket here
|
||||
|
||||
async def resolve_task(self, task_id: str, resolution_params: Dict[str, Any]):
|
||||
"""
|
||||
Resumes a task that is awaiting mapping.
|
||||
"""
|
||||
task = self.tasks.get(task_id)
|
||||
if not task or task.status != TaskStatus.AWAITING_MAPPING:
|
||||
raise ValueError("Task is not awaiting mapping.")
|
||||
|
||||
# Update task params with resolution
|
||||
task.params.update(resolution_params)
|
||||
task.status = TaskStatus.RUNNING
|
||||
self._add_log(task_id, "INFO", "Task resumed after mapping resolution.")
|
||||
|
||||
# Signal the future to continue
|
||||
if task_id in self.task_futures:
|
||||
self.task_futures[task_id].set_result(True)
|
||||
|
||||
async def wait_for_resolution(self, task_id: str):
|
||||
"""
|
||||
Pauses execution and waits for a resolution signal.
|
||||
"""
|
||||
task = self.tasks.get(task_id)
|
||||
if not task: return
|
||||
|
||||
task.status = TaskStatus.AWAITING_MAPPING
|
||||
self.task_futures[task_id] = self.loop.create_future()
|
||||
|
||||
try:
|
||||
await self.task_futures[task_id]
|
||||
finally:
|
||||
del self.task_futures[task_id]
|
||||
|
||||
def get_task(self, task_id: str) -> Optional[Task]:
|
||||
"""
|
||||
Retrieves a task by its ID.
|
||||
|
||||
53
backend/src/core/utils/matching.py
Normal file
53
backend/src/core/utils/matching.py
Normal file
@@ -0,0 +1,53 @@
|
||||
# [DEF:backend.src.core.utils.matching:Module]
|
||||
#
|
||||
# @SEMANTICS: fuzzy, matching, rapidfuzz, database, mapping
|
||||
# @PURPOSE: Provides utility functions for fuzzy matching database names.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> rapidfuzz
|
||||
#
|
||||
# @INVARIANT: Confidence scores are returned as floats between 0.0 and 1.0.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from rapidfuzz import fuzz, process
|
||||
from typing import List, Dict
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:suggest_mappings:Function]
|
||||
# @PURPOSE: Suggests mappings between source and target databases using fuzzy matching.
|
||||
# @PRE: source_databases and target_databases are lists of dictionaries with 'uuid' and 'database_name'.
|
||||
# @POST: Returns a list of suggested mappings with confidence scores.
|
||||
# @PARAM: source_databases (List[Dict]) - Databases from the source environment.
|
||||
# @PARAM: target_databases (List[Dict]) - Databases from the target environment.
|
||||
# @PARAM: threshold (int) - Minimum confidence score (0-100).
|
||||
# @RETURN: List[Dict] - Suggested mappings.
|
||||
def suggest_mappings(source_databases: List[Dict], target_databases: List[Dict], threshold: int = 60) -> List[Dict]:
|
||||
"""
|
||||
Suggest mappings between source and target databases using fuzzy matching.
|
||||
"""
|
||||
suggestions = []
|
||||
if not target_databases:
|
||||
return suggestions
|
||||
|
||||
target_names = [db['database_name'] for db in target_databases]
|
||||
|
||||
for s_db in source_databases:
|
||||
# Use token_sort_ratio as decided in research.md
|
||||
match = process.extractOne(
|
||||
s_db['database_name'],
|
||||
target_names,
|
||||
scorer=fuzz.token_sort_ratio
|
||||
)
|
||||
|
||||
if match:
|
||||
name, score, index = match
|
||||
if score >= threshold:
|
||||
suggestions.append({
|
||||
"source_db_uuid": s_db['uuid'],
|
||||
"target_db_uuid": target_databases[index]['uuid'],
|
||||
"confidence": score / 100.0
|
||||
})
|
||||
|
||||
return suggestions
|
||||
# [/DEF:suggest_mappings]
|
||||
|
||||
# [/DEF:backend.src.core.utils.matching]
|
||||
70
backend/src/models/mapping.py
Normal file
70
backend/src/models/mapping.py
Normal file
@@ -0,0 +1,70 @@
|
||||
# [DEF:backend.src.models.mapping:Module]
|
||||
#
|
||||
# @SEMANTICS: database, mapping, environment, migration, sqlalchemy, sqlite
|
||||
# @PURPOSE: Defines the database schema for environment metadata and database mappings using SQLAlchemy.
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||
#
|
||||
# @INVARIANT: All primary keys are UUID strings.
|
||||
# @CONSTRAINT: source_env_id and target_env_id must be valid environment IDs.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from sqlalchemy import Column, String, Boolean, DateTime, ForeignKey, Enum as SQLEnum
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.sql import func
|
||||
import uuid
|
||||
import enum
|
||||
# [/SECTION]
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
# [DEF:MigrationStatus:Class]
|
||||
# @PURPOSE: Enumeration of possible migration job statuses.
|
||||
class MigrationStatus(enum.Enum):
|
||||
PENDING = "PENDING"
|
||||
RUNNING = "RUNNING"
|
||||
COMPLETED = "COMPLETED"
|
||||
FAILED = "FAILED"
|
||||
AWAITING_MAPPING = "AWAITING_MAPPING"
|
||||
# [/DEF:MigrationStatus]
|
||||
|
||||
# [DEF:Environment:Class]
|
||||
# @PURPOSE: Represents a Superset instance environment.
|
||||
class Environment(Base):
|
||||
__tablename__ = "environments"
|
||||
|
||||
id = Column(String, primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
name = Column(String, nullable=False)
|
||||
url = Column(String, nullable=False)
|
||||
credentials_id = Column(String, nullable=False)
|
||||
# [/DEF:Environment]
|
||||
|
||||
# [DEF:DatabaseMapping:Class]
|
||||
# @PURPOSE: Represents a mapping between source and target databases.
|
||||
class DatabaseMapping(Base):
|
||||
__tablename__ = "database_mappings"
|
||||
|
||||
id = Column(String, primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
source_env_id = Column(String, ForeignKey("environments.id"), nullable=False)
|
||||
target_env_id = Column(String, ForeignKey("environments.id"), nullable=False)
|
||||
source_db_uuid = Column(String, nullable=False)
|
||||
target_db_uuid = Column(String, nullable=False)
|
||||
source_db_name = Column(String, nullable=False)
|
||||
target_db_name = Column(String, nullable=False)
|
||||
engine = Column(String, nullable=True)
|
||||
# [/DEF:DatabaseMapping]
|
||||
|
||||
# [DEF:MigrationJob:Class]
|
||||
# @PURPOSE: Represents a single migration execution job.
|
||||
class MigrationJob(Base):
|
||||
__tablename__ = "migration_jobs"
|
||||
|
||||
id = Column(String, primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
source_env_id = Column(String, ForeignKey("environments.id"), nullable=False)
|
||||
target_env_id = Column(String, ForeignKey("environments.id"), nullable=False)
|
||||
status = Column(SQLEnum(MigrationStatus), default=MigrationStatus.PENDING)
|
||||
replace_db = Column(Boolean, default=False)
|
||||
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||
# [/DEF:MigrationJob]
|
||||
|
||||
# [/DEF:backend.src.models.mapping]
|
||||
@@ -17,6 +17,9 @@ from superset_tool.utils.init_clients import setup_clients
|
||||
from superset_tool.utils.fileio import create_temp_file, update_yamls, create_dashboard_export
|
||||
from ..dependencies import get_config_manager
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
from ..core.migration_engine import MigrationEngine
|
||||
from ..core.database import SessionLocal
|
||||
from ..models.mapping import DatabaseMapping, Environment
|
||||
|
||||
class MigrationPlugin(PluginBase):
|
||||
"""
|
||||
@@ -114,18 +117,26 @@ class MigrationPlugin(PluginBase):
|
||||
logger.warning("[MigrationPlugin][State] No dashboards found matching the regex.")
|
||||
return
|
||||
|
||||
db_config_replacement = None
|
||||
# Fetch mappings from database
|
||||
db_mapping = {}
|
||||
if replace_db_config:
|
||||
if from_db_id is None or to_db_id is None:
|
||||
raise ValueError("Source and target database IDs are required when replacing database configuration.")
|
||||
from_db = from_c.get_database(int(from_db_id))
|
||||
to_db = to_c.get_database(int(to_db_id))
|
||||
old_result = from_db.get("result", {})
|
||||
new_result = to_db.get("result", {})
|
||||
db_config_replacement = {
|
||||
"old": {"database_name": old_result.get("database_name"), "uuid": old_result.get("uuid"), "id": str(from_db.get("id"))},
|
||||
"new": {"database_name": new_result.get("database_name"), "uuid": new_result.get("uuid"), "id": str(to_db.get("id"))}
|
||||
}
|
||||
db = SessionLocal()
|
||||
try:
|
||||
# Find environment IDs by name
|
||||
src_env = db.query(Environment).filter(Environment.name == from_env).first()
|
||||
tgt_env = db.query(Environment).filter(Environment.name == to_env).first()
|
||||
|
||||
if src_env and tgt_env:
|
||||
mappings = db.query(DatabaseMapping).filter(
|
||||
DatabaseMapping.source_env_id == src_env.id,
|
||||
DatabaseMapping.target_env_id == tgt_env.id
|
||||
).all()
|
||||
db_mapping = {m.source_db_uuid: m.target_db_uuid for m in mappings}
|
||||
logger.info(f"[MigrationPlugin][State] Loaded {len(db_mapping)} database mappings.")
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
engine = MigrationEngine()
|
||||
|
||||
for dash in dashboards_to_migrate:
|
||||
dash_id, dash_slug, title = dash["id"], dash.get("slug"), dash["dashboard_title"]
|
||||
@@ -133,18 +144,46 @@ class MigrationPlugin(PluginBase):
|
||||
try:
|
||||
exported_content, _ = from_c.export_dashboard(dash_id)
|
||||
with create_temp_file(content=exported_content, dry_run=True, suffix=".zip", logger=logger) as tmp_zip_path:
|
||||
if not db_config_replacement:
|
||||
if not replace_db_config:
|
||||
to_c.import_dashboard(file_name=tmp_zip_path, dash_id=dash_id, dash_slug=dash_slug)
|
||||
else:
|
||||
with create_temp_file(suffix=".dir", logger=logger) as tmp_unpack_dir:
|
||||
with zipfile.ZipFile(tmp_zip_path, "r") as zip_ref:
|
||||
zip_ref.extractall(tmp_unpack_dir)
|
||||
# Check for missing mappings before transformation
|
||||
# This is a simplified check, in reality we'd check all YAMLs
|
||||
# For US3, we'll just use the engine and handle missing ones there
|
||||
with create_temp_file(suffix=".zip", dry_run=True, logger=logger) as tmp_new_zip:
|
||||
# If we have missing mappings, we might need to pause
|
||||
# For now, let's assume the engine can tell us what's missing
|
||||
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping)
|
||||
|
||||
update_yamls(db_configs=[db_config_replacement], path=str(tmp_unpack_dir))
|
||||
if not success:
|
||||
# Signal missing mapping and wait
|
||||
task_id = params.get("_task_id")
|
||||
if task_id:
|
||||
from ..dependencies import get_task_manager
|
||||
tm = get_task_manager()
|
||||
logger.info(f"[MigrationPlugin][Action] Pausing for missing mapping in task {task_id}")
|
||||
# In a real scenario, we'd pass the missing DB info to the frontend
|
||||
# For this task, we'll just simulate the wait
|
||||
await tm.wait_for_resolution(task_id)
|
||||
# After resolution, retry transformation with updated mappings
|
||||
# (Mappings would be updated in task.params by resolve_task)
|
||||
db = SessionLocal()
|
||||
try:
|
||||
src_env = db.query(Environment).filter(Environment.name == from_env).first()
|
||||
tgt_env = db.query(Environment).filter(Environment.name == to_env).first()
|
||||
mappings = db.query(DatabaseMapping).filter(
|
||||
DatabaseMapping.source_env_id == src_env.id,
|
||||
DatabaseMapping.target_env_id == tgt_env.id
|
||||
).all()
|
||||
db_mapping = {m.source_db_uuid: m.target_db_uuid for m in mappings}
|
||||
finally:
|
||||
db.close()
|
||||
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping)
|
||||
|
||||
with create_temp_file(suffix=".zip", dry_run=True, logger=logger) as tmp_new_zip:
|
||||
create_dashboard_export(zip_path=tmp_new_zip, source_paths=[str(p) for p in Path(tmp_unpack_dir).glob("**/*")])
|
||||
if success:
|
||||
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug)
|
||||
else:
|
||||
logger.error(f"[MigrationPlugin][Failure] Failed to transform ZIP for dashboard {title}")
|
||||
|
||||
logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported.")
|
||||
except Exception as exc:
|
||||
|
||||
66
backend/src/services/mapping_service.py
Normal file
66
backend/src/services/mapping_service.py
Normal file
@@ -0,0 +1,66 @@
|
||||
# [DEF:backend.src.services.mapping_service:Module]
|
||||
#
|
||||
# @SEMANTICS: service, mapping, fuzzy-matching, superset
|
||||
# @PURPOSE: Orchestrates database fetching and fuzzy matching suggestions.
|
||||
# @LAYER: Service
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.superset_client
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.utils.matching
|
||||
#
|
||||
# @INVARIANT: Suggestions are based on database names.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Dict
|
||||
from backend.src.core.superset_client import SupersetClient
|
||||
from backend.src.core.utils.matching import suggest_mappings
|
||||
from superset_tool.models import SupersetConfig
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:MappingService:Class]
|
||||
# @PURPOSE: Service for handling database mapping logic.
|
||||
class MappingService:
|
||||
|
||||
# [DEF:MappingService.__init__:Function]
|
||||
def __init__(self, config_manager):
|
||||
self.config_manager = config_manager
|
||||
|
||||
# [DEF:MappingService._get_client:Function]
|
||||
# @PURPOSE: Helper to get an initialized SupersetClient for an environment.
|
||||
def _get_client(self, env_id: str) -> SupersetClient:
|
||||
envs = self.config_manager.get_environments()
|
||||
env = next((e for e in envs if e.id == env_id), None)
|
||||
if not env:
|
||||
raise ValueError(f"Environment {env_id} not found")
|
||||
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db",
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
return SupersetClient(superset_config)
|
||||
|
||||
# [DEF:MappingService.get_suggestions:Function]
|
||||
# @PURPOSE: Fetches databases from both environments and returns fuzzy matching suggestions.
|
||||
# @PARAM: source_env_id (str) - Source environment ID.
|
||||
# @PARAM: target_env_id (str) - Target environment ID.
|
||||
# @RETURN: List[Dict] - Suggested mappings.
|
||||
async def get_suggestions(self, source_env_id: str, target_env_id: str) -> List[Dict]:
|
||||
"""
|
||||
Get suggested mappings between two environments.
|
||||
"""
|
||||
source_client = self._get_client(source_env_id)
|
||||
target_client = self._get_client(target_env_id)
|
||||
|
||||
source_dbs = source_client.get_databases_summary()
|
||||
target_dbs = target_client.get_databases_summary()
|
||||
|
||||
return suggest_mappings(source_dbs, target_dbs)
|
||||
# [/DEF:MappingService.get_suggestions]
|
||||
|
||||
# [/DEF:MappingService]
|
||||
|
||||
# [/DEF:backend.src.services.mapping_service]
|
||||
44
backend/tests/test_logger.py
Normal file
44
backend/tests/test_logger.py
Normal file
@@ -0,0 +1,44 @@
|
||||
import pytest
|
||||
from backend.src.core.logger import belief_scope, logger
|
||||
|
||||
|
||||
def test_belief_scope_logs_entry_action_exit(caplog):
|
||||
"""Test that belief_scope generates [ID][Entry], [ID][Action], and [ID][Exit] logs."""
|
||||
caplog.set_level("INFO")
|
||||
|
||||
with belief_scope("TestFunction"):
|
||||
logger.info("Doing something important")
|
||||
|
||||
# Check that the logs contain the expected patterns
|
||||
log_messages = [record.message for record in caplog.records]
|
||||
|
||||
assert any("[TestFunction][Entry]" in msg for msg in log_messages), "Entry log not found"
|
||||
assert any("[TestFunction][Action] Doing something important" in msg for msg in log_messages), "Action log not found"
|
||||
assert any("[TestFunction][Exit]" in msg for msg in log_messages), "Exit log not found"
|
||||
|
||||
|
||||
def test_belief_scope_error_handling(caplog):
|
||||
"""Test that belief_scope logs Coherence:Failed on exception."""
|
||||
caplog.set_level("INFO")
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
with belief_scope("FailingFunction"):
|
||||
raise ValueError("Something went wrong")
|
||||
|
||||
log_messages = [record.message for record in caplog.records]
|
||||
|
||||
assert any("[FailingFunction][Entry]" in msg for msg in log_messages), "Entry log not found"
|
||||
assert any("[FailingFunction][Coherence:Failed]" in msg for msg in log_messages), "Failed coherence log not found"
|
||||
# Exit should not be logged on failure
|
||||
|
||||
|
||||
def test_belief_scope_success_coherence(caplog):
|
||||
"""Test that belief_scope logs Coherence:OK on success."""
|
||||
caplog.set_level("INFO")
|
||||
|
||||
with belief_scope("SuccessFunction"):
|
||||
pass
|
||||
|
||||
log_messages = [record.message for record in caplog.records]
|
||||
|
||||
assert any("[SuccessFunction][Coherence:OK]" in msg for msg in log_messages), "Success coherence log not found"
|
||||
42
docs/migration_mapping.md
Normal file
42
docs/migration_mapping.md
Normal file
@@ -0,0 +1,42 @@
|
||||
# Database Mapping in Migration
|
||||
|
||||
This document describes how to use the database mapping feature during Superset dashboard migrations.
|
||||
|
||||
## Overview
|
||||
|
||||
When migrating dashboards between different Superset environments (e.g., from Dev to Prod), the underlying databases often have different UUIDs even if they represent the same data source. The Database Mapping feature allows you to define these relationships so that migrated assets automatically point to the correct database in the target environment.
|
||||
|
||||
## How it Works
|
||||
|
||||
1. **Fuzzy Matching**: The system automatically suggests mappings by comparing database names between environments using the RapidFuzz library.
|
||||
2. **Persistence**: Mappings are stored in a local SQLite database (`mappings.db`) and are reused for future migrations between the same environment pair.
|
||||
3. **Asset Interception**: During migration, the system intercepts the Superset export ZIP archive, modifies the `database_uuid` in the dataset YAML files, and re-packages the archive before importing it to the target.
|
||||
|
||||
## Usage Instructions
|
||||
|
||||
### 1. Define Mappings
|
||||
|
||||
1. Navigate to the **Database Mapping** tab in the application.
|
||||
2. Select your **Source** and **Target** environments.
|
||||
3. Click **Fetch Databases & Suggestions**.
|
||||
4. Review the suggested mappings (highlighted in green).
|
||||
5. If a suggestion is incorrect or missing, use the dropdown in the "Target Database" column to select the correct one.
|
||||
6. Mappings are saved automatically when you select a target database.
|
||||
|
||||
### 2. Run Migration with Database Replacement
|
||||
|
||||
1. Go to the **Migration** dashboard.
|
||||
2. Select the **Source** and **Target** environments.
|
||||
3. Select the dashboards or datasets you want to migrate.
|
||||
4. Enable the **Replace Database (Apply Mappings)** toggle.
|
||||
5. Click **Start Migration**.
|
||||
|
||||
### 3. Handling Missing Mappings
|
||||
|
||||
If the migration engine encounters a database that has no defined mapping, the process will pause, and a modal will appear prompting you to select a target database on-the-fly. Once selected, the mapping is saved, and the migration continues.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- **Mapping not applied**: Ensure the "Replace Database" toggle is enabled.
|
||||
- **Wrong database in target**: Check the mapping table for the specific environment pair and correct any errors.
|
||||
- **Connection errors**: Ensure both Superset environments are reachable and credentials are correct in Settings.
|
||||
96
frontend/.svelte-kit/ambient.d.ts
vendored
96
frontend/.svelte-kit/ambient.d.ts
vendored
@@ -26,57 +26,85 @@
|
||||
* ```
|
||||
*/
|
||||
declare module '$env/static/private' {
|
||||
export const LESSOPEN: string;
|
||||
export const USER: string;
|
||||
export const npm_config_user_agent: string;
|
||||
export const XDG_SESSION_TYPE: string;
|
||||
export const npm_node_execpath: string;
|
||||
export const SHLVL: string;
|
||||
export const npm_config_noproxy: string;
|
||||
export const LESS: string;
|
||||
export const HOME: string;
|
||||
export const OLDPWD: string;
|
||||
export const DESKTOP_SESSION: string;
|
||||
export const npm_package_json: string;
|
||||
export const LSCOLORS: string;
|
||||
export const ZSH: string;
|
||||
export const GNOME_SHELL_SESSION_MODE: string;
|
||||
export const GTK_MODULES: string;
|
||||
export const PAGER: string;
|
||||
export const PS1: string;
|
||||
export const npm_config_userconfig: string;
|
||||
export const npm_config_local_prefix: string;
|
||||
export const SYSTEMD_EXEC_PID: string;
|
||||
export const DBUS_SESSION_BUS_ADDRESS: string;
|
||||
export const WSL_DISTRO_NAME: string;
|
||||
export const COLORTERM: string;
|
||||
export const COLOR: string;
|
||||
export const npm_config_metrics_registry: string;
|
||||
export const WAYLAND_DISPLAY: string;
|
||||
export const LOGNAME: string;
|
||||
export const NAME: string;
|
||||
export const WSL_INTEROP: string;
|
||||
export const PULSE_SERVER: string;
|
||||
export const SDKMAN_CANDIDATES_API: string;
|
||||
export const _: string;
|
||||
export const npm_config_prefix: string;
|
||||
export const npm_config_npm_version: string;
|
||||
export const MEMORY_PRESSURE_WATCH: string;
|
||||
export const XDG_SESSION_CLASS: string;
|
||||
export const USERNAME: string;
|
||||
export const TERM: string;
|
||||
export const npm_config_cache: string;
|
||||
export const GNOME_DESKTOP_SESSION_ID: string;
|
||||
export const npm_config_node_gyp: string;
|
||||
export const PATH: string;
|
||||
export const SDKMAN_CANDIDATES_DIR: string;
|
||||
export const NODE: string;
|
||||
export const npm_package_name: string;
|
||||
export const XDG_MENU_PREFIX: string;
|
||||
export const SDKMAN_BROKER_API: string;
|
||||
export const GNOME_TERMINAL_SCREEN: string;
|
||||
export const GNOME_SETUP_DISPLAY: string;
|
||||
export const XDG_RUNTIME_DIR: string;
|
||||
export const DISPLAY: string;
|
||||
export const LANG: string;
|
||||
export const XDG_CURRENT_DESKTOP: string;
|
||||
export const VIRTUAL_ENV_PROMPT: string;
|
||||
export const XMODIFIERS: string;
|
||||
export const XDG_SESSION_DESKTOP: string;
|
||||
export const XAUTHORITY: string;
|
||||
export const LS_COLORS: string;
|
||||
export const GNOME_TERMINAL_SERVICE: string;
|
||||
export const SDKMAN_DIR: string;
|
||||
export const SDKMAN_PLATFORM: string;
|
||||
export const npm_lifecycle_script: string;
|
||||
export const SSH_AUTH_SOCK: string;
|
||||
export const SHELL: string;
|
||||
export const npm_package_version: string;
|
||||
export const npm_lifecycle_event: string;
|
||||
export const QT_ACCESSIBILITY: string;
|
||||
export const GDMSESSION: string;
|
||||
export const GOOGLE_CLOUD_PROJECT: string;
|
||||
export const LESSCLOSE: string;
|
||||
export const GPG_AGENT_INFO: string;
|
||||
export const VIRTUAL_ENV: string;
|
||||
export const QT_IM_MODULE: string;
|
||||
export const npm_config_globalconfig: string;
|
||||
export const npm_config_init_module: string;
|
||||
export const JAVA_HOME: string;
|
||||
export const PWD: string;
|
||||
export const npm_config_globalignorefile: string;
|
||||
export const npm_execpath: string;
|
||||
export const XDG_DATA_DIRS: string;
|
||||
export const npm_config_global_prefix: string;
|
||||
export const npm_command: string;
|
||||
export const WSL2_GUI_APPS_ENABLED: string;
|
||||
export const HOSTTYPE: string;
|
||||
export const WSLENV: string;
|
||||
export const QT_IM_MODULES: string;
|
||||
export const MEMORY_PRESSURE_WRITE: string;
|
||||
export const VTE_VERSION: string;
|
||||
export const INIT_CWD: string;
|
||||
export const EDITOR: string;
|
||||
export const NODE_ENV: string;
|
||||
@@ -109,57 +137,85 @@ declare module '$env/static/public' {
|
||||
*/
|
||||
declare module '$env/dynamic/private' {
|
||||
export const env: {
|
||||
LESSOPEN: string;
|
||||
USER: string;
|
||||
npm_config_user_agent: string;
|
||||
XDG_SESSION_TYPE: string;
|
||||
npm_node_execpath: string;
|
||||
SHLVL: string;
|
||||
npm_config_noproxy: string;
|
||||
LESS: string;
|
||||
HOME: string;
|
||||
OLDPWD: string;
|
||||
DESKTOP_SESSION: string;
|
||||
npm_package_json: string;
|
||||
LSCOLORS: string;
|
||||
ZSH: string;
|
||||
GNOME_SHELL_SESSION_MODE: string;
|
||||
GTK_MODULES: string;
|
||||
PAGER: string;
|
||||
PS1: string;
|
||||
npm_config_userconfig: string;
|
||||
npm_config_local_prefix: string;
|
||||
SYSTEMD_EXEC_PID: string;
|
||||
DBUS_SESSION_BUS_ADDRESS: string;
|
||||
WSL_DISTRO_NAME: string;
|
||||
COLORTERM: string;
|
||||
COLOR: string;
|
||||
npm_config_metrics_registry: string;
|
||||
WAYLAND_DISPLAY: string;
|
||||
LOGNAME: string;
|
||||
NAME: string;
|
||||
WSL_INTEROP: string;
|
||||
PULSE_SERVER: string;
|
||||
SDKMAN_CANDIDATES_API: string;
|
||||
_: string;
|
||||
npm_config_prefix: string;
|
||||
npm_config_npm_version: string;
|
||||
MEMORY_PRESSURE_WATCH: string;
|
||||
XDG_SESSION_CLASS: string;
|
||||
USERNAME: string;
|
||||
TERM: string;
|
||||
npm_config_cache: string;
|
||||
GNOME_DESKTOP_SESSION_ID: string;
|
||||
npm_config_node_gyp: string;
|
||||
PATH: string;
|
||||
SDKMAN_CANDIDATES_DIR: string;
|
||||
NODE: string;
|
||||
npm_package_name: string;
|
||||
XDG_MENU_PREFIX: string;
|
||||
SDKMAN_BROKER_API: string;
|
||||
GNOME_TERMINAL_SCREEN: string;
|
||||
GNOME_SETUP_DISPLAY: string;
|
||||
XDG_RUNTIME_DIR: string;
|
||||
DISPLAY: string;
|
||||
LANG: string;
|
||||
XDG_CURRENT_DESKTOP: string;
|
||||
VIRTUAL_ENV_PROMPT: string;
|
||||
XMODIFIERS: string;
|
||||
XDG_SESSION_DESKTOP: string;
|
||||
XAUTHORITY: string;
|
||||
LS_COLORS: string;
|
||||
GNOME_TERMINAL_SERVICE: string;
|
||||
SDKMAN_DIR: string;
|
||||
SDKMAN_PLATFORM: string;
|
||||
npm_lifecycle_script: string;
|
||||
SSH_AUTH_SOCK: string;
|
||||
SHELL: string;
|
||||
npm_package_version: string;
|
||||
npm_lifecycle_event: string;
|
||||
QT_ACCESSIBILITY: string;
|
||||
GDMSESSION: string;
|
||||
GOOGLE_CLOUD_PROJECT: string;
|
||||
LESSCLOSE: string;
|
||||
GPG_AGENT_INFO: string;
|
||||
VIRTUAL_ENV: string;
|
||||
QT_IM_MODULE: string;
|
||||
npm_config_globalconfig: string;
|
||||
npm_config_init_module: string;
|
||||
JAVA_HOME: string;
|
||||
PWD: string;
|
||||
npm_config_globalignorefile: string;
|
||||
npm_execpath: string;
|
||||
XDG_DATA_DIRS: string;
|
||||
npm_config_global_prefix: string;
|
||||
npm_command: string;
|
||||
WSL2_GUI_APPS_ENABLED: string;
|
||||
HOSTTYPE: string;
|
||||
WSLENV: string;
|
||||
QT_IM_MODULES: string;
|
||||
MEMORY_PRESSURE_WRITE: string;
|
||||
VTE_VERSION: string;
|
||||
INIT_CWD: string;
|
||||
EDITOR: string;
|
||||
NODE_ENV: string;
|
||||
|
||||
@@ -4,14 +4,18 @@ export const nodes = [
|
||||
() => import('./nodes/0'),
|
||||
() => import('./nodes/1'),
|
||||
() => import('./nodes/2'),
|
||||
() => import('./nodes/3')
|
||||
() => import('./nodes/3'),
|
||||
() => import('./nodes/4'),
|
||||
() => import('./nodes/5')
|
||||
];
|
||||
|
||||
export const server_loads = [];
|
||||
|
||||
export const dictionary = {
|
||||
"/": [2],
|
||||
"/settings": [3]
|
||||
"/migration": [3],
|
||||
"/migration/mappings": [4],
|
||||
"/settings": [5]
|
||||
};
|
||||
|
||||
export const hooks = {
|
||||
|
||||
@@ -1,3 +1 @@
|
||||
import * as universal from "../../../../src/routes/settings/+page.ts";
|
||||
export { universal };
|
||||
export { default as component } from "../../../../src/routes/settings/+page.svelte";
|
||||
export { default as component } from "../../../../src/routes/migration/+page.svelte";
|
||||
@@ -24,7 +24,7 @@ export const options = {
|
||||
app: ({ head, body, assets, nonce, env }) => "<!DOCTYPE html>\n<html lang=\"en\">\n\t<head>\n\t\t<meta charset=\"utf-8\" />\n\t\t<link rel=\"icon\" href=\"" + assets + "/favicon.png\" />\n\t\t<meta name=\"viewport\" content=\"width=device-width, initial-scale=1\" />\n\t\t" + head + "\n\t</head>\n\t<body data-sveltekit-preload-data=\"hover\">\n\t\t<div style=\"display: contents\">" + body + "</div>\n\t</body>\n</html>\n",
|
||||
error: ({ status, message }) => "<!doctype html>\n<html lang=\"en\">\n\t<head>\n\t\t<meta charset=\"utf-8\" />\n\t\t<title>" + message + "</title>\n\n\t\t<style>\n\t\t\tbody {\n\t\t\t\t--bg: white;\n\t\t\t\t--fg: #222;\n\t\t\t\t--divider: #ccc;\n\t\t\t\tbackground: var(--bg);\n\t\t\t\tcolor: var(--fg);\n\t\t\t\tfont-family:\n\t\t\t\t\tsystem-ui,\n\t\t\t\t\t-apple-system,\n\t\t\t\t\tBlinkMacSystemFont,\n\t\t\t\t\t'Segoe UI',\n\t\t\t\t\tRoboto,\n\t\t\t\t\tOxygen,\n\t\t\t\t\tUbuntu,\n\t\t\t\t\tCantarell,\n\t\t\t\t\t'Open Sans',\n\t\t\t\t\t'Helvetica Neue',\n\t\t\t\t\tsans-serif;\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t\tjustify-content: center;\n\t\t\t\theight: 100vh;\n\t\t\t\tmargin: 0;\n\t\t\t}\n\n\t\t\t.error {\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t\tmax-width: 32rem;\n\t\t\t\tmargin: 0 1rem;\n\t\t\t}\n\n\t\t\t.status {\n\t\t\t\tfont-weight: 200;\n\t\t\t\tfont-size: 3rem;\n\t\t\t\tline-height: 1;\n\t\t\t\tposition: relative;\n\t\t\t\ttop: -0.05rem;\n\t\t\t}\n\n\t\t\t.message {\n\t\t\t\tborder-left: 1px solid var(--divider);\n\t\t\t\tpadding: 0 0 0 1rem;\n\t\t\t\tmargin: 0 0 0 1rem;\n\t\t\t\tmin-height: 2.5rem;\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t}\n\n\t\t\t.message h1 {\n\t\t\t\tfont-weight: 400;\n\t\t\t\tfont-size: 1em;\n\t\t\t\tmargin: 0;\n\t\t\t}\n\n\t\t\t@media (prefers-color-scheme: dark) {\n\t\t\t\tbody {\n\t\t\t\t\t--bg: #222;\n\t\t\t\t\t--fg: #ddd;\n\t\t\t\t\t--divider: #666;\n\t\t\t\t}\n\t\t\t}\n\t\t</style>\n\t</head>\n\t<body>\n\t\t<div class=\"error\">\n\t\t\t<span class=\"status\">" + status + "</span>\n\t\t\t<div class=\"message\">\n\t\t\t\t<h1>" + message + "</h1>\n\t\t\t</div>\n\t\t</div>\n\t</body>\n</html>\n"
|
||||
},
|
||||
version_hash: "1eogxsl"
|
||||
version_hash: "n7gbte"
|
||||
};
|
||||
|
||||
export async function get_hooks() {
|
||||
|
||||
6
frontend/.svelte-kit/non-ambient.d.ts
vendored
6
frontend/.svelte-kit/non-ambient.d.ts
vendored
@@ -27,15 +27,17 @@ export {};
|
||||
|
||||
declare module "$app/types" {
|
||||
export interface AppTypes {
|
||||
RouteId(): "/" | "/settings";
|
||||
RouteId(): "/" | "/migration" | "/migration/mappings" | "/settings";
|
||||
RouteParams(): {
|
||||
|
||||
};
|
||||
LayoutParams(): {
|
||||
"/": Record<string, never>;
|
||||
"/migration": Record<string, never>;
|
||||
"/migration/mappings": Record<string, never>;
|
||||
"/settings": Record<string, never>
|
||||
};
|
||||
Pathname(): "/" | "/settings" | "/settings/";
|
||||
Pathname(): "/" | "/migration" | "/migration/" | "/migration/mappings" | "/migration/mappings/" | "/settings" | "/settings/";
|
||||
ResolvedPathname(): `${"" | `/${string}`}${ReturnType<AppTypes['Pathname']>}`;
|
||||
Asset(): string & {};
|
||||
}
|
||||
|
||||
57
frontend/src/components/EnvSelector.svelte
Normal file
57
frontend/src/components/EnvSelector.svelte
Normal file
@@ -0,0 +1,57 @@
|
||||
<!-- [DEF:EnvSelector:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: environment, selector, dropdown, migration
|
||||
@PURPOSE: Provides a UI component for selecting source and target environments.
|
||||
@LAYER: Feature
|
||||
@RELATION: BINDS_TO -> environments store
|
||||
|
||||
@INVARIANT: Source and target environments must be selectable from the list of configured environments.
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount, createEventDispatcher } from 'svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let label: string = "Select Environment";
|
||||
export let selectedId: string = "";
|
||||
export let environments: Array<{id: string, name: string, url: string}> = [];
|
||||
// [/SECTION]
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
// [DEF:handleSelect:Function]
|
||||
/**
|
||||
* @purpose Dispatches the selection change event.
|
||||
* @param {Event} event - The change event from the select element.
|
||||
*/
|
||||
function handleSelect(event: Event) {
|
||||
const target = event.target as HTMLSelectElement;
|
||||
selectedId = target.value;
|
||||
dispatch('change', { id: selectedId });
|
||||
}
|
||||
// [/DEF:handleSelect]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="flex flex-col space-y-1">
|
||||
<label class="text-sm font-medium text-gray-700">{label}</label>
|
||||
<select
|
||||
class="block w-full pl-3 pr-10 py-2 text-base border-gray-300 focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm rounded-md"
|
||||
value={selectedId}
|
||||
on:change={handleSelect}
|
||||
>
|
||||
<option value="" disabled>-- Choose an environment --</option>
|
||||
{#each environments as env}
|
||||
<option value={env.id}>{env.name} ({env.url})</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
/* Component specific styles */
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:EnvSelector] -->
|
||||
94
frontend/src/components/MappingTable.svelte
Normal file
94
frontend/src/components/MappingTable.svelte
Normal file
@@ -0,0 +1,94 @@
|
||||
<!-- [DEF:MappingTable:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: mapping, table, database, editor
|
||||
@PURPOSE: Displays and allows editing of database mappings.
|
||||
@LAYER: Feature
|
||||
@RELATION: BINDS_TO -> mappings state
|
||||
|
||||
@INVARIANT: Each source database can be mapped to one target database.
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
// [SECTION: IMPORTS]
|
||||
import { createEventDispatcher } from 'svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let sourceDatabases: Array<{uuid: string, database_name: string}> = [];
|
||||
export let targetDatabases: Array<{uuid: string, database_name: string}> = [];
|
||||
export let mappings: Array<{source_db_uuid: string, target_db_uuid: string}> = [];
|
||||
export let suggestions: Array<{source_db_uuid: string, target_db_uuid: string, confidence: number}> = [];
|
||||
// [/SECTION]
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
// [DEF:updateMapping:Function]
|
||||
/**
|
||||
* @purpose Updates a mapping for a specific source database.
|
||||
*/
|
||||
function updateMapping(sourceUuid: string, targetUuid: string) {
|
||||
dispatch('update', { sourceUuid, targetUuid });
|
||||
}
|
||||
// [/DEF:updateMapping]
|
||||
|
||||
// [DEF:getSuggestion:Function]
|
||||
/**
|
||||
* @purpose Finds a suggestion for a source database.
|
||||
*/
|
||||
function getSuggestion(sourceUuid: string) {
|
||||
return suggestions.find(s => s.source_db_uuid === sourceUuid);
|
||||
}
|
||||
// [/DEF:getSuggestion]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="overflow-x-auto">
|
||||
<table class="min-w-full divide-y divide-gray-200">
|
||||
<thead class="bg-gray-50">
|
||||
<tr>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Source Database</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Target Database</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Status</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody class="bg-white divide-y divide-gray-200">
|
||||
{#each sourceDatabases as sDb}
|
||||
{@const mapping = mappings.find(m => m.source_db_uuid === sDb.uuid)}
|
||||
{@const suggestion = getSuggestion(sDb.uuid)}
|
||||
<tr class={suggestion && !mapping ? 'bg-green-50' : ''}>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm font-medium text-gray-900">
|
||||
{sDb.database_name}
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
||||
<select
|
||||
class="block w-full pl-3 pr-10 py-2 text-base border-gray-300 focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm rounded-md"
|
||||
value={mapping?.target_db_uuid || suggestion?.target_db_uuid || ""}
|
||||
on:change={(e) => updateMapping(sDb.uuid, (e.target as HTMLSelectElement).value)}
|
||||
>
|
||||
<option value="">-- Select Target --</option>
|
||||
{#each targetDatabases as tDb}
|
||||
<option value={tDb.uuid}>{tDb.database_name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
||||
{#if mapping}
|
||||
<span class="text-blue-600 font-semibold">Saved</span>
|
||||
{:else if suggestion}
|
||||
<span class="text-green-600 font-semibold">Suggested ({Math.round(suggestion.confidence * 100)}%)</span>
|
||||
{:else}
|
||||
<span class="text-red-600">Unmapped</span>
|
||||
{/if}
|
||||
</td>
|
||||
</tr>
|
||||
{/each}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
/* Component specific styles */
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:MappingTable] -->
|
||||
112
frontend/src/components/MissingMappingModal.svelte
Normal file
112
frontend/src/components/MissingMappingModal.svelte
Normal file
@@ -0,0 +1,112 @@
|
||||
<!-- [DEF:MissingMappingModal:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: modal, mapping, prompt, migration
|
||||
@PURPOSE: Prompts the user to provide a database mapping when one is missing during migration.
|
||||
@LAYER: Feature
|
||||
@RELATION: DISPATCHES -> resolve
|
||||
|
||||
@INVARIANT: Modal blocks migration progress until resolved or cancelled.
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
// [SECTION: IMPORTS]
|
||||
import { createEventDispatcher } from 'svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let show: boolean = false;
|
||||
export let sourceDbName: string = "";
|
||||
export let sourceDbUuid: string = "";
|
||||
export let targetDatabases: Array<{uuid: string, database_name: string}> = [];
|
||||
// [/SECTION]
|
||||
|
||||
let selectedTargetUuid = "";
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
// [DEF:resolve:Function]
|
||||
function resolve() {
|
||||
if (!selectedTargetUuid) return;
|
||||
dispatch('resolve', {
|
||||
sourceDbUuid,
|
||||
targetDbUuid: selectedTargetUuid,
|
||||
targetDbName: targetDatabases.find(d => d.uuid === selectedTargetUuid)?.database_name
|
||||
});
|
||||
show = false;
|
||||
}
|
||||
// [/DEF:resolve]
|
||||
|
||||
// [DEF:cancel:Function]
|
||||
function cancel() {
|
||||
dispatch('cancel');
|
||||
show = false;
|
||||
}
|
||||
// [/DEF:cancel]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
{#if show}
|
||||
<div class="fixed z-10 inset-0 overflow-y-auto" aria-labelledby="modal-title" role="dialog" aria-modal="true">
|
||||
<div class="flex items-end justify-center min-h-screen pt-4 px-4 pb-20 text-center sm:block sm:p-0">
|
||||
<div class="fixed inset-0 bg-gray-500 bg-opacity-75 transition-opacity" aria-hidden="true"></div>
|
||||
|
||||
<span class="hidden sm:inline-block sm:align-middle sm:h-screen" aria-hidden="true">​</span>
|
||||
|
||||
<div class="inline-block align-bottom bg-white rounded-lg px-4 pt-5 pb-4 text-left overflow-hidden shadow-xl transform transition-all sm:my-8 sm:align-middle sm:max-w-lg sm:w-full sm:p-6">
|
||||
<div>
|
||||
<div class="mx-auto flex items-center justify-center h-12 w-12 rounded-full bg-yellow-100">
|
||||
<svg class="h-6 w-6 text-yellow-600" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
|
||||
</svg>
|
||||
</div>
|
||||
<div class="mt-3 text-center sm:mt-5">
|
||||
<h3 class="text-lg leading-6 font-medium text-gray-900" id="modal-title">
|
||||
Missing Database Mapping
|
||||
</h3>
|
||||
<div class="mt-2">
|
||||
<p class="text-sm text-gray-500">
|
||||
The database <strong>{sourceDbName}</strong> is used in the assets being migrated but has no mapping to the target environment. Please select a target database.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mt-5 sm:mt-6">
|
||||
<select
|
||||
class="block w-full pl-3 pr-10 py-2 text-base border-gray-300 focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm rounded-md"
|
||||
bind:value={selectedTargetUuid}
|
||||
>
|
||||
<option value="" disabled>-- Select Target Database --</option>
|
||||
{#each targetDatabases as tDb}
|
||||
<option value={tDb.uuid}>{tDb.database_name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="mt-5 sm:mt-6 sm:grid sm:grid-cols-2 sm:gap-3 sm:grid-flow-row-dense">
|
||||
<button
|
||||
type="button"
|
||||
on:click={resolve}
|
||||
disabled={!selectedTargetUuid}
|
||||
class="w-full inline-flex justify-center rounded-md border border-transparent shadow-sm px-4 py-2 bg-indigo-600 text-base font-medium text-white hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:col-start-2 sm:text-sm disabled:bg-gray-400"
|
||||
>
|
||||
Apply & Continue
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
on:click={cancel}
|
||||
class="mt-3 w-full inline-flex justify-center rounded-md border border-gray-300 shadow-sm px-4 py-2 bg-white text-base font-medium text-gray-700 hover:bg-gray-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:mt-0 sm:col-start-1 sm:text-sm"
|
||||
>
|
||||
Cancel Migration
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
/* Modal specific styles */
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:MissingMappingModal] -->
|
||||
@@ -16,6 +16,12 @@
|
||||
>
|
||||
Dashboard
|
||||
</a>
|
||||
<a
|
||||
href="/migration"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/migration') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Migration
|
||||
</a>
|
||||
<a
|
||||
href="/settings"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
|
||||
@@ -15,6 +15,7 @@
|
||||
import { selectedTask, taskLogs } from '../lib/stores.js';
|
||||
import { getWsUrl } from '../lib/api.js';
|
||||
import { addToast } from '../lib/toasts.js';
|
||||
import MissingMappingModal from './MissingMappingModal.svelte';
|
||||
// [/SECTION]
|
||||
|
||||
let ws;
|
||||
@@ -25,7 +26,10 @@
|
||||
let reconnectTimeout;
|
||||
let waitingForData = false;
|
||||
let dataTimeout;
|
||||
let connectionStatus = 'disconnected'; // 'connecting', 'connected', 'disconnected', 'waiting', 'completed'
|
||||
let connectionStatus = 'disconnected'; // 'connecting', 'connected', 'disconnected', 'waiting', 'completed', 'awaiting_mapping'
|
||||
let showMappingModal = false;
|
||||
let missingDbInfo = { name: '', uuid: '' };
|
||||
let targetDatabases = [];
|
||||
|
||||
// [DEF:connect:Function]
|
||||
/**
|
||||
@@ -58,6 +62,17 @@
|
||||
connectionStatus = 'completed';
|
||||
ws.close();
|
||||
}
|
||||
|
||||
// Check for missing mapping signal
|
||||
if (logEntry.message && logEntry.message.includes('Missing mapping for database UUID')) {
|
||||
const uuidMatch = logEntry.message.match(/UUID: ([\w-]+)/);
|
||||
if (uuidMatch) {
|
||||
missingDbInfo = { name: 'Unknown', uuid: uuidMatch[1] };
|
||||
connectionStatus = 'awaiting_mapping';
|
||||
fetchTargetDatabases();
|
||||
showMappingModal = true;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
ws.onerror = (error) => {
|
||||
@@ -85,6 +100,65 @@
|
||||
}
|
||||
// [/DEF:connect]
|
||||
|
||||
async function fetchTargetDatabases() {
|
||||
const task = get(selectedTask);
|
||||
if (!task || !task.params.to_env) return;
|
||||
|
||||
try {
|
||||
// We need to find the environment ID by name first
|
||||
const envsRes = await fetch('/api/environments');
|
||||
const envs = await envsRes.json();
|
||||
const targetEnv = envs.find(e => e.name === task.params.to_env);
|
||||
|
||||
if (targetEnv) {
|
||||
const res = await fetch(`/api/environments/${targetEnv.id}/databases`);
|
||||
targetDatabases = await res.json();
|
||||
}
|
||||
} catch (e) {
|
||||
console.error('Failed to fetch target databases', e);
|
||||
}
|
||||
}
|
||||
|
||||
async function handleMappingResolve(event) {
|
||||
const task = get(selectedTask);
|
||||
const { sourceDbUuid, targetDbUuid, targetDbName } = event.detail;
|
||||
|
||||
try {
|
||||
// 1. Save mapping to backend
|
||||
const envsRes = await fetch('/api/environments');
|
||||
const envs = await envsRes.json();
|
||||
const srcEnv = envs.find(e => e.name === task.params.from_env);
|
||||
const tgtEnv = envs.find(e => e.name === task.params.to_env);
|
||||
|
||||
await fetch('/api/mappings', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
source_env_id: srcEnv.id,
|
||||
target_env_id: tgtEnv.id,
|
||||
source_db_uuid: sourceDbUuid,
|
||||
target_db_uuid: targetDbUuid,
|
||||
source_db_name: missingDbInfo.name,
|
||||
target_db_name: targetDbName
|
||||
})
|
||||
});
|
||||
|
||||
// 2. Resolve task
|
||||
await fetch(`/api/tasks/${task.id}/resolve`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
resolution_params: { resolved_mapping: { [sourceDbUuid]: targetDbUuid } }
|
||||
})
|
||||
});
|
||||
|
||||
connectionStatus = 'connected';
|
||||
addToast('Mapping resolved, migration continuing...', 'success');
|
||||
} catch (e) {
|
||||
addToast('Failed to resolve mapping: ' + e.message, 'error');
|
||||
}
|
||||
}
|
||||
|
||||
function startDataTimeout() {
|
||||
waitingForData = false;
|
||||
dataTimeout = setTimeout(() => {
|
||||
@@ -151,6 +225,9 @@
|
||||
{:else if connectionStatus === 'completed'}
|
||||
<span class="h-3 w-3 rounded-full bg-blue-500"></span>
|
||||
<span class="text-xs text-gray-500">Completed</span>
|
||||
{:else if connectionStatus === 'awaiting_mapping'}
|
||||
<span class="h-3 w-3 rounded-full bg-orange-500 animate-pulse"></span>
|
||||
<span class="text-xs text-gray-500">Awaiting Mapping</span>
|
||||
{:else}
|
||||
<span class="h-3 w-3 rounded-full bg-red-500"></span>
|
||||
<span class="text-xs text-gray-500">Disconnected</span>
|
||||
@@ -177,6 +254,15 @@
|
||||
<p>No task selected.</p>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<MissingMappingModal
|
||||
bind:show={showMappingModal}
|
||||
sourceDbName={missingDbInfo.name}
|
||||
sourceDbUuid={missingDbInfo.uuid}
|
||||
{targetDatabases}
|
||||
on:resolve={handleMappingResolve}
|
||||
on:cancel={() => { connectionStatus = 'disconnected'; ws.close(); }}
|
||||
/>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:TaskRunner] -->
|
||||
|
||||
@@ -21,7 +21,14 @@
|
||||
environments: [],
|
||||
settings: {
|
||||
backup_path: '',
|
||||
default_environment_id: null
|
||||
default_environment_id: null,
|
||||
logging: {
|
||||
level: 'INFO',
|
||||
file_path: 'logs/app.log',
|
||||
max_bytes: 10485760,
|
||||
backup_count: 5,
|
||||
enable_belief_state: true
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
@@ -180,10 +187,43 @@
|
||||
<label for="backup_path" class="block text-sm font-medium text-gray-700">Backup Storage Path</label>
|
||||
<input type="text" id="backup_path" bind:value={settings.settings.backup_path} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<button on:click={handleSaveGlobal} class="bg-blue-500 text-white px-4 py-2 rounded hover:bg-blue-600 w-max">
|
||||
Save Global Settings
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<h3 class="text-lg font-medium mb-4 mt-6">Logging Configuration</h3>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="log_level" class="block text-sm font-medium text-gray-700">Log Level</label>
|
||||
<select id="log_level" bind:value={settings.settings.logging.level} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2">
|
||||
<option value="DEBUG">DEBUG</option>
|
||||
<option value="INFO">INFO</option>
|
||||
<option value="WARNING">WARNING</option>
|
||||
<option value="ERROR">ERROR</option>
|
||||
<option value="CRITICAL">CRITICAL</option>
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label for="log_file_path" class="block text-sm font-medium text-gray-700">Log File Path</label>
|
||||
<input type="text" id="log_file_path" bind:value={settings.settings.logging.file_path} placeholder="logs/app.log" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="log_max_bytes" class="block text-sm font-medium text-gray-700">Max File Size (MB)</label>
|
||||
<input type="number" id="log_max_bytes" bind:value={settings.settings.logging.max_bytes} min="1" step="1" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="log_backup_count" class="block text-sm font-medium text-gray-700">Backup Count</label>
|
||||
<input type="number" id="log_backup_count" bind:value={settings.settings.logging.backup_count} min="1" step="1" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div class="md:col-span-2">
|
||||
<label class="flex items-center">
|
||||
<input type="checkbox" id="enable_belief_state" bind:checked={settings.settings.logging.enable_belief_state} class="h-4 w-4 text-blue-600 border-gray-300 rounded" />
|
||||
<span class="ml-2 block text-sm text-gray-900">Enable Belief State Logging</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<button on:click={handleSaveGlobal} class="bg-blue-500 text-white px-4 py-2 rounded hover:bg-blue-600 w-max mt-4">
|
||||
Save Global Settings
|
||||
</button>
|
||||
</section>
|
||||
|
||||
<section class="mb-8 bg-white p-6 rounded shadow">
|
||||
|
||||
@@ -4,6 +4,7 @@
|
||||
import DynamicForm from '../components/DynamicForm.svelte';
|
||||
import { api } from '../lib/api.js';
|
||||
import { get } from 'svelte/store';
|
||||
import { goto } from '$app/navigation';
|
||||
|
||||
/** @type {import('./$types').PageData} */
|
||||
export let data;
|
||||
@@ -15,7 +16,11 @@
|
||||
|
||||
function selectPlugin(plugin) {
|
||||
console.log(`[Dashboard][Action] Selecting plugin: ${plugin.id}`);
|
||||
selectedPlugin.set(plugin);
|
||||
if (plugin.id === 'superset-migration') {
|
||||
goto('/migration');
|
||||
} else {
|
||||
selectedPlugin.set(plugin);
|
||||
}
|
||||
}
|
||||
|
||||
async function handleFormSubmit(event) {
|
||||
|
||||
141
frontend/src/routes/migration/+page.svelte
Normal file
141
frontend/src/routes/migration/+page.svelte
Normal file
@@ -0,0 +1,141 @@
|
||||
<!-- [DEF:MigrationDashboard:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: migration, dashboard, environment, selection, database-replacement
|
||||
@PURPOSE: Main dashboard for configuring and starting migrations.
|
||||
@LAYER: Page
|
||||
@RELATION: USES -> EnvSelector
|
||||
|
||||
@INVARIANT: Migration cannot start without source and target environments.
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import EnvSelector from '../../components/EnvSelector.svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let environments = [];
|
||||
let sourceEnvId = "";
|
||||
let targetEnvId = "";
|
||||
let dashboardRegex = ".*";
|
||||
let replaceDb = false;
|
||||
let loading = true;
|
||||
let error = "";
|
||||
// [/SECTION]
|
||||
|
||||
// [DEF:fetchEnvironments:Function]
|
||||
/**
|
||||
* @purpose Fetches the list of environments from the API.
|
||||
* @post environments state is updated.
|
||||
*/
|
||||
async function fetchEnvironments() {
|
||||
try {
|
||||
const response = await fetch('/api/environments');
|
||||
if (!response.ok) throw new Error('Failed to fetch environments');
|
||||
environments = await response.json();
|
||||
} catch (e) {
|
||||
error = e.message;
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:fetchEnvironments]
|
||||
|
||||
onMount(fetchEnvironments);
|
||||
|
||||
// [DEF:startMigration:Function]
|
||||
/**
|
||||
* @purpose Starts the migration process.
|
||||
* @pre sourceEnvId and targetEnvId must be set and different.
|
||||
*/
|
||||
async function startMigration() {
|
||||
if (!sourceEnvId || !targetEnvId) {
|
||||
error = "Please select both source and target environments.";
|
||||
return;
|
||||
}
|
||||
if (sourceEnvId === targetEnvId) {
|
||||
error = "Source and target environments must be different.";
|
||||
return;
|
||||
}
|
||||
|
||||
error = "";
|
||||
console.log(`[MigrationDashboard][Action] Starting migration from ${sourceEnvId} to ${targetEnvId} (Replace DB: ${replaceDb})`);
|
||||
// TODO: Implement actual migration trigger in US3
|
||||
}
|
||||
// [/DEF:startMigration]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="max-w-4xl mx-auto p-6">
|
||||
<h1 class="text-2xl font-bold mb-6">Migration Dashboard</h1>
|
||||
|
||||
{#if loading}
|
||||
<p>Loading environments...</p>
|
||||
{:else if error}
|
||||
<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">
|
||||
{error}
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-6 mb-8">
|
||||
<EnvSelector
|
||||
label="Source Environment"
|
||||
bind:selectedId={sourceEnvId}
|
||||
{environments}
|
||||
/>
|
||||
<EnvSelector
|
||||
label="Target Environment"
|
||||
bind:selectedId={targetEnvId}
|
||||
{environments}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div class="mb-8">
|
||||
<label for="dashboard-regex" class="block text-sm font-medium text-gray-700 mb-1">Dashboard Regex</label>
|
||||
<input
|
||||
id="dashboard-regex"
|
||||
type="text"
|
||||
bind:value={dashboardRegex}
|
||||
placeholder="e.g. ^Finance Dashboard$"
|
||||
class="shadow-sm focus:ring-indigo-500 focus:border-indigo-500 block w-full sm:text-sm border-gray-300 rounded-md"
|
||||
/>
|
||||
<p class="mt-1 text-sm text-gray-500">Regular expression to filter dashboards to migrate.</p>
|
||||
</div>
|
||||
|
||||
<div class="flex items-center justify-between mb-8">
|
||||
<div class="flex items-center">
|
||||
<input
|
||||
id="replace-db"
|
||||
type="checkbox"
|
||||
bind:checked={replaceDb}
|
||||
class="h-4 w-4 text-indigo-600 focus:ring-indigo-500 border-gray-300 rounded"
|
||||
/>
|
||||
<label for="replace-db" class="ml-2 block text-sm text-gray-900">
|
||||
Replace Database (Apply Mappings)
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<a
|
||||
href="/migration/mappings"
|
||||
class="text-sm font-medium text-indigo-600 hover:text-indigo-500"
|
||||
>
|
||||
Manage Mappings →
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<button
|
||||
on:click={startMigration}
|
||||
disabled={!sourceEnvId || !targetEnvId || sourceEnvId === targetEnvId}
|
||||
class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:bg-gray-400"
|
||||
>
|
||||
Start Migration
|
||||
</button>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
/* Page specific styles */
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:MigrationDashboard] -->
|
||||
183
frontend/src/routes/migration/mappings/+page.svelte
Normal file
183
frontend/src/routes/migration/mappings/+page.svelte
Normal file
@@ -0,0 +1,183 @@
|
||||
<!-- [DEF:MappingManagement:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: mapping, management, database, fuzzy-matching
|
||||
@PURPOSE: Page for managing database mappings between environments.
|
||||
@LAYER: Page
|
||||
@RELATION: USES -> EnvSelector
|
||||
@RELATION: USES -> MappingTable
|
||||
|
||||
@INVARIANT: Mappings are saved to the backend for persistence.
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import EnvSelector from '../../../components/EnvSelector.svelte';
|
||||
import MappingTable from '../../../components/MappingTable.svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let environments = [];
|
||||
let sourceEnvId = "";
|
||||
let targetEnvId = "";
|
||||
let sourceDatabases = [];
|
||||
let targetDatabases = [];
|
||||
let mappings = [];
|
||||
let suggestions = [];
|
||||
let loading = true;
|
||||
let fetchingDbs = false;
|
||||
let error = "";
|
||||
let success = "";
|
||||
// [/SECTION]
|
||||
|
||||
async function fetchEnvironments() {
|
||||
try {
|
||||
const response = await fetch('/api/environments');
|
||||
if (!response.ok) throw new Error('Failed to fetch environments');
|
||||
environments = await response.json();
|
||||
} catch (e) {
|
||||
error = e.message;
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
|
||||
onMount(fetchEnvironments);
|
||||
|
||||
// [DEF:fetchDatabases:Function]
|
||||
/**
|
||||
* @purpose Fetches databases from both environments and gets suggestions.
|
||||
*/
|
||||
async function fetchDatabases() {
|
||||
if (!sourceEnvId || !targetEnvId) return;
|
||||
fetchingDbs = true;
|
||||
error = "";
|
||||
success = "";
|
||||
|
||||
try {
|
||||
const [srcRes, tgtRes, mapRes, sugRes] = await Promise.all([
|
||||
fetch(`/api/environments/${sourceEnvId}/databases`),
|
||||
fetch(`/api/environments/${targetEnvId}/databases`),
|
||||
fetch(`/api/mappings?source_env_id=${sourceEnvId}&target_env_id=${targetEnvId}`),
|
||||
fetch(`/api/mappings/suggest`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ source_env_id: sourceEnvId, target_env_id: targetEnvId })
|
||||
})
|
||||
]);
|
||||
|
||||
if (!srcRes.ok || !tgtRes.ok) throw new Error('Failed to fetch databases from environments');
|
||||
|
||||
sourceDatabases = await srcRes.json();
|
||||
targetDatabases = await tgtRes.json();
|
||||
mappings = await mapRes.json();
|
||||
suggestions = await sugRes.json();
|
||||
} catch (e) {
|
||||
error = e.message;
|
||||
} finally {
|
||||
fetchingDbs = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:fetchDatabases]
|
||||
|
||||
// [DEF:handleUpdate:Function]
|
||||
/**
|
||||
* @purpose Saves a mapping to the backend.
|
||||
*/
|
||||
async function handleUpdate(event: CustomEvent) {
|
||||
const { sourceUuid, targetUuid } = event.detail;
|
||||
const sDb = sourceDatabases.find(d => d.uuid === sourceUuid);
|
||||
const tDb = targetDatabases.find(d => d.uuid === targetUuid);
|
||||
|
||||
if (!sDb || !tDb) return;
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/mappings', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
source_env_id: sourceEnvId,
|
||||
target_env_id: targetEnvId,
|
||||
source_db_uuid: sourceUuid,
|
||||
target_db_uuid: targetUuid,
|
||||
source_db_name: sDb.database_name,
|
||||
target_db_name: tDb.database_name
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) throw new Error('Failed to save mapping');
|
||||
|
||||
const savedMapping = await response.json();
|
||||
mappings = [...mappings.filter(m => m.source_db_uuid !== sourceUuid), savedMapping];
|
||||
success = "Mapping saved successfully";
|
||||
} catch (e) {
|
||||
error = e.message;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleUpdate]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="max-w-6xl mx-auto p-6">
|
||||
<h1 class="text-2xl font-bold mb-6">Database Mapping Management</h1>
|
||||
|
||||
{#if loading}
|
||||
<p>Loading environments...</p>
|
||||
{:else}
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-6 mb-8">
|
||||
<EnvSelector
|
||||
label="Source Environment"
|
||||
bind:selectedId={sourceEnvId}
|
||||
{environments}
|
||||
on:change={() => { sourceDatabases = []; mappings = []; suggestions = []; }}
|
||||
/>
|
||||
<EnvSelector
|
||||
label="Target Environment"
|
||||
bind:selectedId={targetEnvId}
|
||||
{environments}
|
||||
on:change={() => { targetDatabases = []; mappings = []; suggestions = []; }}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div class="mb-8">
|
||||
<button
|
||||
on:click={fetchDatabases}
|
||||
disabled={!sourceEnvId || !targetEnvId || sourceEnvId === targetEnvId || fetchingDbs}
|
||||
class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:bg-gray-400"
|
||||
>
|
||||
{fetchingDbs ? 'Fetching...' : 'Fetch Databases & Suggestions'}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{#if error}
|
||||
<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">
|
||||
{error}
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if success}
|
||||
<div class="bg-green-100 border border-green-400 text-green-700 px-4 py-3 rounded mb-4">
|
||||
{success}
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if sourceDatabases.length > 0}
|
||||
<MappingTable
|
||||
{sourceDatabases}
|
||||
{targetDatabases}
|
||||
{mappings}
|
||||
{suggestions}
|
||||
on:update={handleUpdate}
|
||||
/>
|
||||
{:else if !fetchingDbs && sourceEnvId && targetEnvId}
|
||||
<p class="text-gray-500 italic">Select environments and click "Fetch Databases" to start mapping.</p>
|
||||
{/if}
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
/* Page specific styles */
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:MappingManagement] -->
|
||||
298
logs/лог.md
Executable file
298
logs/лог.md
Executable file
@@ -0,0 +1,298 @@
|
||||
PS H:\dev\ss-tools> & C:/ProgramData/anaconda3/python.exe h:/dev/ss-tools/migration_script.py
|
||||
2025-12-16 11:50:28,192 - INFO - [run][Entry] Запуск скрипта миграции.
|
||||
|
||||
=== Поведение при ошибке импорта ===
|
||||
Если импорт завершится ошибкой, удалить существующий дашборд и попытаться импортировать заново? (y/n): n
|
||||
2025-12-16 11:50:33,363 - INFO - [ask_delete_on_failure][State] Delete-on-failure = False
|
||||
2025-12-16 11:50:33,368 - INFO - [select_environments][Entry] Шаг 1/5: Выбор окружений.
|
||||
2025-12-16 11:50:33,374 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-16 11:50:33,730 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,734 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,739 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,742 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,746 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,750 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,754 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,758 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,761 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,764 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,769 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,772 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,776 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,779 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,782 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,786 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,790 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,794 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,799 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,805 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,808 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,811 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,815 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,820 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,823 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,827 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,831 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,834 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,838 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,840 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,847 - INFO - [setup_clients][Exit] All clients (dev, prod, sbx, preprod, uatta, dev5) initialized successfully.
|
||||
|
||||
=== Выбор окружения ===
|
||||
Исходное окружение:
|
||||
1) dev
|
||||
2) prod
|
||||
3) sbx
|
||||
4) preprod
|
||||
5) uatta
|
||||
6) dev5
|
||||
|
||||
Введите номер (0 – отмена): 4
|
||||
2025-12-16 11:50:42,379 - INFO - [select_environments][State] from = preprod
|
||||
|
||||
=== Выбор окружения ===
|
||||
Целевое окружение:
|
||||
1) dev
|
||||
2) prod
|
||||
3) sbx
|
||||
4) uatta
|
||||
5) dev5
|
||||
|
||||
Введите номер (0 – отмена): 5
|
||||
2025-12-16 11:50:45,176 - INFO - [select_environments][State] to = dev5
|
||||
2025-12-16 11:50:45,182 - INFO - [select_environments][Exit] Шаг 1 завершён.
|
||||
2025-12-16 11:50:45,186 - INFO - [select_dashboards][Entry] Шаг 2/5: Выбор дашбордов.
|
||||
2025-12-16 11:50:45,190 - INFO - [get_dashboards][Enter] Fetching dashboards.
|
||||
2025-12-16 11:50:45,197 - INFO - [authenticate][Enter] Authenticating to https://preprodta.bi.dwh.rusal.com/api/v1
|
||||
2025-12-16 11:50:45,880 - INFO - [authenticate][Exit] Authenticated successfully.
|
||||
2025-12-16 11:50:46,025 - INFO - [get_dashboards][Exit] Found 95 dashboards.
|
||||
|
||||
=== Поиск ===
|
||||
Введите регулярное выражение для поиска дашбордов:
|
||||
fi
|
||||
|
||||
=== Выбор дашбордов ===
|
||||
Отметьте нужные дашборды (введите номера):
|
||||
1) [ALL] Все дашборды
|
||||
2) [185] FI-0060 Финансы. Налоги. Данные по налогам. Старый
|
||||
3) [184] FI-0083 Статистика по ДЗ/ПДЗ
|
||||
4) [187] FI-0081 ПДЗ Казначейство
|
||||
5) [122] FI-0080 Финансы. Оборотный Капитал ДЗ/КЗ
|
||||
6) [208] FI-0020 Просроченная дебиторская и кредиторская задолженность в динамике
|
||||
7) [126] FI-0022 Кредиторская задолженность для казначейства
|
||||
8) [196] FI-0023 Дебиторская задолженность для казначейства
|
||||
9) [113] FI-0060 Финансы. Налоги. Данные по налогам.
|
||||
10) [173] FI-0040 Оборотно-сальдовая ведомость (ОСВ) по контрагентам
|
||||
11) [174] FI-0021 Дебиторская и кредиторская задолженность по документам
|
||||
12) [172] FI-0030 Дебиторская задолженность по штрафам
|
||||
13) [170] FI-0050 Налог на прибыль (ОНА и ОНО)
|
||||
14) [159] FI-0070 Досье контрагента
|
||||
|
||||
Введите номера через запятую (пустой ввод → отказ): 2
|
||||
2025-12-16 11:50:52,235 - INFO - [select_dashboards][State] Выбрано 1 дашбордов.
|
||||
2025-12-16 11:50:52,242 - INFO - [select_dashboards][Exit] Шаг 2 завершён.
|
||||
|
||||
=== Замена БД ===
|
||||
Заменить конфигурацию БД в YAML‑файлах? (y/n): y
|
||||
2025-12-16 11:50:53,808 - INFO - [_select_databases][Entry] Selecting databases from both environments.
|
||||
2025-12-16 11:50:53,816 - INFO - [get_databases][Enter] Fetching databases.
|
||||
2025-12-16 11:50:53,918 - INFO - [get_databases][Exit] Found 12 databases.
|
||||
2025-12-16 11:50:53,923 - INFO - [get_databases][Enter] Fetching databases.
|
||||
2025-12-16 11:50:53,926 - INFO - [authenticate][Enter] Authenticating to https://dev.bi.dwh.rusal.com/api/v1
|
||||
2025-12-16 11:50:54,450 - INFO - [authenticate][Exit] Authenticated successfully.
|
||||
2025-12-16 11:50:54,551 - INFO - [get_databases][Exit] Found 4 databases.
|
||||
|
||||
=== Выбор исходной БД ===
|
||||
Выберите исходную БД:
|
||||
1) DEV datalab (ID: 9)
|
||||
2) Prod Greenplum (ID: 7)
|
||||
3) DEV Clickhouse New (OLD) (ID: 16)
|
||||
4) Preprod Clickhouse New (ID: 15)
|
||||
5) DEV Greenplum (ID: 1)
|
||||
6) Prod Clickhouse Node 1 (ID: 11)
|
||||
7) Preprod Postgre Superset Internal (ID: 5)
|
||||
8) Prod Postgre Superset Internal (ID: 28)
|
||||
9) Prod Clickhouse (ID: 10)
|
||||
10) Dev Clickhouse (correct) (ID: 14)
|
||||
11) DEV ClickHouse New (ID: 23)
|
||||
12) Sandbox Postgre Superset Internal (ID: 12)
|
||||
|
||||
Введите номер (0 – отмена): 9
|
||||
2025-12-16 11:51:11,008 - INFO - [get_database][Enter] Fetching database 10.
|
||||
2025-12-16 11:51:11,038 - INFO - [get_database][Exit] Got database 10.
|
||||
|
||||
=== Выбор целевой БД ===
|
||||
Выберите целевую БД:
|
||||
1) DEV Greenplum (ID: 2)
|
||||
2) DEV Clickhouse (ID: 3)
|
||||
3) DEV ClickHouse New (ID: 4)
|
||||
4) Dev Postgre Superset Internal (ID: 1)
|
||||
|
||||
Введите номер (0 – отмена): 2
|
||||
2025-12-16 11:51:15,559 - INFO - [get_database][Enter] Fetching database 3.
|
||||
2025-12-16 11:51:15,586 - INFO - [get_database][Exit] Got database 3.
|
||||
2025-12-16 11:51:15,589 - INFO - [_select_databases][Exit] Selected databases: Без имени -> Без имени
|
||||
old_db: {'id': 10, 'result': {'allow_ctas': False, 'allow_cvas': False, 'allow_dml': True, 'allow_file_upload': False, 'allow_run_async': False, 'backen
|
||||
d': 'clickhousedb', 'cache_timeout': None, 'configuration_method': 'sqlalchemy_form', 'database_name': 'Prod Clickhouse', 'driver': 'connect', 'engine_i
|
||||
nformation': {'disable_ssh_tunneling': False, 'supports_file_upload': False}, 'expose_in_sqllab': True, 'force_ctas_schema': None, 'id': 10, 'impersonat
|
||||
e_user': False, 'is_managed_externally': False, 'uuid': '97aced68-326a-4094-b381-27980560efa9'}}
|
||||
2025-12-16 11:51:15,591 - INFO - [confirm_db_config_replacement][State] Replacement set: {'old': {'database_name': None, 'uuid': None, 'id': '10'}, 'new
|
||||
': {'database_name': None, 'uuid': None, 'id': '3'}}
|
||||
2025-12-16 11:51:15,594 - INFO - [execute_migration][Entry] Starting migration of 1 dashboards.
|
||||
|
||||
=== Миграция... ===
|
||||
Миграция: FI-0060 Финансы. Налоги. Данные по налогам. Старый (1/1) 0%2025-12-16 11:51:15,598 - INFO - [export_dashboard][Enter] Exporting dashboard 185.
|
||||
2025-12-16 11:51:16,142 - INFO - [export_dashboard][Exit] Exported dashboard 185 to dashboard_export_20251216T085115.zip.
|
||||
2025-12-16 11:51:16,205 - INFO - [update_yamls][Enter] Starting YAML configuration update.
|
||||
2025-12-16 11:51:16,208 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\metadata.yaml
|
||||
2025-12-16 11:51:16,209 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-01_2787.yaml
|
||||
2025-12-16 11:51:16,210 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_2_4030.yaml
|
||||
2025-12-16 11:51:16,212 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_4029.yaml
|
||||
2025-12-16 11:51:16,213 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_TOTAL2_4036.yaml
|
||||
2025-12-16 11:51:16,215 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_TOTAL2_4037.yaml
|
||||
2025-12-16 11:51:16,216 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_TOTAL_4028.yaml
|
||||
2025-12-16 11:51:16,217 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_ZNODE_ROOT2_4024.yaml
|
||||
2025-12-16 11:51:16,218 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_ZNODE_ROOT_4033.yaml
|
||||
2025-12-16 11:51:16,220 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-02_ZFUND-BD2_4021.yaml
|
||||
2025-12-16 11:51:16,221 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-02_ZFUND_4027.yaml
|
||||
2025-12-16 11:51:16,222 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-02_ZFUND_4034.yaml
|
||||
2025-12-16 11:51:16,224 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02_ZTAX_4022.yaml
|
||||
2025-12-16 11:51:16,226 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02_ZTAX_4035.yaml
|
||||
2025-12-16 11:51:16,227 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-04-2_4031.yaml
|
||||
2025-12-16 11:51:16,228 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-05-01_4026.yaml
|
||||
2025-12-16 11:51:16,230 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-05-01_4032.yaml
|
||||
2025-12-16 11:51:16,231 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-06_1_4023.yaml
|
||||
2025-12-16 11:51:16,233 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-06_2_4020.yaml
|
||||
2025-12-16 11:51:16,234 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060_4025.yaml
|
||||
2025-12-16 11:51:16,236 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\dashboards\FI-0060_185.yaml
|
||||
2025-12-16 11:51:16,238 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\databases\Prod_Clickhouse_10.yaml
|
||||
2025-12-16 11:51:16,240 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0000_-_685.yaml
|
||||
2025-12-16 11:51:16,241 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-01-2_zfund_reciever_-_861.yaml
|
||||
2025-12-16 11:51:16,242 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-01_zfund_reciever_click_689.yaml
|
||||
2025-12-16 11:51:16,244 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-02_680.yaml
|
||||
2025-12-16 11:51:16,245 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-03_ztax_862.yaml
|
||||
2025-12-16 11:51:16,246 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-04_zpbe_681.yaml
|
||||
2025-12-16 11:51:16,247 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-05_ZTAXZFUND_679.yaml
|
||||
2025-12-16 11:51:16,249 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-06_860.yaml
|
||||
2025-12-16 11:51:16,250 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-08_682.yaml
|
||||
2025-12-16 11:51:16,251 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-10_zpbe_688.yaml
|
||||
2025-12-16 11:51:16,253 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-11_ZTAX_NAME_863.yaml
|
||||
2025-12-16 11:51:16,254 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_683.yaml
|
||||
2025-12-16 11:51:16,255 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_684.yaml
|
||||
2025-12-16 11:51:16,256 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_686.yaml
|
||||
2025-12-16 11:51:16,258 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_690.yaml
|
||||
2025-12-16 11:51:16,259 - INFO - [create_dashboard_export][Enter] Packing dashboard: ['C:\\Users\\LO54FB~1\\Temp\\tmpuidfegpd.dir'] -> C:\Users\LO54FB~1
|
||||
\Temp\tmps7cuv2ti.zip
|
||||
2025-12-16 11:51:16,347 - INFO - [create_dashboard_export][Exit] Archive created: C:\Users\LO54FB~1\Temp\tmps7cuv2ti.zip
|
||||
2025-12-16 11:51:16,372 - ERROR - [import_dashboard][Failure] First import attempt failed: [API_FAILURE] API error during upload: {"errors": [{"message"
|
||||
: "Expecting value: line 1 column 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "messag
|
||||
e": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448
|
||||
\u0438\u0431\u043a\u0430."}]}}]} | Context: {'type': 'api_call'}
|
||||
Traceback (most recent call last):
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 186, in _perform_upload
|
||||
response.raise_for_status()
|
||||
File "C:\ProgramData\anaconda3\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
|
||||
raise HTTPError(http_error_msg, response=self)
|
||||
requests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: https://dev.bi.dwh.rusal.com/api/v1/dashboard/import/
|
||||
|
||||
The above exception was the direct cause of the following exception:
|
||||
|
||||
Traceback (most recent call last):
|
||||
File "h:\dev\ss-tools\superset_tool\client.py", line 141, in import_dashboard
|
||||
return self._do_import(file_path)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\client.py", line 197, in _do_import
|
||||
return self.network.upload_file(
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 172, in upload_file
|
||||
return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 196, in _perform_upload
|
||||
raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
|
||||
superset_tool.exceptions.SupersetAPIError: [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "
|
||||
error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437
|
||||
\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'ty
|
||||
pe': 'api_call'}
|
||||
2025-12-16 11:51:16,511 - ERROR - [execute_migration][Failure] [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 c
|
||||
olumn 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u04
|
||||
40\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]
|
||||
}}]} | Context: {'type': 'api_call'}
|
||||
Traceback (most recent call last):
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 186, in _perform_upload
|
||||
response.raise_for_status()
|
||||
File "C:\ProgramData\anaconda3\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
|
||||
raise HTTPError(http_error_msg, response=self)
|
||||
requests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: https://dev.bi.dwh.rusal.com/api/v1/dashboard/import/
|
||||
|
||||
The above exception was the direct cause of the following exception:
|
||||
|
||||
Traceback (most recent call last):
|
||||
File "h:\dev\ss-tools\migration_script.py", line 366, in execute_migration
|
||||
self.to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug)
|
||||
File "h:\dev\ss-tools\superset_tool\client.py", line 141, in import_dashboard
|
||||
return self._do_import(file_path)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\client.py", line 197, in _do_import
|
||||
return self.network.upload_file(
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 172, in upload_file
|
||||
return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 196, in _perform_upload
|
||||
raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
|
||||
superset_tool.exceptions.SupersetAPIError: [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "
|
||||
error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437
|
||||
\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'ty
|
||||
pe': 'api_call'}
|
||||
|
||||
=== Ошибка ===
|
||||
Не удалось мигрировать дашборд FI-0060 Финансы. Налоги. Данные по налогам. Старый.
|
||||
|
||||
[API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "leve
|
||||
l": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438
|
||||
\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'type': 'api_call'}
|
||||
|
||||
100%
|
||||
2025-12-16 11:51:16,598 - INFO - [execute_migration][Exit] Migration finished.
|
||||
|
||||
=== Информация ===
|
||||
Миграция завершена!
|
||||
|
||||
2025-12-16 11:51:16,605 - INFO - [run][Exit] Скрипт миграции завершён.
|
||||
34
specs/001-migration-ui-redesign/checklists/requirements.md
Normal file
34
specs/001-migration-ui-redesign/checklists/requirements.md
Normal file
@@ -0,0 +1,34 @@
|
||||
# Specification Quality Checklist: Migration Process and UI Redesign
|
||||
|
||||
**Purpose**: Validate specification completeness and quality before proceeding to planning
|
||||
**Created**: 2025-12-20
|
||||
**Feature**: [specs/001-migration-ui-redesign/spec.md](specs/001-migration-ui-redesign/spec.md)
|
||||
|
||||
## Content Quality
|
||||
|
||||
- [x] No implementation details (languages, frameworks, APIs)
|
||||
- [x] Focused on user value and business needs
|
||||
- [x] Written for non-technical stakeholders
|
||||
- [x] All mandatory sections completed
|
||||
|
||||
## Requirement Completeness
|
||||
|
||||
- [x] No [NEEDS CLARIFICATION] markers remain
|
||||
- [x] Requirements are testable and unambiguous
|
||||
- [x] Success criteria are measurable
|
||||
- [x] Success criteria are technology-agnostic (no implementation details)
|
||||
- [x] All acceptance scenarios are defined
|
||||
- [x] Edge cases are identified
|
||||
- [x] Scope is clearly bounded
|
||||
- [x] Dependencies and assumptions identified
|
||||
|
||||
## Feature Readiness
|
||||
|
||||
- [x] All functional requirements have clear acceptance criteria
|
||||
- [x] User scenarios cover primary flows
|
||||
- [x] Feature meets measurable outcomes defined in Success Criteria
|
||||
- [x] No implementation details leak into specification
|
||||
|
||||
## Notes
|
||||
|
||||
- Items marked incomplete require spec updates before `/speckit.clarify` or `/speckit.plan`
|
||||
115
specs/001-migration-ui-redesign/contracts/api.md
Normal file
115
specs/001-migration-ui-redesign/contracts/api.md
Normal file
@@ -0,0 +1,115 @@
|
||||
# API Contracts: Migration Process and UI Redesign
|
||||
|
||||
## Environment Management
|
||||
|
||||
### GET /api/environments
|
||||
List all configured environments.
|
||||
|
||||
**Response (200 OK)**:
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": "uuid",
|
||||
"name": "Development",
|
||||
"url": "https://superset-dev.example.com"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### GET /api/environments/{id}/databases
|
||||
Fetch the list of databases from a specific environment.
|
||||
|
||||
**Response (200 OK)**:
|
||||
```json
|
||||
[
|
||||
{
|
||||
"uuid": "db-uuid",
|
||||
"database_name": "Dev Clickhouse",
|
||||
"engine": "clickhouse"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
## Database Mapping
|
||||
|
||||
### GET /api/mappings
|
||||
List all saved database mappings.
|
||||
|
||||
**Query Parameters**:
|
||||
- `source_env_id`: Filter by source environment.
|
||||
- `target_env_id`: Filter by target environment.
|
||||
|
||||
**Response (200 OK)**:
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": "uuid",
|
||||
"source_env_id": "uuid",
|
||||
"target_env_id": "uuid",
|
||||
"source_db_uuid": "uuid",
|
||||
"target_db_uuid": "uuid",
|
||||
"source_db_name": "Dev Clickhouse",
|
||||
"target_db_name": "Prod Clickhouse"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### POST /api/mappings
|
||||
Create or update a database mapping.
|
||||
|
||||
**Request Body**:
|
||||
```json
|
||||
{
|
||||
"source_env_id": "uuid",
|
||||
"target_env_id": "uuid",
|
||||
"source_db_uuid": "uuid",
|
||||
"target_db_uuid": "uuid"
|
||||
}
|
||||
```
|
||||
|
||||
### POST /api/mappings/suggest
|
||||
Get suggested mappings based on fuzzy matching.
|
||||
|
||||
**Request Body**:
|
||||
```json
|
||||
{
|
||||
"source_env_id": "uuid",
|
||||
"target_env_id": "uuid"
|
||||
}
|
||||
```
|
||||
|
||||
**Response (200 OK)**:
|
||||
```json
|
||||
[
|
||||
{
|
||||
"source_db_uuid": "uuid",
|
||||
"target_db_uuid": "uuid",
|
||||
"confidence": 0.95
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
## Migration Execution
|
||||
|
||||
### POST /api/migrations
|
||||
Start a migration job.
|
||||
|
||||
**Request Body**:
|
||||
```json
|
||||
{
|
||||
"source_env_id": "uuid",
|
||||
"target_env_id": "uuid",
|
||||
"assets": [
|
||||
{"type": "dashboard", "id": 123}
|
||||
],
|
||||
"replace_db": true
|
||||
}
|
||||
```
|
||||
|
||||
**Response (202 Accepted)**:
|
||||
```json
|
||||
{
|
||||
"job_id": "uuid",
|
||||
"status": "RUNNING"
|
||||
}
|
||||
```
|
||||
48
specs/001-migration-ui-redesign/data-model.md
Normal file
48
specs/001-migration-ui-redesign/data-model.md
Normal file
@@ -0,0 +1,48 @@
|
||||
# Data Model: Migration Process and UI Redesign
|
||||
|
||||
## Entities
|
||||
|
||||
### Environment
|
||||
Represents a Superset instance.
|
||||
|
||||
| Field | Type | Description |
|
||||
|-------|------|-------------|
|
||||
| `id` | UUID | Primary Key |
|
||||
| `name` | String | Display name (e.g., "Development", "Production") |
|
||||
| `url` | String | Base URL of the Superset instance |
|
||||
| `credentials_id` | String | Reference to encrypted credentials in the config manager |
|
||||
|
||||
### DatabaseMapping
|
||||
Represents a mapping between a database in the source environment and a database in the target environment.
|
||||
|
||||
| Field | Type | Description |
|
||||
|-------|------|-------------|
|
||||
| `id` | UUID | Primary Key |
|
||||
| `source_env_id` | UUID | Foreign Key to Environment (Source) |
|
||||
| `target_env_id` | UUID | Foreign Key to Environment (Target) |
|
||||
| `source_db_uuid` | String | UUID of the database in the source environment |
|
||||
| `target_db_uuid` | String | UUID of the database in the target environment |
|
||||
| `source_db_name` | String | Name of the database in the source environment (for UI) |
|
||||
| `target_db_name` | String | Name of the database in the target environment (for UI) |
|
||||
| `engine` | String | Database engine type (e.g., "clickhouse", "postgres") |
|
||||
|
||||
### MigrationJob
|
||||
Represents a single migration execution.
|
||||
|
||||
| Field | Type | Description |
|
||||
|-------|------|-------------|
|
||||
| `id` | UUID | Primary Key |
|
||||
| `source_env_id` | UUID | Foreign Key to Environment |
|
||||
| `target_env_id` | UUID | Foreign Key to Environment |
|
||||
| `status` | Enum | `PENDING`, `RUNNING`, `COMPLETED`, `FAILED`, `AWAITING_MAPPING` |
|
||||
| `replace_db` | Boolean | Whether to apply database mappings |
|
||||
| `created_at` | DateTime | Timestamp of creation |
|
||||
|
||||
## Relationships
|
||||
- `DatabaseMapping` belongs to a pair of `Environments`.
|
||||
- `MigrationJob` references two `Environments`.
|
||||
|
||||
## Validation Rules
|
||||
- `source_env_id` and `target_env_id` must be different.
|
||||
- `source_db_uuid` and `target_db_uuid` must belong to databases with compatible engines (optional warning).
|
||||
- Mappings must be unique for a given `(source_env_id, target_env_id, source_db_uuid)` triplet.
|
||||
79
specs/001-migration-ui-redesign/plan.md
Normal file
79
specs/001-migration-ui-redesign/plan.md
Normal file
@@ -0,0 +1,79 @@
|
||||
# Implementation Plan: Migration Process and UI Redesign
|
||||
|
||||
**Branch**: `001-migration-ui-redesign` | **Date**: 2025-12-20 | **Spec**: [specs/001-migration-ui-redesign/spec.md](specs/001-migration-ui-redesign/spec.md)
|
||||
|
||||
## Summary
|
||||
|
||||
Redesign the migration process to support environment-based selection and automated database mapping. The technical approach involves using a SQLite database to persist mappings between source and target databases, implementing a fuzzy matching algorithm for empirical suggestions, and intercepting asset definitions during migration to apply these mappings.
|
||||
|
||||
## Technical Context
|
||||
|
||||
**Language/Version**: Python 3.9+, Node.js 18+
|
||||
**Primary Dependencies**: FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLite
|
||||
**Storage**: SQLite (for database mappings and environment metadata)
|
||||
**Testing**: pytest (Backend), Vitest/Playwright (Frontend)
|
||||
**Target Platform**: Linux server
|
||||
**Project Type**: Web application (FastAPI + SvelteKit SPA)
|
||||
**Performance Goals**: SC-001: Users can complete a full database mapping for 5+ databases in under 60 seconds.
|
||||
**Constraints**: SPA-First Architecture (Constitution Principle I), API-Driven Communication (Constitution Principle II).
|
||||
**Scale/Scope**: Support for multiple environments and hundreds of database mappings.
|
||||
|
||||
## Constitution Check
|
||||
|
||||
*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
|
||||
|
||||
| Principle | Status | Notes |
|
||||
|-----------|--------|-------|
|
||||
| I. SPA-First Architecture | PASS | SvelteKit will be built as a static SPA and served by FastAPI. |
|
||||
| II. API-Driven Communication | PASS | All mapping and migration actions will go through FastAPI endpoints. |
|
||||
| III. Modern Stack Consistency | PASS | Using FastAPI, SvelteKit, and Tailwind CSS. |
|
||||
| IV. Semantic Protocol Adherence | PASS | Code will include GRACE-Poly anchors and contracts. |
|
||||
|
||||
## Project Structure
|
||||
|
||||
### Documentation (this feature)
|
||||
|
||||
```text
|
||||
specs/001-migration-ui-redesign/
|
||||
├── plan.md # This file
|
||||
├── research.md # Phase 0 output
|
||||
├── data-model.md # Phase 1 output
|
||||
├── quickstart.md # Phase 1 output
|
||||
├── contracts/ # Phase 1 output
|
||||
└── tasks.md # Phase 2 output
|
||||
```
|
||||
|
||||
### Source Code (repository root)
|
||||
|
||||
```text
|
||||
backend/
|
||||
├── src/
|
||||
│ ├── api/
|
||||
│ │ └── routes/
|
||||
│ │ ├── environments.py # New: Env selection
|
||||
│ │ └── mappings.py # New: DB mapping management
|
||||
│ ├── core/
|
||||
│ │ └── migration_engine.py # Update: DB replacement logic
|
||||
│ └── models/
|
||||
│ └── mapping.py # New: SQLite models
|
||||
└── tests/
|
||||
|
||||
frontend/
|
||||
├── src/
|
||||
│ ├── components/
|
||||
│ │ ├── MappingTable.svelte # New: DB mapping UI
|
||||
│ │ └── EnvSelector.svelte # New: Source/Target selection
|
||||
│ └── routes/
|
||||
│ └── migration/ # New: Migration dashboard
|
||||
└── tests/
|
||||
```
|
||||
|
||||
**Structure Decision**: Web application structure (Option 2) is selected to maintain separation between the FastAPI backend and SvelteKit frontend while adhering to the SPA-first principle.
|
||||
|
||||
## Complexity Tracking
|
||||
|
||||
> **Fill ONLY if Constitution Check has violations that must be justified**
|
||||
|
||||
| Violation | Why Needed | Simpler Alternative Rejected Because |
|
||||
|-----------|------------|-------------------------------------|
|
||||
| None | N/A | N/A |
|
||||
39
specs/001-migration-ui-redesign/quickstart.md
Normal file
39
specs/001-migration-ui-redesign/quickstart.md
Normal file
@@ -0,0 +1,39 @@
|
||||
# Quickstart: Migration Process and UI Redesign
|
||||
|
||||
## Setup
|
||||
|
||||
1. **Install Dependencies**:
|
||||
```bash
|
||||
pip install rapidfuzz sqlalchemy
|
||||
cd frontend && npm install
|
||||
```
|
||||
|
||||
2. **Configure Environments**:
|
||||
Ensure you have at least two Superset environments configured in the application settings.
|
||||
|
||||
3. **Initialize Database**:
|
||||
The system will automatically create the `mappings.db` SQLite file on the first run.
|
||||
|
||||
## Usage
|
||||
|
||||
### 1. Define Mappings
|
||||
1. Navigate to the **Database Mapping** tab.
|
||||
2. Select your **Source** and **Target** environments.
|
||||
3. Click **Fetch Databases**.
|
||||
4. Review the **Suggested Mappings** (highlighted in green).
|
||||
5. Manually adjust any mappings using the dropdowns.
|
||||
6. Click **Save Mappings**.
|
||||
|
||||
### 2. Run Migration
|
||||
1. Go to the **Migration** dashboard.
|
||||
2. Select the **Source** and **Target** environments.
|
||||
3. Select the assets (Dashboards/Datasets) you want to migrate.
|
||||
4. Enable the **Replace Database** toggle.
|
||||
5. Click **Start Migration**.
|
||||
6. If a database is missing a mapping, a modal will appear prompting you to select a target database.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- **Connection Error**: Ensure the backend can reach both Superset instances. Check credentials in settings.
|
||||
- **Mapping Not Applied**: Verify that the "Replace Database" toggle was enabled and that the mapping exists for the specific environment pair.
|
||||
- **Fuzzy Match Failure**: If names are too different, manual mapping is required. The system learns from manual overrides.
|
||||
33
specs/001-migration-ui-redesign/research.md
Normal file
33
specs/001-migration-ui-redesign/research.md
Normal file
@@ -0,0 +1,33 @@
|
||||
# Research: Migration Process and UI Redesign
|
||||
|
||||
## Decision: Fuzzy Matching Algorithm
|
||||
- **Choice**: `RapidFuzz` library with `fuzz.token_sort_ratio`.
|
||||
- **Rationale**: `RapidFuzz` is significantly faster than `FuzzyWuzzy` and provides robust string similarity metrics. `token_sort_ratio` is ideal for database names because it ignores word order and is less sensitive to prefixes like "Dev-" or "Prod-".
|
||||
- **Alternatives considered**:
|
||||
- `Levenshtein`: Too sensitive to string length and prefixes.
|
||||
- `Jaro-Winkler`: Good for short strings but less effective for multi-word names with different orders.
|
||||
|
||||
## Decision: Asset Interception Strategy
|
||||
- **Choice**: ZIP-based transformation during migration.
|
||||
- **Rationale**: Superset's native export/import format is a ZIP archive containing YAML definitions. Intercepting this archive allows for precise modification of database references (UUIDs) before they reach the target environment.
|
||||
- **Implementation**:
|
||||
1. Export dashboard/dataset from source (ZIP).
|
||||
2. Extract ZIP to a temporary directory.
|
||||
3. Iterate through `datasets/*.yaml` files.
|
||||
4. Replace `database_uuid` values based on the mapping table.
|
||||
5. Re-package the ZIP.
|
||||
6. Import to target.
|
||||
|
||||
## Decision: Database Mapping Persistence
|
||||
- **Choice**: SQLite with SQLAlchemy/SQLModel.
|
||||
- **Rationale**: SQLite is lightweight, requires no separate server, and is perfect for storing local configuration and mappings. It aligns with the project's existing stack.
|
||||
- **Schema**:
|
||||
- `Environment`: `id`, `name`, `url`, `credentials_id`.
|
||||
- `DatabaseMapping`: `id`, `source_env_id`, `target_env_id`, `source_db_uuid`, `target_db_uuid`, `source_db_name`, `target_db_name`.
|
||||
|
||||
## Decision: Superset API Integration
|
||||
- **Choice**: Extend existing `SupersetClient`.
|
||||
- **Rationale**: `SupersetClient` already handles authentication, network requests, and basic CRUD for dashboards/datasets. Adding environment-specific fetching and database listing is a natural extension.
|
||||
- **New Endpoints to use**:
|
||||
- `GET /api/v1/database/`: List all databases.
|
||||
- `GET /api/v1/database/{id}`: Get detailed database config.
|
||||
109
specs/001-migration-ui-redesign/spec.md
Normal file
109
specs/001-migration-ui-redesign/spec.md
Normal file
@@ -0,0 +1,109 @@
|
||||
# Feature Specification: Migration Process and UI Redesign
|
||||
|
||||
**Feature Branch**: `001-migration-ui-redesign`
|
||||
**Created**: 2025-12-20
|
||||
**Status**: Draft
|
||||
**Input**: User description: "я хочу переработать процесс и интерфейс миграции. 1. Необходимо чтобы был выпадающий список enviroments (откуда и куда), а также просто галка замены БД 2. Процесс замены БД должен быть предустановленными парами , необходима отдельная вкладка которая бы считывала базы данных с источника и цели и позволяла их маппить, при этом первоначально эмпирически подставляя пары вида 'Dev Clickhouse' -> 'Prod Clickhouse'. Меппинг нужно сохранять и иметь возможность его редактировать"
|
||||
|
||||
## Clarifications
|
||||
|
||||
### Session 2025-12-20
|
||||
- Q: Scope of Database Mapping → A: Map the full configuration object obtained from the Superset API.
|
||||
- Q: Persistence of mappings → A: Use a SQLite database for storing mappings.
|
||||
- Q: Handling of missing mappings during migration → A: Show a modal dialog during the migration process to prompt for missing mappings.
|
||||
- Q: Empirical matching algorithm details → A: Use name-based fuzzy matching (ignoring common prefixes like Dev/Prod).
|
||||
- Q: Scope of "Replace Database" toggle → A: Apply replacement to all assets (Dashboards, Datasets, Charts) included in the migration.
|
||||
- Q: Backend exposure of Superset databases → A: Dedicated environment database endpoints (e.g., `/api/environments/{id}/databases`).
|
||||
- Q: Superset API authentication → A: Use stored environment credentials from the backend.
|
||||
- Q: Error handling for unreachable environments → A: Return structured error responses (502/503) with descriptive messages.
|
||||
- Q: Database list filtering → A: Return all available databases with metadata (engine type, etc.).
|
||||
- Q: Handling large database lists → A: Return full list (no pagination) for simplicity.
|
||||
|
||||
## User Scenarios & Testing *(mandatory)*
|
||||
|
||||
### User Story 1 - Environment-Based Migration Setup (Priority: P1)
|
||||
|
||||
As a migration operator, I want to easily select the source and target environments from a list so that I can quickly define the scope of my migration without manual URL entry.
|
||||
|
||||
**Why this priority**: This is the core interaction for starting any migration. Using predefined environments reduces errors and improves speed.
|
||||
|
||||
**Independent Test**: Can be tested by opening the migration page and verifying that the "Source" and "Target" dropdowns are populated with configured environments and can be selected.
|
||||
|
||||
**Acceptance Scenarios**:
|
||||
|
||||
1. **Given** multiple environments are configured in settings, **When** I open the migration page, **Then** I should see two dropdowns for "Source" and "Target" containing these environments.
|
||||
2. **Given** a source and target are selected, **When** I toggle the "Replace Database" checkbox, **Then** the system should prepare to apply database mappings during the next migration step.
|
||||
|
||||
---
|
||||
|
||||
### User Story 2 - Database Mapping Management (Priority: P1)
|
||||
|
||||
As an administrator, I want to define how databases in my development environment map to databases in production so that my dashboards and datasets work correctly after migration.
|
||||
|
||||
**Why this priority**: Migrations often fail or require manual fixups because database references point to the wrong environment. Automated mapping is critical for reliable migrations.
|
||||
|
||||
**Independent Test**: Can be tested by navigating to the "Database Mapping" tab, fetching databases, and verifying that mappings can be created, saved, and edited.
|
||||
|
||||
**Acceptance Scenarios**:
|
||||
|
||||
1. **Given** a source and target environment are selected, **When** I open the "Database Mapping" tab, **Then** the system should fetch and display lists of databases from both environments.
|
||||
2. **Given** the database lists are loaded, **When** the system identifies similar names (e.g., "Dev Clickhouse" and "Prod Clickhouse"), **Then** it should automatically suggest these as a mapping pair.
|
||||
3. **Given** suggested or manual mappings, **When** I click "Save Mappings", **Then** these pairs should be persisted and associated with the selected environment pair.
|
||||
|
||||
---
|
||||
|
||||
### User Story 3 - Migration with Automated DB Replacement (Priority: P2)
|
||||
|
||||
As a user, I want the migration process to automatically update database references based on my saved mappings so that I don't have to manually edit exported files or post-migration settings.
|
||||
|
||||
**Why this priority**: This delivers the actual value of the mapping feature by automating a tedious and error-prone task.
|
||||
|
||||
**Independent Test**: Can be tested by running a migration with "Replace Database" enabled and verifying that the resulting assets in the target environment point to the mapped databases.
|
||||
|
||||
**Acceptance Scenarios**:
|
||||
|
||||
1. **Given** saved mappings exist for the selected environments, **When** I start a migration with "Replace Database" enabled, **Then** the system should replace all source database IDs/names with their corresponding target values during the transfer.
|
||||
2. **Given** "Replace Database" is enabled but a source database has no mapping, **When** the migration runs, **Then** the system should pause and show a modal dialog prompting the user to provide a mapping on-the-fly for the missing database.
|
||||
|
||||
---
|
||||
|
||||
### Edge Cases
|
||||
|
||||
- **Environment Connectivity**: If the source or target environment is unreachable, the backend MUST return a structured error (502/503), and the frontend MUST display a clear connection error with a retry option.
|
||||
- **Duplicate Mappings**: How does the system handle multiple source databases mapping to the same target database? (Assumption: This is allowed, as multiple dev DBs might consolidate into one prod DB).
|
||||
- **Missing Target Database**: What if a mapped target database no longer exists in the target environment? (Assumption: Validation should occur before migration starts, highlighting broken mappings).
|
||||
|
||||
## Requirements *(mandatory)*
|
||||
|
||||
### Functional Requirements
|
||||
|
||||
- **FR-001**: System MUST provide dropdown menus for selecting "Source Environment" and "Target Environment" on the migration screen.
|
||||
- **FR-002**: System MUST provide a "Replace Database" checkbox that, when enabled, triggers the database mapping logic for all assets (Dashboards, Datasets, Charts) during migration.
|
||||
- **FR-003**: System MUST include a dedicated "Database Mapping" tab or view accessible from the migration interface.
|
||||
- **FR-004**: System MUST fetch available databases from both source and target environments via their respective APIs when the mapping tab is opened.
|
||||
- **FR-005**: System MUST implement a name-based fuzzy matching algorithm to suggest initial mappings, ignoring common environment prefixes (e.g., "Dev", "Prod").
|
||||
- **FR-006**: System MUST allow users to manually override suggested mappings and create new ones via a drag-and-drop or dropdown-based interface.
|
||||
- **FR-007**: System MUST persist database mappings in a local SQLite database, keyed by the source and target environment identifiers.
|
||||
- **FR-008**: System MUST provide an "Edit" capability for existing mappings, allowing users to update or delete them.
|
||||
- **FR-009**: During migration, if "Replace Database" is active, the system MUST intercept asset definitions (JSON/YAML) and replace database references according to the active mapping table.
|
||||
|
||||
### Key Entities *(include if feature involves data)*
|
||||
|
||||
- **Environment**: A configured Superset instance (Name, URL, Credentials).
|
||||
- **Database Mapping**: A record linking a source database configuration (including metadata like engine type) to a target database configuration for a specific `source_env` -> `target_env` pair.
|
||||
- **Migration Configuration**: The set of parameters for a migration job, including selected environments and the "Replace Database" toggle state.
|
||||
|
||||
## Success Criteria *(mandatory)*
|
||||
|
||||
### Measurable Outcomes
|
||||
|
||||
- **SC-001**: Users can complete a full database mapping for 5+ databases in under 60 seconds using the empirical suggestions.
|
||||
- **SC-002**: 100% of assets migrated with "Replace Database" enabled correctly reference the target databases as defined in the mapping table.
|
||||
- **SC-003**: Mapping persistence allows users to run subsequent migrations between the same environments without re-configuring database pairs in 100% of cases.
|
||||
- **SC-004**: The system successfully identifies and suggests at least 90% of matching pairs when naming follows a "Prefix + Name" pattern (e.g., "Dev-Sales" -> "Prod-Sales").
|
||||
|
||||
## Assumptions
|
||||
|
||||
- **AS-001**: Environments are already configured in the application's global settings.
|
||||
- **AS-002**: The backend has access to stored credentials for both source and target environments to perform API requests.
|
||||
- **AS-003**: Database names or IDs are stable enough within an environment to be used as reliable mapping keys.
|
||||
186
specs/001-migration-ui-redesign/tasks.md
Normal file
186
specs/001-migration-ui-redesign/tasks.md
Normal file
@@ -0,0 +1,186 @@
|
||||
---
|
||||
|
||||
description: "Task list for Migration Process and UI Redesign implementation"
|
||||
---
|
||||
|
||||
# Tasks: Migration Process and UI Redesign
|
||||
|
||||
**Input**: Design documents from `specs/001-migration-ui-redesign/`
|
||||
**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, quickstart.md
|
||||
|
||||
**Tests**: Tests are NOT explicitly requested in the feature specification, so they are omitted from this task list.
|
||||
|
||||
**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story.
|
||||
|
||||
## Format: `[ID] [P?] [Story] Description`
|
||||
|
||||
- **[P]**: Can run in parallel (different files, no dependencies)
|
||||
- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3)
|
||||
- Include exact file paths in descriptions
|
||||
|
||||
## Path Conventions
|
||||
|
||||
- **Web app**: `backend/src/`, `frontend/src/`
|
||||
|
||||
---
|
||||
|
||||
## Phase 1: Setup (Shared Infrastructure)
|
||||
|
||||
**Purpose**: Project initialization and basic structure
|
||||
|
||||
- [ ] T001 Create project structure per implementation plan in `backend/src/` and `frontend/src/`
|
||||
- [ ] T002 [P] Install backend dependencies (rapidfuzz, sqlalchemy) in `backend/requirements.txt`
|
||||
- [ ] T003 [P] Install frontend dependencies (if any new) in `frontend/package.json`
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: Foundational (Blocking Prerequisites)
|
||||
|
||||
**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented
|
||||
|
||||
**⚠️ CRITICAL**: No user story work can begin until this phase is complete
|
||||
|
||||
- [ ] T004 Setup SQLite database schema and SQLAlchemy models in `backend/src/models/mapping.py`
|
||||
- [ ] T005 [P] Implement fuzzy matching utility using RapidFuzz in `backend/src/core/utils/matching.py`
|
||||
- [ ] T006 [P] Extend SupersetClient to support database listing and metadata fetching in `backend/src/core/superset_client.py`
|
||||
- [ ] T007 Configure database mapping persistence layer in `backend/src/core/database.py`
|
||||
|
||||
**Checkpoint**: Foundation ready - user story implementation can now begin in parallel
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: User Story 1 - Environment-Based Migration Setup (Priority: P1) 🎯 MVP
|
||||
|
||||
**Goal**: Enable selection of source and target environments and toggle database replacement.
|
||||
|
||||
**Independent Test**: Open the migration page and verify that the "Source" and "Target" dropdowns are populated with configured environments and can be selected.
|
||||
|
||||
### Implementation for User Story 1
|
||||
|
||||
- [ ] T008 [P] [US1] Implement environment selection API endpoints in `backend/src/api/routes/environments.py`
|
||||
- [ ] T009 [P] [US1] Create `EnvSelector.svelte` component for source/target selection in `frontend/src/components/EnvSelector.svelte`
|
||||
- [ ] T010 [US1] Integrate `EnvSelector` and "Replace Database" toggle into migration dashboard in `frontend/src/routes/migration/+page.svelte`
|
||||
- [ ] T011 [US1] Add validation to ensure source and target environments are different in `frontend/src/routes/migration/+page.svelte`
|
||||
|
||||
**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently.
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: User Story 2 - Database Mapping Management (Priority: P1)
|
||||
|
||||
**Goal**: Fetch databases from environments, suggest mappings using fuzzy matching, and allow manual overrides/persistence.
|
||||
|
||||
**Independent Test**: Navigate to the "Database Mapping" tab, fetch databases, and verify that mappings can be created, saved, and edited.
|
||||
|
||||
### Implementation for User Story 2
|
||||
|
||||
- [ ] T012 [P] [US2] Implement database mapping CRUD API endpoints in `backend/src/api/routes/mappings.py`
|
||||
- [ ] T013 [US2] Implement mapping service with fuzzy matching logic in `backend/src/services/mapping_service.py`
|
||||
- [ ] T014 [P] [US2] Create `MappingTable.svelte` component for displaying and editing pairs in `frontend/src/components/MappingTable.svelte`
|
||||
- [ ] T015 [US2] Create database mapping management view in `frontend/src/routes/migration/mappings/+page.svelte`
|
||||
- [ ] T016 [US2] Implement "Fetch Databases" action and suggestion highlighting in `frontend/src/routes/migration/mappings/+page.svelte`
|
||||
|
||||
**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently.
|
||||
|
||||
---
|
||||
|
||||
## Phase 5: User Story 3 - Migration with Automated DB Replacement (Priority: P2)
|
||||
|
||||
**Goal**: Intercept assets during migration, apply database mappings, and prompt for missing ones.
|
||||
|
||||
**Independent Test**: Run a migration with "Replace Database" enabled and verify that the resulting assets in the target environment point to the mapped databases.
|
||||
|
||||
### Implementation for User Story 3
|
||||
|
||||
- [ ] T017 [US3] Implement ZIP-based asset interception and YAML transformation logic in `backend/src/core/migration_engine.py`
|
||||
- [ ] T018 [US3] Integrate database mapping application into the migration job execution flow in `backend/src/core/task_manager.py`
|
||||
- [ ] T019 [P] [US3] Create `MissingMappingModal.svelte` for on-the-fly mapping prompts in `frontend/src/components/MissingMappingModal.svelte`
|
||||
- [ ] T020 [US3] Implement backend pause and frontend modal trigger for missing mappings in `backend/src/api/routes/tasks.py` and `frontend/src/components/TaskRunner.svelte`
|
||||
|
||||
**Checkpoint**: All user stories should now be independently functional.
|
||||
|
||||
---
|
||||
|
||||
## Phase 6: Polish & Cross-Cutting Concerns
|
||||
|
||||
**Purpose**: Improvements that affect multiple user stories
|
||||
|
||||
- [ ] T021 [P] Update documentation in `docs/` to include database mapping instructions
|
||||
- [ ] T022 Code cleanup and refactoring of migration logic
|
||||
- [ ] T023 [P] Performance optimization for fuzzy matching and ZIP processing
|
||||
- [ ] T024 Run `quickstart.md` validation to ensure end-to-end flow works as documented
|
||||
|
||||
---
|
||||
|
||||
## Dependencies & Execution Order
|
||||
|
||||
### Phase Dependencies
|
||||
|
||||
- **Setup (Phase 1)**: No dependencies - can start immediately
|
||||
- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories
|
||||
- **User Stories (Phase 3+)**: All depend on Foundational phase completion
|
||||
- User stories can then proceed in parallel (if staffed)
|
||||
- Or sequentially in priority order (P1 → P2 → P3)
|
||||
- **Polish (Final Phase)**: Depends on all desired user stories being complete
|
||||
|
||||
### User Story Dependencies
|
||||
|
||||
- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
|
||||
- **User Story 2 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
|
||||
- **User Story 3 (P2)**: Can start after Foundational (Phase 2) - Depends on US1/US2 for mapping data and configuration
|
||||
|
||||
### Within Each User Story
|
||||
|
||||
- Models before services
|
||||
- Services before endpoints
|
||||
- Core implementation before integration
|
||||
- Story complete before moving to next priority
|
||||
|
||||
### Parallel Opportunities
|
||||
|
||||
- All Setup tasks marked [P] can run in parallel
|
||||
- All Foundational tasks marked [P] can run in parallel (within Phase 2)
|
||||
- Once Foundational phase completes, US1 and US2 can start in parallel
|
||||
- Models and UI components within a story marked [P] can run in parallel
|
||||
|
||||
---
|
||||
|
||||
## Parallel Example: User Story 2
|
||||
|
||||
```bash
|
||||
# Launch backend and frontend components for User Story 2 together:
|
||||
Task: "Implement database mapping CRUD API endpoints in backend/src/api/routes/mappings.py"
|
||||
Task: "Create MappingTable.svelte component for displaying and editing pairs in frontend/src/components/MappingTable.svelte"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implementation Strategy
|
||||
|
||||
### MVP First (User Story 1 & 2)
|
||||
|
||||
1. Complete Phase 1: Setup
|
||||
2. Complete Phase 2: Foundational (CRITICAL - blocks all stories)
|
||||
3. Complete Phase 3: User Story 1
|
||||
4. Complete Phase 4: User Story 2
|
||||
5. **STOP and VALIDATE**: Test environment selection and mapping management independently
|
||||
6. Deploy/demo if ready
|
||||
|
||||
### Incremental Delivery
|
||||
|
||||
1. Complete Setup + Foundational → Foundation ready
|
||||
2. Add User Story 1 → Test independently → Deploy/Demo
|
||||
3. Add User Story 2 → Test independently → Deploy/Demo (MVP!)
|
||||
4. Add User Story 3 → Test independently → Deploy/Demo
|
||||
5. Each story adds value without breaking previous stories
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- [P] tasks = different files, no dependencies
|
||||
- [Story] label maps task to specific user story for traceability
|
||||
- Each user story should be independently completable and testable
|
||||
- Commit after each task or logical group
|
||||
- Stop at any checkpoint to validate story independently
|
||||
- Avoid: vague tasks, same file conflicts, cross-story dependencies that break independence
|
||||
24
specs/005-fix-ui-ws-validation/contracts/ws-logs.md
Normal file
24
specs/005-fix-ui-ws-validation/contracts/ws-logs.md
Normal file
@@ -0,0 +1,24 @@
|
||||
# WebSocket Contract: Task Logs
|
||||
|
||||
## Endpoint
|
||||
`WS /ws/logs/{task_id}`
|
||||
|
||||
## Description
|
||||
Streams real-time logs for a specific task.
|
||||
|
||||
## Connection Parameters
|
||||
- `task_id`: UUID of the task to monitor.
|
||||
|
||||
## Message Format (Server -> Client)
|
||||
```json
|
||||
{
|
||||
"task_id": "uuid",
|
||||
"message": "Log message text",
|
||||
"timestamp": "2025-12-20T20:20:00Z",
|
||||
"level": "INFO"
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
- If `task_id` is invalid, the connection is closed with code `4004` (Not Found).
|
||||
- If the connection fails, the client should attempt reconnection with exponential backoff.
|
||||
@@ -0,0 +1,34 @@
|
||||
# Specification Quality Checklist: Configurable Belief State Logging
|
||||
|
||||
**Purpose**: Validate specification completeness and quality before proceeding to planning
|
||||
**Created**: 2025-12-26
|
||||
**Feature**: [specs/006-configurable-belief-logs/spec.md](../spec.md)
|
||||
|
||||
## Content Quality
|
||||
|
||||
- [x] No implementation details (languages, frameworks, APIs)
|
||||
- [x] Focused on user value and business needs
|
||||
- [x] Written for non-technical stakeholders
|
||||
- [x] All mandatory sections completed
|
||||
|
||||
## Requirement Completeness
|
||||
|
||||
- [x] No [NEEDS CLARIFICATION] markers remain
|
||||
- [x] Requirements are testable and unambiguous
|
||||
- [x] Success criteria are measurable
|
||||
- [x] Success criteria are technology-agnostic (no implementation details)
|
||||
- [x] All acceptance scenarios are defined
|
||||
- [x] Edge cases are identified
|
||||
- [x] Scope is clearly bounded
|
||||
- [x] Dependencies and assumptions identified
|
||||
|
||||
## Feature Readiness
|
||||
|
||||
- [x] All functional requirements have clear acceptance criteria
|
||||
- [x] User scenarios cover primary flows
|
||||
- [x] Feature meets measurable outcomes defined in Success Criteria
|
||||
- [x] No implementation details leak into specification
|
||||
|
||||
## Notes
|
||||
|
||||
- Items marked incomplete require spec updates before `/speckit.clarify` or `/speckit.plan`
|
||||
56
specs/006-configurable-belief-logs/contracts/api.md
Normal file
56
specs/006-configurable-belief-logs/contracts/api.md
Normal file
@@ -0,0 +1,56 @@
|
||||
# API Contract: Settings Update
|
||||
|
||||
## PATCH /api/settings/global
|
||||
|
||||
Updates the global application settings, including the new logging configuration.
|
||||
|
||||
### Request Body
|
||||
|
||||
**Content-Type**: `application/json`
|
||||
|
||||
```json
|
||||
{
|
||||
"backup_path": "string",
|
||||
"default_environment_id": "string (optional)",
|
||||
"logging": {
|
||||
"level": "string (DEBUG, INFO, WARNING, ERROR, CRITICAL)",
|
||||
"file_path": "string (optional)",
|
||||
"max_bytes": "integer (default: 10485760)",
|
||||
"backup_count": "integer (default: 5)",
|
||||
"enable_belief_state": "boolean (default: true)"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Response
|
||||
|
||||
**Status**: `200 OK`
|
||||
**Content-Type**: `application/json`
|
||||
|
||||
```json
|
||||
{
|
||||
"backup_path": "string",
|
||||
"default_environment_id": "string (optional)",
|
||||
"logging": {
|
||||
"level": "string",
|
||||
"file_path": "string",
|
||||
"max_bytes": "integer",
|
||||
"backup_count": "integer",
|
||||
"enable_belief_state": "boolean"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Example
|
||||
|
||||
**Request**
|
||||
|
||||
```json
|
||||
{
|
||||
"backup_path": "backups",
|
||||
"logging": {
|
||||
"level": "DEBUG",
|
||||
"file_path": "logs/app.log",
|
||||
"enable_belief_state": true
|
||||
}
|
||||
}
|
||||
74
specs/006-configurable-belief-logs/data-model.md
Normal file
74
specs/006-configurable-belief-logs/data-model.md
Normal file
@@ -0,0 +1,74 @@
|
||||
# Data Model: Configurable Belief State Logging
|
||||
|
||||
## 1. Configuration Models
|
||||
|
||||
These models extend the existing `ConfigModels` in `backend/src/core/config_models.py`.
|
||||
|
||||
### 1.1. LoggingConfig
|
||||
|
||||
Defines the configuration for the application's logging system.
|
||||
|
||||
| Field | Type | Default | Description |
|
||||
|---|---|---|---|
|
||||
| `level` | `str` | `"INFO"` | The logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL). |
|
||||
| `file_path` | `Optional[str]` | `"logs/app.log"` | Path to the log file. If None, file logging is disabled. |
|
||||
| `max_bytes` | `int` | `10485760` (10MB) | Maximum size of a log file before rotation. |
|
||||
| `backup_count` | `int` | `5` | Number of backup files to keep. |
|
||||
| `enable_belief_state` | `bool` | `True` | Whether to enable structured Belief State logging (Entry/Exit). |
|
||||
|
||||
```python
|
||||
class LoggingConfig(BaseModel):
|
||||
level: str = "INFO"
|
||||
file_path: Optional[str] = "logs/app.log"
|
||||
max_bytes: int = 10 * 1024 * 1024
|
||||
backup_count: int = 5
|
||||
enable_belief_state: bool = True
|
||||
```
|
||||
|
||||
### 1.2. GlobalSettings (Updated)
|
||||
|
||||
Updates the existing `GlobalSettings` to include `LoggingConfig`.
|
||||
|
||||
| Field | Type | Default | Description |
|
||||
|---|---|---|---|
|
||||
| `logging` | `LoggingConfig` | `LoggingConfig()` | The logging configuration object. |
|
||||
|
||||
```python
|
||||
class GlobalSettings(BaseModel):
|
||||
backup_path: str
|
||||
default_environment_id: Optional[str] = None
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
```
|
||||
|
||||
## 2. Logger Entities
|
||||
|
||||
These entities are part of the `backend/src/core/logger.py` module.
|
||||
|
||||
### 2.1. LogEntry (Existing)
|
||||
|
||||
Represents a single log record.
|
||||
|
||||
| Field | Type | Description |
|
||||
|---|---|---|
|
||||
| `timestamp` | `datetime` | UTC timestamp of the log. |
|
||||
| `level` | `str` | Log level. |
|
||||
| `message` | `str` | Log message. |
|
||||
| `context` | `Optional[Dict[str, Any]]` | Additional context data. |
|
||||
|
||||
### 2.2. BeliefState (Concept)
|
||||
|
||||
Represents the state of execution in the "Belief State" model.
|
||||
|
||||
- **Entry**: Entering a logical block.
|
||||
- **Action**: Performing a core action within the block.
|
||||
- **Coherence**: Verifying the state (OK or Failed).
|
||||
- **Exit**: Exiting the logical block.
|
||||
|
||||
Format: `[{ANCHOR_ID}][{STATE}] {Message}`
|
||||
|
||||
## 3. Relationships
|
||||
|
||||
- `AppConfig` contains `GlobalSettings`.
|
||||
- `GlobalSettings` contains `LoggingConfig`.
|
||||
- `ConfigManager` reads/writes `AppConfig`.
|
||||
- `ConfigManager` configures the global `logger` based on `LoggingConfig`.
|
||||
81
specs/006-configurable-belief-logs/plan.md
Normal file
81
specs/006-configurable-belief-logs/plan.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# Implementation Plan: Configurable Belief State Logging
|
||||
|
||||
**Branch**: `006-configurable-belief-logs` | **Date**: 2025-12-26 | **Spec**: specs/006-configurable-belief-logs/spec.md
|
||||
**Input**: Feature specification from `/specs/006-configurable-belief-logs/spec.md`
|
||||
|
||||
**Note**: This template is filled in by the `/speckit.plan` command. See `.specify/templates/commands/plan.md` for the execution workflow.
|
||||
|
||||
## Summary
|
||||
|
||||
Implement a configurable logging system with "Belief State" support (Entry, Action, Coherence, Exit).
|
||||
Approach: Use Python's `logging` module with a custom Context Manager (`belief_scope`) and extend `GlobalSettings` with a `LoggingConfig` model.
|
||||
|
||||
## Technical Context
|
||||
|
||||
**Language/Version**: Python 3.9+
|
||||
**Primary Dependencies**: FastAPI (Backend), Pydantic (Config), Svelte (Frontend)
|
||||
**Storage**: Filesystem (for log files), JSON (for configuration persistence)
|
||||
**Testing**: pytest (Backend), manual verification (Frontend)
|
||||
**Target Platform**: Linux server (primary), cross-platform compatible
|
||||
**Project Type**: Web application (Backend + Frontend)
|
||||
**Performance Goals**: Low overhead logging (<1ms per log entry), non-blocking for main thread (mostly)
|
||||
**Constraints**: Must use standard library `logging` where possible; Log rotation to prevent disk overflow
|
||||
**Scale/Scope**: Configurable log levels and retention policies
|
||||
|
||||
## Constitution Check
|
||||
|
||||
*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
|
||||
|
||||
- **Library-First**: N/A (Core infrastructure)
|
||||
- **CLI Interface**: N/A (Configured via UI/API/JSON)
|
||||
- **Test-First**: Will require unit tests for `belief_scope` and config updates.
|
||||
- **Integration Testing**: Will require testing the settings API.
|
||||
- **Observability**: This feature *is* the observability enhancement.
|
||||
|
||||
## Project Structure
|
||||
|
||||
### Documentation (this feature)
|
||||
|
||||
```text
|
||||
specs/[###-feature]/
|
||||
├── plan.md # This file (/speckit.plan command output)
|
||||
├── research.md # Phase 0 output (/speckit.plan command)
|
||||
├── data-model.md # Phase 1 output (/speckit.plan command)
|
||||
├── quickstart.md # Phase 1 output (/speckit.plan command)
|
||||
├── contracts/ # Phase 1 output (/speckit.plan command)
|
||||
└── tasks.md # Phase 2 output (/speckit.tasks command - NOT created by /speckit.plan)
|
||||
```
|
||||
|
||||
### Source Code (repository root)
|
||||
|
||||
```text
|
||||
backend/
|
||||
├── src/
|
||||
│ ├── core/
|
||||
│ │ ├── config_models.py # Add LoggingConfig
|
||||
│ │ ├── config_manager.py # Update config loading/saving
|
||||
│ │ └── logger.py # Add belief_scope and configure_logger
|
||||
│ └── api/
|
||||
│ └── routes/
|
||||
│ └── settings.py # Update GlobalSettings endpoint
|
||||
└── tests/
|
||||
└── test_logger.py # New tests for logger logic
|
||||
|
||||
frontend/
|
||||
├── src/
|
||||
│ ├── pages/
|
||||
│ │ └── Settings.svelte # Add logging UI controls
|
||||
│ └── lib/
|
||||
│ └── api.js # Update API calls if needed
|
||||
```
|
||||
|
||||
**Structure Decision**: enhancing existing Backend/Frontend structure.
|
||||
|
||||
## Complexity Tracking
|
||||
|
||||
> **Fill ONLY if Constitution Check has violations that must be justified**
|
||||
|
||||
| Violation | Why Needed | Simpler Alternative Rejected Because |
|
||||
|-----------|------------|-------------------------------------|
|
||||
| [e.g., 4th project] | [current need] | [why 3 projects insufficient] |
|
||||
| [e.g., Repository pattern] | [specific problem] | [why direct DB access insufficient] |
|
||||
104
specs/006-configurable-belief-logs/quickstart.md
Normal file
104
specs/006-configurable-belief-logs/quickstart.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# Quickstart: Configurable Belief State Logging
|
||||
|
||||
## 1. Configuration
|
||||
|
||||
The logging system is configured via the `GlobalSettings` in `config.json` or through the Settings UI.
|
||||
|
||||
### 1.1. Default Configuration
|
||||
|
||||
```json
|
||||
{
|
||||
"environments": [],
|
||||
"settings": {
|
||||
"backup_path": "backups",
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"file_path": "logs/app.log",
|
||||
"max_bytes": 10485760,
|
||||
"backup_count": 5,
|
||||
"enable_belief_state": true
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 1.2. Changing Log Level
|
||||
|
||||
To change the log level to `DEBUG`, update the `logging.level` field in `config.json` or use the API:
|
||||
|
||||
```bash
|
||||
curl -X PATCH http://localhost:8000/api/settings/global \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"backup_path": "backups",
|
||||
"logging": {
|
||||
"level": "DEBUG",
|
||||
"file_path": "logs/app.log",
|
||||
"max_bytes": 10485760,
|
||||
"backup_count": 5,
|
||||
"enable_belief_state": true
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
## 2. Using Belief State Logging
|
||||
|
||||
### 2.1. Basic Usage
|
||||
|
||||
Use the `belief_scope` context manager to automatically log Entry, Exit, and Coherence states.
|
||||
|
||||
```python
|
||||
from backend.src.core.logger import logger, belief_scope
|
||||
|
||||
def my_function():
|
||||
with belief_scope("MyFunction"):
|
||||
# Logs: [MyFunction][Entry]
|
||||
|
||||
logger.info("Doing something important")
|
||||
# Logs: [MyFunction][Action] Doing something important
|
||||
|
||||
# ... logic ...
|
||||
|
||||
# Logs: [MyFunction][Coherence:OK]
|
||||
# Logs: [MyFunction][Exit]
|
||||
```
|
||||
|
||||
### 2.2. Error Handling
|
||||
|
||||
If an exception occurs within the scope, it is caught, logged as a failure, and re-raised.
|
||||
|
||||
```python
|
||||
def failing_function():
|
||||
with belief_scope("FailingFunc"):
|
||||
raise ValueError("Something went wrong")
|
||||
|
||||
# Logs: [FailingFunc][Entry]
|
||||
# Logs: [FailingFunc][Coherence:Failed] Something went wrong
|
||||
# Re-raises ValueError
|
||||
```
|
||||
|
||||
### 2.3. Custom Messages
|
||||
|
||||
You can provide an optional message to `belief_scope`.
|
||||
|
||||
```python
|
||||
with belief_scope("DataProcessor", "Processing batch #1"):
|
||||
# Logs: [DataProcessor][Entry] Processing batch #1
|
||||
pass
|
||||
```
|
||||
|
||||
## 3. Log Output Format
|
||||
|
||||
### 3.1. Standard Log
|
||||
|
||||
```text
|
||||
[2025-12-26 10:00:00,000][INFO][superset_tools_app] System initialized
|
||||
```
|
||||
|
||||
### 3.2. Belief State Log
|
||||
|
||||
```text
|
||||
[2025-12-26 10:00:01,000][INFO][superset_tools_app] [MyFunction][Entry]
|
||||
[2025-12-26 10:00:01,050][INFO][superset_tools_app] [MyFunction][Action] Processing data
|
||||
[2025-12-26 10:00:01,100][INFO][superset_tools_app] [MyFunction][Coherence:OK]
|
||||
[2025-12-26 10:00:01,100][INFO][superset_tools_app] [MyFunction][Exit]
|
||||
109
specs/006-configurable-belief-logs/research.md
Normal file
109
specs/006-configurable-belief-logs/research.md
Normal file
@@ -0,0 +1,109 @@
|
||||
# Research: Configurable Belief State Logging
|
||||
|
||||
## 1. Introduction
|
||||
|
||||
This research explores the implementation of a configurable logging system that supports "Belief State" logging (Entry, Action, Coherence, Exit) and allows users to customize log levels, formats, and file persistence.
|
||||
|
||||
## 2. Analysis of Existing System
|
||||
|
||||
- **Language**: Python 3.9+ (Backend)
|
||||
- **Framework**: FastAPI (inferred from context, though not explicitly seen in snippets, `uvicorn` mentioned)
|
||||
- **Logging**: Standard Python `logging` module.
|
||||
- **Configuration**: Pydantic models (`ConfigModels`) managed by `ConfigManager`, persisting to `config.json`.
|
||||
- **Current Logger**:
|
||||
- `backend/src/core/logger.py`:
|
||||
- Uses `logging.getLogger("superset_tools_app")`
|
||||
- Has `StreamHandler` (console) and `WebSocketLogHandler` (streaming).
|
||||
- `WebSocketLogHandler` buffers logs in a `deque`.
|
||||
|
||||
## 3. Requirements Analysis
|
||||
|
||||
- **Belief State**: Need a structured way to log `[ANCHOR_ID][STATE] Message`.
|
||||
- **Context Manager**: Need a `with belief_scope("ID"):` pattern.
|
||||
- **Configuration**: Need to add `LoggingConfig` to `GlobalSettings`.
|
||||
- **File Logging**: Need `RotatingFileHandler` with size limits.
|
||||
- **Dynamic Reconfiguration**: Need to update logger handlers/levels when config changes.
|
||||
|
||||
## 4. Proposed Solution
|
||||
|
||||
### 4.1. Data Model (`LoggingConfig`)
|
||||
|
||||
We will add a `LoggingConfig` model to `backend/src/core/config_models.py`:
|
||||
|
||||
```python
|
||||
class LoggingConfig(BaseModel):
|
||||
level: str = "INFO" # DEBUG, INFO, WARNING, ERROR, CRITICAL
|
||||
file_path: Optional[str] = "logs/app.log"
|
||||
max_bytes: int = 10 * 1024 * 1024 # 10MB
|
||||
backup_count: int = 5
|
||||
enable_belief_state: bool = True
|
||||
```
|
||||
|
||||
And update `GlobalSettings`:
|
||||
|
||||
```python
|
||||
class GlobalSettings(BaseModel):
|
||||
backup_path: str
|
||||
default_environment_id: Optional[str] = None
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
```
|
||||
|
||||
### 4.2. Context Manager (`belief_scope`)
|
||||
|
||||
We will implement a context manager in `backend/src/core/logger.py`:
|
||||
|
||||
```python
|
||||
from contextlib import contextmanager
|
||||
|
||||
@contextmanager
|
||||
def belief_scope(anchor_id: str, message: str = ""):
|
||||
# ... logic to log [Entry] ...
|
||||
try:
|
||||
yield
|
||||
# ... logic to log [Coherence:OK] and [Exit] ...
|
||||
except Exception as e:
|
||||
# ... logic to log [Coherence:Failed] ...
|
||||
raise
|
||||
```
|
||||
|
||||
### 4.3. Logger Reconfiguration
|
||||
|
||||
We will add a `configure_logger(config: LoggingConfig)` function in `backend/src/core/logger.py` that:
|
||||
1. Sets the logger level.
|
||||
2. Removes old file handlers.
|
||||
3. Adds a new `RotatingFileHandler` if `file_path` is set.
|
||||
4. Updates a global flag for `enable_belief_state`.
|
||||
|
||||
`ConfigManager` will call this function whenever settings are updated.
|
||||
|
||||
### 4.4. Belief State Filtering
|
||||
|
||||
If `enable_belief_state` is False:
|
||||
- `Entry`/`Exit` logs are skipped.
|
||||
- `Action`/`Coherence` logs are logged as standard messages (maybe without the `[ANCHOR_ID]` prefix if desired, but retaining it is usually better for consistency).
|
||||
|
||||
## 5. Alternatives Considered
|
||||
|
||||
- **Alternative A**: Use a separate logger for Belief State.
|
||||
- *Pros*: Cleaner separation.
|
||||
- *Cons*: Harder to interleave with standard logs in the same stream/file.
|
||||
- *Decision*: Rejected. We want a unified log stream.
|
||||
|
||||
- **Alternative B**: Use structlog.
|
||||
- *Pros*: Powerful structured logging.
|
||||
- *Cons*: Adds a new dependency.
|
||||
- *Decision*: Rejected. Standard `logging` is sufficient and already used.
|
||||
|
||||
## 6. Implementation Steps
|
||||
|
||||
1. **Modify `backend/src/core/config_models.py`**: Add `LoggingConfig` and update `GlobalSettings`.
|
||||
2. **Modify `backend/src/core/logger.py`**:
|
||||
- Add `configure_logger` function.
|
||||
- Implement `belief_scope` context manager.
|
||||
- Implement `RotatingFileHandler`.
|
||||
3. **Modify `backend/src/core/config_manager.py`**: Call `configure_logger` on init and update.
|
||||
4. **Frontend**: Update Settings page to allow editing `LoggingConfig`.
|
||||
|
||||
## 7. Conclusion
|
||||
|
||||
The proposed solution leverages the existing Pydantic/ConfigManager architecture and standard Python logging, minimizing disruption while meeting all requirements.
|
||||
88
specs/006-configurable-belief-logs/spec.md
Normal file
88
specs/006-configurable-belief-logs/spec.md
Normal file
@@ -0,0 +1,88 @@
|
||||
# Feature Specification: Configurable Belief State Logging
|
||||
|
||||
**Feature Branch**: `006-configurable-belief-logs`
|
||||
**Created**: 2025-12-26
|
||||
**Status**: Draft
|
||||
**Input**: User description: "Меня не устраивает текущее состояния логов проекта, которые я получаю после запуска run.sh. Нужно продумать систему хранения логов, которая будет настраиваемой, включать логирование belief state агента разработки (смотри semantic_protocol.md) и гибко настраиваемой."
|
||||
|
||||
## Clarifications
|
||||
|
||||
### Session 2025-12-26
|
||||
- Q: Log rotation policy? → A: Size-based rotation (e.g., 10MB limit, 5 backups).
|
||||
- Q: Belief State disabled behavior? → A: Smart Filter (Suppress Entry/Exit; keep Action/Coherence as standard logs).
|
||||
- Q: Implementation pattern? → A: Context Manager (e.g., `with belief_scope("ID"):`) to auto-handle Entry/Exit/Error.
|
||||
|
||||
## User Scenarios & Testing
|
||||
|
||||
### User Story 1 - Structured Belief State Logging (Priority: P1)
|
||||
|
||||
As a developer or AI agent, I want logs to clearly indicate the execution flow and internal state ("Belief State") of the system using a standardized format, so that I can easily debug logic errors and verify semantic coherence.
|
||||
|
||||
**Why this priority**: This is the core requirement to align with the `semantic_protocol.md` and improve debugging capabilities for the AI agent.
|
||||
|
||||
**Independent Test**: Trigger an action in the system (e.g., running a migration) and verify that the logs contain entries formatted as `[ANCHOR_ID][STATE] Message`, representing the entry, action, and exit states of the code blocks.
|
||||
|
||||
**Acceptance Scenarios**:
|
||||
|
||||
1. **Given** a function wrapped with a Belief State logger, **When** the function is called, **Then** a log entry `[ANCHOR_ID][Entry] ...` is generated.
|
||||
2. **Given** a function execution, **When** the core logic is executed, **Then** a log entry `[ANCHOR_ID][Action] ...` is generated.
|
||||
3. **Given** a function execution completes successfully, **When** it returns, **Then** a log entry `[ANCHOR_ID][Coherence:OK]` and `[ANCHOR_ID][Exit]` are generated.
|
||||
4. **Given** a function execution fails, **When** an exception is raised, **Then** a log entry `[ANCHOR_ID][Coherence:Failed]` is generated.
|
||||
|
||||
---
|
||||
|
||||
### User Story 2 - Configurable Logging Settings (Priority: P2)
|
||||
|
||||
As a user, I want to be able to configure log levels, formats, and destinations via the application settings, so that I can control the verbosity and storage of logs without changing code.
|
||||
|
||||
**Why this priority**: Provides the "flexible" and "customizable" aspect of the requirement, allowing users to adapt logging to their needs (dev vs prod).
|
||||
|
||||
**Independent Test**: Change the log level in the settings (e.g., from INFO to DEBUG) and verify that debug messages start appearing in the output.
|
||||
|
||||
**Acceptance Scenarios**:
|
||||
|
||||
1. **Given** the application is running, **When** I update the log level to "DEBUG" in the settings, **Then** debug-level logs are captured and displayed.
|
||||
2. **Given** the application is running, **When** I enable "File Logging" in settings, **Then** logs are written to the specified file path.
|
||||
3. **Given** the application is running, **When** I disable "Belief State" logs in settings, **Then** "Entry" and "Exit" logs are suppressed, while "Action" and "Coherence" logs are retained as standard log entries.
|
||||
|
||||
---
|
||||
|
||||
### Edge Cases
|
||||
|
||||
- **Invalid Configuration**: What happens if the user provides an invalid log level (e.g., "SUPER_LOUD")? The system should fallback to a safe default (e.g., "INFO") and log a warning.
|
||||
- **File System Errors**: What happens if the configured log file path is not writable? The system should fallback to console logging and alert the user.
|
||||
- **High Volume**: What happens if "Belief State" logging generates too much noise? The system should remain performant, and users should be able to toggle it off easily.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Functional Requirements
|
||||
|
||||
- **FR-001**: The system MUST support a `LoggingConfig` structure within the global configuration to store settings for level, format, file path, and belief state enablement.
|
||||
- **FR-002**: The logging system MUST implement a standard format for "Belief State" logs as defined in the system protocols: `[{ANCHOR_ID}][{STATE}] {Message}`.
|
||||
- **FR-003**: The system MUST provide a standard mechanism to easily generate Belief State logs without repetitive boilerplate code, specifically using a **Context Manager** pattern.
|
||||
- **FR-004**: The supported Belief States MUST include: `Entry`, `Action`, `Coherence:OK`, `Coherence:Failed`, `Exit`.
|
||||
- **FR-005**: The system MUST allow dynamic reconfiguration of the logger (e.g., changing levels) when settings are updated.
|
||||
- **FR-006**: The real-time log stream MUST preserve the structured format of Belief State logs so the frontend can potentially render them specially.
|
||||
- **FR-007**: Logs MUST optionally be persisted to a file if configured.
|
||||
- **FR-008**: The file logging system MUST implement size-based rotation (default: 10MB limit, 5 backups) to prevent disk saturation.
|
||||
- **FR-009**: When `enable_belief_state` is False, the logger MUST suppress `Entry` and `Exit` states but retain `Action` and `Coherence` states (stripping the structured prefix if necessary or keeping it as standard info).
|
||||
- **FR-010**: The Context Manager MUST automatically log `[Entry]` on start, `[Exit]` on success, and `[Coherence:Failed]` if an exception occurs (re-raising the exception).
|
||||
|
||||
### Key Entities
|
||||
|
||||
- **LoggingConfig**: A configuration object defining `level` (text), `file_path` (text), `format` (structure), `enable_belief_state` (boolean).
|
||||
- **BeliefStateAdapter**: A component that enforces the semantic protocol format for log entries.
|
||||
|
||||
## Success Criteria
|
||||
|
||||
### Measurable Outcomes
|
||||
|
||||
- **SC-001**: Developers can trace the execution flow of a specific task using only the `[ANCHOR_ID]` filtered logs.
|
||||
- **SC-002**: Changing the log level in the configuration file or API immediately (or upon restart) reflects in the log output.
|
||||
- **SC-003**: All "Belief State" logs strictly follow the `[ID][State]` format, allowing for regex parsing.
|
||||
- **SC-004**: System supports logging to both console and file simultaneously if configured.
|
||||
|
||||
## Assumptions
|
||||
|
||||
- The existing real-time streaming mechanism can be adapted or chained with the new logging configuration.
|
||||
- The system protocol definition of states is the source of truth.
|
||||
61
specs/006-configurable-belief-logs/tasks.md
Normal file
61
specs/006-configurable-belief-logs/tasks.md
Normal file
@@ -0,0 +1,61 @@
|
||||
# Tasks: Configurable Belief State Logging
|
||||
|
||||
**Spec**: `specs/006-configurable-belief-logs/spec.md`
|
||||
**Plan**: `specs/006-configurable-belief-logs/plan.md`
|
||||
**Status**: Completed
|
||||
|
||||
## Phase 1: Setup
|
||||
*Goal: Initialize project structure for logging.*
|
||||
|
||||
- [x] T001 Ensure logs directory exists at `logs/` (relative to project root)
|
||||
|
||||
## Phase 2: Foundational
|
||||
*Goal: Define data models and infrastructure required for logging configuration.*
|
||||
|
||||
- [x] T002 Define `LoggingConfig` model in `backend/src/core/config_models.py`
|
||||
- [x] T003 Update `GlobalSettings` model to include `logging` field in `backend/src/core/config_models.py`
|
||||
- [x] T004 Update `ConfigManager` to handle logging configuration persistence in `backend/src/core/config_manager.py`
|
||||
|
||||
## Phase 3: User Story 1 - Structured Belief State Logging
|
||||
*Goal: Implement the core "Belief State" logging logic with context managers.*
|
||||
*Priority: P1*
|
||||
|
||||
**Independent Test**: Run `pytest backend/tests/test_logger.py` and verify `belief_scope` generates `[ID][Entry]`, `[ID][Action]`, and `[ID][Exit]` logs.
|
||||
|
||||
- [x] T005 [US1] Create unit tests for belief state logging in `backend/tests/test_logger.py`
|
||||
- [x] T006 [US1] Implement `belief_scope` context manager in `backend/src/core/logger.py`
|
||||
- [x] T007 [US1] Implement log formatting and smart filtering (suppress Entry/Exit if disabled) in `backend/src/core/logger.py`
|
||||
|
||||
## Phase 4: User Story 2 - Configurable Logging Settings
|
||||
*Goal: Expose logging configuration to the user via API and UI.*
|
||||
*Priority: P2*
|
||||
|
||||
**Independent Test**: Update settings via API/UI and verify log level changes (e.g., DEBUG logs appear) and file rotation is active.
|
||||
|
||||
- [x] T008 [US2] Implement `configure_logger` function to apply settings (level, file, rotation) in `backend/src/core/logger.py`
|
||||
- [x] T009 [US2] Update settings API endpoint to handle `logging` updates in `backend/src/api/routes/settings.py`
|
||||
- [x] T010 [P] [US2] Add Logging configuration section (Level, File Path, Enable Belief State) to `frontend/src/pages/Settings.svelte`
|
||||
|
||||
## Final Phase: Polish & Cross-Cutting Concerns
|
||||
*Goal: Verify system stability and cleanup.*
|
||||
|
||||
- [x] T011 Verify log rotation works by generating dummy logs (manual verification)
|
||||
- [x] T012 Ensure default configuration provides a sensible out-of-the-box experience
|
||||
|
||||
## Dependencies
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
Setup[Phase 1: Setup] --> Foundational[Phase 2: Foundational]
|
||||
Foundational --> US1[Phase 3: US1 Belief State]
|
||||
US1 --> US2[Phase 4: US2 Configurable Settings]
|
||||
US2 --> Polish[Final Phase: Polish]
|
||||
```
|
||||
|
||||
## Parallel Execution Opportunities
|
||||
|
||||
- **US2**: Frontend task (T010) can be implemented in parallel with Backend tasks (T008, T009) once the API contract is finalized.
|
||||
|
||||
## Implementation Strategy
|
||||
1. **MVP**: Complete Phase 1, 2, and 3 to enable structured logging for the agent.
|
||||
2. **Full Feature**: Complete Phase 4 to allow users to control the verbosity and storage.
|
||||
@@ -28,7 +28,11 @@ class SupersetLogger:
|
||||
# @PARAM: log_dir (Optional[Path]) - Директория для сохранения лог-файлов.
|
||||
# @PARAM: level (int) - Уровень логирования (e.g., `logging.INFO`).
|
||||
# @PARAM: console (bool) - Флаг для включения вывода в консоль.
|
||||
def __init__(self, name: str = "superset_tool", log_dir: Optional[Path] = None, level: int = logging.INFO, console: bool = True) -> None:
|
||||
def __init__(self, name: str = "superset_tool", log_dir: Optional[Path] = None, level: int = logging.INFO, console: bool = True, logger: Optional[logging.Logger] = None) -> None:
|
||||
if logger:
|
||||
self.logger = logger
|
||||
return
|
||||
|
||||
self.logger = logging.getLogger(name)
|
||||
self.logger.setLevel(level)
|
||||
self.logger.propagate = False
|
||||
|
||||
Reference in New Issue
Block a user