14 Commits

Author SHA1 Message Date
45c077b928 +api rework 2025-12-30 20:08:48 +03:00
9ed3a5992d cleaned 2025-12-30 18:20:40 +03:00
a032fe8457 Password promt 2025-12-30 17:21:12 +03:00
4c9d554432 TaskManager refactor 2025-12-29 10:13:37 +03:00
6962a78112 mappings+migrate 2025-12-27 10:16:41 +03:00
3d75a21127 tech_lead / coder 2roles 2025-12-27 08:02:59 +03:00
07914c8728 semantic add 2025-12-27 07:14:08 +03:00
cddc259b76 new loggers logic in constitution 2025-12-27 06:51:28 +03:00
dcbf0a7d7f tasks ready 2025-12-27 06:37:03 +03:00
65f61c1f80 Merge branch '001-migration-ui-redesign' into master 2025-12-27 05:58:35 +03:00
cb7386f274 superset_tool logger rework 2025-12-27 05:53:30 +03:00
83e34e1799 feat(logging): implement configurable belief state logging
- Add LoggingConfig model and logging field to GlobalSettings
- Implement belief_scope context manager for structured logging
- Add configure_logger for dynamic level and file rotation settings
- Add logging configuration UI to Settings page
- Update ConfigManager to apply logging settings on initialization and updates
2025-12-27 05:39:33 +03:00
d197303b9f 006 plan ready 2025-12-26 19:36:49 +03:00
a43f8fb021 001-migration-ui-redesign (#3)
Reviewed-on: #3
2025-12-26 18:17:58 +03:00
125 changed files with 5831 additions and 11167 deletions

3
.gitignore vendored
View File

@@ -60,4 +60,5 @@ keyring passwords.py
*github* *github*
*git* *git*
*tech_spec* *tech_spec*
dashboards dashboards
backend/mappings.db

View File

@@ -9,6 +9,13 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic (005-fix-ui-ws-validation) - Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic (005-fix-ui-ws-validation)
- N/A (Configuration based) (005-fix-ui-ws-validation) - N/A (Configuration based) (005-fix-ui-ws-validation)
- Filesystem (plugins, logs, backups), SQLite (optional, for job history if needed) (005-fix-ui-ws-validation) - Filesystem (plugins, logs, backups), SQLite (optional, for job history if needed) (005-fix-ui-ws-validation)
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS (007-migration-dashboard-grid)
- N/A (Superset API integration) (007-migration-dashboard-grid)
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, Superset API (007-migration-dashboard-grid)
- N/A (Superset API integration - read-only for metadata) (007-migration-dashboard-grid)
- Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
- SQLite (optional for job history), existing database for mappings (008-migration-ui-improvements)
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui) - Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
@@ -29,9 +36,9 @@ cd src; pytest; ruff check .
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
## Recent Changes ## Recent Changes
- 005-fix-ui-ws-validation: Added Python 3.9+ (Backend), Node.js 18+ (Frontend Build) - 008-migration-ui-improvements: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API
- 005-fix-ui-ws-validation: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic - 008-migration-ui-improvements: Added Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API
- 005-fix-ui-ws-validation: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic - 007-migration-dashboard-grid: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, Superset API
<!-- MANUAL ADDITIONS START --> <!-- MANUAL ADDITIONS START -->

25
.kilocodemodes Normal file
View File

@@ -0,0 +1,25 @@
customModes:
- slug: tester
name: Tester
description: QA and Plan Verification Specialist
roleDefinition: >-
You are Kilo Code, acting as a QA and Verification Specialist. Your primary goal is to validate that the project implementation aligns strictly with the defined specifications and task plans.
Your responsibilities include:
- Reading and analyzing task plans and specifications (typically in the `specs/` directory).
- Verifying that implemented code matches the requirements.
- Executing tests and validating system behavior via CLI or Browser.
- Updating the status of tasks in the plan files (e.g., marking checkboxes [x]) as they are verified.
- Identifying and reporting missing features or bugs.
whenToUse: >-
Use this mode when you need to audit the progress of a project, verify completed tasks against the plan, run quality assurance checks, or update the status of task lists in specification documents.
groups:
- read
- edit
- command
- browser
- mcp
customInstructions: >-
1. Always begin by loading the relevant plan or task list from the `specs/` directory.
2. Do not assume a task is done just because it is checked; verify the code or functionality first if asked to audit.
3. When updating task lists, ensure you only mark items as complete if you have verified them.

View File

@@ -1,29 +1,99 @@
# ss-tools Constitution <!--
SYNC IMPACT REPORT
Version: 1.5.0 (Fractal Complexity Limit)
Changes:
- Added Section VI (Fractal Complexity Limit) to enforce strict module (~300 lines) and function (~30-50 lines) size limits.
- Aims to maintain semantic coherence and avoid "Attention Sink".
Templates Status:
- .specify/templates/plan-template.md: ✅ Aligned.
- .specify/templates/spec-template.md: ✅ Aligned.
- .specify/templates/tasks-arch-template.md: ✅ Aligned (New role-based split).
- .specify/templates/tasks-dev-template.md: ✅ Aligned (New role-based split).
-->
# Semantic Code Generation Constitution
## Core Principles ## Core Principles
### I. SPA-First Architecture ### I. Causal Validity (Contracts First)
The frontend MUST be a Static Single Page Application (SPA) served by the Python backend. No Node.js server is permitted in production. The backend serves the `index.html` entry point for all non-API routes. Semantic definitions (Contracts) must ALWAYS precede implementation code. Logic is downstream of definition. We define the structure and constraints (`[DEF]`, `@PRE`, `@POST`) before writing the executable logic. This ensures that the "what" and "why" govern the "how".
### II. API-Driven Communication ### II. Immutability of Architecture
All data retrieval and state changes MUST be performed via the backend REST API or WebSockets. The frontend should not access the database or filesystem directly. Once defined, architectural decisions in the Module Header (`@LAYER`, `@INVARIANT`, `@CONSTRAINT`) are treated as immutable constraints for that module. Changes to these require an explicit refactoring step, not ad-hoc modification during implementation.
### III. Modern Stack Consistency ### III. Semantic Format Compliance
The project strictly uses SvelteKit (Frontend), FastAPI (Backend), and Tailwind CSS (Styling). New dependencies must be justified and approved. All output must strictly follow the `[DEF]` / `[/DEF]` anchor syntax with specific Metadata Tags (`@KEY`) and Graph Relations (`@RELATION`). **Crucially, the closing anchor must strictly match the full content of the opening anchor (e.g., `[DEF:identifier:Type]` must close with `[/DEF:identifier:Type]`).**
### IV. Semantic Protocol Adherence (GRACE-Poly) **Standardized Graph Relations**
All code generation and modification MUST adhere to the Semantic Protocol defined in `semantic_protocol.md`. To ensure the integrity of the Semantic Graph, `@RELATION` must use a strict taxonomy:
- **Anchors**: Use `[DEF:id:Type]` and `[/DEF:id]` to define semantic boundaries. - `DEPENDS_ON` (Structural dependency)
- **Contracts**: Define `@PRE` and `@POST` conditions in headers. - `CALLS` (Flow control)
- **Logging**: Use structured logging with `[AnchorID][State]` format. - `CREATES` (Instantiation)
- **Immutability**: Respect architectural decisions in headers. - `INHERITS_FROM` / `IMPLEMENTS` (OOP hierarchy)
- `READS_STATE` / `WRITES_STATE` (Data flow)
- `DISPATCHES` / `HANDLES` (Event flow)
Ad-hoc relationships are forbidden. This structure is non-negotiable as it ensures the codebase remains machine-readable, fractal-structured, and optimized for Sparse Attention navigation by AI agents.
### IV. Design by Contract (DbC)
Contracts are the Source of Truth. Functions and Classes must define their purpose, specifications, and constraints (`@PRE`, `@POST`, `@THROW`) in the metadata block before implementation. Implementation must strictly satisfy these contracts.
### V. Belief State Logging
Logs must define the agent's internal state for debugging and coherence checks. We use a strict format: `[{ANCHOR_ID}][{STATE}] {MESSAGE}`. For Python, a **Context Manager** pattern MUST be used to automatically handle `Entry`, `Exit`, and `Coherence` states, ensuring structural integrity and error capturing.
### VI. Fractal Complexity Limit
To maintain semantic coherence and avoid "Attention Sink" issues:
- **Module Size**: If a Module body exceeds ~300 lines (or logical complexity), it MUST be refactored into sub-modules or a package structure.
- **Function Size**: Functions should fit within a standard attention "chunk" (approx. 30-50 lines). If larger, logic MUST be decomposed into helper functions with their own contracts.
This ensures every vector embedding remains sharp and focused.
## File Structure Standards
### Python Modules
Every `.py` file must start with a Module definition header (`[DEF:module_name:Module]`) containing:
- `@SEMANTICS`: Keywords for vector search.
- `@PURPOSE`: Primary responsibility of the module.
- `@LAYER`: Architecture layer (Domain/Infra/UI).
- `@RELATION`: Dependencies.
- `@INVARIANT` & `@CONSTRAINT`: Immutable rules.
- `@PUBLIC_API`: Exported symbols.
### Svelte Components
Every `.svelte` file must start with a Component definition header (`[DEF:ComponentName:Component]`) wrapped in an HTML comment `<!-- ... -->` containing:
- `@SEMANTICS`: Keywords for vector search.
- `@PURPOSE`: Primary responsibility of the component.
- `@LAYER`: Architecture layer (UI/State/Layout).
- `@RELATION`: Child components, Stores used, API calls.
- `@PROPS`: Input properties.
- `@EVENTS`: Emitted events.
- `@INVARIANT`: Immutable UI/State rules.
## Generation Workflow
The development process follows a strict sequence enforced by Agent Roles:
### 1. Architecture Phase (Mode: `tech-lead`)
**Input**: `tasks-arch.md`
**Responsibility**:
- Analyze request and graph position.
- Generate `[DEF]` anchors, Headers, and Contracts (`@PRE`, `@POST`).
- **Output**: Scaffolding files with no implementation logic.
### 2. Implementation Phase (Mode: `code`)
**Input**: `tasks-dev.md` + Scaffolding files
**Responsibility**:
- Read contracts defined by Architect.
- Write implementation code that strictly satisfies contracts.
- **Output**: Working code with passing tests.
### 3. Validation
If logic conflicts with Contract -> Stop -> Report Error.
## Governance ## Governance
This Constitution establishes the "Semantic Code Generation Protocol" as the supreme law of this repository.
### Compliance - **Automated Enforcement**: All code generation tools and agents must parse and validate adherence to the `[DEF]` syntax and Contract requirements.
All Pull Requests and code modifications must be verified against this Constitution. Violations of Core Principles are considered critical defects. - **Amendments**: Changes to the syntax or core principles require a formal amendment to this Constitution and a corresponding update to the constitution
- **Review**: Code reviews must verify that implementation matches the preceding contracts and that no "naked code" exists outside of semantic anchors.
- **Compliance**: Failure to adhere to the `[DEF]` / `[/DEF]` structure (including matching closing tags) constitutes a build failure.
### Amendments **Version**: 1.5.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2025-12-27
Changes to this Constitution require a formal RFC process and approval from the project lead.
**Version**: 1.0.0 | **Ratified**: 2025-12-20

View File

@@ -9,8 +9,8 @@
# #
# OPTIONS: # OPTIONS:
# --json Output in JSON format # --json Output in JSON format
# --require-tasks Require tasks.md to exist (for implementation phase) # --require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
# --include-tasks Include tasks.md in AVAILABLE_DOCS list # --include-tasks Include task files in AVAILABLE_DOCS list
# --paths-only Only output path variables (no validation) # --paths-only Only output path variables (no validation)
# --help, -h Show help message # --help, -h Show help message
# #
@@ -49,8 +49,8 @@ Consolidated prerequisite checking for Spec-Driven Development workflow.
OPTIONS: OPTIONS:
--json Output in JSON format --json Output in JSON format
--require-tasks Require tasks.md to exist (for implementation phase) --require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
--include-tasks Include tasks.md in AVAILABLE_DOCS list --include-tasks Include task files in AVAILABLE_DOCS list
--paths-only Only output path variables (no prerequisite validation) --paths-only Only output path variables (no prerequisite validation)
--help, -h Show this help message --help, -h Show this help message
@@ -58,7 +58,7 @@ EXAMPLES:
# Check task prerequisites (plan.md required) # Check task prerequisites (plan.md required)
./check-prerequisites.sh --json ./check-prerequisites.sh --json
# Check implementation prerequisites (plan.md + tasks.md required) # Check implementation prerequisites (plan.md + task files required)
./check-prerequisites.sh --json --require-tasks --include-tasks ./check-prerequisites.sh --json --require-tasks --include-tasks
# Get feature paths only (no validation) # Get feature paths only (no validation)
@@ -86,15 +86,16 @@ check_feature_branch "$CURRENT_BRANCH" "$HAS_GIT" || exit 1
if $PATHS_ONLY; then if $PATHS_ONLY; then
if $JSON_MODE; then if $JSON_MODE; then
# Minimal JSON paths payload (no validation performed) # Minimal JSON paths payload (no validation performed)
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS":"%s"}\n' \ printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS_ARCH":"%s","TASKS_DEV":"%s"}\n' \
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS" "$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS_ARCH" "$TASKS_DEV"
else else
echo "REPO_ROOT: $REPO_ROOT" echo "REPO_ROOT: $REPO_ROOT"
echo "BRANCH: $CURRENT_BRANCH" echo "BRANCH: $CURRENT_BRANCH"
echo "FEATURE_DIR: $FEATURE_DIR" echo "FEATURE_DIR: $FEATURE_DIR"
echo "FEATURE_SPEC: $FEATURE_SPEC" echo "FEATURE_SPEC: $FEATURE_SPEC"
echo "IMPL_PLAN: $IMPL_PLAN" echo "IMPL_PLAN: $IMPL_PLAN"
echo "TASKS: $TASKS" echo "TASKS_ARCH: $TASKS_ARCH"
echo "TASKS_DEV: $TASKS_DEV"
fi fi
exit 0 exit 0
fi fi
@@ -112,11 +113,20 @@ if [[ ! -f "$IMPL_PLAN" ]]; then
exit 1 exit 1
fi fi
# Check for tasks.md if required # Check for task files if required
if $REQUIRE_TASKS && [[ ! -f "$TASKS" ]]; then if $REQUIRE_TASKS; then
echo "ERROR: tasks.md not found in $FEATURE_DIR" >&2 # Check for split tasks first
echo "Run /speckit.tasks first to create the task list." >&2 if [[ -f "$TASKS_ARCH" ]] && [[ -f "$TASKS_DEV" ]]; then
exit 1 : # Split tasks exist, proceed
# Fallback to unified tasks.md
elif [[ -f "$TASKS" ]]; then
: # Unified tasks exist, proceed
else
echo "ERROR: No valid task files found in $FEATURE_DIR" >&2
echo "Expected 'tasks-arch.md' AND 'tasks-dev.md' (split) OR 'tasks.md' (unified)" >&2
echo "Run /speckit.tasks first to create the task lists." >&2
exit 1
fi
fi fi
# Build list of available documents # Build list of available documents
@@ -133,9 +143,14 @@ fi
[[ -f "$QUICKSTART" ]] && docs+=("quickstart.md") [[ -f "$QUICKSTART" ]] && docs+=("quickstart.md")
# Include tasks.md if requested and it exists # Include task files if requested and they exist
if $INCLUDE_TASKS && [[ -f "$TASKS" ]]; then if $INCLUDE_TASKS; then
docs+=("tasks.md") if [[ -f "$TASKS_ARCH" ]] || [[ -f "$TASKS_DEV" ]]; then
[[ -f "$TASKS_ARCH" ]] && docs+=("tasks-arch.md")
[[ -f "$TASKS_DEV" ]] && docs+=("tasks-dev.md")
elif [[ -f "$TASKS" ]]; then
docs+=("tasks.md")
fi
fi fi
# Output results # Output results
@@ -161,6 +176,11 @@ else
check_file "$QUICKSTART" "quickstart.md" check_file "$QUICKSTART" "quickstart.md"
if $INCLUDE_TASKS; then if $INCLUDE_TASKS; then
check_file "$TASKS" "tasks.md" if [[ -f "$TASKS_ARCH" ]] || [[ -f "$TASKS_DEV" ]]; then
check_file "$TASKS_ARCH" "tasks-arch.md"
check_file "$TASKS_DEV" "tasks-dev.md"
else
check_file "$TASKS" "tasks.md"
fi
fi fi
fi fi

View File

@@ -143,7 +143,9 @@ HAS_GIT='$has_git_repo'
FEATURE_DIR='$feature_dir' FEATURE_DIR='$feature_dir'
FEATURE_SPEC='$feature_dir/spec.md' FEATURE_SPEC='$feature_dir/spec.md'
IMPL_PLAN='$feature_dir/plan.md' IMPL_PLAN='$feature_dir/plan.md'
TASKS='$feature_dir/tasks.md' TASKS_ARCH='$feature_dir/tasks-arch.md'
TASKS_DEV='$feature_dir/tasks-dev.md'
TASKS='$feature_dir/tasks.md' # Deprecated
RESEARCH='$feature_dir/research.md' RESEARCH='$feature_dir/research.md'
DATA_MODEL='$feature_dir/data-model.md' DATA_MODEL='$feature_dir/data-model.md'
QUICKSTART='$feature_dir/quickstart.md' QUICKSTART='$feature_dir/quickstart.md'

View File

@@ -0,0 +1,35 @@
---
description: "Architecture task list template (Contracts & Scaffolding)"
---
# Architecture Tasks: [FEATURE NAME]
**Role**: Architect Agent
**Goal**: Define the "What" and "Why" (Contracts, Scaffolding, Models) before implementation.
**Input**: Design documents from `/specs/[###-feature-name]/`
**Output**: Files with `[DEF]` anchors, `@PRE`/`@POST` contracts, and `@RELATION` mappings. No business logic.
## Phase 1: Setup & Models
- [ ] A001 Create/Update data models in [path] with `[DEF]` and contracts
- [ ] A002 Define API route structure/contracts in [path]
- [ ] A003 Define shared utilities/interfaces
## Phase 2: User Story 1 - [Title]
- [ ] A004 [US1] Define contracts for [Component/Service] in [path]
- [ ] A005 [US1] Define contracts for [Endpoint] in [path]
- [ ] A006 [US1] Define contracts for [Frontend Component] in [path]
## Phase 3: User Story 2 - [Title]
- [ ] A007 [US2] Define contracts for [Component/Service] in [path]
- [ ] A008 [US2] Define contracts for [Endpoint] in [path]
## Handover Checklist
- [ ] All new files created with `[DEF]` anchors
- [ ] All functions/classes have `@PURPOSE`, `@PRE`, `@POST` tags
- [ ] No "naked code" (logic outside of anchors)
- [ ] `tasks-dev.md` is ready for the Developer Agent

View File

@@ -0,0 +1,35 @@
---
description: "Developer task list template (Implementation Logic)"
---
# Developer Tasks: [FEATURE NAME]
**Role**: Developer Agent
**Goal**: Implement the "How" (Logic, State, Error Handling) inside the defined contracts.
**Input**: `tasks-arch.md` (completed), Scaffolding files with `[DEF]` anchors.
**Output**: Working code that satisfies `@PRE`/`@POST` conditions.
## Phase 1: Setup & Models
- [ ] D001 Implement logic for [Model] in [path]
- [ ] D002 Implement logic for [API Route] in [path]
- [ ] D003 Implement shared utilities
## Phase 2: User Story 1 - [Title]
- [ ] D004 [US1] Implement logic for [Component/Service] in [path]
- [ ] D005 [US1] Implement logic for [Endpoint] in [path]
- [ ] D006 [US1] Implement logic for [Frontend Component] in [path]
- [ ] D007 [US1] Verify semantic compliance and belief state logging
## Phase 3: User Story 2 - [Title]
- [ ] D008 [US2] Implement logic for [Component/Service] in [path]
- [ ] D009 [US2] Implement logic for [Endpoint] in [path]
## Polish & Quality Assurance
- [ ] DXXX Verify all tests pass
- [ ] DXXX Check error handling and edge cases
- [ ] DXXX Ensure code style compliance

View File

@@ -1,251 +0,0 @@
---
description: "Task list template for feature implementation"
---
# Tasks: [FEATURE NAME]
**Input**: Design documents from `/specs/[###-feature-name]/`
**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, contracts/
**Tests**: The examples below include test tasks. Tests are OPTIONAL - only include them if explicitly requested in the feature specification.
**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story.
## Format: `[ID] [P?] [Story] Description`
- **[P]**: Can run in parallel (different files, no dependencies)
- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3)
- Include exact file paths in descriptions
## Path Conventions
- **Single project**: `src/`, `tests/` at repository root
- **Web app**: `backend/src/`, `frontend/src/`
- **Mobile**: `api/src/`, `ios/src/` or `android/src/`
- Paths shown below assume single project - adjust based on plan.md structure
<!--
============================================================================
IMPORTANT: The tasks below are SAMPLE TASKS for illustration purposes only.
The /speckit.tasks command MUST replace these with actual tasks based on:
- User stories from spec.md (with their priorities P1, P2, P3...)
- Feature requirements from plan.md
- Entities from data-model.md
- Endpoints from contracts/
Tasks MUST be organized by user story so each story can be:
- Implemented independently
- Tested independently
- Delivered as an MVP increment
DO NOT keep these sample tasks in the generated tasks.md file.
============================================================================
-->
## Phase 1: Setup (Shared Infrastructure)
**Purpose**: Project initialization and basic structure
- [ ] T001 Create project structure per implementation plan
- [ ] T002 Initialize [language] project with [framework] dependencies
- [ ] T003 [P] Configure linting and formatting tools
---
## Phase 2: Foundational (Blocking Prerequisites)
**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented
**⚠️ CRITICAL**: No user story work can begin until this phase is complete
Examples of foundational tasks (adjust based on your project):
- [ ] T004 Setup database schema and migrations framework
- [ ] T005 [P] Implement authentication/authorization framework
- [ ] T006 [P] Setup API routing and middleware structure
- [ ] T007 Create base models/entities that all stories depend on
- [ ] T008 Configure error handling and logging infrastructure
- [ ] T009 Setup environment configuration management
**Checkpoint**: Foundation ready - user story implementation can now begin in parallel
---
## Phase 3: User Story 1 - [Title] (Priority: P1) 🎯 MVP
**Goal**: [Brief description of what this story delivers]
**Independent Test**: [How to verify this story works on its own]
### Tests for User Story 1 (OPTIONAL - only if tests requested) ⚠️
> **NOTE: Write these tests FIRST, ensure they FAIL before implementation**
- [ ] T010 [P] [US1] Contract test for [endpoint] in tests/contract/test_[name].py
- [ ] T011 [P] [US1] Integration test for [user journey] in tests/integration/test_[name].py
### Implementation for User Story 1
- [ ] T012 [P] [US1] Create [Entity1] model in src/models/[entity1].py
- [ ] T013 [P] [US1] Create [Entity2] model in src/models/[entity2].py
- [ ] T014 [US1] Implement [Service] in src/services/[service].py (depends on T012, T013)
- [ ] T015 [US1] Implement [endpoint/feature] in src/[location]/[file].py
- [ ] T016 [US1] Add validation and error handling
- [ ] T017 [US1] Add logging for user story 1 operations
**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently
---
## Phase 4: User Story 2 - [Title] (Priority: P2)
**Goal**: [Brief description of what this story delivers]
**Independent Test**: [How to verify this story works on its own]
### Tests for User Story 2 (OPTIONAL - only if tests requested) ⚠️
- [ ] T018 [P] [US2] Contract test for [endpoint] in tests/contract/test_[name].py
- [ ] T019 [P] [US2] Integration test for [user journey] in tests/integration/test_[name].py
### Implementation for User Story 2
- [ ] T020 [P] [US2] Create [Entity] model in src/models/[entity].py
- [ ] T021 [US2] Implement [Service] in src/services/[service].py
- [ ] T022 [US2] Implement [endpoint/feature] in src/[location]/[file].py
- [ ] T023 [US2] Integrate with User Story 1 components (if needed)
**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently
---
## Phase 5: User Story 3 - [Title] (Priority: P3)
**Goal**: [Brief description of what this story delivers]
**Independent Test**: [How to verify this story works on its own]
### Tests for User Story 3 (OPTIONAL - only if tests requested) ⚠️
- [ ] T024 [P] [US3] Contract test for [endpoint] in tests/contract/test_[name].py
- [ ] T025 [P] [US3] Integration test for [user journey] in tests/integration/test_[name].py
### Implementation for User Story 3
- [ ] T026 [P] [US3] Create [Entity] model in src/models/[entity].py
- [ ] T027 [US3] Implement [Service] in src/services/[service].py
- [ ] T028 [US3] Implement [endpoint/feature] in src/[location]/[file].py
**Checkpoint**: All user stories should now be independently functional
---
[Add more user story phases as needed, following the same pattern]
---
## Phase N: Polish & Cross-Cutting Concerns
**Purpose**: Improvements that affect multiple user stories
- [ ] TXXX [P] Documentation updates in docs/
- [ ] TXXX Code cleanup and refactoring
- [ ] TXXX Performance optimization across all stories
- [ ] TXXX [P] Additional unit tests (if requested) in tests/unit/
- [ ] TXXX Security hardening
- [ ] TXXX Run quickstart.md validation
---
## Dependencies & Execution Order
### Phase Dependencies
- **Setup (Phase 1)**: No dependencies - can start immediately
- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories
- **User Stories (Phase 3+)**: All depend on Foundational phase completion
- User stories can then proceed in parallel (if staffed)
- Or sequentially in priority order (P1 → P2 → P3)
- **Polish (Final Phase)**: Depends on all desired user stories being complete
### User Story Dependencies
- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
- **User Story 2 (P2)**: Can start after Foundational (Phase 2) - May integrate with US1 but should be independently testable
- **User Story 3 (P3)**: Can start after Foundational (Phase 2) - May integrate with US1/US2 but should be independently testable
### Within Each User Story
- Tests (if included) MUST be written and FAIL before implementation
- Models before services
- Services before endpoints
- Core implementation before integration
- Story complete before moving to next priority
### Parallel Opportunities
- All Setup tasks marked [P] can run in parallel
- All Foundational tasks marked [P] can run in parallel (within Phase 2)
- Once Foundational phase completes, all user stories can start in parallel (if team capacity allows)
- All tests for a user story marked [P] can run in parallel
- Models within a story marked [P] can run in parallel
- Different user stories can be worked on in parallel by different team members
---
## Parallel Example: User Story 1
```bash
# Launch all tests for User Story 1 together (if tests requested):
Task: "Contract test for [endpoint] in tests/contract/test_[name].py"
Task: "Integration test for [user journey] in tests/integration/test_[name].py"
# Launch all models for User Story 1 together:
Task: "Create [Entity1] model in src/models/[entity1].py"
Task: "Create [Entity2] model in src/models/[entity2].py"
```
---
## Implementation Strategy
### MVP First (User Story 1 Only)
1. Complete Phase 1: Setup
2. Complete Phase 2: Foundational (CRITICAL - blocks all stories)
3. Complete Phase 3: User Story 1
4. **STOP and VALIDATE**: Test User Story 1 independently
5. Deploy/demo if ready
### Incremental Delivery
1. Complete Setup + Foundational → Foundation ready
2. Add User Story 1 → Test independently → Deploy/Demo (MVP!)
3. Add User Story 2 → Test independently → Deploy/Demo
4. Add User Story 3 → Test independently → Deploy/Demo
5. Each story adds value without breaking previous stories
### Parallel Team Strategy
With multiple developers:
1. Team completes Setup + Foundational together
2. Once Foundational is done:
- Developer A: User Story 1
- Developer B: User Story 2
- Developer C: User Story 3
3. Stories complete and integrate independently
---
## Notes
- [P] tasks = different files, no dependencies
- [Story] label maps task to specific user story for traceability
- Each user story should be independently completable and testable
- Verify tests fail before implementing
- Commit after each task or logical group
- Stop at any checkpoint to validate story independently
- Avoid: vague tasks, same file conflicts, cross-story dependencies that break independence

Binary file not shown.

BIN
backend/migrations.db Normal file

Binary file not shown.

View File

@@ -39,6 +39,9 @@ class DatabaseResponse(BaseModel):
@router.get("", response_model=List[EnvironmentResponse]) @router.get("", response_model=List[EnvironmentResponse])
async def get_environments(config_manager=Depends(get_config_manager)): async def get_environments(config_manager=Depends(get_config_manager)):
envs = config_manager.get_environments() envs = config_manager.get_environments()
# Ensure envs is a list
if not isinstance(envs, list):
envs = []
return [EnvironmentResponse(id=e.id, name=e.name, url=e.url) for e in envs] return [EnvironmentResponse(id=e.id, name=e.name, url=e.url) for e in envs]
# [/DEF:get_environments] # [/DEF:get_environments]

View File

@@ -0,0 +1,76 @@
# [DEF:backend.src.api.routes.migration:Module]
# @SEMANTICS: api, migration, dashboards
# @PURPOSE: API endpoints for migration operations.
# @LAYER: API
# @RELATION: DEPENDS_ON -> backend.src.dependencies
# @RELATION: DEPENDS_ON -> backend.src.models.dashboard
from fastapi import APIRouter, Depends, HTTPException
from typing import List, Dict
from backend.src.dependencies import get_config_manager, get_task_manager
from backend.src.models.dashboard import DashboardMetadata, DashboardSelection
from backend.src.core.superset_client import SupersetClient
from superset_tool.models import SupersetConfig
router = APIRouter(prefix="/api", tags=["migration"])
# [DEF:get_dashboards:Function]
# @PURPOSE: Fetch all dashboards from the specified environment for the grid.
# @PRE: Environment ID must be valid.
# @POST: Returns a list of dashboard metadata.
# @PARAM: env_id (str) - The ID of the environment to fetch from.
# @RETURN: List[DashboardMetadata]
@router.get("/environments/{env_id}/dashboards", response_model=List[DashboardMetadata])
async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)):
environments = config_manager.get_environments()
env = next((e for e in environments if e.id == env_id), None)
if not env:
raise HTTPException(status_code=404, detail="Environment not found")
config = SupersetConfig(
env=env.name,
base_url=env.url,
auth={'provider': 'db', 'username': env.username, 'password': env.password, 'refresh': False},
verify_ssl=True,
timeout=30
)
client = SupersetClient(config)
dashboards = client.get_dashboards_summary()
return dashboards
# [/DEF:get_dashboards]
# [DEF:execute_migration:Function]
# @PURPOSE: Execute the migration of selected dashboards.
# @PRE: Selection must be valid and environments must exist.
# @POST: Starts the migration task and returns the task ID.
# @PARAM: selection (DashboardSelection) - The dashboards to migrate.
# @RETURN: Dict - {"task_id": str, "message": str}
@router.post("/migration/execute")
async def execute_migration(selection: DashboardSelection, config_manager=Depends(get_config_manager), task_manager=Depends(get_task_manager)):
# Validate environments exist
environments = config_manager.get_environments()
env_ids = {e.id for e in environments}
if selection.source_env_id not in env_ids or selection.target_env_id not in env_ids:
raise HTTPException(status_code=400, detail="Invalid source or target environment")
# Create migration task with debug logging
from ...core.logger import logger
# Include replace_db_config in the task parameters
task_params = selection.dict()
task_params['replace_db_config'] = selection.replace_db_config
logger.info(f"Creating migration task with params: {task_params}")
logger.info(f"Available environments: {env_ids}")
logger.info(f"Source env: {selection.source_env_id}, Target env: {selection.target_env_id}")
try:
task = await task_manager.create_task("superset-migration", task_params)
logger.info(f"Task created successfully: {task.id}")
return {"task_id": task.id, "message": "Migration initiated"}
except Exception as e:
logger.error(f"Task creation failed: {e}")
raise HTTPException(status_code=500, detail=f"Failed to create migration task: {str(e)}")
# [/DEF:execute_migration]
# [/DEF:backend.src.api.routes.migration]

View File

@@ -16,7 +16,7 @@ from ...core.config_models import AppConfig, Environment, GlobalSettings
from ...dependencies import get_config_manager from ...dependencies import get_config_manager
from ...core.config_manager import ConfigManager from ...core.config_manager import ConfigManager
from ...core.logger import logger from ...core.logger import logger
from superset_tool.client import SupersetClient from ...core.superset_client import SupersetClient
from superset_tool.models import SupersetConfig from superset_tool.models import SupersetConfig
import os import os
# [/SECTION] # [/SECTION]

View File

@@ -3,11 +3,11 @@
# @PURPOSE: Defines the FastAPI router for task-related endpoints, allowing clients to create, list, and get the status of tasks. # @PURPOSE: Defines the FastAPI router for task-related endpoints, allowing clients to create, list, and get the status of tasks.
# @LAYER: UI (API) # @LAYER: UI (API)
# @RELATION: Depends on the TaskManager. It is included by the main app. # @RELATION: Depends on the TaskManager. It is included by the main app.
from typing import List, Dict, Any from typing import List, Dict, Any, Optional
from fastapi import APIRouter, Depends, HTTPException, status from fastapi import APIRouter, Depends, HTTPException, status
from pydantic import BaseModel from pydantic import BaseModel
from ...core.task_manager import TaskManager, Task from ...core.task_manager import TaskManager, Task, TaskStatus, LogEntry
from ...dependencies import get_task_manager from ...dependencies import get_task_manager
router = APIRouter() router = APIRouter()
@@ -19,7 +19,10 @@ class CreateTaskRequest(BaseModel):
class ResolveTaskRequest(BaseModel): class ResolveTaskRequest(BaseModel):
resolution_params: Dict[str, Any] resolution_params: Dict[str, Any]
@router.post("/", response_model=Task, status_code=status.HTTP_201_CREATED) class ResumeTaskRequest(BaseModel):
passwords: Dict[str, str]
@router.post("", response_model=Task, status_code=status.HTTP_201_CREATED)
async def create_task( async def create_task(
request: CreateTaskRequest, request: CreateTaskRequest,
task_manager: TaskManager = Depends(get_task_manager) task_manager: TaskManager = Depends(get_task_manager)
@@ -36,14 +39,17 @@ async def create_task(
except ValueError as e: except ValueError as e:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e)) raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e))
@router.get("/", response_model=List[Task]) @router.get("", response_model=List[Task])
async def list_tasks( async def list_tasks(
limit: int = 10,
offset: int = 0,
status: Optional[TaskStatus] = None,
task_manager: TaskManager = Depends(get_task_manager) task_manager: TaskManager = Depends(get_task_manager)
): ):
""" """
Retrieve a list of all tasks. Retrieve a list of tasks with pagination and optional status filter.
""" """
return task_manager.get_all_tasks() return task_manager.get_tasks(limit=limit, offset=offset, status=status)
@router.get("/{task_id}", response_model=Task) @router.get("/{task_id}", response_model=Task)
async def get_task( async def get_task(
@@ -58,6 +64,19 @@ async def get_task(
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found") raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
return task return task
@router.get("/{task_id}/logs", response_model=List[LogEntry])
async def get_task_logs(
task_id: str,
task_manager: TaskManager = Depends(get_task_manager)
):
"""
Retrieve logs for a specific task.
"""
task = task_manager.get_task(task_id)
if not task:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
return task_manager.get_task_logs(task_id)
@router.post("/{task_id}/resolve", response_model=Task) @router.post("/{task_id}/resolve", response_model=Task)
async def resolve_task( async def resolve_task(
task_id: str, task_id: str,
@@ -72,4 +91,30 @@ async def resolve_task(
return task_manager.get_task(task_id) return task_manager.get_task(task_id)
except ValueError as e: except ValueError as e:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e)) raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
@router.post("/{task_id}/resume", response_model=Task)
async def resume_task(
task_id: str,
request: ResumeTaskRequest,
task_manager: TaskManager = Depends(get_task_manager)
):
"""
Resume a task that is awaiting input (e.g., passwords).
"""
try:
task_manager.resume_task_with_password(task_id, request.passwords)
return task_manager.get_task(task_id)
except ValueError as e:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
@router.delete("", status_code=status.HTTP_204_NO_CONTENT)
async def clear_tasks(
status: Optional[TaskStatus] = None,
task_manager: TaskManager = Depends(get_task_manager)
):
"""
Clear tasks matching the status filter. If no filter, clears all non-running tasks.
"""
task_manager.clear_tasks(status)
return
# [/DEF] # [/DEF]

View File

@@ -11,7 +11,7 @@ from pathlib import Path
project_root = Path(__file__).resolve().parent.parent.parent project_root = Path(__file__).resolve().parent.parent.parent
sys.path.append(str(project_root)) sys.path.append(str(project_root))
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
from fastapi.responses import FileResponse from fastapi.responses import FileResponse
@@ -20,7 +20,7 @@ import os
from .dependencies import get_task_manager from .dependencies import get_task_manager
from .core.logger import logger from .core.logger import logger
from .api.routes import plugins, tasks, settings, environments, mappings from .api.routes import plugins, tasks, settings, environments, mappings, migration
from .core.database import init_db from .core.database import init_db
# Initialize database # Initialize database
@@ -45,12 +45,20 @@ app.add_middleware(
) )
@app.middleware("http")
async def log_requests(request: Request, call_next):
logger.info(f"[DEBUG] Incoming request: {request.method} {request.url.path}")
response = await call_next(request)
logger.info(f"[DEBUG] Response status: {response.status_code} for {request.url.path}")
return response
# Include API routes # Include API routes
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"]) app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"]) app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"]) app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
app.include_router(environments.router) app.include_router(environments.router)
app.include_router(mappings.router) app.include_router(mappings.router)
app.include_router(migration.router)
# [DEF:WebSocketEndpoint:Endpoint] # [DEF:WebSocketEndpoint:Endpoint]
# @SEMANTICS: websocket, logs, streaming, real-time # @SEMANTICS: websocket, logs, streaming, real-time
@@ -62,16 +70,30 @@ async def websocket_endpoint(websocket: WebSocket, task_id: str):
task_manager = get_task_manager() task_manager = get_task_manager()
queue = await task_manager.subscribe_logs(task_id) queue = await task_manager.subscribe_logs(task_id)
try: try:
# Send initial logs if any # Stream new logs
logger.info(f"Starting log stream for task {task_id}")
# Send initial logs first to build context
initial_logs = task_manager.get_task_logs(task_id) initial_logs = task_manager.get_task_logs(task_id)
for log_entry in initial_logs: for log_entry in initial_logs:
# Convert datetime to string for JSON serialization
log_dict = log_entry.dict() log_dict = log_entry.dict()
log_dict['timestamp'] = log_dict['timestamp'].isoformat() log_dict['timestamp'] = log_dict['timestamp'].isoformat()
await websocket.send_json(log_dict) await websocket.send_json(log_dict)
# Stream new logs # Force a check for AWAITING_INPUT status immediately upon connection
logger.info(f"Starting log stream for task {task_id}") # This ensures that if the task is already waiting when the user connects, they get the prompt.
task = task_manager.get_task(task_id)
if task and task.status == "AWAITING_INPUT" and task.input_request:
# Construct a synthetic log entry to trigger the frontend handler
# This is a bit of a hack but avoids changing the websocket protocol significantly
synthetic_log = {
"timestamp": task.logs[-1].timestamp.isoformat() if task.logs else "2024-01-01T00:00:00",
"level": "INFO",
"message": "Task paused for user input (Connection Re-established)",
"context": {"input_request": task.input_request}
}
await websocket.send_json(synthetic_log)
while True: while True:
log_entry = await queue.get() log_entry = await queue.get()
log_dict = log_entry.dict() log_dict = log_entry.dict()
@@ -83,7 +105,9 @@ async def websocket_endpoint(websocket: WebSocket, task_id: str):
if "Task completed successfully" in log_entry.message or "Task failed" in log_entry.message: if "Task completed successfully" in log_entry.message or "Task failed" in log_entry.message:
# Wait a bit to ensure client receives the last message # Wait a bit to ensure client receives the last message
await asyncio.sleep(2) await asyncio.sleep(2)
break # DO NOT BREAK here - allow client to keep connection open if they want to review logs
# or until they disconnect. Breaking closes the socket immediately.
# break
except WebSocketDisconnect: except WebSocketDisconnect:
logger.info(f"WebSocket connection disconnected for task {task_id}") logger.info(f"WebSocket connection disconnected for task {task_id}")

View File

@@ -16,7 +16,7 @@ import os
from pathlib import Path from pathlib import Path
from typing import Optional, List from typing import Optional, List
from .config_models import AppConfig, Environment, GlobalSettings from .config_models import AppConfig, Environment, GlobalSettings
from .logger import logger from .logger import logger, configure_logger
# [/SECTION] # [/SECTION]
# [DEF:ConfigManager:Class] # [DEF:ConfigManager:Class]
@@ -38,10 +38,13 @@ class ConfigManager:
# 2. Logic implementation # 2. Logic implementation
self.config_path = Path(config_path) self.config_path = Path(config_path)
self.config: AppConfig = self._load_config() self.config: AppConfig = self._load_config()
# Configure logger with loaded settings
configure_logger(self.config.settings.logging)
# 3. Runtime check of @POST # 3. Runtime check of @POST
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig" assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
logger.info(f"[ConfigManager][Exit] Initialized") logger.info(f"[ConfigManager][Exit] Initialized")
# [/DEF:__init__] # [/DEF:__init__]
@@ -69,6 +72,8 @@ class ConfigManager:
return config return config
except Exception as e: except Exception as e:
logger.error(f"[_load_config][Coherence:Failed] Error loading config: {e}") logger.error(f"[_load_config][Coherence:Failed] Error loading config: {e}")
# Fallback but try to preserve existing settings if possible?
# For now, return default to be safe, but log the error prominently.
return AppConfig( return AppConfig(
environments=[], environments=[],
settings=GlobalSettings(backup_path="backups") settings=GlobalSettings(backup_path="backups")
@@ -120,7 +125,10 @@ class ConfigManager:
# 2. Logic implementation # 2. Logic implementation
self.config.settings = settings self.config.settings = settings
self.save() self.save()
# Reconfigure logger with new settings
configure_logger(settings.logging)
logger.info(f"[update_global_settings][Exit] Settings updated") logger.info(f"[update_global_settings][Exit] Settings updated")
# [/DEF:update_global_settings] # [/DEF:update_global_settings]

View File

@@ -19,11 +19,27 @@ class Environment(BaseModel):
is_default: bool = False is_default: bool = False
# [/DEF:Environment] # [/DEF:Environment]
# [DEF:LoggingConfig:DataClass]
# @PURPOSE: Defines the configuration for the application's logging system.
class LoggingConfig(BaseModel):
level: str = "INFO"
file_path: Optional[str] = "logs/app.log"
max_bytes: int = 10 * 1024 * 1024
backup_count: int = 5
enable_belief_state: bool = True
# [/DEF:LoggingConfig]
# [DEF:GlobalSettings:DataClass] # [DEF:GlobalSettings:DataClass]
# @PURPOSE: Represents global application settings. # @PURPOSE: Represents global application settings.
class GlobalSettings(BaseModel): class GlobalSettings(BaseModel):
backup_path: str backup_path: str
default_environment_id: Optional[str] = None default_environment_id: Optional[str] = None
logging: LoggingConfig = Field(default_factory=LoggingConfig)
# Task retention settings
task_retention_days: int = 30
task_retention_limit: int = 100
pagination_limit: int = 10
# [/DEF:GlobalSettings] # [/DEF:GlobalSettings]
# [DEF:AppConfig:DataClass] # [DEF:AppConfig:DataClass]

View File

@@ -4,12 +4,32 @@
# @LAYER: Core # @LAYER: Core
# @RELATION: Used by the main application and other modules to log events. The WebSocketLogHandler is used by the WebSocket endpoint in app.py. # @RELATION: Used by the main application and other modules to log events. The WebSocketLogHandler is used by the WebSocket endpoint in app.py.
import logging import logging
import threading
from datetime import datetime from datetime import datetime
from typing import Dict, Any, List, Optional from typing import Dict, Any, List, Optional
from collections import deque from collections import deque
from contextlib import contextmanager
from logging.handlers import RotatingFileHandler
from pydantic import BaseModel, Field from pydantic import BaseModel, Field
# Thread-local storage for belief state
_belief_state = threading.local()
# Global flag for belief state logging
_enable_belief_state = True
# [DEF:BeliefFormatter:Class]
# @PURPOSE: Custom logging formatter that adds belief state prefixes to log messages.
class BeliefFormatter(logging.Formatter):
def format(self, record):
msg = super().format(record)
anchor_id = getattr(_belief_state, 'anchor_id', None)
if anchor_id:
msg = f"[{anchor_id}][Action] {msg}"
return msg
# [/DEF:BeliefFormatter]
# Re-using LogEntry from task_manager for consistency # Re-using LogEntry from task_manager for consistency
# [DEF:LogEntry:Class] # [DEF:LogEntry:Class]
# @SEMANTICS: log, entry, record, pydantic # @SEMANTICS: log, entry, record, pydantic
@@ -22,6 +42,81 @@ class LogEntry(BaseModel):
# [/DEF] # [/DEF]
# [DEF:BeliefScope:Function]
# @PURPOSE: Context manager for structured Belief State logging.
@contextmanager
def belief_scope(anchor_id: str, message: str = ""):
# Log Entry if enabled
if _enable_belief_state:
entry_msg = f"[{anchor_id}][Entry]"
if message:
entry_msg += f" {message}"
logger.info(entry_msg)
# Set thread-local anchor_id
old_anchor = getattr(_belief_state, 'anchor_id', None)
_belief_state.anchor_id = anchor_id
try:
yield
# Log Coherence OK and Exit
logger.info(f"[{anchor_id}][Coherence:OK]")
if _enable_belief_state:
logger.info(f"[{anchor_id}][Exit]")
except Exception as e:
# Log Coherence Failed
logger.info(f"[{anchor_id}][Coherence:Failed] {str(e)}")
raise
finally:
# Restore old anchor
_belief_state.anchor_id = old_anchor
# [/DEF:BeliefScope]
# [DEF:ConfigureLogger:Function]
# @PURPOSE: Configures the logger with the provided logging settings.
# @PRE: config is a valid LoggingConfig instance.
# @POST: Logger level, handlers, and belief state flag are updated.
# @PARAM: config (LoggingConfig) - The logging configuration.
def configure_logger(config):
global _enable_belief_state
_enable_belief_state = config.enable_belief_state
# Set logger level
level = getattr(logging, config.level.upper(), logging.INFO)
logger.setLevel(level)
# Remove existing file handlers
handlers_to_remove = [h for h in logger.handlers if isinstance(h, RotatingFileHandler)]
for h in handlers_to_remove:
logger.removeHandler(h)
h.close()
# Add file handler if file_path is set
if config.file_path:
import os
from pathlib import Path
log_file = Path(config.file_path)
log_file.parent.mkdir(parents=True, exist_ok=True)
file_handler = RotatingFileHandler(
config.file_path,
maxBytes=config.max_bytes,
backupCount=config.backup_count
)
file_handler.setFormatter(BeliefFormatter(
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
))
logger.addHandler(file_handler)
# Update existing handlers' formatters to BeliefFormatter
for handler in logger.handlers:
if not isinstance(handler, RotatingFileHandler):
handler.setFormatter(BeliefFormatter(
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
))
# [/DEF:ConfigureLogger]
# [DEF:WebSocketLogHandler:Class] # [DEF:WebSocketLogHandler:Class]
# @SEMANTICS: logging, handler, websocket, buffer # @SEMANTICS: logging, handler, websocket, buffer
# @PURPOSE: A custom logging handler that captures log records into a buffer. It is designed to be extended for real-time log streaming over WebSockets. # @PURPOSE: A custom logging handler that captures log records into a buffer. It is designed to be extended for real-time log streaming over WebSockets.
@@ -72,7 +167,7 @@ logger = logging.getLogger("superset_tools_app")
logger.setLevel(logging.INFO) logger.setLevel(logging.INFO)
# Create a formatter # Create a formatter
formatter = logging.Formatter( formatter = BeliefFormatter(
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s' '[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
) )

View File

@@ -15,6 +15,8 @@ import shutil
import tempfile import tempfile
from pathlib import Path from pathlib import Path
from typing import Dict from typing import Dict
from .logger import logger, belief_scope
import yaml
# [/SECTION] # [/SECTION]
# [DEF:MigrationEngine:Class] # [DEF:MigrationEngine:Class]
@@ -26,37 +28,51 @@ class MigrationEngine:
# @PARAM: zip_path (str) - Path to the source ZIP file. # @PARAM: zip_path (str) - Path to the source ZIP file.
# @PARAM: output_path (str) - Path where the transformed ZIP will be saved. # @PARAM: output_path (str) - Path where the transformed ZIP will be saved.
# @PARAM: db_mapping (Dict[str, str]) - Mapping of source UUID to target UUID. # @PARAM: db_mapping (Dict[str, str]) - Mapping of source UUID to target UUID.
# @PARAM: strip_databases (bool) - Whether to remove the databases directory from the archive.
# @RETURN: bool - True if successful. # @RETURN: bool - True if successful.
def transform_zip(self, zip_path: str, output_path: str, db_mapping: Dict[str, str]) -> bool: def transform_zip(self, zip_path: str, output_path: str, db_mapping: Dict[str, str], strip_databases: bool = True) -> bool:
""" """
Transform a Superset export ZIP by replacing database UUIDs. Transform a Superset export ZIP by replacing database UUIDs.
""" """
with tempfile.TemporaryDirectory() as temp_dir_str: with belief_scope("MigrationEngine.transform_zip"):
temp_dir = Path(temp_dir_str) with tempfile.TemporaryDirectory() as temp_dir_str:
temp_dir = Path(temp_dir_str)
try: try:
# 1. Extract # 1. Extract
with zipfile.ZipFile(zip_path, 'r') as zf: logger.info(f"[MigrationEngine.transform_zip][Action] Extracting ZIP: {zip_path}")
zf.extractall(temp_dir) with zipfile.ZipFile(zip_path, 'r') as zf:
zf.extractall(temp_dir)
# 2. Transform YAMLs # 2. Transform YAMLs
# Datasets are usually in datasets/*.yaml # Datasets are usually in datasets/*.yaml
dataset_files = list(temp_dir.glob("**/datasets/*.yaml")) dataset_files = list(temp_dir.glob("**/datasets/**/*.yaml")) + list(temp_dir.glob("**/datasets/*.yaml"))
for ds_file in dataset_files: dataset_files = list(set(dataset_files))
self._transform_yaml(ds_file, db_mapping)
logger.info(f"[MigrationEngine.transform_zip][State] Found {len(dataset_files)} dataset files.")
for ds_file in dataset_files:
logger.info(f"[MigrationEngine.transform_zip][Action] Transforming dataset: {ds_file}")
self._transform_yaml(ds_file, db_mapping)
# 3. Re-package # 3. Re-package
with zipfile.ZipFile(output_path, 'w', zipfile.ZIP_DEFLATED) as zf: logger.info(f"[MigrationEngine.transform_zip][Action] Re-packaging ZIP to: {output_path} (strip_databases={strip_databases})")
for root, dirs, files in os.walk(temp_dir): with zipfile.ZipFile(output_path, 'w', zipfile.ZIP_DEFLATED) as zf:
for file in files: for root, dirs, files in os.walk(temp_dir):
file_path = Path(root) / file rel_root = Path(root).relative_to(temp_dir)
arcname = file_path.relative_to(temp_dir)
zf.write(file_path, arcname) if strip_databases and "databases" in rel_root.parts:
logger.info(f"[MigrationEngine.transform_zip][Action] Skipping file in databases directory: {rel_root}")
return True continue
except Exception as e:
print(f"Error transforming ZIP: {e}") for file in files:
return False file_path = Path(root) / file
arcname = file_path.relative_to(temp_dir)
zf.write(file_path, arcname)
return True
except Exception as e:
logger.error(f"[MigrationEngine.transform_zip][Coherence:Failed] Error transforming ZIP: {e}")
return False
# [DEF:MigrationEngine._transform_yaml:Function] # [DEF:MigrationEngine._transform_yaml:Function]
# @PURPOSE: Replaces database_uuid in a single YAML file. # @PURPOSE: Replaces database_uuid in a single YAML file.

View File

@@ -47,12 +47,17 @@ class PluginLoader:
Loads a single Python module and extracts PluginBase subclasses. Loads a single Python module and extracts PluginBase subclasses.
""" """
# Try to determine the correct package prefix based on how the app is running # Try to determine the correct package prefix based on how the app is running
if "backend.src" in __name__: # For standalone execution, we need to handle the import differently
if __name__ == "__main__" or "test" in __name__:
# When running as standalone or in tests, use relative import
package_name = f"plugins.{module_name}"
elif "backend.src" in __name__:
package_prefix = "backend.src.plugins" package_prefix = "backend.src.plugins"
package_name = f"{package_prefix}.{module_name}"
else: else:
package_prefix = "src.plugins" package_prefix = "src.plugins"
package_name = f"{package_prefix}.{module_name}"
package_name = f"{package_prefix}.{module_name}"
# print(f"DEBUG: Loading plugin {module_name} as {package_name}") # print(f"DEBUG: Loading plugin {module_name} as {package_name}")
spec = importlib.util.spec_from_file_location(package_name, file_path) spec = importlib.util.spec_from_file_location(package_name, file_path)
if spec is None or spec.loader is None: if spec is None or spec.loader is None:
@@ -106,9 +111,11 @@ class PluginLoader:
# validate(instance={}, schema=schema) # validate(instance={}, schema=schema)
self._plugins[plugin_id] = plugin_instance self._plugins[plugin_id] = plugin_instance
self._plugin_configs[plugin_id] = plugin_config self._plugin_configs[plugin_id] = plugin_config
print(f"Plugin '{plugin_instance.name}' (ID: {plugin_id}) loaded successfully.") # Replace with proper logging from ..core.logger import logger
logger.info(f"Plugin '{plugin_instance.name}' (ID: {plugin_id}) loaded successfully.")
except Exception as e: except Exception as e:
print(f"Error validating plugin '{plugin_instance.name}' (ID: {plugin_id}): {e}") # Replace with proper logging from ..core.logger import logger
logger.error(f"Error validating plugin '{plugin_instance.name}' (ID: {plugin_id}): {e}")
def get_plugin(self, plugin_id: str) -> Optional[PluginBase]: def get_plugin(self, plugin_id: str) -> Optional[PluginBase]:

View File

@@ -52,6 +52,32 @@ class SupersetClient(BaseSupersetClient):
return databases[0] if databases else None return databases[0] if databases else None
# [/DEF:SupersetClient.get_database_by_uuid] # [/DEF:SupersetClient.get_database_by_uuid]
# [DEF:SupersetClient.get_dashboards_summary:Function]
# @PURPOSE: Fetches dashboard metadata optimized for the grid.
# @POST: Returns a list of dashboard dictionaries.
# @RETURN: List[Dict]
def get_dashboards_summary(self) -> List[Dict]:
"""
Fetches dashboard metadata optimized for the grid.
Returns a list of dictionaries mapped to DashboardMetadata fields.
"""
query = {
"columns": ["id", "dashboard_title", "changed_on_utc", "published"]
}
_, dashboards = self.get_dashboards(query=query)
# Map fields to DashboardMetadata schema
result = []
for dash in dashboards:
result.append({
"id": dash.get("id"),
"title": dash.get("dashboard_title"),
"last_modified": dash.get("changed_on_utc"),
"status": "published" if dash.get("published") else "draft"
})
return result
# [/DEF:SupersetClient.get_dashboards_summary]
# [/DEF:SupersetClient] # [/DEF:SupersetClient]
# [/DEF:backend.src.core.superset_client] # [/DEF:backend.src.core.superset_client]

View File

@@ -1,203 +0,0 @@
# [DEF:TaskManagerModule:Module]
# @SEMANTICS: task, manager, lifecycle, execution, state
# @PURPOSE: Manages the lifecycle of tasks, including their creation, execution, and state tracking. It uses a thread pool to run plugins asynchronously.
# @LAYER: Core
# @RELATION: Depends on PluginLoader to get plugin instances. It is used by the API layer to create and query tasks.
import asyncio
import uuid
from datetime import datetime
from enum import Enum
from typing import Dict, Any, List, Optional
from concurrent.futures import ThreadPoolExecutor
from pydantic import BaseModel, Field
# Assuming PluginBase and PluginConfig are defined in plugin_base.py
# from .plugin_base import PluginBase, PluginConfig # Not needed here, TaskManager interacts with the PluginLoader
# [DEF:TaskStatus:Enum]
# @SEMANTICS: task, status, state, enum
# @PURPOSE: Defines the possible states a task can be in during its lifecycle.
class TaskStatus(str, Enum):
PENDING = "PENDING"
RUNNING = "RUNNING"
SUCCESS = "SUCCESS"
FAILED = "FAILED"
AWAITING_MAPPING = "AWAITING_MAPPING"
# [/DEF]
# [DEF:LogEntry:Class]
# @SEMANTICS: log, entry, record, pydantic
# @PURPOSE: A Pydantic model representing a single, structured log entry associated with a task.
class LogEntry(BaseModel):
timestamp: datetime = Field(default_factory=datetime.utcnow)
level: str
message: str
context: Optional[Dict[str, Any]] = None
# [/DEF]
# [DEF:Task:Class]
# @SEMANTICS: task, job, execution, state, pydantic
# @PURPOSE: A Pydantic model representing a single execution instance of a plugin, including its status, parameters, and logs.
class Task(BaseModel):
id: str = Field(default_factory=lambda: str(uuid.uuid4()))
plugin_id: str
status: TaskStatus = TaskStatus.PENDING
started_at: Optional[datetime] = None
finished_at: Optional[datetime] = None
user_id: Optional[str] = None
logs: List[LogEntry] = Field(default_factory=list)
params: Dict[str, Any] = Field(default_factory=dict)
# [/DEF]
# [DEF:TaskManager:Class]
# @SEMANTICS: task, manager, lifecycle, execution, state
# @PURPOSE: Manages the lifecycle of tasks, including their creation, execution, and state tracking.
class TaskManager:
"""
Manages the lifecycle of tasks, including their creation, execution, and state tracking.
"""
def __init__(self, plugin_loader):
self.plugin_loader = plugin_loader
self.tasks: Dict[str, Task] = {}
self.subscribers: Dict[str, List[asyncio.Queue]] = {}
self.executor = ThreadPoolExecutor(max_workers=5) # For CPU-bound plugin execution
self.loop = asyncio.get_event_loop()
self.task_futures: Dict[str, asyncio.Future] = {}
# [/DEF]
async def create_task(self, plugin_id: str, params: Dict[str, Any], user_id: Optional[str] = None) -> Task:
"""
Creates and queues a new task for execution.
"""
if not self.plugin_loader.has_plugin(plugin_id):
raise ValueError(f"Plugin with ID '{plugin_id}' not found.")
plugin = self.plugin_loader.get_plugin(plugin_id)
# Validate params against plugin schema (this will be done at a higher level, e.g., API route)
# For now, a basic check
if not isinstance(params, dict):
raise ValueError("Task parameters must be a dictionary.")
task = Task(plugin_id=plugin_id, params=params, user_id=user_id)
self.tasks[task.id] = task
self.loop.create_task(self._run_task(task.id)) # Schedule task for execution
return task
async def _run_task(self, task_id: str):
"""
Internal method to execute a task.
"""
task = self.tasks[task_id]
plugin = self.plugin_loader.get_plugin(task.plugin_id)
task.status = TaskStatus.RUNNING
task.started_at = datetime.utcnow()
self._add_log(task_id, "INFO", f"Task started for plugin '{plugin.name}'")
try:
# Execute plugin in a separate thread to avoid blocking the event loop
# if the plugin's execute method is synchronous and potentially CPU-bound.
# If the plugin's execute method is already async, this can be simplified.
# Pass task_id to plugin so it can signal pause
params = {**task.params, "_task_id": task_id}
await self.loop.run_in_executor(
self.executor,
lambda: asyncio.run(plugin.execute(params)) if asyncio.iscoroutinefunction(plugin.execute) else plugin.execute(params)
)
task.status = TaskStatus.SUCCESS
self._add_log(task_id, "INFO", f"Task completed successfully for plugin '{plugin.name}'")
except Exception as e:
task.status = TaskStatus.FAILED
self._add_log(task_id, "ERROR", f"Task failed: {e}", {"error_type": type(e).__name__})
finally:
task.finished_at = datetime.utcnow()
# In a real system, you might notify clients via WebSocket here
async def resolve_task(self, task_id: str, resolution_params: Dict[str, Any]):
"""
Resumes a task that is awaiting mapping.
"""
task = self.tasks.get(task_id)
if not task or task.status != TaskStatus.AWAITING_MAPPING:
raise ValueError("Task is not awaiting mapping.")
# Update task params with resolution
task.params.update(resolution_params)
task.status = TaskStatus.RUNNING
self._add_log(task_id, "INFO", "Task resumed after mapping resolution.")
# Signal the future to continue
if task_id in self.task_futures:
self.task_futures[task_id].set_result(True)
async def wait_for_resolution(self, task_id: str):
"""
Pauses execution and waits for a resolution signal.
"""
task = self.tasks.get(task_id)
if not task: return
task.status = TaskStatus.AWAITING_MAPPING
self.task_futures[task_id] = self.loop.create_future()
try:
await self.task_futures[task_id]
finally:
del self.task_futures[task_id]
def get_task(self, task_id: str) -> Optional[Task]:
"""
Retrieves a task by its ID.
"""
return self.tasks.get(task_id)
def get_all_tasks(self) -> List[Task]:
"""
Retrieves all registered tasks.
"""
return list(self.tasks.values())
def get_task_logs(self, task_id: str) -> List[LogEntry]:
"""
Retrieves logs for a specific task.
"""
task = self.tasks.get(task_id)
return task.logs if task else []
def _add_log(self, task_id: str, level: str, message: str, context: Optional[Dict[str, Any]] = None):
"""
Adds a log entry to a task and notifies subscribers.
"""
task = self.tasks.get(task_id)
if not task:
return
log_entry = LogEntry(level=level, message=message, context=context)
task.logs.append(log_entry)
# Notify subscribers
if task_id in self.subscribers:
for queue in self.subscribers[task_id]:
self.loop.call_soon_threadsafe(queue.put_nowait, log_entry)
async def subscribe_logs(self, task_id: str) -> asyncio.Queue:
"""
Subscribes to real-time logs for a task.
"""
queue = asyncio.Queue()
if task_id not in self.subscribers:
self.subscribers[task_id] = []
self.subscribers[task_id].append(queue)
return queue
def unsubscribe_logs(self, task_id: str, queue: asyncio.Queue):
"""
Unsubscribes from real-time logs for a task.
"""
if task_id in self.subscribers:
self.subscribers[task_id].remove(queue)
if not self.subscribers[task_id]:
del self.subscribers[task_id]

View File

@@ -0,0 +1,12 @@
# [DEF:TaskManagerPackage:Module]
# @SEMANTICS: task, manager, package, exports
# @PURPOSE: Exports the public API of the task manager package.
# @LAYER: Core
# @RELATION: Aggregates models and manager.
from .models import Task, TaskStatus, LogEntry
from .manager import TaskManager
__all__ = ["TaskManager", "Task", "TaskStatus", "LogEntry"]
# [/DEF:TaskManagerPackage:Module]

View File

@@ -0,0 +1,379 @@
# [DEF:TaskManagerModule:Module]
# @SEMANTICS: task, manager, lifecycle, execution, state
# @PURPOSE: Manages the lifecycle of tasks, including their creation, execution, and state tracking. It uses a thread pool to run plugins asynchronously.
# @LAYER: Core
# @RELATION: Depends on PluginLoader to get plugin instances. It is used by the API layer to create and query tasks.
# @INVARIANT: Task IDs are unique.
# @CONSTRAINT: Must use belief_scope for logging.
# [SECTION: IMPORTS]
import asyncio
from datetime import datetime
from typing import Dict, Any, List, Optional
from concurrent.futures import ThreadPoolExecutor
from .models import Task, TaskStatus, LogEntry
from .persistence import TaskPersistenceService
from ..logger import logger, belief_scope
# [/SECTION]
# [DEF:TaskManager:Class]
# @SEMANTICS: task, manager, lifecycle, execution, state
# @PURPOSE: Manages the lifecycle of tasks, including their creation, execution, and state tracking.
class TaskManager:
"""
Manages the lifecycle of tasks, including their creation, execution, and state tracking.
"""
# [DEF:TaskManager.__init__:Function]
# @PURPOSE: Initialize the TaskManager with dependencies.
# @PRE: plugin_loader is initialized.
# @POST: TaskManager is ready to accept tasks.
# @PARAM: plugin_loader - The plugin loader instance.
def __init__(self, plugin_loader):
with belief_scope("TaskManager.__init__"):
self.plugin_loader = plugin_loader
self.tasks: Dict[str, Task] = {}
self.subscribers: Dict[str, List[asyncio.Queue]] = {}
self.executor = ThreadPoolExecutor(max_workers=5) # For CPU-bound plugin execution
self.persistence_service = TaskPersistenceService()
try:
self.loop = asyncio.get_running_loop()
except RuntimeError:
self.loop = asyncio.get_event_loop()
self.task_futures: Dict[str, asyncio.Future] = {}
# Load persisted tasks on startup
self.load_persisted_tasks()
# [/DEF:TaskManager.__init__:Function]
# [DEF:TaskManager.create_task:Function]
# @PURPOSE: Creates and queues a new task for execution.
# @PRE: Plugin with plugin_id exists. Params are valid.
# @POST: Task is created, added to registry, and scheduled for execution.
# @PARAM: plugin_id (str) - The ID of the plugin to run.
# @PARAM: params (Dict[str, Any]) - Parameters for the plugin.
# @PARAM: user_id (Optional[str]) - ID of the user requesting the task.
# @RETURN: Task - The created task instance.
# @THROWS: ValueError if plugin not found or params invalid.
async def create_task(self, plugin_id: str, params: Dict[str, Any], user_id: Optional[str] = None) -> Task:
with belief_scope("TaskManager.create_task", f"plugin_id={plugin_id}"):
if not self.plugin_loader.has_plugin(plugin_id):
logger.error(f"Plugin with ID '{plugin_id}' not found.")
raise ValueError(f"Plugin with ID '{plugin_id}' not found.")
plugin = self.plugin_loader.get_plugin(plugin_id)
if not isinstance(params, dict):
logger.error("Task parameters must be a dictionary.")
raise ValueError("Task parameters must be a dictionary.")
task = Task(plugin_id=plugin_id, params=params, user_id=user_id)
self.tasks[task.id] = task
logger.info(f"Task {task.id} created and scheduled for execution")
self.loop.create_task(self._run_task(task.id)) # Schedule task for execution
return task
# [/DEF:TaskManager.create_task:Function]
# [DEF:TaskManager._run_task:Function]
# @PURPOSE: Internal method to execute a task.
# @PRE: Task exists in registry.
# @POST: Task is executed, status updated to SUCCESS or FAILED.
# @PARAM: task_id (str) - The ID of the task to run.
async def _run_task(self, task_id: str):
with belief_scope("TaskManager._run_task", f"task_id={task_id}"):
task = self.tasks[task_id]
plugin = self.plugin_loader.get_plugin(task.plugin_id)
logger.info(f"Starting execution of task {task_id} for plugin '{plugin.name}'")
task.status = TaskStatus.RUNNING
task.started_at = datetime.utcnow()
self._add_log(task_id, "INFO", f"Task started for plugin '{plugin.name}'")
try:
# Execute plugin
params = {**task.params, "_task_id": task_id}
if asyncio.iscoroutinefunction(plugin.execute):
await plugin.execute(params)
else:
await self.loop.run_in_executor(
self.executor,
plugin.execute,
params
)
logger.info(f"Task {task_id} completed successfully")
task.status = TaskStatus.SUCCESS
self._add_log(task_id, "INFO", f"Task completed successfully for plugin '{plugin.name}'")
except Exception as e:
logger.error(f"Task {task_id} failed: {e}")
task.status = TaskStatus.FAILED
self._add_log(task_id, "ERROR", f"Task failed: {e}", {"error_type": type(e).__name__})
finally:
task.finished_at = datetime.utcnow()
logger.info(f"Task {task_id} execution finished with status: {task.status}")
# [/DEF:TaskManager._run_task:Function]
# [DEF:TaskManager.resolve_task:Function]
# @PURPOSE: Resumes a task that is awaiting mapping.
# @PRE: Task exists and is in AWAITING_MAPPING state.
# @POST: Task status updated to RUNNING, params updated, execution resumed.
# @PARAM: task_id (str) - The ID of the task.
# @PARAM: resolution_params (Dict[str, Any]) - Params to resolve the wait.
# @THROWS: ValueError if task not found or not awaiting mapping.
async def resolve_task(self, task_id: str, resolution_params: Dict[str, Any]):
with belief_scope("TaskManager.resolve_task", f"task_id={task_id}"):
task = self.tasks.get(task_id)
if not task or task.status != TaskStatus.AWAITING_MAPPING:
raise ValueError("Task is not awaiting mapping.")
# Update task params with resolution
task.params.update(resolution_params)
task.status = TaskStatus.RUNNING
self._add_log(task_id, "INFO", "Task resumed after mapping resolution.")
# Signal the future to continue
if task_id in self.task_futures:
self.task_futures[task_id].set_result(True)
# [/DEF:TaskManager.resolve_task:Function]
# [DEF:TaskManager.wait_for_resolution:Function]
# @PURPOSE: Pauses execution and waits for a resolution signal.
# @PRE: Task exists.
# @POST: Execution pauses until future is set.
# @PARAM: task_id (str) - The ID of the task.
async def wait_for_resolution(self, task_id: str):
with belief_scope("TaskManager.wait_for_resolution", f"task_id={task_id}"):
task = self.tasks.get(task_id)
if not task: return
task.status = TaskStatus.AWAITING_MAPPING
self.task_futures[task_id] = self.loop.create_future()
try:
await self.task_futures[task_id]
finally:
if task_id in self.task_futures:
del self.task_futures[task_id]
# [/DEF:TaskManager.wait_for_resolution:Function]
# [DEF:TaskManager.wait_for_input:Function]
# @PURPOSE: Pauses execution and waits for user input.
# @PRE: Task exists.
# @POST: Execution pauses until future is set via resume_task_with_password.
# @PARAM: task_id (str) - The ID of the task.
async def wait_for_input(self, task_id: str):
with belief_scope("TaskManager.wait_for_input", f"task_id={task_id}"):
task = self.tasks.get(task_id)
if not task: return
# Status is already set to AWAITING_INPUT by await_input()
self.task_futures[task_id] = self.loop.create_future()
try:
await self.task_futures[task_id]
finally:
if task_id in self.task_futures:
del self.task_futures[task_id]
# [/DEF:TaskManager.wait_for_input:Function]
# [DEF:TaskManager.get_task:Function]
# @PURPOSE: Retrieves a task by its ID.
# @PARAM: task_id (str) - ID of the task.
# @RETURN: Optional[Task] - The task or None.
def get_task(self, task_id: str) -> Optional[Task]:
return self.tasks.get(task_id)
# [/DEF:TaskManager.get_task:Function]
# [DEF:TaskManager.get_all_tasks:Function]
# @PURPOSE: Retrieves all registered tasks.
# @RETURN: List[Task] - All tasks.
def get_all_tasks(self) -> List[Task]:
return list(self.tasks.values())
# [/DEF:TaskManager.get_all_tasks:Function]
# [DEF:TaskManager.get_tasks:Function]
# @PURPOSE: Retrieves tasks with pagination and optional status filter.
# @PRE: limit and offset are non-negative integers.
# @POST: Returns a list of tasks sorted by start_time descending.
# @PARAM: limit (int) - Maximum number of tasks to return.
# @PARAM: offset (int) - Number of tasks to skip.
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
# @RETURN: List[Task] - List of tasks matching criteria.
def get_tasks(self, limit: int = 10, offset: int = 0, status: Optional[TaskStatus] = None) -> List[Task]:
tasks = list(self.tasks.values())
if status:
tasks = [t for t in tasks if t.status == status]
# Sort by start_time descending (most recent first)
tasks.sort(key=lambda t: t.started_at or datetime.min, reverse=True)
return tasks[offset:offset + limit]
# [/DEF:TaskManager.get_tasks:Function]
# [DEF:TaskManager.get_task_logs:Function]
# @PURPOSE: Retrieves logs for a specific task.
# @PARAM: task_id (str) - ID of the task.
# @RETURN: List[LogEntry] - List of log entries.
def get_task_logs(self, task_id: str) -> List[LogEntry]:
task = self.tasks.get(task_id)
return task.logs if task else []
# [/DEF:TaskManager.get_task_logs:Function]
# [DEF:TaskManager._add_log:Function]
# @PURPOSE: Adds a log entry to a task and notifies subscribers.
# @PRE: Task exists.
# @POST: Log added to task and pushed to queues.
# @PARAM: task_id (str) - ID of the task.
# @PARAM: level (str) - Log level.
# @PARAM: message (str) - Log message.
# @PARAM: context (Optional[Dict]) - Log context.
def _add_log(self, task_id: str, level: str, message: str, context: Optional[Dict[str, Any]] = None):
task = self.tasks.get(task_id)
if not task:
return
log_entry = LogEntry(level=level, message=message, context=context)
task.logs.append(log_entry)
# Notify subscribers
if task_id in self.subscribers:
for queue in self.subscribers[task_id]:
self.loop.call_soon_threadsafe(queue.put_nowait, log_entry)
# [/DEF:TaskManager._add_log:Function]
# [DEF:TaskManager.subscribe_logs:Function]
# @PURPOSE: Subscribes to real-time logs for a task.
# @PARAM: task_id (str) - ID of the task.
# @RETURN: asyncio.Queue - Queue for log entries.
async def subscribe_logs(self, task_id: str) -> asyncio.Queue:
queue = asyncio.Queue()
if task_id not in self.subscribers:
self.subscribers[task_id] = []
self.subscribers[task_id].append(queue)
return queue
# [/DEF:TaskManager.subscribe_logs:Function]
# [DEF:TaskManager.unsubscribe_logs:Function]
# @PURPOSE: Unsubscribes from real-time logs for a task.
# @PARAM: task_id (str) - ID of the task.
# @PARAM: queue (asyncio.Queue) - Queue to remove.
def unsubscribe_logs(self, task_id: str, queue: asyncio.Queue):
if task_id in self.subscribers:
if queue in self.subscribers[task_id]:
self.subscribers[task_id].remove(queue)
if not self.subscribers[task_id]:
del self.subscribers[task_id]
# [/DEF:TaskManager.unsubscribe_logs:Function]
# [DEF:TaskManager.persist_awaiting_input_tasks:Function]
# @PURPOSE: Persist tasks in AWAITING_INPUT state using persistence service.
def persist_awaiting_input_tasks(self) -> None:
self.persistence_service.persist_tasks(list(self.tasks.values()))
# [/DEF:TaskManager.persist_awaiting_input_tasks:Function]
# [DEF:TaskManager.load_persisted_tasks:Function]
# @PURPOSE: Load persisted tasks using persistence service.
def load_persisted_tasks(self) -> None:
loaded_tasks = self.persistence_service.load_tasks()
for task in loaded_tasks:
if task.id not in self.tasks:
self.tasks[task.id] = task
# [/DEF:TaskManager.load_persisted_tasks:Function]
# [DEF:TaskManager.await_input:Function]
# @PURPOSE: Transition a task to AWAITING_INPUT state with input request.
# @PRE: Task exists and is in RUNNING state.
# @POST: Task status changed to AWAITING_INPUT, input_request set, persisted.
# @PARAM: task_id (str) - ID of the task.
# @PARAM: input_request (Dict) - Details about required input.
# @THROWS: ValueError if task not found or not RUNNING.
def await_input(self, task_id: str, input_request: Dict[str, Any]) -> None:
with belief_scope("TaskManager.await_input", f"task_id={task_id}"):
task = self.tasks.get(task_id)
if not task:
raise ValueError(f"Task {task_id} not found")
if task.status != TaskStatus.RUNNING:
raise ValueError(f"Task {task_id} is not RUNNING (current: {task.status})")
task.status = TaskStatus.AWAITING_INPUT
task.input_required = True
task.input_request = input_request
self._add_log(task_id, "INFO", "Task paused for user input", {"input_request": input_request})
self.persist_awaiting_input_tasks()
# [/DEF:TaskManager.await_input:Function]
# [DEF:TaskManager.resume_task_with_password:Function]
# @PURPOSE: Resume a task that is awaiting input with provided passwords.
# @PRE: Task exists and is in AWAITING_INPUT state.
# @POST: Task status changed to RUNNING, passwords injected, task resumed.
# @PARAM: task_id (str) - ID of the task.
# @PARAM: passwords (Dict[str, str]) - Mapping of database name to password.
# @THROWS: ValueError if task not found, not awaiting input, or passwords invalid.
def resume_task_with_password(self, task_id: str, passwords: Dict[str, str]) -> None:
with belief_scope("TaskManager.resume_task_with_password", f"task_id={task_id}"):
task = self.tasks.get(task_id)
if not task:
raise ValueError(f"Task {task_id} not found")
if task.status != TaskStatus.AWAITING_INPUT:
raise ValueError(f"Task {task_id} is not AWAITING_INPUT (current: {task.status})")
if not isinstance(passwords, dict) or not passwords:
raise ValueError("Passwords must be a non-empty dictionary")
task.params["passwords"] = passwords
task.input_required = False
task.input_request = None
task.status = TaskStatus.RUNNING
self._add_log(task_id, "INFO", "Task resumed with passwords", {"databases": list(passwords.keys())})
if task_id in self.task_futures:
self.task_futures[task_id].set_result(True)
# Remove from persistence as it's no longer awaiting input
self.persistence_service.delete_tasks([task_id])
# [/DEF:TaskManager.resume_task_with_password:Function]
# [DEF:TaskManager.clear_tasks:Function]
# @PURPOSE: Clears tasks based on status filter.
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
# @RETURN: int - Number of tasks cleared.
def clear_tasks(self, status: Optional[TaskStatus] = None) -> int:
with belief_scope("TaskManager.clear_tasks"):
tasks_to_remove = []
for task_id, task in list(self.tasks.items()):
# If status is provided, match it.
# If status is None, match everything EXCEPT RUNNING (unless they are awaiting input/mapping which are technically running but paused?)
# Actually, AWAITING_INPUT and AWAITING_MAPPING are distinct statuses in TaskStatus enum.
# RUNNING is active execution.
should_remove = False
if status:
if task.status == status:
should_remove = True
else:
# Clear all non-active tasks (keep RUNNING, AWAITING_INPUT, AWAITING_MAPPING)
if task.status not in [TaskStatus.RUNNING, TaskStatus.AWAITING_INPUT, TaskStatus.AWAITING_MAPPING]:
should_remove = True
if should_remove:
tasks_to_remove.append(task_id)
for tid in tasks_to_remove:
# Cancel future if exists (e.g. for AWAITING_INPUT/MAPPING)
if tid in self.task_futures:
self.task_futures[tid].cancel()
del self.task_futures[tid]
del self.tasks[tid]
# Remove from persistence
self.persistence_service.delete_tasks(tasks_to_remove)
logger.info(f"Cleared {len(tasks_to_remove)} tasks.")
return len(tasks_to_remove)
# [/DEF:TaskManager.clear_tasks:Function]
# [/DEF:TaskManager:Class]
# [/DEF:TaskManagerModule:Module]

View File

@@ -0,0 +1,67 @@
# [DEF:TaskManagerModels:Module]
# @SEMANTICS: task, models, pydantic, enum, state
# @PURPOSE: Defines the data models and enumerations used by the Task Manager.
# @LAYER: Core
# @RELATION: Used by TaskManager and API routes.
# @INVARIANT: Task IDs are immutable once created.
# @CONSTRAINT: Must use Pydantic for data validation.
# [SECTION: IMPORTS]
import uuid
from datetime import datetime
from enum import Enum
from typing import Dict, Any, List, Optional
from pydantic import BaseModel, Field
# [/SECTION]
# [DEF:TaskStatus:Enum]
# @SEMANTICS: task, status, state, enum
# @PURPOSE: Defines the possible states a task can be in during its lifecycle.
class TaskStatus(str, Enum):
PENDING = "PENDING"
RUNNING = "RUNNING"
SUCCESS = "SUCCESS"
FAILED = "FAILED"
AWAITING_MAPPING = "AWAITING_MAPPING"
AWAITING_INPUT = "AWAITING_INPUT"
# [/DEF:TaskStatus:Enum]
# [DEF:LogEntry:Class]
# @SEMANTICS: log, entry, record, pydantic
# @PURPOSE: A Pydantic model representing a single, structured log entry associated with a task.
class LogEntry(BaseModel):
timestamp: datetime = Field(default_factory=datetime.utcnow)
level: str
message: str
context: Optional[Dict[str, Any]] = None
# [/DEF:LogEntry:Class]
# [DEF:Task:Class]
# @SEMANTICS: task, job, execution, state, pydantic
# @PURPOSE: A Pydantic model representing a single execution instance of a plugin, including its status, parameters, and logs.
class Task(BaseModel):
id: str = Field(default_factory=lambda: str(uuid.uuid4()))
plugin_id: str
status: TaskStatus = TaskStatus.PENDING
started_at: Optional[datetime] = None
finished_at: Optional[datetime] = None
user_id: Optional[str] = None
logs: List[LogEntry] = Field(default_factory=list)
params: Dict[str, Any] = Field(default_factory=dict)
input_required: bool = False
input_request: Optional[Dict[str, Any]] = None
# [DEF:Task.__init__:Function]
# @PURPOSE: Initializes the Task model and validates input_request for AWAITING_INPUT status.
# @PRE: If status is AWAITING_INPUT, input_request must be provided.
# @POST: Task instance is created or ValueError is raised.
# @PARAM: **data - Keyword arguments for model initialization.
def __init__(self, **data):
super().__init__(**data)
if self.status == TaskStatus.AWAITING_INPUT and not self.input_request:
raise ValueError("input_request is required when status is AWAITING_INPUT")
# [/DEF:Task.__init__:Function]
# [/DEF:Task:Class]
# [/DEF:TaskManagerModels:Module]

View File

@@ -0,0 +1,158 @@
# [DEF:TaskPersistenceModule:Module]
# @SEMANTICS: persistence, sqlite, task, storage
# @PURPOSE: Handles the persistence of tasks, specifically those awaiting user input, to a SQLite database.
# @LAYER: Core
# @RELATION: Used by TaskManager to save and load tasks.
# @INVARIANT: Database schema must match the Task model structure.
# @CONSTRAINT: Uses synchronous SQLite operations (blocking), should be used carefully.
# [SECTION: IMPORTS]
import sqlite3
import json
from datetime import datetime
from pathlib import Path
from typing import Dict, List, Optional, Any
from .models import Task, TaskStatus
from ..logger import logger, belief_scope
# [/SECTION]
# [DEF:TaskPersistenceService:Class]
# @SEMANTICS: persistence, service, database
# @PURPOSE: Provides methods to save and load tasks from a local SQLite database.
class TaskPersistenceService:
def __init__(self, db_path: Optional[Path] = None):
if db_path is None:
self.db_path = Path(__file__).parent.parent.parent.parent / "migrations.db"
else:
self.db_path = db_path
self._ensure_db_exists()
# [DEF:TaskPersistenceService._ensure_db_exists:Function]
# @PURPOSE: Ensures the database directory and table exist.
# @PRE: None.
# @POST: Database file and table are created if they didn't exist.
def _ensure_db_exists(self) -> None:
with belief_scope("TaskPersistenceService._ensure_db_exists"):
self.db_path.parent.mkdir(parents=True, exist_ok=True)
conn = sqlite3.connect(str(self.db_path))
cursor = conn.cursor()
cursor.execute("""
CREATE TABLE IF NOT EXISTS persistent_tasks (
id TEXT PRIMARY KEY,
plugin_id TEXT NOT NULL,
status TEXT NOT NULL,
created_at TEXT NOT NULL,
updated_at TEXT NOT NULL,
input_request JSON,
context JSON
)
""")
conn.commit()
conn.close()
# [/DEF:TaskPersistenceService._ensure_db_exists:Function]
# [DEF:TaskPersistenceService.persist_tasks:Function]
# @PURPOSE: Persists a list of tasks to the database.
# @PRE: Tasks list contains valid Task objects.
# @POST: Tasks matching the criteria (AWAITING_INPUT) are saved/updated in the DB.
# @PARAM: tasks (List[Task]) - The list of tasks to check and persist.
def persist_tasks(self, tasks: List[Task]) -> None:
with belief_scope("TaskPersistenceService.persist_tasks"):
conn = sqlite3.connect(str(self.db_path))
cursor = conn.cursor()
count = 0
for task in tasks:
if task.status == TaskStatus.AWAITING_INPUT:
cursor.execute("""
INSERT OR REPLACE INTO persistent_tasks
(id, plugin_id, status, created_at, updated_at, input_request, context)
VALUES (?, ?, ?, ?, ?, ?, ?)
""", (
task.id,
task.plugin_id,
task.status.value,
task.started_at.isoformat() if task.started_at else datetime.utcnow().isoformat(),
datetime.utcnow().isoformat(),
json.dumps(task.input_request) if task.input_request else None,
json.dumps(task.params)
))
count += 1
conn.commit()
conn.close()
logger.info(f"Persisted {count} tasks awaiting input.")
# [/DEF:TaskPersistenceService.persist_tasks:Function]
# [DEF:TaskPersistenceService.load_tasks:Function]
# @PURPOSE: Loads persisted tasks from the database.
# @PRE: Database exists.
# @POST: Returns a list of Task objects reconstructed from the DB.
# @RETURN: List[Task] - The loaded tasks.
def load_tasks(self) -> List[Task]:
with belief_scope("TaskPersistenceService.load_tasks"):
if not self.db_path.exists():
return []
conn = sqlite3.connect(str(self.db_path))
cursor = conn.cursor()
# Check if plugin_id column exists (migration for existing db)
cursor.execute("PRAGMA table_info(persistent_tasks)")
columns = [info[1] for info in cursor.fetchall()]
has_plugin_id = "plugin_id" in columns
if has_plugin_id:
cursor.execute("SELECT id, plugin_id, status, created_at, input_request, context FROM persistent_tasks")
else:
cursor.execute("SELECT id, status, created_at, input_request, context FROM persistent_tasks")
rows = cursor.fetchall()
loaded_tasks = []
for row in rows:
if has_plugin_id:
task_id, plugin_id, status, created_at, input_request_json, context_json = row
else:
task_id, status, created_at, input_request_json, context_json = row
plugin_id = "superset-migration" # Default fallback
try:
task = Task(
id=task_id,
plugin_id=plugin_id,
status=TaskStatus(status),
started_at=datetime.fromisoformat(created_at),
input_required=True,
input_request=json.loads(input_request_json) if input_request_json else None,
params=json.loads(context_json) if context_json else {}
)
loaded_tasks.append(task)
except Exception as e:
logger.error(f"Failed to load task {task_id}: {e}")
conn.close()
return loaded_tasks
# [/DEF:TaskPersistenceService.load_tasks:Function]
# [DEF:TaskPersistenceService.delete_tasks:Function]
# @PURPOSE: Deletes specific tasks from the database.
# @PARAM: task_ids (List[str]) - List of task IDs to delete.
def delete_tasks(self, task_ids: List[str]) -> None:
if not task_ids:
return
with belief_scope("TaskPersistenceService.delete_tasks"):
conn = sqlite3.connect(str(self.db_path))
cursor = conn.cursor()
placeholders = ', '.join('?' for _ in task_ids)
cursor.execute(f"DELETE FROM persistent_tasks WHERE id IN ({placeholders})", task_ids)
conn.commit()
conn.close()
# [/DEF:TaskPersistenceService.delete_tasks:Function]
# [/DEF:TaskPersistenceService:Class]
# [/DEF:TaskPersistenceModule:Module]

View File

@@ -21,7 +21,12 @@ def get_config_manager() -> ConfigManager:
plugin_dir = Path(__file__).parent / "plugins" plugin_dir = Path(__file__).parent / "plugins"
plugin_loader = PluginLoader(plugin_dir=str(plugin_dir)) plugin_loader = PluginLoader(plugin_dir=str(plugin_dir))
from .core.logger import logger
logger.info(f"PluginLoader initialized with directory: {plugin_dir}")
logger.info(f"Available plugins: {[config.name for config in plugin_loader.get_all_plugin_configs()]}")
task_manager = TaskManager(plugin_loader) task_manager = TaskManager(plugin_loader)
logger.info("TaskManager initialized")
def get_plugin_loader() -> PluginLoader: def get_plugin_loader() -> PluginLoader:
"""Dependency injector for the PluginLoader.""" """Dependency injector for the PluginLoader."""

View File

@@ -0,0 +1,28 @@
# [DEF:backend.src.models.dashboard:Module]
# @SEMANTICS: dashboard, model, metadata, migration
# @PURPOSE: Defines data models for dashboard metadata and selection.
# @LAYER: Model
# @RELATION: USED_BY -> backend.src.api.routes.migration
from pydantic import BaseModel
from typing import List
# [DEF:DashboardMetadata:Class]
# @PURPOSE: Represents a dashboard available for migration.
class DashboardMetadata(BaseModel):
id: int
title: str
last_modified: str
status: str
# [/DEF:DashboardMetadata]
# [DEF:DashboardSelection:Class]
# @PURPOSE: Represents the user's selection of dashboards to migrate.
class DashboardSelection(BaseModel):
selected_ids: List[int]
source_env_id: str
target_env_id: str
replace_db_config: bool = False
# [/DEF:DashboardSelection]
# [/DEF:backend.src.models.dashboard]

View File

@@ -87,34 +87,96 @@ class MigrationPlugin(PluginBase):
} }
async def execute(self, params: Dict[str, Any]): async def execute(self, params: Dict[str, Any]):
from_env = params["from_env"] source_env_id = params.get("source_env_id")
to_env = params["to_env"] target_env_id = params.get("target_env_id")
dashboard_regex = params["dashboard_regex"] selected_ids = params.get("selected_ids")
# Legacy support or alternative params
from_env_name = params.get("from_env")
to_env_name = params.get("to_env")
dashboard_regex = params.get("dashboard_regex")
replace_db_config = params.get("replace_db_config", False) replace_db_config = params.get("replace_db_config", False)
from_db_id = params.get("from_db_id") from_db_id = params.get("from_db_id")
to_db_id = params.get("to_db_id") to_db_id = params.get("to_db_id")
logger = SupersetLogger(log_dir=Path.cwd() / "logs", console=True) # [DEF:MigrationPlugin.execute:Action]
logger.info(f"[MigrationPlugin][Entry] Starting migration from {from_env} to {to_env}.") # @PURPOSE: Execute the migration logic with proper task logging.
task_id = params.get("_task_id")
from ..dependencies import get_task_manager
tm = get_task_manager()
class TaskLoggerProxy(SupersetLogger):
def __init__(self):
# Initialize parent with dummy values since we override methods
super().__init__(console=False)
def debug(self, msg, *args, extra=None, **kwargs):
if task_id: tm._add_log(task_id, "DEBUG", msg, extra or {})
def info(self, msg, *args, extra=None, **kwargs):
if task_id: tm._add_log(task_id, "INFO", msg, extra or {})
def warning(self, msg, *args, extra=None, **kwargs):
if task_id: tm._add_log(task_id, "WARNING", msg, extra or {})
def error(self, msg, *args, extra=None, **kwargs):
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
def critical(self, msg, *args, extra=None, **kwargs):
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
def exception(self, msg, *args, **kwargs):
if task_id: tm._add_log(task_id, "ERROR", msg, {"exception": True})
logger = TaskLoggerProxy()
logger.info(f"[MigrationPlugin][Entry] Starting migration task.")
logger.info(f"[MigrationPlugin][Action] Params: {params}")
try: try:
config_manager = get_config_manager() config_manager = get_config_manager()
all_clients = setup_clients(logger, custom_envs=config_manager.get_environments()) environments = config_manager.get_environments()
from_c = all_clients.get(from_env)
to_c = all_clients.get(to_env) # Resolve environments
src_env = None
tgt_env = None
if source_env_id:
src_env = next((e for e in environments if e.id == source_env_id), None)
elif from_env_name:
src_env = next((e for e in environments if e.name == from_env_name), None)
if target_env_id:
tgt_env = next((e for e in environments if e.id == target_env_id), None)
elif to_env_name:
tgt_env = next((e for e in environments if e.name == to_env_name), None)
if not src_env or not tgt_env:
raise ValueError(f"Could not resolve source or target environment. Source: {source_env_id or from_env_name}, Target: {target_env_id or to_env_name}")
from_env_name = src_env.name
to_env_name = tgt_env.name
logger.info(f"[MigrationPlugin][State] Resolved environments: {from_env_name} -> {to_env_name}")
all_clients = setup_clients(logger, custom_envs=environments)
from_c = all_clients.get(from_env_name)
to_c = all_clients.get(to_env_name)
if not from_c or not to_c: if not from_c or not to_c:
raise ValueError(f"One or both environments ('{from_env}', '{to_env}') not found in configuration.") raise ValueError(f"Clients not initialized for environments: {from_env_name}, {to_env_name}")
_, all_dashboards = from_c.get_dashboards() _, all_dashboards = from_c.get_dashboards()
regex_str = str(dashboard_regex) dashboards_to_migrate = []
dashboards_to_migrate = [ if selected_ids:
d for d in all_dashboards if re.search(regex_str, d["dashboard_title"], re.IGNORECASE) dashboards_to_migrate = [d for d in all_dashboards if d["id"] in selected_ids]
] elif dashboard_regex:
regex_str = str(dashboard_regex)
dashboards_to_migrate = [
d for d in all_dashboards if re.search(regex_str, d["dashboard_title"], re.IGNORECASE)
]
else:
logger.warning("[MigrationPlugin][State] No selection criteria provided (selected_ids or dashboard_regex).")
return
if not dashboards_to_migrate: if not dashboards_to_migrate:
logger.warning("[MigrationPlugin][State] No dashboards found matching the regex.") logger.warning("[MigrationPlugin][State] No dashboards found matching criteria.")
return return
# Fetch mappings from database # Fetch mappings from database
@@ -123,8 +185,8 @@ class MigrationPlugin(PluginBase):
db = SessionLocal() db = SessionLocal()
try: try:
# Find environment IDs by name # Find environment IDs by name
src_env = db.query(Environment).filter(Environment.name == from_env).first() src_env = db.query(Environment).filter(Environment.name == from_env_name).first()
tgt_env = db.query(Environment).filter(Environment.name == to_env).first() tgt_env = db.query(Environment).filter(Environment.name == to_env_name).first()
if src_env and tgt_env: if src_env and tgt_env:
mappings = db.query(DatabaseMapping).filter( mappings = db.query(DatabaseMapping).filter(
@@ -144,49 +206,86 @@ class MigrationPlugin(PluginBase):
try: try:
exported_content, _ = from_c.export_dashboard(dash_id) exported_content, _ = from_c.export_dashboard(dash_id)
with create_temp_file(content=exported_content, dry_run=True, suffix=".zip", logger=logger) as tmp_zip_path: with create_temp_file(content=exported_content, dry_run=True, suffix=".zip", logger=logger) as tmp_zip_path:
if not replace_db_config: # Always transform to strip databases to avoid password errors
to_c.import_dashboard(file_name=tmp_zip_path, dash_id=dash_id, dash_slug=dash_slug) with create_temp_file(suffix=".zip", dry_run=True, logger=logger) as tmp_new_zip:
else: success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping, strip_databases=False)
# Check for missing mappings before transformation
# This is a simplified check, in reality we'd check all YAMLs if not success and replace_db_config:
# For US3, we'll just use the engine and handle missing ones there # Signal missing mapping and wait (only if we care about mappings)
with create_temp_file(suffix=".zip", dry_run=True, logger=logger) as tmp_new_zip: if task_id:
# If we have missing mappings, we might need to pause logger.info(f"[MigrationPlugin][Action] Pausing for missing mapping in task {task_id}")
# For now, let's assume the engine can tell us what's missing # In a real scenario, we'd pass the missing DB info to the frontend
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping) # For this task, we'll just simulate the wait
await tm.wait_for_resolution(task_id)
if not success: # After resolution, retry transformation with updated mappings
# Signal missing mapping and wait # (Mappings would be updated in task.params by resolve_task)
task_id = params.get("_task_id") db = SessionLocal()
if task_id: try:
from ..dependencies import get_task_manager src_env = db.query(Environment).filter(Environment.name == from_env_name).first()
tm = get_task_manager() tgt_env = db.query(Environment).filter(Environment.name == to_env_name).first()
logger.info(f"[MigrationPlugin][Action] Pausing for missing mapping in task {task_id}") mappings = db.query(DatabaseMapping).filter(
# In a real scenario, we'd pass the missing DB info to the frontend DatabaseMapping.source_env_id == src_env.id,
# For this task, we'll just simulate the wait DatabaseMapping.target_env_id == tgt_env.id
await tm.wait_for_resolution(task_id) ).all()
# After resolution, retry transformation with updated mappings db_mapping = {m.source_db_uuid: m.target_db_uuid for m in mappings}
# (Mappings would be updated in task.params by resolve_task) finally:
db = SessionLocal() db.close()
try: success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping, strip_databases=False)
src_env = db.query(Environment).filter(Environment.name == from_env).first()
tgt_env = db.query(Environment).filter(Environment.name == to_env).first()
mappings = db.query(DatabaseMapping).filter(
DatabaseMapping.source_env_id == src_env.id,
DatabaseMapping.target_env_id == tgt_env.id
).all()
db_mapping = {m.source_db_uuid: m.target_db_uuid for m in mappings}
finally:
db.close()
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping)
if success: if success:
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug) to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug)
else: else:
logger.error(f"[MigrationPlugin][Failure] Failed to transform ZIP for dashboard {title}") logger.error(f"[MigrationPlugin][Failure] Failed to transform ZIP for dashboard {title}")
logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported.") logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported.")
except Exception as exc: except Exception as exc:
# Check for password error
error_msg = str(exc)
# The error message from Superset is often a JSON string inside a string.
# We need to robustly detect the password requirement.
# Typical error: "Error importing dashboard: databases/PostgreSQL.yaml: {'_schema': ['Must provide a password for the database']}"
if "Must provide a password for the database" in error_msg:
# Extract database name
# Try to find "databases/DBNAME.yaml" pattern
import re
db_name = "unknown"
match = re.search(r"databases/([^.]+)\.yaml", error_msg)
if match:
db_name = match.group(1)
else:
# Fallback: try to find 'database 'NAME'' pattern
match_alt = re.search(r"database '([^']+)'", error_msg)
if match_alt:
db_name = match_alt.group(1)
logger.warning(f"[MigrationPlugin][Action] Detected missing password for database: {db_name}")
if task_id:
input_request = {
"type": "database_password",
"databases": [db_name],
"error_message": error_msg
}
tm.await_input(task_id, input_request)
# Wait for user input
await tm.wait_for_input(task_id)
# Resume with passwords
task = tm.get_task(task_id)
passwords = task.params.get("passwords", {})
# Retry import with password
if passwords:
logger.info(f"[MigrationPlugin][Action] Retrying import for {title} with provided passwords.")
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug, passwords=passwords)
logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported after password injection.")
# Clear passwords from params after use for security
if "passwords" in task.params:
del task.params["passwords"]
continue
logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True) logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True)
logger.info("[MigrationPlugin][Exit] Migration finished.") logger.info("[MigrationPlugin][Exit] Migration finished.")

View File

@@ -0,0 +1,44 @@
import pytest
from backend.src.core.logger import belief_scope, logger
def test_belief_scope_logs_entry_action_exit(caplog):
"""Test that belief_scope generates [ID][Entry], [ID][Action], and [ID][Exit] logs."""
caplog.set_level("INFO")
with belief_scope("TestFunction"):
logger.info("Doing something important")
# Check that the logs contain the expected patterns
log_messages = [record.message for record in caplog.records]
assert any("[TestFunction][Entry]" in msg for msg in log_messages), "Entry log not found"
assert any("[TestFunction][Action] Doing something important" in msg for msg in log_messages), "Action log not found"
assert any("[TestFunction][Exit]" in msg for msg in log_messages), "Exit log not found"
def test_belief_scope_error_handling(caplog):
"""Test that belief_scope logs Coherence:Failed on exception."""
caplog.set_level("INFO")
with pytest.raises(ValueError):
with belief_scope("FailingFunction"):
raise ValueError("Something went wrong")
log_messages = [record.message for record in caplog.records]
assert any("[FailingFunction][Entry]" in msg for msg in log_messages), "Entry log not found"
assert any("[FailingFunction][Coherence:Failed]" in msg for msg in log_messages), "Failed coherence log not found"
# Exit should not be logged on failure
def test_belief_scope_success_coherence(caplog):
"""Test that belief_scope logs Coherence:OK on success."""
caplog.set_level("INFO")
with belief_scope("SuccessFunction"):
pass
log_messages = [record.message for record in caplog.records]
assert any("[SuccessFunction][Coherence:OK]" in msg for msg in log_messages), "Success coherence log not found"

View File

@@ -1,186 +0,0 @@
// this file is generated — do not edit it
/// <reference types="@sveltejs/kit" />
/**
* Environment variables [loaded by Vite](https://vitejs.dev/guide/env-and-mode.html#env-files) from `.env` files and `process.env`. Like [`$env/dynamic/private`](https://svelte.dev/docs/kit/$env-dynamic-private), this module cannot be imported into client-side code. This module only includes variables that _do not_ begin with [`config.kit.env.publicPrefix`](https://svelte.dev/docs/kit/configuration#env) _and do_ start with [`config.kit.env.privatePrefix`](https://svelte.dev/docs/kit/configuration#env) (if configured).
*
* _Unlike_ [`$env/dynamic/private`](https://svelte.dev/docs/kit/$env-dynamic-private), the values exported from this module are statically injected into your bundle at build time, enabling optimisations like dead code elimination.
*
* ```ts
* import { API_KEY } from '$env/static/private';
* ```
*
* Note that all environment variables referenced in your code should be declared (for example in an `.env` file), even if they don't have a value until the app is deployed:
*
* ```
* MY_FEATURE_FLAG=""
* ```
*
* You can override `.env` values from the command line like so:
*
* ```sh
* MY_FEATURE_FLAG="enabled" npm run dev
* ```
*/
declare module '$env/static/private' {
export const LESSOPEN: string;
export const USER: string;
export const npm_config_user_agent: string;
export const npm_node_execpath: string;
export const SHLVL: string;
export const npm_config_noproxy: string;
export const HOME: string;
export const OLDPWD: string;
export const npm_package_json: string;
export const PS1: string;
export const npm_config_userconfig: string;
export const npm_config_local_prefix: string;
export const DBUS_SESSION_BUS_ADDRESS: string;
export const WSL_DISTRO_NAME: string;
export const COLOR: string;
export const WAYLAND_DISPLAY: string;
export const LOGNAME: string;
export const NAME: string;
export const WSL_INTEROP: string;
export const PULSE_SERVER: string;
export const _: string;
export const npm_config_prefix: string;
export const npm_config_npm_version: string;
export const TERM: string;
export const npm_config_cache: string;
export const npm_config_node_gyp: string;
export const PATH: string;
export const NODE: string;
export const npm_package_name: string;
export const XDG_RUNTIME_DIR: string;
export const DISPLAY: string;
export const LANG: string;
export const VIRTUAL_ENV_PROMPT: string;
export const LS_COLORS: string;
export const npm_lifecycle_script: string;
export const SHELL: string;
export const npm_package_version: string;
export const npm_lifecycle_event: string;
export const GOOGLE_CLOUD_PROJECT: string;
export const LESSCLOSE: string;
export const VIRTUAL_ENV: string;
export const npm_config_globalconfig: string;
export const npm_config_init_module: string;
export const PWD: string;
export const npm_execpath: string;
export const XDG_DATA_DIRS: string;
export const npm_config_global_prefix: string;
export const npm_command: string;
export const WSL2_GUI_APPS_ENABLED: string;
export const HOSTTYPE: string;
export const WSLENV: string;
export const INIT_CWD: string;
export const EDITOR: string;
export const NODE_ENV: string;
}
/**
* Similar to [`$env/static/private`](https://svelte.dev/docs/kit/$env-static-private), except that it only includes environment variables that begin with [`config.kit.env.publicPrefix`](https://svelte.dev/docs/kit/configuration#env) (which defaults to `PUBLIC_`), and can therefore safely be exposed to client-side code.
*
* Values are replaced statically at build time.
*
* ```ts
* import { PUBLIC_BASE_URL } from '$env/static/public';
* ```
*/
declare module '$env/static/public' {
export const PUBLIC_WS_URL: string;
}
/**
* This module provides access to runtime environment variables, as defined by the platform you're running on. For example if you're using [`adapter-node`](https://github.com/sveltejs/kit/tree/main/packages/adapter-node) (or running [`vite preview`](https://svelte.dev/docs/kit/cli)), this is equivalent to `process.env`. This module only includes variables that _do not_ begin with [`config.kit.env.publicPrefix`](https://svelte.dev/docs/kit/configuration#env) _and do_ start with [`config.kit.env.privatePrefix`](https://svelte.dev/docs/kit/configuration#env) (if configured).
*
* This module cannot be imported into client-side code.
*
* ```ts
* import { env } from '$env/dynamic/private';
* console.log(env.DEPLOYMENT_SPECIFIC_VARIABLE);
* ```
*
* > [!NOTE] In `dev`, `$env/dynamic` always includes environment variables from `.env`. In `prod`, this behavior will depend on your adapter.
*/
declare module '$env/dynamic/private' {
export const env: {
LESSOPEN: string;
USER: string;
npm_config_user_agent: string;
npm_node_execpath: string;
SHLVL: string;
npm_config_noproxy: string;
HOME: string;
OLDPWD: string;
npm_package_json: string;
PS1: string;
npm_config_userconfig: string;
npm_config_local_prefix: string;
DBUS_SESSION_BUS_ADDRESS: string;
WSL_DISTRO_NAME: string;
COLOR: string;
WAYLAND_DISPLAY: string;
LOGNAME: string;
NAME: string;
WSL_INTEROP: string;
PULSE_SERVER: string;
_: string;
npm_config_prefix: string;
npm_config_npm_version: string;
TERM: string;
npm_config_cache: string;
npm_config_node_gyp: string;
PATH: string;
NODE: string;
npm_package_name: string;
XDG_RUNTIME_DIR: string;
DISPLAY: string;
LANG: string;
VIRTUAL_ENV_PROMPT: string;
LS_COLORS: string;
npm_lifecycle_script: string;
SHELL: string;
npm_package_version: string;
npm_lifecycle_event: string;
GOOGLE_CLOUD_PROJECT: string;
LESSCLOSE: string;
VIRTUAL_ENV: string;
npm_config_globalconfig: string;
npm_config_init_module: string;
PWD: string;
npm_execpath: string;
XDG_DATA_DIRS: string;
npm_config_global_prefix: string;
npm_command: string;
WSL2_GUI_APPS_ENABLED: string;
HOSTTYPE: string;
WSLENV: string;
INIT_CWD: string;
EDITOR: string;
NODE_ENV: string;
[key: `PUBLIC_${string}`]: undefined;
[key: `${string}`]: string | undefined;
}
}
/**
* Similar to [`$env/dynamic/private`](https://svelte.dev/docs/kit/$env-dynamic-private), but only includes variables that begin with [`config.kit.env.publicPrefix`](https://svelte.dev/docs/kit/configuration#env) (which defaults to `PUBLIC_`), and can therefore safely be exposed to client-side code.
*
* Note that public dynamic environment variables must all be sent from the server to the client, causing larger network requests — when possible, use `$env/static/public` instead.
*
* ```ts
* import { env } from '$env/dynamic/public';
* console.log(env.PUBLIC_DEPLOYMENT_SPECIFIC_VARIABLE);
* ```
*/
declare module '$env/dynamic/public' {
export const env: {
PUBLIC_WS_URL: string;
[key: `PUBLIC_${string}`]: string | undefined;
}
}

View File

@@ -1,31 +0,0 @@
export { matchers } from './matchers.js';
export const nodes = [
() => import('./nodes/0'),
() => import('./nodes/1'),
() => import('./nodes/2'),
() => import('./nodes/3')
];
export const server_loads = [];
export const dictionary = {
"/": [2],
"/settings": [3]
};
export const hooks = {
handleError: (({ error }) => { console.error(error) }),
reroute: (() => {}),
transport: {}
};
export const decoders = Object.fromEntries(Object.entries(hooks.transport).map(([k, v]) => [k, v.decode]));
export const encoders = Object.fromEntries(Object.entries(hooks.transport).map(([k, v]) => [k, v.encode]));
export const hash = false;
export const decode = (type, value) => decoders[type](value);
export { default as root } from '../root.js';

View File

@@ -1 +0,0 @@
export const matchers = {};

View File

@@ -1,3 +0,0 @@
import * as universal from "../../../../src/routes/+layout.ts";
export { universal };
export { default as component } from "../../../../src/routes/+layout.svelte";

View File

@@ -1 +0,0 @@
export { default as component } from "../../../../src/routes/+error.svelte";

View File

@@ -1,3 +0,0 @@
import * as universal from "../../../../src/routes/+page.ts";
export { universal };
export { default as component } from "../../../../src/routes/+page.svelte";

View File

@@ -1,3 +0,0 @@
import * as universal from "../../../../src/routes/settings/+page.ts";
export { universal };
export { default as component } from "../../../../src/routes/settings/+page.svelte";

View File

@@ -1,35 +0,0 @@
export { matchers } from './matchers.js';
export const nodes = [
() => import('./nodes/0'),
() => import('./nodes/1'),
() => import('./nodes/2'),
() => import('./nodes/3'),
() => import('./nodes/4'),
() => import('./nodes/5')
];
export const server_loads = [];
export const dictionary = {
"/": [2],
"/migration": [3],
"/migration/mappings": [4],
"/settings": [5]
};
export const hooks = {
handleError: (({ error }) => { console.error(error) }),
reroute: (() => {}),
transport: {}
};
export const decoders = Object.fromEntries(Object.entries(hooks.transport).map(([k, v]) => [k, v.decode]));
export const encoders = Object.fromEntries(Object.entries(hooks.transport).map(([k, v]) => [k, v.encode]));
export const hash = false;
export const decode = (type, value) => decoders[type](value);
export { default as root } from '../root.js';

View File

@@ -1 +0,0 @@
export const matchers = {};

View File

@@ -1,3 +0,0 @@
import * as universal from "../../../../src/routes/+layout.ts";
export { universal };
export { default as component } from "../../../../src/routes/+layout.svelte";

View File

@@ -1 +0,0 @@
export { default as component } from "../../../../src/routes/+error.svelte";

View File

@@ -1,3 +0,0 @@
import * as universal from "../../../../src/routes/+page.ts";
export { universal };
export { default as component } from "../../../../src/routes/+page.svelte";

View File

@@ -1 +0,0 @@
export { default as component } from "../../../../src/routes/migration/+page.svelte";

View File

@@ -1,3 +0,0 @@
import { asClassComponent } from 'svelte/legacy';
import Root from './root.svelte';
export default asClassComponent(Root);

View File

@@ -1,68 +0,0 @@
<!-- This file is generated by @sveltejs/kit — do not edit it! -->
<svelte:options runes={true} />
<script>
import { setContext, onMount, tick } from 'svelte';
import { browser } from '$app/environment';
// stores
let { stores, page, constructors, components = [], form, data_0 = null, data_1 = null } = $props();
if (!browser) {
// svelte-ignore state_referenced_locally
setContext('__svelte__', stores);
}
if (browser) {
$effect.pre(() => stores.page.set(page));
} else {
// svelte-ignore state_referenced_locally
stores.page.set(page);
}
$effect(() => {
stores;page;constructors;components;form;data_0;data_1;
stores.page.notify();
});
let mounted = $state(false);
let navigated = $state(false);
let title = $state(null);
onMount(() => {
const unsubscribe = stores.page.subscribe(() => {
if (mounted) {
navigated = true;
tick().then(() => {
title = document.title || 'untitled page';
});
}
});
mounted = true;
return unsubscribe;
});
const Pyramid_1=$derived(constructors[1])
</script>
{#if constructors[1]}
{@const Pyramid_0 = constructors[0]}
<!-- svelte-ignore binding_property_non_reactive -->
<Pyramid_0 bind:this={components[0]} data={data_0} {form} params={page.params}>
<!-- svelte-ignore binding_property_non_reactive -->
<Pyramid_1 bind:this={components[1]} data={data_1} {form} params={page.params} />
</Pyramid_0>
{:else}
{@const Pyramid_0 = constructors[0]}
<!-- svelte-ignore binding_property_non_reactive -->
<Pyramid_0 bind:this={components[0]} data={data_0} {form} params={page.params} />
{/if}
{#if mounted}
<div id="svelte-announcer" aria-live="assertive" aria-atomic="true" style="position: absolute; left: 0; top: 0; clip: rect(0 0 0 0); clip-path: inset(50%); overflow: hidden; white-space: nowrap; width: 1px; height: 1px">
{#if navigated}
{title}
{/if}
</div>
{/if}

View File

@@ -1,53 +0,0 @@
import root from '../root.js';
import { set_building, set_prerendering } from '__sveltekit/environment';
import { set_assets } from '$app/paths/internal/server';
import { set_manifest, set_read_implementation } from '__sveltekit/server';
import { set_private_env, set_public_env } from '../../../node_modules/@sveltejs/kit/src/runtime/shared-server.js';
export const options = {
app_template_contains_nonce: false,
async: false,
csp: {"mode":"auto","directives":{"upgrade-insecure-requests":false,"block-all-mixed-content":false},"reportOnly":{"upgrade-insecure-requests":false,"block-all-mixed-content":false}},
csrf_check_origin: true,
csrf_trusted_origins: [],
embedded: false,
env_public_prefix: 'PUBLIC_',
env_private_prefix: '',
hash_routing: false,
hooks: null, // added lazily, via `get_hooks`
preload_strategy: "modulepreload",
root,
service_worker: false,
service_worker_options: undefined,
templates: {
app: ({ head, body, assets, nonce, env }) => "<!DOCTYPE html>\n<html lang=\"en\">\n\t<head>\n\t\t<meta charset=\"utf-8\" />\n\t\t<link rel=\"icon\" href=\"" + assets + "/favicon.png\" />\n\t\t<meta name=\"viewport\" content=\"width=device-width, initial-scale=1\" />\n\t\t" + head + "\n\t</head>\n\t<body data-sveltekit-preload-data=\"hover\">\n\t\t<div style=\"display: contents\">" + body + "</div>\n\t</body>\n</html>\n",
error: ({ status, message }) => "<!doctype html>\n<html lang=\"en\">\n\t<head>\n\t\t<meta charset=\"utf-8\" />\n\t\t<title>" + message + "</title>\n\n\t\t<style>\n\t\t\tbody {\n\t\t\t\t--bg: white;\n\t\t\t\t--fg: #222;\n\t\t\t\t--divider: #ccc;\n\t\t\t\tbackground: var(--bg);\n\t\t\t\tcolor: var(--fg);\n\t\t\t\tfont-family:\n\t\t\t\t\tsystem-ui,\n\t\t\t\t\t-apple-system,\n\t\t\t\t\tBlinkMacSystemFont,\n\t\t\t\t\t'Segoe UI',\n\t\t\t\t\tRoboto,\n\t\t\t\t\tOxygen,\n\t\t\t\t\tUbuntu,\n\t\t\t\t\tCantarell,\n\t\t\t\t\t'Open Sans',\n\t\t\t\t\t'Helvetica Neue',\n\t\t\t\t\tsans-serif;\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t\tjustify-content: center;\n\t\t\t\theight: 100vh;\n\t\t\t\tmargin: 0;\n\t\t\t}\n\n\t\t\t.error {\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t\tmax-width: 32rem;\n\t\t\t\tmargin: 0 1rem;\n\t\t\t}\n\n\t\t\t.status {\n\t\t\t\tfont-weight: 200;\n\t\t\t\tfont-size: 3rem;\n\t\t\t\tline-height: 1;\n\t\t\t\tposition: relative;\n\t\t\t\ttop: -0.05rem;\n\t\t\t}\n\n\t\t\t.message {\n\t\t\t\tborder-left: 1px solid var(--divider);\n\t\t\t\tpadding: 0 0 0 1rem;\n\t\t\t\tmargin: 0 0 0 1rem;\n\t\t\t\tmin-height: 2.5rem;\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t}\n\n\t\t\t.message h1 {\n\t\t\t\tfont-weight: 400;\n\t\t\t\tfont-size: 1em;\n\t\t\t\tmargin: 0;\n\t\t\t}\n\n\t\t\t@media (prefers-color-scheme: dark) {\n\t\t\t\tbody {\n\t\t\t\t\t--bg: #222;\n\t\t\t\t\t--fg: #ddd;\n\t\t\t\t\t--divider: #666;\n\t\t\t\t}\n\t\t\t}\n\t\t</style>\n\t</head>\n\t<body>\n\t\t<div class=\"error\">\n\t\t\t<span class=\"status\">" + status + "</span>\n\t\t\t<div class=\"message\">\n\t\t\t\t<h1>" + message + "</h1>\n\t\t\t</div>\n\t\t</div>\n\t</body>\n</html>\n"
},
version_hash: "1pvaiah"
};
export async function get_hooks() {
let handle;
let handleFetch;
let handleError;
let handleValidationError;
let init;
let reroute;
let transport;
return {
handle,
handleFetch,
handleError,
handleValidationError,
init,
reroute,
transport
};
}
export { set_assets, set_building, set_manifest, set_prerendering, set_private_env, set_public_env, set_read_implementation };

View File

@@ -1,44 +0,0 @@
// this file is generated — do not edit it
declare module "svelte/elements" {
export interface HTMLAttributes<T> {
'data-sveltekit-keepfocus'?: true | '' | 'off' | undefined | null;
'data-sveltekit-noscroll'?: true | '' | 'off' | undefined | null;
'data-sveltekit-preload-code'?:
| true
| ''
| 'eager'
| 'viewport'
| 'hover'
| 'tap'
| 'off'
| undefined
| null;
'data-sveltekit-preload-data'?: true | '' | 'hover' | 'tap' | 'off' | undefined | null;
'data-sveltekit-reload'?: true | '' | 'off' | undefined | null;
'data-sveltekit-replacestate'?: true | '' | 'off' | undefined | null;
}
}
export {};
declare module "$app/types" {
export interface AppTypes {
RouteId(): "/" | "/migration" | "/migration/mappings" | "/settings";
RouteParams(): {
};
LayoutParams(): {
"/": Record<string, never>;
"/migration": Record<string, never>;
"/migration/mappings": Record<string, never>;
"/settings": Record<string, never>
};
Pathname(): "/" | "/migration" | "/migration/" | "/migration/mappings" | "/migration/mappings/" | "/settings" | "/settings/";
ResolvedPathname(): `${"" | `/${string}`}${ReturnType<AppTypes['Pathname']>}`;
Asset(): string & {};
}
}

View File

@@ -1,162 +0,0 @@
{
".svelte-kit/generated/client-optimized/app.js": {
"file": "_app/immutable/entry/app.BXnpILpp.js",
"name": "entry/app",
"src": ".svelte-kit/generated/client-optimized/app.js",
"isEntry": true,
"imports": [
"_BtL0wB3H.js",
"_cv2LK44M.js",
"_BxZpmA7Z.js",
"_vVxDbqKK.js"
],
"dynamicImports": [
".svelte-kit/generated/client-optimized/nodes/0.js",
".svelte-kit/generated/client-optimized/nodes/1.js",
".svelte-kit/generated/client-optimized/nodes/2.js",
".svelte-kit/generated/client-optimized/nodes/3.js"
]
},
".svelte-kit/generated/client-optimized/nodes/0.js": {
"file": "_app/immutable/nodes/0.DZdF_zz-.js",
"name": "nodes/0",
"src": ".svelte-kit/generated/client-optimized/nodes/0.js",
"isEntry": true,
"isDynamicEntry": true,
"imports": [
"_cv2LK44M.js",
"_CRLlKr96.js",
"_BtL0wB3H.js",
"_xdjHc-A2.js",
"_DXE57cnx.js",
"_Dbod7Wv8.js"
],
"css": [
"_app/immutable/assets/0.RZHRvmcL.css"
]
},
".svelte-kit/generated/client-optimized/nodes/1.js": {
"file": "_app/immutable/nodes/1.Bh-fCbID.js",
"name": "nodes/1",
"src": ".svelte-kit/generated/client-optimized/nodes/1.js",
"isEntry": true,
"isDynamicEntry": true,
"imports": [
"_cv2LK44M.js",
"_CRLlKr96.js",
"_BtL0wB3H.js",
"_DXE57cnx.js"
]
},
".svelte-kit/generated/client-optimized/nodes/2.js": {
"file": "_app/immutable/nodes/2.BmiXdPHI.js",
"name": "nodes/2",
"src": ".svelte-kit/generated/client-optimized/nodes/2.js",
"isEntry": true,
"isDynamicEntry": true,
"imports": [
"_DyPeVqDG.js",
"_cv2LK44M.js",
"_CRLlKr96.js",
"_BtL0wB3H.js",
"_vVxDbqKK.js",
"_Dbod7Wv8.js",
"_BxZpmA7Z.js",
"_xdjHc-A2.js"
]
},
".svelte-kit/generated/client-optimized/nodes/3.js": {
"file": "_app/immutable/nodes/3.guWMyWpk.js",
"name": "nodes/3",
"src": ".svelte-kit/generated/client-optimized/nodes/3.js",
"isEntry": true,
"isDynamicEntry": true,
"imports": [
"_DyPeVqDG.js",
"_cv2LK44M.js",
"_CRLlKr96.js",
"_BtL0wB3H.js",
"_vVxDbqKK.js",
"_Dbod7Wv8.js"
]
},
"_BtL0wB3H.js": {
"file": "_app/immutable/chunks/BtL0wB3H.js",
"name": "index"
},
"_BxZpmA7Z.js": {
"file": "_app/immutable/chunks/BxZpmA7Z.js",
"name": "index-client",
"imports": [
"_BtL0wB3H.js"
]
},
"_CRLlKr96.js": {
"file": "_app/immutable/chunks/CRLlKr96.js",
"name": "legacy",
"imports": [
"_BtL0wB3H.js"
]
},
"_D0iaTcAo.js": {
"file": "_app/immutable/chunks/D0iaTcAo.js",
"name": "entry",
"imports": [
"_BtL0wB3H.js",
"_BxZpmA7Z.js"
]
},
"_DXE57cnx.js": {
"file": "_app/immutable/chunks/DXE57cnx.js",
"name": "stores",
"imports": [
"_D0iaTcAo.js"
]
},
"_Dbod7Wv8.js": {
"file": "_app/immutable/chunks/Dbod7Wv8.js",
"name": "toasts",
"imports": [
"_BtL0wB3H.js"
]
},
"_DyPeVqDG.js": {
"file": "_app/immutable/chunks/DyPeVqDG.js",
"name": "api",
"imports": [
"_BtL0wB3H.js",
"_Dbod7Wv8.js"
]
},
"_cv2LK44M.js": {
"file": "_app/immutable/chunks/cv2LK44M.js",
"name": "disclose-version",
"imports": [
"_BtL0wB3H.js"
]
},
"_vVxDbqKK.js": {
"file": "_app/immutable/chunks/vVxDbqKK.js",
"name": "props",
"imports": [
"_BtL0wB3H.js",
"_cv2LK44M.js"
]
},
"_xdjHc-A2.js": {
"file": "_app/immutable/chunks/xdjHc-A2.js",
"name": "class",
"imports": [
"_BtL0wB3H.js"
]
},
"node_modules/@sveltejs/kit/src/runtime/client/entry.js": {
"file": "_app/immutable/entry/start.BHAeOrfR.js",
"name": "entry/start",
"src": "node_modules/@sveltejs/kit/src/runtime/client/entry.js",
"isEntry": true,
"imports": [
"_D0iaTcAo.js"
]
}
}

View File

@@ -1 +0,0 @@
{"version":"1766262590857"}

View File

@@ -1 +0,0 @@
export const env={"PUBLIC_WS_URL":"ws://localhost:8000"}

View File

@@ -1,180 +0,0 @@
{
".svelte-kit/generated/server/internal.js": {
"file": "internal.js",
"name": "internal",
"src": ".svelte-kit/generated/server/internal.js",
"isEntry": true,
"imports": [
"_internal.js",
"_environment.js"
]
},
"_api.js": {
"file": "chunks/api.js",
"name": "api",
"imports": [
"_toasts.js"
]
},
"_environment.js": {
"file": "chunks/environment.js",
"name": "environment"
},
"_equality.js": {
"file": "chunks/equality.js",
"name": "equality"
},
"_exports.js": {
"file": "chunks/exports.js",
"name": "exports"
},
"_false.js": {
"file": "chunks/false.js",
"name": "false"
},
"_index.js": {
"file": "chunks/index.js",
"name": "index",
"imports": [
"_equality.js"
]
},
"_index2.js": {
"file": "chunks/index2.js",
"name": "index",
"imports": [
"_false.js",
"_equality.js"
]
},
"_internal.js": {
"file": "chunks/internal.js",
"name": "internal",
"imports": [
"_index2.js",
"_equality.js",
"_environment.js"
]
},
"_shared.js": {
"file": "chunks/shared.js",
"name": "shared",
"imports": [
"_utils.js"
]
},
"_stores.js": {
"file": "chunks/stores.js",
"name": "stores",
"imports": [
"_index2.js",
"_exports.js",
"_utils.js",
"_equality.js"
]
},
"_toasts.js": {
"file": "chunks/toasts.js",
"name": "toasts",
"imports": [
"_index.js"
]
},
"_utils.js": {
"file": "chunks/utils.js",
"name": "utils"
},
"node_modules/@sveltejs/kit/src/runtime/app/server/remote/index.js": {
"file": "remote-entry.js",
"name": "remote-entry",
"src": "node_modules/@sveltejs/kit/src/runtime/app/server/remote/index.js",
"isEntry": true,
"imports": [
"_shared.js",
"_false.js",
"_environment.js"
]
},
"node_modules/@sveltejs/kit/src/runtime/server/index.js": {
"file": "index.js",
"name": "index",
"src": "node_modules/@sveltejs/kit/src/runtime/server/index.js",
"isEntry": true,
"imports": [
"_false.js",
"_environment.js",
"_shared.js",
"_exports.js",
"_utils.js",
"_index.js",
"_internal.js"
]
},
"src/routes/+error.svelte": {
"file": "entries/pages/_error.svelte.js",
"name": "entries/pages/_error.svelte",
"src": "src/routes/+error.svelte",
"isEntry": true,
"imports": [
"_index2.js",
"_stores.js"
]
},
"src/routes/+layout.svelte": {
"file": "entries/pages/_layout.svelte.js",
"name": "entries/pages/_layout.svelte",
"src": "src/routes/+layout.svelte",
"isEntry": true,
"imports": [
"_index2.js",
"_stores.js",
"_toasts.js"
],
"css": [
"_app/immutable/assets/_layout.RZHRvmcL.css"
]
},
"src/routes/+layout.ts": {
"file": "entries/pages/_layout.ts.js",
"name": "entries/pages/_layout.ts",
"src": "src/routes/+layout.ts",
"isEntry": true
},
"src/routes/+page.svelte": {
"file": "entries/pages/_page.svelte.js",
"name": "entries/pages/_page.svelte",
"src": "src/routes/+page.svelte",
"isEntry": true,
"imports": [
"_index2.js",
"_index.js"
]
},
"src/routes/+page.ts": {
"file": "entries/pages/_page.ts.js",
"name": "entries/pages/_page.ts",
"src": "src/routes/+page.ts",
"isEntry": true,
"imports": [
"_api.js"
]
},
"src/routes/settings/+page.svelte": {
"file": "entries/pages/settings/_page.svelte.js",
"name": "entries/pages/settings/_page.svelte",
"src": "src/routes/settings/+page.svelte",
"isEntry": true,
"imports": [
"_index2.js"
]
},
"src/routes/settings/+page.ts": {
"file": "entries/pages/settings/_page.ts.js",
"name": "entries/pages/settings/_page.ts",
"src": "src/routes/settings/+page.ts",
"isEntry": true,
"imports": [
"_api.js"
]
}
}

View File

@@ -1,77 +0,0 @@
import { a as addToast } from "./toasts.js";
const API_BASE_URL = "/api";
async function fetchApi(endpoint) {
try {
console.log(`[api.fetchApi][Action] Fetching from context={{'endpoint': '${endpoint}'}}`);
const response = await fetch(`${API_BASE_URL}${endpoint}`);
if (!response.ok) {
throw new Error(`API request failed with status ${response.status}`);
}
return await response.json();
} catch (error) {
console.error(`[api.fetchApi][Coherence:Failed] Error fetching from ${endpoint}:`, error);
addToast(error.message, "error");
throw error;
}
}
async function postApi(endpoint, body) {
try {
console.log(`[api.postApi][Action] Posting to context={{'endpoint': '${endpoint}'}}`);
const response = await fetch(`${API_BASE_URL}${endpoint}`, {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify(body)
});
if (!response.ok) {
throw new Error(`API request failed with status ${response.status}`);
}
return await response.json();
} catch (error) {
console.error(`[api.postApi][Coherence:Failed] Error posting to ${endpoint}:`, error);
addToast(error.message, "error");
throw error;
}
}
async function requestApi(endpoint, method = "GET", body = null) {
try {
console.log(`[api.requestApi][Action] ${method} to context={{'endpoint': '${endpoint}'}}`);
const options = {
method,
headers: {
"Content-Type": "application/json"
}
};
if (body) {
options.body = JSON.stringify(body);
}
const response = await fetch(`${API_BASE_URL}${endpoint}`, options);
if (!response.ok) {
const errorData = await response.json().catch(() => ({}));
throw new Error(errorData.detail || `API request failed with status ${response.status}`);
}
return await response.json();
} catch (error) {
console.error(`[api.requestApi][Coherence:Failed] Error ${method} to ${endpoint}:`, error);
addToast(error.message, "error");
throw error;
}
}
const api = {
getPlugins: () => fetchApi("/plugins/"),
getTasks: () => fetchApi("/tasks/"),
getTask: (taskId) => fetchApi(`/tasks/${taskId}`),
createTask: (pluginId, params) => postApi("/tasks/", { plugin_id: pluginId, params }),
// Settings
getSettings: () => fetchApi("/settings/"),
updateGlobalSettings: (settings) => requestApi("/settings/global", "PATCH", settings),
getEnvironments: () => fetchApi("/settings/environments"),
addEnvironment: (env) => postApi("/settings/environments", env),
updateEnvironment: (id, env) => requestApi(`/settings/environments/${id}`, "PUT", env),
deleteEnvironment: (id) => requestApi(`/settings/environments/${id}`, "DELETE"),
testEnvironmentConnection: (id) => postApi(`/settings/environments/${id}/test`, {})
};
export {
api as a
};

View File

@@ -1,34 +0,0 @@
let base = "";
let assets = base;
const app_dir = "_app";
const relative = true;
const initial = { base, assets };
function override(paths) {
base = paths.base;
assets = paths.assets;
}
function reset() {
base = initial.base;
assets = initial.assets;
}
function set_assets(path) {
assets = initial.assets = path;
}
let prerendering = false;
function set_building() {
}
function set_prerendering() {
prerendering = true;
}
export {
assets as a,
base as b,
app_dir as c,
reset as d,
set_building as e,
set_prerendering as f,
override as o,
prerendering as p,
relative as r,
set_assets as s
};

View File

@@ -1,51 +0,0 @@
var is_array = Array.isArray;
var index_of = Array.prototype.indexOf;
var array_from = Array.from;
var define_property = Object.defineProperty;
var get_descriptor = Object.getOwnPropertyDescriptor;
var object_prototype = Object.prototype;
var array_prototype = Array.prototype;
var get_prototype_of = Object.getPrototypeOf;
var is_extensible = Object.isExtensible;
const noop = () => {
};
function run_all(arr) {
for (var i = 0; i < arr.length; i++) {
arr[i]();
}
}
function deferred() {
var resolve;
var reject;
var promise = new Promise((res, rej) => {
resolve = res;
reject = rej;
});
return { promise, resolve, reject };
}
function equals(value) {
return value === this.v;
}
function safe_not_equal(a, b) {
return a != a ? b == b : a !== b || a !== null && typeof a === "object" || typeof a === "function";
}
function safe_equals(value) {
return !safe_not_equal(value, this.v);
}
export {
array_from as a,
deferred as b,
array_prototype as c,
define_property as d,
equals as e,
get_prototype_of as f,
get_descriptor as g,
is_extensible as h,
is_array as i,
index_of as j,
safe_not_equal as k,
noop as n,
object_prototype as o,
run_all as r,
safe_equals as s
};

View File

@@ -1,174 +0,0 @@
const SCHEME = /^[a-z][a-z\d+\-.]+:/i;
const internal = new URL("sveltekit-internal://");
function resolve(base, path) {
if (path[0] === "/" && path[1] === "/") return path;
let url = new URL(base, internal);
url = new URL(path, url);
return url.protocol === internal.protocol ? url.pathname + url.search + url.hash : url.href;
}
function normalize_path(path, trailing_slash) {
if (path === "/" || trailing_slash === "ignore") return path;
if (trailing_slash === "never") {
return path.endsWith("/") ? path.slice(0, -1) : path;
} else if (trailing_slash === "always" && !path.endsWith("/")) {
return path + "/";
}
return path;
}
function decode_pathname(pathname) {
return pathname.split("%25").map(decodeURI).join("%25");
}
function decode_params(params) {
for (const key in params) {
params[key] = decodeURIComponent(params[key]);
}
return params;
}
function make_trackable(url, callback, search_params_callback, allow_hash = false) {
const tracked = new URL(url);
Object.defineProperty(tracked, "searchParams", {
value: new Proxy(tracked.searchParams, {
get(obj, key) {
if (key === "get" || key === "getAll" || key === "has") {
return (param) => {
search_params_callback(param);
return obj[key](param);
};
}
callback();
const value = Reflect.get(obj, key);
return typeof value === "function" ? value.bind(obj) : value;
}
}),
enumerable: true,
configurable: true
});
const tracked_url_properties = ["href", "pathname", "search", "toString", "toJSON"];
if (allow_hash) tracked_url_properties.push("hash");
for (const property of tracked_url_properties) {
Object.defineProperty(tracked, property, {
get() {
callback();
return url[property];
},
enumerable: true,
configurable: true
});
}
{
tracked[/* @__PURE__ */ Symbol.for("nodejs.util.inspect.custom")] = (depth, opts, inspect) => {
return inspect(url, opts);
};
tracked.searchParams[/* @__PURE__ */ Symbol.for("nodejs.util.inspect.custom")] = (depth, opts, inspect) => {
return inspect(url.searchParams, opts);
};
}
if (!allow_hash) {
disable_hash(tracked);
}
return tracked;
}
function disable_hash(url) {
allow_nodejs_console_log(url);
Object.defineProperty(url, "hash", {
get() {
throw new Error(
"Cannot access event.url.hash. Consider using `page.url.hash` inside a component instead"
);
}
});
}
function disable_search(url) {
allow_nodejs_console_log(url);
for (const property of ["search", "searchParams"]) {
Object.defineProperty(url, property, {
get() {
throw new Error(`Cannot access url.${property} on a page with prerendering enabled`);
}
});
}
}
function allow_nodejs_console_log(url) {
{
url[/* @__PURE__ */ Symbol.for("nodejs.util.inspect.custom")] = (depth, opts, inspect) => {
return inspect(new URL(url), opts);
};
}
}
function validator(expected) {
function validate(module, file) {
if (!module) return;
for (const key in module) {
if (key[0] === "_" || expected.has(key)) continue;
const values = [...expected.values()];
const hint = hint_for_supported_files(key, file?.slice(file.lastIndexOf("."))) ?? `valid exports are ${values.join(", ")}, or anything with a '_' prefix`;
throw new Error(`Invalid export '${key}'${file ? ` in ${file}` : ""} (${hint})`);
}
}
return validate;
}
function hint_for_supported_files(key, ext = ".js") {
const supported_files = [];
if (valid_layout_exports.has(key)) {
supported_files.push(`+layout${ext}`);
}
if (valid_page_exports.has(key)) {
supported_files.push(`+page${ext}`);
}
if (valid_layout_server_exports.has(key)) {
supported_files.push(`+layout.server${ext}`);
}
if (valid_page_server_exports.has(key)) {
supported_files.push(`+page.server${ext}`);
}
if (valid_server_exports.has(key)) {
supported_files.push(`+server${ext}`);
}
if (supported_files.length > 0) {
return `'${key}' is a valid export in ${supported_files.slice(0, -1).join(", ")}${supported_files.length > 1 ? " or " : ""}${supported_files.at(-1)}`;
}
}
const valid_layout_exports = /* @__PURE__ */ new Set([
"load",
"prerender",
"csr",
"ssr",
"trailingSlash",
"config"
]);
const valid_page_exports = /* @__PURE__ */ new Set([...valid_layout_exports, "entries"]);
const valid_layout_server_exports = /* @__PURE__ */ new Set([...valid_layout_exports]);
const valid_page_server_exports = /* @__PURE__ */ new Set([...valid_layout_server_exports, "actions", "entries"]);
const valid_server_exports = /* @__PURE__ */ new Set([
"GET",
"POST",
"PATCH",
"PUT",
"DELETE",
"OPTIONS",
"HEAD",
"fallback",
"prerender",
"trailingSlash",
"config",
"entries"
]);
const validate_layout_exports = validator(valid_layout_exports);
const validate_page_exports = validator(valid_page_exports);
const validate_layout_server_exports = validator(valid_layout_server_exports);
const validate_page_server_exports = validator(valid_page_server_exports);
const validate_server_exports = validator(valid_server_exports);
export {
SCHEME as S,
decode_params as a,
validate_layout_exports as b,
validate_page_server_exports as c,
disable_search as d,
validate_page_exports as e,
decode_pathname as f,
validate_server_exports as g,
make_trackable as m,
normalize_path as n,
resolve as r,
validate_layout_server_exports as v
};

View File

@@ -1,4 +0,0 @@
const BROWSER = false;
export {
BROWSER as B
};

View File

@@ -1,59 +0,0 @@
import { n as noop, k as safe_not_equal } from "./equality.js";
import "clsx";
const subscriber_queue = [];
function readable(value, start) {
return {
subscribe: writable(value, start).subscribe
};
}
function writable(value, start = noop) {
let stop = null;
const subscribers = /* @__PURE__ */ new Set();
function set(new_value) {
if (safe_not_equal(value, new_value)) {
value = new_value;
if (stop) {
const run_queue = !subscriber_queue.length;
for (const subscriber of subscribers) {
subscriber[1]();
subscriber_queue.push(subscriber, value);
}
if (run_queue) {
for (let i = 0; i < subscriber_queue.length; i += 2) {
subscriber_queue[i][0](subscriber_queue[i + 1]);
}
subscriber_queue.length = 0;
}
}
}
}
function update(fn) {
set(fn(
/** @type {T} */
value
));
}
function subscribe(run, invalidate = noop) {
const subscriber = [run, invalidate];
subscribers.add(subscriber);
if (subscribers.size === 1) {
stop = start(set, update) || noop;
}
run(
/** @type {T} */
value
);
return () => {
subscribers.delete(subscriber);
if (subscribers.size === 0 && stop) {
stop();
stop = null;
}
};
}
return { set, update, subscribe };
}
export {
readable as r,
writable as w
};

File diff suppressed because it is too large Load Diff

View File

@@ -1,982 +0,0 @@
import { H as HYDRATION_ERROR, C as COMMENT_NODE, a as HYDRATION_END, g as get_next_sibling, b as HYDRATION_START, c as HYDRATION_START_ELSE, e as effect_tracking, d as get, s as source, r as render_effect, u as untrack, i as increment, q as queue_micro_task, f as active_effect, h as block, j as branch, B as Batch, p as pause_effect, k as create_text, l as set_active_effect, m as set_active_reaction, n as set_component_context, o as handle_error, t as active_reaction, v as component_context, w as move_effect, x as internal_set, y as destroy_effect, z as invoke_error_boundary, A as svelte_boundary_reset_onerror, E as EFFECT_TRANSPARENT, D as EFFECT_PRESERVED, F as BOUNDARY_EFFECT, G as init_operations, I as get_first_child, J as hydration_failed, K as clear_text_content, L as component_root, M as is_passive_event, N as push, O as pop, P as set, Q as LEGACY_PROPS, R as flushSync, S as mutable_source, T as render, U as setContext } from "./index2.js";
import { d as define_property, a as array_from } from "./equality.js";
import "clsx";
import "./environment.js";
let public_env = {};
function set_private_env(environment) {
}
function set_public_env(environment) {
public_env = environment;
}
function hydration_mismatch(location) {
{
console.warn(`https://svelte.dev/e/hydration_mismatch`);
}
}
function svelte_boundary_reset_noop() {
{
console.warn(`https://svelte.dev/e/svelte_boundary_reset_noop`);
}
}
let hydrating = false;
function set_hydrating(value) {
hydrating = value;
}
let hydrate_node;
function set_hydrate_node(node) {
if (node === null) {
hydration_mismatch();
throw HYDRATION_ERROR;
}
return hydrate_node = node;
}
function hydrate_next() {
return set_hydrate_node(get_next_sibling(hydrate_node));
}
function next(count = 1) {
if (hydrating) {
var i = count;
var node = hydrate_node;
while (i--) {
node = /** @type {TemplateNode} */
get_next_sibling(node);
}
hydrate_node = node;
}
}
function skip_nodes(remove = true) {
var depth = 0;
var node = hydrate_node;
while (true) {
if (node.nodeType === COMMENT_NODE) {
var data = (
/** @type {Comment} */
node.data
);
if (data === HYDRATION_END) {
if (depth === 0) return node;
depth -= 1;
} else if (data === HYDRATION_START || data === HYDRATION_START_ELSE) {
depth += 1;
}
}
var next2 = (
/** @type {TemplateNode} */
get_next_sibling(node)
);
if (remove) node.remove();
node = next2;
}
}
function createSubscriber(start) {
let subscribers = 0;
let version = source(0);
let stop;
return () => {
if (effect_tracking()) {
get(version);
render_effect(() => {
if (subscribers === 0) {
stop = untrack(() => start(() => increment(version)));
}
subscribers += 1;
return () => {
queue_micro_task(() => {
subscribers -= 1;
if (subscribers === 0) {
stop?.();
stop = void 0;
increment(version);
}
});
};
});
}
};
}
var flags = EFFECT_TRANSPARENT | EFFECT_PRESERVED | BOUNDARY_EFFECT;
function boundary(node, props, children) {
new Boundary(node, props, children);
}
class Boundary {
/** @type {Boundary | null} */
parent;
#pending = false;
/** @type {TemplateNode} */
#anchor;
/** @type {TemplateNode | null} */
#hydrate_open = hydrating ? hydrate_node : null;
/** @type {BoundaryProps} */
#props;
/** @type {((anchor: Node) => void)} */
#children;
/** @type {Effect} */
#effect;
/** @type {Effect | null} */
#main_effect = null;
/** @type {Effect | null} */
#pending_effect = null;
/** @type {Effect | null} */
#failed_effect = null;
/** @type {DocumentFragment | null} */
#offscreen_fragment = null;
/** @type {TemplateNode | null} */
#pending_anchor = null;
#local_pending_count = 0;
#pending_count = 0;
#is_creating_fallback = false;
/**
* A source containing the number of pending async deriveds/expressions.
* Only created if `$effect.pending()` is used inside the boundary,
* otherwise updating the source results in needless `Batch.ensure()`
* calls followed by no-op flushes
* @type {Source<number> | null}
*/
#effect_pending = null;
#effect_pending_subscriber = createSubscriber(() => {
this.#effect_pending = source(this.#local_pending_count);
return () => {
this.#effect_pending = null;
};
});
/**
* @param {TemplateNode} node
* @param {BoundaryProps} props
* @param {((anchor: Node) => void)} children
*/
constructor(node, props, children) {
this.#anchor = node;
this.#props = props;
this.#children = children;
this.parent = /** @type {Effect} */
active_effect.b;
this.#pending = !!this.#props.pending;
this.#effect = block(() => {
active_effect.b = this;
if (hydrating) {
const comment = this.#hydrate_open;
hydrate_next();
const server_rendered_pending = (
/** @type {Comment} */
comment.nodeType === COMMENT_NODE && /** @type {Comment} */
comment.data === HYDRATION_START_ELSE
);
if (server_rendered_pending) {
this.#hydrate_pending_content();
} else {
this.#hydrate_resolved_content();
}
} else {
var anchor = this.#get_anchor();
try {
this.#main_effect = branch(() => children(anchor));
} catch (error) {
this.error(error);
}
if (this.#pending_count > 0) {
this.#show_pending_snippet();
} else {
this.#pending = false;
}
}
return () => {
this.#pending_anchor?.remove();
};
}, flags);
if (hydrating) {
this.#anchor = hydrate_node;
}
}
#hydrate_resolved_content() {
try {
this.#main_effect = branch(() => this.#children(this.#anchor));
} catch (error) {
this.error(error);
}
this.#pending = false;
}
#hydrate_pending_content() {
const pending = this.#props.pending;
if (!pending) {
return;
}
this.#pending_effect = branch(() => pending(this.#anchor));
Batch.enqueue(() => {
var anchor = this.#get_anchor();
this.#main_effect = this.#run(() => {
Batch.ensure();
return branch(() => this.#children(anchor));
});
if (this.#pending_count > 0) {
this.#show_pending_snippet();
} else {
pause_effect(
/** @type {Effect} */
this.#pending_effect,
() => {
this.#pending_effect = null;
}
);
this.#pending = false;
}
});
}
#get_anchor() {
var anchor = this.#anchor;
if (this.#pending) {
this.#pending_anchor = create_text();
this.#anchor.before(this.#pending_anchor);
anchor = this.#pending_anchor;
}
return anchor;
}
/**
* Returns `true` if the effect exists inside a boundary whose pending snippet is shown
* @returns {boolean}
*/
is_pending() {
return this.#pending || !!this.parent && this.parent.is_pending();
}
has_pending_snippet() {
return !!this.#props.pending;
}
/**
* @param {() => Effect | null} fn
*/
#run(fn) {
var previous_effect = active_effect;
var previous_reaction = active_reaction;
var previous_ctx = component_context;
set_active_effect(this.#effect);
set_active_reaction(this.#effect);
set_component_context(this.#effect.ctx);
try {
return fn();
} catch (e) {
handle_error(e);
return null;
} finally {
set_active_effect(previous_effect);
set_active_reaction(previous_reaction);
set_component_context(previous_ctx);
}
}
#show_pending_snippet() {
const pending = (
/** @type {(anchor: Node) => void} */
this.#props.pending
);
if (this.#main_effect !== null) {
this.#offscreen_fragment = document.createDocumentFragment();
this.#offscreen_fragment.append(
/** @type {TemplateNode} */
this.#pending_anchor
);
move_effect(this.#main_effect, this.#offscreen_fragment);
}
if (this.#pending_effect === null) {
this.#pending_effect = branch(() => pending(this.#anchor));
}
}
/**
* Updates the pending count associated with the currently visible pending snippet,
* if any, such that we can replace the snippet with content once work is done
* @param {1 | -1} d
*/
#update_pending_count(d) {
if (!this.has_pending_snippet()) {
if (this.parent) {
this.parent.#update_pending_count(d);
}
return;
}
this.#pending_count += d;
if (this.#pending_count === 0) {
this.#pending = false;
if (this.#pending_effect) {
pause_effect(this.#pending_effect, () => {
this.#pending_effect = null;
});
}
if (this.#offscreen_fragment) {
this.#anchor.before(this.#offscreen_fragment);
this.#offscreen_fragment = null;
}
}
}
/**
* Update the source that powers `$effect.pending()` inside this boundary,
* and controls when the current `pending` snippet (if any) is removed.
* Do not call from inside the class
* @param {1 | -1} d
*/
update_pending_count(d) {
this.#update_pending_count(d);
this.#local_pending_count += d;
if (this.#effect_pending) {
internal_set(this.#effect_pending, this.#local_pending_count);
}
}
get_effect_pending() {
this.#effect_pending_subscriber();
return get(
/** @type {Source<number>} */
this.#effect_pending
);
}
/** @param {unknown} error */
error(error) {
var onerror = this.#props.onerror;
let failed = this.#props.failed;
if (this.#is_creating_fallback || !onerror && !failed) {
throw error;
}
if (this.#main_effect) {
destroy_effect(this.#main_effect);
this.#main_effect = null;
}
if (this.#pending_effect) {
destroy_effect(this.#pending_effect);
this.#pending_effect = null;
}
if (this.#failed_effect) {
destroy_effect(this.#failed_effect);
this.#failed_effect = null;
}
if (hydrating) {
set_hydrate_node(
/** @type {TemplateNode} */
this.#hydrate_open
);
next();
set_hydrate_node(skip_nodes());
}
var did_reset = false;
var calling_on_error = false;
const reset = () => {
if (did_reset) {
svelte_boundary_reset_noop();
return;
}
did_reset = true;
if (calling_on_error) {
svelte_boundary_reset_onerror();
}
Batch.ensure();
this.#local_pending_count = 0;
if (this.#failed_effect !== null) {
pause_effect(this.#failed_effect, () => {
this.#failed_effect = null;
});
}
this.#pending = this.has_pending_snippet();
this.#main_effect = this.#run(() => {
this.#is_creating_fallback = false;
return branch(() => this.#children(this.#anchor));
});
if (this.#pending_count > 0) {
this.#show_pending_snippet();
} else {
this.#pending = false;
}
};
var previous_reaction = active_reaction;
try {
set_active_reaction(null);
calling_on_error = true;
onerror?.(error, reset);
calling_on_error = false;
} catch (error2) {
invoke_error_boundary(error2, this.#effect && this.#effect.parent);
} finally {
set_active_reaction(previous_reaction);
}
if (failed) {
queue_micro_task(() => {
this.#failed_effect = this.#run(() => {
Batch.ensure();
this.#is_creating_fallback = true;
try {
return branch(() => {
failed(
this.#anchor,
() => error,
() => reset
);
});
} catch (error2) {
invoke_error_boundary(
error2,
/** @type {Effect} */
this.#effect.parent
);
return null;
} finally {
this.#is_creating_fallback = false;
}
});
});
}
}
}
const all_registered_events = /* @__PURE__ */ new Set();
const root_event_handles = /* @__PURE__ */ new Set();
let last_propagated_event = null;
function handle_event_propagation(event) {
var handler_element = this;
var owner_document = (
/** @type {Node} */
handler_element.ownerDocument
);
var event_name = event.type;
var path = event.composedPath?.() || [];
var current_target = (
/** @type {null | Element} */
path[0] || event.target
);
last_propagated_event = event;
var path_idx = 0;
var handled_at = last_propagated_event === event && event.__root;
if (handled_at) {
var at_idx = path.indexOf(handled_at);
if (at_idx !== -1 && (handler_element === document || handler_element === /** @type {any} */
window)) {
event.__root = handler_element;
return;
}
var handler_idx = path.indexOf(handler_element);
if (handler_idx === -1) {
return;
}
if (at_idx <= handler_idx) {
path_idx = at_idx;
}
}
current_target = /** @type {Element} */
path[path_idx] || event.target;
if (current_target === handler_element) return;
define_property(event, "currentTarget", {
configurable: true,
get() {
return current_target || owner_document;
}
});
var previous_reaction = active_reaction;
var previous_effect = active_effect;
set_active_reaction(null);
set_active_effect(null);
try {
var throw_error;
var other_errors = [];
while (current_target !== null) {
var parent_element = current_target.assignedSlot || current_target.parentNode || /** @type {any} */
current_target.host || null;
try {
var delegated = current_target["__" + event_name];
if (delegated != null && (!/** @type {any} */
current_target.disabled || // DOM could've been updated already by the time this is reached, so we check this as well
// -> the target could not have been disabled because it emits the event in the first place
event.target === current_target)) {
delegated.call(current_target, event);
}
} catch (error) {
if (throw_error) {
other_errors.push(error);
} else {
throw_error = error;
}
}
if (event.cancelBubble || parent_element === handler_element || parent_element === null) {
break;
}
current_target = parent_element;
}
if (throw_error) {
for (let error of other_errors) {
queueMicrotask(() => {
throw error;
});
}
throw throw_error;
}
} finally {
event.__root = handler_element;
delete event.currentTarget;
set_active_reaction(previous_reaction);
set_active_effect(previous_effect);
}
}
function assign_nodes(start, end) {
var effect = (
/** @type {Effect} */
active_effect
);
if (effect.nodes === null) {
effect.nodes = { start, end, a: null, t: null };
}
}
function mount(component, options2) {
return _mount(component, options2);
}
function hydrate(component, options2) {
init_operations();
options2.intro = options2.intro ?? false;
const target = options2.target;
const was_hydrating = hydrating;
const previous_hydrate_node = hydrate_node;
try {
var anchor = get_first_child(target);
while (anchor && (anchor.nodeType !== COMMENT_NODE || /** @type {Comment} */
anchor.data !== HYDRATION_START)) {
anchor = get_next_sibling(anchor);
}
if (!anchor) {
throw HYDRATION_ERROR;
}
set_hydrating(true);
set_hydrate_node(
/** @type {Comment} */
anchor
);
const instance = _mount(component, { ...options2, anchor });
set_hydrating(false);
return (
/** @type {Exports} */
instance
);
} catch (error) {
if (error instanceof Error && error.message.split("\n").some((line) => line.startsWith("https://svelte.dev/e/"))) {
throw error;
}
if (error !== HYDRATION_ERROR) {
console.warn("Failed to hydrate: ", error);
}
if (options2.recover === false) {
hydration_failed();
}
init_operations();
clear_text_content(target);
set_hydrating(false);
return mount(component, options2);
} finally {
set_hydrating(was_hydrating);
set_hydrate_node(previous_hydrate_node);
}
}
const document_listeners = /* @__PURE__ */ new Map();
function _mount(Component, { target, anchor, props = {}, events, context, intro = true }) {
init_operations();
var registered_events = /* @__PURE__ */ new Set();
var event_handle = (events2) => {
for (var i = 0; i < events2.length; i++) {
var event_name = events2[i];
if (registered_events.has(event_name)) continue;
registered_events.add(event_name);
var passive = is_passive_event(event_name);
target.addEventListener(event_name, handle_event_propagation, { passive });
var n = document_listeners.get(event_name);
if (n === void 0) {
document.addEventListener(event_name, handle_event_propagation, { passive });
document_listeners.set(event_name, 1);
} else {
document_listeners.set(event_name, n + 1);
}
}
};
event_handle(array_from(all_registered_events));
root_event_handles.add(event_handle);
var component = void 0;
var unmount2 = component_root(() => {
var anchor_node = anchor ?? target.appendChild(create_text());
boundary(
/** @type {TemplateNode} */
anchor_node,
{
pending: () => {
}
},
(anchor_node2) => {
if (context) {
push({});
var ctx = (
/** @type {ComponentContext} */
component_context
);
ctx.c = context;
}
if (events) {
props.$$events = events;
}
if (hydrating) {
assign_nodes(
/** @type {TemplateNode} */
anchor_node2,
null
);
}
component = Component(anchor_node2, props) || {};
if (hydrating) {
active_effect.nodes.end = hydrate_node;
if (hydrate_node === null || hydrate_node.nodeType !== COMMENT_NODE || /** @type {Comment} */
hydrate_node.data !== HYDRATION_END) {
hydration_mismatch();
throw HYDRATION_ERROR;
}
}
if (context) {
pop();
}
}
);
return () => {
for (var event_name of registered_events) {
target.removeEventListener(event_name, handle_event_propagation);
var n = (
/** @type {number} */
document_listeners.get(event_name)
);
if (--n === 0) {
document.removeEventListener(event_name, handle_event_propagation);
document_listeners.delete(event_name);
} else {
document_listeners.set(event_name, n);
}
}
root_event_handles.delete(event_handle);
if (anchor_node !== anchor) {
anchor_node.parentNode?.removeChild(anchor_node);
}
};
});
mounted_components.set(component, unmount2);
return component;
}
let mounted_components = /* @__PURE__ */ new WeakMap();
function unmount(component, options2) {
const fn = mounted_components.get(component);
if (fn) {
mounted_components.delete(component);
return fn(options2);
}
return Promise.resolve();
}
function asClassComponent$1(component) {
return class extends Svelte4Component {
/** @param {any} options */
constructor(options2) {
super({
component,
...options2
});
}
};
}
class Svelte4Component {
/** @type {any} */
#events;
/** @type {Record<string, any>} */
#instance;
/**
* @param {ComponentConstructorOptions & {
* component: any;
* }} options
*/
constructor(options2) {
var sources = /* @__PURE__ */ new Map();
var add_source = (key, value) => {
var s = mutable_source(value, false, false);
sources.set(key, s);
return s;
};
const props = new Proxy(
{ ...options2.props || {}, $$events: {} },
{
get(target, prop) {
return get(sources.get(prop) ?? add_source(prop, Reflect.get(target, prop)));
},
has(target, prop) {
if (prop === LEGACY_PROPS) return true;
get(sources.get(prop) ?? add_source(prop, Reflect.get(target, prop)));
return Reflect.has(target, prop);
},
set(target, prop, value) {
set(sources.get(prop) ?? add_source(prop, value), value);
return Reflect.set(target, prop, value);
}
}
);
this.#instance = (options2.hydrate ? hydrate : mount)(options2.component, {
target: options2.target,
anchor: options2.anchor,
props,
context: options2.context,
intro: options2.intro ?? false,
recover: options2.recover
});
if (!options2?.props?.$$host || options2.sync === false) {
flushSync();
}
this.#events = props.$$events;
for (const key of Object.keys(this.#instance)) {
if (key === "$set" || key === "$destroy" || key === "$on") continue;
define_property(this, key, {
get() {
return this.#instance[key];
},
/** @param {any} value */
set(value) {
this.#instance[key] = value;
},
enumerable: true
});
}
this.#instance.$set = /** @param {Record<string, any>} next */
(next2) => {
Object.assign(props, next2);
};
this.#instance.$destroy = () => {
unmount(this.#instance);
};
}
/** @param {Record<string, any>} props */
$set(props) {
this.#instance.$set(props);
}
/**
* @param {string} event
* @param {(...args: any[]) => any} callback
* @returns {any}
*/
$on(event, callback) {
this.#events[event] = this.#events[event] || [];
const cb = (...args) => callback.call(this, ...args);
this.#events[event].push(cb);
return () => {
this.#events[event] = this.#events[event].filter(
/** @param {any} fn */
(fn) => fn !== cb
);
};
}
$destroy() {
this.#instance.$destroy();
}
}
let read_implementation = null;
function set_read_implementation(fn) {
read_implementation = fn;
}
function set_manifest(_) {
}
function asClassComponent(component) {
const component_constructor = asClassComponent$1(component);
const _render = (props, { context, csp } = {}) => {
const result = render(component, { props, context, csp });
const munged = Object.defineProperties(
/** @type {LegacyRenderResult & PromiseLike<LegacyRenderResult>} */
{},
{
css: {
value: { code: "", map: null }
},
head: {
get: () => result.head
},
html: {
get: () => result.body
},
then: {
/**
* this is not type-safe, but honestly it's the best I can do right now, and it's a straightforward function.
*
* @template TResult1
* @template [TResult2=never]
* @param { (value: LegacyRenderResult) => TResult1 } onfulfilled
* @param { (reason: unknown) => TResult2 } onrejected
*/
value: (onfulfilled, onrejected) => {
{
const user_result = onfulfilled({
css: munged.css,
head: munged.head,
html: munged.html
});
return Promise.resolve(user_result);
}
}
}
}
);
return munged;
};
component_constructor.render = _render;
return component_constructor;
}
function Root($$renderer, $$props) {
$$renderer.component(($$renderer2) => {
let {
stores,
page,
constructors,
components = [],
form,
data_0 = null,
data_1 = null
} = $$props;
{
setContext("__svelte__", stores);
}
{
stores.page.set(page);
}
const Pyramid_1 = constructors[1];
if (constructors[1]) {
$$renderer2.push("<!--[-->");
const Pyramid_0 = constructors[0];
$$renderer2.push(`<!---->`);
Pyramid_0($$renderer2, {
data: data_0,
form,
params: page.params,
children: ($$renderer3) => {
$$renderer3.push(`<!---->`);
Pyramid_1($$renderer3, { data: data_1, form, params: page.params });
$$renderer3.push(`<!---->`);
},
$$slots: { default: true }
});
$$renderer2.push(`<!---->`);
} else {
$$renderer2.push("<!--[!-->");
const Pyramid_0 = constructors[0];
$$renderer2.push(`<!---->`);
Pyramid_0($$renderer2, { data: data_0, form, params: page.params });
$$renderer2.push(`<!---->`);
}
$$renderer2.push(`<!--]--> `);
{
$$renderer2.push("<!--[!-->");
}
$$renderer2.push(`<!--]-->`);
});
}
const root = asClassComponent(Root);
const options = {
app_template_contains_nonce: false,
async: false,
csp: { "mode": "auto", "directives": { "upgrade-insecure-requests": false, "block-all-mixed-content": false }, "reportOnly": { "upgrade-insecure-requests": false, "block-all-mixed-content": false } },
csrf_check_origin: true,
csrf_trusted_origins: [],
embedded: false,
env_public_prefix: "PUBLIC_",
env_private_prefix: "",
hash_routing: false,
hooks: null,
// added lazily, via `get_hooks`
preload_strategy: "modulepreload",
root,
service_worker: false,
service_worker_options: void 0,
templates: {
app: ({ head, body, assets, nonce, env }) => '<!DOCTYPE html>\n<html lang="en">\n <head>\n <meta charset="utf-8" />\n <link rel="icon" href="' + assets + '/favicon.png" />\n <meta name="viewport" content="width=device-width, initial-scale=1" />\n ' + head + '\n </head>\n <body data-sveltekit-preload-data="hover">\n <div style="display: contents">' + body + "</div>\n </body>\n</html>\n",
error: ({ status, message }) => '<!doctype html>\n<html lang="en">\n <head>\n <meta charset="utf-8" />\n <title>' + message + `</title>
<style>
body {
--bg: white;
--fg: #222;
--divider: #ccc;
background: var(--bg);
color: var(--fg);
font-family:
system-ui,
-apple-system,
BlinkMacSystemFont,
'Segoe UI',
Roboto,
Oxygen,
Ubuntu,
Cantarell,
'Open Sans',
'Helvetica Neue',
sans-serif;
display: flex;
align-items: center;
justify-content: center;
height: 100vh;
margin: 0;
}
.error {
display: flex;
align-items: center;
max-width: 32rem;
margin: 0 1rem;
}
.status {
font-weight: 200;
font-size: 3rem;
line-height: 1;
position: relative;
top: -0.05rem;
}
.message {
border-left: 1px solid var(--divider);
padding: 0 0 0 1rem;
margin: 0 0 0 1rem;
min-height: 2.5rem;
display: flex;
align-items: center;
}
.message h1 {
font-weight: 400;
font-size: 1em;
margin: 0;
}
@media (prefers-color-scheme: dark) {
body {
--bg: #222;
--fg: #ddd;
--divider: #666;
}
}
</style>
</head>
<body>
<div class="error">
<span class="status">` + status + '</span>\n <div class="message">\n <h1>' + message + "</h1>\n </div>\n </div>\n </body>\n</html>\n"
},
version_hash: "1ootf77"
};
async function get_hooks() {
let handle;
let handleFetch;
let handleError;
let handleValidationError;
let init;
let reroute;
let transport;
return {
handle,
handleFetch,
handleError,
handleValidationError,
init,
reroute,
transport
};
}
export {
set_public_env as a,
set_read_implementation as b,
set_manifest as c,
get_hooks as g,
options as o,
public_env as p,
read_implementation as r,
set_private_env as s
};

View File

@@ -1,522 +0,0 @@
import * as devalue from "devalue";
import { t as text_decoder, b as base64_encode, c as base64_decode } from "./utils.js";
function set_nested_value(object, path_string, value) {
if (path_string.startsWith("n:")) {
path_string = path_string.slice(2);
value = value === "" ? void 0 : parseFloat(value);
} else if (path_string.startsWith("b:")) {
path_string = path_string.slice(2);
value = value === "on";
}
deep_set(object, split_path(path_string), value);
}
function convert_formdata(data) {
const result = {};
for (let key of data.keys()) {
const is_array = key.endsWith("[]");
let values = data.getAll(key);
if (is_array) key = key.slice(0, -2);
if (values.length > 1 && !is_array) {
throw new Error(`Form cannot contain duplicated keys — "${key}" has ${values.length} values`);
}
values = values.filter(
(entry) => typeof entry === "string" || entry.name !== "" || entry.size > 0
);
if (key.startsWith("n:")) {
key = key.slice(2);
values = values.map((v) => v === "" ? void 0 : parseFloat(
/** @type {string} */
v
));
} else if (key.startsWith("b:")) {
key = key.slice(2);
values = values.map((v) => v === "on");
}
set_nested_value(result, key, is_array ? values : values[0]);
}
return result;
}
const BINARY_FORM_CONTENT_TYPE = "application/x-sveltekit-formdata";
const BINARY_FORM_VERSION = 0;
async function deserialize_binary_form(request) {
if (request.headers.get("content-type") !== BINARY_FORM_CONTENT_TYPE) {
const form_data = await request.formData();
return { data: convert_formdata(form_data), meta: {}, form_data };
}
if (!request.body) {
throw new Error("Could not deserialize binary form: no body");
}
const reader = request.body.getReader();
const chunks = [];
async function get_chunk(index) {
if (index in chunks) return chunks[index];
let i = chunks.length;
while (i <= index) {
chunks[i] = reader.read().then((chunk) => chunk.value);
i++;
}
return chunks[index];
}
async function get_buffer(offset, length) {
let start_chunk;
let chunk_start = 0;
let chunk_index;
for (chunk_index = 0; ; chunk_index++) {
const chunk = await get_chunk(chunk_index);
if (!chunk) return null;
const chunk_end = chunk_start + chunk.byteLength;
if (offset >= chunk_start && offset < chunk_end) {
start_chunk = chunk;
break;
}
chunk_start = chunk_end;
}
if (offset + length <= chunk_start + start_chunk.byteLength) {
return start_chunk.subarray(offset - chunk_start, offset + length - chunk_start);
}
const buffer = new Uint8Array(length);
buffer.set(start_chunk.subarray(offset - chunk_start));
let cursor = start_chunk.byteLength - offset + chunk_start;
while (cursor < length) {
chunk_index++;
let chunk = await get_chunk(chunk_index);
if (!chunk) return null;
if (chunk.byteLength > length - cursor) {
chunk = chunk.subarray(0, length - cursor);
}
buffer.set(chunk, cursor);
cursor += chunk.byteLength;
}
return buffer;
}
const header = await get_buffer(0, 1 + 4 + 2);
if (!header) throw new Error("Could not deserialize binary form: too short");
if (header[0] !== BINARY_FORM_VERSION) {
throw new Error(
`Could not deserialize binary form: got version ${header[0]}, expected version ${BINARY_FORM_VERSION}`
);
}
const header_view = new DataView(header.buffer, header.byteOffset, header.byteLength);
const data_length = header_view.getUint32(1, true);
const file_offsets_length = header_view.getUint16(5, true);
const data_buffer = await get_buffer(1 + 4 + 2, data_length);
if (!data_buffer) throw new Error("Could not deserialize binary form: data too short");
let file_offsets;
let files_start_offset;
if (file_offsets_length > 0) {
const file_offsets_buffer = await get_buffer(1 + 4 + 2 + data_length, file_offsets_length);
if (!file_offsets_buffer)
throw new Error("Could not deserialize binary form: file offset table too short");
file_offsets = /** @type {Array<number>} */
JSON.parse(text_decoder.decode(file_offsets_buffer));
files_start_offset = 1 + 4 + 2 + data_length + file_offsets_length;
}
const [data, meta] = devalue.parse(text_decoder.decode(data_buffer), {
File: ([name, type, size, last_modified, index]) => {
return new Proxy(
new LazyFile(
name,
type,
size,
last_modified,
get_chunk,
files_start_offset + file_offsets[index]
),
{
getPrototypeOf() {
return File.prototype;
}
}
);
}
});
void (async () => {
let has_more = true;
while (has_more) {
const chunk = await get_chunk(chunks.length);
has_more = !!chunk;
}
})();
return { data, meta, form_data: null };
}
class LazyFile {
/** @type {(index: number) => Promise<Uint8Array<ArrayBuffer> | undefined>} */
#get_chunk;
/** @type {number} */
#offset;
/**
* @param {string} name
* @param {string} type
* @param {number} size
* @param {number} last_modified
* @param {(index: number) => Promise<Uint8Array<ArrayBuffer> | undefined>} get_chunk
* @param {number} offset
*/
constructor(name, type, size, last_modified, get_chunk, offset) {
this.name = name;
this.type = type;
this.size = size;
this.lastModified = last_modified;
this.webkitRelativePath = "";
this.#get_chunk = get_chunk;
this.#offset = offset;
this.arrayBuffer = this.arrayBuffer.bind(this);
this.bytes = this.bytes.bind(this);
this.slice = this.slice.bind(this);
this.stream = this.stream.bind(this);
this.text = this.text.bind(this);
}
/** @type {ArrayBuffer | undefined} */
#buffer;
async arrayBuffer() {
this.#buffer ??= await new Response(this.stream()).arrayBuffer();
return this.#buffer;
}
async bytes() {
return new Uint8Array(await this.arrayBuffer());
}
/**
* @param {number=} start
* @param {number=} end
* @param {string=} contentType
*/
slice(start = 0, end = this.size, contentType = this.type) {
if (start < 0) {
start = Math.max(this.size + start, 0);
} else {
start = Math.min(start, this.size);
}
if (end < 0) {
end = Math.max(this.size + end, 0);
} else {
end = Math.min(end, this.size);
}
const size = Math.max(end - start, 0);
const file = new LazyFile(
this.name,
contentType,
size,
this.lastModified,
this.#get_chunk,
this.#offset + start
);
return file;
}
stream() {
let cursor = 0;
let chunk_index = 0;
return new ReadableStream({
start: async (controller) => {
let chunk_start = 0;
let start_chunk = null;
for (chunk_index = 0; ; chunk_index++) {
const chunk = await this.#get_chunk(chunk_index);
if (!chunk) return null;
const chunk_end = chunk_start + chunk.byteLength;
if (this.#offset >= chunk_start && this.#offset < chunk_end) {
start_chunk = chunk;
break;
}
chunk_start = chunk_end;
}
if (this.#offset + this.size <= chunk_start + start_chunk.byteLength) {
controller.enqueue(
start_chunk.subarray(this.#offset - chunk_start, this.#offset + this.size - chunk_start)
);
controller.close();
} else {
controller.enqueue(start_chunk.subarray(this.#offset - chunk_start));
cursor = start_chunk.byteLength - this.#offset + chunk_start;
}
},
pull: async (controller) => {
chunk_index++;
let chunk = await this.#get_chunk(chunk_index);
if (!chunk) {
controller.error("Could not deserialize binary form: incomplete file data");
controller.close();
return;
}
if (chunk.byteLength > this.size - cursor) {
chunk = chunk.subarray(0, this.size - cursor);
}
controller.enqueue(chunk);
cursor += chunk.byteLength;
if (cursor >= this.size) {
controller.close();
}
}
});
}
async text() {
return text_decoder.decode(await this.arrayBuffer());
}
}
const path_regex = /^[a-zA-Z_$]\w*(\.[a-zA-Z_$]\w*|\[\d+\])*$/;
function split_path(path) {
if (!path_regex.test(path)) {
throw new Error(`Invalid path ${path}`);
}
return path.split(/\.|\[|\]/).filter(Boolean);
}
function check_prototype_pollution(key) {
if (key === "__proto__" || key === "constructor" || key === "prototype") {
throw new Error(
`Invalid key "${key}"`
);
}
}
function deep_set(object, keys, value) {
let current = object;
for (let i = 0; i < keys.length - 1; i += 1) {
const key = keys[i];
check_prototype_pollution(key);
const is_array = /^\d+$/.test(keys[i + 1]);
const exists = key in current;
const inner = current[key];
if (exists && is_array !== Array.isArray(inner)) {
throw new Error(`Invalid array key ${keys[i + 1]}`);
}
if (!exists) {
current[key] = is_array ? [] : {};
}
current = current[key];
}
const final_key = keys[keys.length - 1];
check_prototype_pollution(final_key);
current[final_key] = value;
}
function normalize_issue(issue, server = false) {
const normalized = { name: "", path: [], message: issue.message, server };
if (issue.path !== void 0) {
let name = "";
for (const segment of issue.path) {
const key = (
/** @type {string | number} */
typeof segment === "object" ? segment.key : segment
);
normalized.path.push(key);
if (typeof key === "number") {
name += `[${key}]`;
} else if (typeof key === "string") {
name += name === "" ? key : "." + key;
}
}
normalized.name = name;
}
return normalized;
}
function flatten_issues(issues) {
const result = {};
for (const issue of issues) {
(result.$ ??= []).push(issue);
let name = "";
if (issue.path !== void 0) {
for (const key of issue.path) {
if (typeof key === "number") {
name += `[${key}]`;
} else if (typeof key === "string") {
name += name === "" ? key : "." + key;
}
(result[name] ??= []).push(issue);
}
}
}
return result;
}
function deep_get(object, path) {
let current = object;
for (const key of path) {
if (current == null || typeof current !== "object") {
return current;
}
current = current[key];
}
return current;
}
function create_field_proxy(target, get_input, set_input, get_issues, path = []) {
const get_value = () => {
return deep_get(get_input(), path);
};
return new Proxy(target, {
get(target2, prop) {
if (typeof prop === "symbol") return target2[prop];
if (/^\d+$/.test(prop)) {
return create_field_proxy({}, get_input, set_input, get_issues, [
...path,
parseInt(prop, 10)
]);
}
const key = build_path_string(path);
if (prop === "set") {
const set_func = function(newValue) {
set_input(path, newValue);
return newValue;
};
return create_field_proxy(set_func, get_input, set_input, get_issues, [...path, prop]);
}
if (prop === "value") {
return create_field_proxy(get_value, get_input, set_input, get_issues, [...path, prop]);
}
if (prop === "issues" || prop === "allIssues") {
const issues_func = () => {
const all_issues = get_issues()[key === "" ? "$" : key];
if (prop === "allIssues") {
return all_issues?.map((issue) => ({
path: issue.path,
message: issue.message
}));
}
return all_issues?.filter((issue) => issue.name === key)?.map((issue) => ({
path: issue.path,
message: issue.message
}));
};
return create_field_proxy(issues_func, get_input, set_input, get_issues, [...path, prop]);
}
if (prop === "as") {
const as_func = (type, input_value) => {
const is_array = type === "file multiple" || type === "select multiple" || type === "checkbox" && typeof input_value === "string";
const prefix = type === "number" || type === "range" ? "n:" : type === "checkbox" && !is_array ? "b:" : "";
const base_props = {
name: prefix + key + (is_array ? "[]" : ""),
get "aria-invalid"() {
const issues = get_issues();
return key in issues ? "true" : void 0;
}
};
if (type !== "text" && type !== "select" && type !== "select multiple") {
base_props.type = type === "file multiple" ? "file" : type;
}
if (type === "submit" || type === "hidden") {
return Object.defineProperties(base_props, {
value: { value: input_value, enumerable: true }
});
}
if (type === "select" || type === "select multiple") {
return Object.defineProperties(base_props, {
multiple: { value: is_array, enumerable: true },
value: {
enumerable: true,
get() {
return get_value();
}
}
});
}
if (type === "checkbox" || type === "radio") {
return Object.defineProperties(base_props, {
value: { value: input_value ?? "on", enumerable: true },
checked: {
enumerable: true,
get() {
const value = get_value();
if (type === "radio") {
return value === input_value;
}
if (is_array) {
return (value ?? []).includes(input_value);
}
return value;
}
}
});
}
if (type === "file" || type === "file multiple") {
return Object.defineProperties(base_props, {
multiple: { value: is_array, enumerable: true },
files: {
enumerable: true,
get() {
const value = get_value();
if (value instanceof File) {
if (typeof DataTransfer !== "undefined") {
const fileList = new DataTransfer();
fileList.items.add(value);
return fileList.files;
}
return { 0: value, length: 1 };
}
if (Array.isArray(value) && value.every((f) => f instanceof File)) {
if (typeof DataTransfer !== "undefined") {
const fileList = new DataTransfer();
value.forEach((file) => fileList.items.add(file));
return fileList.files;
}
const fileListLike = { length: value.length };
value.forEach((file, index) => {
fileListLike[index] = file;
});
return fileListLike;
}
return null;
}
}
});
}
return Object.defineProperties(base_props, {
value: {
enumerable: true,
get() {
const value = get_value();
return value != null ? String(value) : "";
}
}
});
};
return create_field_proxy(as_func, get_input, set_input, get_issues, [...path, "as"]);
}
return create_field_proxy({}, get_input, set_input, get_issues, [...path, prop]);
}
});
}
function build_path_string(path) {
let result = "";
for (const segment of path) {
if (typeof segment === "number") {
result += `[${segment}]`;
} else {
result += result === "" ? segment : "." + segment;
}
}
return result;
}
const INVALIDATED_PARAM = "x-sveltekit-invalidated";
const TRAILING_SLASH_PARAM = "x-sveltekit-trailing-slash";
function stringify(data, transport) {
const encoders = Object.fromEntries(Object.entries(transport).map(([k, v]) => [k, v.encode]));
return devalue.stringify(data, encoders);
}
function stringify_remote_arg(value, transport) {
if (value === void 0) return "";
const json_string = stringify(value, transport);
const bytes = new TextEncoder().encode(json_string);
return base64_encode(bytes).replaceAll("=", "").replaceAll("+", "-").replaceAll("/", "_");
}
function parse_remote_arg(string, transport) {
if (!string) return void 0;
const json_string = text_decoder.decode(
// no need to add back `=` characters, atob can handle it
base64_decode(string.replaceAll("-", "+").replaceAll("_", "/"))
);
const decoders = Object.fromEntries(Object.entries(transport).map(([k, v]) => [k, v.decode]));
return devalue.parse(json_string, decoders);
}
function create_remote_key(id, payload) {
return id + "/" + payload;
}
export {
BINARY_FORM_CONTENT_TYPE as B,
INVALIDATED_PARAM as I,
TRAILING_SLASH_PARAM as T,
stringify_remote_arg as a,
create_field_proxy as b,
create_remote_key as c,
deserialize_binary_form as d,
set_nested_value as e,
flatten_issues as f,
deep_set as g,
normalize_issue as n,
parse_remote_arg as p,
stringify as s
};

View File

@@ -1,44 +0,0 @@
import { a0 as getContext } from "./index2.js";
import "clsx";
import "@sveltejs/kit/internal";
import "./exports.js";
import "./utils.js";
import "@sveltejs/kit/internal/server";
import { n as noop } from "./equality.js";
const is_legacy = noop.toString().includes("$$") || /function \w+\(\) \{\}/.test(noop.toString());
if (is_legacy) {
({
data: {},
form: null,
error: null,
params: {},
route: { id: null },
state: {},
status: -1,
url: new URL("https://example.com")
});
}
const getStores = () => {
const stores = getContext("__svelte__");
return {
/** @type {typeof page} */
page: {
subscribe: stores.page.subscribe
},
/** @type {typeof navigating} */
navigating: {
subscribe: stores.navigating.subscribe
},
/** @type {typeof updated} */
updated: stores.updated
};
};
const page = {
subscribe(fn) {
const store = getStores().page;
return store.subscribe(fn);
}
};
export {
page as p
};

View File

@@ -1,16 +0,0 @@
import { w as writable } from "./index.js";
const toasts = writable([]);
function addToast(message, type = "info", duration = 3e3) {
const id = Math.random().toString(36).substr(2, 9);
console.log(`[toasts.addToast][Action] Adding toast context={{'id': '${id}', 'type': '${type}', 'message': '${message}'}}`);
toasts.update((all) => [...all, { id, message, type }]);
setTimeout(() => removeToast(id), duration);
}
function removeToast(id) {
console.log(`[toasts.removeToast][Action] Removing toast context={{'id': '${id}'}}`);
toasts.update((all) => all.filter((t) => t.id !== id));
}
export {
addToast as a,
toasts as t
};

View File

@@ -1,43 +0,0 @@
const text_encoder = new TextEncoder();
const text_decoder = new TextDecoder();
function get_relative_path(from, to) {
const from_parts = from.split(/[/\\]/);
const to_parts = to.split(/[/\\]/);
from_parts.pop();
while (from_parts[0] === to_parts[0]) {
from_parts.shift();
to_parts.shift();
}
let i = from_parts.length;
while (i--) from_parts[i] = "..";
return from_parts.concat(to_parts).join("/");
}
function base64_encode(bytes) {
if (globalThis.Buffer) {
return globalThis.Buffer.from(bytes).toString("base64");
}
let binary = "";
for (let i = 0; i < bytes.length; i++) {
binary += String.fromCharCode(bytes[i]);
}
return btoa(binary);
}
function base64_decode(encoded) {
if (globalThis.Buffer) {
const buffer = globalThis.Buffer.from(encoded, "base64");
return new Uint8Array(buffer);
}
const binary = atob(encoded);
const bytes = new Uint8Array(binary.length);
for (let i = 0; i < binary.length; i++) {
bytes[i] = binary.charCodeAt(i);
}
return bytes;
}
export {
text_encoder as a,
base64_encode as b,
base64_decode as c,
get_relative_path as g,
text_decoder as t
};

View File

@@ -1,12 +0,0 @@
import { _ as escape_html, X as store_get, Y as unsubscribe_stores } from "../../chunks/index2.js";
import { p as page } from "../../chunks/stores.js";
function _error($$renderer, $$props) {
$$renderer.component(($$renderer2) => {
var $$store_subs;
$$renderer2.push(`<div class="container mx-auto p-4 text-center mt-20"><h1 class="text-6xl font-bold text-gray-800 mb-4">${escape_html(store_get($$store_subs ??= {}, "$page", page).status)}</h1> <p class="text-2xl text-gray-600 mb-8">${escape_html(store_get($$store_subs ??= {}, "$page", page).error?.message || "Page not found")}</p> <a href="/" class="bg-blue-500 text-white px-6 py-3 rounded-lg hover:bg-blue-600 transition-colors">Back to Dashboard</a></div>`);
if ($$store_subs) unsubscribe_stores($$store_subs);
});
}
export {
_error as default
};

View File

@@ -1,38 +0,0 @@
import { V as attr_class, W as stringify, X as store_get, Y as unsubscribe_stores, Z as ensure_array_like, _ as escape_html, $ as slot } from "../../chunks/index2.js";
import { p as page } from "../../chunks/stores.js";
import "clsx";
import { t as toasts } from "../../chunks/toasts.js";
function Navbar($$renderer, $$props) {
$$renderer.component(($$renderer2) => {
var $$store_subs;
$$renderer2.push(`<header class="bg-white shadow-md p-4 flex justify-between items-center"><a href="/" class="text-3xl font-bold text-gray-800 focus:outline-none">Superset Tools</a> <nav class="space-x-4"><a href="/"${attr_class(`text-gray-600 hover:text-blue-600 font-medium ${stringify(store_get($$store_subs ??= {}, "$page", page).url.pathname === "/" ? "text-blue-600 border-b-2 border-blue-600" : "")}`)}>Dashboard</a> <a href="/settings"${attr_class(`text-gray-600 hover:text-blue-600 font-medium ${stringify(store_get($$store_subs ??= {}, "$page", page).url.pathname === "/settings" ? "text-blue-600 border-b-2 border-blue-600" : "")}`)}>Settings</a></nav></header>`);
if ($$store_subs) unsubscribe_stores($$store_subs);
});
}
function Footer($$renderer) {
$$renderer.push(`<footer class="bg-white border-t p-4 mt-8 text-center text-gray-500 text-sm">© 2025 Superset Tools. All rights reserved.</footer>`);
}
function Toast($$renderer) {
var $$store_subs;
$$renderer.push(`<div class="fixed bottom-0 right-0 p-4 space-y-2"><!--[-->`);
const each_array = ensure_array_like(store_get($$store_subs ??= {}, "$toasts", toasts));
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
let toast = each_array[$$index];
$$renderer.push(`<div${attr_class(`p-4 rounded-md shadow-lg text-white ${stringify(toast.type === "info" && "bg-blue-500")} ${stringify(toast.type === "success" && "bg-green-500")} ${stringify(toast.type === "error" && "bg-red-500")} `)}>${escape_html(toast.message)}</div>`);
}
$$renderer.push(`<!--]--></div>`);
if ($$store_subs) unsubscribe_stores($$store_subs);
}
function _layout($$renderer, $$props) {
Toast($$renderer);
$$renderer.push(`<!----> <main class="bg-gray-50 min-h-screen flex flex-col">`);
Navbar($$renderer);
$$renderer.push(`<!----> <div class="p-4 flex-grow"><!--[-->`);
slot($$renderer, $$props, "default", {});
$$renderer.push(`<!--]--></div> `);
Footer($$renderer);
$$renderer.push(`<!----></main>`);
}
export {
_layout as default
};

View File

@@ -1,6 +0,0 @@
const ssr = false;
const prerender = false;
export {
prerender,
ssr
};

View File

@@ -1,132 +0,0 @@
import { a1 as ssr_context, X as store_get, _ as escape_html, Z as ensure_array_like, V as attr_class, Y as unsubscribe_stores, a2 as attr, a3 as bind_props } from "../../chunks/index2.js";
import { w as writable } from "../../chunks/index.js";
import "clsx";
function onDestroy(fn) {
/** @type {SSRContext} */
ssr_context.r.on_destroy(fn);
}
const plugins = writable([]);
const selectedPlugin = writable(null);
const selectedTask = writable(null);
const taskLogs = writable([]);
function TaskRunner($$renderer, $$props) {
$$renderer.component(($$renderer2) => {
var $$store_subs;
onDestroy(() => {
});
$$renderer2.push(`<div class="p-4 border rounded-lg bg-white shadow-md">`);
if (store_get($$store_subs ??= {}, "$selectedTask", selectedTask)) {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<h2 class="text-xl font-semibold mb-2">Task: ${escape_html(store_get($$store_subs ??= {}, "$selectedTask", selectedTask).plugin_id)}</h2> <div class="bg-gray-900 text-white font-mono text-sm p-4 rounded-md h-96 overflow-y-auto"><!--[-->`);
const each_array = ensure_array_like(store_get($$store_subs ??= {}, "$taskLogs", taskLogs));
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
let log = each_array[$$index];
$$renderer2.push(`<div><span class="text-gray-400">${escape_html(new Date(log.timestamp).toLocaleTimeString())}</span> <span${attr_class(log.level === "ERROR" ? "text-red-500" : "text-green-400")}>[${escape_html(log.level)}]</span> <span>${escape_html(log.message)}</span></div>`);
}
$$renderer2.push(`<!--]--></div>`);
} else {
$$renderer2.push("<!--[!-->");
$$renderer2.push(`<p>No task selected.</p>`);
}
$$renderer2.push(`<!--]--></div>`);
if ($$store_subs) unsubscribe_stores($$store_subs);
});
}
function DynamicForm($$renderer, $$props) {
$$renderer.component(($$renderer2) => {
let schema = $$props["schema"];
let formData = {};
function initializeForm() {
if (schema && schema.properties) {
for (const key in schema.properties) {
formData[key] = schema.properties[key].default || "";
}
}
}
initializeForm();
$$renderer2.push(`<form class="space-y-4">`);
if (schema && schema.properties) {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<!--[-->`);
const each_array = ensure_array_like(Object.entries(schema.properties));
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
let [key, prop] = each_array[$$index];
$$renderer2.push(`<div class="flex flex-col"><label${attr("for", key)} class="mb-1 font-semibold text-gray-700">${escape_html(prop.title || key)}</label> `);
if (prop.type === "string") {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<input type="text"${attr("id", key)}${attr("value", formData[key])}${attr("placeholder", prop.description || "")} class="p-2 border rounded-md"/>`);
} else {
$$renderer2.push("<!--[!-->");
if (prop.type === "number" || prop.type === "integer") {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<input type="number"${attr("id", key)}${attr("value", formData[key])}${attr("placeholder", prop.description || "")} class="p-2 border rounded-md"/>`);
} else {
$$renderer2.push("<!--[!-->");
if (prop.type === "boolean") {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<input type="checkbox"${attr("id", key)}${attr("checked", formData[key], true)} class="h-5 w-5"/>`);
} else {
$$renderer2.push("<!--[!-->");
}
$$renderer2.push(`<!--]-->`);
}
$$renderer2.push(`<!--]-->`);
}
$$renderer2.push(`<!--]--></div>`);
}
$$renderer2.push(`<!--]--> <button type="submit" class="w-full bg-green-500 text-white p-2 rounded-md hover:bg-green-600">Run Task</button>`);
} else {
$$renderer2.push("<!--[!-->");
}
$$renderer2.push(`<!--]--></form>`);
bind_props($$props, { schema });
});
}
function _page($$renderer, $$props) {
$$renderer.component(($$renderer2) => {
var $$store_subs;
let data = $$props["data"];
if (data.plugins) {
plugins.set(data.plugins);
}
$$renderer2.push(`<div class="container mx-auto p-4">`);
if (store_get($$store_subs ??= {}, "$selectedTask", selectedTask)) {
$$renderer2.push("<!--[-->");
TaskRunner($$renderer2);
$$renderer2.push(`<!----> <button class="mt-4 bg-blue-500 text-white p-2 rounded">Back to Task List</button>`);
} else {
$$renderer2.push("<!--[!-->");
if (store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin)) {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<h2 class="text-2xl font-bold mb-4">${escape_html(store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin).name)}</h2> `);
DynamicForm($$renderer2, {
schema: store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin).schema
});
$$renderer2.push(`<!----> <button class="mt-4 bg-gray-500 text-white p-2 rounded">Back to Dashboard</button>`);
} else {
$$renderer2.push("<!--[!-->");
$$renderer2.push(`<h1 class="text-2xl font-bold mb-4">Available Tools</h1> `);
if (data.error) {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">${escape_html(data.error)}</div>`);
} else {
$$renderer2.push("<!--[!-->");
}
$$renderer2.push(`<!--]--> <div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4"><!--[-->`);
const each_array = ensure_array_like(data.plugins);
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
let plugin = each_array[$$index];
$$renderer2.push(`<div class="border rounded-lg p-4 cursor-pointer hover:bg-gray-100" role="button" tabindex="0"><h2 class="text-xl font-semibold">${escape_html(plugin.name)}</h2> <p class="text-gray-600">${escape_html(plugin.description)}</p> <span class="text-sm text-gray-400">v${escape_html(plugin.version)}</span></div>`);
}
$$renderer2.push(`<!--]--></div>`);
}
$$renderer2.push(`<!--]-->`);
}
$$renderer2.push(`<!--]--></div>`);
if ($$store_subs) unsubscribe_stores($$store_subs);
bind_props($$props, { data });
});
}
export {
_page as default
};

View File

@@ -1,18 +0,0 @@
import { a as api } from "../../chunks/api.js";
async function load() {
try {
const plugins = await api.getPlugins();
return {
plugins
};
} catch (error) {
console.error("Failed to load plugins:", error);
return {
plugins: [],
error: "Failed to load plugins"
};
}
}
export {
load
};

View File

@@ -1,45 +0,0 @@
import { _ as escape_html, a2 as attr, Z as ensure_array_like, a3 as bind_props } from "../../../chunks/index2.js";
function _page($$renderer, $$props) {
$$renderer.component(($$renderer2) => {
let data = $$props["data"];
let settings = data.settings;
let newEnv = {
id: "",
name: "",
url: "",
username: "",
password: "",
is_default: false
};
settings = data.settings;
$$renderer2.push(`<div class="container mx-auto p-4"><h1 class="text-2xl font-bold mb-6">Settings</h1> `);
if (data.error) {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">${escape_html(data.error)}</div>`);
} else {
$$renderer2.push("<!--[!-->");
}
$$renderer2.push(`<!--]--> <section class="mb-8 bg-white p-6 rounded shadow"><h2 class="text-xl font-semibold mb-4">Global Settings</h2> <div class="grid grid-cols-1 gap-4"><div><label for="backup_path" class="block text-sm font-medium text-gray-700">Backup Storage Path</label> <input type="text" id="backup_path"${attr("value", settings.settings.backup_path)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <button class="bg-blue-500 text-white px-4 py-2 rounded hover:bg-blue-600 w-max">Save Global Settings</button></div></section> <section class="mb-8 bg-white p-6 rounded shadow"><h2 class="text-xl font-semibold mb-4">Superset Environments</h2> `);
if (settings.environments.length === 0) {
$$renderer2.push("<!--[-->");
$$renderer2.push(`<div class="mb-4 p-4 bg-yellow-100 border-l-4 border-yellow-500 text-yellow-700"><p class="font-bold">Warning</p> <p>No Superset environments configured. You must add at least one environment to perform backups or migrations.</p></div>`);
} else {
$$renderer2.push("<!--[!-->");
}
$$renderer2.push(`<!--]--> <div class="mb-6 overflow-x-auto"><table class="min-w-full divide-y divide-gray-200"><thead class="bg-gray-50"><tr><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Name</th><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">URL</th><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Username</th><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Default</th><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Actions</th></tr></thead><tbody class="bg-white divide-y divide-gray-200"><!--[-->`);
const each_array = ensure_array_like(settings.environments);
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
let env = each_array[$$index];
$$renderer2.push(`<tr><td class="px-6 py-4 whitespace-nowrap">${escape_html(env.name)}</td><td class="px-6 py-4 whitespace-nowrap">${escape_html(env.url)}</td><td class="px-6 py-4 whitespace-nowrap">${escape_html(env.username)}</td><td class="px-6 py-4 whitespace-nowrap">${escape_html(env.is_default ? "Yes" : "No")}</td><td class="px-6 py-4 whitespace-nowrap"><button class="text-green-600 hover:text-green-900 mr-4">Test</button> <button class="text-indigo-600 hover:text-indigo-900 mr-4">Edit</button> <button class="text-red-600 hover:text-red-900">Delete</button></td></tr>`);
}
$$renderer2.push(`<!--]--></tbody></table></div> <div class="bg-gray-50 p-4 rounded"><h3 class="text-lg font-medium mb-4">${escape_html("Add")} Environment</h3> <div class="grid grid-cols-1 md:grid-cols-2 gap-4"><div><label for="env_id" class="block text-sm font-medium text-gray-700">ID</label> <input type="text" id="env_id"${attr("value", newEnv.id)}${attr("disabled", false, true)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div><label for="env_name" class="block text-sm font-medium text-gray-700">Name</label> <input type="text" id="env_name"${attr("value", newEnv.name)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div><label for="env_url" class="block text-sm font-medium text-gray-700">URL</label> <input type="text" id="env_url"${attr("value", newEnv.url)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div><label for="env_user" class="block text-sm font-medium text-gray-700">Username</label> <input type="text" id="env_user"${attr("value", newEnv.username)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div><label for="env_pass" class="block text-sm font-medium text-gray-700">Password</label> <input type="password" id="env_pass"${attr("value", newEnv.password)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div class="flex items-center"><input type="checkbox" id="env_default"${attr("checked", newEnv.is_default, true)} class="h-4 w-4 text-blue-600 border-gray-300 rounded"/> <label for="env_default" class="ml-2 block text-sm text-gray-900">Default Environment</label></div></div> <div class="mt-4 flex gap-2"><button class="bg-green-500 text-white px-4 py-2 rounded hover:bg-green-600">${escape_html("Add")} Environment</button> `);
{
$$renderer2.push("<!--[!-->");
}
$$renderer2.push(`<!--]--></div></div></section></div>`);
bind_props($$props, { data });
});
}
export {
_page as default
};

View File

@@ -1,24 +0,0 @@
import { a as api } from "../../../chunks/api.js";
async function load() {
try {
const settings = await api.getSettings();
return {
settings
};
} catch (error) {
console.error("Failed to load settings:", error);
return {
settings: {
environments: [],
settings: {
backup_path: "",
default_environment_id: null
}
},
error: "Failed to load settings"
};
}
}
export {
load
};

File diff suppressed because it is too large Load Diff

View File

@@ -1,13 +0,0 @@
import { g, o, c, s, a, b } from "./chunks/internal.js";
import { s as s2, e, f } from "./chunks/environment.js";
export {
g as get_hooks,
o as options,
s2 as set_assets,
e as set_building,
c as set_manifest,
f as set_prerendering,
s as set_private_env,
a as set_public_env,
b as set_read_implementation
};

View File

@@ -1,47 +0,0 @@
export const manifest = (() => {
function __memo(fn) {
let value;
return () => value ??= (value = fn());
}
return {
appDir: "_app",
appPath: "_app",
assets: new Set([]),
mimeTypes: {},
_: {
client: {start:"_app/immutable/entry/start.BHAeOrfR.js",app:"_app/immutable/entry/app.BXnpILpp.js",imports:["_app/immutable/entry/start.BHAeOrfR.js","_app/immutable/chunks/D0iaTcAo.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/entry/app.BXnpILpp.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/chunks/vVxDbqKK.js"],stylesheets:[],fonts:[],uses_env_dynamic_public:false},
nodes: [
__memo(() => import('./nodes/0.js')),
__memo(() => import('./nodes/1.js')),
__memo(() => import('./nodes/2.js')),
__memo(() => import('./nodes/3.js'))
],
remotes: {
},
routes: [
{
id: "/",
pattern: /^\/$/,
params: [],
page: { layouts: [0,], errors: [1,], leaf: 2 },
endpoint: null
},
{
id: "/settings",
pattern: /^\/settings\/?$/,
params: [],
page: { layouts: [0,], errors: [1,], leaf: 3 },
endpoint: null
}
],
prerendered_routes: new Set([]),
matchers: async () => {
return { };
},
server_assets: {}
}
}
})();

View File

@@ -1,47 +0,0 @@
export const manifest = (() => {
function __memo(fn) {
let value;
return () => value ??= (value = fn());
}
return {
appDir: "_app",
appPath: "_app",
assets: new Set([]),
mimeTypes: {},
_: {
client: {start:"_app/immutable/entry/start.BHAeOrfR.js",app:"_app/immutable/entry/app.BXnpILpp.js",imports:["_app/immutable/entry/start.BHAeOrfR.js","_app/immutable/chunks/D0iaTcAo.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/entry/app.BXnpILpp.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/chunks/vVxDbqKK.js"],stylesheets:[],fonts:[],uses_env_dynamic_public:false},
nodes: [
__memo(() => import('./nodes/0.js')),
__memo(() => import('./nodes/1.js')),
__memo(() => import('./nodes/2.js')),
__memo(() => import('./nodes/3.js'))
],
remotes: {
},
routes: [
{
id: "/",
pattern: /^\/$/,
params: [],
page: { layouts: [0,], errors: [1,], leaf: 2 },
endpoint: null
},
{
id: "/settings",
pattern: /^\/settings\/?$/,
params: [],
page: { layouts: [0,], errors: [1,], leaf: 3 },
endpoint: null
}
],
prerendered_routes: new Set([]),
matchers: async () => {
return { };
},
server_assets: {}
}
}
})();

View File

@@ -1,13 +0,0 @@
export const index = 0;
let component_cache;
export const component = async () => component_cache ??= (await import('../entries/pages/_layout.svelte.js')).default;
export const universal = {
"ssr": false,
"prerender": false
};
export const universal_id = "src/routes/+layout.ts";
export const imports = ["_app/immutable/nodes/0.DZdF_zz-.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/CRLlKr96.js","_app/immutable/chunks/xdjHc-A2.js","_app/immutable/chunks/DXE57cnx.js","_app/immutable/chunks/D0iaTcAo.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/chunks/Dbod7Wv8.js"];
export const stylesheets = ["_app/immutable/assets/0.RZHRvmcL.css"];
export const fonts = [];

View File

@@ -1,8 +0,0 @@
export const index = 1;
let component_cache;
export const component = async () => component_cache ??= (await import('../entries/pages/_error.svelte.js')).default;
export const imports = ["_app/immutable/nodes/1.Bh-fCbID.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/CRLlKr96.js","_app/immutable/chunks/DXE57cnx.js","_app/immutable/chunks/D0iaTcAo.js","_app/immutable/chunks/BxZpmA7Z.js"];
export const stylesheets = [];
export const fonts = [];

View File

@@ -1,14 +0,0 @@
export const index = 2;
let component_cache;
export const component = async () => component_cache ??= (await import('../entries/pages/_page.svelte.js')).default;
export const universal = {
"ssr": false,
"prerender": false,
"load": null
};
export const universal_id = "src/routes/+page.ts";
export const imports = ["_app/immutable/nodes/2.BmiXdPHI.js","_app/immutable/chunks/DyPeVqDG.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/Dbod7Wv8.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/CRLlKr96.js","_app/immutable/chunks/vVxDbqKK.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/chunks/xdjHc-A2.js"];
export const stylesheets = [];
export const fonts = [];

View File

@@ -1,14 +0,0 @@
export const index = 3;
let component_cache;
export const component = async () => component_cache ??= (await import('../entries/pages/settings/_page.svelte.js')).default;
export const universal = {
"ssr": false,
"prerender": false,
"load": null
};
export const universal_id = "src/routes/settings/+page.ts";
export const imports = ["_app/immutable/nodes/3.guWMyWpk.js","_app/immutable/chunks/DyPeVqDG.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/Dbod7Wv8.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/CRLlKr96.js","_app/immutable/chunks/vVxDbqKK.js"];
export const stylesheets = [];
export const fonts = [];

View File

@@ -1,562 +0,0 @@
import { get_request_store, with_request_store } from "@sveltejs/kit/internal/server";
import { parse } from "devalue";
import { error, json } from "@sveltejs/kit";
import { a as stringify_remote_arg, f as flatten_issues, b as create_field_proxy, n as normalize_issue, e as set_nested_value, g as deep_set, s as stringify, c as create_remote_key } from "./chunks/shared.js";
import { ValidationError } from "@sveltejs/kit/internal";
import { B as BROWSER } from "./chunks/false.js";
import { b as base, c as app_dir, p as prerendering } from "./chunks/environment.js";
function create_validator(validate_or_fn, maybe_fn) {
if (!maybe_fn) {
return (arg) => {
if (arg !== void 0) {
error(400, "Bad Request");
}
};
}
if (validate_or_fn === "unchecked") {
return (arg) => arg;
}
if ("~standard" in validate_or_fn) {
return async (arg) => {
const { event, state } = get_request_store();
const result = await validate_or_fn["~standard"].validate(arg);
if (result.issues) {
error(
400,
await state.handleValidationError({
issues: result.issues,
event
})
);
}
return result.value;
};
}
throw new Error(
'Invalid validator passed to remote function. Expected "unchecked" or a Standard Schema (https://standardschema.dev)'
);
}
async function get_response(info, arg, state, get_result) {
await 0;
const cache = get_cache(info, state);
return cache[stringify_remote_arg(arg, state.transport)] ??= get_result();
}
function parse_remote_response(data, transport) {
const revivers = {};
for (const key in transport) {
revivers[key] = transport[key].decode;
}
return parse(data, revivers);
}
async function run_remote_function(event, state, allow_cookies, arg, validate, fn) {
const store = {
event: {
...event,
setHeaders: () => {
throw new Error("setHeaders is not allowed in remote functions");
},
cookies: {
...event.cookies,
set: (name, value, opts) => {
if (!allow_cookies) {
throw new Error("Cannot set cookies in `query` or `prerender` functions");
}
if (opts.path && !opts.path.startsWith("/")) {
throw new Error("Cookies set in remote functions must have an absolute path");
}
return event.cookies.set(name, value, opts);
},
delete: (name, opts) => {
if (!allow_cookies) {
throw new Error("Cannot delete cookies in `query` or `prerender` functions");
}
if (opts.path && !opts.path.startsWith("/")) {
throw new Error("Cookies deleted in remote functions must have an absolute path");
}
return event.cookies.delete(name, opts);
}
}
},
state: {
...state,
is_in_remote_function: true
}
};
const validated = await with_request_store(store, () => validate(arg));
return with_request_store(store, () => fn(validated));
}
function get_cache(info, state = get_request_store().state) {
let cache = state.remote_data?.get(info);
if (cache === void 0) {
cache = {};
(state.remote_data ??= /* @__PURE__ */ new Map()).set(info, cache);
}
return cache;
}
// @__NO_SIDE_EFFECTS__
function command(validate_or_fn, maybe_fn) {
const fn = maybe_fn ?? validate_or_fn;
const validate = create_validator(validate_or_fn, maybe_fn);
const __ = { type: "command", id: "", name: "" };
const wrapper = (arg) => {
const { event, state } = get_request_store();
if (state.is_endpoint_request) {
if (!["POST", "PUT", "PATCH", "DELETE"].includes(event.request.method)) {
throw new Error(
`Cannot call a command (\`${__.name}(${maybe_fn ? "..." : ""})\`) from a ${event.request.method} handler`
);
}
} else if (!event.isRemoteRequest) {
throw new Error(
`Cannot call a command (\`${__.name}(${maybe_fn ? "..." : ""})\`) during server-side rendering`
);
}
state.refreshes ??= {};
const promise = Promise.resolve(run_remote_function(event, state, true, arg, validate, fn));
promise.updates = () => {
throw new Error(`Cannot call '${__.name}(...).updates(...)' on the server`);
};
return (
/** @type {ReturnType<RemoteCommand<Input, Output>>} */
promise
);
};
Object.defineProperty(wrapper, "__", { value: __ });
Object.defineProperty(wrapper, "pending", {
get: () => 0
});
return wrapper;
}
// @__NO_SIDE_EFFECTS__
function form(validate_or_fn, maybe_fn) {
const fn = maybe_fn ?? validate_or_fn;
const schema = !maybe_fn || validate_or_fn === "unchecked" ? null : (
/** @type {any} */
validate_or_fn
);
function create_instance(key) {
const instance = {};
instance.method = "POST";
Object.defineProperty(instance, "enhance", {
value: () => {
return { action: instance.action, method: instance.method };
}
});
const button_props = {
type: "submit",
onclick: () => {
}
};
Object.defineProperty(button_props, "enhance", {
value: () => {
return { type: "submit", formaction: instance.buttonProps.formaction, onclick: () => {
} };
}
});
Object.defineProperty(instance, "buttonProps", {
value: button_props
});
const __ = {
type: "form",
name: "",
id: "",
fn: async (data, meta, form_data) => {
const output = {};
output.submission = true;
const { event, state } = get_request_store();
const validated = await schema?.["~standard"].validate(data);
if (meta.validate_only) {
return validated?.issues?.map((issue) => normalize_issue(issue, true)) ?? [];
}
if (validated?.issues !== void 0) {
handle_issues(output, validated.issues, form_data);
} else {
if (validated !== void 0) {
data = validated.value;
}
state.refreshes ??= {};
const issue = create_issues();
try {
output.result = await run_remote_function(
event,
state,
true,
data,
(d) => d,
(data2) => !maybe_fn ? fn() : fn(data2, issue)
);
} catch (e) {
if (e instanceof ValidationError) {
handle_issues(output, e.issues, form_data);
} else {
throw e;
}
}
}
if (!event.isRemoteRequest) {
get_cache(__, state)[""] ??= output;
}
return output;
}
};
Object.defineProperty(instance, "__", { value: __ });
Object.defineProperty(instance, "action", {
get: () => `?/remote=${__.id}`,
enumerable: true
});
Object.defineProperty(button_props, "formaction", {
get: () => `?/remote=${__.id}`,
enumerable: true
});
Object.defineProperty(instance, "fields", {
get() {
const data = get_cache(__)?.[""];
const issues = flatten_issues(data?.issues ?? []);
return create_field_proxy(
{},
() => data?.input ?? {},
(path, value) => {
if (data?.submission) {
return;
}
const input = path.length === 0 ? value : deep_set(data?.input ?? {}, path.map(String), value);
(get_cache(__)[""] ??= {}).input = input;
},
() => issues
);
}
});
Object.defineProperty(instance, "result", {
get() {
try {
return get_cache(__)?.[""]?.result;
} catch {
return void 0;
}
}
});
Object.defineProperty(instance, "pending", {
get: () => 0
});
Object.defineProperty(button_props, "pending", {
get: () => 0
});
Object.defineProperty(instance, "preflight", {
// preflight is a noop on the server
value: () => instance
});
Object.defineProperty(instance, "validate", {
value: () => {
throw new Error("Cannot call validate() on the server");
}
});
if (key == void 0) {
Object.defineProperty(instance, "for", {
/** @type {RemoteForm<any, any>['for']} */
value: (key2) => {
const { state } = get_request_store();
const cache_key = __.id + "|" + JSON.stringify(key2);
let instance2 = (state.form_instances ??= /* @__PURE__ */ new Map()).get(cache_key);
if (!instance2) {
instance2 = create_instance(key2);
instance2.__.id = `${__.id}/${encodeURIComponent(JSON.stringify(key2))}`;
instance2.__.name = __.name;
state.form_instances.set(cache_key, instance2);
}
return instance2;
}
});
}
return instance;
}
return create_instance();
}
function handle_issues(output, issues, form_data) {
output.issues = issues.map((issue) => normalize_issue(issue, true));
if (form_data) {
output.input = {};
for (let key of form_data.keys()) {
if (/^[.\]]?_/.test(key)) continue;
const is_array = key.endsWith("[]");
const values = form_data.getAll(key).filter((value) => typeof value === "string");
if (is_array) key = key.slice(0, -2);
set_nested_value(
/** @type {Record<string, any>} */
output.input,
key,
is_array ? values : values[0]
);
}
}
}
function create_issues() {
return (
/** @type {InvalidField<any>} */
new Proxy(
/** @param {string} message */
(message) => {
if (typeof message !== "string") {
throw new Error(
"`invalid` should now be imported from `@sveltejs/kit` to throw validation issues. The second parameter provided to the form function (renamed to `issue`) is still used to construct issues, e.g. `invalid(issue.field('message'))`. For more info see https://github.com/sveltejs/kit/pulls/14768"
);
}
return create_issue(message);
},
{
get(target, prop) {
if (typeof prop === "symbol") return (
/** @type {any} */
target[prop]
);
return create_issue_proxy(prop, []);
}
}
)
);
function create_issue(message, path = []) {
return {
message,
path
};
}
function create_issue_proxy(key, path) {
const new_path = [...path, key];
const issue_func = (message) => create_issue(message, new_path);
return new Proxy(issue_func, {
get(target, prop) {
if (typeof prop === "symbol") return (
/** @type {any} */
target[prop]
);
if (/^\d+$/.test(prop)) {
return create_issue_proxy(parseInt(prop, 10), new_path);
}
return create_issue_proxy(prop, new_path);
}
});
}
}
// @__NO_SIDE_EFFECTS__
function prerender(validate_or_fn, fn_or_options, maybe_options) {
const maybe_fn = typeof fn_or_options === "function" ? fn_or_options : void 0;
const options = maybe_options ?? (maybe_fn ? void 0 : fn_or_options);
const fn = maybe_fn ?? validate_or_fn;
const validate = create_validator(validate_or_fn, maybe_fn);
const __ = {
type: "prerender",
id: "",
name: "",
has_arg: !!maybe_fn,
inputs: options?.inputs,
dynamic: options?.dynamic
};
const wrapper = (arg) => {
const promise = (async () => {
const { event, state } = get_request_store();
const payload = stringify_remote_arg(arg, state.transport);
const id = __.id;
const url = `${base}/${app_dir}/remote/${id}${payload ? `/${payload}` : ""}`;
if (!state.prerendering && !BROWSER && !event.isRemoteRequest) {
try {
return await get_response(__, arg, state, async () => {
const key = stringify_remote_arg(arg, state.transport);
const cache = get_cache(__, state);
const promise3 = cache[key] ??= fetch(new URL(url, event.url.origin).href).then(
async (response) => {
if (!response.ok) {
throw new Error("Prerendered response not found");
}
const prerendered = await response.json();
if (prerendered.type === "error") {
error(prerendered.status, prerendered.error);
}
return prerendered.result;
}
);
return parse_remote_response(await promise3, state.transport);
});
} catch {
}
}
if (state.prerendering?.remote_responses.has(url)) {
return (
/** @type {Promise<any>} */
state.prerendering.remote_responses.get(url)
);
}
const promise2 = get_response(
__,
arg,
state,
() => run_remote_function(event, state, false, arg, validate, fn)
);
if (state.prerendering) {
state.prerendering.remote_responses.set(url, promise2);
}
const result = await promise2;
if (state.prerendering) {
const body = { type: "result", result: stringify(result, state.transport) };
state.prerendering.dependencies.set(url, {
body: JSON.stringify(body),
response: json(body)
});
}
return result;
})();
promise.catch(() => {
});
return (
/** @type {RemoteResource<Output>} */
promise
);
};
Object.defineProperty(wrapper, "__", { value: __ });
return wrapper;
}
// @__NO_SIDE_EFFECTS__
function query(validate_or_fn, maybe_fn) {
const fn = maybe_fn ?? validate_or_fn;
const validate = create_validator(validate_or_fn, maybe_fn);
const __ = { type: "query", id: "", name: "" };
const wrapper = (arg) => {
if (prerendering) {
throw new Error(
`Cannot call query '${__.name}' while prerendering, as prerendered pages need static data. Use 'prerender' from $app/server instead`
);
}
const { event, state } = get_request_store();
const get_remote_function_result = () => run_remote_function(event, state, false, arg, validate, fn);
const promise = get_response(__, arg, state, get_remote_function_result);
promise.catch(() => {
});
promise.set = (value) => update_refresh_value(get_refresh_context(__, "set", arg), value);
promise.refresh = () => {
const refresh_context = get_refresh_context(__, "refresh", arg);
const is_immediate_refresh = !refresh_context.cache[refresh_context.cache_key];
const value = is_immediate_refresh ? promise : get_remote_function_result();
return update_refresh_value(refresh_context, value, is_immediate_refresh);
};
promise.withOverride = () => {
throw new Error(`Cannot call '${__.name}.withOverride()' on the server`);
};
return (
/** @type {RemoteQuery<Output>} */
promise
);
};
Object.defineProperty(wrapper, "__", { value: __ });
return wrapper;
}
// @__NO_SIDE_EFFECTS__
function batch(validate_or_fn, maybe_fn) {
const fn = maybe_fn ?? validate_or_fn;
const validate = create_validator(validate_or_fn, maybe_fn);
const __ = {
type: "query_batch",
id: "",
name: "",
run: (args) => {
const { event, state } = get_request_store();
return run_remote_function(
event,
state,
false,
args,
(array) => Promise.all(array.map(validate)),
fn
);
}
};
let batching = { args: [], resolvers: [] };
const wrapper = (arg) => {
if (prerendering) {
throw new Error(
`Cannot call query.batch '${__.name}' while prerendering, as prerendered pages need static data. Use 'prerender' from $app/server instead`
);
}
const { event, state } = get_request_store();
const get_remote_function_result = () => {
return new Promise((resolve, reject) => {
batching.args.push(arg);
batching.resolvers.push({ resolve, reject });
if (batching.args.length > 1) return;
setTimeout(async () => {
const batched = batching;
batching = { args: [], resolvers: [] };
try {
const get_result = await run_remote_function(
event,
state,
false,
batched.args,
(array) => Promise.all(array.map(validate)),
fn
);
for (let i = 0; i < batched.resolvers.length; i++) {
try {
batched.resolvers[i].resolve(get_result(batched.args[i], i));
} catch (error2) {
batched.resolvers[i].reject(error2);
}
}
} catch (error2) {
for (const resolver of batched.resolvers) {
resolver.reject(error2);
}
}
}, 0);
});
};
const promise = get_response(__, arg, state, get_remote_function_result);
promise.catch(() => {
});
promise.set = (value) => update_refresh_value(get_refresh_context(__, "set", arg), value);
promise.refresh = () => {
const refresh_context = get_refresh_context(__, "refresh", arg);
const is_immediate_refresh = !refresh_context.cache[refresh_context.cache_key];
const value = is_immediate_refresh ? promise : get_remote_function_result();
return update_refresh_value(refresh_context, value, is_immediate_refresh);
};
promise.withOverride = () => {
throw new Error(`Cannot call '${__.name}.withOverride()' on the server`);
};
return (
/** @type {RemoteQuery<Output>} */
promise
);
};
Object.defineProperty(wrapper, "__", { value: __ });
return wrapper;
}
Object.defineProperty(query, "batch", { value: batch, enumerable: true });
function get_refresh_context(__, action, arg) {
const { state } = get_request_store();
const { refreshes } = state;
if (!refreshes) {
const name = __.type === "query_batch" ? `query.batch '${__.name}'` : `query '${__.name}'`;
throw new Error(
`Cannot call ${action} on ${name} because it is not executed in the context of a command/form remote function`
);
}
const cache = get_cache(__, state);
const cache_key = stringify_remote_arg(arg, state.transport);
const refreshes_key = create_remote_key(__.id, cache_key);
return { __, state, refreshes, refreshes_key, cache, cache_key };
}
function update_refresh_value({ __, refreshes, refreshes_key, cache, cache_key }, value, is_immediate_refresh = false) {
const promise = Promise.resolve(value);
if (!is_immediate_refresh) {
cache[cache_key] = promise;
}
if (__.id) {
refreshes[refreshes_key] = promise;
}
return promise.then(() => {
});
}
export {
command,
form,
prerender,
query
};

View File

@@ -1,52 +0,0 @@
{
"compilerOptions": {
"paths": {
"$lib": [
"../src/lib"
],
"$lib/*": [
"../src/lib/*"
],
"$app/types": [
"./types/index.d.ts"
]
},
"rootDirs": [
"..",
"./types"
],
"verbatimModuleSyntax": true,
"isolatedModules": true,
"lib": [
"esnext",
"DOM",
"DOM.Iterable"
],
"moduleResolution": "bundler",
"module": "esnext",
"noEmit": true,
"target": "esnext"
},
"include": [
"ambient.d.ts",
"non-ambient.d.ts",
"./types/**/$types.d.ts",
"../vite.config.js",
"../vite.config.ts",
"../src/**/*.js",
"../src/**/*.ts",
"../src/**/*.svelte",
"../tests/**/*.js",
"../tests/**/*.ts",
"../tests/**/*.svelte"
],
"exclude": [
"../node_modules/**",
"../src/service-worker.js",
"../src/service-worker/**/*.js",
"../src/service-worker.ts",
"../src/service-worker/**/*.ts",
"../src/service-worker.d.ts",
"../src/service-worker/**/*.d.ts"
]
}

View File

@@ -0,0 +1,205 @@
<!-- [DEF:DashboardGrid:Component] -->
<!--
@SEMANTICS: dashboard, grid, selection, pagination
@PURPOSE: Displays a grid of dashboards with selection and pagination.
@LAYER: Component
@RELATION: USED_BY -> frontend/src/routes/migration/+page.svelte
@INVARIANT: Selected IDs must be a subset of available dashboards.
-->
<script lang="ts">
// [SECTION: IMPORTS]
import { createEventDispatcher } from 'svelte';
import type { DashboardMetadata } from '../types/dashboard';
// [/SECTION]
// [SECTION: PROPS]
export let dashboards: DashboardMetadata[] = [];
export let selectedIds: number[] = [];
// [/SECTION]
// [SECTION: STATE]
let filterText = "";
let currentPage = 0;
let pageSize = 20;
let sortColumn: keyof DashboardMetadata = "title";
let sortDirection: "asc" | "desc" = "asc";
// [/SECTION]
// [SECTION: DERIVED]
$: filteredDashboards = dashboards.filter(d =>
d.title.toLowerCase().includes(filterText.toLowerCase())
);
$: sortedDashboards = [...filteredDashboards].sort((a, b) => {
let aVal = a[sortColumn];
let bVal = b[sortColumn];
if (sortColumn === "id") {
aVal = Number(aVal);
bVal = Number(bVal);
}
if (aVal < bVal) return sortDirection === "asc" ? -1 : 1;
if (aVal > bVal) return sortDirection === "asc" ? 1 : -1;
return 0;
});
$: paginatedDashboards = sortedDashboards.slice(
currentPage * pageSize,
(currentPage + 1) * pageSize
);
$: totalPages = Math.ceil(sortedDashboards.length / pageSize);
$: allSelected = paginatedDashboards.length > 0 && paginatedDashboards.every(d => selectedIds.includes(d.id));
$: someSelected = paginatedDashboards.some(d => selectedIds.includes(d.id));
// [/SECTION]
// [SECTION: EVENTS]
const dispatch = createEventDispatcher<{ selectionChanged: number[] }>();
// [/SECTION]
// [DEF:handleSort:Function]
// @PURPOSE: Toggles sort direction or changes sort column.
function handleSort(column: keyof DashboardMetadata) {
if (sortColumn === column) {
sortDirection = sortDirection === "asc" ? "desc" : "asc";
} else {
sortColumn = column;
sortDirection = "asc";
}
}
// [/DEF:handleSort]
// [DEF:handleSelectionChange:Function]
// @PURPOSE: Handles individual checkbox changes.
function handleSelectionChange(id: number, checked: boolean) {
let newSelected = [...selectedIds];
if (checked) {
if (!newSelected.includes(id)) newSelected.push(id);
} else {
newSelected = newSelected.filter(sid => sid !== id);
}
selectedIds = newSelected;
dispatch('selectionChanged', newSelected);
}
// [/DEF:handleSelectionChange]
// [DEF:handleSelectAll:Function]
// @PURPOSE: Handles select all checkbox.
function handleSelectAll(checked: boolean) {
let newSelected = [...selectedIds];
if (checked) {
paginatedDashboards.forEach(d => {
if (!newSelected.includes(d.id)) newSelected.push(d.id);
});
} else {
paginatedDashboards.forEach(d => {
newSelected = newSelected.filter(sid => sid !== d.id);
});
}
selectedIds = newSelected;
dispatch('selectionChanged', newSelected);
}
// [/DEF:handleSelectAll]
// [DEF:goToPage:Function]
// @PURPOSE: Changes current page.
function goToPage(page: number) {
if (page >= 0 && page < totalPages) {
currentPage = page;
}
}
// [/DEF:goToPage]
</script>
<!-- [SECTION: TEMPLATE] -->
<div class="dashboard-grid">
<!-- Filter Input -->
<div class="mb-4">
<input
type="text"
bind:value={filterText}
placeholder="Search dashboards..."
class="w-full px-3 py-2 border border-gray-300 rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500"
/>
</div>
<!-- Grid/Table -->
<div class="overflow-x-auto">
<table class="min-w-full bg-white border border-gray-300">
<thead class="bg-gray-50">
<tr>
<th class="px-4 py-2 border-b">
<input
type="checkbox"
checked={allSelected}
indeterminate={someSelected && !allSelected}
on:change={(e) => handleSelectAll((e.target as HTMLInputElement).checked)}
/>
</th>
<th class="px-4 py-2 border-b cursor-pointer" on:click={() => handleSort('title')}>
Title {sortColumn === 'title' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
</th>
<th class="px-4 py-2 border-b cursor-pointer" on:click={() => handleSort('last_modified')}>
Last Modified {sortColumn === 'last_modified' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
</th>
<th class="px-4 py-2 border-b cursor-pointer" on:click={() => handleSort('status')}>
Status {sortColumn === 'status' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
</th>
</tr>
</thead>
<tbody>
{#each paginatedDashboards as dashboard (dashboard.id)}
<tr class="hover:bg-gray-50">
<td class="px-4 py-2 border-b">
<input
type="checkbox"
checked={selectedIds.includes(dashboard.id)}
on:change={(e) => handleSelectionChange(dashboard.id, (e.target as HTMLInputElement).checked)}
/>
</td>
<td class="px-4 py-2 border-b">{dashboard.title}</td>
<td class="px-4 py-2 border-b">{new Date(dashboard.last_modified).toLocaleDateString()}</td>
<td class="px-4 py-2 border-b">
<span class="px-2 py-1 text-xs font-medium rounded-full {dashboard.status === 'published' ? 'bg-green-100 text-green-800' : 'bg-gray-100 text-gray-800'}">
{dashboard.status}
</span>
</td>
</tr>
{/each}
</tbody>
</table>
</div>
<!-- Pagination Controls -->
<div class="flex items-center justify-between mt-4">
<div class="text-sm text-gray-700">
Showing {currentPage * pageSize + 1} to {Math.min((currentPage + 1) * pageSize, sortedDashboards.length)} of {sortedDashboards.length} dashboards
</div>
<div class="flex space-x-2">
<button
class="px-3 py-1 text-sm border border-gray-300 rounded-md hover:bg-gray-50 disabled:opacity-50 disabled:cursor-not-allowed"
disabled={currentPage === 0}
on:click={() => goToPage(currentPage - 1)}
>
Previous
</button>
<button
class="px-3 py-1 text-sm border border-gray-300 rounded-md hover:bg-gray-50 disabled:opacity-50 disabled:cursor-not-allowed"
disabled={currentPage >= totalPages - 1}
on:click={() => goToPage(currentPage + 1)}
>
Next
</button>
</div>
</div>
</div>
<!-- [/SECTION] -->
<style>
/* Component styles */
</style>
<!-- [/DEF:DashboardGrid] -->

View File

@@ -36,8 +36,9 @@
<!-- [SECTION: TEMPLATE] --> <!-- [SECTION: TEMPLATE] -->
<div class="flex flex-col space-y-1"> <div class="flex flex-col space-y-1">
<label class="text-sm font-medium text-gray-700">{label}</label> <label for="env-select" class="text-sm font-medium text-gray-700">{label}</label>
<select <select
id="env-select"
class="block w-full pl-3 pr-10 py-2 text-base border-gray-300 focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm rounded-md" class="block w-full pl-3 pr-10 py-2 text-base border-gray-300 focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm rounded-md"
value={selectedId} value={selectedId}
on:change={handleSelect} on:change={handleSelect}

View File

@@ -16,6 +16,12 @@
> >
Dashboard Dashboard
</a> </a>
<a
href="/migration"
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/migration') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
>
Migration
</a>
<a <a
href="/settings" href="/settings"
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}" class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"

View File

@@ -0,0 +1,123 @@
<!-- [DEF:PasswordPrompt:Component] -->
<!--
@SEMANTICS: password, prompt, modal, input, security
@PURPOSE: A modal component to prompt the user for database passwords when a migration task is paused.
@LAYER: UI
@RELATION: USES -> frontend/src/lib/api.js (inferred)
@RELATION: EMITS -> resume, cancel
-->
<script>
import { createEventDispatcher } from 'svelte';
export let show = false;
export let databases = []; // List of database names requiring passwords
export let errorMessage = "";
const dispatch = createEventDispatcher();
let passwords = {};
let submitting = false;
function handleSubmit() {
if (submitting) return;
// Validate all passwords entered
const missing = databases.filter(db => !passwords[db]);
if (missing.length > 0) {
alert(`Please enter passwords for: ${missing.join(', ')}`);
return;
}
submitting = true;
dispatch('resume', { passwords });
// Reset submitting state is handled by parent or on close
}
function handleCancel() {
dispatch('cancel');
show = false;
}
// Reset passwords when modal opens/closes
$: if (!show) {
passwords = {};
submitting = false;
}
</script>
{#if show}
<div class="fixed inset-0 z-50 overflow-y-auto" aria-labelledby="modal-title" role="dialog" aria-modal="true">
<div class="flex items-end justify-center min-h-screen pt-4 px-4 pb-20 text-center sm:block sm:p-0">
<!-- Background overlay -->
<div class="fixed inset-0 bg-gray-500 bg-opacity-75 transition-opacity" aria-hidden="true" on:click={handleCancel}></div>
<span class="hidden sm:inline-block sm:align-middle sm:h-screen" aria-hidden="true">&#8203;</span>
<div class="inline-block align-bottom bg-white rounded-lg text-left overflow-hidden shadow-xl transform transition-all sm:my-8 sm:align-middle sm:max-w-lg sm:w-full">
<div class="bg-white px-4 pt-5 pb-4 sm:p-6 sm:pb-4">
<div class="sm:flex sm:items-start">
<div class="mx-auto flex-shrink-0 flex items-center justify-center h-12 w-12 rounded-full bg-red-100 sm:mx-0 sm:h-10 sm:w-10">
<!-- Heroicon name: outline/lock-closed -->
<svg class="h-6 w-6 text-red-600" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke="currentColor" aria-hidden="true">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 15v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2zm10-10V7a4 4 0 00-8 0v4h8z" />
</svg>
</div>
<div class="mt-3 text-center sm:mt-0 sm:ml-4 sm:text-left w-full">
<h3 class="text-lg leading-6 font-medium text-gray-900" id="modal-title">
Database Password Required
</h3>
<div class="mt-2">
<p class="text-sm text-gray-500 mb-4">
The migration process requires passwords for the following databases to proceed.
</p>
{#if errorMessage}
<div class="mb-4 p-2 bg-red-50 text-red-700 text-xs rounded border border-red-200">
Error: {errorMessage}
</div>
{/if}
<form on:submit|preventDefault={handleSubmit} class="space-y-4">
{#each databases as dbName}
<div>
<label for="password-{dbName}" class="block text-sm font-medium text-gray-700">
Password for {dbName}
</label>
<input
type="password"
id="password-{dbName}"
bind:value={passwords[dbName]}
class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm p-2 border"
placeholder="Enter password"
required
/>
</div>
{/each}
</form>
</div>
</div>
</div>
</div>
<div class="bg-gray-50 px-4 py-3 sm:px-6 sm:flex sm:flex-row-reverse">
<button
type="button"
class="w-full inline-flex justify-center rounded-md border border-transparent shadow-sm px-4 py-2 bg-indigo-600 text-base font-medium text-white hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:ml-3 sm:w-auto sm:text-sm disabled:opacity-50"
on:click={handleSubmit}
disabled={submitting}
>
{submitting ? 'Resuming...' : 'Resume Migration'}
</button>
<button
type="button"
class="mt-3 w-full inline-flex justify-center rounded-md border border-gray-300 shadow-sm px-4 py-2 bg-white text-base font-medium text-gray-700 hover:bg-gray-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:mt-0 sm:ml-3 sm:w-auto sm:text-sm"
on:click={handleCancel}
disabled={submitting}
>
Cancel
</button>
</div>
</div>
</div>
</div>
{/if}
<!-- [/DEF:PasswordPrompt] -->

View File

@@ -0,0 +1,179 @@
<!-- [DEF:TaskHistory:Component] -->
<!--
@SEMANTICS: task, history, list, status, monitoring
@PURPOSE: Displays a list of recent tasks with their status and allows selecting them for viewing logs.
@LAYER: UI
@RELATION: USES -> frontend/src/lib/stores.js
@RELATION: USES -> frontend/src/lib/api.js (inferred)
-->
<script>
import { onMount, onDestroy } from 'svelte';
import { selectedTask } from '../lib/stores.js';
let tasks = [];
let loading = true;
let error = "";
let interval;
async function fetchTasks() {
try {
const res = await fetch('/api/tasks?limit=10');
if (!res.ok) throw new Error('Failed to fetch tasks');
tasks = await res.json();
// [DEBUG] Check for tasks requiring attention
tasks.forEach(t => {
if (t.status === 'AWAITING_MAPPING' || t.status === 'AWAITING_INPUT') {
console.log(`[TaskHistory] Task ${t.id} is in state ${t.status}. Input required: ${t.input_required}`);
}
});
// Update selected task if it exists in the list (for status updates)
if ($selectedTask) {
const updatedTask = tasks.find(t => t.id === $selectedTask.id);
if (updatedTask && updatedTask.status !== $selectedTask.status) {
selectedTask.set(updatedTask);
}
}
} catch (e) {
error = e.message;
} finally {
loading = false;
}
}
async function clearTasks(status = null) {
if (!confirm('Are you sure you want to clear tasks?')) return;
try {
let url = '/api/tasks';
const params = new URLSearchParams();
if (status) params.append('status', status);
const res = await fetch(`${url}?${params.toString()}`, { method: 'DELETE' });
if (!res.ok) throw new Error('Failed to clear tasks');
await fetchTasks();
} catch (e) {
error = e.message;
}
}
async function selectTask(task) {
try {
// Fetch the full task details (including logs) before setting it as selected
const res = await fetch(`/api/tasks/${task.id}`);
if (res.ok) {
const fullTask = await res.json();
selectedTask.set(fullTask);
} else {
// Fallback to the list version if fetch fails
selectedTask.set(task);
}
} catch (e) {
console.error("Failed to fetch full task details:", e);
selectedTask.set(task);
}
}
function getStatusColor(status) {
switch (status) {
case 'SUCCESS': return 'bg-green-100 text-green-800';
case 'FAILED': return 'bg-red-100 text-red-800';
case 'RUNNING': return 'bg-blue-100 text-blue-800';
case 'AWAITING_INPUT': return 'bg-orange-100 text-orange-800';
case 'AWAITING_MAPPING': return 'bg-yellow-100 text-yellow-800';
default: return 'bg-gray-100 text-gray-800';
}
}
onMount(() => {
fetchTasks();
interval = setInterval(fetchTasks, 5000); // Poll every 5s
});
onDestroy(() => {
clearInterval(interval);
});
</script>
<div class="bg-white shadow overflow-hidden sm:rounded-lg mb-8">
<div class="px-4 py-5 sm:px-6 flex justify-between items-center">
<h3 class="text-lg leading-6 font-medium text-gray-900">
Recent Tasks
</h3>
<div class="flex space-x-4 items-center">
<div class="relative inline-block text-left group">
<button class="text-sm text-red-600 hover:text-red-900 focus:outline-none flex items-center py-2">
Clear Tasks
<svg class="ml-1 h-4 w-4" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7"></path></svg>
</button>
<!-- Added a transparent bridge to prevent menu closing when moving cursor -->
<div class="absolute h-2 w-full top-full left-0"></div>
<div class="origin-top-right absolute right-0 mt-2 w-48 rounded-md shadow-lg bg-white ring-1 ring-black ring-opacity-5 focus:outline-none hidden group-hover:block z-50">
<div class="py-1">
<button on:click={() => clearTasks()} class="block w-full text-left px-4 py-2 text-sm text-gray-700 hover:bg-gray-100">Clear All Non-Running</button>
<button on:click={() => clearTasks('FAILED')} class="block w-full text-left px-4 py-2 text-sm text-gray-700 hover:bg-gray-100">Clear Failed</button>
<button on:click={() => clearTasks('AWAITING_INPUT')} class="block w-full text-left px-4 py-2 text-sm text-gray-700 hover:bg-gray-100">Clear Awaiting Input</button>
</div>
</div>
</div>
<button
on:click={fetchTasks}
class="text-sm text-indigo-600 hover:text-indigo-900 focus:outline-none"
>
Refresh
</button>
</div>
</div>
{#if loading && tasks.length === 0}
<div class="p-4 text-center text-gray-500">Loading tasks...</div>
{:else if error}
<div class="p-4 text-center text-red-500">{error}</div>
{:else if tasks.length === 0}
<div class="p-4 text-center text-gray-500">No recent tasks found.</div>
{:else}
<ul class="divide-y divide-gray-200">
{#each tasks as task}
<li>
<button
class="w-full text-left block hover:bg-gray-50 focus:outline-none focus:bg-gray-50 transition duration-150 ease-in-out"
class:bg-indigo-50={$selectedTask && $selectedTask.id === task.id}
on:click={() => selectTask(task)}
>
<div class="px-4 py-4 sm:px-6">
<div class="flex items-center justify-between">
<p class="text-sm font-medium text-indigo-600 truncate">
{task.plugin_id}
<span class="text-gray-500 text-xs ml-2">({task.id.slice(0, 8)})</span>
</p>
<div class="ml-2 flex-shrink-0 flex">
<p class="px-2 inline-flex text-xs leading-5 font-semibold rounded-full {getStatusColor(task.status)}">
{task.status}
</p>
</div>
</div>
<div class="mt-2 sm:flex sm:justify-between">
<div class="sm:flex">
<p class="flex items-center text-sm text-gray-500">
{#if task.params.from_env && task.params.to_env}
{task.params.from_env} &rarr; {task.params.to_env}
{:else}
Params: {Object.keys(task.params).length} keys
{/if}
</p>
</div>
<div class="mt-2 flex items-center text-sm text-gray-500 sm:mt-0">
<p>
Started: {new Date(task.started_at || task.created_at || Date.now()).toLocaleString()}
</p>
</div>
</div>
</div>
</button>
</li>
{/each}
</ul>
{/if}
</div>
<!-- [/DEF:TaskHistory] -->

View File

@@ -0,0 +1,153 @@
<!-- [DEF:TaskLogViewer:Component] -->
<!--
@SEMANTICS: task, log, viewer, modal
@PURPOSE: Displays detailed logs for a specific task in a modal.
@LAYER: UI
@RELATION: USES -> frontend/src/lib/api.js (inferred)
-->
<script>
import { createEventDispatcher, onMount, onDestroy } from 'svelte';
import { getTaskLogs } from '../services/taskService.js';
export let show = false;
export let taskId = null;
export let taskStatus = null; // To know if we should poll
const dispatch = createEventDispatcher();
let logs = [];
let loading = false;
let error = "";
let interval;
let autoScroll = true;
let logContainer;
async function fetchLogs() {
if (!taskId) return;
try {
logs = await getTaskLogs(taskId);
if (autoScroll) {
scrollToBottom();
}
} catch (e) {
error = e.message;
} finally {
loading = false;
}
}
function scrollToBottom() {
if (logContainer) {
setTimeout(() => {
logContainer.scrollTop = logContainer.scrollHeight;
}, 0);
}
}
function handleScroll() {
if (!logContainer) return;
// If user scrolls up, disable auto-scroll
const { scrollTop, scrollHeight, clientHeight } = logContainer;
const atBottom = scrollHeight - scrollTop - clientHeight < 50;
autoScroll = atBottom;
}
function close() {
dispatch('close');
show = false;
}
function getLogLevelColor(level) {
switch (level) {
case 'INFO': return 'text-blue-600';
case 'WARNING': return 'text-yellow-600';
case 'ERROR': return 'text-red-600';
case 'DEBUG': return 'text-gray-500';
default: return 'text-gray-800';
}
}
// React to changes in show/taskId
$: if (show && taskId) {
logs = [];
loading = true;
error = "";
fetchLogs();
// Poll if task is running
if (taskStatus === 'RUNNING' || taskStatus === 'AWAITING_INPUT' || taskStatus === 'AWAITING_MAPPING') {
interval = setInterval(fetchLogs, 3000);
}
} else {
if (interval) clearInterval(interval);
}
onDestroy(() => {
if (interval) clearInterval(interval);
});
</script>
{#if show}
<div class="fixed inset-0 z-50 overflow-y-auto" aria-labelledby="modal-title" role="dialog" aria-modal="true">
<div class="flex items-end justify-center min-h-screen pt-4 px-4 pb-20 text-center sm:block sm:p-0">
<!-- Background overlay -->
<div class="fixed inset-0 bg-gray-500 bg-opacity-75 transition-opacity" aria-hidden="true" on:click={close}></div>
<span class="hidden sm:inline-block sm:align-middle sm:h-screen" aria-hidden="true">&#8203;</span>
<div class="inline-block align-bottom bg-white rounded-lg text-left overflow-hidden shadow-xl transform transition-all sm:my-8 sm:align-middle sm:max-w-4xl sm:w-full">
<div class="bg-white px-4 pt-5 pb-4 sm:p-6 sm:pb-4">
<div class="sm:flex sm:items-start">
<div class="mt-3 text-center sm:mt-0 sm:ml-4 sm:text-left w-full">
<h3 class="text-lg leading-6 font-medium text-gray-900 flex justify-between items-center" id="modal-title">
<span>Task Logs <span class="text-sm text-gray-500 font-normal">({taskId})</span></span>
<button on:click={fetchLogs} class="text-sm text-indigo-600 hover:text-indigo-900">Refresh</button>
</h3>
<div class="mt-4 border rounded-md bg-gray-50 p-4 h-96 overflow-y-auto font-mono text-sm"
bind:this={logContainer}
on:scroll={handleScroll}>
{#if loading && logs.length === 0}
<p class="text-gray-500 text-center">Loading logs...</p>
{:else if error}
<p class="text-red-500 text-center">{error}</p>
{:else if logs.length === 0}
<p class="text-gray-500 text-center">No logs available.</p>
{:else}
{#each logs as log}
<div class="mb-1 hover:bg-gray-100 p-1 rounded">
<span class="text-gray-400 text-xs mr-2">
{new Date(log.timestamp).toLocaleTimeString()}
</span>
<span class="font-bold text-xs mr-2 w-16 inline-block {getLogLevelColor(log.level)}">
[{log.level}]
</span>
<span class="text-gray-800 break-words">
{log.message}
</span>
{#if log.context}
<div class="ml-24 text-xs text-gray-500 mt-1 bg-gray-100 p-1 rounded overflow-x-auto">
<pre>{JSON.stringify(log.context, null, 2)}</pre>
</div>
{/if}
</div>
{/each}
{/if}
</div>
</div>
</div>
</div>
<div class="bg-gray-50 px-4 py-3 sm:px-6 sm:flex sm:flex-row-reverse">
<button
type="button"
class="mt-3 w-full inline-flex justify-center rounded-md border border-gray-300 shadow-sm px-4 py-2 bg-white text-base font-medium text-gray-700 hover:bg-gray-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:mt-0 sm:ml-3 sm:w-auto sm:text-sm"
on:click={close}
>
Close
</button>
</div>
</div>
</div>
</div>
{/if}
<!-- [/DEF:TaskLogViewer] -->

View File

@@ -16,6 +16,7 @@
import { getWsUrl } from '../lib/api.js'; import { getWsUrl } from '../lib/api.js';
import { addToast } from '../lib/toasts.js'; import { addToast } from '../lib/toasts.js';
import MissingMappingModal from './MissingMappingModal.svelte'; import MissingMappingModal from './MissingMappingModal.svelte';
import PasswordPrompt from './PasswordPrompt.svelte';
// [/SECTION] // [/SECTION]
let ws; let ws;
@@ -26,10 +27,13 @@
let reconnectTimeout; let reconnectTimeout;
let waitingForData = false; let waitingForData = false;
let dataTimeout; let dataTimeout;
let connectionStatus = 'disconnected'; // 'connecting', 'connected', 'disconnected', 'waiting', 'completed', 'awaiting_mapping' let connectionStatus = 'disconnected'; // 'connecting', 'connected', 'disconnected', 'waiting', 'completed', 'awaiting_mapping', 'awaiting_input'
let showMappingModal = false; let showMappingModal = false;
let missingDbInfo = { name: '', uuid: '' }; let missingDbInfo = { name: '', uuid: '' };
let targetDatabases = []; let targetDatabases = [];
let showPasswordPrompt = false;
let passwordPromptData = { databases: [], errorMessage: '' };
// [DEF:connect:Function] // [DEF:connect:Function]
/** /**
@@ -73,8 +77,33 @@
showMappingModal = true; showMappingModal = true;
} }
} }
// Check for password request via log context or message
// Note: The backend logs "Task paused for user input" with context
if (logEntry.message && logEntry.message.includes('Task paused for user input') && logEntry.context && logEntry.context.input_request) {
const request = logEntry.context.input_request;
if (request.type === 'database_password') {
connectionStatus = 'awaiting_input';
passwordPromptData = {
databases: request.databases || [],
errorMessage: request.error_message || ''
};
showPasswordPrompt = true;
}
}
}; };
// Check if task is already awaiting input (e.g. when re-selecting task)
// We use the 'task' variable from the outer scope (connect function)
if (task && task.status === 'AWAITING_INPUT' && task.input_request && task.input_request.type === 'database_password') {
connectionStatus = 'awaiting_input';
passwordPromptData = {
databases: task.input_request.databases || [],
errorMessage: task.input_request.error_message || ''
};
showPasswordPrompt = true;
}
ws.onerror = (error) => { ws.onerror = (error) => {
console.error('[TaskRunner][Coherence:Failed] WebSocket error:', error); console.error('[TaskRunner][Coherence:Failed] WebSocket error:', error);
connectionStatus = 'disconnected'; connectionStatus = 'disconnected';
@@ -158,6 +187,25 @@
addToast('Failed to resolve mapping: ' + e.message, 'error'); addToast('Failed to resolve mapping: ' + e.message, 'error');
} }
} }
async function handlePasswordResume(event) {
const task = get(selectedTask);
const { passwords } = event.detail;
try {
await fetch(`/api/tasks/${task.id}/resume`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ passwords })
});
showPasswordPrompt = false;
connectionStatus = 'connected';
addToast('Passwords submitted, resuming migration...', 'success');
} catch (e) {
addToast('Failed to resume task: ' + e.message, 'error');
}
}
function startDataTimeout() { function startDataTimeout() {
waitingForData = false; waitingForData = false;
@@ -184,7 +232,15 @@
clearTimeout(reconnectTimeout); clearTimeout(reconnectTimeout);
reconnectAttempts = 0; reconnectAttempts = 0;
connectionStatus = 'disconnected'; connectionStatus = 'disconnected';
taskLogs.set([]);
// Initialize logs from the task object if available
if (task.logs && Array.isArray(task.logs)) {
console.log(`[TaskRunner] Loaded ${task.logs.length} existing logs.`);
taskLogs.set(task.logs);
} else {
taskLogs.set([]);
}
connect(); connect();
} }
}); });
@@ -228,6 +284,9 @@
{:else if connectionStatus === 'awaiting_mapping'} {:else if connectionStatus === 'awaiting_mapping'}
<span class="h-3 w-3 rounded-full bg-orange-500 animate-pulse"></span> <span class="h-3 w-3 rounded-full bg-orange-500 animate-pulse"></span>
<span class="text-xs text-gray-500">Awaiting Mapping</span> <span class="text-xs text-gray-500">Awaiting Mapping</span>
{:else if connectionStatus === 'awaiting_input'}
<span class="h-3 w-3 rounded-full bg-orange-500 animate-pulse"></span>
<span class="text-xs text-gray-500">Awaiting Input</span>
{:else} {:else}
<span class="h-3 w-3 rounded-full bg-red-500"></span> <span class="h-3 w-3 rounded-full bg-red-500"></span>
<span class="text-xs text-gray-500">Disconnected</span> <span class="text-xs text-gray-500">Disconnected</span>
@@ -235,18 +294,46 @@
</div> </div>
</div> </div>
<div class="bg-gray-900 text-white font-mono text-sm p-4 rounded-md h-96 overflow-y-auto relative"> <!-- Task Info Section -->
<div class="mb-4 bg-gray-50 p-3 rounded text-sm border border-gray-200">
<details open>
<summary class="cursor-pointer font-medium text-gray-700 focus:outline-none hover:text-indigo-600">Task Details & Parameters</summary>
<div class="mt-2 pl-2 border-l-2 border-indigo-200">
<div class="grid grid-cols-2 gap-2 mb-2">
<div><span class="font-semibold">ID:</span> <span class="text-gray-600">{$selectedTask.id}</span></div>
<div><span class="font-semibold">Status:</span> <span class="text-gray-600">{$selectedTask.status}</span></div>
<div><span class="font-semibold">Started:</span> <span class="text-gray-600">{new Date($selectedTask.started_at || $selectedTask.created_at || Date.now()).toLocaleString()}</span></div>
<div><span class="font-semibold">Plugin:</span> <span class="text-gray-600">{$selectedTask.plugin_id}</span></div>
</div>
<div class="mt-1">
<span class="font-semibold">Parameters:</span>
<pre class="text-xs bg-gray-100 p-2 rounded mt-1 overflow-x-auto border border-gray-200">{JSON.stringify($selectedTask.params, null, 2)}</pre>
</div>
</div>
</details>
</div>
<div class="bg-gray-900 text-white font-mono text-sm p-4 rounded-md h-96 overflow-y-auto relative shadow-inner">
{#if $taskLogs.length === 0}
<div class="text-gray-500 italic text-center mt-10">No logs available for this task.</div>
{/if}
{#each $taskLogs as log} {#each $taskLogs as log}
<div> <div class="hover:bg-gray-800 px-1 rounded">
<span class="text-gray-400">{new Date(log.timestamp).toLocaleTimeString()}</span> <span class="text-gray-500 select-none text-xs w-20 inline-block">{new Date(log.timestamp).toLocaleTimeString()}</span>
<span class="{log.level === 'ERROR' ? 'text-red-500' : 'text-green-400'}">[{log.level}]</span> <span class="{log.level === 'ERROR' ? 'text-red-500 font-bold' : log.level === 'WARNING' ? 'text-yellow-400' : 'text-green-400'} w-16 inline-block">[{log.level}]</span>
<span>{log.message}</span> <span>{log.message}</span>
{#if log.context}
<details class="ml-24">
<summary class="text-xs text-gray-500 cursor-pointer hover:text-gray-300">Context</summary>
<pre class="text-xs text-gray-400 pl-2 border-l border-gray-700 mt-1">{JSON.stringify(log.context, null, 2)}</pre>
</details>
{/if}
</div> </div>
{/each} {/each}
{#if waitingForData} {#if waitingForData && connectionStatus === 'connected'}
<div class="text-gray-500 italic mt-2 animate-pulse"> <div class="text-gray-500 italic mt-2 animate-pulse border-t border-gray-800 pt-2">
Waiting for data... Waiting for new logs...
</div> </div>
{/if} {/if}
</div> </div>
@@ -263,6 +350,14 @@
on:resolve={handleMappingResolve} on:resolve={handleMappingResolve}
on:cancel={() => { connectionStatus = 'disconnected'; ws.close(); }} on:cancel={() => { connectionStatus = 'disconnected'; ws.close(); }}
/> />
<PasswordPrompt
bind:show={showPasswordPrompt}
databases={passwordPromptData.databases}
errorMessage={passwordPromptData.errorMessage}
on:resume={handlePasswordResume}
on:cancel={() => { showPasswordPrompt = false; }}
/>
<!-- [/SECTION] --> <!-- [/SECTION] -->
<!-- [/DEF:TaskRunner] --> <!-- [/DEF:TaskRunner] -->

View File

@@ -21,7 +21,14 @@
environments: [], environments: [],
settings: { settings: {
backup_path: '', backup_path: '',
default_environment_id: null default_environment_id: null,
logging: {
level: 'INFO',
file_path: 'logs/app.log',
max_bytes: 10485760,
backup_count: 5,
enable_belief_state: true
}
} }
}; };
@@ -180,10 +187,43 @@
<label for="backup_path" class="block text-sm font-medium text-gray-700">Backup Storage Path</label> <label for="backup_path" class="block text-sm font-medium text-gray-700">Backup Storage Path</label>
<input type="text" id="backup_path" bind:value={settings.settings.backup_path} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" /> <input type="text" id="backup_path" bind:value={settings.settings.backup_path} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
</div> </div>
<button on:click={handleSaveGlobal} class="bg-blue-500 text-white px-4 py-2 rounded hover:bg-blue-600 w-max">
Save Global Settings
</button>
</div> </div>
<h3 class="text-lg font-medium mb-4 mt-6">Logging Configuration</h3>
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
<div>
<label for="log_level" class="block text-sm font-medium text-gray-700">Log Level</label>
<select id="log_level" bind:value={settings.settings.logging.level} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2">
<option value="DEBUG">DEBUG</option>
<option value="INFO">INFO</option>
<option value="WARNING">WARNING</option>
<option value="ERROR">ERROR</option>
<option value="CRITICAL">CRITICAL</option>
</select>
</div>
<div>
<label for="log_file_path" class="block text-sm font-medium text-gray-700">Log File Path</label>
<input type="text" id="log_file_path" bind:value={settings.settings.logging.file_path} placeholder="logs/app.log" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
</div>
<div>
<label for="log_max_bytes" class="block text-sm font-medium text-gray-700">Max File Size (MB)</label>
<input type="number" id="log_max_bytes" bind:value={settings.settings.logging.max_bytes} min="1" step="1" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
</div>
<div>
<label for="log_backup_count" class="block text-sm font-medium text-gray-700">Backup Count</label>
<input type="number" id="log_backup_count" bind:value={settings.settings.logging.backup_count} min="1" step="1" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
</div>
<div class="md:col-span-2">
<label class="flex items-center">
<input type="checkbox" id="enable_belief_state" bind:checked={settings.settings.logging.enable_belief_state} class="h-4 w-4 text-blue-600 border-gray-300 rounded" />
<span class="ml-2 block text-sm text-gray-900">Enable Belief State Logging</span>
</label>
</div>
</div>
<button on:click={handleSaveGlobal} class="bg-blue-500 text-white px-4 py-2 rounded hover:bg-blue-600 w-max mt-4">
Save Global Settings
</button>
</section> </section>
<section class="mb-8 bg-white p-6 rounded shadow"> <section class="mb-8 bg-white p-6 rounded shadow">

View File

@@ -12,16 +12,40 @@
// [SECTION: IMPORTS] // [SECTION: IMPORTS]
import { onMount } from 'svelte'; import { onMount } from 'svelte';
import EnvSelector from '../../components/EnvSelector.svelte'; import EnvSelector from '../../components/EnvSelector.svelte';
import DashboardGrid from '../../components/DashboardGrid.svelte';
import MappingTable from '../../components/MappingTable.svelte';
import TaskRunner from '../../components/TaskRunner.svelte';
import TaskHistory from '../../components/TaskHistory.svelte';
import TaskLogViewer from '../../components/TaskLogViewer.svelte';
import PasswordPrompt from '../../components/PasswordPrompt.svelte';
import { selectedTask } from '../../lib/stores.js';
import { resumeTask } from '../../services/taskService.js';
import type { DashboardMetadata, DashboardSelection } from '../../types/dashboard';
// [/SECTION] // [/SECTION]
// [SECTION: STATE] // [SECTION: STATE]
let environments = []; let environments: any[] = [];
let sourceEnvId = ""; let sourceEnvId = "";
let targetEnvId = ""; let targetEnvId = "";
let dashboardRegex = ".*";
let replaceDb = false; let replaceDb = false;
let loading = true; let loading = true;
let error = ""; let error = "";
let dashboards: DashboardMetadata[] = [];
let selectedDashboardIds: number[] = [];
let sourceDatabases: any[] = [];
let targetDatabases: any[] = [];
let mappings: any[] = [];
let suggestions: any[] = [];
let fetchingDbs = false;
// UI State for Modals
let showLogViewer = false;
let logViewerTaskId: string | null = null;
let logViewerTaskStatus: string | null = null;
let showPasswordPrompt = false;
let passwordPromptDatabases: string[] = [];
let passwordPromptErrorMessage = "";
// [/SECTION] // [/SECTION]
// [DEF:fetchEnvironments:Function] // [DEF:fetchEnvironments:Function]
@@ -42,8 +66,144 @@
} }
// [/DEF:fetchEnvironments] // [/DEF:fetchEnvironments]
// [DEF:fetchDashboards:Function]
/**
* @purpose Fetches dashboards for the selected source environment.
* @param envId The environment ID.
* @post dashboards state is updated.
*/
async function fetchDashboards(envId: string) {
try {
const response = await fetch(`/api/environments/${envId}/dashboards`);
if (!response.ok) throw new Error('Failed to fetch dashboards');
dashboards = await response.json();
selectedDashboardIds = []; // Reset selection when env changes
} catch (e) {
error = e.message;
dashboards = [];
}
}
// [/DEF:fetchDashboards]
onMount(fetchEnvironments); onMount(fetchEnvironments);
// Reactive: fetch dashboards when source env changes
$: if (sourceEnvId) fetchDashboards(sourceEnvId);
// [DEF:fetchDatabases:Function]
/**
* @purpose Fetches databases from both environments and gets suggestions.
*/
async function fetchDatabases() {
if (!sourceEnvId || !targetEnvId) return;
fetchingDbs = true;
error = "";
try {
const [srcRes, tgtRes, mapRes, sugRes] = await Promise.all([
fetch(`/api/environments/${sourceEnvId}/databases`),
fetch(`/api/environments/${targetEnvId}/databases`),
fetch(`/api/mappings?source_env_id=${sourceEnvId}&target_env_id=${targetEnvId}`),
fetch(`/api/mappings/suggest`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ source_env_id: sourceEnvId, target_env_id: targetEnvId })
})
]);
if (!srcRes.ok || !tgtRes.ok) throw new Error('Failed to fetch databases from environments');
sourceDatabases = await srcRes.json();
targetDatabases = await tgtRes.json();
mappings = await mapRes.json();
suggestions = await sugRes.json();
} catch (e) {
error = e.message;
} finally {
fetchingDbs = false;
}
}
// [/DEF:fetchDatabases]
// [DEF:handleMappingUpdate:Function]
/**
* @purpose Saves a mapping to the backend.
*/
async function handleMappingUpdate(event: CustomEvent) {
const { sourceUuid, targetUuid } = event.detail;
const sDb = sourceDatabases.find(d => d.uuid === sourceUuid);
const tDb = targetDatabases.find(d => d.uuid === targetUuid);
if (!sDb || !tDb) return;
try {
const response = await fetch('/api/mappings', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
source_env_id: sourceEnvId,
target_env_id: targetEnvId,
source_db_uuid: sourceUuid,
target_db_uuid: targetUuid,
source_db_name: sDb.database_name,
target_db_name: tDb.database_name
})
});
if (!response.ok) throw new Error('Failed to save mapping');
const savedMapping = await response.json();
mappings = [...mappings.filter(m => m.source_db_uuid !== sourceUuid), savedMapping];
} catch (e) {
error = e.message;
}
}
// [/DEF:handleMappingUpdate]
// [DEF:handleViewLogs:Function]
function handleViewLogs(event: CustomEvent) {
const task = event.detail;
logViewerTaskId = task.id;
logViewerTaskStatus = task.status;
showLogViewer = true;
}
// [/DEF:handleViewLogs]
// [DEF:handlePasswordPrompt:Function]
// This is triggered by TaskRunner or TaskHistory when a task needs input
// For now, we rely on the WebSocket or manual check.
// Ideally, TaskHistory or TaskRunner emits an event when input is needed.
// Or we watch selectedTask.
$: if ($selectedTask && $selectedTask.status === 'AWAITING_INPUT' && $selectedTask.input_request) {
const req = $selectedTask.input_request;
if (req.type === 'database_password') {
passwordPromptDatabases = req.databases || [];
passwordPromptErrorMessage = req.error_message || "";
showPasswordPrompt = true;
}
} else if (!$selectedTask || $selectedTask.status !== 'AWAITING_INPUT') {
// Close prompt if task is no longer waiting (e.g. resumed)
// But only if we are viewing this task.
// showPasswordPrompt = false;
// Actually, don't auto-close, let the user or success handler close it.
}
async function handleResumeMigration(event: CustomEvent) {
if (!$selectedTask) return;
const { passwords } = event.detail;
try {
await resumeTask($selectedTask.id, passwords);
showPasswordPrompt = false;
// Task status update will be handled by store/websocket
} catch (e) {
console.error("Failed to resume task:", e);
passwordPromptErrorMessage = e.message;
// Keep prompt open
}
}
// [DEF:startMigration:Function] // [DEF:startMigration:Function]
/** /**
* @purpose Starts the migration process. * @purpose Starts the migration process.
@@ -58,10 +218,57 @@
error = "Source and target environments must be different."; error = "Source and target environments must be different.";
return; return;
} }
if (selectedDashboardIds.length === 0) {
error = "Please select at least one dashboard to migrate.";
return;
}
error = ""; error = "";
console.log(`[MigrationDashboard][Action] Starting migration from ${sourceEnvId} to ${targetEnvId} (Replace DB: ${replaceDb})`); try {
// TODO: Implement actual migration trigger in US3 const selection: DashboardSelection = {
selected_ids: selectedDashboardIds,
source_env_id: sourceEnvId,
target_env_id: targetEnvId,
replace_db_config: replaceDb
};
console.log(`[MigrationDashboard][Action] Starting migration with selection:`, selection);
const response = await fetch('/api/migration/execute', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(selection)
});
console.log(`[MigrationDashboard][Action] API response status: ${response.status}`);
if (!response.ok) throw new Error(`Failed to start migration: ${response.status} ${response.statusText}`);
const result = await response.json();
console.log(`[MigrationDashboard][Action] Migration started: ${result.task_id} - ${result.message}`);
// Wait a brief moment for the backend to ensure the task is retrievable
await new Promise(r => setTimeout(r, 500));
// Fetch full task details and switch to TaskRunner view
try {
const taskRes = await fetch(`/api/tasks/${result.task_id}`);
if (taskRes.ok) {
const task = await taskRes.json();
selectedTask.set(task);
} else {
// Fallback: create a temporary task object to switch view immediately
console.warn("Could not fetch task details immediately, using placeholder.");
selectedTask.set({
id: result.task_id,
plugin_id: 'superset-migration',
status: 'RUNNING',
logs: [],
params: {}
});
}
} catch (fetchErr) {
console.error("Failed to fetch new task details:", fetchErr);
}
} catch (e) {
console.error(`[MigrationDashboard][Failure] Migration failed:`, e);
error = e.message;
}
} }
// [/DEF:startMigration] // [/DEF:startMigration]
</script> </script>
@@ -69,60 +276,120 @@
<!-- [SECTION: TEMPLATE] --> <!-- [SECTION: TEMPLATE] -->
<div class="max-w-4xl mx-auto p-6"> <div class="max-w-4xl mx-auto p-6">
<h1 class="text-2xl font-bold mb-6">Migration Dashboard</h1> <h1 class="text-2xl font-bold mb-6">Migration Dashboard</h1>
<TaskHistory on:viewLogs={handleViewLogs} />
{#if loading} {#if $selectedTask}
<p>Loading environments...</p> <div class="mt-6">
{:else if error} <TaskRunner />
<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4"> <button
{error} on:click={() => selectedTask.set(null)}
class="mt-4 inline-flex items-center px-4 py-2 border border-gray-300 shadow-sm text-sm font-medium rounded-md text-gray-700 bg-white hover:bg-gray-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500"
>
Back to New Migration
</button>
</div> </div>
{:else}
{#if loading}
<p>Loading environments...</p>
{:else if error}
<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">
{error}
</div>
{/if}
<div class="grid grid-cols-1 md:grid-cols-2 gap-6 mb-8">
<EnvSelector
label="Source Environment"
bind:selectedId={sourceEnvId}
{environments}
/>
<EnvSelector
label="Target Environment"
bind:selectedId={targetEnvId}
{environments}
/>
</div>
<!-- [DEF:DashboardSelectionSection] -->
<div class="mb-8">
<h2 class="text-lg font-medium mb-4">Select Dashboards</h2>
{#if sourceEnvId}
<DashboardGrid
{dashboards}
bind:selectedIds={selectedDashboardIds}
/>
{:else}
<p class="text-gray-500 italic">Select a source environment to view dashboards.</p>
{/if}
</div>
<!-- [/DEF:DashboardSelectionSection] -->
<div class="flex items-center mb-4">
<input
id="replace-db"
type="checkbox"
bind:checked={replaceDb}
on:change={() => { if (replaceDb && sourceDatabases.length === 0) fetchDatabases(); }}
class="h-4 w-4 text-indigo-600 focus:ring-indigo-500 border-gray-300 rounded"
/>
<label for="replace-db" class="ml-2 block text-sm text-gray-900">
Replace Database (Apply Mappings)
</label>
</div>
{#if replaceDb}
<div class="mb-8 p-4 border rounded-md bg-gray-50">
<h3 class="text-md font-medium mb-4">Database Mappings</h3>
{#if fetchingDbs}
<p>Loading databases and suggestions...</p>
{:else if sourceDatabases.length > 0}
<MappingTable
{sourceDatabases}
{targetDatabases}
{mappings}
{suggestions}
on:update={handleMappingUpdate}
/>
{:else if sourceEnvId && targetEnvId}
<button
on:click={fetchDatabases}
class="text-indigo-600 hover:text-indigo-500 text-sm font-medium"
>
Refresh Databases & Suggestions
</button>
{/if}
</div>
{/if}
<button
on:click={startMigration}
disabled={!sourceEnvId || !targetEnvId || sourceEnvId === targetEnvId || selectedDashboardIds.length === 0}
class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:bg-gray-400"
>
Start Migration
</button>
{/if} {/if}
<div class="grid grid-cols-1 md:grid-cols-2 gap-6 mb-8">
<EnvSelector
label="Source Environment"
bind:selectedId={sourceEnvId}
{environments}
/>
<EnvSelector
label="Target Environment"
bind:selectedId={targetEnvId}
{environments}
/>
</div>
<div class="mb-8">
<label for="dashboard-regex" class="block text-sm font-medium text-gray-700 mb-1">Dashboard Regex</label>
<input
id="dashboard-regex"
type="text"
bind:value={dashboardRegex}
placeholder="e.g. ^Finance Dashboard$"
class="shadow-sm focus:ring-indigo-500 focus:border-indigo-500 block w-full sm:text-sm border-gray-300 rounded-md"
/>
<p class="mt-1 text-sm text-gray-500">Regular expression to filter dashboards to migrate.</p>
</div>
<div class="flex items-center mb-8">
<input
id="replace-db"
type="checkbox"
bind:checked={replaceDb}
class="h-4 w-4 text-indigo-600 focus:ring-indigo-500 border-gray-300 rounded"
/>
<label for="replace-db" class="ml-2 block text-sm text-gray-900">
Replace Database (Apply Mappings)
</label>
</div>
<button
on:click={startMigration}
disabled={!sourceEnvId || !targetEnvId || sourceEnvId === targetEnvId}
class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:bg-gray-400"
>
Start Migration
</button>
</div> </div>
<!-- Modals -->
<TaskLogViewer
bind:show={showLogViewer}
taskId={logViewerTaskId}
taskStatus={logViewerTaskStatus}
on:close={() => showLogViewer = false}
/>
<PasswordPrompt
bind:show={showPasswordPrompt}
databases={passwordPromptDatabases}
errorMessage={passwordPromptErrorMessage}
on:resume={handleResumeMigration}
on:cancel={() => showPasswordPrompt = false}
/>
<!-- [/SECTION] --> <!-- [/SECTION] -->
<style> <style>

View File

@@ -0,0 +1,120 @@
/**
* Service for interacting with the Task Management API.
*/
const API_BASE = '/api/tasks';
/**
* Fetch a list of tasks with pagination and optional status filter.
* @param {number} limit - Maximum number of tasks to return.
* @param {number} offset - Number of tasks to skip.
* @param {string|null} status - Filter by task status (optional).
* @returns {Promise<Array>} List of tasks.
*/
export async function getTasks(limit = 10, offset = 0, status = null) {
const params = new URLSearchParams({
limit: limit.toString(),
offset: offset.toString()
});
if (status) {
params.append('status', status);
}
const response = await fetch(`${API_BASE}?${params.toString()}`);
if (!response.ok) {
throw new Error(`Failed to fetch tasks: ${response.statusText}`);
}
return await response.json();
}
/**
* Fetch details for a specific task.
* @param {string} taskId - The ID of the task.
* @returns {Promise<Object>} Task details.
*/
export async function getTask(taskId) {
const response = await fetch(`${API_BASE}/${taskId}`);
if (!response.ok) {
throw new Error(`Failed to fetch task ${taskId}: ${response.statusText}`);
}
return await response.json();
}
/**
* Fetch logs for a specific task.
* @param {string} taskId - The ID of the task.
* @returns {Promise<Array>} List of log entries.
*/
export async function getTaskLogs(taskId) {
// Currently, logs are included in the task object, but we might have a separate endpoint later.
// For now, we fetch the task and return its logs.
// Or if we implement T017 (GET /api/tasks/{task_id}/logs), we would use that.
// The current backend implementation in tasks.py does NOT have a separate /logs endpoint yet.
// T017 is in Phase 3.
// So for now, we'll fetch the task.
const task = await getTask(taskId);
return task.logs || [];
}
/**
* Resume a task that is awaiting input (e.g., passwords).
* @param {string} taskId - The ID of the task.
* @param {Object} passwords - Map of database names to passwords.
* @returns {Promise<Object>} Updated task object.
*/
export async function resumeTask(taskId, passwords) {
const response = await fetch(`${API_BASE}/${taskId}/resume`, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ passwords })
});
if (!response.ok) {
const errorData = await response.json().catch(() => ({}));
throw new Error(errorData.detail || `Failed to resume task: ${response.statusText}`);
}
return await response.json();
}
/**
* Resolve a task that is awaiting mapping.
* @param {string} taskId - The ID of the task.
* @param {Object} resolutionParams - Resolution parameters.
* @returns {Promise<Object>} Updated task object.
*/
export async function resolveTask(taskId, resolutionParams) {
const response = await fetch(`${API_BASE}/${taskId}/resolve`, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ resolution_params: resolutionParams })
});
if (!response.ok) {
const errorData = await response.json().catch(() => ({}));
throw new Error(errorData.detail || `Failed to resolve task: ${response.statusText}`);
}
return await response.json();
}
/**
* Clear tasks based on status.
* @param {string|null} status - Filter by task status (optional).
*/
export async function clearTasks(status = null) {
const params = new URLSearchParams();
if (status) {
params.append('status', status);
}
const response = await fetch(`${API_BASE}?${params.toString()}`, {
method: 'DELETE'
});
if (!response.ok) {
throw new Error(`Failed to clear tasks: ${response.statusText}`);
}
}

View File

@@ -0,0 +1,13 @@
export interface DashboardMetadata {
id: number;
title: string;
last_modified: string;
status: string;
}
export interface DashboardSelection {
selected_ids: number[];
source_env_id: string;
target_env_id: string;
replace_db_config?: boolean;
}

View File

@@ -6,13 +6,16 @@ export default defineConfig({
server: { server: {
proxy: { proxy: {
'/api': { '/api': {
target: 'http://localhost:8000', target: 'http://127.0.0.1:8000',
changeOrigin: true changeOrigin: true,
secure: false,
rewrite: (path) => path.replace(/^\/api/, '/api')
}, },
'/ws': { '/ws': {
target: 'ws://localhost:8000', target: 'ws://127.0.0.1:8000',
ws: true, ws: true,
changeOrigin: true changeOrigin: true,
secure: false
} }
} }
} }

BIN
mappings.db Normal file

Binary file not shown.

View File

@@ -11,8 +11,9 @@ This protocol standardizes the "Semantic Bridge" between the two languages using
## I. CORE REQUIREMENTS ## I. CORE REQUIREMENTS
1. **Causal Validity:** Semantic definitions (Contracts) must ALWAYS precede implementation code. 1. **Causal Validity:** Semantic definitions (Contracts) must ALWAYS precede implementation code.
2. **Immutability:** Architectural decisions defined in the Module/Component Header are treated as immutable constraints. 2. **Immutability:** Architectural decisions defined in the Module/Component Header are treated as immutable constraints.
3. **Format Compliance:** Output must strictly follow the `[DEF]` / `[/DEF]` anchor syntax for structure. 3. **Format Compliance:** Output must strictly follow the `[DEF:..:...]` / `[/DEF:...:...]` anchor syntax for structure.
4. **Logic over Assertion:** Contracts define the *logic flow*. Do not generate explicit `assert` statements unless requested. The code logic itself must inherently satisfy the Pre/Post conditions (e.g., via control flow, guards, or types). 4. **Logic over Assertion:** Contracts define the *logic flow*. Do not generate explicit `assert` statements unless requested. The code logic itself must inherently satisfy the Pre/Post conditions (e.g., via control flow, guards, or types).
5. **Fractal Complexity:** Modules and functions must adhere to strict size limits (~300 lines/module, ~30-50 lines/function) to maintain semantic focus.
--- ---
@@ -25,13 +26,13 @@ Used to define the boundaries of Modules, Classes, Components, and Functions.
* **Python:** * **Python:**
* Start: `# [DEF:identifier:Type]` * Start: `# [DEF:identifier:Type]`
* End: `# [/DEF:identifier]` * End: `# [/DEF:identifier:Type]`
* **Svelte (Top-level):** * **Svelte (Top-level):**
* Start: `<!-- [DEF:ComponentName:Component] -->` * Start: `<!-- [DEF:ComponentName:Component] -->`
* End: `<!-- [/DEF:ComponentName] -->` * End: `<!-- [/DEF:ComponentName:Component] -->`
* **Svelte (Script/JS/TS):** * **Svelte (Script/JS/TS):**
* Start: `// [DEF:funcName:Function]` * Start: `// [DEF:funcName:Function]`
* End: `// [/DEF:funcName]` * End: `// [/DEF:funcName:Function]`
**Types:** `Module`, `Component`, `Class`, `Function`, `Store`, `Action`. **Types:** `Module`, `Component`, `Class`, `Function`, `Store`, `Action`.
@@ -63,7 +64,7 @@ Defines high-level dependencies.
# ... IMPLEMENTATION ... # ... IMPLEMENTATION ...
# [/DEF:module_name] # [/DEF:module_name:Module]
``` ```
### 2. Svelte Component Header (`.svelte`) ### 2. Svelte Component Header (`.svelte`)
@@ -81,20 +82,20 @@ Defines high-level dependencies.
<script lang="ts"> <script lang="ts">
// [SECTION: IMPORTS] // [SECTION: IMPORTS]
// ... // ...
// [/SECTION] // [/SECTION: IMPORTS]
// ... LOGIC IMPLEMENTATION ... // ... LOGIC IMPLEMENTATION ...
</script> </script>
<!-- [SECTION: TEMPLATE] --> <!-- [SECTION: TEMPLATE] -->
... ...
<!-- [/SECTION] --> <!-- [/SECTION: TEMPLATE] -->
<style> <style>
/* ... */ /* ... */
</style> </style>
<!-- [/DEF:ComponentName] --> <!-- [/DEF:ComponentName:Component] -->
``` ```
--- ---
@@ -122,7 +123,7 @@ def calculate_total(items: List[Item]) -> Decimal:
# Logic ensuring @POST # Logic ensuring @POST
return total return total
# [/DEF:calculate_total] # [/DEF:calculate_total:Function]
``` ```
### 2. Svelte/JS Contract Style (JSDoc) ### 2. Svelte/JS Contract Style (JSDoc)
@@ -145,7 +146,7 @@ async function updateUserProfile(profileData) {
// ... // ...
} }
// [/DEF:updateUserProfile] // [/DEF:updateUserProfile:Function]
``` ```
--- ---
@@ -154,21 +155,32 @@ async function updateUserProfile(profileData) {
Logs delineate the agent's internal state. Logs delineate the agent's internal state.
* **Python:** `logger.info(f"[{ANCHOR_ID}][{STATE}] Msg")` * **Python:** MUST use a Context Manager (e.g., `with belief_scope("ANCHOR_ID"):`) to ensure state consistency and automatic handling of Entry/Exit/Error states.
* Manual logging (inside scope): `logger.info(f"[{ANCHOR_ID}][{STATE}] Msg")`
* **Svelte/JS:** `console.log(\`[${ANCHOR_ID}][${STATE}] Msg\`)` * **Svelte/JS:** `console.log(\`[${ANCHOR_ID}][${STATE}] Msg\`)`
**Required States:** **Required States:**
1. `Entry` (Start of block) 1. `Entry` (Start of block - Auto-logged by Context Manager)
2. `Action` (Key business logic) 2. `Action` (Key business logic - Manual log)
3. `Coherence:OK` (Logic successfully completed) 3. `Coherence:OK` (Logic successfully completed - Auto-logged by Context Manager)
4. `Coherence:Failed` (Error handling) 4. `Coherence:Failed` (Exception/Error - Auto-logged by Context Manager)
5. `Exit` (End of block) 5. `Exit` (End of block - Auto-logged by Context Manager)
--- ---
## VI. GENERATION WORKFLOW ## VI. FRACTAL COMPLEXITY LIMIT
To maintain semantic coherence and avoid "Attention Sink" issues:
* **Module Size:** If a Module body exceeds ~300 lines (or logical complexity), it MUST be refactored into sub-modules or a package structure.
* **Function Size:** Functions should fit within a standard attention "chunk" (approx. 30-50 lines). If larger, logic MUST be decomposed into helper functions with their own contracts.
This ensures every vector embedding remains sharp and focused.
---
## VII. GENERATION WORKFLOW
1. **Context Analysis:** Identify language (Python vs Svelte) and Architecture Layer. 1. **Context Analysis:** Identify language (Python vs Svelte) and Architecture Layer.
2. **Scaffolding:** Generate the `[DEF]` Anchors and Header/Contract **before** writing any logic. 2. **Scaffolding:** Generate the `[DEF:...:...]` Anchors and Header/Contract **before** writing any logic.
3. **Implementation:** Write the code. Ensure the code logic handles the `@PRE` conditions (e.g., via `if/return` or guards) and satisfies `@POST` conditions naturally. **Do not write explicit `assert` statements unless debugging mode is requested.** 3. **Implementation:** Write the code. Ensure the code logic handles the `@PRE` conditions (e.g., via `if/return` or guards) and satisfies `@POST` conditions naturally. **Do not write explicit `assert` statements unless debugging mode is requested.**
4. **Closure:** Ensure every `[DEF]` is closed with `[/DEF]` to accumulate semantic context. 4. **Closure:** Ensure every `[DEF:...:...]` is closed with `[/DEF:...:...]` to accumulate semantic context.

View File

@@ -0,0 +1,34 @@
# Specification Quality Checklist: Configurable Belief State Logging
**Purpose**: Validate specification completeness and quality before proceeding to planning
**Created**: 2025-12-26
**Feature**: [specs/006-configurable-belief-logs/spec.md](../spec.md)
## Content Quality
- [x] No implementation details (languages, frameworks, APIs)
- [x] Focused on user value and business needs
- [x] Written for non-technical stakeholders
- [x] All mandatory sections completed
## Requirement Completeness
- [x] No [NEEDS CLARIFICATION] markers remain
- [x] Requirements are testable and unambiguous
- [x] Success criteria are measurable
- [x] Success criteria are technology-agnostic (no implementation details)
- [x] All acceptance scenarios are defined
- [x] Edge cases are identified
- [x] Scope is clearly bounded
- [x] Dependencies and assumptions identified
## Feature Readiness
- [x] All functional requirements have clear acceptance criteria
- [x] User scenarios cover primary flows
- [x] Feature meets measurable outcomes defined in Success Criteria
- [x] No implementation details leak into specification
## Notes
- Items marked incomplete require spec updates before `/speckit.clarify` or `/speckit.plan`

View File

@@ -0,0 +1,56 @@
# API Contract: Settings Update
## PATCH /api/settings/global
Updates the global application settings, including the new logging configuration.
### Request Body
**Content-Type**: `application/json`
```json
{
"backup_path": "string",
"default_environment_id": "string (optional)",
"logging": {
"level": "string (DEBUG, INFO, WARNING, ERROR, CRITICAL)",
"file_path": "string (optional)",
"max_bytes": "integer (default: 10485760)",
"backup_count": "integer (default: 5)",
"enable_belief_state": "boolean (default: true)"
}
}
```
### Response
**Status**: `200 OK`
**Content-Type**: `application/json`
```json
{
"backup_path": "string",
"default_environment_id": "string (optional)",
"logging": {
"level": "string",
"file_path": "string",
"max_bytes": "integer",
"backup_count": "integer",
"enable_belief_state": "boolean"
}
}
```
### Example
**Request**
```json
{
"backup_path": "backups",
"logging": {
"level": "DEBUG",
"file_path": "logs/app.log",
"enable_belief_state": true
}
}

View File

@@ -0,0 +1,74 @@
# Data Model: Configurable Belief State Logging
## 1. Configuration Models
These models extend the existing `ConfigModels` in `backend/src/core/config_models.py`.
### 1.1. LoggingConfig
Defines the configuration for the application's logging system.
| Field | Type | Default | Description |
|---|---|---|---|
| `level` | `str` | `"INFO"` | The logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL). |
| `file_path` | `Optional[str]` | `"logs/app.log"` | Path to the log file. If None, file logging is disabled. |
| `max_bytes` | `int` | `10485760` (10MB) | Maximum size of a log file before rotation. |
| `backup_count` | `int` | `5` | Number of backup files to keep. |
| `enable_belief_state` | `bool` | `True` | Whether to enable structured Belief State logging (Entry/Exit). |
```python
class LoggingConfig(BaseModel):
level: str = "INFO"
file_path: Optional[str] = "logs/app.log"
max_bytes: int = 10 * 1024 * 1024
backup_count: int = 5
enable_belief_state: bool = True
```
### 1.2. GlobalSettings (Updated)
Updates the existing `GlobalSettings` to include `LoggingConfig`.
| Field | Type | Default | Description |
|---|---|---|---|
| `logging` | `LoggingConfig` | `LoggingConfig()` | The logging configuration object. |
```python
class GlobalSettings(BaseModel):
backup_path: str
default_environment_id: Optional[str] = None
logging: LoggingConfig = Field(default_factory=LoggingConfig)
```
## 2. Logger Entities
These entities are part of the `backend/src/core/logger.py` module.
### 2.1. LogEntry (Existing)
Represents a single log record.
| Field | Type | Description |
|---|---|---|
| `timestamp` | `datetime` | UTC timestamp of the log. |
| `level` | `str` | Log level. |
| `message` | `str` | Log message. |
| `context` | `Optional[Dict[str, Any]]` | Additional context data. |
### 2.2. BeliefState (Concept)
Represents the state of execution in the "Belief State" model.
- **Entry**: Entering a logical block.
- **Action**: Performing a core action within the block.
- **Coherence**: Verifying the state (OK or Failed).
- **Exit**: Exiting the logical block.
Format: `[{ANCHOR_ID}][{STATE}] {Message}`
## 3. Relationships
- `AppConfig` contains `GlobalSettings`.
- `GlobalSettings` contains `LoggingConfig`.
- `ConfigManager` reads/writes `AppConfig`.
- `ConfigManager` configures the global `logger` based on `LoggingConfig`.

View File

@@ -0,0 +1,81 @@
# Implementation Plan: Configurable Belief State Logging
**Branch**: `006-configurable-belief-logs` | **Date**: 2025-12-26 | **Spec**: specs/006-configurable-belief-logs/spec.md
**Input**: Feature specification from `/specs/006-configurable-belief-logs/spec.md`
**Note**: This template is filled in by the `/speckit.plan` command. See `.specify/templates/commands/plan.md` for the execution workflow.
## Summary
Implement a configurable logging system with "Belief State" support (Entry, Action, Coherence, Exit).
Approach: Use Python's `logging` module with a custom Context Manager (`belief_scope`) and extend `GlobalSettings` with a `LoggingConfig` model.
## Technical Context
**Language/Version**: Python 3.9+
**Primary Dependencies**: FastAPI (Backend), Pydantic (Config), Svelte (Frontend)
**Storage**: Filesystem (for log files), JSON (for configuration persistence)
**Testing**: pytest (Backend), manual verification (Frontend)
**Target Platform**: Linux server (primary), cross-platform compatible
**Project Type**: Web application (Backend + Frontend)
**Performance Goals**: Low overhead logging (<1ms per log entry), non-blocking for main thread (mostly)
**Constraints**: Must use standard library `logging` where possible; Log rotation to prevent disk overflow
**Scale/Scope**: Configurable log levels and retention policies
## Constitution Check
*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
- **Library-First**: N/A (Core infrastructure)
- **CLI Interface**: N/A (Configured via UI/API/JSON)
- **Test-First**: Will require unit tests for `belief_scope` and config updates.
- **Integration Testing**: Will require testing the settings API.
- **Observability**: This feature *is* the observability enhancement.
## Project Structure
### Documentation (this feature)
```text
specs/[###-feature]/
├── plan.md # This file (/speckit.plan command output)
├── research.md # Phase 0 output (/speckit.plan command)
├── data-model.md # Phase 1 output (/speckit.plan command)
├── quickstart.md # Phase 1 output (/speckit.plan command)
├── contracts/ # Phase 1 output (/speckit.plan command)
└── tasks.md # Phase 2 output (/speckit.tasks command - NOT created by /speckit.plan)
```
### Source Code (repository root)
```text
backend/
├── src/
│ ├── core/
│ │ ├── config_models.py # Add LoggingConfig
│ │ ├── config_manager.py # Update config loading/saving
│ │ └── logger.py # Add belief_scope and configure_logger
│ └── api/
│ └── routes/
│ └── settings.py # Update GlobalSettings endpoint
└── tests/
└── test_logger.py # New tests for logger logic
frontend/
├── src/
│ ├── pages/
│ │ └── Settings.svelte # Add logging UI controls
│ └── lib/
│ └── api.js # Update API calls if needed
```
**Structure Decision**: enhancing existing Backend/Frontend structure.
## Complexity Tracking
> **Fill ONLY if Constitution Check has violations that must be justified**
| Violation | Why Needed | Simpler Alternative Rejected Because |
|-----------|------------|-------------------------------------|
| [e.g., 4th project] | [current need] | [why 3 projects insufficient] |
| [e.g., Repository pattern] | [specific problem] | [why direct DB access insufficient] |

View File

@@ -0,0 +1,104 @@
# Quickstart: Configurable Belief State Logging
## 1. Configuration
The logging system is configured via the `GlobalSettings` in `config.json` or through the Settings UI.
### 1.1. Default Configuration
```json
{
"environments": [],
"settings": {
"backup_path": "backups",
"logging": {
"level": "INFO",
"file_path": "logs/app.log",
"max_bytes": 10485760,
"backup_count": 5,
"enable_belief_state": true
}
}
}
```
### 1.2. Changing Log Level
To change the log level to `DEBUG`, update the `logging.level` field in `config.json` or use the API:
```bash
curl -X PATCH http://localhost:8000/api/settings/global \
-H "Content-Type: application/json" \
-d '{
"backup_path": "backups",
"logging": {
"level": "DEBUG",
"file_path": "logs/app.log",
"max_bytes": 10485760,
"backup_count": 5,
"enable_belief_state": true
}
}'
```
## 2. Using Belief State Logging
### 2.1. Basic Usage
Use the `belief_scope` context manager to automatically log Entry, Exit, and Coherence states.
```python
from backend.src.core.logger import logger, belief_scope
def my_function():
with belief_scope("MyFunction"):
# Logs: [MyFunction][Entry]
logger.info("Doing something important")
# Logs: [MyFunction][Action] Doing something important
# ... logic ...
# Logs: [MyFunction][Coherence:OK]
# Logs: [MyFunction][Exit]
```
### 2.2. Error Handling
If an exception occurs within the scope, it is caught, logged as a failure, and re-raised.
```python
def failing_function():
with belief_scope("FailingFunc"):
raise ValueError("Something went wrong")
# Logs: [FailingFunc][Entry]
# Logs: [FailingFunc][Coherence:Failed] Something went wrong
# Re-raises ValueError
```
### 2.3. Custom Messages
You can provide an optional message to `belief_scope`.
```python
with belief_scope("DataProcessor", "Processing batch #1"):
# Logs: [DataProcessor][Entry] Processing batch #1
pass
```
## 3. Log Output Format
### 3.1. Standard Log
```text
[2025-12-26 10:00:00,000][INFO][superset_tools_app] System initialized
```
### 3.2. Belief State Log
```text
[2025-12-26 10:00:01,000][INFO][superset_tools_app] [MyFunction][Entry]
[2025-12-26 10:00:01,050][INFO][superset_tools_app] [MyFunction][Action] Processing data
[2025-12-26 10:00:01,100][INFO][superset_tools_app] [MyFunction][Coherence:OK]
[2025-12-26 10:00:01,100][INFO][superset_tools_app] [MyFunction][Exit]

Some files were not shown because too many files have changed in this diff Show More