Compare commits
7 Commits
4c9d554432
...
009-backup
| Author | SHA1 | Date | |
|---|---|---|---|
| 297b29986d | |||
| 4c6fc8256d | |||
| a747a163c8 | |||
| fce0941e98 | |||
| 45c077b928 | |||
| 9ed3a5992d | |||
| a032fe8457 |
@@ -16,6 +16,8 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
|||||||
- Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
- Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
||||||
- SQLite (optional for job history), existing database for mappings (008-migration-ui-improvements)
|
- SQLite (optional for job history), existing database for mappings (008-migration-ui-improvements)
|
||||||
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
||||||
|
- Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS (009-backup-scheduler)
|
||||||
|
- SQLite (`tasks.db`), JSON (`config.json`) (009-backup-scheduler)
|
||||||
|
|
||||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
||||||
|
|
||||||
@@ -36,9 +38,9 @@ cd src; pytest; ruff check .
|
|||||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||||
|
|
||||||
## Recent Changes
|
## Recent Changes
|
||||||
- 008-migration-ui-improvements: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API
|
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||||
- 008-migration-ui-improvements: Added Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API
|
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||||
- 007-migration-dashboard-grid: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, Superset API
|
- 009-backup-scheduler: Added [if applicable, e.g., PostgreSQL, CoreData, files or N/A]
|
||||||
|
|
||||||
|
|
||||||
<!-- MANUAL ADDITIONS START -->
|
<!-- MANUAL ADDITIONS START -->
|
||||||
|
|||||||
75
.kilocodemodes
Normal file
75
.kilocodemodes
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
customModes:
|
||||||
|
- slug: tester
|
||||||
|
name: Tester
|
||||||
|
description: QA and Plan Verification Specialist
|
||||||
|
roleDefinition: >-
|
||||||
|
You are Kilo Code, acting as a QA and Verification Specialist. Your primary goal is to validate that the project implementation aligns strictly with the defined specifications and task plans.
|
||||||
|
|
||||||
|
Your responsibilities include:
|
||||||
|
- Reading and analyzing task plans and specifications (typically in the `specs/` directory).
|
||||||
|
- Verifying that implemented code matches the requirements.
|
||||||
|
- Executing tests and validating system behavior via CLI or Browser.
|
||||||
|
- Updating the status of tasks in the plan files (e.g., marking checkboxes [x]) as they are verified.
|
||||||
|
- Identifying and reporting missing features or bugs.
|
||||||
|
whenToUse: >-
|
||||||
|
Use this mode when you need to audit the progress of a project, verify completed tasks against the plan, run quality assurance checks, or update the status of task lists in specification documents.
|
||||||
|
groups:
|
||||||
|
- read
|
||||||
|
- edit
|
||||||
|
- command
|
||||||
|
- browser
|
||||||
|
- mcp
|
||||||
|
customInstructions: >-
|
||||||
|
1. Always begin by loading the relevant plan or task list from the `specs/` directory.
|
||||||
|
2. Do not assume a task is done just because it is checked; verify the code or functionality first if asked to audit.
|
||||||
|
3. When updating task lists, ensure you only mark items as complete if you have verified them.
|
||||||
|
- slug: semantic
|
||||||
|
name: Semantic Agent
|
||||||
|
description: Codebase semantic mapping and compliance expert
|
||||||
|
roleDefinition: >-
|
||||||
|
You are Kilo Code, a Semantic Agent responsible for maintaining the semantic integrity of the codebase. Your primary goal is to ensure that all code entities (Modules, Classes, Functions, Components) are properly annotated with semantic anchors and tags as defined in `semantic_protocol.md`.
|
||||||
|
|
||||||
|
Your core responsibilities are:
|
||||||
|
1. **Semantic Mapping**: You run and maintain the `generate_semantic_map.py` script to generate up-to-date semantic maps (`semantics/semantic_map.json`, `specs/project_map.md`) and compliance reports (`semantics/reports/*.md`).
|
||||||
|
2. **Compliance Auditing**: You analyze the generated compliance reports to identify files with low semantic coverage or parsing errors.
|
||||||
|
3. **Semantic Enrichment**: You actively edit code files to add missing semantic anchors (`[DEF:...]`, `[/DEF:...]`) and mandatory tags (`@PURPOSE`, `@LAYER`, etc.) to improve the global compliance score.
|
||||||
|
4. **Protocol Enforcement**: You strictly adhere to the syntax and rules defined in `semantic_protocol.md` when modifying code.
|
||||||
|
|
||||||
|
You have access to the full codebase and tools to read, write, and execute scripts. You should prioritize fixing "Critical Parsing Errors" (unclosed anchors) before addressing missing metadata.
|
||||||
|
whenToUse: >-
|
||||||
|
Use this mode when you need to update the project's semantic map, fix semantic compliance issues (missing anchors/tags), or analyze the codebase structure. This mode is specialized for maintaining the `semantic_protocol.md` standards.
|
||||||
|
groups:
|
||||||
|
- read
|
||||||
|
- edit
|
||||||
|
- command
|
||||||
|
- browser
|
||||||
|
- mcp
|
||||||
|
customInstructions: >-
|
||||||
|
Always check `semantics/reports/` for the latest compliance status before starting work.
|
||||||
|
When fixing a file, try to fix all semantic issues in that file at once.
|
||||||
|
After making a batch of fixes, run `python3 generate_semantic_map.py` to verify improvements.
|
||||||
|
- slug: product-manager
|
||||||
|
name: Product Manager
|
||||||
|
description: Executes SpecKit workflows for feature management
|
||||||
|
roleDefinition: >-
|
||||||
|
You are Kilo Code, acting as a Product Manager. Your purpose is to rigorously execute the workflows defined in `.kilocode/workflows/`.
|
||||||
|
|
||||||
|
You act as the orchestrator for:
|
||||||
|
- Specification (`speckit.specify`, `speckit.clarify`)
|
||||||
|
- Planning (`speckit.plan`)
|
||||||
|
- Task Management (`speckit.tasks`, `speckit.taskstoissues`)
|
||||||
|
- Quality Assurance (`speckit.analyze`, `speckit.checklist`)
|
||||||
|
- Governance (`speckit.constitution`)
|
||||||
|
- Implementation Oversight (`speckit.implement`)
|
||||||
|
|
||||||
|
For each task, you must read the relevant workflow file from `.kilocode/workflows/` and follow its Execution Steps precisely.
|
||||||
|
whenToUse: >-
|
||||||
|
Use this mode when you need to run any /speckit.* command or when dealing with high-level feature planning, specification writing, or project management tasks.
|
||||||
|
groups:
|
||||||
|
- read
|
||||||
|
- edit
|
||||||
|
- command
|
||||||
|
- mcp
|
||||||
|
customInstructions: >-
|
||||||
|
1. Always read the specific workflow file in `.kilocode/workflows/` before executing a command.
|
||||||
|
2. Adhere strictly to the "Operating Constraints" and "Execution Steps" in the workflow files.
|
||||||
@@ -9,8 +9,8 @@
|
|||||||
#
|
#
|
||||||
# OPTIONS:
|
# OPTIONS:
|
||||||
# --json Output in JSON format
|
# --json Output in JSON format
|
||||||
# --require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
|
# --require-tasks Require tasks.md to exist (for implementation phase)
|
||||||
# --include-tasks Include task files in AVAILABLE_DOCS list
|
# --include-tasks Include tasks.md in AVAILABLE_DOCS list
|
||||||
# --paths-only Only output path variables (no validation)
|
# --paths-only Only output path variables (no validation)
|
||||||
# --help, -h Show help message
|
# --help, -h Show help message
|
||||||
#
|
#
|
||||||
@@ -49,8 +49,8 @@ Consolidated prerequisite checking for Spec-Driven Development workflow.
|
|||||||
|
|
||||||
OPTIONS:
|
OPTIONS:
|
||||||
--json Output in JSON format
|
--json Output in JSON format
|
||||||
--require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
|
--require-tasks Require tasks.md to exist (for implementation phase)
|
||||||
--include-tasks Include task files in AVAILABLE_DOCS list
|
--include-tasks Include tasks.md in AVAILABLE_DOCS list
|
||||||
--paths-only Only output path variables (no prerequisite validation)
|
--paths-only Only output path variables (no prerequisite validation)
|
||||||
--help, -h Show this help message
|
--help, -h Show this help message
|
||||||
|
|
||||||
@@ -58,7 +58,7 @@ EXAMPLES:
|
|||||||
# Check task prerequisites (plan.md required)
|
# Check task prerequisites (plan.md required)
|
||||||
./check-prerequisites.sh --json
|
./check-prerequisites.sh --json
|
||||||
|
|
||||||
# Check implementation prerequisites (plan.md + task files required)
|
# Check implementation prerequisites (plan.md + tasks.md required)
|
||||||
./check-prerequisites.sh --json --require-tasks --include-tasks
|
./check-prerequisites.sh --json --require-tasks --include-tasks
|
||||||
|
|
||||||
# Get feature paths only (no validation)
|
# Get feature paths only (no validation)
|
||||||
@@ -86,16 +86,15 @@ check_feature_branch "$CURRENT_BRANCH" "$HAS_GIT" || exit 1
|
|||||||
if $PATHS_ONLY; then
|
if $PATHS_ONLY; then
|
||||||
if $JSON_MODE; then
|
if $JSON_MODE; then
|
||||||
# Minimal JSON paths payload (no validation performed)
|
# Minimal JSON paths payload (no validation performed)
|
||||||
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS_ARCH":"%s","TASKS_DEV":"%s"}\n' \
|
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS":"%s"}\n' \
|
||||||
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS_ARCH" "$TASKS_DEV"
|
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS"
|
||||||
else
|
else
|
||||||
echo "REPO_ROOT: $REPO_ROOT"
|
echo "REPO_ROOT: $REPO_ROOT"
|
||||||
echo "BRANCH: $CURRENT_BRANCH"
|
echo "BRANCH: $CURRENT_BRANCH"
|
||||||
echo "FEATURE_DIR: $FEATURE_DIR"
|
echo "FEATURE_DIR: $FEATURE_DIR"
|
||||||
echo "FEATURE_SPEC: $FEATURE_SPEC"
|
echo "FEATURE_SPEC: $FEATURE_SPEC"
|
||||||
echo "IMPL_PLAN: $IMPL_PLAN"
|
echo "IMPL_PLAN: $IMPL_PLAN"
|
||||||
echo "TASKS_ARCH: $TASKS_ARCH"
|
echo "TASKS: $TASKS"
|
||||||
echo "TASKS_DEV: $TASKS_DEV"
|
|
||||||
fi
|
fi
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
@@ -113,21 +112,12 @@ if [[ ! -f "$IMPL_PLAN" ]]; then
|
|||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Check for task files if required
|
# Check for tasks.md if required
|
||||||
if $REQUIRE_TASKS; then
|
if $REQUIRE_TASKS && [[ ! -f "$TASKS" ]]; then
|
||||||
# Check for split tasks first
|
echo "ERROR: tasks.md not found in $FEATURE_DIR" >&2
|
||||||
if [[ -f "$TASKS_ARCH" ]] && [[ -f "$TASKS_DEV" ]]; then
|
echo "Run /speckit.tasks first to create the task list." >&2
|
||||||
: # Split tasks exist, proceed
|
|
||||||
# Fallback to unified tasks.md
|
|
||||||
elif [[ -f "$TASKS" ]]; then
|
|
||||||
: # Unified tasks exist, proceed
|
|
||||||
else
|
|
||||||
echo "ERROR: No valid task files found in $FEATURE_DIR" >&2
|
|
||||||
echo "Expected 'tasks-arch.md' AND 'tasks-dev.md' (split) OR 'tasks.md' (unified)" >&2
|
|
||||||
echo "Run /speckit.tasks first to create the task lists." >&2
|
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
fi
|
|
||||||
|
|
||||||
# Build list of available documents
|
# Build list of available documents
|
||||||
docs=()
|
docs=()
|
||||||
@@ -143,15 +133,10 @@ fi
|
|||||||
|
|
||||||
[[ -f "$QUICKSTART" ]] && docs+=("quickstart.md")
|
[[ -f "$QUICKSTART" ]] && docs+=("quickstart.md")
|
||||||
|
|
||||||
# Include task files if requested and they exist
|
# Include tasks.md if requested and it exists
|
||||||
if $INCLUDE_TASKS; then
|
if $INCLUDE_TASKS && [[ -f "$TASKS" ]]; then
|
||||||
if [[ -f "$TASKS_ARCH" ]] || [[ -f "$TASKS_DEV" ]]; then
|
|
||||||
[[ -f "$TASKS_ARCH" ]] && docs+=("tasks-arch.md")
|
|
||||||
[[ -f "$TASKS_DEV" ]] && docs+=("tasks-dev.md")
|
|
||||||
elif [[ -f "$TASKS" ]]; then
|
|
||||||
docs+=("tasks.md")
|
docs+=("tasks.md")
|
||||||
fi
|
fi
|
||||||
fi
|
|
||||||
|
|
||||||
# Output results
|
# Output results
|
||||||
if $JSON_MODE; then
|
if $JSON_MODE; then
|
||||||
@@ -176,11 +161,6 @@ else
|
|||||||
check_file "$QUICKSTART" "quickstart.md"
|
check_file "$QUICKSTART" "quickstart.md"
|
||||||
|
|
||||||
if $INCLUDE_TASKS; then
|
if $INCLUDE_TASKS; then
|
||||||
if [[ -f "$TASKS_ARCH" ]] || [[ -f "$TASKS_DEV" ]]; then
|
|
||||||
check_file "$TASKS_ARCH" "tasks-arch.md"
|
|
||||||
check_file "$TASKS_DEV" "tasks-dev.md"
|
|
||||||
else
|
|
||||||
check_file "$TASKS" "tasks.md"
|
check_file "$TASKS" "tasks.md"
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
fi
|
|
||||||
|
|||||||
@@ -143,9 +143,7 @@ HAS_GIT='$has_git_repo'
|
|||||||
FEATURE_DIR='$feature_dir'
|
FEATURE_DIR='$feature_dir'
|
||||||
FEATURE_SPEC='$feature_dir/spec.md'
|
FEATURE_SPEC='$feature_dir/spec.md'
|
||||||
IMPL_PLAN='$feature_dir/plan.md'
|
IMPL_PLAN='$feature_dir/plan.md'
|
||||||
TASKS_ARCH='$feature_dir/tasks-arch.md'
|
TASKS='$feature_dir/tasks.md'
|
||||||
TASKS_DEV='$feature_dir/tasks-dev.md'
|
|
||||||
TASKS='$feature_dir/tasks.md' # Deprecated
|
|
||||||
RESEARCH='$feature_dir/research.md'
|
RESEARCH='$feature_dir/research.md'
|
||||||
DATA_MODEL='$feature_dir/data-model.md'
|
DATA_MODEL='$feature_dir/data-model.md'
|
||||||
QUICKSTART='$feature_dir/quickstart.md'
|
QUICKSTART='$feature_dir/quickstart.md'
|
||||||
|
|||||||
251
.specify/templates/tasks-template.md
Normal file
251
.specify/templates/tasks-template.md
Normal file
@@ -0,0 +1,251 @@
|
|||||||
|
---
|
||||||
|
|
||||||
|
description: "Task list template for feature implementation"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Tasks: [FEATURE NAME]
|
||||||
|
|
||||||
|
**Input**: Design documents from `/specs/[###-feature-name]/`
|
||||||
|
**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, contracts/
|
||||||
|
|
||||||
|
**Tests**: The examples below include test tasks. Tests are OPTIONAL - only include them if explicitly requested in the feature specification.
|
||||||
|
|
||||||
|
**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story.
|
||||||
|
|
||||||
|
## Format: `[ID] [P?] [Story] Description`
|
||||||
|
|
||||||
|
- **[P]**: Can run in parallel (different files, no dependencies)
|
||||||
|
- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3)
|
||||||
|
- Include exact file paths in descriptions
|
||||||
|
|
||||||
|
## Path Conventions
|
||||||
|
|
||||||
|
- **Single project**: `src/`, `tests/` at repository root
|
||||||
|
- **Web app**: `backend/src/`, `frontend/src/`
|
||||||
|
- **Mobile**: `api/src/`, `ios/src/` or `android/src/`
|
||||||
|
- Paths shown below assume single project - adjust based on plan.md structure
|
||||||
|
|
||||||
|
<!--
|
||||||
|
============================================================================
|
||||||
|
IMPORTANT: The tasks below are SAMPLE TASKS for illustration purposes only.
|
||||||
|
|
||||||
|
The /speckit.tasks command MUST replace these with actual tasks based on:
|
||||||
|
- User stories from spec.md (with their priorities P1, P2, P3...)
|
||||||
|
- Feature requirements from plan.md
|
||||||
|
- Entities from data-model.md
|
||||||
|
- Endpoints from contracts/
|
||||||
|
|
||||||
|
Tasks MUST be organized by user story so each story can be:
|
||||||
|
- Implemented independently
|
||||||
|
- Tested independently
|
||||||
|
- Delivered as an MVP increment
|
||||||
|
|
||||||
|
DO NOT keep these sample tasks in the generated tasks.md file.
|
||||||
|
============================================================================
|
||||||
|
-->
|
||||||
|
|
||||||
|
## Phase 1: Setup (Shared Infrastructure)
|
||||||
|
|
||||||
|
**Purpose**: Project initialization and basic structure
|
||||||
|
|
||||||
|
- [ ] T001 Create project structure per implementation plan
|
||||||
|
- [ ] T002 Initialize [language] project with [framework] dependencies
|
||||||
|
- [ ] T003 [P] Configure linting and formatting tools
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 2: Foundational (Blocking Prerequisites)
|
||||||
|
|
||||||
|
**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented
|
||||||
|
|
||||||
|
**⚠️ CRITICAL**: No user story work can begin until this phase is complete
|
||||||
|
|
||||||
|
Examples of foundational tasks (adjust based on your project):
|
||||||
|
|
||||||
|
- [ ] T004 Setup database schema and migrations framework
|
||||||
|
- [ ] T005 [P] Implement authentication/authorization framework
|
||||||
|
- [ ] T006 [P] Setup API routing and middleware structure
|
||||||
|
- [ ] T007 Create base models/entities that all stories depend on
|
||||||
|
- [ ] T008 Configure error handling and logging infrastructure
|
||||||
|
- [ ] T009 Setup environment configuration management
|
||||||
|
|
||||||
|
**Checkpoint**: Foundation ready - user story implementation can now begin in parallel
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 3: User Story 1 - [Title] (Priority: P1) 🎯 MVP
|
||||||
|
|
||||||
|
**Goal**: [Brief description of what this story delivers]
|
||||||
|
|
||||||
|
**Independent Test**: [How to verify this story works on its own]
|
||||||
|
|
||||||
|
### Tests for User Story 1 (OPTIONAL - only if tests requested) ⚠️
|
||||||
|
|
||||||
|
> **NOTE: Write these tests FIRST, ensure they FAIL before implementation**
|
||||||
|
|
||||||
|
- [ ] T010 [P] [US1] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||||
|
- [ ] T011 [P] [US1] Integration test for [user journey] in tests/integration/test_[name].py
|
||||||
|
|
||||||
|
### Implementation for User Story 1
|
||||||
|
|
||||||
|
- [ ] T012 [P] [US1] Create [Entity1] model in src/models/[entity1].py
|
||||||
|
- [ ] T013 [P] [US1] Create [Entity2] model in src/models/[entity2].py
|
||||||
|
- [ ] T014 [US1] Implement [Service] in src/services/[service].py (depends on T012, T013)
|
||||||
|
- [ ] T015 [US1] Implement [endpoint/feature] in src/[location]/[file].py
|
||||||
|
- [ ] T016 [US1] Add validation and error handling
|
||||||
|
- [ ] T017 [US1] Add logging for user story 1 operations
|
||||||
|
|
||||||
|
**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 4: User Story 2 - [Title] (Priority: P2)
|
||||||
|
|
||||||
|
**Goal**: [Brief description of what this story delivers]
|
||||||
|
|
||||||
|
**Independent Test**: [How to verify this story works on its own]
|
||||||
|
|
||||||
|
### Tests for User Story 2 (OPTIONAL - only if tests requested) ⚠️
|
||||||
|
|
||||||
|
- [ ] T018 [P] [US2] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||||
|
- [ ] T019 [P] [US2] Integration test for [user journey] in tests/integration/test_[name].py
|
||||||
|
|
||||||
|
### Implementation for User Story 2
|
||||||
|
|
||||||
|
- [ ] T020 [P] [US2] Create [Entity] model in src/models/[entity].py
|
||||||
|
- [ ] T021 [US2] Implement [Service] in src/services/[service].py
|
||||||
|
- [ ] T022 [US2] Implement [endpoint/feature] in src/[location]/[file].py
|
||||||
|
- [ ] T023 [US2] Integrate with User Story 1 components (if needed)
|
||||||
|
|
||||||
|
**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 5: User Story 3 - [Title] (Priority: P3)
|
||||||
|
|
||||||
|
**Goal**: [Brief description of what this story delivers]
|
||||||
|
|
||||||
|
**Independent Test**: [How to verify this story works on its own]
|
||||||
|
|
||||||
|
### Tests for User Story 3 (OPTIONAL - only if tests requested) ⚠️
|
||||||
|
|
||||||
|
- [ ] T024 [P] [US3] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||||
|
- [ ] T025 [P] [US3] Integration test for [user journey] in tests/integration/test_[name].py
|
||||||
|
|
||||||
|
### Implementation for User Story 3
|
||||||
|
|
||||||
|
- [ ] T026 [P] [US3] Create [Entity] model in src/models/[entity].py
|
||||||
|
- [ ] T027 [US3] Implement [Service] in src/services/[service].py
|
||||||
|
- [ ] T028 [US3] Implement [endpoint/feature] in src/[location]/[file].py
|
||||||
|
|
||||||
|
**Checkpoint**: All user stories should now be independently functional
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
[Add more user story phases as needed, following the same pattern]
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase N: Polish & Cross-Cutting Concerns
|
||||||
|
|
||||||
|
**Purpose**: Improvements that affect multiple user stories
|
||||||
|
|
||||||
|
- [ ] TXXX [P] Documentation updates in docs/
|
||||||
|
- [ ] TXXX Code cleanup and refactoring
|
||||||
|
- [ ] TXXX Performance optimization across all stories
|
||||||
|
- [ ] TXXX [P] Additional unit tests (if requested) in tests/unit/
|
||||||
|
- [ ] TXXX Security hardening
|
||||||
|
- [ ] TXXX Run quickstart.md validation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Dependencies & Execution Order
|
||||||
|
|
||||||
|
### Phase Dependencies
|
||||||
|
|
||||||
|
- **Setup (Phase 1)**: No dependencies - can start immediately
|
||||||
|
- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories
|
||||||
|
- **User Stories (Phase 3+)**: All depend on Foundational phase completion
|
||||||
|
- User stories can then proceed in parallel (if staffed)
|
||||||
|
- Or sequentially in priority order (P1 → P2 → P3)
|
||||||
|
- **Polish (Final Phase)**: Depends on all desired user stories being complete
|
||||||
|
|
||||||
|
### User Story Dependencies
|
||||||
|
|
||||||
|
- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
|
||||||
|
- **User Story 2 (P2)**: Can start after Foundational (Phase 2) - May integrate with US1 but should be independently testable
|
||||||
|
- **User Story 3 (P3)**: Can start after Foundational (Phase 2) - May integrate with US1/US2 but should be independently testable
|
||||||
|
|
||||||
|
### Within Each User Story
|
||||||
|
|
||||||
|
- Tests (if included) MUST be written and FAIL before implementation
|
||||||
|
- Models before services
|
||||||
|
- Services before endpoints
|
||||||
|
- Core implementation before integration
|
||||||
|
- Story complete before moving to next priority
|
||||||
|
|
||||||
|
### Parallel Opportunities
|
||||||
|
|
||||||
|
- All Setup tasks marked [P] can run in parallel
|
||||||
|
- All Foundational tasks marked [P] can run in parallel (within Phase 2)
|
||||||
|
- Once Foundational phase completes, all user stories can start in parallel (if team capacity allows)
|
||||||
|
- All tests for a user story marked [P] can run in parallel
|
||||||
|
- Models within a story marked [P] can run in parallel
|
||||||
|
- Different user stories can be worked on in parallel by different team members
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Parallel Example: User Story 1
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Launch all tests for User Story 1 together (if tests requested):
|
||||||
|
Task: "Contract test for [endpoint] in tests/contract/test_[name].py"
|
||||||
|
Task: "Integration test for [user journey] in tests/integration/test_[name].py"
|
||||||
|
|
||||||
|
# Launch all models for User Story 1 together:
|
||||||
|
Task: "Create [Entity1] model in src/models/[entity1].py"
|
||||||
|
Task: "Create [Entity2] model in src/models/[entity2].py"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Strategy
|
||||||
|
|
||||||
|
### MVP First (User Story 1 Only)
|
||||||
|
|
||||||
|
1. Complete Phase 1: Setup
|
||||||
|
2. Complete Phase 2: Foundational (CRITICAL - blocks all stories)
|
||||||
|
3. Complete Phase 3: User Story 1
|
||||||
|
4. **STOP and VALIDATE**: Test User Story 1 independently
|
||||||
|
5. Deploy/demo if ready
|
||||||
|
|
||||||
|
### Incremental Delivery
|
||||||
|
|
||||||
|
1. Complete Setup + Foundational → Foundation ready
|
||||||
|
2. Add User Story 1 → Test independently → Deploy/Demo (MVP!)
|
||||||
|
3. Add User Story 2 → Test independently → Deploy/Demo
|
||||||
|
4. Add User Story 3 → Test independently → Deploy/Demo
|
||||||
|
5. Each story adds value without breaking previous stories
|
||||||
|
|
||||||
|
### Parallel Team Strategy
|
||||||
|
|
||||||
|
With multiple developers:
|
||||||
|
|
||||||
|
1. Team completes Setup + Foundational together
|
||||||
|
2. Once Foundational is done:
|
||||||
|
- Developer A: User Story 1
|
||||||
|
- Developer B: User Story 2
|
||||||
|
- Developer C: User Story 3
|
||||||
|
3. Stories complete and integrate independently
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- [P] tasks = different files, no dependencies
|
||||||
|
- [Story] label maps task to specific user story for traceability
|
||||||
|
- Each user story should be independently completable and testable
|
||||||
|
- Verify tests fail before implementing
|
||||||
|
- Commit after each task or logical group
|
||||||
|
- Stop at any checkpoint to validate story independently
|
||||||
|
- Avoid: vague tasks, same file conflicts, cross-story dependencies that break independence
|
||||||
Binary file not shown.
Binary file not shown.
@@ -1,14 +1,43 @@
|
|||||||
fastapi
|
annotated-doc==0.0.4
|
||||||
uvicorn
|
annotated-types==0.7.0
|
||||||
pydantic
|
anyio==4.12.0
|
||||||
authlib
|
APScheduler==3.11.2
|
||||||
python-multipart
|
attrs==25.4.0
|
||||||
starlette
|
Authlib==1.6.6
|
||||||
jsonschema
|
certifi==2025.11.12
|
||||||
requests
|
cffi==2.0.0
|
||||||
keyring
|
charset-normalizer==3.4.4
|
||||||
httpx
|
click==8.3.1
|
||||||
PyYAML
|
cryptography==46.0.3
|
||||||
websockets
|
fastapi==0.126.0
|
||||||
rapidfuzz
|
greenlet==3.3.0
|
||||||
sqlalchemy
|
h11==0.16.0
|
||||||
|
httpcore==1.0.9
|
||||||
|
httpx==0.28.1
|
||||||
|
idna==3.11
|
||||||
|
jaraco.classes==3.4.0
|
||||||
|
jaraco.context==6.0.1
|
||||||
|
jaraco.functools==4.3.0
|
||||||
|
jeepney==0.9.0
|
||||||
|
jsonschema==4.25.1
|
||||||
|
jsonschema-specifications==2025.9.1
|
||||||
|
keyring==25.7.0
|
||||||
|
more-itertools==10.8.0
|
||||||
|
pycparser==2.23
|
||||||
|
pydantic==2.12.5
|
||||||
|
pydantic_core==2.41.5
|
||||||
|
python-multipart==0.0.21
|
||||||
|
PyYAML==6.0.3
|
||||||
|
RapidFuzz==3.14.3
|
||||||
|
referencing==0.37.0
|
||||||
|
requests==2.32.5
|
||||||
|
rpds-py==0.30.0
|
||||||
|
SecretStorage==3.5.0
|
||||||
|
SQLAlchemy==2.0.45
|
||||||
|
starlette==0.50.0
|
||||||
|
typing-inspection==0.4.2
|
||||||
|
typing_extensions==4.15.0
|
||||||
|
tzlocal==5.3.1
|
||||||
|
urllib3==2.6.2
|
||||||
|
uvicorn==0.38.0
|
||||||
|
websockets==15.0.1
|
||||||
|
|||||||
@@ -49,4 +49,4 @@ async def get_current_user(token: str = Depends(oauth2_scheme)):
|
|||||||
)
|
)
|
||||||
# A real implementation would return a user object.
|
# A real implementation would return a user object.
|
||||||
return {"placeholder_user": "user@example.com"}
|
return {"placeholder_user": "user@example.com"}
|
||||||
# [/DEF]
|
# [/DEF:AuthModule:Module]
|
||||||
@@ -11,27 +11,35 @@
|
|||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from typing import List, Dict, Optional
|
from typing import List, Dict, Optional
|
||||||
from backend.src.dependencies import get_config_manager
|
from backend.src.dependencies import get_config_manager, get_scheduler_service
|
||||||
from backend.src.core.superset_client import SupersetClient
|
from backend.src.core.superset_client import SupersetClient
|
||||||
from superset_tool.models import SupersetConfig
|
from superset_tool.models import SupersetConfig
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel, Field
|
||||||
|
from backend.src.core.config_models import Environment as EnvModel
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
router = APIRouter(prefix="/api/environments", tags=["environments"])
|
router = APIRouter()
|
||||||
|
|
||||||
|
# [DEF:ScheduleSchema:DataClass]
|
||||||
|
class ScheduleSchema(BaseModel):
|
||||||
|
enabled: bool = False
|
||||||
|
cron_expression: str = Field(..., pattern=r'^(@(annually|yearly|monthly|weekly|daily|hourly|reboot))|((((\d+,)*\d+|(\d+(\/|-)\d+)|\d+|\*) ?){5,7})$')
|
||||||
|
# [/DEF:ScheduleSchema:DataClass]
|
||||||
|
|
||||||
# [DEF:EnvironmentResponse:DataClass]
|
# [DEF:EnvironmentResponse:DataClass]
|
||||||
class EnvironmentResponse(BaseModel):
|
class EnvironmentResponse(BaseModel):
|
||||||
id: str
|
id: str
|
||||||
name: str
|
name: str
|
||||||
url: str
|
url: str
|
||||||
# [/DEF:EnvironmentResponse]
|
backup_schedule: Optional[ScheduleSchema] = None
|
||||||
|
# [/DEF:EnvironmentResponse:DataClass]
|
||||||
|
|
||||||
# [DEF:DatabaseResponse:DataClass]
|
# [DEF:DatabaseResponse:DataClass]
|
||||||
class DatabaseResponse(BaseModel):
|
class DatabaseResponse(BaseModel):
|
||||||
uuid: str
|
uuid: str
|
||||||
database_name: str
|
database_name: str
|
||||||
engine: Optional[str]
|
engine: Optional[str]
|
||||||
# [/DEF:DatabaseResponse]
|
# [/DEF:DatabaseResponse:DataClass]
|
||||||
|
|
||||||
# [DEF:get_environments:Function]
|
# [DEF:get_environments:Function]
|
||||||
# @PURPOSE: List all configured environments.
|
# @PURPOSE: List all configured environments.
|
||||||
@@ -39,8 +47,49 @@ class DatabaseResponse(BaseModel):
|
|||||||
@router.get("", response_model=List[EnvironmentResponse])
|
@router.get("", response_model=List[EnvironmentResponse])
|
||||||
async def get_environments(config_manager=Depends(get_config_manager)):
|
async def get_environments(config_manager=Depends(get_config_manager)):
|
||||||
envs = config_manager.get_environments()
|
envs = config_manager.get_environments()
|
||||||
return [EnvironmentResponse(id=e.id, name=e.name, url=e.url) for e in envs]
|
# Ensure envs is a list
|
||||||
# [/DEF:get_environments]
|
if not isinstance(envs, list):
|
||||||
|
envs = []
|
||||||
|
return [
|
||||||
|
EnvironmentResponse(
|
||||||
|
id=e.id,
|
||||||
|
name=e.name,
|
||||||
|
url=e.url,
|
||||||
|
backup_schedule=ScheduleSchema(
|
||||||
|
enabled=e.backup_schedule.enabled,
|
||||||
|
cron_expression=e.backup_schedule.cron_expression
|
||||||
|
) if e.backup_schedule else None
|
||||||
|
) for e in envs
|
||||||
|
]
|
||||||
|
# [/DEF:get_environments:Function]
|
||||||
|
|
||||||
|
# [DEF:update_environment_schedule:Function]
|
||||||
|
# @PURPOSE: Update backup schedule for an environment.
|
||||||
|
# @PARAM: id (str) - The environment ID.
|
||||||
|
# @PARAM: schedule (ScheduleSchema) - The new schedule.
|
||||||
|
@router.put("/{id}/schedule")
|
||||||
|
async def update_environment_schedule(
|
||||||
|
id: str,
|
||||||
|
schedule: ScheduleSchema,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
scheduler_service=Depends(get_scheduler_service)
|
||||||
|
):
|
||||||
|
envs = config_manager.get_environments()
|
||||||
|
env = next((e for e in envs if e.id == id), None)
|
||||||
|
if not env:
|
||||||
|
raise HTTPException(status_code=404, detail="Environment not found")
|
||||||
|
|
||||||
|
# Update environment config
|
||||||
|
env.backup_schedule.enabled = schedule.enabled
|
||||||
|
env.backup_schedule.cron_expression = schedule.cron_expression
|
||||||
|
|
||||||
|
config_manager.update_environment(id, env)
|
||||||
|
|
||||||
|
# Refresh scheduler
|
||||||
|
scheduler_service.load_schedules()
|
||||||
|
|
||||||
|
return {"message": "Schedule updated successfully"}
|
||||||
|
# [/DEF:update_environment_schedule:Function]
|
||||||
|
|
||||||
# [DEF:get_environment_databases:Function]
|
# [DEF:get_environment_databases:Function]
|
||||||
# @PURPOSE: Fetch the list of databases from a specific environment.
|
# @PURPOSE: Fetch the list of databases from a specific environment.
|
||||||
@@ -70,6 +119,6 @@ async def get_environment_databases(id: str, config_manager=Depends(get_config_m
|
|||||||
return client.get_databases_summary()
|
return client.get_databases_summary()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to fetch databases: {str(e)}")
|
raise HTTPException(status_code=500, detail=f"Failed to fetch databases: {str(e)}")
|
||||||
# [/DEF:get_environment_databases]
|
# [/DEF:get_environment_databases:Function]
|
||||||
|
|
||||||
# [/DEF:backend.src.api.routes.environments]
|
# [/DEF:backend.src.api.routes.environments:Module]
|
||||||
|
|||||||
@@ -29,7 +29,7 @@ class MappingCreate(BaseModel):
|
|||||||
target_db_uuid: str
|
target_db_uuid: str
|
||||||
source_db_name: str
|
source_db_name: str
|
||||||
target_db_name: str
|
target_db_name: str
|
||||||
# [/DEF:MappingCreate]
|
# [/DEF:MappingCreate:DataClass]
|
||||||
|
|
||||||
# [DEF:MappingResponse:DataClass]
|
# [DEF:MappingResponse:DataClass]
|
||||||
class MappingResponse(BaseModel):
|
class MappingResponse(BaseModel):
|
||||||
@@ -43,13 +43,13 @@ class MappingResponse(BaseModel):
|
|||||||
|
|
||||||
class Config:
|
class Config:
|
||||||
from_attributes = True
|
from_attributes = True
|
||||||
# [/DEF:MappingResponse]
|
# [/DEF:MappingResponse:DataClass]
|
||||||
|
|
||||||
# [DEF:SuggestRequest:DataClass]
|
# [DEF:SuggestRequest:DataClass]
|
||||||
class SuggestRequest(BaseModel):
|
class SuggestRequest(BaseModel):
|
||||||
source_env_id: str
|
source_env_id: str
|
||||||
target_env_id: str
|
target_env_id: str
|
||||||
# [/DEF:SuggestRequest]
|
# [/DEF:SuggestRequest:DataClass]
|
||||||
|
|
||||||
# [DEF:get_mappings:Function]
|
# [DEF:get_mappings:Function]
|
||||||
# @PURPOSE: List all saved database mappings.
|
# @PURPOSE: List all saved database mappings.
|
||||||
@@ -65,7 +65,7 @@ async def get_mappings(
|
|||||||
if target_env_id:
|
if target_env_id:
|
||||||
query = query.filter(DatabaseMapping.target_env_id == target_env_id)
|
query = query.filter(DatabaseMapping.target_env_id == target_env_id)
|
||||||
return query.all()
|
return query.all()
|
||||||
# [/DEF:get_mappings]
|
# [/DEF:get_mappings:Function]
|
||||||
|
|
||||||
# [DEF:create_mapping:Function]
|
# [DEF:create_mapping:Function]
|
||||||
# @PURPOSE: Create or update a database mapping.
|
# @PURPOSE: Create or update a database mapping.
|
||||||
@@ -90,7 +90,7 @@ async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
|||||||
db.commit()
|
db.commit()
|
||||||
db.refresh(new_mapping)
|
db.refresh(new_mapping)
|
||||||
return new_mapping
|
return new_mapping
|
||||||
# [/DEF:create_mapping]
|
# [/DEF:create_mapping:Function]
|
||||||
|
|
||||||
# [DEF:suggest_mappings_api:Function]
|
# [DEF:suggest_mappings_api:Function]
|
||||||
# @PURPOSE: Get suggested mappings based on fuzzy matching.
|
# @PURPOSE: Get suggested mappings based on fuzzy matching.
|
||||||
@@ -105,6 +105,6 @@ async def suggest_mappings_api(
|
|||||||
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
# [/DEF:suggest_mappings_api]
|
# [/DEF:suggest_mappings_api:Function]
|
||||||
|
|
||||||
# [/DEF:backend.src.api.routes.mappings]
|
# [/DEF:backend.src.api.routes.mappings:Module]
|
||||||
|
|||||||
@@ -37,7 +37,7 @@ async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)
|
|||||||
client = SupersetClient(config)
|
client = SupersetClient(config)
|
||||||
dashboards = client.get_dashboards_summary()
|
dashboards = client.get_dashboards_summary()
|
||||||
return dashboards
|
return dashboards
|
||||||
# [/DEF:get_dashboards]
|
# [/DEF:get_dashboards:Function]
|
||||||
|
|
||||||
# [DEF:execute_migration:Function]
|
# [DEF:execute_migration:Function]
|
||||||
# @PURPOSE: Execute the migration of selected dashboards.
|
# @PURPOSE: Execute the migration of selected dashboards.
|
||||||
@@ -55,17 +55,22 @@ async def execute_migration(selection: DashboardSelection, config_manager=Depend
|
|||||||
|
|
||||||
# Create migration task with debug logging
|
# Create migration task with debug logging
|
||||||
from ...core.logger import logger
|
from ...core.logger import logger
|
||||||
logger.info(f"Creating migration task with selection: {selection.dict()}")
|
|
||||||
|
# Include replace_db_config in the task parameters
|
||||||
|
task_params = selection.dict()
|
||||||
|
task_params['replace_db_config'] = selection.replace_db_config
|
||||||
|
|
||||||
|
logger.info(f"Creating migration task with params: {task_params}")
|
||||||
logger.info(f"Available environments: {env_ids}")
|
logger.info(f"Available environments: {env_ids}")
|
||||||
logger.info(f"Source env: {selection.source_env_id}, Target env: {selection.target_env_id}")
|
logger.info(f"Source env: {selection.source_env_id}, Target env: {selection.target_env_id}")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
task = await task_manager.create_task("superset-migration", selection.dict())
|
task = await task_manager.create_task("superset-migration", task_params)
|
||||||
logger.info(f"Task created successfully: {task.id}")
|
logger.info(f"Task created successfully: {task.id}")
|
||||||
return {"task_id": task.id, "message": "Migration initiated"}
|
return {"task_id": task.id, "message": "Migration initiated"}
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Task creation failed: {e}")
|
logger.error(f"Task creation failed: {e}")
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to create migration task: {str(e)}")
|
raise HTTPException(status_code=500, detail=f"Failed to create migration task: {str(e)}")
|
||||||
# [/DEF:execute_migration]
|
# [/DEF:execute_migration:Function]
|
||||||
|
|
||||||
# [/DEF:backend.src.api.routes.migration]
|
# [/DEF:backend.src.api.routes.migration:Module]
|
||||||
@@ -19,4 +19,4 @@ async def list_plugins(
|
|||||||
Retrieve a list of all available plugins.
|
Retrieve a list of all available plugins.
|
||||||
"""
|
"""
|
||||||
return plugin_loader.get_all_plugin_configs()
|
return plugin_loader.get_all_plugin_configs()
|
||||||
# [/DEF]
|
# [/DEF:PluginsRouter:Module]
|
||||||
@@ -35,7 +35,7 @@ async def get_settings(config_manager: ConfigManager = Depends(get_config_manage
|
|||||||
if env.password:
|
if env.password:
|
||||||
env.password = "********"
|
env.password = "********"
|
||||||
return config
|
return config
|
||||||
# [/DEF:get_settings]
|
# [/DEF:get_settings:Function]
|
||||||
|
|
||||||
# [DEF:update_global_settings:Function]
|
# [DEF:update_global_settings:Function]
|
||||||
# @PURPOSE: Updates global application settings.
|
# @PURPOSE: Updates global application settings.
|
||||||
@@ -49,7 +49,7 @@ async def update_global_settings(
|
|||||||
logger.info("[update_global_settings][Entry] Updating global settings")
|
logger.info("[update_global_settings][Entry] Updating global settings")
|
||||||
config_manager.update_global_settings(settings)
|
config_manager.update_global_settings(settings)
|
||||||
return settings
|
return settings
|
||||||
# [/DEF:update_global_settings]
|
# [/DEF:update_global_settings:Function]
|
||||||
|
|
||||||
# [DEF:get_environments:Function]
|
# [DEF:get_environments:Function]
|
||||||
# @PURPOSE: Lists all configured Superset environments.
|
# @PURPOSE: Lists all configured Superset environments.
|
||||||
@@ -58,7 +58,7 @@ async def update_global_settings(
|
|||||||
async def get_environments(config_manager: ConfigManager = Depends(get_config_manager)):
|
async def get_environments(config_manager: ConfigManager = Depends(get_config_manager)):
|
||||||
logger.info("[get_environments][Entry] Fetching environments")
|
logger.info("[get_environments][Entry] Fetching environments")
|
||||||
return config_manager.get_environments()
|
return config_manager.get_environments()
|
||||||
# [/DEF:get_environments]
|
# [/DEF:get_environments:Function]
|
||||||
|
|
||||||
# [DEF:add_environment:Function]
|
# [DEF:add_environment:Function]
|
||||||
# @PURPOSE: Adds a new Superset environment.
|
# @PURPOSE: Adds a new Superset environment.
|
||||||
@@ -91,7 +91,7 @@ async def add_environment(
|
|||||||
|
|
||||||
config_manager.add_environment(env)
|
config_manager.add_environment(env)
|
||||||
return env
|
return env
|
||||||
# [/DEF:add_environment]
|
# [/DEF:add_environment:Function]
|
||||||
|
|
||||||
# [DEF:update_environment:Function]
|
# [DEF:update_environment:Function]
|
||||||
# @PURPOSE: Updates an existing Superset environment.
|
# @PURPOSE: Updates an existing Superset environment.
|
||||||
@@ -134,7 +134,7 @@ async def update_environment(
|
|||||||
if config_manager.update_environment(id, env):
|
if config_manager.update_environment(id, env):
|
||||||
return env
|
return env
|
||||||
raise HTTPException(status_code=404, detail=f"Environment {id} not found")
|
raise HTTPException(status_code=404, detail=f"Environment {id} not found")
|
||||||
# [/DEF:update_environment]
|
# [/DEF:update_environment:Function]
|
||||||
|
|
||||||
# [DEF:delete_environment:Function]
|
# [DEF:delete_environment:Function]
|
||||||
# @PURPOSE: Deletes a Superset environment.
|
# @PURPOSE: Deletes a Superset environment.
|
||||||
@@ -147,7 +147,7 @@ async def delete_environment(
|
|||||||
logger.info(f"[delete_environment][Entry] Deleting environment {id}")
|
logger.info(f"[delete_environment][Entry] Deleting environment {id}")
|
||||||
config_manager.delete_environment(id)
|
config_manager.delete_environment(id)
|
||||||
return {"message": f"Environment {id} deleted"}
|
return {"message": f"Environment {id} deleted"}
|
||||||
# [/DEF:delete_environment]
|
# [/DEF:delete_environment:Function]
|
||||||
|
|
||||||
# [DEF:test_environment_connection:Function]
|
# [DEF:test_environment_connection:Function]
|
||||||
# @PURPOSE: Tests the connection to a Superset environment.
|
# @PURPOSE: Tests the connection to a Superset environment.
|
||||||
@@ -190,7 +190,7 @@ async def test_environment_connection(
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"[test_environment_connection][Coherence:Failed] Connection failed for {id}: {e}")
|
logger.error(f"[test_environment_connection][Coherence:Failed] Connection failed for {id}: {e}")
|
||||||
return {"status": "error", "message": str(e)}
|
return {"status": "error", "message": str(e)}
|
||||||
# [/DEF:test_environment_connection]
|
# [/DEF:test_environment_connection:Function]
|
||||||
|
|
||||||
# [DEF:validate_backup_path:Function]
|
# [DEF:validate_backup_path:Function]
|
||||||
# @PURPOSE: Validates if a backup path exists and is writable.
|
# @PURPOSE: Validates if a backup path exists and is writable.
|
||||||
@@ -213,6 +213,6 @@ async def validate_backup_path(
|
|||||||
return {"status": "error", "message": message}
|
return {"status": "error", "message": message}
|
||||||
|
|
||||||
return {"status": "success", "message": message}
|
return {"status": "success", "message": message}
|
||||||
# [/DEF:validate_backup_path]
|
# [/DEF:validate_backup_path:Function]
|
||||||
|
|
||||||
# [/DEF:SettingsRouter]
|
# [/DEF:SettingsRouter:Module]
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ class ResolveTaskRequest(BaseModel):
|
|||||||
class ResumeTaskRequest(BaseModel):
|
class ResumeTaskRequest(BaseModel):
|
||||||
passwords: Dict[str, str]
|
passwords: Dict[str, str]
|
||||||
|
|
||||||
@router.post("/", response_model=Task, status_code=status.HTTP_201_CREATED)
|
@router.post("", response_model=Task, status_code=status.HTTP_201_CREATED)
|
||||||
async def create_task(
|
async def create_task(
|
||||||
request: CreateTaskRequest,
|
request: CreateTaskRequest,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager)
|
||||||
@@ -39,14 +39,17 @@ async def create_task(
|
|||||||
except ValueError as e:
|
except ValueError as e:
|
||||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e))
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e))
|
||||||
|
|
||||||
@router.get("/", response_model=List[Task])
|
@router.get("", response_model=List[Task])
|
||||||
async def list_tasks(
|
async def list_tasks(
|
||||||
|
limit: int = 10,
|
||||||
|
offset: int = 0,
|
||||||
|
status: Optional[TaskStatus] = None,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager)
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Retrieve a list of all tasks.
|
Retrieve a list of tasks with pagination and optional status filter.
|
||||||
"""
|
"""
|
||||||
return task_manager.get_all_tasks()
|
return task_manager.get_tasks(limit=limit, offset=offset, status=status)
|
||||||
|
|
||||||
@router.get("/{task_id}", response_model=Task)
|
@router.get("/{task_id}", response_model=Task)
|
||||||
async def get_task(
|
async def get_task(
|
||||||
@@ -61,6 +64,19 @@ async def get_task(
|
|||||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||||
return task
|
return task
|
||||||
|
|
||||||
|
@router.get("/{task_id}/logs", response_model=List[LogEntry])
|
||||||
|
async def get_task_logs(
|
||||||
|
task_id: str,
|
||||||
|
task_manager: TaskManager = Depends(get_task_manager)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve logs for a specific task.
|
||||||
|
"""
|
||||||
|
task = task_manager.get_task(task_id)
|
||||||
|
if not task:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||||
|
return task_manager.get_task_logs(task_id)
|
||||||
|
|
||||||
@router.post("/{task_id}/resolve", response_model=Task)
|
@router.post("/{task_id}/resolve", response_model=Task)
|
||||||
async def resolve_task(
|
async def resolve_task(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
@@ -90,4 +106,15 @@ async def resume_task(
|
|||||||
return task_manager.get_task(task_id)
|
return task_manager.get_task(task_id)
|
||||||
except ValueError as e:
|
except ValueError as e:
|
||||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||||
# [/DEF]
|
|
||||||
|
@router.delete("", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def clear_tasks(
|
||||||
|
status: Optional[TaskStatus] = None,
|
||||||
|
task_manager: TaskManager = Depends(get_task_manager)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Clear tasks matching the status filter. If no filter, clears all non-running tasks.
|
||||||
|
"""
|
||||||
|
task_manager.clear_tasks(status)
|
||||||
|
return
|
||||||
|
# [/DEF:TasksRouter:Module]
|
||||||
@@ -11,21 +11,18 @@ from pathlib import Path
|
|||||||
project_root = Path(__file__).resolve().parent.parent.parent
|
project_root = Path(__file__).resolve().parent.parent.parent
|
||||||
sys.path.append(str(project_root))
|
sys.path.append(str(project_root))
|
||||||
|
|
||||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends
|
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request
|
||||||
from fastapi.middleware.cors import CORSMiddleware
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
from fastapi.staticfiles import StaticFiles
|
from fastapi.staticfiles import StaticFiles
|
||||||
from fastapi.responses import FileResponse
|
from fastapi.responses import FileResponse
|
||||||
import asyncio
|
import asyncio
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from .dependencies import get_task_manager
|
from .dependencies import get_task_manager, get_scheduler_service
|
||||||
from .core.logger import logger
|
from .core.logger import logger
|
||||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration
|
from .api.routes import plugins, tasks, settings, environments, mappings, migration
|
||||||
from .core.database import init_db
|
from .core.database import init_db
|
||||||
|
|
||||||
# Initialize database
|
|
||||||
init_db()
|
|
||||||
|
|
||||||
# [DEF:App:Global]
|
# [DEF:App:Global]
|
||||||
# @SEMANTICS: app, fastapi, instance
|
# @SEMANTICS: app, fastapi, instance
|
||||||
# @PURPOSE: The global FastAPI application instance.
|
# @PURPOSE: The global FastAPI application instance.
|
||||||
@@ -34,6 +31,19 @@ app = FastAPI(
|
|||||||
description="API for managing Superset automation tools and plugins.",
|
description="API for managing Superset automation tools and plugins.",
|
||||||
version="1.0.0",
|
version="1.0.0",
|
||||||
)
|
)
|
||||||
|
# [/DEF:App:Global]
|
||||||
|
|
||||||
|
# Startup event
|
||||||
|
@app.on_event("startup")
|
||||||
|
async def startup_event():
|
||||||
|
scheduler = get_scheduler_service()
|
||||||
|
scheduler.start()
|
||||||
|
|
||||||
|
# Shutdown event
|
||||||
|
@app.on_event("shutdown")
|
||||||
|
async def shutdown_event():
|
||||||
|
scheduler = get_scheduler_service()
|
||||||
|
scheduler.stop()
|
||||||
|
|
||||||
# Configure CORS
|
# Configure CORS
|
||||||
app.add_middleware(
|
app.add_middleware(
|
||||||
@@ -45,11 +55,18 @@ app.add_middleware(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.middleware("http")
|
||||||
|
async def log_requests(request: Request, call_next):
|
||||||
|
logger.info(f"[DEBUG] Incoming request: {request.method} {request.url.path}")
|
||||||
|
response = await call_next(request)
|
||||||
|
logger.info(f"[DEBUG] Response status: {response.status_code} for {request.url.path}")
|
||||||
|
return response
|
||||||
|
|
||||||
# Include API routes
|
# Include API routes
|
||||||
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
||||||
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
||||||
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
||||||
app.include_router(environments.router)
|
app.include_router(environments.router, prefix="/api/environments", tags=["Environments"])
|
||||||
app.include_router(mappings.router)
|
app.include_router(mappings.router)
|
||||||
app.include_router(migration.router)
|
app.include_router(migration.router)
|
||||||
|
|
||||||
@@ -63,16 +80,30 @@ async def websocket_endpoint(websocket: WebSocket, task_id: str):
|
|||||||
task_manager = get_task_manager()
|
task_manager = get_task_manager()
|
||||||
queue = await task_manager.subscribe_logs(task_id)
|
queue = await task_manager.subscribe_logs(task_id)
|
||||||
try:
|
try:
|
||||||
# Send initial logs if any
|
# Stream new logs
|
||||||
|
logger.info(f"Starting log stream for task {task_id}")
|
||||||
|
|
||||||
|
# Send initial logs first to build context
|
||||||
initial_logs = task_manager.get_task_logs(task_id)
|
initial_logs = task_manager.get_task_logs(task_id)
|
||||||
for log_entry in initial_logs:
|
for log_entry in initial_logs:
|
||||||
# Convert datetime to string for JSON serialization
|
|
||||||
log_dict = log_entry.dict()
|
log_dict = log_entry.dict()
|
||||||
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
||||||
await websocket.send_json(log_dict)
|
await websocket.send_json(log_dict)
|
||||||
|
|
||||||
# Stream new logs
|
# Force a check for AWAITING_INPUT status immediately upon connection
|
||||||
logger.info(f"Starting log stream for task {task_id}")
|
# This ensures that if the task is already waiting when the user connects, they get the prompt.
|
||||||
|
task = task_manager.get_task(task_id)
|
||||||
|
if task and task.status == "AWAITING_INPUT" and task.input_request:
|
||||||
|
# Construct a synthetic log entry to trigger the frontend handler
|
||||||
|
# This is a bit of a hack but avoids changing the websocket protocol significantly
|
||||||
|
synthetic_log = {
|
||||||
|
"timestamp": task.logs[-1].timestamp.isoformat() if task.logs else "2024-01-01T00:00:00",
|
||||||
|
"level": "INFO",
|
||||||
|
"message": "Task paused for user input (Connection Re-established)",
|
||||||
|
"context": {"input_request": task.input_request}
|
||||||
|
}
|
||||||
|
await websocket.send_json(synthetic_log)
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
log_entry = await queue.get()
|
log_entry = await queue.get()
|
||||||
log_dict = log_entry.dict()
|
log_dict = log_entry.dict()
|
||||||
@@ -84,7 +115,9 @@ async def websocket_endpoint(websocket: WebSocket, task_id: str):
|
|||||||
if "Task completed successfully" in log_entry.message or "Task failed" in log_entry.message:
|
if "Task completed successfully" in log_entry.message or "Task failed" in log_entry.message:
|
||||||
# Wait a bit to ensure client receives the last message
|
# Wait a bit to ensure client receives the last message
|
||||||
await asyncio.sleep(2)
|
await asyncio.sleep(2)
|
||||||
break
|
# DO NOT BREAK here - allow client to keep connection open if they want to review logs
|
||||||
|
# or until they disconnect. Breaking closes the socket immediately.
|
||||||
|
# break
|
||||||
|
|
||||||
except WebSocketDisconnect:
|
except WebSocketDisconnect:
|
||||||
logger.info(f"WebSocket connection disconnected for task {task_id}")
|
logger.info(f"WebSocket connection disconnected for task {task_id}")
|
||||||
@@ -92,8 +125,7 @@ async def websocket_endpoint(websocket: WebSocket, task_id: str):
|
|||||||
logger.error(f"WebSocket error for task {task_id}: {e}")
|
logger.error(f"WebSocket error for task {task_id}: {e}")
|
||||||
finally:
|
finally:
|
||||||
task_manager.unsubscribe_logs(task_id, queue)
|
task_manager.unsubscribe_logs(task_id, queue)
|
||||||
|
# [/DEF:WebSocketEndpoint:Endpoint]
|
||||||
# [/DEF]
|
|
||||||
|
|
||||||
# [DEF:StaticFiles:Mount]
|
# [DEF:StaticFiles:Mount]
|
||||||
# @SEMANTICS: static, frontend, spa
|
# @SEMANTICS: static, frontend, spa
|
||||||
@@ -117,4 +149,6 @@ else:
|
|||||||
@app.get("/")
|
@app.get("/")
|
||||||
async def read_root():
|
async def read_root():
|
||||||
return {"message": "Superset Tools API is running (Frontend build not found)"}
|
return {"message": "Superset Tools API is running (Frontend build not found)"}
|
||||||
# [/DEF]
|
# [/DEF:RootEndpoint:Endpoint]
|
||||||
|
# [/DEF:StaticFiles:Mount]
|
||||||
|
# [/DEF:AppModule:Module]
|
||||||
|
|||||||
@@ -46,7 +46,7 @@ class ConfigManager:
|
|||||||
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
||||||
|
|
||||||
logger.info(f"[ConfigManager][Exit] Initialized")
|
logger.info(f"[ConfigManager][Exit] Initialized")
|
||||||
# [/DEF:__init__]
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
# [DEF:_load_config:Function]
|
# [DEF:_load_config:Function]
|
||||||
# @PURPOSE: Loads the configuration from disk or creates a default one.
|
# @PURPOSE: Loads the configuration from disk or creates a default one.
|
||||||
@@ -72,11 +72,13 @@ class ConfigManager:
|
|||||||
return config
|
return config
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"[_load_config][Coherence:Failed] Error loading config: {e}")
|
logger.error(f"[_load_config][Coherence:Failed] Error loading config: {e}")
|
||||||
|
# Fallback but try to preserve existing settings if possible?
|
||||||
|
# For now, return default to be safe, but log the error prominently.
|
||||||
return AppConfig(
|
return AppConfig(
|
||||||
environments=[],
|
environments=[],
|
||||||
settings=GlobalSettings(backup_path="backups")
|
settings=GlobalSettings(backup_path="backups")
|
||||||
)
|
)
|
||||||
# [/DEF:_load_config]
|
# [/DEF:_load_config:Function]
|
||||||
|
|
||||||
# [DEF:_save_config_to_disk:Function]
|
# [DEF:_save_config_to_disk:Function]
|
||||||
# @PURPOSE: Saves the provided configuration object to disk.
|
# @PURPOSE: Saves the provided configuration object to disk.
|
||||||
@@ -95,20 +97,20 @@ class ConfigManager:
|
|||||||
logger.info(f"[_save_config_to_disk][Action] Configuration saved")
|
logger.info(f"[_save_config_to_disk][Action] Configuration saved")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"[_save_config_to_disk][Coherence:Failed] Failed to save: {e}")
|
logger.error(f"[_save_config_to_disk][Coherence:Failed] Failed to save: {e}")
|
||||||
# [/DEF:_save_config_to_disk]
|
# [/DEF:_save_config_to_disk:Function]
|
||||||
|
|
||||||
# [DEF:save:Function]
|
# [DEF:save:Function]
|
||||||
# @PURPOSE: Saves the current configuration state to disk.
|
# @PURPOSE: Saves the current configuration state to disk.
|
||||||
def save(self):
|
def save(self):
|
||||||
self._save_config_to_disk(self.config)
|
self._save_config_to_disk(self.config)
|
||||||
# [/DEF:save]
|
# [/DEF:save:Function]
|
||||||
|
|
||||||
# [DEF:get_config:Function]
|
# [DEF:get_config:Function]
|
||||||
# @PURPOSE: Returns the current configuration.
|
# @PURPOSE: Returns the current configuration.
|
||||||
# @RETURN: AppConfig - The current configuration.
|
# @RETURN: AppConfig - The current configuration.
|
||||||
def get_config(self) -> AppConfig:
|
def get_config(self) -> AppConfig:
|
||||||
return self.config
|
return self.config
|
||||||
# [/DEF:get_config]
|
# [/DEF:get_config:Function]
|
||||||
|
|
||||||
# [DEF:update_global_settings:Function]
|
# [DEF:update_global_settings:Function]
|
||||||
# @PURPOSE: Updates the global settings and persists the change.
|
# @PURPOSE: Updates the global settings and persists the change.
|
||||||
@@ -128,7 +130,7 @@ class ConfigManager:
|
|||||||
configure_logger(settings.logging)
|
configure_logger(settings.logging)
|
||||||
|
|
||||||
logger.info(f"[update_global_settings][Exit] Settings updated")
|
logger.info(f"[update_global_settings][Exit] Settings updated")
|
||||||
# [/DEF:update_global_settings]
|
# [/DEF:update_global_settings:Function]
|
||||||
|
|
||||||
# [DEF:validate_path:Function]
|
# [DEF:validate_path:Function]
|
||||||
# @PURPOSE: Validates if a path exists and is writable.
|
# @PURPOSE: Validates if a path exists and is writable.
|
||||||
@@ -146,21 +148,21 @@ class ConfigManager:
|
|||||||
return False, "Path is not writable"
|
return False, "Path is not writable"
|
||||||
|
|
||||||
return True, "Path is valid and writable"
|
return True, "Path is valid and writable"
|
||||||
# [/DEF:validate_path]
|
# [/DEF:validate_path:Function]
|
||||||
|
|
||||||
# [DEF:get_environments:Function]
|
# [DEF:get_environments:Function]
|
||||||
# @PURPOSE: Returns the list of configured environments.
|
# @PURPOSE: Returns the list of configured environments.
|
||||||
# @RETURN: List[Environment] - List of environments.
|
# @RETURN: List[Environment] - List of environments.
|
||||||
def get_environments(self) -> List[Environment]:
|
def get_environments(self) -> List[Environment]:
|
||||||
return self.config.environments
|
return self.config.environments
|
||||||
# [/DEF:get_environments]
|
# [/DEF:get_environments:Function]
|
||||||
|
|
||||||
# [DEF:has_environments:Function]
|
# [DEF:has_environments:Function]
|
||||||
# @PURPOSE: Checks if at least one environment is configured.
|
# @PURPOSE: Checks if at least one environment is configured.
|
||||||
# @RETURN: bool - True if at least one environment exists.
|
# @RETURN: bool - True if at least one environment exists.
|
||||||
def has_environments(self) -> bool:
|
def has_environments(self) -> bool:
|
||||||
return len(self.config.environments) > 0
|
return len(self.config.environments) > 0
|
||||||
# [/DEF:has_environments]
|
# [/DEF:has_environments:Function]
|
||||||
|
|
||||||
# [DEF:add_environment:Function]
|
# [DEF:add_environment:Function]
|
||||||
# @PURPOSE: Adds a new environment to the configuration.
|
# @PURPOSE: Adds a new environment to the configuration.
|
||||||
@@ -179,7 +181,7 @@ class ConfigManager:
|
|||||||
self.save()
|
self.save()
|
||||||
|
|
||||||
logger.info(f"[add_environment][Exit] Environment added")
|
logger.info(f"[add_environment][Exit] Environment added")
|
||||||
# [/DEF:add_environment]
|
# [/DEF:add_environment:Function]
|
||||||
|
|
||||||
# [DEF:update_environment:Function]
|
# [DEF:update_environment:Function]
|
||||||
# @PURPOSE: Updates an existing environment.
|
# @PURPOSE: Updates an existing environment.
|
||||||
@@ -208,7 +210,7 @@ class ConfigManager:
|
|||||||
|
|
||||||
logger.warning(f"[update_environment][Coherence:Failed] Environment {env_id} not found")
|
logger.warning(f"[update_environment][Coherence:Failed] Environment {env_id} not found")
|
||||||
return False
|
return False
|
||||||
# [/DEF:update_environment]
|
# [/DEF:update_environment:Function]
|
||||||
|
|
||||||
# [DEF:delete_environment:Function]
|
# [DEF:delete_environment:Function]
|
||||||
# @PURPOSE: Deletes an environment by ID.
|
# @PURPOSE: Deletes an environment by ID.
|
||||||
@@ -229,8 +231,8 @@ class ConfigManager:
|
|||||||
logger.info(f"[delete_environment][Action] Deleted {env_id}")
|
logger.info(f"[delete_environment][Action] Deleted {env_id}")
|
||||||
else:
|
else:
|
||||||
logger.warning(f"[delete_environment][Coherence:Failed] Environment {env_id} not found")
|
logger.warning(f"[delete_environment][Coherence:Failed] Environment {env_id} not found")
|
||||||
# [/DEF:delete_environment]
|
# [/DEF:delete_environment:Function]
|
||||||
|
|
||||||
# [/DEF:ConfigManager]
|
# [/DEF:ConfigManager:Class]
|
||||||
|
|
||||||
# [/DEF:ConfigManagerModule]
|
# [/DEF:ConfigManagerModule:Module]
|
||||||
|
|||||||
@@ -8,6 +8,13 @@
|
|||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
|
|
||||||
|
# [DEF:Schedule:DataClass]
|
||||||
|
# @PURPOSE: Represents a backup schedule configuration.
|
||||||
|
class Schedule(BaseModel):
|
||||||
|
enabled: bool = False
|
||||||
|
cron_expression: str = "0 0 * * *" # Default: daily at midnight
|
||||||
|
# [/DEF:Schedule:DataClass]
|
||||||
|
|
||||||
# [DEF:Environment:DataClass]
|
# [DEF:Environment:DataClass]
|
||||||
# @PURPOSE: Represents a Superset environment configuration.
|
# @PURPOSE: Represents a Superset environment configuration.
|
||||||
class Environment(BaseModel):
|
class Environment(BaseModel):
|
||||||
@@ -17,7 +24,8 @@ class Environment(BaseModel):
|
|||||||
username: str
|
username: str
|
||||||
password: str # Will be masked in UI
|
password: str # Will be masked in UI
|
||||||
is_default: bool = False
|
is_default: bool = False
|
||||||
# [/DEF:Environment]
|
backup_schedule: Schedule = Field(default_factory=Schedule)
|
||||||
|
# [/DEF:Environment:DataClass]
|
||||||
|
|
||||||
# [DEF:LoggingConfig:DataClass]
|
# [DEF:LoggingConfig:DataClass]
|
||||||
# @PURPOSE: Defines the configuration for the application's logging system.
|
# @PURPOSE: Defines the configuration for the application's logging system.
|
||||||
@@ -27,7 +35,7 @@ class LoggingConfig(BaseModel):
|
|||||||
max_bytes: int = 10 * 1024 * 1024
|
max_bytes: int = 10 * 1024 * 1024
|
||||||
backup_count: int = 5
|
backup_count: int = 5
|
||||||
enable_belief_state: bool = True
|
enable_belief_state: bool = True
|
||||||
# [/DEF:LoggingConfig]
|
# [/DEF:LoggingConfig:DataClass]
|
||||||
|
|
||||||
# [DEF:GlobalSettings:DataClass]
|
# [DEF:GlobalSettings:DataClass]
|
||||||
# @PURPOSE: Represents global application settings.
|
# @PURPOSE: Represents global application settings.
|
||||||
@@ -35,13 +43,18 @@ class GlobalSettings(BaseModel):
|
|||||||
backup_path: str
|
backup_path: str
|
||||||
default_environment_id: Optional[str] = None
|
default_environment_id: Optional[str] = None
|
||||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||||
# [/DEF:GlobalSettings]
|
|
||||||
|
# Task retention settings
|
||||||
|
task_retention_days: int = 30
|
||||||
|
task_retention_limit: int = 100
|
||||||
|
pagination_limit: int = 10
|
||||||
|
# [/DEF:GlobalSettings:DataClass]
|
||||||
|
|
||||||
# [DEF:AppConfig:DataClass]
|
# [DEF:AppConfig:DataClass]
|
||||||
# @PURPOSE: The root configuration model containing all application settings.
|
# @PURPOSE: The root configuration model containing all application settings.
|
||||||
class AppConfig(BaseModel):
|
class AppConfig(BaseModel):
|
||||||
environments: List[Environment] = []
|
environments: List[Environment] = []
|
||||||
settings: GlobalSettings
|
settings: GlobalSettings
|
||||||
# [/DEF:AppConfig]
|
# [/DEF:AppConfig:DataClass]
|
||||||
|
|
||||||
# [/DEF:ConfigModels]
|
# [/DEF:ConfigModels:Module]
|
||||||
|
|||||||
@@ -12,26 +12,43 @@
|
|||||||
from sqlalchemy import create_engine
|
from sqlalchemy import create_engine
|
||||||
from sqlalchemy.orm import sessionmaker, Session
|
from sqlalchemy.orm import sessionmaker, Session
|
||||||
from backend.src.models.mapping import Base
|
from backend.src.models.mapping import Base
|
||||||
|
# Import TaskRecord to ensure it's registered with Base
|
||||||
|
from backend.src.models.task import TaskRecord
|
||||||
import os
|
import os
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:DATABASE_URL:Constant]
|
# [DEF:DATABASE_URL:Constant]
|
||||||
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./mappings.db")
|
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./mappings.db")
|
||||||
# [/DEF:DATABASE_URL]
|
# [/DEF:DATABASE_URL:Constant]
|
||||||
|
|
||||||
|
# [DEF:TASKS_DATABASE_URL:Constant]
|
||||||
|
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", "sqlite:///./tasks.db")
|
||||||
|
# [/DEF:TASKS_DATABASE_URL:Constant]
|
||||||
|
|
||||||
# [DEF:engine:Variable]
|
# [DEF:engine:Variable]
|
||||||
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
# [/DEF:engine]
|
# [/DEF:engine:Variable]
|
||||||
|
|
||||||
|
# [DEF:tasks_engine:Variable]
|
||||||
|
tasks_engine = create_engine(TASKS_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
|
# [/DEF:tasks_engine:Variable]
|
||||||
|
|
||||||
# [DEF:SessionLocal:Class]
|
# [DEF:SessionLocal:Class]
|
||||||
|
# @PURPOSE: A session factory for the main mappings database.
|
||||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
# [/DEF:SessionLocal]
|
# [/DEF:SessionLocal:Class]
|
||||||
|
|
||||||
|
# [DEF:TasksSessionLocal:Class]
|
||||||
|
# @PURPOSE: A session factory for the tasks execution database.
|
||||||
|
TasksSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=tasks_engine)
|
||||||
|
# [/DEF:TasksSessionLocal:Class]
|
||||||
|
|
||||||
# [DEF:init_db:Function]
|
# [DEF:init_db:Function]
|
||||||
# @PURPOSE: Initializes the database by creating all tables.
|
# @PURPOSE: Initializes the database by creating all tables.
|
||||||
def init_db():
|
def init_db():
|
||||||
Base.metadata.create_all(bind=engine)
|
Base.metadata.create_all(bind=engine)
|
||||||
# [/DEF:init_db]
|
Base.metadata.create_all(bind=tasks_engine)
|
||||||
|
# [/DEF:init_db:Function]
|
||||||
|
|
||||||
# [DEF:get_db:Function]
|
# [DEF:get_db:Function]
|
||||||
# @PURPOSE: Dependency for getting a database session.
|
# @PURPOSE: Dependency for getting a database session.
|
||||||
@@ -43,6 +60,18 @@ def get_db():
|
|||||||
yield db
|
yield db
|
||||||
finally:
|
finally:
|
||||||
db.close()
|
db.close()
|
||||||
# [/DEF:get_db]
|
# [/DEF:get_db:Function]
|
||||||
|
|
||||||
# [/DEF:backend.src.core.database]
|
# [DEF:get_tasks_db:Function]
|
||||||
|
# @PURPOSE: Dependency for getting a tasks database session.
|
||||||
|
# @POST: Session is closed after use.
|
||||||
|
# @RETURN: Generator[Session, None, None]
|
||||||
|
def get_tasks_db():
|
||||||
|
db = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:get_tasks_db:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.database:Module]
|
||||||
|
|||||||
@@ -28,7 +28,7 @@ class BeliefFormatter(logging.Formatter):
|
|||||||
if anchor_id:
|
if anchor_id:
|
||||||
msg = f"[{anchor_id}][Action] {msg}"
|
msg = f"[{anchor_id}][Action] {msg}"
|
||||||
return msg
|
return msg
|
||||||
# [/DEF:BeliefFormatter]
|
# [/DEF:BeliefFormatter:Class]
|
||||||
|
|
||||||
# Re-using LogEntry from task_manager for consistency
|
# Re-using LogEntry from task_manager for consistency
|
||||||
# [DEF:LogEntry:Class]
|
# [DEF:LogEntry:Class]
|
||||||
@@ -40,7 +40,7 @@ class LogEntry(BaseModel):
|
|||||||
message: str
|
message: str
|
||||||
context: Optional[Dict[str, Any]] = None
|
context: Optional[Dict[str, Any]] = None
|
||||||
|
|
||||||
# [/DEF]
|
# [/DEF:LogEntry:Class]
|
||||||
|
|
||||||
# [DEF:BeliefScope:Function]
|
# [DEF:BeliefScope:Function]
|
||||||
# @PURPOSE: Context manager for structured Belief State logging.
|
# @PURPOSE: Context manager for structured Belief State logging.
|
||||||
@@ -71,7 +71,7 @@ def belief_scope(anchor_id: str, message: str = ""):
|
|||||||
# Restore old anchor
|
# Restore old anchor
|
||||||
_belief_state.anchor_id = old_anchor
|
_belief_state.anchor_id = old_anchor
|
||||||
|
|
||||||
# [/DEF:BeliefScope]
|
# [/DEF:BeliefScope:Function]
|
||||||
|
|
||||||
# [DEF:ConfigureLogger:Function]
|
# [DEF:ConfigureLogger:Function]
|
||||||
# @PURPOSE: Configures the logger with the provided logging settings.
|
# @PURPOSE: Configures the logger with the provided logging settings.
|
||||||
@@ -115,7 +115,7 @@ def configure_logger(config):
|
|||||||
handler.setFormatter(BeliefFormatter(
|
handler.setFormatter(BeliefFormatter(
|
||||||
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
||||||
))
|
))
|
||||||
# [/DEF:ConfigureLogger]
|
# [/DEF:ConfigureLogger:Function]
|
||||||
|
|
||||||
# [DEF:WebSocketLogHandler:Class]
|
# [DEF:WebSocketLogHandler:Class]
|
||||||
# @SEMANTICS: logging, handler, websocket, buffer
|
# @SEMANTICS: logging, handler, websocket, buffer
|
||||||
@@ -158,7 +158,7 @@ class WebSocketLogHandler(logging.Handler):
|
|||||||
"""
|
"""
|
||||||
return list(self.log_buffer)
|
return list(self.log_buffer)
|
||||||
|
|
||||||
# [/DEF]
|
# [/DEF:WebSocketLogHandler:Class]
|
||||||
|
|
||||||
# [DEF:Logger:Global]
|
# [DEF:Logger:Global]
|
||||||
# @SEMANTICS: logger, global, instance
|
# @SEMANTICS: logger, global, instance
|
||||||
@@ -184,4 +184,5 @@ logger.addHandler(websocket_log_handler)
|
|||||||
# Example usage:
|
# Example usage:
|
||||||
# logger.info("Application started", extra={"context_key": "context_value"})
|
# logger.info("Application started", extra={"context_key": "context_value"})
|
||||||
# logger.error("An error occurred", exc_info=True)
|
# logger.error("An error occurred", exc_info=True)
|
||||||
# [/DEF]
|
# [/DEF:Logger:Global]
|
||||||
|
# [/DEF:LoggerModule:Module]
|
||||||
@@ -73,6 +73,7 @@ class MigrationEngine:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"[MigrationEngine.transform_zip][Coherence:Failed] Error transforming ZIP: {e}")
|
logger.error(f"[MigrationEngine.transform_zip][Coherence:Failed] Error transforming ZIP: {e}")
|
||||||
return False
|
return False
|
||||||
|
# [/DEF:MigrationEngine.transform_zip:Function]
|
||||||
|
|
||||||
# [DEF:MigrationEngine._transform_yaml:Function]
|
# [DEF:MigrationEngine._transform_yaml:Function]
|
||||||
# @PURPOSE: Replaces database_uuid in a single YAML file.
|
# @PURPOSE: Replaces database_uuid in a single YAML file.
|
||||||
@@ -90,8 +91,8 @@ class MigrationEngine:
|
|||||||
data['database_uuid'] = db_mapping[source_uuid]
|
data['database_uuid'] = db_mapping[source_uuid]
|
||||||
with open(file_path, 'w') as f:
|
with open(file_path, 'w') as f:
|
||||||
yaml.dump(data, f)
|
yaml.dump(data, f)
|
||||||
# [/DEF:MigrationEngine._transform_yaml]
|
# [/DEF:MigrationEngine._transform_yaml:Function]
|
||||||
|
|
||||||
# [/DEF:MigrationEngine]
|
# [/DEF:MigrationEngine:Class]
|
||||||
|
|
||||||
# [/DEF:backend.src.core.migration_engine]
|
# [/DEF:backend.src.core.migration_engine:Module]
|
||||||
|
|||||||
@@ -54,7 +54,7 @@ class PluginBase(ABC):
|
|||||||
The `params` argument will be validated against the schema returned by `get_schema()`.
|
The `params` argument will be validated against the schema returned by `get_schema()`.
|
||||||
"""
|
"""
|
||||||
pass
|
pass
|
||||||
# [/DEF]
|
# [/DEF:PluginBase:Class]
|
||||||
|
|
||||||
# [DEF:PluginConfig:Class]
|
# [DEF:PluginConfig:Class]
|
||||||
# @SEMANTICS: plugin, config, schema, pydantic
|
# @SEMANTICS: plugin, config, schema, pydantic
|
||||||
@@ -68,4 +68,4 @@ class PluginConfig(BaseModel):
|
|||||||
description: str = Field(..., description="Brief description of what the plugin does")
|
description: str = Field(..., description="Brief description of what the plugin does")
|
||||||
version: str = Field(..., description="Version of the plugin")
|
version: str = Field(..., description="Version of the plugin")
|
||||||
input_schema: Dict[str, Any] = Field(..., description="JSON schema for input parameters", alias="schema")
|
input_schema: Dict[str, Any] = Field(..., description="JSON schema for input parameters", alias="schema")
|
||||||
# [/DEF]
|
# [/DEF:PluginConfig:Class]
|
||||||
@@ -16,12 +16,18 @@ class PluginLoader:
|
|||||||
that inherit from PluginBase.
|
that inherit from PluginBase.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
# [DEF:PluginLoader.__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the PluginLoader with a directory to scan.
|
||||||
|
# @PARAM: plugin_dir (str) - The directory containing plugin modules.
|
||||||
def __init__(self, plugin_dir: str):
|
def __init__(self, plugin_dir: str):
|
||||||
self.plugin_dir = plugin_dir
|
self.plugin_dir = plugin_dir
|
||||||
self._plugins: Dict[str, PluginBase] = {}
|
self._plugins: Dict[str, PluginBase] = {}
|
||||||
self._plugin_configs: Dict[str, PluginConfig] = {}
|
self._plugin_configs: Dict[str, PluginConfig] = {}
|
||||||
self._load_plugins()
|
self._load_plugins()
|
||||||
|
# [/DEF:PluginLoader.__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:PluginLoader._load_plugins:Function]
|
||||||
|
# @PURPOSE: Scans the plugin directory and loads all valid plugins.
|
||||||
def _load_plugins(self):
|
def _load_plugins(self):
|
||||||
"""
|
"""
|
||||||
Scans the plugin directory, imports modules, and registers valid plugins.
|
Scans the plugin directory, imports modules, and registers valid plugins.
|
||||||
@@ -41,7 +47,12 @@ class PluginLoader:
|
|||||||
module_name = filename[:-3]
|
module_name = filename[:-3]
|
||||||
file_path = os.path.join(self.plugin_dir, filename)
|
file_path = os.path.join(self.plugin_dir, filename)
|
||||||
self._load_module(module_name, file_path)
|
self._load_module(module_name, file_path)
|
||||||
|
# [/DEF:PluginLoader._load_plugins:Function]
|
||||||
|
|
||||||
|
# [DEF:PluginLoader._load_module:Function]
|
||||||
|
# @PURPOSE: Loads a single Python module and discovers PluginBase implementations.
|
||||||
|
# @PARAM: module_name (str) - The name of the module.
|
||||||
|
# @PARAM: file_path (str) - The path to the module file.
|
||||||
def _load_module(self, module_name: str, file_path: str):
|
def _load_module(self, module_name: str, file_path: str):
|
||||||
"""
|
"""
|
||||||
Loads a single Python module and extracts PluginBase subclasses.
|
Loads a single Python module and extracts PluginBase subclasses.
|
||||||
@@ -83,7 +94,11 @@ class PluginLoader:
|
|||||||
self._register_plugin(plugin_instance)
|
self._register_plugin(plugin_instance)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"Error instantiating plugin {attribute_name} in {module_name}: {e}") # Replace with proper logging
|
print(f"Error instantiating plugin {attribute_name} in {module_name}: {e}") # Replace with proper logging
|
||||||
|
# [/DEF:PluginLoader._load_module:Function]
|
||||||
|
|
||||||
|
# [DEF:PluginLoader._register_plugin:Function]
|
||||||
|
# @PURPOSE: Registers a PluginBase instance and its configuration.
|
||||||
|
# @PARAM: plugin_instance (PluginBase) - The plugin instance to register.
|
||||||
def _register_plugin(self, plugin_instance: PluginBase):
|
def _register_plugin(self, plugin_instance: PluginBase):
|
||||||
"""
|
"""
|
||||||
Registers a valid plugin instance.
|
Registers a valid plugin instance.
|
||||||
@@ -116,22 +131,39 @@ class PluginLoader:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
from ..core.logger import logger
|
from ..core.logger import logger
|
||||||
logger.error(f"Error validating plugin '{plugin_instance.name}' (ID: {plugin_id}): {e}")
|
logger.error(f"Error validating plugin '{plugin_instance.name}' (ID: {plugin_id}): {e}")
|
||||||
|
# [/DEF:PluginLoader._register_plugin:Function]
|
||||||
|
|
||||||
|
|
||||||
|
# [DEF:PluginLoader.get_plugin:Function]
|
||||||
|
# @PURPOSE: Retrieves a loaded plugin instance by its ID.
|
||||||
|
# @PARAM: plugin_id (str) - The unique identifier of the plugin.
|
||||||
|
# @RETURN: Optional[PluginBase] - The plugin instance if found, otherwise None.
|
||||||
def get_plugin(self, plugin_id: str) -> Optional[PluginBase]:
|
def get_plugin(self, plugin_id: str) -> Optional[PluginBase]:
|
||||||
"""
|
"""
|
||||||
Returns a loaded plugin instance by its ID.
|
Returns a loaded plugin instance by its ID.
|
||||||
"""
|
"""
|
||||||
return self._plugins.get(plugin_id)
|
return self._plugins.get(plugin_id)
|
||||||
|
# [/DEF:PluginLoader.get_plugin:Function]
|
||||||
|
|
||||||
|
# [DEF:PluginLoader.get_all_plugin_configs:Function]
|
||||||
|
# @PURPOSE: Returns a list of all registered plugin configurations.
|
||||||
|
# @RETURN: List[PluginConfig] - A list of plugin configurations.
|
||||||
def get_all_plugin_configs(self) -> List[PluginConfig]:
|
def get_all_plugin_configs(self) -> List[PluginConfig]:
|
||||||
"""
|
"""
|
||||||
Returns a list of all loaded plugin configurations.
|
Returns a list of all loaded plugin configurations.
|
||||||
"""
|
"""
|
||||||
return list(self._plugin_configs.values())
|
return list(self._plugin_configs.values())
|
||||||
|
# [/DEF:PluginLoader.get_all_plugin_configs:Function]
|
||||||
|
|
||||||
|
# [DEF:PluginLoader.has_plugin:Function]
|
||||||
|
# @PURPOSE: Checks if a plugin with the given ID is registered.
|
||||||
|
# @PARAM: plugin_id (str) - The unique identifier of the plugin.
|
||||||
|
# @RETURN: bool - True if the plugin is registered, False otherwise.
|
||||||
def has_plugin(self, plugin_id: str) -> bool:
|
def has_plugin(self, plugin_id: str) -> bool:
|
||||||
"""
|
"""
|
||||||
Checks if a plugin with the given ID is loaded.
|
Checks if a plugin with the given ID is loaded.
|
||||||
"""
|
"""
|
||||||
return plugin_id in self._plugins
|
return plugin_id in self._plugins
|
||||||
|
# [/DEF:PluginLoader.has_plugin:Function]
|
||||||
|
|
||||||
|
# [/DEF:PluginLoader:Class]
|
||||||
104
backend/src/core/scheduler.py
Normal file
104
backend/src/core/scheduler.py
Normal file
@@ -0,0 +1,104 @@
|
|||||||
|
# [DEF:SchedulerModule:Module]
|
||||||
|
# @SEMANTICS: scheduler, apscheduler, cron, backup
|
||||||
|
# @PURPOSE: Manages scheduled tasks using APScheduler.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: Uses TaskManager to run scheduled backups.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from apscheduler.schedulers.background import BackgroundScheduler
|
||||||
|
from apscheduler.triggers.cron import CronTrigger
|
||||||
|
from .logger import logger, belief_scope
|
||||||
|
from .config_manager import ConfigManager
|
||||||
|
from typing import Optional
|
||||||
|
import asyncio
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:SchedulerService:Class]
|
||||||
|
# @SEMANTICS: scheduler, service, apscheduler
|
||||||
|
# @PURPOSE: Provides a service to manage scheduled backup tasks.
|
||||||
|
class SchedulerService:
|
||||||
|
def __init__(self, task_manager, config_manager: ConfigManager):
|
||||||
|
with belief_scope("SchedulerService.__init__"):
|
||||||
|
self.task_manager = task_manager
|
||||||
|
self.config_manager = config_manager
|
||||||
|
self.scheduler = BackgroundScheduler()
|
||||||
|
self.loop = asyncio.get_event_loop()
|
||||||
|
|
||||||
|
# [DEF:SchedulerService.start:Function]
|
||||||
|
# @PURPOSE: Starts the background scheduler and loads initial schedules.
|
||||||
|
def start(self):
|
||||||
|
with belief_scope("SchedulerService.start"):
|
||||||
|
if not self.scheduler.running:
|
||||||
|
self.scheduler.start()
|
||||||
|
logger.info("Scheduler started.")
|
||||||
|
self.load_schedules()
|
||||||
|
# [/DEF:SchedulerService.start:Function]
|
||||||
|
|
||||||
|
# [DEF:SchedulerService.stop:Function]
|
||||||
|
# @PURPOSE: Stops the background scheduler.
|
||||||
|
def stop(self):
|
||||||
|
with belief_scope("SchedulerService.stop"):
|
||||||
|
if self.scheduler.running:
|
||||||
|
self.scheduler.shutdown()
|
||||||
|
logger.info("Scheduler stopped.")
|
||||||
|
# [/DEF:SchedulerService.stop:Function]
|
||||||
|
|
||||||
|
# [DEF:SchedulerService.load_schedules:Function]
|
||||||
|
# @PURPOSE: Loads backup schedules from configuration and registers them.
|
||||||
|
def load_schedules(self):
|
||||||
|
with belief_scope("SchedulerService.load_schedules"):
|
||||||
|
# Clear existing jobs
|
||||||
|
self.scheduler.remove_all_jobs()
|
||||||
|
|
||||||
|
config = self.config_manager.get_config()
|
||||||
|
for env in config.environments:
|
||||||
|
if env.backup_schedule and env.backup_schedule.enabled:
|
||||||
|
self.add_backup_job(env.id, env.backup_schedule.cron_expression)
|
||||||
|
# [/DEF:SchedulerService.load_schedules:Function]
|
||||||
|
|
||||||
|
# [DEF:SchedulerService.add_backup_job:Function]
|
||||||
|
# @PURPOSE: Adds a scheduled backup job for an environment.
|
||||||
|
# @PARAM: env_id (str) - The ID of the environment.
|
||||||
|
# @PARAM: cron_expression (str) - The cron expression for the schedule.
|
||||||
|
def add_backup_job(self, env_id: str, cron_expression: str):
|
||||||
|
with belief_scope("SchedulerService.add_backup_job", f"env_id={env_id}, cron={cron_expression}"):
|
||||||
|
job_id = f"backup_{env_id}"
|
||||||
|
try:
|
||||||
|
self.scheduler.add_job(
|
||||||
|
self._trigger_backup,
|
||||||
|
CronTrigger.from_crontab(cron_expression),
|
||||||
|
id=job_id,
|
||||||
|
args=[env_id],
|
||||||
|
replace_existing=True
|
||||||
|
)
|
||||||
|
logger.info(f"Scheduled backup job added for environment {env_id}: {cron_expression}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to add backup job for environment {env_id}: {e}")
|
||||||
|
# [/DEF:SchedulerService.add_backup_job:Function]
|
||||||
|
|
||||||
|
# [DEF:SchedulerService._trigger_backup:Function]
|
||||||
|
# @PURPOSE: Triggered by the scheduler to start a backup task.
|
||||||
|
# @PARAM: env_id (str) - The ID of the environment.
|
||||||
|
def _trigger_backup(self, env_id: str):
|
||||||
|
with belief_scope("SchedulerService._trigger_backup", f"env_id={env_id}"):
|
||||||
|
logger.info(f"Triggering scheduled backup for environment {env_id}")
|
||||||
|
|
||||||
|
# Check if a backup is already running for this environment
|
||||||
|
active_tasks = self.task_manager.get_tasks(limit=100)
|
||||||
|
for task in active_tasks:
|
||||||
|
if (task.plugin_id == "superset-backup" and
|
||||||
|
task.status in ["PENDING", "RUNNING"] and
|
||||||
|
task.params.get("environment_id") == env_id):
|
||||||
|
logger.warning(f"Backup already running for environment {env_id}. Skipping scheduled run.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Run the backup task
|
||||||
|
# We need to run this in the event loop since create_task is async
|
||||||
|
asyncio.run_coroutine_threadsafe(
|
||||||
|
self.task_manager.create_task("superset-backup", {"environment_id": env_id}),
|
||||||
|
self.loop
|
||||||
|
)
|
||||||
|
# [/DEF:SchedulerService._trigger_backup:Function]
|
||||||
|
|
||||||
|
# [/DEF:SchedulerService:Class]
|
||||||
|
# [/DEF:SchedulerModule:Module]
|
||||||
@@ -35,7 +35,7 @@ class SupersetClient(BaseSupersetClient):
|
|||||||
db['engine'] = db.pop('backend', None)
|
db['engine'] = db.pop('backend', None)
|
||||||
|
|
||||||
return databases
|
return databases
|
||||||
# [/DEF:SupersetClient.get_databases_summary]
|
# [/DEF:SupersetClient.get_databases_summary:Function]
|
||||||
|
|
||||||
# [DEF:SupersetClient.get_database_by_uuid:Function]
|
# [DEF:SupersetClient.get_database_by_uuid:Function]
|
||||||
# @PURPOSE: Find a database by its UUID.
|
# @PURPOSE: Find a database by its UUID.
|
||||||
@@ -50,7 +50,7 @@ class SupersetClient(BaseSupersetClient):
|
|||||||
}
|
}
|
||||||
_, databases = self.get_databases(query=query)
|
_, databases = self.get_databases(query=query)
|
||||||
return databases[0] if databases else None
|
return databases[0] if databases else None
|
||||||
# [/DEF:SupersetClient.get_database_by_uuid]
|
# [/DEF:SupersetClient.get_database_by_uuid:Function]
|
||||||
|
|
||||||
# [DEF:SupersetClient.get_dashboards_summary:Function]
|
# [DEF:SupersetClient.get_dashboards_summary:Function]
|
||||||
# @PURPOSE: Fetches dashboard metadata optimized for the grid.
|
# @PURPOSE: Fetches dashboard metadata optimized for the grid.
|
||||||
@@ -76,8 +76,8 @@ class SupersetClient(BaseSupersetClient):
|
|||||||
"status": "published" if dash.get("published") else "draft"
|
"status": "published" if dash.get("published") else "draft"
|
||||||
})
|
})
|
||||||
return result
|
return result
|
||||||
# [/DEF:SupersetClient.get_dashboards_summary]
|
# [/DEF:SupersetClient.get_dashboards_summary:Function]
|
||||||
|
|
||||||
# [/DEF:SupersetClient]
|
# [/DEF:SupersetClient:Class]
|
||||||
|
|
||||||
# [/DEF:backend.src.core.superset_client]
|
# [/DEF:backend.src.core.superset_client:Module]
|
||||||
|
|||||||
40
backend/src/core/task_manager/cleanup.py
Normal file
40
backend/src/core/task_manager/cleanup.py
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
# [DEF:TaskCleanupModule:Module]
|
||||||
|
# @SEMANTICS: task, cleanup, retention
|
||||||
|
# @PURPOSE: Implements task cleanup and retention policies.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: Uses TaskPersistenceService to delete old tasks.
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from .persistence import TaskPersistenceService
|
||||||
|
from ..logger import logger, belief_scope
|
||||||
|
from ..config_manager import ConfigManager
|
||||||
|
|
||||||
|
# [DEF:TaskCleanupService:Class]
|
||||||
|
# @PURPOSE: Provides methods to clean up old task records.
|
||||||
|
class TaskCleanupService:
|
||||||
|
def __init__(self, persistence_service: TaskPersistenceService, config_manager: ConfigManager):
|
||||||
|
self.persistence_service = persistence_service
|
||||||
|
self.config_manager = config_manager
|
||||||
|
|
||||||
|
# [DEF:TaskCleanupService.run_cleanup:Function]
|
||||||
|
# @PURPOSE: Deletes tasks older than the configured retention period.
|
||||||
|
def run_cleanup(self):
|
||||||
|
with belief_scope("TaskCleanupService.run_cleanup"):
|
||||||
|
settings = self.config_manager.get_config().settings
|
||||||
|
retention_days = settings.task_retention_days
|
||||||
|
|
||||||
|
# This is a simplified implementation.
|
||||||
|
# In a real scenario, we would query IDs of tasks older than retention_days.
|
||||||
|
# For now, we'll log the action.
|
||||||
|
logger.info(f"Cleaning up tasks older than {retention_days} days.")
|
||||||
|
|
||||||
|
# Re-loading tasks to check for limit
|
||||||
|
tasks = self.persistence_service.load_tasks(limit=1000)
|
||||||
|
if len(tasks) > settings.task_retention_limit:
|
||||||
|
to_delete = [t.id for t in tasks[settings.task_retention_limit:]]
|
||||||
|
self.persistence_service.delete_tasks(to_delete)
|
||||||
|
logger.info(f"Deleted {len(to_delete)} tasks exceeding limit of {settings.task_retention_limit}")
|
||||||
|
# [/DEF:TaskCleanupService.run_cleanup:Function]
|
||||||
|
|
||||||
|
# [/DEF:TaskCleanupService:Class]
|
||||||
|
# [/DEF:TaskCleanupModule:Module]
|
||||||
@@ -43,6 +43,9 @@ class TaskManager:
|
|||||||
except RuntimeError:
|
except RuntimeError:
|
||||||
self.loop = asyncio.get_event_loop()
|
self.loop = asyncio.get_event_loop()
|
||||||
self.task_futures: Dict[str, asyncio.Future] = {}
|
self.task_futures: Dict[str, asyncio.Future] = {}
|
||||||
|
|
||||||
|
# Load persisted tasks on startup
|
||||||
|
self.load_persisted_tasks()
|
||||||
# [/DEF:TaskManager.__init__:Function]
|
# [/DEF:TaskManager.__init__:Function]
|
||||||
|
|
||||||
# [DEF:TaskManager.create_task:Function]
|
# [DEF:TaskManager.create_task:Function]
|
||||||
@@ -68,6 +71,7 @@ class TaskManager:
|
|||||||
|
|
||||||
task = Task(plugin_id=plugin_id, params=params, user_id=user_id)
|
task = Task(plugin_id=plugin_id, params=params, user_id=user_id)
|
||||||
self.tasks[task.id] = task
|
self.tasks[task.id] = task
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
logger.info(f"Task {task.id} created and scheduled for execution")
|
logger.info(f"Task {task.id} created and scheduled for execution")
|
||||||
self.loop.create_task(self._run_task(task.id)) # Schedule task for execution
|
self.loop.create_task(self._run_task(task.id)) # Schedule task for execution
|
||||||
return task
|
return task
|
||||||
@@ -86,6 +90,7 @@ class TaskManager:
|
|||||||
logger.info(f"Starting execution of task {task_id} for plugin '{plugin.name}'")
|
logger.info(f"Starting execution of task {task_id} for plugin '{plugin.name}'")
|
||||||
task.status = TaskStatus.RUNNING
|
task.status = TaskStatus.RUNNING
|
||||||
task.started_at = datetime.utcnow()
|
task.started_at = datetime.utcnow()
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", f"Task started for plugin '{plugin.name}'")
|
self._add_log(task_id, "INFO", f"Task started for plugin '{plugin.name}'")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -110,6 +115,7 @@ class TaskManager:
|
|||||||
self._add_log(task_id, "ERROR", f"Task failed: {e}", {"error_type": type(e).__name__})
|
self._add_log(task_id, "ERROR", f"Task failed: {e}", {"error_type": type(e).__name__})
|
||||||
finally:
|
finally:
|
||||||
task.finished_at = datetime.utcnow()
|
task.finished_at = datetime.utcnow()
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
logger.info(f"Task {task_id} execution finished with status: {task.status}")
|
logger.info(f"Task {task_id} execution finished with status: {task.status}")
|
||||||
# [/DEF:TaskManager._run_task:Function]
|
# [/DEF:TaskManager._run_task:Function]
|
||||||
|
|
||||||
@@ -129,6 +135,7 @@ class TaskManager:
|
|||||||
# Update task params with resolution
|
# Update task params with resolution
|
||||||
task.params.update(resolution_params)
|
task.params.update(resolution_params)
|
||||||
task.status = TaskStatus.RUNNING
|
task.status = TaskStatus.RUNNING
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", "Task resumed after mapping resolution.")
|
self._add_log(task_id, "INFO", "Task resumed after mapping resolution.")
|
||||||
|
|
||||||
# Signal the future to continue
|
# Signal the future to continue
|
||||||
@@ -147,6 +154,7 @@ class TaskManager:
|
|||||||
if not task: return
|
if not task: return
|
||||||
|
|
||||||
task.status = TaskStatus.AWAITING_MAPPING
|
task.status = TaskStatus.AWAITING_MAPPING
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self.task_futures[task_id] = self.loop.create_future()
|
self.task_futures[task_id] = self.loop.create_future()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -232,6 +240,7 @@ class TaskManager:
|
|||||||
|
|
||||||
log_entry = LogEntry(level=level, message=message, context=context)
|
log_entry = LogEntry(level=level, message=message, context=context)
|
||||||
task.logs.append(log_entry)
|
task.logs.append(log_entry)
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
|
|
||||||
# Notify subscribers
|
# Notify subscribers
|
||||||
if task_id in self.subscribers:
|
if task_id in self.subscribers:
|
||||||
@@ -263,16 +272,10 @@ class TaskManager:
|
|||||||
del self.subscribers[task_id]
|
del self.subscribers[task_id]
|
||||||
# [/DEF:TaskManager.unsubscribe_logs:Function]
|
# [/DEF:TaskManager.unsubscribe_logs:Function]
|
||||||
|
|
||||||
# [DEF:TaskManager.persist_awaiting_input_tasks:Function]
|
|
||||||
# @PURPOSE: Persist tasks in AWAITING_INPUT state using persistence service.
|
|
||||||
def persist_awaiting_input_tasks(self) -> None:
|
|
||||||
self.persistence_service.persist_tasks(list(self.tasks.values()))
|
|
||||||
# [/DEF:TaskManager.persist_awaiting_input_tasks:Function]
|
|
||||||
|
|
||||||
# [DEF:TaskManager.load_persisted_tasks:Function]
|
# [DEF:TaskManager.load_persisted_tasks:Function]
|
||||||
# @PURPOSE: Load persisted tasks using persistence service.
|
# @PURPOSE: Load persisted tasks using persistence service.
|
||||||
def load_persisted_tasks(self) -> None:
|
def load_persisted_tasks(self) -> None:
|
||||||
loaded_tasks = self.persistence_service.load_tasks()
|
loaded_tasks = self.persistence_service.load_tasks(limit=100)
|
||||||
for task in loaded_tasks:
|
for task in loaded_tasks:
|
||||||
if task.id not in self.tasks:
|
if task.id not in self.tasks:
|
||||||
self.tasks[task.id] = task
|
self.tasks[task.id] = task
|
||||||
@@ -296,9 +299,8 @@ class TaskManager:
|
|||||||
task.status = TaskStatus.AWAITING_INPUT
|
task.status = TaskStatus.AWAITING_INPUT
|
||||||
task.input_required = True
|
task.input_required = True
|
||||||
task.input_request = input_request
|
task.input_request = input_request
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", "Task paused for user input", {"input_request": input_request})
|
self._add_log(task_id, "INFO", "Task paused for user input", {"input_request": input_request})
|
||||||
|
|
||||||
self.persist_awaiting_input_tasks()
|
|
||||||
# [/DEF:TaskManager.await_input:Function]
|
# [/DEF:TaskManager.await_input:Function]
|
||||||
|
|
||||||
# [DEF:TaskManager.resume_task_with_password:Function]
|
# [DEF:TaskManager.resume_task_with_password:Function]
|
||||||
@@ -323,13 +325,52 @@ class TaskManager:
|
|||||||
task.input_required = False
|
task.input_required = False
|
||||||
task.input_request = None
|
task.input_request = None
|
||||||
task.status = TaskStatus.RUNNING
|
task.status = TaskStatus.RUNNING
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", "Task resumed with passwords", {"databases": list(passwords.keys())})
|
self._add_log(task_id, "INFO", "Task resumed with passwords", {"databases": list(passwords.keys())})
|
||||||
|
|
||||||
if task_id in self.task_futures:
|
if task_id in self.task_futures:
|
||||||
self.task_futures[task_id].set_result(True)
|
self.task_futures[task_id].set_result(True)
|
||||||
|
|
||||||
self.persist_awaiting_input_tasks()
|
|
||||||
# [/DEF:TaskManager.resume_task_with_password:Function]
|
# [/DEF:TaskManager.resume_task_with_password:Function]
|
||||||
|
|
||||||
|
# [DEF:TaskManager.clear_tasks:Function]
|
||||||
|
# @PURPOSE: Clears tasks based on status filter.
|
||||||
|
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
|
||||||
|
# @RETURN: int - Number of tasks cleared.
|
||||||
|
def clear_tasks(self, status: Optional[TaskStatus] = None) -> int:
|
||||||
|
with belief_scope("TaskManager.clear_tasks"):
|
||||||
|
tasks_to_remove = []
|
||||||
|
for task_id, task in list(self.tasks.items()):
|
||||||
|
# If status is provided, match it.
|
||||||
|
# If status is None, match everything EXCEPT RUNNING (unless they are awaiting input/mapping which are technically running but paused?)
|
||||||
|
# Actually, AWAITING_INPUT and AWAITING_MAPPING are distinct statuses in TaskStatus enum.
|
||||||
|
# RUNNING is active execution.
|
||||||
|
|
||||||
|
should_remove = False
|
||||||
|
if status:
|
||||||
|
if task.status == status:
|
||||||
|
should_remove = True
|
||||||
|
else:
|
||||||
|
# Clear all non-active tasks (keep RUNNING, AWAITING_INPUT, AWAITING_MAPPING)
|
||||||
|
if task.status not in [TaskStatus.RUNNING, TaskStatus.AWAITING_INPUT, TaskStatus.AWAITING_MAPPING]:
|
||||||
|
should_remove = True
|
||||||
|
|
||||||
|
if should_remove:
|
||||||
|
tasks_to_remove.append(task_id)
|
||||||
|
|
||||||
|
for tid in tasks_to_remove:
|
||||||
|
# Cancel future if exists (e.g. for AWAITING_INPUT/MAPPING)
|
||||||
|
if tid in self.task_futures:
|
||||||
|
self.task_futures[tid].cancel()
|
||||||
|
del self.task_futures[tid]
|
||||||
|
|
||||||
|
del self.tasks[tid]
|
||||||
|
|
||||||
|
# Remove from persistence
|
||||||
|
self.persistence_service.delete_tasks(tasks_to_remove)
|
||||||
|
|
||||||
|
logger.info(f"Cleared {len(tasks_to_remove)} tasks.")
|
||||||
|
return len(tasks_to_remove)
|
||||||
|
# [/DEF:TaskManager.clear_tasks:Function]
|
||||||
|
|
||||||
# [/DEF:TaskManager:Class]
|
# [/DEF:TaskManager:Class]
|
||||||
# [/DEF:TaskManagerModule:Module]
|
# [/DEF:TaskManagerModule:Module]
|
||||||
@@ -1,127 +1,141 @@
|
|||||||
# [DEF:TaskPersistenceModule:Module]
|
# [DEF:TaskPersistenceModule:Module]
|
||||||
# @SEMANTICS: persistence, sqlite, task, storage
|
# @SEMANTICS: persistence, sqlite, sqlalchemy, task, storage
|
||||||
# @PURPOSE: Handles the persistence of tasks, specifically those awaiting user input, to a SQLite database.
|
# @PURPOSE: Handles the persistence of tasks using SQLAlchemy and the tasks.db database.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
# @RELATION: Used by TaskManager to save and load tasks.
|
# @RELATION: Used by TaskManager to save and load tasks.
|
||||||
# @INVARIANT: Database schema must match the Task model structure.
|
# @INVARIANT: Database schema must match the TaskRecord model structure.
|
||||||
# @CONSTRAINT: Uses synchronous SQLite operations (blocking), should be used carefully.
|
|
||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
import sqlite3
|
|
||||||
import json
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from pathlib import Path
|
from typing import List, Optional, Dict, Any
|
||||||
from typing import Dict, List, Optional, Any
|
import json
|
||||||
|
|
||||||
from .models import Task, TaskStatus
|
from sqlalchemy.orm import Session
|
||||||
|
from backend.src.models.task import TaskRecord
|
||||||
|
from backend.src.core.database import TasksSessionLocal
|
||||||
|
from .models import Task, TaskStatus, LogEntry
|
||||||
from ..logger import logger, belief_scope
|
from ..logger import logger, belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService:Class]
|
# [DEF:TaskPersistenceService:Class]
|
||||||
# @SEMANTICS: persistence, service, database
|
# @SEMANTICS: persistence, service, database, sqlalchemy
|
||||||
# @PURPOSE: Provides methods to save and load tasks from a local SQLite database.
|
# @PURPOSE: Provides methods to save and load tasks from the tasks.db database using SQLAlchemy.
|
||||||
class TaskPersistenceService:
|
class TaskPersistenceService:
|
||||||
def __init__(self, db_path: Optional[Path] = None):
|
def __init__(self):
|
||||||
if db_path is None:
|
# We use TasksSessionLocal from database.py
|
||||||
self.db_path = Path(__file__).parent.parent.parent.parent / "migrations.db"
|
pass
|
||||||
else:
|
|
||||||
self.db_path = db_path
|
|
||||||
self._ensure_db_exists()
|
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService._ensure_db_exists:Function]
|
# [DEF:TaskPersistenceService.persist_task:Function]
|
||||||
# @PURPOSE: Ensures the database directory and table exist.
|
# @PURPOSE: Persists or updates a single task in the database.
|
||||||
# @PRE: None.
|
# @PARAM: task (Task) - The task object to persist.
|
||||||
# @POST: Database file and table are created if they didn't exist.
|
def persist_task(self, task: Task) -> None:
|
||||||
def _ensure_db_exists(self) -> None:
|
with belief_scope("TaskPersistenceService.persist_task", f"task_id={task.id}"):
|
||||||
with belief_scope("TaskPersistenceService._ensure_db_exists"):
|
session: Session = TasksSessionLocal()
|
||||||
self.db_path.parent.mkdir(parents=True, exist_ok=True)
|
try:
|
||||||
|
record = session.query(TaskRecord).filter(TaskRecord.id == task.id).first()
|
||||||
|
if not record:
|
||||||
|
record = TaskRecord(id=task.id)
|
||||||
|
session.add(record)
|
||||||
|
|
||||||
conn = sqlite3.connect(str(self.db_path))
|
record.type = task.plugin_id
|
||||||
cursor = conn.cursor()
|
record.status = task.status.value
|
||||||
|
record.environment_id = task.params.get("environment_id") or task.params.get("source_env_id")
|
||||||
|
record.started_at = task.started_at
|
||||||
|
record.finished_at = task.finished_at
|
||||||
|
record.params = task.params
|
||||||
|
|
||||||
cursor.execute("""
|
# Store logs as JSON, converting datetime to string
|
||||||
CREATE TABLE IF NOT EXISTS persistent_tasks (
|
record.logs = []
|
||||||
id TEXT PRIMARY KEY,
|
for log in task.logs:
|
||||||
status TEXT NOT NULL,
|
log_dict = log.dict()
|
||||||
created_at TEXT NOT NULL,
|
if isinstance(log_dict.get('timestamp'), datetime):
|
||||||
updated_at TEXT NOT NULL,
|
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
||||||
input_request JSON,
|
record.logs.append(log_dict)
|
||||||
context JSON
|
|
||||||
)
|
# Extract error if failed
|
||||||
""")
|
if task.status == TaskStatus.FAILED:
|
||||||
conn.commit()
|
for log in reversed(task.logs):
|
||||||
conn.close()
|
if log.level == "ERROR":
|
||||||
# [/DEF:TaskPersistenceService._ensure_db_exists:Function]
|
record.error = log.message
|
||||||
|
break
|
||||||
|
|
||||||
|
session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
session.rollback()
|
||||||
|
logger.error(f"Failed to persist task {task.id}: {e}")
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:TaskPersistenceService.persist_task:Function]
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService.persist_tasks:Function]
|
# [DEF:TaskPersistenceService.persist_tasks:Function]
|
||||||
# @PURPOSE: Persists a list of tasks to the database.
|
# @PURPOSE: Persists multiple tasks.
|
||||||
# @PRE: Tasks list contains valid Task objects.
|
# @PARAM: tasks (List[Task]) - The list of tasks to persist.
|
||||||
# @POST: Tasks matching the criteria (AWAITING_INPUT) are saved/updated in the DB.
|
|
||||||
# @PARAM: tasks (List[Task]) - The list of tasks to check and persist.
|
|
||||||
def persist_tasks(self, tasks: List[Task]) -> None:
|
def persist_tasks(self, tasks: List[Task]) -> None:
|
||||||
with belief_scope("TaskPersistenceService.persist_tasks"):
|
|
||||||
conn = sqlite3.connect(str(self.db_path))
|
|
||||||
cursor = conn.cursor()
|
|
||||||
|
|
||||||
count = 0
|
|
||||||
for task in tasks:
|
for task in tasks:
|
||||||
if task.status == TaskStatus.AWAITING_INPUT:
|
self.persist_task(task)
|
||||||
cursor.execute("""
|
|
||||||
INSERT OR REPLACE INTO persistent_tasks
|
|
||||||
(id, status, created_at, updated_at, input_request, context)
|
|
||||||
VALUES (?, ?, ?, ?, ?, ?)
|
|
||||||
""", (
|
|
||||||
task.id,
|
|
||||||
task.status.value,
|
|
||||||
task.started_at.isoformat() if task.started_at else datetime.utcnow().isoformat(),
|
|
||||||
datetime.utcnow().isoformat(),
|
|
||||||
json.dumps(task.input_request) if task.input_request else None,
|
|
||||||
json.dumps(task.params)
|
|
||||||
))
|
|
||||||
count += 1
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
conn.close()
|
|
||||||
logger.info(f"Persisted {count} tasks awaiting input.")
|
|
||||||
# [/DEF:TaskPersistenceService.persist_tasks:Function]
|
# [/DEF:TaskPersistenceService.persist_tasks:Function]
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService.load_tasks:Function]
|
# [DEF:TaskPersistenceService.load_tasks:Function]
|
||||||
# @PURPOSE: Loads persisted tasks from the database.
|
# @PURPOSE: Loads tasks from the database.
|
||||||
# @PRE: Database exists.
|
# @PARAM: limit (int) - Max tasks to load.
|
||||||
# @POST: Returns a list of Task objects reconstructed from the DB.
|
# @PARAM: status (Optional[TaskStatus]) - Filter by status.
|
||||||
# @RETURN: List[Task] - The loaded tasks.
|
# @RETURN: List[Task] - The loaded tasks.
|
||||||
def load_tasks(self) -> List[Task]:
|
def load_tasks(self, limit: int = 100, status: Optional[TaskStatus] = None) -> List[Task]:
|
||||||
with belief_scope("TaskPersistenceService.load_tasks"):
|
with belief_scope("TaskPersistenceService.load_tasks"):
|
||||||
if not self.db_path.exists():
|
session: Session = TasksSessionLocal()
|
||||||
return []
|
try:
|
||||||
|
query = session.query(TaskRecord)
|
||||||
|
if status:
|
||||||
|
query = query.filter(TaskRecord.status == status.value)
|
||||||
|
|
||||||
conn = sqlite3.connect(str(self.db_path))
|
records = query.order_by(TaskRecord.created_at.desc()).limit(limit).all()
|
||||||
cursor = conn.cursor()
|
|
||||||
|
|
||||||
cursor.execute("SELECT id, status, created_at, input_request, context FROM persistent_tasks")
|
|
||||||
rows = cursor.fetchall()
|
|
||||||
|
|
||||||
loaded_tasks = []
|
loaded_tasks = []
|
||||||
for row in rows:
|
for record in records:
|
||||||
task_id, status, created_at, input_request_json, context_json = row
|
|
||||||
try:
|
try:
|
||||||
|
logs = []
|
||||||
|
if record.logs:
|
||||||
|
for log_data in record.logs:
|
||||||
|
# Handle timestamp conversion if it's a string
|
||||||
|
if isinstance(log_data.get('timestamp'), str):
|
||||||
|
log_data['timestamp'] = datetime.fromisoformat(log_data['timestamp'])
|
||||||
|
logs.append(LogEntry(**log_data))
|
||||||
|
|
||||||
task = Task(
|
task = Task(
|
||||||
id=task_id,
|
id=record.id,
|
||||||
plugin_id="migration", # Default, assumes migration context for now
|
plugin_id=record.type,
|
||||||
status=TaskStatus(status),
|
status=TaskStatus(record.status),
|
||||||
started_at=datetime.fromisoformat(created_at),
|
started_at=record.started_at,
|
||||||
input_required=True,
|
finished_at=record.finished_at,
|
||||||
input_request=json.loads(input_request_json) if input_request_json else None,
|
params=record.params or {},
|
||||||
params=json.loads(context_json) if context_json else {}
|
logs=logs
|
||||||
)
|
)
|
||||||
loaded_tasks.append(task)
|
loaded_tasks.append(task)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Failed to load task {task_id}: {e}")
|
logger.error(f"Failed to reconstruct task {record.id}: {e}")
|
||||||
|
|
||||||
conn.close()
|
|
||||||
return loaded_tasks
|
return loaded_tasks
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
# [/DEF:TaskPersistenceService.load_tasks:Function]
|
# [/DEF:TaskPersistenceService.load_tasks:Function]
|
||||||
|
|
||||||
# [/DEF:TaskPersistenceService:Class]
|
# [DEF:TaskPersistenceService.delete_tasks:Function]
|
||||||
|
# @PURPOSE: Deletes specific tasks from the database.
|
||||||
|
# @PARAM: task_ids (List[str]) - List of task IDs to delete.
|
||||||
|
def delete_tasks(self, task_ids: List[str]) -> None:
|
||||||
|
if not task_ids:
|
||||||
|
return
|
||||||
|
with belief_scope("TaskPersistenceService.delete_tasks"):
|
||||||
|
session: Session = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
session.query(TaskRecord).filter(TaskRecord.id.in_(task_ids)).delete(synchronize_session=False)
|
||||||
|
session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
session.rollback()
|
||||||
|
logger.error(f"Failed to delete tasks: {e}")
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:TaskPersistenceService.delete_tasks:Function]
|
||||||
|
|
||||||
|
# [/DEF:TaskPersistenceService:Class]
|
||||||
# [/DEF:TaskPersistenceModule:Module]
|
# [/DEF:TaskPersistenceModule:Module]
|
||||||
@@ -48,6 +48,6 @@ def suggest_mappings(source_databases: List[Dict], target_databases: List[Dict],
|
|||||||
})
|
})
|
||||||
|
|
||||||
return suggestions
|
return suggestions
|
||||||
# [/DEF:suggest_mappings]
|
# [/DEF:suggest_mappings:Function]
|
||||||
|
|
||||||
# [/DEF:backend.src.core.utils.matching]
|
# [/DEF:backend.src.core.utils.matching:Module]
|
||||||
|
|||||||
@@ -8,6 +8,8 @@ from pathlib import Path
|
|||||||
from .core.plugin_loader import PluginLoader
|
from .core.plugin_loader import PluginLoader
|
||||||
from .core.task_manager import TaskManager
|
from .core.task_manager import TaskManager
|
||||||
from .core.config_manager import ConfigManager
|
from .core.config_manager import ConfigManager
|
||||||
|
from .core.scheduler import SchedulerService
|
||||||
|
from .core.database import init_db
|
||||||
|
|
||||||
# Initialize singletons
|
# Initialize singletons
|
||||||
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
||||||
@@ -15,6 +17,9 @@ project_root = Path(__file__).parent.parent.parent
|
|||||||
config_path = project_root / "config.json"
|
config_path = project_root / "config.json"
|
||||||
config_manager = ConfigManager(config_path=str(config_path))
|
config_manager = ConfigManager(config_path=str(config_path))
|
||||||
|
|
||||||
|
# Initialize database before any other services that might use it
|
||||||
|
init_db()
|
||||||
|
|
||||||
def get_config_manager() -> ConfigManager:
|
def get_config_manager() -> ConfigManager:
|
||||||
"""Dependency injector for the ConfigManager."""
|
"""Dependency injector for the ConfigManager."""
|
||||||
return config_manager
|
return config_manager
|
||||||
@@ -28,6 +33,9 @@ logger.info(f"Available plugins: {[config.name for config in plugin_loader.get_a
|
|||||||
task_manager = TaskManager(plugin_loader)
|
task_manager = TaskManager(plugin_loader)
|
||||||
logger.info("TaskManager initialized")
|
logger.info("TaskManager initialized")
|
||||||
|
|
||||||
|
scheduler_service = SchedulerService(task_manager, config_manager)
|
||||||
|
logger.info("SchedulerService initialized")
|
||||||
|
|
||||||
def get_plugin_loader() -> PluginLoader:
|
def get_plugin_loader() -> PluginLoader:
|
||||||
"""Dependency injector for the PluginLoader."""
|
"""Dependency injector for the PluginLoader."""
|
||||||
return plugin_loader
|
return plugin_loader
|
||||||
@@ -35,4 +43,8 @@ def get_plugin_loader() -> PluginLoader:
|
|||||||
def get_task_manager() -> TaskManager:
|
def get_task_manager() -> TaskManager:
|
||||||
"""Dependency injector for the TaskManager."""
|
"""Dependency injector for the TaskManager."""
|
||||||
return task_manager
|
return task_manager
|
||||||
# [/DEF]
|
|
||||||
|
def get_scheduler_service() -> SchedulerService:
|
||||||
|
"""Dependency injector for the SchedulerService."""
|
||||||
|
return scheduler_service
|
||||||
|
# [/DEF:Dependencies:Module]
|
||||||
@@ -14,7 +14,7 @@ class DashboardMetadata(BaseModel):
|
|||||||
title: str
|
title: str
|
||||||
last_modified: str
|
last_modified: str
|
||||||
status: str
|
status: str
|
||||||
# [/DEF:DashboardMetadata]
|
# [/DEF:DashboardMetadata:Class]
|
||||||
|
|
||||||
# [DEF:DashboardSelection:Class]
|
# [DEF:DashboardSelection:Class]
|
||||||
# @PURPOSE: Represents the user's selection of dashboards to migrate.
|
# @PURPOSE: Represents the user's selection of dashboards to migrate.
|
||||||
@@ -22,6 +22,7 @@ class DashboardSelection(BaseModel):
|
|||||||
selected_ids: List[int]
|
selected_ids: List[int]
|
||||||
source_env_id: str
|
source_env_id: str
|
||||||
target_env_id: str
|
target_env_id: str
|
||||||
# [/DEF:DashboardSelection]
|
replace_db_config: bool = False
|
||||||
|
# [/DEF:DashboardSelection:Class]
|
||||||
|
|
||||||
# [/DEF:backend.src.models.dashboard]
|
# [/DEF:backend.src.models.dashboard:Module]
|
||||||
@@ -26,7 +26,7 @@ class MigrationStatus(enum.Enum):
|
|||||||
COMPLETED = "COMPLETED"
|
COMPLETED = "COMPLETED"
|
||||||
FAILED = "FAILED"
|
FAILED = "FAILED"
|
||||||
AWAITING_MAPPING = "AWAITING_MAPPING"
|
AWAITING_MAPPING = "AWAITING_MAPPING"
|
||||||
# [/DEF:MigrationStatus]
|
# [/DEF:MigrationStatus:Class]
|
||||||
|
|
||||||
# [DEF:Environment:Class]
|
# [DEF:Environment:Class]
|
||||||
# @PURPOSE: Represents a Superset instance environment.
|
# @PURPOSE: Represents a Superset instance environment.
|
||||||
@@ -37,7 +37,7 @@ class Environment(Base):
|
|||||||
name = Column(String, nullable=False)
|
name = Column(String, nullable=False)
|
||||||
url = Column(String, nullable=False)
|
url = Column(String, nullable=False)
|
||||||
credentials_id = Column(String, nullable=False)
|
credentials_id = Column(String, nullable=False)
|
||||||
# [/DEF:Environment]
|
# [/DEF:Environment:Class]
|
||||||
|
|
||||||
# [DEF:DatabaseMapping:Class]
|
# [DEF:DatabaseMapping:Class]
|
||||||
# @PURPOSE: Represents a mapping between source and target databases.
|
# @PURPOSE: Represents a mapping between source and target databases.
|
||||||
@@ -52,7 +52,7 @@ class DatabaseMapping(Base):
|
|||||||
source_db_name = Column(String, nullable=False)
|
source_db_name = Column(String, nullable=False)
|
||||||
target_db_name = Column(String, nullable=False)
|
target_db_name = Column(String, nullable=False)
|
||||||
engine = Column(String, nullable=True)
|
engine = Column(String, nullable=True)
|
||||||
# [/DEF:DatabaseMapping]
|
# [/DEF:DatabaseMapping:Class]
|
||||||
|
|
||||||
# [DEF:MigrationJob:Class]
|
# [DEF:MigrationJob:Class]
|
||||||
# @PURPOSE: Represents a single migration execution job.
|
# @PURPOSE: Represents a single migration execution job.
|
||||||
@@ -65,6 +65,6 @@ class MigrationJob(Base):
|
|||||||
status = Column(SQLEnum(MigrationStatus), default=MigrationStatus.PENDING)
|
status = Column(SQLEnum(MigrationStatus), default=MigrationStatus.PENDING)
|
||||||
replace_db = Column(Boolean, default=False)
|
replace_db = Column(Boolean, default=False)
|
||||||
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
# [/DEF:MigrationJob]
|
# [/DEF:MigrationJob:Class]
|
||||||
|
|
||||||
# [/DEF:backend.src.models.mapping]
|
# [/DEF:backend.src.models.mapping:Module]
|
||||||
|
|||||||
34
backend/src/models/task.py
Normal file
34
backend/src/models/task.py
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# [DEF:backend.src.models.task:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: database, task, record, sqlalchemy, sqlite
|
||||||
|
# @PURPOSE: Defines the database schema for task execution records.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||||
|
#
|
||||||
|
# @INVARIANT: All primary keys are UUID strings.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from sqlalchemy import Column, String, DateTime, JSON, ForeignKey
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
from .mapping import Base
|
||||||
|
import uuid
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:TaskRecord:Class]
|
||||||
|
# @PURPOSE: Represents a persistent record of a task execution.
|
||||||
|
class TaskRecord(Base):
|
||||||
|
__tablename__ = "task_records"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||||
|
type = Column(String, nullable=False) # e.g., "backup", "migration"
|
||||||
|
status = Column(String, nullable=False) # Enum: "PENDING", "RUNNING", "SUCCESS", "FAILED"
|
||||||
|
environment_id = Column(String, ForeignKey("environments.id"), nullable=True)
|
||||||
|
started_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
finished_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
logs = Column(JSON, nullable=True) # Store structured logs as JSON
|
||||||
|
error = Column(String, nullable=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
|
params = Column(JSON, nullable=True)
|
||||||
|
# [/DEF:TaskRecord:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.models.task:Module]
|
||||||
@@ -25,6 +25,8 @@ from superset_tool.utils.fileio import (
|
|||||||
from superset_tool.utils.init_clients import setup_clients
|
from superset_tool.utils.init_clients import setup_clients
|
||||||
from ..dependencies import get_config_manager
|
from ..dependencies import get_config_manager
|
||||||
|
|
||||||
|
# [DEF:BackupPlugin:Class]
|
||||||
|
# @PURPOSE: Implementation of the backup plugin logic.
|
||||||
class BackupPlugin(PluginBase):
|
class BackupPlugin(PluginBase):
|
||||||
"""
|
"""
|
||||||
A plugin to back up Superset dashboards.
|
A plugin to back up Superset dashboards.
|
||||||
@@ -71,8 +73,21 @@ class BackupPlugin(PluginBase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
async def execute(self, params: Dict[str, Any]):
|
async def execute(self, params: Dict[str, Any]):
|
||||||
env = params["env"]
|
config_manager = get_config_manager()
|
||||||
backup_path = Path(params["backup_path"])
|
env_id = params.get("environment_id")
|
||||||
|
|
||||||
|
# Resolve environment name if environment_id is provided
|
||||||
|
if env_id:
|
||||||
|
env_config = next((e for e in config_manager.get_environments() if e.id == env_id), None)
|
||||||
|
if env_config:
|
||||||
|
params["env"] = env_config.name
|
||||||
|
|
||||||
|
env = params.get("env")
|
||||||
|
if not env:
|
||||||
|
raise KeyError("env")
|
||||||
|
|
||||||
|
backup_path_str = params.get("backup_path") or config_manager.get_config().settings.backup_path
|
||||||
|
backup_path = Path(backup_path_str)
|
||||||
|
|
||||||
logger = SupersetLogger(log_dir=backup_path / "Logs", console=True)
|
logger = SupersetLogger(log_dir=backup_path / "Logs", console=True)
|
||||||
logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
||||||
@@ -130,4 +145,5 @@ class BackupPlugin(PluginBase):
|
|||||||
except (RequestException, IOError, KeyError) as e:
|
except (RequestException, IOError, KeyError) as e:
|
||||||
logger.critical(f"[BackupPlugin][Failure] Fatal error during backup for {env}: {e}", exc_info=True)
|
logger.critical(f"[BackupPlugin][Failure] Fatal error during backup for {env}: {e}", exc_info=True)
|
||||||
raise e
|
raise e
|
||||||
# [/DEF:BackupPlugin]
|
# [/DEF:BackupPlugin:Class]
|
||||||
|
# [/DEF:BackupPlugin:Module]
|
||||||
@@ -21,6 +21,8 @@ from ..core.migration_engine import MigrationEngine
|
|||||||
from ..core.database import SessionLocal
|
from ..core.database import SessionLocal
|
||||||
from ..models.mapping import DatabaseMapping, Environment
|
from ..models.mapping import DatabaseMapping, Environment
|
||||||
|
|
||||||
|
# [DEF:MigrationPlugin:Class]
|
||||||
|
# @PURPOSE: Implementation of the migration plugin logic.
|
||||||
class MigrationPlugin(PluginBase):
|
class MigrationPlugin(PluginBase):
|
||||||
"""
|
"""
|
||||||
A plugin to migrate Superset dashboards between environments.
|
A plugin to migrate Superset dashboards between environments.
|
||||||
@@ -100,7 +102,31 @@ class MigrationPlugin(PluginBase):
|
|||||||
from_db_id = params.get("from_db_id")
|
from_db_id = params.get("from_db_id")
|
||||||
to_db_id = params.get("to_db_id")
|
to_db_id = params.get("to_db_id")
|
||||||
|
|
||||||
logger = SupersetLogger(log_dir=Path.cwd() / "logs", console=True)
|
# [DEF:MigrationPlugin.execute:Action]
|
||||||
|
# @PURPOSE: Execute the migration logic with proper task logging.
|
||||||
|
task_id = params.get("_task_id")
|
||||||
|
from ..dependencies import get_task_manager
|
||||||
|
tm = get_task_manager()
|
||||||
|
|
||||||
|
class TaskLoggerProxy(SupersetLogger):
|
||||||
|
def __init__(self):
|
||||||
|
# Initialize parent with dummy values since we override methods
|
||||||
|
super().__init__(console=False)
|
||||||
|
|
||||||
|
def debug(self, msg, *args, extra=None, **kwargs):
|
||||||
|
if task_id: tm._add_log(task_id, "DEBUG", msg, extra or {})
|
||||||
|
def info(self, msg, *args, extra=None, **kwargs):
|
||||||
|
if task_id: tm._add_log(task_id, "INFO", msg, extra or {})
|
||||||
|
def warning(self, msg, *args, extra=None, **kwargs):
|
||||||
|
if task_id: tm._add_log(task_id, "WARNING", msg, extra or {})
|
||||||
|
def error(self, msg, *args, extra=None, **kwargs):
|
||||||
|
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
|
||||||
|
def critical(self, msg, *args, extra=None, **kwargs):
|
||||||
|
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
|
||||||
|
def exception(self, msg, *args, **kwargs):
|
||||||
|
if task_id: tm._add_log(task_id, "ERROR", msg, {"exception": True})
|
||||||
|
|
||||||
|
logger = TaskLoggerProxy()
|
||||||
logger.info(f"[MigrationPlugin][Entry] Starting migration task.")
|
logger.info(f"[MigrationPlugin][Entry] Starting migration task.")
|
||||||
logger.info(f"[MigrationPlugin][Action] Params: {params}")
|
logger.info(f"[MigrationPlugin][Action] Params: {params}")
|
||||||
|
|
||||||
@@ -188,10 +214,7 @@ class MigrationPlugin(PluginBase):
|
|||||||
|
|
||||||
if not success and replace_db_config:
|
if not success and replace_db_config:
|
||||||
# Signal missing mapping and wait (only if we care about mappings)
|
# Signal missing mapping and wait (only if we care about mappings)
|
||||||
task_id = params.get("_task_id")
|
|
||||||
if task_id:
|
if task_id:
|
||||||
from ..dependencies import get_task_manager
|
|
||||||
tm = get_task_manager()
|
|
||||||
logger.info(f"[MigrationPlugin][Action] Pausing for missing mapping in task {task_id}")
|
logger.info(f"[MigrationPlugin][Action] Pausing for missing mapping in task {task_id}")
|
||||||
# In a real scenario, we'd pass the missing DB info to the frontend
|
# In a real scenario, we'd pass the missing DB info to the frontend
|
||||||
# For this task, we'll just simulate the wait
|
# For this task, we'll just simulate the wait
|
||||||
@@ -220,16 +243,25 @@ class MigrationPlugin(PluginBase):
|
|||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
# Check for password error
|
# Check for password error
|
||||||
error_msg = str(exc)
|
error_msg = str(exc)
|
||||||
if "Must provide a password for the database" in error_msg:
|
# The error message from Superset is often a JSON string inside a string.
|
||||||
# Extract database name (assuming format: "Must provide a password for the database 'PostgreSQL'")
|
# We need to robustly detect the password requirement.
|
||||||
import re
|
# Typical error: "Error importing dashboard: databases/PostgreSQL.yaml: {'_schema': ['Must provide a password for the database']}"
|
||||||
match = re.search(r"database '([^']+)'", error_msg)
|
|
||||||
db_name = match.group(1) if match else "unknown"
|
|
||||||
|
|
||||||
# Get task manager
|
if "Must provide a password for the database" in error_msg:
|
||||||
from ..dependencies import get_task_manager
|
# Extract database name
|
||||||
tm = get_task_manager()
|
# Try to find "databases/DBNAME.yaml" pattern
|
||||||
task_id = params.get("_task_id")
|
import re
|
||||||
|
db_name = "unknown"
|
||||||
|
match = re.search(r"databases/([^.]+)\.yaml", error_msg)
|
||||||
|
if match:
|
||||||
|
db_name = match.group(1)
|
||||||
|
else:
|
||||||
|
# Fallback: try to find 'database 'NAME'' pattern
|
||||||
|
match_alt = re.search(r"database '([^']+)'", error_msg)
|
||||||
|
if match_alt:
|
||||||
|
db_name = match_alt.group(1)
|
||||||
|
|
||||||
|
logger.warning(f"[MigrationPlugin][Action] Detected missing password for database: {db_name}")
|
||||||
|
|
||||||
if task_id:
|
if task_id:
|
||||||
input_request = {
|
input_request = {
|
||||||
@@ -251,13 +283,18 @@ class MigrationPlugin(PluginBase):
|
|||||||
logger.info(f"[MigrationPlugin][Action] Retrying import for {title} with provided passwords.")
|
logger.info(f"[MigrationPlugin][Action] Retrying import for {title} with provided passwords.")
|
||||||
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug, passwords=passwords)
|
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug, passwords=passwords)
|
||||||
logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported after password injection.")
|
logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported after password injection.")
|
||||||
|
# Clear passwords from params after use for security
|
||||||
|
if "passwords" in task.params:
|
||||||
|
del task.params["passwords"]
|
||||||
continue
|
continue
|
||||||
|
|
||||||
logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True)
|
logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True)
|
||||||
|
# [/DEF:MigrationPlugin.execute:Action]
|
||||||
|
|
||||||
logger.info("[MigrationPlugin][Exit] Migration finished.")
|
logger.info("[MigrationPlugin][Exit] Migration finished.")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.critical(f"[MigrationPlugin][Failure] Fatal error during migration: {e}", exc_info=True)
|
logger.critical(f"[MigrationPlugin][Failure] Fatal error during migration: {e}", exc_info=True)
|
||||||
raise e
|
raise e
|
||||||
# [/DEF:MigrationPlugin]
|
# [/DEF:MigrationPlugin:Class]
|
||||||
|
# [/DEF:MigrationPlugin:Module]
|
||||||
@@ -20,8 +20,10 @@ from superset_tool.models import SupersetConfig
|
|||||||
class MappingService:
|
class MappingService:
|
||||||
|
|
||||||
# [DEF:MappingService.__init__:Function]
|
# [DEF:MappingService.__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the mapping service with a config manager.
|
||||||
def __init__(self, config_manager):
|
def __init__(self, config_manager):
|
||||||
self.config_manager = config_manager
|
self.config_manager = config_manager
|
||||||
|
# [/DEF:MappingService.__init__:Function]
|
||||||
|
|
||||||
# [DEF:MappingService._get_client:Function]
|
# [DEF:MappingService._get_client:Function]
|
||||||
# @PURPOSE: Helper to get an initialized SupersetClient for an environment.
|
# @PURPOSE: Helper to get an initialized SupersetClient for an environment.
|
||||||
@@ -42,6 +44,7 @@ class MappingService:
|
|||||||
}
|
}
|
||||||
)
|
)
|
||||||
return SupersetClient(superset_config)
|
return SupersetClient(superset_config)
|
||||||
|
# [/DEF:MappingService._get_client:Function]
|
||||||
|
|
||||||
# [DEF:MappingService.get_suggestions:Function]
|
# [DEF:MappingService.get_suggestions:Function]
|
||||||
# @PURPOSE: Fetches databases from both environments and returns fuzzy matching suggestions.
|
# @PURPOSE: Fetches databases from both environments and returns fuzzy matching suggestions.
|
||||||
@@ -59,8 +62,8 @@ class MappingService:
|
|||||||
target_dbs = target_client.get_databases_summary()
|
target_dbs = target_client.get_databases_summary()
|
||||||
|
|
||||||
return suggest_mappings(source_dbs, target_dbs)
|
return suggest_mappings(source_dbs, target_dbs)
|
||||||
# [/DEF:MappingService.get_suggestions]
|
# [/DEF:MappingService.get_suggestions:Function]
|
||||||
|
|
||||||
# [/DEF:MappingService]
|
# [/DEF:MappingService:Class]
|
||||||
|
|
||||||
# [/DEF:backend.src.services.mapping_service]
|
# [/DEF:backend.src.services.mapping_service:Module]
|
||||||
|
|||||||
BIN
backend/tasks.db
Normal file
BIN
backend/tasks.db
Normal file
Binary file not shown.
@@ -36,7 +36,7 @@ class BackupConfig:
|
|||||||
rotate_archive: bool = True
|
rotate_archive: bool = True
|
||||||
clean_folders: bool = True
|
clean_folders: bool = True
|
||||||
retention_policy: RetentionPolicy = field(default_factory=RetentionPolicy)
|
retention_policy: RetentionPolicy = field(default_factory=RetentionPolicy)
|
||||||
# [/DEF:BackupConfig]
|
# [/DEF:BackupConfig:DataClass]
|
||||||
|
|
||||||
# [DEF:backup_dashboards:Function]
|
# [DEF:backup_dashboards:Function]
|
||||||
# @PURPOSE: Выполняет бэкап всех доступных дашбордов для заданного клиента и окружения, пропуская ошибки экспорта.
|
# @PURPOSE: Выполняет бэкап всех доступных дашбордов для заданного клиента и окружения, пропуская ошибки экспорта.
|
||||||
@@ -111,7 +111,7 @@ def backup_dashboards(
|
|||||||
except (RequestException, IOError) as e:
|
except (RequestException, IOError) as e:
|
||||||
logger.critical(f"[backup_dashboards][Failure] Fatal error during backup for {env_name}: {e}", exc_info=True)
|
logger.critical(f"[backup_dashboards][Failure] Fatal error during backup for {env_name}: {e}", exc_info=True)
|
||||||
return False
|
return False
|
||||||
# [/DEF:backup_dashboards]
|
# [/DEF:backup_dashboards:Function]
|
||||||
|
|
||||||
# [DEF:main:Function]
|
# [DEF:main:Function]
|
||||||
# @PURPOSE: Основная точка входа для запуска процесса резервного копирования.
|
# @PURPOSE: Основная точка входа для запуска процесса резервного копирования.
|
||||||
@@ -155,9 +155,9 @@ def main() -> int:
|
|||||||
|
|
||||||
logger.info("[main][Exit] Superset backup process finished.")
|
logger.info("[main][Exit] Superset backup process finished.")
|
||||||
return exit_code
|
return exit_code
|
||||||
# [/DEF:main]
|
# [/DEF:main:Function]
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
sys.exit(main())
|
sys.exit(main())
|
||||||
|
|
||||||
# [/DEF:backup_script]
|
# [/DEF:backup_script:Module]
|
||||||
|
|||||||
@@ -71,9 +71,9 @@ def debug_database_api():
|
|||||||
print(f"Ошибка при тестировании API: {e}")
|
print(f"Ошибка при тестировании API: {e}")
|
||||||
import traceback
|
import traceback
|
||||||
traceback.print_exc()
|
traceback.print_exc()
|
||||||
# [/DEF:debug_database_api]
|
# [/DEF:debug_database_api:Function]
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
debug_database_api()
|
debug_database_api()
|
||||||
|
|
||||||
# [/DEF:debug_db_api]
|
# [/DEF:debug_db_api:Module]
|
||||||
|
|||||||
242
frontend/.svelte-kit/ambient.d.ts
vendored
242
frontend/.svelte-kit/ambient.d.ts
vendored
@@ -1,242 +0,0 @@
|
|||||||
|
|
||||||
// this file is generated — do not edit it
|
|
||||||
|
|
||||||
|
|
||||||
/// <reference types="@sveltejs/kit" />
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Environment variables [loaded by Vite](https://vitejs.dev/guide/env-and-mode.html#env-files) from `.env` files and `process.env`. Like [`$env/dynamic/private`](https://svelte.dev/docs/kit/$env-dynamic-private), this module cannot be imported into client-side code. This module only includes variables that _do not_ begin with [`config.kit.env.publicPrefix`](https://svelte.dev/docs/kit/configuration#env) _and do_ start with [`config.kit.env.privatePrefix`](https://svelte.dev/docs/kit/configuration#env) (if configured).
|
|
||||||
*
|
|
||||||
* _Unlike_ [`$env/dynamic/private`](https://svelte.dev/docs/kit/$env-dynamic-private), the values exported from this module are statically injected into your bundle at build time, enabling optimisations like dead code elimination.
|
|
||||||
*
|
|
||||||
* ```ts
|
|
||||||
* import { API_KEY } from '$env/static/private';
|
|
||||||
* ```
|
|
||||||
*
|
|
||||||
* Note that all environment variables referenced in your code should be declared (for example in an `.env` file), even if they don't have a value until the app is deployed:
|
|
||||||
*
|
|
||||||
* ```
|
|
||||||
* MY_FEATURE_FLAG=""
|
|
||||||
* ```
|
|
||||||
*
|
|
||||||
* You can override `.env` values from the command line like so:
|
|
||||||
*
|
|
||||||
* ```sh
|
|
||||||
* MY_FEATURE_FLAG="enabled" npm run dev
|
|
||||||
* ```
|
|
||||||
*/
|
|
||||||
declare module '$env/static/private' {
|
|
||||||
export const USER: string;
|
|
||||||
export const npm_config_user_agent: string;
|
|
||||||
export const XDG_SESSION_TYPE: string;
|
|
||||||
export const npm_node_execpath: string;
|
|
||||||
export const SHLVL: string;
|
|
||||||
export const npm_config_noproxy: string;
|
|
||||||
export const LESS: string;
|
|
||||||
export const HOME: string;
|
|
||||||
export const OLDPWD: string;
|
|
||||||
export const DESKTOP_SESSION: string;
|
|
||||||
export const npm_package_json: string;
|
|
||||||
export const LSCOLORS: string;
|
|
||||||
export const ZSH: string;
|
|
||||||
export const GNOME_SHELL_SESSION_MODE: string;
|
|
||||||
export const GTK_MODULES: string;
|
|
||||||
export const PAGER: string;
|
|
||||||
export const PS1: string;
|
|
||||||
export const npm_config_userconfig: string;
|
|
||||||
export const npm_config_local_prefix: string;
|
|
||||||
export const SYSTEMD_EXEC_PID: string;
|
|
||||||
export const DBUS_SESSION_BUS_ADDRESS: string;
|
|
||||||
export const COLORTERM: string;
|
|
||||||
export const COLOR: string;
|
|
||||||
export const npm_config_metrics_registry: string;
|
|
||||||
export const WAYLAND_DISPLAY: string;
|
|
||||||
export const LOGNAME: string;
|
|
||||||
export const SDKMAN_CANDIDATES_API: string;
|
|
||||||
export const _: string;
|
|
||||||
export const npm_config_prefix: string;
|
|
||||||
export const MEMORY_PRESSURE_WATCH: string;
|
|
||||||
export const XDG_SESSION_CLASS: string;
|
|
||||||
export const USERNAME: string;
|
|
||||||
export const TERM: string;
|
|
||||||
export const npm_config_cache: string;
|
|
||||||
export const GNOME_DESKTOP_SESSION_ID: string;
|
|
||||||
export const npm_config_node_gyp: string;
|
|
||||||
export const PATH: string;
|
|
||||||
export const SDKMAN_CANDIDATES_DIR: string;
|
|
||||||
export const NODE: string;
|
|
||||||
export const npm_package_name: string;
|
|
||||||
export const XDG_MENU_PREFIX: string;
|
|
||||||
export const SDKMAN_BROKER_API: string;
|
|
||||||
export const GNOME_TERMINAL_SCREEN: string;
|
|
||||||
export const GNOME_SETUP_DISPLAY: string;
|
|
||||||
export const XDG_RUNTIME_DIR: string;
|
|
||||||
export const DISPLAY: string;
|
|
||||||
export const LANG: string;
|
|
||||||
export const XDG_CURRENT_DESKTOP: string;
|
|
||||||
export const VIRTUAL_ENV_PROMPT: string;
|
|
||||||
export const XMODIFIERS: string;
|
|
||||||
export const XDG_SESSION_DESKTOP: string;
|
|
||||||
export const XAUTHORITY: string;
|
|
||||||
export const LS_COLORS: string;
|
|
||||||
export const GNOME_TERMINAL_SERVICE: string;
|
|
||||||
export const SDKMAN_DIR: string;
|
|
||||||
export const SDKMAN_PLATFORM: string;
|
|
||||||
export const npm_lifecycle_script: string;
|
|
||||||
export const SSH_AUTH_SOCK: string;
|
|
||||||
export const SHELL: string;
|
|
||||||
export const npm_package_version: string;
|
|
||||||
export const npm_lifecycle_event: string;
|
|
||||||
export const QT_ACCESSIBILITY: string;
|
|
||||||
export const GDMSESSION: string;
|
|
||||||
export const GOOGLE_CLOUD_PROJECT: string;
|
|
||||||
export const GPG_AGENT_INFO: string;
|
|
||||||
export const VIRTUAL_ENV: string;
|
|
||||||
export const QT_IM_MODULE: string;
|
|
||||||
export const npm_config_globalconfig: string;
|
|
||||||
export const npm_config_init_module: string;
|
|
||||||
export const JAVA_HOME: string;
|
|
||||||
export const PWD: string;
|
|
||||||
export const npm_config_globalignorefile: string;
|
|
||||||
export const npm_execpath: string;
|
|
||||||
export const XDG_DATA_DIRS: string;
|
|
||||||
export const npm_config_global_prefix: string;
|
|
||||||
export const npm_command: string;
|
|
||||||
export const QT_IM_MODULES: string;
|
|
||||||
export const MEMORY_PRESSURE_WRITE: string;
|
|
||||||
export const VTE_VERSION: string;
|
|
||||||
export const INIT_CWD: string;
|
|
||||||
export const EDITOR: string;
|
|
||||||
export const NODE_ENV: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Similar to [`$env/static/private`](https://svelte.dev/docs/kit/$env-static-private), except that it only includes environment variables that begin with [`config.kit.env.publicPrefix`](https://svelte.dev/docs/kit/configuration#env) (which defaults to `PUBLIC_`), and can therefore safely be exposed to client-side code.
|
|
||||||
*
|
|
||||||
* Values are replaced statically at build time.
|
|
||||||
*
|
|
||||||
* ```ts
|
|
||||||
* import { PUBLIC_BASE_URL } from '$env/static/public';
|
|
||||||
* ```
|
|
||||||
*/
|
|
||||||
declare module '$env/static/public' {
|
|
||||||
export const PUBLIC_WS_URL: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* This module provides access to runtime environment variables, as defined by the platform you're running on. For example if you're using [`adapter-node`](https://github.com/sveltejs/kit/tree/main/packages/adapter-node) (or running [`vite preview`](https://svelte.dev/docs/kit/cli)), this is equivalent to `process.env`. This module only includes variables that _do not_ begin with [`config.kit.env.publicPrefix`](https://svelte.dev/docs/kit/configuration#env) _and do_ start with [`config.kit.env.privatePrefix`](https://svelte.dev/docs/kit/configuration#env) (if configured).
|
|
||||||
*
|
|
||||||
* This module cannot be imported into client-side code.
|
|
||||||
*
|
|
||||||
* ```ts
|
|
||||||
* import { env } from '$env/dynamic/private';
|
|
||||||
* console.log(env.DEPLOYMENT_SPECIFIC_VARIABLE);
|
|
||||||
* ```
|
|
||||||
*
|
|
||||||
* > [!NOTE] In `dev`, `$env/dynamic` always includes environment variables from `.env`. In `prod`, this behavior will depend on your adapter.
|
|
||||||
*/
|
|
||||||
declare module '$env/dynamic/private' {
|
|
||||||
export const env: {
|
|
||||||
USER: string;
|
|
||||||
npm_config_user_agent: string;
|
|
||||||
XDG_SESSION_TYPE: string;
|
|
||||||
npm_node_execpath: string;
|
|
||||||
SHLVL: string;
|
|
||||||
npm_config_noproxy: string;
|
|
||||||
LESS: string;
|
|
||||||
HOME: string;
|
|
||||||
OLDPWD: string;
|
|
||||||
DESKTOP_SESSION: string;
|
|
||||||
npm_package_json: string;
|
|
||||||
LSCOLORS: string;
|
|
||||||
ZSH: string;
|
|
||||||
GNOME_SHELL_SESSION_MODE: string;
|
|
||||||
GTK_MODULES: string;
|
|
||||||
PAGER: string;
|
|
||||||
PS1: string;
|
|
||||||
npm_config_userconfig: string;
|
|
||||||
npm_config_local_prefix: string;
|
|
||||||
SYSTEMD_EXEC_PID: string;
|
|
||||||
DBUS_SESSION_BUS_ADDRESS: string;
|
|
||||||
COLORTERM: string;
|
|
||||||
COLOR: string;
|
|
||||||
npm_config_metrics_registry: string;
|
|
||||||
WAYLAND_DISPLAY: string;
|
|
||||||
LOGNAME: string;
|
|
||||||
SDKMAN_CANDIDATES_API: string;
|
|
||||||
_: string;
|
|
||||||
npm_config_prefix: string;
|
|
||||||
MEMORY_PRESSURE_WATCH: string;
|
|
||||||
XDG_SESSION_CLASS: string;
|
|
||||||
USERNAME: string;
|
|
||||||
TERM: string;
|
|
||||||
npm_config_cache: string;
|
|
||||||
GNOME_DESKTOP_SESSION_ID: string;
|
|
||||||
npm_config_node_gyp: string;
|
|
||||||
PATH: string;
|
|
||||||
SDKMAN_CANDIDATES_DIR: string;
|
|
||||||
NODE: string;
|
|
||||||
npm_package_name: string;
|
|
||||||
XDG_MENU_PREFIX: string;
|
|
||||||
SDKMAN_BROKER_API: string;
|
|
||||||
GNOME_TERMINAL_SCREEN: string;
|
|
||||||
GNOME_SETUP_DISPLAY: string;
|
|
||||||
XDG_RUNTIME_DIR: string;
|
|
||||||
DISPLAY: string;
|
|
||||||
LANG: string;
|
|
||||||
XDG_CURRENT_DESKTOP: string;
|
|
||||||
VIRTUAL_ENV_PROMPT: string;
|
|
||||||
XMODIFIERS: string;
|
|
||||||
XDG_SESSION_DESKTOP: string;
|
|
||||||
XAUTHORITY: string;
|
|
||||||
LS_COLORS: string;
|
|
||||||
GNOME_TERMINAL_SERVICE: string;
|
|
||||||
SDKMAN_DIR: string;
|
|
||||||
SDKMAN_PLATFORM: string;
|
|
||||||
npm_lifecycle_script: string;
|
|
||||||
SSH_AUTH_SOCK: string;
|
|
||||||
SHELL: string;
|
|
||||||
npm_package_version: string;
|
|
||||||
npm_lifecycle_event: string;
|
|
||||||
QT_ACCESSIBILITY: string;
|
|
||||||
GDMSESSION: string;
|
|
||||||
GOOGLE_CLOUD_PROJECT: string;
|
|
||||||
GPG_AGENT_INFO: string;
|
|
||||||
VIRTUAL_ENV: string;
|
|
||||||
QT_IM_MODULE: string;
|
|
||||||
npm_config_globalconfig: string;
|
|
||||||
npm_config_init_module: string;
|
|
||||||
JAVA_HOME: string;
|
|
||||||
PWD: string;
|
|
||||||
npm_config_globalignorefile: string;
|
|
||||||
npm_execpath: string;
|
|
||||||
XDG_DATA_DIRS: string;
|
|
||||||
npm_config_global_prefix: string;
|
|
||||||
npm_command: string;
|
|
||||||
QT_IM_MODULES: string;
|
|
||||||
MEMORY_PRESSURE_WRITE: string;
|
|
||||||
VTE_VERSION: string;
|
|
||||||
INIT_CWD: string;
|
|
||||||
EDITOR: string;
|
|
||||||
NODE_ENV: string;
|
|
||||||
[key: `PUBLIC_${string}`]: undefined;
|
|
||||||
[key: `${string}`]: string | undefined;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Similar to [`$env/dynamic/private`](https://svelte.dev/docs/kit/$env-dynamic-private), but only includes variables that begin with [`config.kit.env.publicPrefix`](https://svelte.dev/docs/kit/configuration#env) (which defaults to `PUBLIC_`), and can therefore safely be exposed to client-side code.
|
|
||||||
*
|
|
||||||
* Note that public dynamic environment variables must all be sent from the server to the client, causing larger network requests — when possible, use `$env/static/public` instead.
|
|
||||||
*
|
|
||||||
* ```ts
|
|
||||||
* import { env } from '$env/dynamic/public';
|
|
||||||
* console.log(env.PUBLIC_DEPLOYMENT_SPECIFIC_VARIABLE);
|
|
||||||
* ```
|
|
||||||
*/
|
|
||||||
declare module '$env/dynamic/public' {
|
|
||||||
export const env: {
|
|
||||||
PUBLIC_WS_URL: string;
|
|
||||||
[key: `PUBLIC_${string}`]: string | undefined;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,31 +0,0 @@
|
|||||||
export { matchers } from './matchers.js';
|
|
||||||
|
|
||||||
export const nodes = [
|
|
||||||
() => import('./nodes/0'),
|
|
||||||
() => import('./nodes/1'),
|
|
||||||
() => import('./nodes/2'),
|
|
||||||
() => import('./nodes/3')
|
|
||||||
];
|
|
||||||
|
|
||||||
export const server_loads = [];
|
|
||||||
|
|
||||||
export const dictionary = {
|
|
||||||
"/": [2],
|
|
||||||
"/settings": [3]
|
|
||||||
};
|
|
||||||
|
|
||||||
export const hooks = {
|
|
||||||
handleError: (({ error }) => { console.error(error) }),
|
|
||||||
|
|
||||||
reroute: (() => {}),
|
|
||||||
transport: {}
|
|
||||||
};
|
|
||||||
|
|
||||||
export const decoders = Object.fromEntries(Object.entries(hooks.transport).map(([k, v]) => [k, v.decode]));
|
|
||||||
export const encoders = Object.fromEntries(Object.entries(hooks.transport).map(([k, v]) => [k, v.encode]));
|
|
||||||
|
|
||||||
export const hash = false;
|
|
||||||
|
|
||||||
export const decode = (type, value) => decoders[type](value);
|
|
||||||
|
|
||||||
export { default as root } from '../root.js';
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
export const matchers = {};
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
import * as universal from "../../../../src/routes/+layout.ts";
|
|
||||||
export { universal };
|
|
||||||
export { default as component } from "../../../../src/routes/+layout.svelte";
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
export { default as component } from "../../../../src/routes/+error.svelte";
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
import * as universal from "../../../../src/routes/+page.ts";
|
|
||||||
export { universal };
|
|
||||||
export { default as component } from "../../../../src/routes/+page.svelte";
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
import * as universal from "../../../../src/routes/settings/+page.ts";
|
|
||||||
export { universal };
|
|
||||||
export { default as component } from "../../../../src/routes/settings/+page.svelte";
|
|
||||||
@@ -1,35 +0,0 @@
|
|||||||
export { matchers } from './matchers.js';
|
|
||||||
|
|
||||||
export const nodes = [
|
|
||||||
() => import('./nodes/0'),
|
|
||||||
() => import('./nodes/1'),
|
|
||||||
() => import('./nodes/2'),
|
|
||||||
() => import('./nodes/3'),
|
|
||||||
() => import('./nodes/4'),
|
|
||||||
() => import('./nodes/5')
|
|
||||||
];
|
|
||||||
|
|
||||||
export const server_loads = [];
|
|
||||||
|
|
||||||
export const dictionary = {
|
|
||||||
"/": [2],
|
|
||||||
"/migration": [3],
|
|
||||||
"/migration/mappings": [4],
|
|
||||||
"/settings": [5]
|
|
||||||
};
|
|
||||||
|
|
||||||
export const hooks = {
|
|
||||||
handleError: (({ error }) => { console.error(error) }),
|
|
||||||
|
|
||||||
reroute: (() => {}),
|
|
||||||
transport: {}
|
|
||||||
};
|
|
||||||
|
|
||||||
export const decoders = Object.fromEntries(Object.entries(hooks.transport).map(([k, v]) => [k, v.decode]));
|
|
||||||
export const encoders = Object.fromEntries(Object.entries(hooks.transport).map(([k, v]) => [k, v.encode]));
|
|
||||||
|
|
||||||
export const hash = false;
|
|
||||||
|
|
||||||
export const decode = (type, value) => decoders[type](value);
|
|
||||||
|
|
||||||
export { default as root } from '../root.js';
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
export const matchers = {};
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
import * as universal from "../../../../src/routes/+layout.ts";
|
|
||||||
export { universal };
|
|
||||||
export { default as component } from "../../../../src/routes/+layout.svelte";
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
export { default as component } from "../../../../src/routes/+error.svelte";
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
import * as universal from "../../../../src/routes/+page.ts";
|
|
||||||
export { universal };
|
|
||||||
export { default as component } from "../../../../src/routes/+page.svelte";
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
export { default as component } from "../../../../src/routes/migration/+page.svelte";
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
import { asClassComponent } from 'svelte/legacy';
|
|
||||||
import Root from './root.svelte';
|
|
||||||
export default asClassComponent(Root);
|
|
||||||
@@ -1,68 +0,0 @@
|
|||||||
<!-- This file is generated by @sveltejs/kit — do not edit it! -->
|
|
||||||
<svelte:options runes={true} />
|
|
||||||
<script>
|
|
||||||
import { setContext, onMount, tick } from 'svelte';
|
|
||||||
import { browser } from '$app/environment';
|
|
||||||
|
|
||||||
// stores
|
|
||||||
let { stores, page, constructors, components = [], form, data_0 = null, data_1 = null } = $props();
|
|
||||||
|
|
||||||
if (!browser) {
|
|
||||||
// svelte-ignore state_referenced_locally
|
|
||||||
setContext('__svelte__', stores);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (browser) {
|
|
||||||
$effect.pre(() => stores.page.set(page));
|
|
||||||
} else {
|
|
||||||
// svelte-ignore state_referenced_locally
|
|
||||||
stores.page.set(page);
|
|
||||||
}
|
|
||||||
$effect(() => {
|
|
||||||
stores;page;constructors;components;form;data_0;data_1;
|
|
||||||
stores.page.notify();
|
|
||||||
});
|
|
||||||
|
|
||||||
let mounted = $state(false);
|
|
||||||
let navigated = $state(false);
|
|
||||||
let title = $state(null);
|
|
||||||
|
|
||||||
onMount(() => {
|
|
||||||
const unsubscribe = stores.page.subscribe(() => {
|
|
||||||
if (mounted) {
|
|
||||||
navigated = true;
|
|
||||||
tick().then(() => {
|
|
||||||
title = document.title || 'untitled page';
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
mounted = true;
|
|
||||||
return unsubscribe;
|
|
||||||
});
|
|
||||||
|
|
||||||
const Pyramid_1=$derived(constructors[1])
|
|
||||||
</script>
|
|
||||||
|
|
||||||
{#if constructors[1]}
|
|
||||||
{@const Pyramid_0 = constructors[0]}
|
|
||||||
<!-- svelte-ignore binding_property_non_reactive -->
|
|
||||||
<Pyramid_0 bind:this={components[0]} data={data_0} {form} params={page.params}>
|
|
||||||
<!-- svelte-ignore binding_property_non_reactive -->
|
|
||||||
<Pyramid_1 bind:this={components[1]} data={data_1} {form} params={page.params} />
|
|
||||||
</Pyramid_0>
|
|
||||||
|
|
||||||
{:else}
|
|
||||||
{@const Pyramid_0 = constructors[0]}
|
|
||||||
<!-- svelte-ignore binding_property_non_reactive -->
|
|
||||||
<Pyramid_0 bind:this={components[0]} data={data_0} {form} params={page.params} />
|
|
||||||
|
|
||||||
{/if}
|
|
||||||
|
|
||||||
{#if mounted}
|
|
||||||
<div id="svelte-announcer" aria-live="assertive" aria-atomic="true" style="position: absolute; left: 0; top: 0; clip: rect(0 0 0 0); clip-path: inset(50%); overflow: hidden; white-space: nowrap; width: 1px; height: 1px">
|
|
||||||
{#if navigated}
|
|
||||||
{title}
|
|
||||||
{/if}
|
|
||||||
</div>
|
|
||||||
{/if}
|
|
||||||
@@ -1,53 +0,0 @@
|
|||||||
|
|
||||||
import root from '../root.js';
|
|
||||||
import { set_building, set_prerendering } from '__sveltekit/environment';
|
|
||||||
import { set_assets } from '$app/paths/internal/server';
|
|
||||||
import { set_manifest, set_read_implementation } from '__sveltekit/server';
|
|
||||||
import { set_private_env, set_public_env } from '../../../node_modules/@sveltejs/kit/src/runtime/shared-server.js';
|
|
||||||
|
|
||||||
export const options = {
|
|
||||||
app_template_contains_nonce: false,
|
|
||||||
async: false,
|
|
||||||
csp: {"mode":"auto","directives":{"upgrade-insecure-requests":false,"block-all-mixed-content":false},"reportOnly":{"upgrade-insecure-requests":false,"block-all-mixed-content":false}},
|
|
||||||
csrf_check_origin: true,
|
|
||||||
csrf_trusted_origins: [],
|
|
||||||
embedded: false,
|
|
||||||
env_public_prefix: 'PUBLIC_',
|
|
||||||
env_private_prefix: '',
|
|
||||||
hash_routing: false,
|
|
||||||
hooks: null, // added lazily, via `get_hooks`
|
|
||||||
preload_strategy: "modulepreload",
|
|
||||||
root,
|
|
||||||
service_worker: false,
|
|
||||||
service_worker_options: undefined,
|
|
||||||
templates: {
|
|
||||||
app: ({ head, body, assets, nonce, env }) => "<!DOCTYPE html>\n<html lang=\"en\">\n\t<head>\n\t\t<meta charset=\"utf-8\" />\n\t\t<link rel=\"icon\" href=\"" + assets + "/favicon.png\" />\n\t\t<meta name=\"viewport\" content=\"width=device-width, initial-scale=1\" />\n\t\t" + head + "\n\t</head>\n\t<body data-sveltekit-preload-data=\"hover\">\n\t\t<div style=\"display: contents\">" + body + "</div>\n\t</body>\n</html>\n",
|
|
||||||
error: ({ status, message }) => "<!doctype html>\n<html lang=\"en\">\n\t<head>\n\t\t<meta charset=\"utf-8\" />\n\t\t<title>" + message + "</title>\n\n\t\t<style>\n\t\t\tbody {\n\t\t\t\t--bg: white;\n\t\t\t\t--fg: #222;\n\t\t\t\t--divider: #ccc;\n\t\t\t\tbackground: var(--bg);\n\t\t\t\tcolor: var(--fg);\n\t\t\t\tfont-family:\n\t\t\t\t\tsystem-ui,\n\t\t\t\t\t-apple-system,\n\t\t\t\t\tBlinkMacSystemFont,\n\t\t\t\t\t'Segoe UI',\n\t\t\t\t\tRoboto,\n\t\t\t\t\tOxygen,\n\t\t\t\t\tUbuntu,\n\t\t\t\t\tCantarell,\n\t\t\t\t\t'Open Sans',\n\t\t\t\t\t'Helvetica Neue',\n\t\t\t\t\tsans-serif;\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t\tjustify-content: center;\n\t\t\t\theight: 100vh;\n\t\t\t\tmargin: 0;\n\t\t\t}\n\n\t\t\t.error {\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t\tmax-width: 32rem;\n\t\t\t\tmargin: 0 1rem;\n\t\t\t}\n\n\t\t\t.status {\n\t\t\t\tfont-weight: 200;\n\t\t\t\tfont-size: 3rem;\n\t\t\t\tline-height: 1;\n\t\t\t\tposition: relative;\n\t\t\t\ttop: -0.05rem;\n\t\t\t}\n\n\t\t\t.message {\n\t\t\t\tborder-left: 1px solid var(--divider);\n\t\t\t\tpadding: 0 0 0 1rem;\n\t\t\t\tmargin: 0 0 0 1rem;\n\t\t\t\tmin-height: 2.5rem;\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t}\n\n\t\t\t.message h1 {\n\t\t\t\tfont-weight: 400;\n\t\t\t\tfont-size: 1em;\n\t\t\t\tmargin: 0;\n\t\t\t}\n\n\t\t\t@media (prefers-color-scheme: dark) {\n\t\t\t\tbody {\n\t\t\t\t\t--bg: #222;\n\t\t\t\t\t--fg: #ddd;\n\t\t\t\t\t--divider: #666;\n\t\t\t\t}\n\t\t\t}\n\t\t</style>\n\t</head>\n\t<body>\n\t\t<div class=\"error\">\n\t\t\t<span class=\"status\">" + status + "</span>\n\t\t\t<div class=\"message\">\n\t\t\t\t<h1>" + message + "</h1>\n\t\t\t</div>\n\t\t</div>\n\t</body>\n</html>\n"
|
|
||||||
},
|
|
||||||
version_hash: "oj9twc"
|
|
||||||
};
|
|
||||||
|
|
||||||
export async function get_hooks() {
|
|
||||||
let handle;
|
|
||||||
let handleFetch;
|
|
||||||
let handleError;
|
|
||||||
let handleValidationError;
|
|
||||||
let init;
|
|
||||||
|
|
||||||
|
|
||||||
let reroute;
|
|
||||||
let transport;
|
|
||||||
|
|
||||||
|
|
||||||
return {
|
|
||||||
handle,
|
|
||||||
handleFetch,
|
|
||||||
handleError,
|
|
||||||
handleValidationError,
|
|
||||||
init,
|
|
||||||
reroute,
|
|
||||||
transport
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
export { set_assets, set_building, set_manifest, set_prerendering, set_private_env, set_public_env, set_read_implementation };
|
|
||||||
44
frontend/.svelte-kit/non-ambient.d.ts
vendored
44
frontend/.svelte-kit/non-ambient.d.ts
vendored
@@ -1,44 +0,0 @@
|
|||||||
|
|
||||||
// this file is generated — do not edit it
|
|
||||||
|
|
||||||
|
|
||||||
declare module "svelte/elements" {
|
|
||||||
export interface HTMLAttributes<T> {
|
|
||||||
'data-sveltekit-keepfocus'?: true | '' | 'off' | undefined | null;
|
|
||||||
'data-sveltekit-noscroll'?: true | '' | 'off' | undefined | null;
|
|
||||||
'data-sveltekit-preload-code'?:
|
|
||||||
| true
|
|
||||||
| ''
|
|
||||||
| 'eager'
|
|
||||||
| 'viewport'
|
|
||||||
| 'hover'
|
|
||||||
| 'tap'
|
|
||||||
| 'off'
|
|
||||||
| undefined
|
|
||||||
| null;
|
|
||||||
'data-sveltekit-preload-data'?: true | '' | 'hover' | 'tap' | 'off' | undefined | null;
|
|
||||||
'data-sveltekit-reload'?: true | '' | 'off' | undefined | null;
|
|
||||||
'data-sveltekit-replacestate'?: true | '' | 'off' | undefined | null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export {};
|
|
||||||
|
|
||||||
|
|
||||||
declare module "$app/types" {
|
|
||||||
export interface AppTypes {
|
|
||||||
RouteId(): "/" | "/migration" | "/migration/mappings" | "/settings";
|
|
||||||
RouteParams(): {
|
|
||||||
|
|
||||||
};
|
|
||||||
LayoutParams(): {
|
|
||||||
"/": Record<string, never>;
|
|
||||||
"/migration": Record<string, never>;
|
|
||||||
"/migration/mappings": Record<string, never>;
|
|
||||||
"/settings": Record<string, never>
|
|
||||||
};
|
|
||||||
Pathname(): "/" | "/migration" | "/migration/" | "/migration/mappings" | "/migration/mappings/" | "/settings" | "/settings/";
|
|
||||||
ResolvedPathname(): `${"" | `/${string}`}${ReturnType<AppTypes['Pathname']>}`;
|
|
||||||
Asset(): string & {};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,162 +0,0 @@
|
|||||||
{
|
|
||||||
".svelte-kit/generated/client-optimized/app.js": {
|
|
||||||
"file": "_app/immutable/entry/app.BXnpILpp.js",
|
|
||||||
"name": "entry/app",
|
|
||||||
"src": ".svelte-kit/generated/client-optimized/app.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js",
|
|
||||||
"_cv2LK44M.js",
|
|
||||||
"_BxZpmA7Z.js",
|
|
||||||
"_vVxDbqKK.js"
|
|
||||||
],
|
|
||||||
"dynamicImports": [
|
|
||||||
".svelte-kit/generated/client-optimized/nodes/0.js",
|
|
||||||
".svelte-kit/generated/client-optimized/nodes/1.js",
|
|
||||||
".svelte-kit/generated/client-optimized/nodes/2.js",
|
|
||||||
".svelte-kit/generated/client-optimized/nodes/3.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
".svelte-kit/generated/client-optimized/nodes/0.js": {
|
|
||||||
"file": "_app/immutable/nodes/0.DZdF_zz-.js",
|
|
||||||
"name": "nodes/0",
|
|
||||||
"src": ".svelte-kit/generated/client-optimized/nodes/0.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"isDynamicEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_cv2LK44M.js",
|
|
||||||
"_CRLlKr96.js",
|
|
||||||
"_BtL0wB3H.js",
|
|
||||||
"_xdjHc-A2.js",
|
|
||||||
"_DXE57cnx.js",
|
|
||||||
"_Dbod7Wv8.js"
|
|
||||||
],
|
|
||||||
"css": [
|
|
||||||
"_app/immutable/assets/0.RZHRvmcL.css"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
".svelte-kit/generated/client-optimized/nodes/1.js": {
|
|
||||||
"file": "_app/immutable/nodes/1.Bh-fCbID.js",
|
|
||||||
"name": "nodes/1",
|
|
||||||
"src": ".svelte-kit/generated/client-optimized/nodes/1.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"isDynamicEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_cv2LK44M.js",
|
|
||||||
"_CRLlKr96.js",
|
|
||||||
"_BtL0wB3H.js",
|
|
||||||
"_DXE57cnx.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
".svelte-kit/generated/client-optimized/nodes/2.js": {
|
|
||||||
"file": "_app/immutable/nodes/2.BmiXdPHI.js",
|
|
||||||
"name": "nodes/2",
|
|
||||||
"src": ".svelte-kit/generated/client-optimized/nodes/2.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"isDynamicEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_DyPeVqDG.js",
|
|
||||||
"_cv2LK44M.js",
|
|
||||||
"_CRLlKr96.js",
|
|
||||||
"_BtL0wB3H.js",
|
|
||||||
"_vVxDbqKK.js",
|
|
||||||
"_Dbod7Wv8.js",
|
|
||||||
"_BxZpmA7Z.js",
|
|
||||||
"_xdjHc-A2.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
".svelte-kit/generated/client-optimized/nodes/3.js": {
|
|
||||||
"file": "_app/immutable/nodes/3.guWMyWpk.js",
|
|
||||||
"name": "nodes/3",
|
|
||||||
"src": ".svelte-kit/generated/client-optimized/nodes/3.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"isDynamicEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_DyPeVqDG.js",
|
|
||||||
"_cv2LK44M.js",
|
|
||||||
"_CRLlKr96.js",
|
|
||||||
"_BtL0wB3H.js",
|
|
||||||
"_vVxDbqKK.js",
|
|
||||||
"_Dbod7Wv8.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_BtL0wB3H.js": {
|
|
||||||
"file": "_app/immutable/chunks/BtL0wB3H.js",
|
|
||||||
"name": "index"
|
|
||||||
},
|
|
||||||
"_BxZpmA7Z.js": {
|
|
||||||
"file": "_app/immutable/chunks/BxZpmA7Z.js",
|
|
||||||
"name": "index-client",
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_CRLlKr96.js": {
|
|
||||||
"file": "_app/immutable/chunks/CRLlKr96.js",
|
|
||||||
"name": "legacy",
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_D0iaTcAo.js": {
|
|
||||||
"file": "_app/immutable/chunks/D0iaTcAo.js",
|
|
||||||
"name": "entry",
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js",
|
|
||||||
"_BxZpmA7Z.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_DXE57cnx.js": {
|
|
||||||
"file": "_app/immutable/chunks/DXE57cnx.js",
|
|
||||||
"name": "stores",
|
|
||||||
"imports": [
|
|
||||||
"_D0iaTcAo.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_Dbod7Wv8.js": {
|
|
||||||
"file": "_app/immutable/chunks/Dbod7Wv8.js",
|
|
||||||
"name": "toasts",
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_DyPeVqDG.js": {
|
|
||||||
"file": "_app/immutable/chunks/DyPeVqDG.js",
|
|
||||||
"name": "api",
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js",
|
|
||||||
"_Dbod7Wv8.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_cv2LK44M.js": {
|
|
||||||
"file": "_app/immutable/chunks/cv2LK44M.js",
|
|
||||||
"name": "disclose-version",
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_vVxDbqKK.js": {
|
|
||||||
"file": "_app/immutable/chunks/vVxDbqKK.js",
|
|
||||||
"name": "props",
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js",
|
|
||||||
"_cv2LK44M.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_xdjHc-A2.js": {
|
|
||||||
"file": "_app/immutable/chunks/xdjHc-A2.js",
|
|
||||||
"name": "class",
|
|
||||||
"imports": [
|
|
||||||
"_BtL0wB3H.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"node_modules/@sveltejs/kit/src/runtime/client/entry.js": {
|
|
||||||
"file": "_app/immutable/entry/start.BHAeOrfR.js",
|
|
||||||
"name": "entry/start",
|
|
||||||
"src": "node_modules/@sveltejs/kit/src/runtime/client/entry.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_D0iaTcAo.js"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
{"version":"1766262590857"}
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
export const env={"PUBLIC_WS_URL":"ws://localhost:8000"}
|
|
||||||
@@ -1,180 +0,0 @@
|
|||||||
{
|
|
||||||
".svelte-kit/generated/server/internal.js": {
|
|
||||||
"file": "internal.js",
|
|
||||||
"name": "internal",
|
|
||||||
"src": ".svelte-kit/generated/server/internal.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_internal.js",
|
|
||||||
"_environment.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_api.js": {
|
|
||||||
"file": "chunks/api.js",
|
|
||||||
"name": "api",
|
|
||||||
"imports": [
|
|
||||||
"_toasts.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_environment.js": {
|
|
||||||
"file": "chunks/environment.js",
|
|
||||||
"name": "environment"
|
|
||||||
},
|
|
||||||
"_equality.js": {
|
|
||||||
"file": "chunks/equality.js",
|
|
||||||
"name": "equality"
|
|
||||||
},
|
|
||||||
"_exports.js": {
|
|
||||||
"file": "chunks/exports.js",
|
|
||||||
"name": "exports"
|
|
||||||
},
|
|
||||||
"_false.js": {
|
|
||||||
"file": "chunks/false.js",
|
|
||||||
"name": "false"
|
|
||||||
},
|
|
||||||
"_index.js": {
|
|
||||||
"file": "chunks/index.js",
|
|
||||||
"name": "index",
|
|
||||||
"imports": [
|
|
||||||
"_equality.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_index2.js": {
|
|
||||||
"file": "chunks/index2.js",
|
|
||||||
"name": "index",
|
|
||||||
"imports": [
|
|
||||||
"_false.js",
|
|
||||||
"_equality.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_internal.js": {
|
|
||||||
"file": "chunks/internal.js",
|
|
||||||
"name": "internal",
|
|
||||||
"imports": [
|
|
||||||
"_index2.js",
|
|
||||||
"_equality.js",
|
|
||||||
"_environment.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_shared.js": {
|
|
||||||
"file": "chunks/shared.js",
|
|
||||||
"name": "shared",
|
|
||||||
"imports": [
|
|
||||||
"_utils.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_stores.js": {
|
|
||||||
"file": "chunks/stores.js",
|
|
||||||
"name": "stores",
|
|
||||||
"imports": [
|
|
||||||
"_index2.js",
|
|
||||||
"_exports.js",
|
|
||||||
"_utils.js",
|
|
||||||
"_equality.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_toasts.js": {
|
|
||||||
"file": "chunks/toasts.js",
|
|
||||||
"name": "toasts",
|
|
||||||
"imports": [
|
|
||||||
"_index.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"_utils.js": {
|
|
||||||
"file": "chunks/utils.js",
|
|
||||||
"name": "utils"
|
|
||||||
},
|
|
||||||
"node_modules/@sveltejs/kit/src/runtime/app/server/remote/index.js": {
|
|
||||||
"file": "remote-entry.js",
|
|
||||||
"name": "remote-entry",
|
|
||||||
"src": "node_modules/@sveltejs/kit/src/runtime/app/server/remote/index.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_shared.js",
|
|
||||||
"_false.js",
|
|
||||||
"_environment.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"node_modules/@sveltejs/kit/src/runtime/server/index.js": {
|
|
||||||
"file": "index.js",
|
|
||||||
"name": "index",
|
|
||||||
"src": "node_modules/@sveltejs/kit/src/runtime/server/index.js",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_false.js",
|
|
||||||
"_environment.js",
|
|
||||||
"_shared.js",
|
|
||||||
"_exports.js",
|
|
||||||
"_utils.js",
|
|
||||||
"_index.js",
|
|
||||||
"_internal.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"src/routes/+error.svelte": {
|
|
||||||
"file": "entries/pages/_error.svelte.js",
|
|
||||||
"name": "entries/pages/_error.svelte",
|
|
||||||
"src": "src/routes/+error.svelte",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_index2.js",
|
|
||||||
"_stores.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"src/routes/+layout.svelte": {
|
|
||||||
"file": "entries/pages/_layout.svelte.js",
|
|
||||||
"name": "entries/pages/_layout.svelte",
|
|
||||||
"src": "src/routes/+layout.svelte",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_index2.js",
|
|
||||||
"_stores.js",
|
|
||||||
"_toasts.js"
|
|
||||||
],
|
|
||||||
"css": [
|
|
||||||
"_app/immutable/assets/_layout.RZHRvmcL.css"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"src/routes/+layout.ts": {
|
|
||||||
"file": "entries/pages/_layout.ts.js",
|
|
||||||
"name": "entries/pages/_layout.ts",
|
|
||||||
"src": "src/routes/+layout.ts",
|
|
||||||
"isEntry": true
|
|
||||||
},
|
|
||||||
"src/routes/+page.svelte": {
|
|
||||||
"file": "entries/pages/_page.svelte.js",
|
|
||||||
"name": "entries/pages/_page.svelte",
|
|
||||||
"src": "src/routes/+page.svelte",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_index2.js",
|
|
||||||
"_index.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"src/routes/+page.ts": {
|
|
||||||
"file": "entries/pages/_page.ts.js",
|
|
||||||
"name": "entries/pages/_page.ts",
|
|
||||||
"src": "src/routes/+page.ts",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_api.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"src/routes/settings/+page.svelte": {
|
|
||||||
"file": "entries/pages/settings/_page.svelte.js",
|
|
||||||
"name": "entries/pages/settings/_page.svelte",
|
|
||||||
"src": "src/routes/settings/+page.svelte",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_index2.js"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"src/routes/settings/+page.ts": {
|
|
||||||
"file": "entries/pages/settings/_page.ts.js",
|
|
||||||
"name": "entries/pages/settings/_page.ts",
|
|
||||||
"src": "src/routes/settings/+page.ts",
|
|
||||||
"isEntry": true,
|
|
||||||
"imports": [
|
|
||||||
"_api.js"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,77 +0,0 @@
|
|||||||
import { a as addToast } from "./toasts.js";
|
|
||||||
const API_BASE_URL = "/api";
|
|
||||||
async function fetchApi(endpoint) {
|
|
||||||
try {
|
|
||||||
console.log(`[api.fetchApi][Action] Fetching from context={{'endpoint': '${endpoint}'}}`);
|
|
||||||
const response = await fetch(`${API_BASE_URL}${endpoint}`);
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error(`API request failed with status ${response.status}`);
|
|
||||||
}
|
|
||||||
return await response.json();
|
|
||||||
} catch (error) {
|
|
||||||
console.error(`[api.fetchApi][Coherence:Failed] Error fetching from ${endpoint}:`, error);
|
|
||||||
addToast(error.message, "error");
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
async function postApi(endpoint, body) {
|
|
||||||
try {
|
|
||||||
console.log(`[api.postApi][Action] Posting to context={{'endpoint': '${endpoint}'}}`);
|
|
||||||
const response = await fetch(`${API_BASE_URL}${endpoint}`, {
|
|
||||||
method: "POST",
|
|
||||||
headers: {
|
|
||||||
"Content-Type": "application/json"
|
|
||||||
},
|
|
||||||
body: JSON.stringify(body)
|
|
||||||
});
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error(`API request failed with status ${response.status}`);
|
|
||||||
}
|
|
||||||
return await response.json();
|
|
||||||
} catch (error) {
|
|
||||||
console.error(`[api.postApi][Coherence:Failed] Error posting to ${endpoint}:`, error);
|
|
||||||
addToast(error.message, "error");
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
async function requestApi(endpoint, method = "GET", body = null) {
|
|
||||||
try {
|
|
||||||
console.log(`[api.requestApi][Action] ${method} to context={{'endpoint': '${endpoint}'}}`);
|
|
||||||
const options = {
|
|
||||||
method,
|
|
||||||
headers: {
|
|
||||||
"Content-Type": "application/json"
|
|
||||||
}
|
|
||||||
};
|
|
||||||
if (body) {
|
|
||||||
options.body = JSON.stringify(body);
|
|
||||||
}
|
|
||||||
const response = await fetch(`${API_BASE_URL}${endpoint}`, options);
|
|
||||||
if (!response.ok) {
|
|
||||||
const errorData = await response.json().catch(() => ({}));
|
|
||||||
throw new Error(errorData.detail || `API request failed with status ${response.status}`);
|
|
||||||
}
|
|
||||||
return await response.json();
|
|
||||||
} catch (error) {
|
|
||||||
console.error(`[api.requestApi][Coherence:Failed] Error ${method} to ${endpoint}:`, error);
|
|
||||||
addToast(error.message, "error");
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
const api = {
|
|
||||||
getPlugins: () => fetchApi("/plugins/"),
|
|
||||||
getTasks: () => fetchApi("/tasks/"),
|
|
||||||
getTask: (taskId) => fetchApi(`/tasks/${taskId}`),
|
|
||||||
createTask: (pluginId, params) => postApi("/tasks/", { plugin_id: pluginId, params }),
|
|
||||||
// Settings
|
|
||||||
getSettings: () => fetchApi("/settings/"),
|
|
||||||
updateGlobalSettings: (settings) => requestApi("/settings/global", "PATCH", settings),
|
|
||||||
getEnvironments: () => fetchApi("/settings/environments"),
|
|
||||||
addEnvironment: (env) => postApi("/settings/environments", env),
|
|
||||||
updateEnvironment: (id, env) => requestApi(`/settings/environments/${id}`, "PUT", env),
|
|
||||||
deleteEnvironment: (id) => requestApi(`/settings/environments/${id}`, "DELETE"),
|
|
||||||
testEnvironmentConnection: (id) => postApi(`/settings/environments/${id}/test`, {})
|
|
||||||
};
|
|
||||||
export {
|
|
||||||
api as a
|
|
||||||
};
|
|
||||||
@@ -1,34 +0,0 @@
|
|||||||
let base = "";
|
|
||||||
let assets = base;
|
|
||||||
const app_dir = "_app";
|
|
||||||
const relative = true;
|
|
||||||
const initial = { base, assets };
|
|
||||||
function override(paths) {
|
|
||||||
base = paths.base;
|
|
||||||
assets = paths.assets;
|
|
||||||
}
|
|
||||||
function reset() {
|
|
||||||
base = initial.base;
|
|
||||||
assets = initial.assets;
|
|
||||||
}
|
|
||||||
function set_assets(path) {
|
|
||||||
assets = initial.assets = path;
|
|
||||||
}
|
|
||||||
let prerendering = false;
|
|
||||||
function set_building() {
|
|
||||||
}
|
|
||||||
function set_prerendering() {
|
|
||||||
prerendering = true;
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
assets as a,
|
|
||||||
base as b,
|
|
||||||
app_dir as c,
|
|
||||||
reset as d,
|
|
||||||
set_building as e,
|
|
||||||
set_prerendering as f,
|
|
||||||
override as o,
|
|
||||||
prerendering as p,
|
|
||||||
relative as r,
|
|
||||||
set_assets as s
|
|
||||||
};
|
|
||||||
@@ -1,51 +0,0 @@
|
|||||||
var is_array = Array.isArray;
|
|
||||||
var index_of = Array.prototype.indexOf;
|
|
||||||
var array_from = Array.from;
|
|
||||||
var define_property = Object.defineProperty;
|
|
||||||
var get_descriptor = Object.getOwnPropertyDescriptor;
|
|
||||||
var object_prototype = Object.prototype;
|
|
||||||
var array_prototype = Array.prototype;
|
|
||||||
var get_prototype_of = Object.getPrototypeOf;
|
|
||||||
var is_extensible = Object.isExtensible;
|
|
||||||
const noop = () => {
|
|
||||||
};
|
|
||||||
function run_all(arr) {
|
|
||||||
for (var i = 0; i < arr.length; i++) {
|
|
||||||
arr[i]();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function deferred() {
|
|
||||||
var resolve;
|
|
||||||
var reject;
|
|
||||||
var promise = new Promise((res, rej) => {
|
|
||||||
resolve = res;
|
|
||||||
reject = rej;
|
|
||||||
});
|
|
||||||
return { promise, resolve, reject };
|
|
||||||
}
|
|
||||||
function equals(value) {
|
|
||||||
return value === this.v;
|
|
||||||
}
|
|
||||||
function safe_not_equal(a, b) {
|
|
||||||
return a != a ? b == b : a !== b || a !== null && typeof a === "object" || typeof a === "function";
|
|
||||||
}
|
|
||||||
function safe_equals(value) {
|
|
||||||
return !safe_not_equal(value, this.v);
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
array_from as a,
|
|
||||||
deferred as b,
|
|
||||||
array_prototype as c,
|
|
||||||
define_property as d,
|
|
||||||
equals as e,
|
|
||||||
get_prototype_of as f,
|
|
||||||
get_descriptor as g,
|
|
||||||
is_extensible as h,
|
|
||||||
is_array as i,
|
|
||||||
index_of as j,
|
|
||||||
safe_not_equal as k,
|
|
||||||
noop as n,
|
|
||||||
object_prototype as o,
|
|
||||||
run_all as r,
|
|
||||||
safe_equals as s
|
|
||||||
};
|
|
||||||
@@ -1,174 +0,0 @@
|
|||||||
const SCHEME = /^[a-z][a-z\d+\-.]+:/i;
|
|
||||||
const internal = new URL("sveltekit-internal://");
|
|
||||||
function resolve(base, path) {
|
|
||||||
if (path[0] === "/" && path[1] === "/") return path;
|
|
||||||
let url = new URL(base, internal);
|
|
||||||
url = new URL(path, url);
|
|
||||||
return url.protocol === internal.protocol ? url.pathname + url.search + url.hash : url.href;
|
|
||||||
}
|
|
||||||
function normalize_path(path, trailing_slash) {
|
|
||||||
if (path === "/" || trailing_slash === "ignore") return path;
|
|
||||||
if (trailing_slash === "never") {
|
|
||||||
return path.endsWith("/") ? path.slice(0, -1) : path;
|
|
||||||
} else if (trailing_slash === "always" && !path.endsWith("/")) {
|
|
||||||
return path + "/";
|
|
||||||
}
|
|
||||||
return path;
|
|
||||||
}
|
|
||||||
function decode_pathname(pathname) {
|
|
||||||
return pathname.split("%25").map(decodeURI).join("%25");
|
|
||||||
}
|
|
||||||
function decode_params(params) {
|
|
||||||
for (const key in params) {
|
|
||||||
params[key] = decodeURIComponent(params[key]);
|
|
||||||
}
|
|
||||||
return params;
|
|
||||||
}
|
|
||||||
function make_trackable(url, callback, search_params_callback, allow_hash = false) {
|
|
||||||
const tracked = new URL(url);
|
|
||||||
Object.defineProperty(tracked, "searchParams", {
|
|
||||||
value: new Proxy(tracked.searchParams, {
|
|
||||||
get(obj, key) {
|
|
||||||
if (key === "get" || key === "getAll" || key === "has") {
|
|
||||||
return (param) => {
|
|
||||||
search_params_callback(param);
|
|
||||||
return obj[key](param);
|
|
||||||
};
|
|
||||||
}
|
|
||||||
callback();
|
|
||||||
const value = Reflect.get(obj, key);
|
|
||||||
return typeof value === "function" ? value.bind(obj) : value;
|
|
||||||
}
|
|
||||||
}),
|
|
||||||
enumerable: true,
|
|
||||||
configurable: true
|
|
||||||
});
|
|
||||||
const tracked_url_properties = ["href", "pathname", "search", "toString", "toJSON"];
|
|
||||||
if (allow_hash) tracked_url_properties.push("hash");
|
|
||||||
for (const property of tracked_url_properties) {
|
|
||||||
Object.defineProperty(tracked, property, {
|
|
||||||
get() {
|
|
||||||
callback();
|
|
||||||
return url[property];
|
|
||||||
},
|
|
||||||
enumerable: true,
|
|
||||||
configurable: true
|
|
||||||
});
|
|
||||||
}
|
|
||||||
{
|
|
||||||
tracked[/* @__PURE__ */ Symbol.for("nodejs.util.inspect.custom")] = (depth, opts, inspect) => {
|
|
||||||
return inspect(url, opts);
|
|
||||||
};
|
|
||||||
tracked.searchParams[/* @__PURE__ */ Symbol.for("nodejs.util.inspect.custom")] = (depth, opts, inspect) => {
|
|
||||||
return inspect(url.searchParams, opts);
|
|
||||||
};
|
|
||||||
}
|
|
||||||
if (!allow_hash) {
|
|
||||||
disable_hash(tracked);
|
|
||||||
}
|
|
||||||
return tracked;
|
|
||||||
}
|
|
||||||
function disable_hash(url) {
|
|
||||||
allow_nodejs_console_log(url);
|
|
||||||
Object.defineProperty(url, "hash", {
|
|
||||||
get() {
|
|
||||||
throw new Error(
|
|
||||||
"Cannot access event.url.hash. Consider using `page.url.hash` inside a component instead"
|
|
||||||
);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
function disable_search(url) {
|
|
||||||
allow_nodejs_console_log(url);
|
|
||||||
for (const property of ["search", "searchParams"]) {
|
|
||||||
Object.defineProperty(url, property, {
|
|
||||||
get() {
|
|
||||||
throw new Error(`Cannot access url.${property} on a page with prerendering enabled`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function allow_nodejs_console_log(url) {
|
|
||||||
{
|
|
||||||
url[/* @__PURE__ */ Symbol.for("nodejs.util.inspect.custom")] = (depth, opts, inspect) => {
|
|
||||||
return inspect(new URL(url), opts);
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function validator(expected) {
|
|
||||||
function validate(module, file) {
|
|
||||||
if (!module) return;
|
|
||||||
for (const key in module) {
|
|
||||||
if (key[0] === "_" || expected.has(key)) continue;
|
|
||||||
const values = [...expected.values()];
|
|
||||||
const hint = hint_for_supported_files(key, file?.slice(file.lastIndexOf("."))) ?? `valid exports are ${values.join(", ")}, or anything with a '_' prefix`;
|
|
||||||
throw new Error(`Invalid export '${key}'${file ? ` in ${file}` : ""} (${hint})`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return validate;
|
|
||||||
}
|
|
||||||
function hint_for_supported_files(key, ext = ".js") {
|
|
||||||
const supported_files = [];
|
|
||||||
if (valid_layout_exports.has(key)) {
|
|
||||||
supported_files.push(`+layout${ext}`);
|
|
||||||
}
|
|
||||||
if (valid_page_exports.has(key)) {
|
|
||||||
supported_files.push(`+page${ext}`);
|
|
||||||
}
|
|
||||||
if (valid_layout_server_exports.has(key)) {
|
|
||||||
supported_files.push(`+layout.server${ext}`);
|
|
||||||
}
|
|
||||||
if (valid_page_server_exports.has(key)) {
|
|
||||||
supported_files.push(`+page.server${ext}`);
|
|
||||||
}
|
|
||||||
if (valid_server_exports.has(key)) {
|
|
||||||
supported_files.push(`+server${ext}`);
|
|
||||||
}
|
|
||||||
if (supported_files.length > 0) {
|
|
||||||
return `'${key}' is a valid export in ${supported_files.slice(0, -1).join(", ")}${supported_files.length > 1 ? " or " : ""}${supported_files.at(-1)}`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
const valid_layout_exports = /* @__PURE__ */ new Set([
|
|
||||||
"load",
|
|
||||||
"prerender",
|
|
||||||
"csr",
|
|
||||||
"ssr",
|
|
||||||
"trailingSlash",
|
|
||||||
"config"
|
|
||||||
]);
|
|
||||||
const valid_page_exports = /* @__PURE__ */ new Set([...valid_layout_exports, "entries"]);
|
|
||||||
const valid_layout_server_exports = /* @__PURE__ */ new Set([...valid_layout_exports]);
|
|
||||||
const valid_page_server_exports = /* @__PURE__ */ new Set([...valid_layout_server_exports, "actions", "entries"]);
|
|
||||||
const valid_server_exports = /* @__PURE__ */ new Set([
|
|
||||||
"GET",
|
|
||||||
"POST",
|
|
||||||
"PATCH",
|
|
||||||
"PUT",
|
|
||||||
"DELETE",
|
|
||||||
"OPTIONS",
|
|
||||||
"HEAD",
|
|
||||||
"fallback",
|
|
||||||
"prerender",
|
|
||||||
"trailingSlash",
|
|
||||||
"config",
|
|
||||||
"entries"
|
|
||||||
]);
|
|
||||||
const validate_layout_exports = validator(valid_layout_exports);
|
|
||||||
const validate_page_exports = validator(valid_page_exports);
|
|
||||||
const validate_layout_server_exports = validator(valid_layout_server_exports);
|
|
||||||
const validate_page_server_exports = validator(valid_page_server_exports);
|
|
||||||
const validate_server_exports = validator(valid_server_exports);
|
|
||||||
export {
|
|
||||||
SCHEME as S,
|
|
||||||
decode_params as a,
|
|
||||||
validate_layout_exports as b,
|
|
||||||
validate_page_server_exports as c,
|
|
||||||
disable_search as d,
|
|
||||||
validate_page_exports as e,
|
|
||||||
decode_pathname as f,
|
|
||||||
validate_server_exports as g,
|
|
||||||
make_trackable as m,
|
|
||||||
normalize_path as n,
|
|
||||||
resolve as r,
|
|
||||||
validate_layout_server_exports as v
|
|
||||||
};
|
|
||||||
@@ -1,4 +0,0 @@
|
|||||||
const BROWSER = false;
|
|
||||||
export {
|
|
||||||
BROWSER as B
|
|
||||||
};
|
|
||||||
@@ -1,59 +0,0 @@
|
|||||||
import { n as noop, k as safe_not_equal } from "./equality.js";
|
|
||||||
import "clsx";
|
|
||||||
const subscriber_queue = [];
|
|
||||||
function readable(value, start) {
|
|
||||||
return {
|
|
||||||
subscribe: writable(value, start).subscribe
|
|
||||||
};
|
|
||||||
}
|
|
||||||
function writable(value, start = noop) {
|
|
||||||
let stop = null;
|
|
||||||
const subscribers = /* @__PURE__ */ new Set();
|
|
||||||
function set(new_value) {
|
|
||||||
if (safe_not_equal(value, new_value)) {
|
|
||||||
value = new_value;
|
|
||||||
if (stop) {
|
|
||||||
const run_queue = !subscriber_queue.length;
|
|
||||||
for (const subscriber of subscribers) {
|
|
||||||
subscriber[1]();
|
|
||||||
subscriber_queue.push(subscriber, value);
|
|
||||||
}
|
|
||||||
if (run_queue) {
|
|
||||||
for (let i = 0; i < subscriber_queue.length; i += 2) {
|
|
||||||
subscriber_queue[i][0](subscriber_queue[i + 1]);
|
|
||||||
}
|
|
||||||
subscriber_queue.length = 0;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function update(fn) {
|
|
||||||
set(fn(
|
|
||||||
/** @type {T} */
|
|
||||||
value
|
|
||||||
));
|
|
||||||
}
|
|
||||||
function subscribe(run, invalidate = noop) {
|
|
||||||
const subscriber = [run, invalidate];
|
|
||||||
subscribers.add(subscriber);
|
|
||||||
if (subscribers.size === 1) {
|
|
||||||
stop = start(set, update) || noop;
|
|
||||||
}
|
|
||||||
run(
|
|
||||||
/** @type {T} */
|
|
||||||
value
|
|
||||||
);
|
|
||||||
return () => {
|
|
||||||
subscribers.delete(subscriber);
|
|
||||||
if (subscribers.size === 0 && stop) {
|
|
||||||
stop();
|
|
||||||
stop = null;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
return { set, update, subscribe };
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
readable as r,
|
|
||||||
writable as w
|
|
||||||
};
|
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -1,982 +0,0 @@
|
|||||||
import { H as HYDRATION_ERROR, C as COMMENT_NODE, a as HYDRATION_END, g as get_next_sibling, b as HYDRATION_START, c as HYDRATION_START_ELSE, e as effect_tracking, d as get, s as source, r as render_effect, u as untrack, i as increment, q as queue_micro_task, f as active_effect, h as block, j as branch, B as Batch, p as pause_effect, k as create_text, l as set_active_effect, m as set_active_reaction, n as set_component_context, o as handle_error, t as active_reaction, v as component_context, w as move_effect, x as internal_set, y as destroy_effect, z as invoke_error_boundary, A as svelte_boundary_reset_onerror, E as EFFECT_TRANSPARENT, D as EFFECT_PRESERVED, F as BOUNDARY_EFFECT, G as init_operations, I as get_first_child, J as hydration_failed, K as clear_text_content, L as component_root, M as is_passive_event, N as push, O as pop, P as set, Q as LEGACY_PROPS, R as flushSync, S as mutable_source, T as render, U as setContext } from "./index2.js";
|
|
||||||
import { d as define_property, a as array_from } from "./equality.js";
|
|
||||||
import "clsx";
|
|
||||||
import "./environment.js";
|
|
||||||
let public_env = {};
|
|
||||||
function set_private_env(environment) {
|
|
||||||
}
|
|
||||||
function set_public_env(environment) {
|
|
||||||
public_env = environment;
|
|
||||||
}
|
|
||||||
function hydration_mismatch(location) {
|
|
||||||
{
|
|
||||||
console.warn(`https://svelte.dev/e/hydration_mismatch`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function svelte_boundary_reset_noop() {
|
|
||||||
{
|
|
||||||
console.warn(`https://svelte.dev/e/svelte_boundary_reset_noop`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
let hydrating = false;
|
|
||||||
function set_hydrating(value) {
|
|
||||||
hydrating = value;
|
|
||||||
}
|
|
||||||
let hydrate_node;
|
|
||||||
function set_hydrate_node(node) {
|
|
||||||
if (node === null) {
|
|
||||||
hydration_mismatch();
|
|
||||||
throw HYDRATION_ERROR;
|
|
||||||
}
|
|
||||||
return hydrate_node = node;
|
|
||||||
}
|
|
||||||
function hydrate_next() {
|
|
||||||
return set_hydrate_node(get_next_sibling(hydrate_node));
|
|
||||||
}
|
|
||||||
function next(count = 1) {
|
|
||||||
if (hydrating) {
|
|
||||||
var i = count;
|
|
||||||
var node = hydrate_node;
|
|
||||||
while (i--) {
|
|
||||||
node = /** @type {TemplateNode} */
|
|
||||||
get_next_sibling(node);
|
|
||||||
}
|
|
||||||
hydrate_node = node;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function skip_nodes(remove = true) {
|
|
||||||
var depth = 0;
|
|
||||||
var node = hydrate_node;
|
|
||||||
while (true) {
|
|
||||||
if (node.nodeType === COMMENT_NODE) {
|
|
||||||
var data = (
|
|
||||||
/** @type {Comment} */
|
|
||||||
node.data
|
|
||||||
);
|
|
||||||
if (data === HYDRATION_END) {
|
|
||||||
if (depth === 0) return node;
|
|
||||||
depth -= 1;
|
|
||||||
} else if (data === HYDRATION_START || data === HYDRATION_START_ELSE) {
|
|
||||||
depth += 1;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
var next2 = (
|
|
||||||
/** @type {TemplateNode} */
|
|
||||||
get_next_sibling(node)
|
|
||||||
);
|
|
||||||
if (remove) node.remove();
|
|
||||||
node = next2;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function createSubscriber(start) {
|
|
||||||
let subscribers = 0;
|
|
||||||
let version = source(0);
|
|
||||||
let stop;
|
|
||||||
return () => {
|
|
||||||
if (effect_tracking()) {
|
|
||||||
get(version);
|
|
||||||
render_effect(() => {
|
|
||||||
if (subscribers === 0) {
|
|
||||||
stop = untrack(() => start(() => increment(version)));
|
|
||||||
}
|
|
||||||
subscribers += 1;
|
|
||||||
return () => {
|
|
||||||
queue_micro_task(() => {
|
|
||||||
subscribers -= 1;
|
|
||||||
if (subscribers === 0) {
|
|
||||||
stop?.();
|
|
||||||
stop = void 0;
|
|
||||||
increment(version);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
};
|
|
||||||
});
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
var flags = EFFECT_TRANSPARENT | EFFECT_PRESERVED | BOUNDARY_EFFECT;
|
|
||||||
function boundary(node, props, children) {
|
|
||||||
new Boundary(node, props, children);
|
|
||||||
}
|
|
||||||
class Boundary {
|
|
||||||
/** @type {Boundary | null} */
|
|
||||||
parent;
|
|
||||||
#pending = false;
|
|
||||||
/** @type {TemplateNode} */
|
|
||||||
#anchor;
|
|
||||||
/** @type {TemplateNode | null} */
|
|
||||||
#hydrate_open = hydrating ? hydrate_node : null;
|
|
||||||
/** @type {BoundaryProps} */
|
|
||||||
#props;
|
|
||||||
/** @type {((anchor: Node) => void)} */
|
|
||||||
#children;
|
|
||||||
/** @type {Effect} */
|
|
||||||
#effect;
|
|
||||||
/** @type {Effect | null} */
|
|
||||||
#main_effect = null;
|
|
||||||
/** @type {Effect | null} */
|
|
||||||
#pending_effect = null;
|
|
||||||
/** @type {Effect | null} */
|
|
||||||
#failed_effect = null;
|
|
||||||
/** @type {DocumentFragment | null} */
|
|
||||||
#offscreen_fragment = null;
|
|
||||||
/** @type {TemplateNode | null} */
|
|
||||||
#pending_anchor = null;
|
|
||||||
#local_pending_count = 0;
|
|
||||||
#pending_count = 0;
|
|
||||||
#is_creating_fallback = false;
|
|
||||||
/**
|
|
||||||
* A source containing the number of pending async deriveds/expressions.
|
|
||||||
* Only created if `$effect.pending()` is used inside the boundary,
|
|
||||||
* otherwise updating the source results in needless `Batch.ensure()`
|
|
||||||
* calls followed by no-op flushes
|
|
||||||
* @type {Source<number> | null}
|
|
||||||
*/
|
|
||||||
#effect_pending = null;
|
|
||||||
#effect_pending_subscriber = createSubscriber(() => {
|
|
||||||
this.#effect_pending = source(this.#local_pending_count);
|
|
||||||
return () => {
|
|
||||||
this.#effect_pending = null;
|
|
||||||
};
|
|
||||||
});
|
|
||||||
/**
|
|
||||||
* @param {TemplateNode} node
|
|
||||||
* @param {BoundaryProps} props
|
|
||||||
* @param {((anchor: Node) => void)} children
|
|
||||||
*/
|
|
||||||
constructor(node, props, children) {
|
|
||||||
this.#anchor = node;
|
|
||||||
this.#props = props;
|
|
||||||
this.#children = children;
|
|
||||||
this.parent = /** @type {Effect} */
|
|
||||||
active_effect.b;
|
|
||||||
this.#pending = !!this.#props.pending;
|
|
||||||
this.#effect = block(() => {
|
|
||||||
active_effect.b = this;
|
|
||||||
if (hydrating) {
|
|
||||||
const comment = this.#hydrate_open;
|
|
||||||
hydrate_next();
|
|
||||||
const server_rendered_pending = (
|
|
||||||
/** @type {Comment} */
|
|
||||||
comment.nodeType === COMMENT_NODE && /** @type {Comment} */
|
|
||||||
comment.data === HYDRATION_START_ELSE
|
|
||||||
);
|
|
||||||
if (server_rendered_pending) {
|
|
||||||
this.#hydrate_pending_content();
|
|
||||||
} else {
|
|
||||||
this.#hydrate_resolved_content();
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
var anchor = this.#get_anchor();
|
|
||||||
try {
|
|
||||||
this.#main_effect = branch(() => children(anchor));
|
|
||||||
} catch (error) {
|
|
||||||
this.error(error);
|
|
||||||
}
|
|
||||||
if (this.#pending_count > 0) {
|
|
||||||
this.#show_pending_snippet();
|
|
||||||
} else {
|
|
||||||
this.#pending = false;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return () => {
|
|
||||||
this.#pending_anchor?.remove();
|
|
||||||
};
|
|
||||||
}, flags);
|
|
||||||
if (hydrating) {
|
|
||||||
this.#anchor = hydrate_node;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
#hydrate_resolved_content() {
|
|
||||||
try {
|
|
||||||
this.#main_effect = branch(() => this.#children(this.#anchor));
|
|
||||||
} catch (error) {
|
|
||||||
this.error(error);
|
|
||||||
}
|
|
||||||
this.#pending = false;
|
|
||||||
}
|
|
||||||
#hydrate_pending_content() {
|
|
||||||
const pending = this.#props.pending;
|
|
||||||
if (!pending) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
this.#pending_effect = branch(() => pending(this.#anchor));
|
|
||||||
Batch.enqueue(() => {
|
|
||||||
var anchor = this.#get_anchor();
|
|
||||||
this.#main_effect = this.#run(() => {
|
|
||||||
Batch.ensure();
|
|
||||||
return branch(() => this.#children(anchor));
|
|
||||||
});
|
|
||||||
if (this.#pending_count > 0) {
|
|
||||||
this.#show_pending_snippet();
|
|
||||||
} else {
|
|
||||||
pause_effect(
|
|
||||||
/** @type {Effect} */
|
|
||||||
this.#pending_effect,
|
|
||||||
() => {
|
|
||||||
this.#pending_effect = null;
|
|
||||||
}
|
|
||||||
);
|
|
||||||
this.#pending = false;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
#get_anchor() {
|
|
||||||
var anchor = this.#anchor;
|
|
||||||
if (this.#pending) {
|
|
||||||
this.#pending_anchor = create_text();
|
|
||||||
this.#anchor.before(this.#pending_anchor);
|
|
||||||
anchor = this.#pending_anchor;
|
|
||||||
}
|
|
||||||
return anchor;
|
|
||||||
}
|
|
||||||
/**
|
|
||||||
* Returns `true` if the effect exists inside a boundary whose pending snippet is shown
|
|
||||||
* @returns {boolean}
|
|
||||||
*/
|
|
||||||
is_pending() {
|
|
||||||
return this.#pending || !!this.parent && this.parent.is_pending();
|
|
||||||
}
|
|
||||||
has_pending_snippet() {
|
|
||||||
return !!this.#props.pending;
|
|
||||||
}
|
|
||||||
/**
|
|
||||||
* @param {() => Effect | null} fn
|
|
||||||
*/
|
|
||||||
#run(fn) {
|
|
||||||
var previous_effect = active_effect;
|
|
||||||
var previous_reaction = active_reaction;
|
|
||||||
var previous_ctx = component_context;
|
|
||||||
set_active_effect(this.#effect);
|
|
||||||
set_active_reaction(this.#effect);
|
|
||||||
set_component_context(this.#effect.ctx);
|
|
||||||
try {
|
|
||||||
return fn();
|
|
||||||
} catch (e) {
|
|
||||||
handle_error(e);
|
|
||||||
return null;
|
|
||||||
} finally {
|
|
||||||
set_active_effect(previous_effect);
|
|
||||||
set_active_reaction(previous_reaction);
|
|
||||||
set_component_context(previous_ctx);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
#show_pending_snippet() {
|
|
||||||
const pending = (
|
|
||||||
/** @type {(anchor: Node) => void} */
|
|
||||||
this.#props.pending
|
|
||||||
);
|
|
||||||
if (this.#main_effect !== null) {
|
|
||||||
this.#offscreen_fragment = document.createDocumentFragment();
|
|
||||||
this.#offscreen_fragment.append(
|
|
||||||
/** @type {TemplateNode} */
|
|
||||||
this.#pending_anchor
|
|
||||||
);
|
|
||||||
move_effect(this.#main_effect, this.#offscreen_fragment);
|
|
||||||
}
|
|
||||||
if (this.#pending_effect === null) {
|
|
||||||
this.#pending_effect = branch(() => pending(this.#anchor));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
/**
|
|
||||||
* Updates the pending count associated with the currently visible pending snippet,
|
|
||||||
* if any, such that we can replace the snippet with content once work is done
|
|
||||||
* @param {1 | -1} d
|
|
||||||
*/
|
|
||||||
#update_pending_count(d) {
|
|
||||||
if (!this.has_pending_snippet()) {
|
|
||||||
if (this.parent) {
|
|
||||||
this.parent.#update_pending_count(d);
|
|
||||||
}
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
this.#pending_count += d;
|
|
||||||
if (this.#pending_count === 0) {
|
|
||||||
this.#pending = false;
|
|
||||||
if (this.#pending_effect) {
|
|
||||||
pause_effect(this.#pending_effect, () => {
|
|
||||||
this.#pending_effect = null;
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (this.#offscreen_fragment) {
|
|
||||||
this.#anchor.before(this.#offscreen_fragment);
|
|
||||||
this.#offscreen_fragment = null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
/**
|
|
||||||
* Update the source that powers `$effect.pending()` inside this boundary,
|
|
||||||
* and controls when the current `pending` snippet (if any) is removed.
|
|
||||||
* Do not call from inside the class
|
|
||||||
* @param {1 | -1} d
|
|
||||||
*/
|
|
||||||
update_pending_count(d) {
|
|
||||||
this.#update_pending_count(d);
|
|
||||||
this.#local_pending_count += d;
|
|
||||||
if (this.#effect_pending) {
|
|
||||||
internal_set(this.#effect_pending, this.#local_pending_count);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
get_effect_pending() {
|
|
||||||
this.#effect_pending_subscriber();
|
|
||||||
return get(
|
|
||||||
/** @type {Source<number>} */
|
|
||||||
this.#effect_pending
|
|
||||||
);
|
|
||||||
}
|
|
||||||
/** @param {unknown} error */
|
|
||||||
error(error) {
|
|
||||||
var onerror = this.#props.onerror;
|
|
||||||
let failed = this.#props.failed;
|
|
||||||
if (this.#is_creating_fallback || !onerror && !failed) {
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
if (this.#main_effect) {
|
|
||||||
destroy_effect(this.#main_effect);
|
|
||||||
this.#main_effect = null;
|
|
||||||
}
|
|
||||||
if (this.#pending_effect) {
|
|
||||||
destroy_effect(this.#pending_effect);
|
|
||||||
this.#pending_effect = null;
|
|
||||||
}
|
|
||||||
if (this.#failed_effect) {
|
|
||||||
destroy_effect(this.#failed_effect);
|
|
||||||
this.#failed_effect = null;
|
|
||||||
}
|
|
||||||
if (hydrating) {
|
|
||||||
set_hydrate_node(
|
|
||||||
/** @type {TemplateNode} */
|
|
||||||
this.#hydrate_open
|
|
||||||
);
|
|
||||||
next();
|
|
||||||
set_hydrate_node(skip_nodes());
|
|
||||||
}
|
|
||||||
var did_reset = false;
|
|
||||||
var calling_on_error = false;
|
|
||||||
const reset = () => {
|
|
||||||
if (did_reset) {
|
|
||||||
svelte_boundary_reset_noop();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
did_reset = true;
|
|
||||||
if (calling_on_error) {
|
|
||||||
svelte_boundary_reset_onerror();
|
|
||||||
}
|
|
||||||
Batch.ensure();
|
|
||||||
this.#local_pending_count = 0;
|
|
||||||
if (this.#failed_effect !== null) {
|
|
||||||
pause_effect(this.#failed_effect, () => {
|
|
||||||
this.#failed_effect = null;
|
|
||||||
});
|
|
||||||
}
|
|
||||||
this.#pending = this.has_pending_snippet();
|
|
||||||
this.#main_effect = this.#run(() => {
|
|
||||||
this.#is_creating_fallback = false;
|
|
||||||
return branch(() => this.#children(this.#anchor));
|
|
||||||
});
|
|
||||||
if (this.#pending_count > 0) {
|
|
||||||
this.#show_pending_snippet();
|
|
||||||
} else {
|
|
||||||
this.#pending = false;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
var previous_reaction = active_reaction;
|
|
||||||
try {
|
|
||||||
set_active_reaction(null);
|
|
||||||
calling_on_error = true;
|
|
||||||
onerror?.(error, reset);
|
|
||||||
calling_on_error = false;
|
|
||||||
} catch (error2) {
|
|
||||||
invoke_error_boundary(error2, this.#effect && this.#effect.parent);
|
|
||||||
} finally {
|
|
||||||
set_active_reaction(previous_reaction);
|
|
||||||
}
|
|
||||||
if (failed) {
|
|
||||||
queue_micro_task(() => {
|
|
||||||
this.#failed_effect = this.#run(() => {
|
|
||||||
Batch.ensure();
|
|
||||||
this.#is_creating_fallback = true;
|
|
||||||
try {
|
|
||||||
return branch(() => {
|
|
||||||
failed(
|
|
||||||
this.#anchor,
|
|
||||||
() => error,
|
|
||||||
() => reset
|
|
||||||
);
|
|
||||||
});
|
|
||||||
} catch (error2) {
|
|
||||||
invoke_error_boundary(
|
|
||||||
error2,
|
|
||||||
/** @type {Effect} */
|
|
||||||
this.#effect.parent
|
|
||||||
);
|
|
||||||
return null;
|
|
||||||
} finally {
|
|
||||||
this.#is_creating_fallback = false;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
const all_registered_events = /* @__PURE__ */ new Set();
|
|
||||||
const root_event_handles = /* @__PURE__ */ new Set();
|
|
||||||
let last_propagated_event = null;
|
|
||||||
function handle_event_propagation(event) {
|
|
||||||
var handler_element = this;
|
|
||||||
var owner_document = (
|
|
||||||
/** @type {Node} */
|
|
||||||
handler_element.ownerDocument
|
|
||||||
);
|
|
||||||
var event_name = event.type;
|
|
||||||
var path = event.composedPath?.() || [];
|
|
||||||
var current_target = (
|
|
||||||
/** @type {null | Element} */
|
|
||||||
path[0] || event.target
|
|
||||||
);
|
|
||||||
last_propagated_event = event;
|
|
||||||
var path_idx = 0;
|
|
||||||
var handled_at = last_propagated_event === event && event.__root;
|
|
||||||
if (handled_at) {
|
|
||||||
var at_idx = path.indexOf(handled_at);
|
|
||||||
if (at_idx !== -1 && (handler_element === document || handler_element === /** @type {any} */
|
|
||||||
window)) {
|
|
||||||
event.__root = handler_element;
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
var handler_idx = path.indexOf(handler_element);
|
|
||||||
if (handler_idx === -1) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (at_idx <= handler_idx) {
|
|
||||||
path_idx = at_idx;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
current_target = /** @type {Element} */
|
|
||||||
path[path_idx] || event.target;
|
|
||||||
if (current_target === handler_element) return;
|
|
||||||
define_property(event, "currentTarget", {
|
|
||||||
configurable: true,
|
|
||||||
get() {
|
|
||||||
return current_target || owner_document;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
var previous_reaction = active_reaction;
|
|
||||||
var previous_effect = active_effect;
|
|
||||||
set_active_reaction(null);
|
|
||||||
set_active_effect(null);
|
|
||||||
try {
|
|
||||||
var throw_error;
|
|
||||||
var other_errors = [];
|
|
||||||
while (current_target !== null) {
|
|
||||||
var parent_element = current_target.assignedSlot || current_target.parentNode || /** @type {any} */
|
|
||||||
current_target.host || null;
|
|
||||||
try {
|
|
||||||
var delegated = current_target["__" + event_name];
|
|
||||||
if (delegated != null && (!/** @type {any} */
|
|
||||||
current_target.disabled || // DOM could've been updated already by the time this is reached, so we check this as well
|
|
||||||
// -> the target could not have been disabled because it emits the event in the first place
|
|
||||||
event.target === current_target)) {
|
|
||||||
delegated.call(current_target, event);
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
if (throw_error) {
|
|
||||||
other_errors.push(error);
|
|
||||||
} else {
|
|
||||||
throw_error = error;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (event.cancelBubble || parent_element === handler_element || parent_element === null) {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
current_target = parent_element;
|
|
||||||
}
|
|
||||||
if (throw_error) {
|
|
||||||
for (let error of other_errors) {
|
|
||||||
queueMicrotask(() => {
|
|
||||||
throw error;
|
|
||||||
});
|
|
||||||
}
|
|
||||||
throw throw_error;
|
|
||||||
}
|
|
||||||
} finally {
|
|
||||||
event.__root = handler_element;
|
|
||||||
delete event.currentTarget;
|
|
||||||
set_active_reaction(previous_reaction);
|
|
||||||
set_active_effect(previous_effect);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function assign_nodes(start, end) {
|
|
||||||
var effect = (
|
|
||||||
/** @type {Effect} */
|
|
||||||
active_effect
|
|
||||||
);
|
|
||||||
if (effect.nodes === null) {
|
|
||||||
effect.nodes = { start, end, a: null, t: null };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function mount(component, options2) {
|
|
||||||
return _mount(component, options2);
|
|
||||||
}
|
|
||||||
function hydrate(component, options2) {
|
|
||||||
init_operations();
|
|
||||||
options2.intro = options2.intro ?? false;
|
|
||||||
const target = options2.target;
|
|
||||||
const was_hydrating = hydrating;
|
|
||||||
const previous_hydrate_node = hydrate_node;
|
|
||||||
try {
|
|
||||||
var anchor = get_first_child(target);
|
|
||||||
while (anchor && (anchor.nodeType !== COMMENT_NODE || /** @type {Comment} */
|
|
||||||
anchor.data !== HYDRATION_START)) {
|
|
||||||
anchor = get_next_sibling(anchor);
|
|
||||||
}
|
|
||||||
if (!anchor) {
|
|
||||||
throw HYDRATION_ERROR;
|
|
||||||
}
|
|
||||||
set_hydrating(true);
|
|
||||||
set_hydrate_node(
|
|
||||||
/** @type {Comment} */
|
|
||||||
anchor
|
|
||||||
);
|
|
||||||
const instance = _mount(component, { ...options2, anchor });
|
|
||||||
set_hydrating(false);
|
|
||||||
return (
|
|
||||||
/** @type {Exports} */
|
|
||||||
instance
|
|
||||||
);
|
|
||||||
} catch (error) {
|
|
||||||
if (error instanceof Error && error.message.split("\n").some((line) => line.startsWith("https://svelte.dev/e/"))) {
|
|
||||||
throw error;
|
|
||||||
}
|
|
||||||
if (error !== HYDRATION_ERROR) {
|
|
||||||
console.warn("Failed to hydrate: ", error);
|
|
||||||
}
|
|
||||||
if (options2.recover === false) {
|
|
||||||
hydration_failed();
|
|
||||||
}
|
|
||||||
init_operations();
|
|
||||||
clear_text_content(target);
|
|
||||||
set_hydrating(false);
|
|
||||||
return mount(component, options2);
|
|
||||||
} finally {
|
|
||||||
set_hydrating(was_hydrating);
|
|
||||||
set_hydrate_node(previous_hydrate_node);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
const document_listeners = /* @__PURE__ */ new Map();
|
|
||||||
function _mount(Component, { target, anchor, props = {}, events, context, intro = true }) {
|
|
||||||
init_operations();
|
|
||||||
var registered_events = /* @__PURE__ */ new Set();
|
|
||||||
var event_handle = (events2) => {
|
|
||||||
for (var i = 0; i < events2.length; i++) {
|
|
||||||
var event_name = events2[i];
|
|
||||||
if (registered_events.has(event_name)) continue;
|
|
||||||
registered_events.add(event_name);
|
|
||||||
var passive = is_passive_event(event_name);
|
|
||||||
target.addEventListener(event_name, handle_event_propagation, { passive });
|
|
||||||
var n = document_listeners.get(event_name);
|
|
||||||
if (n === void 0) {
|
|
||||||
document.addEventListener(event_name, handle_event_propagation, { passive });
|
|
||||||
document_listeners.set(event_name, 1);
|
|
||||||
} else {
|
|
||||||
document_listeners.set(event_name, n + 1);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
event_handle(array_from(all_registered_events));
|
|
||||||
root_event_handles.add(event_handle);
|
|
||||||
var component = void 0;
|
|
||||||
var unmount2 = component_root(() => {
|
|
||||||
var anchor_node = anchor ?? target.appendChild(create_text());
|
|
||||||
boundary(
|
|
||||||
/** @type {TemplateNode} */
|
|
||||||
anchor_node,
|
|
||||||
{
|
|
||||||
pending: () => {
|
|
||||||
}
|
|
||||||
},
|
|
||||||
(anchor_node2) => {
|
|
||||||
if (context) {
|
|
||||||
push({});
|
|
||||||
var ctx = (
|
|
||||||
/** @type {ComponentContext} */
|
|
||||||
component_context
|
|
||||||
);
|
|
||||||
ctx.c = context;
|
|
||||||
}
|
|
||||||
if (events) {
|
|
||||||
props.$$events = events;
|
|
||||||
}
|
|
||||||
if (hydrating) {
|
|
||||||
assign_nodes(
|
|
||||||
/** @type {TemplateNode} */
|
|
||||||
anchor_node2,
|
|
||||||
null
|
|
||||||
);
|
|
||||||
}
|
|
||||||
component = Component(anchor_node2, props) || {};
|
|
||||||
if (hydrating) {
|
|
||||||
active_effect.nodes.end = hydrate_node;
|
|
||||||
if (hydrate_node === null || hydrate_node.nodeType !== COMMENT_NODE || /** @type {Comment} */
|
|
||||||
hydrate_node.data !== HYDRATION_END) {
|
|
||||||
hydration_mismatch();
|
|
||||||
throw HYDRATION_ERROR;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (context) {
|
|
||||||
pop();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
);
|
|
||||||
return () => {
|
|
||||||
for (var event_name of registered_events) {
|
|
||||||
target.removeEventListener(event_name, handle_event_propagation);
|
|
||||||
var n = (
|
|
||||||
/** @type {number} */
|
|
||||||
document_listeners.get(event_name)
|
|
||||||
);
|
|
||||||
if (--n === 0) {
|
|
||||||
document.removeEventListener(event_name, handle_event_propagation);
|
|
||||||
document_listeners.delete(event_name);
|
|
||||||
} else {
|
|
||||||
document_listeners.set(event_name, n);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
root_event_handles.delete(event_handle);
|
|
||||||
if (anchor_node !== anchor) {
|
|
||||||
anchor_node.parentNode?.removeChild(anchor_node);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
});
|
|
||||||
mounted_components.set(component, unmount2);
|
|
||||||
return component;
|
|
||||||
}
|
|
||||||
let mounted_components = /* @__PURE__ */ new WeakMap();
|
|
||||||
function unmount(component, options2) {
|
|
||||||
const fn = mounted_components.get(component);
|
|
||||||
if (fn) {
|
|
||||||
mounted_components.delete(component);
|
|
||||||
return fn(options2);
|
|
||||||
}
|
|
||||||
return Promise.resolve();
|
|
||||||
}
|
|
||||||
function asClassComponent$1(component) {
|
|
||||||
return class extends Svelte4Component {
|
|
||||||
/** @param {any} options */
|
|
||||||
constructor(options2) {
|
|
||||||
super({
|
|
||||||
component,
|
|
||||||
...options2
|
|
||||||
});
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
class Svelte4Component {
|
|
||||||
/** @type {any} */
|
|
||||||
#events;
|
|
||||||
/** @type {Record<string, any>} */
|
|
||||||
#instance;
|
|
||||||
/**
|
|
||||||
* @param {ComponentConstructorOptions & {
|
|
||||||
* component: any;
|
|
||||||
* }} options
|
|
||||||
*/
|
|
||||||
constructor(options2) {
|
|
||||||
var sources = /* @__PURE__ */ new Map();
|
|
||||||
var add_source = (key, value) => {
|
|
||||||
var s = mutable_source(value, false, false);
|
|
||||||
sources.set(key, s);
|
|
||||||
return s;
|
|
||||||
};
|
|
||||||
const props = new Proxy(
|
|
||||||
{ ...options2.props || {}, $$events: {} },
|
|
||||||
{
|
|
||||||
get(target, prop) {
|
|
||||||
return get(sources.get(prop) ?? add_source(prop, Reflect.get(target, prop)));
|
|
||||||
},
|
|
||||||
has(target, prop) {
|
|
||||||
if (prop === LEGACY_PROPS) return true;
|
|
||||||
get(sources.get(prop) ?? add_source(prop, Reflect.get(target, prop)));
|
|
||||||
return Reflect.has(target, prop);
|
|
||||||
},
|
|
||||||
set(target, prop, value) {
|
|
||||||
set(sources.get(prop) ?? add_source(prop, value), value);
|
|
||||||
return Reflect.set(target, prop, value);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
);
|
|
||||||
this.#instance = (options2.hydrate ? hydrate : mount)(options2.component, {
|
|
||||||
target: options2.target,
|
|
||||||
anchor: options2.anchor,
|
|
||||||
props,
|
|
||||||
context: options2.context,
|
|
||||||
intro: options2.intro ?? false,
|
|
||||||
recover: options2.recover
|
|
||||||
});
|
|
||||||
if (!options2?.props?.$$host || options2.sync === false) {
|
|
||||||
flushSync();
|
|
||||||
}
|
|
||||||
this.#events = props.$$events;
|
|
||||||
for (const key of Object.keys(this.#instance)) {
|
|
||||||
if (key === "$set" || key === "$destroy" || key === "$on") continue;
|
|
||||||
define_property(this, key, {
|
|
||||||
get() {
|
|
||||||
return this.#instance[key];
|
|
||||||
},
|
|
||||||
/** @param {any} value */
|
|
||||||
set(value) {
|
|
||||||
this.#instance[key] = value;
|
|
||||||
},
|
|
||||||
enumerable: true
|
|
||||||
});
|
|
||||||
}
|
|
||||||
this.#instance.$set = /** @param {Record<string, any>} next */
|
|
||||||
(next2) => {
|
|
||||||
Object.assign(props, next2);
|
|
||||||
};
|
|
||||||
this.#instance.$destroy = () => {
|
|
||||||
unmount(this.#instance);
|
|
||||||
};
|
|
||||||
}
|
|
||||||
/** @param {Record<string, any>} props */
|
|
||||||
$set(props) {
|
|
||||||
this.#instance.$set(props);
|
|
||||||
}
|
|
||||||
/**
|
|
||||||
* @param {string} event
|
|
||||||
* @param {(...args: any[]) => any} callback
|
|
||||||
* @returns {any}
|
|
||||||
*/
|
|
||||||
$on(event, callback) {
|
|
||||||
this.#events[event] = this.#events[event] || [];
|
|
||||||
const cb = (...args) => callback.call(this, ...args);
|
|
||||||
this.#events[event].push(cb);
|
|
||||||
return () => {
|
|
||||||
this.#events[event] = this.#events[event].filter(
|
|
||||||
/** @param {any} fn */
|
|
||||||
(fn) => fn !== cb
|
|
||||||
);
|
|
||||||
};
|
|
||||||
}
|
|
||||||
$destroy() {
|
|
||||||
this.#instance.$destroy();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
let read_implementation = null;
|
|
||||||
function set_read_implementation(fn) {
|
|
||||||
read_implementation = fn;
|
|
||||||
}
|
|
||||||
function set_manifest(_) {
|
|
||||||
}
|
|
||||||
function asClassComponent(component) {
|
|
||||||
const component_constructor = asClassComponent$1(component);
|
|
||||||
const _render = (props, { context, csp } = {}) => {
|
|
||||||
const result = render(component, { props, context, csp });
|
|
||||||
const munged = Object.defineProperties(
|
|
||||||
/** @type {LegacyRenderResult & PromiseLike<LegacyRenderResult>} */
|
|
||||||
{},
|
|
||||||
{
|
|
||||||
css: {
|
|
||||||
value: { code: "", map: null }
|
|
||||||
},
|
|
||||||
head: {
|
|
||||||
get: () => result.head
|
|
||||||
},
|
|
||||||
html: {
|
|
||||||
get: () => result.body
|
|
||||||
},
|
|
||||||
then: {
|
|
||||||
/**
|
|
||||||
* this is not type-safe, but honestly it's the best I can do right now, and it's a straightforward function.
|
|
||||||
*
|
|
||||||
* @template TResult1
|
|
||||||
* @template [TResult2=never]
|
|
||||||
* @param { (value: LegacyRenderResult) => TResult1 } onfulfilled
|
|
||||||
* @param { (reason: unknown) => TResult2 } onrejected
|
|
||||||
*/
|
|
||||||
value: (onfulfilled, onrejected) => {
|
|
||||||
{
|
|
||||||
const user_result = onfulfilled({
|
|
||||||
css: munged.css,
|
|
||||||
head: munged.head,
|
|
||||||
html: munged.html
|
|
||||||
});
|
|
||||||
return Promise.resolve(user_result);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
);
|
|
||||||
return munged;
|
|
||||||
};
|
|
||||||
component_constructor.render = _render;
|
|
||||||
return component_constructor;
|
|
||||||
}
|
|
||||||
function Root($$renderer, $$props) {
|
|
||||||
$$renderer.component(($$renderer2) => {
|
|
||||||
let {
|
|
||||||
stores,
|
|
||||||
page,
|
|
||||||
constructors,
|
|
||||||
components = [],
|
|
||||||
form,
|
|
||||||
data_0 = null,
|
|
||||||
data_1 = null
|
|
||||||
} = $$props;
|
|
||||||
{
|
|
||||||
setContext("__svelte__", stores);
|
|
||||||
}
|
|
||||||
{
|
|
||||||
stores.page.set(page);
|
|
||||||
}
|
|
||||||
const Pyramid_1 = constructors[1];
|
|
||||||
if (constructors[1]) {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
const Pyramid_0 = constructors[0];
|
|
||||||
$$renderer2.push(`<!---->`);
|
|
||||||
Pyramid_0($$renderer2, {
|
|
||||||
data: data_0,
|
|
||||||
form,
|
|
||||||
params: page.params,
|
|
||||||
children: ($$renderer3) => {
|
|
||||||
$$renderer3.push(`<!---->`);
|
|
||||||
Pyramid_1($$renderer3, { data: data_1, form, params: page.params });
|
|
||||||
$$renderer3.push(`<!---->`);
|
|
||||||
},
|
|
||||||
$$slots: { default: true }
|
|
||||||
});
|
|
||||||
$$renderer2.push(`<!---->`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
const Pyramid_0 = constructors[0];
|
|
||||||
$$renderer2.push(`<!---->`);
|
|
||||||
Pyramid_0($$renderer2, { data: data_0, form, params: page.params });
|
|
||||||
$$renderer2.push(`<!---->`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--> `);
|
|
||||||
{
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]-->`);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
const root = asClassComponent(Root);
|
|
||||||
const options = {
|
|
||||||
app_template_contains_nonce: false,
|
|
||||||
async: false,
|
|
||||||
csp: { "mode": "auto", "directives": { "upgrade-insecure-requests": false, "block-all-mixed-content": false }, "reportOnly": { "upgrade-insecure-requests": false, "block-all-mixed-content": false } },
|
|
||||||
csrf_check_origin: true,
|
|
||||||
csrf_trusted_origins: [],
|
|
||||||
embedded: false,
|
|
||||||
env_public_prefix: "PUBLIC_",
|
|
||||||
env_private_prefix: "",
|
|
||||||
hash_routing: false,
|
|
||||||
hooks: null,
|
|
||||||
// added lazily, via `get_hooks`
|
|
||||||
preload_strategy: "modulepreload",
|
|
||||||
root,
|
|
||||||
service_worker: false,
|
|
||||||
service_worker_options: void 0,
|
|
||||||
templates: {
|
|
||||||
app: ({ head, body, assets, nonce, env }) => '<!DOCTYPE html>\n<html lang="en">\n <head>\n <meta charset="utf-8" />\n <link rel="icon" href="' + assets + '/favicon.png" />\n <meta name="viewport" content="width=device-width, initial-scale=1" />\n ' + head + '\n </head>\n <body data-sveltekit-preload-data="hover">\n <div style="display: contents">' + body + "</div>\n </body>\n</html>\n",
|
|
||||||
error: ({ status, message }) => '<!doctype html>\n<html lang="en">\n <head>\n <meta charset="utf-8" />\n <title>' + message + `</title>
|
|
||||||
|
|
||||||
<style>
|
|
||||||
body {
|
|
||||||
--bg: white;
|
|
||||||
--fg: #222;
|
|
||||||
--divider: #ccc;
|
|
||||||
background: var(--bg);
|
|
||||||
color: var(--fg);
|
|
||||||
font-family:
|
|
||||||
system-ui,
|
|
||||||
-apple-system,
|
|
||||||
BlinkMacSystemFont,
|
|
||||||
'Segoe UI',
|
|
||||||
Roboto,
|
|
||||||
Oxygen,
|
|
||||||
Ubuntu,
|
|
||||||
Cantarell,
|
|
||||||
'Open Sans',
|
|
||||||
'Helvetica Neue',
|
|
||||||
sans-serif;
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
height: 100vh;
|
|
||||||
margin: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.error {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
max-width: 32rem;
|
|
||||||
margin: 0 1rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.status {
|
|
||||||
font-weight: 200;
|
|
||||||
font-size: 3rem;
|
|
||||||
line-height: 1;
|
|
||||||
position: relative;
|
|
||||||
top: -0.05rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.message {
|
|
||||||
border-left: 1px solid var(--divider);
|
|
||||||
padding: 0 0 0 1rem;
|
|
||||||
margin: 0 0 0 1rem;
|
|
||||||
min-height: 2.5rem;
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
}
|
|
||||||
|
|
||||||
.message h1 {
|
|
||||||
font-weight: 400;
|
|
||||||
font-size: 1em;
|
|
||||||
margin: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (prefers-color-scheme: dark) {
|
|
||||||
body {
|
|
||||||
--bg: #222;
|
|
||||||
--fg: #ddd;
|
|
||||||
--divider: #666;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
</style>
|
|
||||||
</head>
|
|
||||||
<body>
|
|
||||||
<div class="error">
|
|
||||||
<span class="status">` + status + '</span>\n <div class="message">\n <h1>' + message + "</h1>\n </div>\n </div>\n </body>\n</html>\n"
|
|
||||||
},
|
|
||||||
version_hash: "1ootf77"
|
|
||||||
};
|
|
||||||
async function get_hooks() {
|
|
||||||
let handle;
|
|
||||||
let handleFetch;
|
|
||||||
let handleError;
|
|
||||||
let handleValidationError;
|
|
||||||
let init;
|
|
||||||
let reroute;
|
|
||||||
let transport;
|
|
||||||
return {
|
|
||||||
handle,
|
|
||||||
handleFetch,
|
|
||||||
handleError,
|
|
||||||
handleValidationError,
|
|
||||||
init,
|
|
||||||
reroute,
|
|
||||||
transport
|
|
||||||
};
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
set_public_env as a,
|
|
||||||
set_read_implementation as b,
|
|
||||||
set_manifest as c,
|
|
||||||
get_hooks as g,
|
|
||||||
options as o,
|
|
||||||
public_env as p,
|
|
||||||
read_implementation as r,
|
|
||||||
set_private_env as s
|
|
||||||
};
|
|
||||||
@@ -1,522 +0,0 @@
|
|||||||
import * as devalue from "devalue";
|
|
||||||
import { t as text_decoder, b as base64_encode, c as base64_decode } from "./utils.js";
|
|
||||||
function set_nested_value(object, path_string, value) {
|
|
||||||
if (path_string.startsWith("n:")) {
|
|
||||||
path_string = path_string.slice(2);
|
|
||||||
value = value === "" ? void 0 : parseFloat(value);
|
|
||||||
} else if (path_string.startsWith("b:")) {
|
|
||||||
path_string = path_string.slice(2);
|
|
||||||
value = value === "on";
|
|
||||||
}
|
|
||||||
deep_set(object, split_path(path_string), value);
|
|
||||||
}
|
|
||||||
function convert_formdata(data) {
|
|
||||||
const result = {};
|
|
||||||
for (let key of data.keys()) {
|
|
||||||
const is_array = key.endsWith("[]");
|
|
||||||
let values = data.getAll(key);
|
|
||||||
if (is_array) key = key.slice(0, -2);
|
|
||||||
if (values.length > 1 && !is_array) {
|
|
||||||
throw new Error(`Form cannot contain duplicated keys — "${key}" has ${values.length} values`);
|
|
||||||
}
|
|
||||||
values = values.filter(
|
|
||||||
(entry) => typeof entry === "string" || entry.name !== "" || entry.size > 0
|
|
||||||
);
|
|
||||||
if (key.startsWith("n:")) {
|
|
||||||
key = key.slice(2);
|
|
||||||
values = values.map((v) => v === "" ? void 0 : parseFloat(
|
|
||||||
/** @type {string} */
|
|
||||||
v
|
|
||||||
));
|
|
||||||
} else if (key.startsWith("b:")) {
|
|
||||||
key = key.slice(2);
|
|
||||||
values = values.map((v) => v === "on");
|
|
||||||
}
|
|
||||||
set_nested_value(result, key, is_array ? values : values[0]);
|
|
||||||
}
|
|
||||||
return result;
|
|
||||||
}
|
|
||||||
const BINARY_FORM_CONTENT_TYPE = "application/x-sveltekit-formdata";
|
|
||||||
const BINARY_FORM_VERSION = 0;
|
|
||||||
async function deserialize_binary_form(request) {
|
|
||||||
if (request.headers.get("content-type") !== BINARY_FORM_CONTENT_TYPE) {
|
|
||||||
const form_data = await request.formData();
|
|
||||||
return { data: convert_formdata(form_data), meta: {}, form_data };
|
|
||||||
}
|
|
||||||
if (!request.body) {
|
|
||||||
throw new Error("Could not deserialize binary form: no body");
|
|
||||||
}
|
|
||||||
const reader = request.body.getReader();
|
|
||||||
const chunks = [];
|
|
||||||
async function get_chunk(index) {
|
|
||||||
if (index in chunks) return chunks[index];
|
|
||||||
let i = chunks.length;
|
|
||||||
while (i <= index) {
|
|
||||||
chunks[i] = reader.read().then((chunk) => chunk.value);
|
|
||||||
i++;
|
|
||||||
}
|
|
||||||
return chunks[index];
|
|
||||||
}
|
|
||||||
async function get_buffer(offset, length) {
|
|
||||||
let start_chunk;
|
|
||||||
let chunk_start = 0;
|
|
||||||
let chunk_index;
|
|
||||||
for (chunk_index = 0; ; chunk_index++) {
|
|
||||||
const chunk = await get_chunk(chunk_index);
|
|
||||||
if (!chunk) return null;
|
|
||||||
const chunk_end = chunk_start + chunk.byteLength;
|
|
||||||
if (offset >= chunk_start && offset < chunk_end) {
|
|
||||||
start_chunk = chunk;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
chunk_start = chunk_end;
|
|
||||||
}
|
|
||||||
if (offset + length <= chunk_start + start_chunk.byteLength) {
|
|
||||||
return start_chunk.subarray(offset - chunk_start, offset + length - chunk_start);
|
|
||||||
}
|
|
||||||
const buffer = new Uint8Array(length);
|
|
||||||
buffer.set(start_chunk.subarray(offset - chunk_start));
|
|
||||||
let cursor = start_chunk.byteLength - offset + chunk_start;
|
|
||||||
while (cursor < length) {
|
|
||||||
chunk_index++;
|
|
||||||
let chunk = await get_chunk(chunk_index);
|
|
||||||
if (!chunk) return null;
|
|
||||||
if (chunk.byteLength > length - cursor) {
|
|
||||||
chunk = chunk.subarray(0, length - cursor);
|
|
||||||
}
|
|
||||||
buffer.set(chunk, cursor);
|
|
||||||
cursor += chunk.byteLength;
|
|
||||||
}
|
|
||||||
return buffer;
|
|
||||||
}
|
|
||||||
const header = await get_buffer(0, 1 + 4 + 2);
|
|
||||||
if (!header) throw new Error("Could not deserialize binary form: too short");
|
|
||||||
if (header[0] !== BINARY_FORM_VERSION) {
|
|
||||||
throw new Error(
|
|
||||||
`Could not deserialize binary form: got version ${header[0]}, expected version ${BINARY_FORM_VERSION}`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
const header_view = new DataView(header.buffer, header.byteOffset, header.byteLength);
|
|
||||||
const data_length = header_view.getUint32(1, true);
|
|
||||||
const file_offsets_length = header_view.getUint16(5, true);
|
|
||||||
const data_buffer = await get_buffer(1 + 4 + 2, data_length);
|
|
||||||
if (!data_buffer) throw new Error("Could not deserialize binary form: data too short");
|
|
||||||
let file_offsets;
|
|
||||||
let files_start_offset;
|
|
||||||
if (file_offsets_length > 0) {
|
|
||||||
const file_offsets_buffer = await get_buffer(1 + 4 + 2 + data_length, file_offsets_length);
|
|
||||||
if (!file_offsets_buffer)
|
|
||||||
throw new Error("Could not deserialize binary form: file offset table too short");
|
|
||||||
file_offsets = /** @type {Array<number>} */
|
|
||||||
JSON.parse(text_decoder.decode(file_offsets_buffer));
|
|
||||||
files_start_offset = 1 + 4 + 2 + data_length + file_offsets_length;
|
|
||||||
}
|
|
||||||
const [data, meta] = devalue.parse(text_decoder.decode(data_buffer), {
|
|
||||||
File: ([name, type, size, last_modified, index]) => {
|
|
||||||
return new Proxy(
|
|
||||||
new LazyFile(
|
|
||||||
name,
|
|
||||||
type,
|
|
||||||
size,
|
|
||||||
last_modified,
|
|
||||||
get_chunk,
|
|
||||||
files_start_offset + file_offsets[index]
|
|
||||||
),
|
|
||||||
{
|
|
||||||
getPrototypeOf() {
|
|
||||||
return File.prototype;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
void (async () => {
|
|
||||||
let has_more = true;
|
|
||||||
while (has_more) {
|
|
||||||
const chunk = await get_chunk(chunks.length);
|
|
||||||
has_more = !!chunk;
|
|
||||||
}
|
|
||||||
})();
|
|
||||||
return { data, meta, form_data: null };
|
|
||||||
}
|
|
||||||
class LazyFile {
|
|
||||||
/** @type {(index: number) => Promise<Uint8Array<ArrayBuffer> | undefined>} */
|
|
||||||
#get_chunk;
|
|
||||||
/** @type {number} */
|
|
||||||
#offset;
|
|
||||||
/**
|
|
||||||
* @param {string} name
|
|
||||||
* @param {string} type
|
|
||||||
* @param {number} size
|
|
||||||
* @param {number} last_modified
|
|
||||||
* @param {(index: number) => Promise<Uint8Array<ArrayBuffer> | undefined>} get_chunk
|
|
||||||
* @param {number} offset
|
|
||||||
*/
|
|
||||||
constructor(name, type, size, last_modified, get_chunk, offset) {
|
|
||||||
this.name = name;
|
|
||||||
this.type = type;
|
|
||||||
this.size = size;
|
|
||||||
this.lastModified = last_modified;
|
|
||||||
this.webkitRelativePath = "";
|
|
||||||
this.#get_chunk = get_chunk;
|
|
||||||
this.#offset = offset;
|
|
||||||
this.arrayBuffer = this.arrayBuffer.bind(this);
|
|
||||||
this.bytes = this.bytes.bind(this);
|
|
||||||
this.slice = this.slice.bind(this);
|
|
||||||
this.stream = this.stream.bind(this);
|
|
||||||
this.text = this.text.bind(this);
|
|
||||||
}
|
|
||||||
/** @type {ArrayBuffer | undefined} */
|
|
||||||
#buffer;
|
|
||||||
async arrayBuffer() {
|
|
||||||
this.#buffer ??= await new Response(this.stream()).arrayBuffer();
|
|
||||||
return this.#buffer;
|
|
||||||
}
|
|
||||||
async bytes() {
|
|
||||||
return new Uint8Array(await this.arrayBuffer());
|
|
||||||
}
|
|
||||||
/**
|
|
||||||
* @param {number=} start
|
|
||||||
* @param {number=} end
|
|
||||||
* @param {string=} contentType
|
|
||||||
*/
|
|
||||||
slice(start = 0, end = this.size, contentType = this.type) {
|
|
||||||
if (start < 0) {
|
|
||||||
start = Math.max(this.size + start, 0);
|
|
||||||
} else {
|
|
||||||
start = Math.min(start, this.size);
|
|
||||||
}
|
|
||||||
if (end < 0) {
|
|
||||||
end = Math.max(this.size + end, 0);
|
|
||||||
} else {
|
|
||||||
end = Math.min(end, this.size);
|
|
||||||
}
|
|
||||||
const size = Math.max(end - start, 0);
|
|
||||||
const file = new LazyFile(
|
|
||||||
this.name,
|
|
||||||
contentType,
|
|
||||||
size,
|
|
||||||
this.lastModified,
|
|
||||||
this.#get_chunk,
|
|
||||||
this.#offset + start
|
|
||||||
);
|
|
||||||
return file;
|
|
||||||
}
|
|
||||||
stream() {
|
|
||||||
let cursor = 0;
|
|
||||||
let chunk_index = 0;
|
|
||||||
return new ReadableStream({
|
|
||||||
start: async (controller) => {
|
|
||||||
let chunk_start = 0;
|
|
||||||
let start_chunk = null;
|
|
||||||
for (chunk_index = 0; ; chunk_index++) {
|
|
||||||
const chunk = await this.#get_chunk(chunk_index);
|
|
||||||
if (!chunk) return null;
|
|
||||||
const chunk_end = chunk_start + chunk.byteLength;
|
|
||||||
if (this.#offset >= chunk_start && this.#offset < chunk_end) {
|
|
||||||
start_chunk = chunk;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
chunk_start = chunk_end;
|
|
||||||
}
|
|
||||||
if (this.#offset + this.size <= chunk_start + start_chunk.byteLength) {
|
|
||||||
controller.enqueue(
|
|
||||||
start_chunk.subarray(this.#offset - chunk_start, this.#offset + this.size - chunk_start)
|
|
||||||
);
|
|
||||||
controller.close();
|
|
||||||
} else {
|
|
||||||
controller.enqueue(start_chunk.subarray(this.#offset - chunk_start));
|
|
||||||
cursor = start_chunk.byteLength - this.#offset + chunk_start;
|
|
||||||
}
|
|
||||||
},
|
|
||||||
pull: async (controller) => {
|
|
||||||
chunk_index++;
|
|
||||||
let chunk = await this.#get_chunk(chunk_index);
|
|
||||||
if (!chunk) {
|
|
||||||
controller.error("Could not deserialize binary form: incomplete file data");
|
|
||||||
controller.close();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (chunk.byteLength > this.size - cursor) {
|
|
||||||
chunk = chunk.subarray(0, this.size - cursor);
|
|
||||||
}
|
|
||||||
controller.enqueue(chunk);
|
|
||||||
cursor += chunk.byteLength;
|
|
||||||
if (cursor >= this.size) {
|
|
||||||
controller.close();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
async text() {
|
|
||||||
return text_decoder.decode(await this.arrayBuffer());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
const path_regex = /^[a-zA-Z_$]\w*(\.[a-zA-Z_$]\w*|\[\d+\])*$/;
|
|
||||||
function split_path(path) {
|
|
||||||
if (!path_regex.test(path)) {
|
|
||||||
throw new Error(`Invalid path ${path}`);
|
|
||||||
}
|
|
||||||
return path.split(/\.|\[|\]/).filter(Boolean);
|
|
||||||
}
|
|
||||||
function check_prototype_pollution(key) {
|
|
||||||
if (key === "__proto__" || key === "constructor" || key === "prototype") {
|
|
||||||
throw new Error(
|
|
||||||
`Invalid key "${key}"`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function deep_set(object, keys, value) {
|
|
||||||
let current = object;
|
|
||||||
for (let i = 0; i < keys.length - 1; i += 1) {
|
|
||||||
const key = keys[i];
|
|
||||||
check_prototype_pollution(key);
|
|
||||||
const is_array = /^\d+$/.test(keys[i + 1]);
|
|
||||||
const exists = key in current;
|
|
||||||
const inner = current[key];
|
|
||||||
if (exists && is_array !== Array.isArray(inner)) {
|
|
||||||
throw new Error(`Invalid array key ${keys[i + 1]}`);
|
|
||||||
}
|
|
||||||
if (!exists) {
|
|
||||||
current[key] = is_array ? [] : {};
|
|
||||||
}
|
|
||||||
current = current[key];
|
|
||||||
}
|
|
||||||
const final_key = keys[keys.length - 1];
|
|
||||||
check_prototype_pollution(final_key);
|
|
||||||
current[final_key] = value;
|
|
||||||
}
|
|
||||||
function normalize_issue(issue, server = false) {
|
|
||||||
const normalized = { name: "", path: [], message: issue.message, server };
|
|
||||||
if (issue.path !== void 0) {
|
|
||||||
let name = "";
|
|
||||||
for (const segment of issue.path) {
|
|
||||||
const key = (
|
|
||||||
/** @type {string | number} */
|
|
||||||
typeof segment === "object" ? segment.key : segment
|
|
||||||
);
|
|
||||||
normalized.path.push(key);
|
|
||||||
if (typeof key === "number") {
|
|
||||||
name += `[${key}]`;
|
|
||||||
} else if (typeof key === "string") {
|
|
||||||
name += name === "" ? key : "." + key;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
normalized.name = name;
|
|
||||||
}
|
|
||||||
return normalized;
|
|
||||||
}
|
|
||||||
function flatten_issues(issues) {
|
|
||||||
const result = {};
|
|
||||||
for (const issue of issues) {
|
|
||||||
(result.$ ??= []).push(issue);
|
|
||||||
let name = "";
|
|
||||||
if (issue.path !== void 0) {
|
|
||||||
for (const key of issue.path) {
|
|
||||||
if (typeof key === "number") {
|
|
||||||
name += `[${key}]`;
|
|
||||||
} else if (typeof key === "string") {
|
|
||||||
name += name === "" ? key : "." + key;
|
|
||||||
}
|
|
||||||
(result[name] ??= []).push(issue);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return result;
|
|
||||||
}
|
|
||||||
function deep_get(object, path) {
|
|
||||||
let current = object;
|
|
||||||
for (const key of path) {
|
|
||||||
if (current == null || typeof current !== "object") {
|
|
||||||
return current;
|
|
||||||
}
|
|
||||||
current = current[key];
|
|
||||||
}
|
|
||||||
return current;
|
|
||||||
}
|
|
||||||
function create_field_proxy(target, get_input, set_input, get_issues, path = []) {
|
|
||||||
const get_value = () => {
|
|
||||||
return deep_get(get_input(), path);
|
|
||||||
};
|
|
||||||
return new Proxy(target, {
|
|
||||||
get(target2, prop) {
|
|
||||||
if (typeof prop === "symbol") return target2[prop];
|
|
||||||
if (/^\d+$/.test(prop)) {
|
|
||||||
return create_field_proxy({}, get_input, set_input, get_issues, [
|
|
||||||
...path,
|
|
||||||
parseInt(prop, 10)
|
|
||||||
]);
|
|
||||||
}
|
|
||||||
const key = build_path_string(path);
|
|
||||||
if (prop === "set") {
|
|
||||||
const set_func = function(newValue) {
|
|
||||||
set_input(path, newValue);
|
|
||||||
return newValue;
|
|
||||||
};
|
|
||||||
return create_field_proxy(set_func, get_input, set_input, get_issues, [...path, prop]);
|
|
||||||
}
|
|
||||||
if (prop === "value") {
|
|
||||||
return create_field_proxy(get_value, get_input, set_input, get_issues, [...path, prop]);
|
|
||||||
}
|
|
||||||
if (prop === "issues" || prop === "allIssues") {
|
|
||||||
const issues_func = () => {
|
|
||||||
const all_issues = get_issues()[key === "" ? "$" : key];
|
|
||||||
if (prop === "allIssues") {
|
|
||||||
return all_issues?.map((issue) => ({
|
|
||||||
path: issue.path,
|
|
||||||
message: issue.message
|
|
||||||
}));
|
|
||||||
}
|
|
||||||
return all_issues?.filter((issue) => issue.name === key)?.map((issue) => ({
|
|
||||||
path: issue.path,
|
|
||||||
message: issue.message
|
|
||||||
}));
|
|
||||||
};
|
|
||||||
return create_field_proxy(issues_func, get_input, set_input, get_issues, [...path, prop]);
|
|
||||||
}
|
|
||||||
if (prop === "as") {
|
|
||||||
const as_func = (type, input_value) => {
|
|
||||||
const is_array = type === "file multiple" || type === "select multiple" || type === "checkbox" && typeof input_value === "string";
|
|
||||||
const prefix = type === "number" || type === "range" ? "n:" : type === "checkbox" && !is_array ? "b:" : "";
|
|
||||||
const base_props = {
|
|
||||||
name: prefix + key + (is_array ? "[]" : ""),
|
|
||||||
get "aria-invalid"() {
|
|
||||||
const issues = get_issues();
|
|
||||||
return key in issues ? "true" : void 0;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
if (type !== "text" && type !== "select" && type !== "select multiple") {
|
|
||||||
base_props.type = type === "file multiple" ? "file" : type;
|
|
||||||
}
|
|
||||||
if (type === "submit" || type === "hidden") {
|
|
||||||
return Object.defineProperties(base_props, {
|
|
||||||
value: { value: input_value, enumerable: true }
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (type === "select" || type === "select multiple") {
|
|
||||||
return Object.defineProperties(base_props, {
|
|
||||||
multiple: { value: is_array, enumerable: true },
|
|
||||||
value: {
|
|
||||||
enumerable: true,
|
|
||||||
get() {
|
|
||||||
return get_value();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (type === "checkbox" || type === "radio") {
|
|
||||||
return Object.defineProperties(base_props, {
|
|
||||||
value: { value: input_value ?? "on", enumerable: true },
|
|
||||||
checked: {
|
|
||||||
enumerable: true,
|
|
||||||
get() {
|
|
||||||
const value = get_value();
|
|
||||||
if (type === "radio") {
|
|
||||||
return value === input_value;
|
|
||||||
}
|
|
||||||
if (is_array) {
|
|
||||||
return (value ?? []).includes(input_value);
|
|
||||||
}
|
|
||||||
return value;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (type === "file" || type === "file multiple") {
|
|
||||||
return Object.defineProperties(base_props, {
|
|
||||||
multiple: { value: is_array, enumerable: true },
|
|
||||||
files: {
|
|
||||||
enumerable: true,
|
|
||||||
get() {
|
|
||||||
const value = get_value();
|
|
||||||
if (value instanceof File) {
|
|
||||||
if (typeof DataTransfer !== "undefined") {
|
|
||||||
const fileList = new DataTransfer();
|
|
||||||
fileList.items.add(value);
|
|
||||||
return fileList.files;
|
|
||||||
}
|
|
||||||
return { 0: value, length: 1 };
|
|
||||||
}
|
|
||||||
if (Array.isArray(value) && value.every((f) => f instanceof File)) {
|
|
||||||
if (typeof DataTransfer !== "undefined") {
|
|
||||||
const fileList = new DataTransfer();
|
|
||||||
value.forEach((file) => fileList.items.add(file));
|
|
||||||
return fileList.files;
|
|
||||||
}
|
|
||||||
const fileListLike = { length: value.length };
|
|
||||||
value.forEach((file, index) => {
|
|
||||||
fileListLike[index] = file;
|
|
||||||
});
|
|
||||||
return fileListLike;
|
|
||||||
}
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
return Object.defineProperties(base_props, {
|
|
||||||
value: {
|
|
||||||
enumerable: true,
|
|
||||||
get() {
|
|
||||||
const value = get_value();
|
|
||||||
return value != null ? String(value) : "";
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
};
|
|
||||||
return create_field_proxy(as_func, get_input, set_input, get_issues, [...path, "as"]);
|
|
||||||
}
|
|
||||||
return create_field_proxy({}, get_input, set_input, get_issues, [...path, prop]);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
function build_path_string(path) {
|
|
||||||
let result = "";
|
|
||||||
for (const segment of path) {
|
|
||||||
if (typeof segment === "number") {
|
|
||||||
result += `[${segment}]`;
|
|
||||||
} else {
|
|
||||||
result += result === "" ? segment : "." + segment;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return result;
|
|
||||||
}
|
|
||||||
const INVALIDATED_PARAM = "x-sveltekit-invalidated";
|
|
||||||
const TRAILING_SLASH_PARAM = "x-sveltekit-trailing-slash";
|
|
||||||
function stringify(data, transport) {
|
|
||||||
const encoders = Object.fromEntries(Object.entries(transport).map(([k, v]) => [k, v.encode]));
|
|
||||||
return devalue.stringify(data, encoders);
|
|
||||||
}
|
|
||||||
function stringify_remote_arg(value, transport) {
|
|
||||||
if (value === void 0) return "";
|
|
||||||
const json_string = stringify(value, transport);
|
|
||||||
const bytes = new TextEncoder().encode(json_string);
|
|
||||||
return base64_encode(bytes).replaceAll("=", "").replaceAll("+", "-").replaceAll("/", "_");
|
|
||||||
}
|
|
||||||
function parse_remote_arg(string, transport) {
|
|
||||||
if (!string) return void 0;
|
|
||||||
const json_string = text_decoder.decode(
|
|
||||||
// no need to add back `=` characters, atob can handle it
|
|
||||||
base64_decode(string.replaceAll("-", "+").replaceAll("_", "/"))
|
|
||||||
);
|
|
||||||
const decoders = Object.fromEntries(Object.entries(transport).map(([k, v]) => [k, v.decode]));
|
|
||||||
return devalue.parse(json_string, decoders);
|
|
||||||
}
|
|
||||||
function create_remote_key(id, payload) {
|
|
||||||
return id + "/" + payload;
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
BINARY_FORM_CONTENT_TYPE as B,
|
|
||||||
INVALIDATED_PARAM as I,
|
|
||||||
TRAILING_SLASH_PARAM as T,
|
|
||||||
stringify_remote_arg as a,
|
|
||||||
create_field_proxy as b,
|
|
||||||
create_remote_key as c,
|
|
||||||
deserialize_binary_form as d,
|
|
||||||
set_nested_value as e,
|
|
||||||
flatten_issues as f,
|
|
||||||
deep_set as g,
|
|
||||||
normalize_issue as n,
|
|
||||||
parse_remote_arg as p,
|
|
||||||
stringify as s
|
|
||||||
};
|
|
||||||
@@ -1,44 +0,0 @@
|
|||||||
import { a0 as getContext } from "./index2.js";
|
|
||||||
import "clsx";
|
|
||||||
import "@sveltejs/kit/internal";
|
|
||||||
import "./exports.js";
|
|
||||||
import "./utils.js";
|
|
||||||
import "@sveltejs/kit/internal/server";
|
|
||||||
import { n as noop } from "./equality.js";
|
|
||||||
const is_legacy = noop.toString().includes("$$") || /function \w+\(\) \{\}/.test(noop.toString());
|
|
||||||
if (is_legacy) {
|
|
||||||
({
|
|
||||||
data: {},
|
|
||||||
form: null,
|
|
||||||
error: null,
|
|
||||||
params: {},
|
|
||||||
route: { id: null },
|
|
||||||
state: {},
|
|
||||||
status: -1,
|
|
||||||
url: new URL("https://example.com")
|
|
||||||
});
|
|
||||||
}
|
|
||||||
const getStores = () => {
|
|
||||||
const stores = getContext("__svelte__");
|
|
||||||
return {
|
|
||||||
/** @type {typeof page} */
|
|
||||||
page: {
|
|
||||||
subscribe: stores.page.subscribe
|
|
||||||
},
|
|
||||||
/** @type {typeof navigating} */
|
|
||||||
navigating: {
|
|
||||||
subscribe: stores.navigating.subscribe
|
|
||||||
},
|
|
||||||
/** @type {typeof updated} */
|
|
||||||
updated: stores.updated
|
|
||||||
};
|
|
||||||
};
|
|
||||||
const page = {
|
|
||||||
subscribe(fn) {
|
|
||||||
const store = getStores().page;
|
|
||||||
return store.subscribe(fn);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
export {
|
|
||||||
page as p
|
|
||||||
};
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
import { w as writable } from "./index.js";
|
|
||||||
const toasts = writable([]);
|
|
||||||
function addToast(message, type = "info", duration = 3e3) {
|
|
||||||
const id = Math.random().toString(36).substr(2, 9);
|
|
||||||
console.log(`[toasts.addToast][Action] Adding toast context={{'id': '${id}', 'type': '${type}', 'message': '${message}'}}`);
|
|
||||||
toasts.update((all) => [...all, { id, message, type }]);
|
|
||||||
setTimeout(() => removeToast(id), duration);
|
|
||||||
}
|
|
||||||
function removeToast(id) {
|
|
||||||
console.log(`[toasts.removeToast][Action] Removing toast context={{'id': '${id}'}}`);
|
|
||||||
toasts.update((all) => all.filter((t) => t.id !== id));
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
addToast as a,
|
|
||||||
toasts as t
|
|
||||||
};
|
|
||||||
@@ -1,43 +0,0 @@
|
|||||||
const text_encoder = new TextEncoder();
|
|
||||||
const text_decoder = new TextDecoder();
|
|
||||||
function get_relative_path(from, to) {
|
|
||||||
const from_parts = from.split(/[/\\]/);
|
|
||||||
const to_parts = to.split(/[/\\]/);
|
|
||||||
from_parts.pop();
|
|
||||||
while (from_parts[0] === to_parts[0]) {
|
|
||||||
from_parts.shift();
|
|
||||||
to_parts.shift();
|
|
||||||
}
|
|
||||||
let i = from_parts.length;
|
|
||||||
while (i--) from_parts[i] = "..";
|
|
||||||
return from_parts.concat(to_parts).join("/");
|
|
||||||
}
|
|
||||||
function base64_encode(bytes) {
|
|
||||||
if (globalThis.Buffer) {
|
|
||||||
return globalThis.Buffer.from(bytes).toString("base64");
|
|
||||||
}
|
|
||||||
let binary = "";
|
|
||||||
for (let i = 0; i < bytes.length; i++) {
|
|
||||||
binary += String.fromCharCode(bytes[i]);
|
|
||||||
}
|
|
||||||
return btoa(binary);
|
|
||||||
}
|
|
||||||
function base64_decode(encoded) {
|
|
||||||
if (globalThis.Buffer) {
|
|
||||||
const buffer = globalThis.Buffer.from(encoded, "base64");
|
|
||||||
return new Uint8Array(buffer);
|
|
||||||
}
|
|
||||||
const binary = atob(encoded);
|
|
||||||
const bytes = new Uint8Array(binary.length);
|
|
||||||
for (let i = 0; i < binary.length; i++) {
|
|
||||||
bytes[i] = binary.charCodeAt(i);
|
|
||||||
}
|
|
||||||
return bytes;
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
text_encoder as a,
|
|
||||||
base64_encode as b,
|
|
||||||
base64_decode as c,
|
|
||||||
get_relative_path as g,
|
|
||||||
text_decoder as t
|
|
||||||
};
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
import { _ as escape_html, X as store_get, Y as unsubscribe_stores } from "../../chunks/index2.js";
|
|
||||||
import { p as page } from "../../chunks/stores.js";
|
|
||||||
function _error($$renderer, $$props) {
|
|
||||||
$$renderer.component(($$renderer2) => {
|
|
||||||
var $$store_subs;
|
|
||||||
$$renderer2.push(`<div class="container mx-auto p-4 text-center mt-20"><h1 class="text-6xl font-bold text-gray-800 mb-4">${escape_html(store_get($$store_subs ??= {}, "$page", page).status)}</h1> <p class="text-2xl text-gray-600 mb-8">${escape_html(store_get($$store_subs ??= {}, "$page", page).error?.message || "Page not found")}</p> <a href="/" class="bg-blue-500 text-white px-6 py-3 rounded-lg hover:bg-blue-600 transition-colors">Back to Dashboard</a></div>`);
|
|
||||||
if ($$store_subs) unsubscribe_stores($$store_subs);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
_error as default
|
|
||||||
};
|
|
||||||
@@ -1,38 +0,0 @@
|
|||||||
import { V as attr_class, W as stringify, X as store_get, Y as unsubscribe_stores, Z as ensure_array_like, _ as escape_html, $ as slot } from "../../chunks/index2.js";
|
|
||||||
import { p as page } from "../../chunks/stores.js";
|
|
||||||
import "clsx";
|
|
||||||
import { t as toasts } from "../../chunks/toasts.js";
|
|
||||||
function Navbar($$renderer, $$props) {
|
|
||||||
$$renderer.component(($$renderer2) => {
|
|
||||||
var $$store_subs;
|
|
||||||
$$renderer2.push(`<header class="bg-white shadow-md p-4 flex justify-between items-center"><a href="/" class="text-3xl font-bold text-gray-800 focus:outline-none">Superset Tools</a> <nav class="space-x-4"><a href="/"${attr_class(`text-gray-600 hover:text-blue-600 font-medium ${stringify(store_get($$store_subs ??= {}, "$page", page).url.pathname === "/" ? "text-blue-600 border-b-2 border-blue-600" : "")}`)}>Dashboard</a> <a href="/settings"${attr_class(`text-gray-600 hover:text-blue-600 font-medium ${stringify(store_get($$store_subs ??= {}, "$page", page).url.pathname === "/settings" ? "text-blue-600 border-b-2 border-blue-600" : "")}`)}>Settings</a></nav></header>`);
|
|
||||||
if ($$store_subs) unsubscribe_stores($$store_subs);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
function Footer($$renderer) {
|
|
||||||
$$renderer.push(`<footer class="bg-white border-t p-4 mt-8 text-center text-gray-500 text-sm">© 2025 Superset Tools. All rights reserved.</footer>`);
|
|
||||||
}
|
|
||||||
function Toast($$renderer) {
|
|
||||||
var $$store_subs;
|
|
||||||
$$renderer.push(`<div class="fixed bottom-0 right-0 p-4 space-y-2"><!--[-->`);
|
|
||||||
const each_array = ensure_array_like(store_get($$store_subs ??= {}, "$toasts", toasts));
|
|
||||||
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
|
|
||||||
let toast = each_array[$$index];
|
|
||||||
$$renderer.push(`<div${attr_class(`p-4 rounded-md shadow-lg text-white ${stringify(toast.type === "info" && "bg-blue-500")} ${stringify(toast.type === "success" && "bg-green-500")} ${stringify(toast.type === "error" && "bg-red-500")} `)}>${escape_html(toast.message)}</div>`);
|
|
||||||
}
|
|
||||||
$$renderer.push(`<!--]--></div>`);
|
|
||||||
if ($$store_subs) unsubscribe_stores($$store_subs);
|
|
||||||
}
|
|
||||||
function _layout($$renderer, $$props) {
|
|
||||||
Toast($$renderer);
|
|
||||||
$$renderer.push(`<!----> <main class="bg-gray-50 min-h-screen flex flex-col">`);
|
|
||||||
Navbar($$renderer);
|
|
||||||
$$renderer.push(`<!----> <div class="p-4 flex-grow"><!--[-->`);
|
|
||||||
slot($$renderer, $$props, "default", {});
|
|
||||||
$$renderer.push(`<!--]--></div> `);
|
|
||||||
Footer($$renderer);
|
|
||||||
$$renderer.push(`<!----></main>`);
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
_layout as default
|
|
||||||
};
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
const ssr = false;
|
|
||||||
const prerender = false;
|
|
||||||
export {
|
|
||||||
prerender,
|
|
||||||
ssr
|
|
||||||
};
|
|
||||||
@@ -1,132 +0,0 @@
|
|||||||
import { a1 as ssr_context, X as store_get, _ as escape_html, Z as ensure_array_like, V as attr_class, Y as unsubscribe_stores, a2 as attr, a3 as bind_props } from "../../chunks/index2.js";
|
|
||||||
import { w as writable } from "../../chunks/index.js";
|
|
||||||
import "clsx";
|
|
||||||
function onDestroy(fn) {
|
|
||||||
/** @type {SSRContext} */
|
|
||||||
ssr_context.r.on_destroy(fn);
|
|
||||||
}
|
|
||||||
const plugins = writable([]);
|
|
||||||
const selectedPlugin = writable(null);
|
|
||||||
const selectedTask = writable(null);
|
|
||||||
const taskLogs = writable([]);
|
|
||||||
function TaskRunner($$renderer, $$props) {
|
|
||||||
$$renderer.component(($$renderer2) => {
|
|
||||||
var $$store_subs;
|
|
||||||
onDestroy(() => {
|
|
||||||
});
|
|
||||||
$$renderer2.push(`<div class="p-4 border rounded-lg bg-white shadow-md">`);
|
|
||||||
if (store_get($$store_subs ??= {}, "$selectedTask", selectedTask)) {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<h2 class="text-xl font-semibold mb-2">Task: ${escape_html(store_get($$store_subs ??= {}, "$selectedTask", selectedTask).plugin_id)}</h2> <div class="bg-gray-900 text-white font-mono text-sm p-4 rounded-md h-96 overflow-y-auto"><!--[-->`);
|
|
||||||
const each_array = ensure_array_like(store_get($$store_subs ??= {}, "$taskLogs", taskLogs));
|
|
||||||
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
|
|
||||||
let log = each_array[$$index];
|
|
||||||
$$renderer2.push(`<div><span class="text-gray-400">${escape_html(new Date(log.timestamp).toLocaleTimeString())}</span> <span${attr_class(log.level === "ERROR" ? "text-red-500" : "text-green-400")}>[${escape_html(log.level)}]</span> <span>${escape_html(log.message)}</span></div>`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--></div>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
$$renderer2.push(`<p>No task selected.</p>`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--></div>`);
|
|
||||||
if ($$store_subs) unsubscribe_stores($$store_subs);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
function DynamicForm($$renderer, $$props) {
|
|
||||||
$$renderer.component(($$renderer2) => {
|
|
||||||
let schema = $$props["schema"];
|
|
||||||
let formData = {};
|
|
||||||
function initializeForm() {
|
|
||||||
if (schema && schema.properties) {
|
|
||||||
for (const key in schema.properties) {
|
|
||||||
formData[key] = schema.properties[key].default || "";
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
initializeForm();
|
|
||||||
$$renderer2.push(`<form class="space-y-4">`);
|
|
||||||
if (schema && schema.properties) {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<!--[-->`);
|
|
||||||
const each_array = ensure_array_like(Object.entries(schema.properties));
|
|
||||||
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
|
|
||||||
let [key, prop] = each_array[$$index];
|
|
||||||
$$renderer2.push(`<div class="flex flex-col"><label${attr("for", key)} class="mb-1 font-semibold text-gray-700">${escape_html(prop.title || key)}</label> `);
|
|
||||||
if (prop.type === "string") {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<input type="text"${attr("id", key)}${attr("value", formData[key])}${attr("placeholder", prop.description || "")} class="p-2 border rounded-md"/>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
if (prop.type === "number" || prop.type === "integer") {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<input type="number"${attr("id", key)}${attr("value", formData[key])}${attr("placeholder", prop.description || "")} class="p-2 border rounded-md"/>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
if (prop.type === "boolean") {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<input type="checkbox"${attr("id", key)}${attr("checked", formData[key], true)} class="h-5 w-5"/>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]-->`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]-->`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--></div>`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--> <button type="submit" class="w-full bg-green-500 text-white p-2 rounded-md hover:bg-green-600">Run Task</button>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--></form>`);
|
|
||||||
bind_props($$props, { schema });
|
|
||||||
});
|
|
||||||
}
|
|
||||||
function _page($$renderer, $$props) {
|
|
||||||
$$renderer.component(($$renderer2) => {
|
|
||||||
var $$store_subs;
|
|
||||||
let data = $$props["data"];
|
|
||||||
if (data.plugins) {
|
|
||||||
plugins.set(data.plugins);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<div class="container mx-auto p-4">`);
|
|
||||||
if (store_get($$store_subs ??= {}, "$selectedTask", selectedTask)) {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
TaskRunner($$renderer2);
|
|
||||||
$$renderer2.push(`<!----> <button class="mt-4 bg-blue-500 text-white p-2 rounded">Back to Task List</button>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
if (store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin)) {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<h2 class="text-2xl font-bold mb-4">${escape_html(store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin).name)}</h2> `);
|
|
||||||
DynamicForm($$renderer2, {
|
|
||||||
schema: store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin).schema
|
|
||||||
});
|
|
||||||
$$renderer2.push(`<!----> <button class="mt-4 bg-gray-500 text-white p-2 rounded">Back to Dashboard</button>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
$$renderer2.push(`<h1 class="text-2xl font-bold mb-4">Available Tools</h1> `);
|
|
||||||
if (data.error) {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">${escape_html(data.error)}</div>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--> <div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4"><!--[-->`);
|
|
||||||
const each_array = ensure_array_like(data.plugins);
|
|
||||||
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
|
|
||||||
let plugin = each_array[$$index];
|
|
||||||
$$renderer2.push(`<div class="border rounded-lg p-4 cursor-pointer hover:bg-gray-100" role="button" tabindex="0"><h2 class="text-xl font-semibold">${escape_html(plugin.name)}</h2> <p class="text-gray-600">${escape_html(plugin.description)}</p> <span class="text-sm text-gray-400">v${escape_html(plugin.version)}</span></div>`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--></div>`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]-->`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--></div>`);
|
|
||||||
if ($$store_subs) unsubscribe_stores($$store_subs);
|
|
||||||
bind_props($$props, { data });
|
|
||||||
});
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
_page as default
|
|
||||||
};
|
|
||||||
@@ -1,18 +0,0 @@
|
|||||||
import { a as api } from "../../chunks/api.js";
|
|
||||||
async function load() {
|
|
||||||
try {
|
|
||||||
const plugins = await api.getPlugins();
|
|
||||||
return {
|
|
||||||
plugins
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
console.error("Failed to load plugins:", error);
|
|
||||||
return {
|
|
||||||
plugins: [],
|
|
||||||
error: "Failed to load plugins"
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
load
|
|
||||||
};
|
|
||||||
@@ -1,45 +0,0 @@
|
|||||||
import { _ as escape_html, a2 as attr, Z as ensure_array_like, a3 as bind_props } from "../../../chunks/index2.js";
|
|
||||||
function _page($$renderer, $$props) {
|
|
||||||
$$renderer.component(($$renderer2) => {
|
|
||||||
let data = $$props["data"];
|
|
||||||
let settings = data.settings;
|
|
||||||
let newEnv = {
|
|
||||||
id: "",
|
|
||||||
name: "",
|
|
||||||
url: "",
|
|
||||||
username: "",
|
|
||||||
password: "",
|
|
||||||
is_default: false
|
|
||||||
};
|
|
||||||
settings = data.settings;
|
|
||||||
$$renderer2.push(`<div class="container mx-auto p-4"><h1 class="text-2xl font-bold mb-6">Settings</h1> `);
|
|
||||||
if (data.error) {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">${escape_html(data.error)}</div>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--> <section class="mb-8 bg-white p-6 rounded shadow"><h2 class="text-xl font-semibold mb-4">Global Settings</h2> <div class="grid grid-cols-1 gap-4"><div><label for="backup_path" class="block text-sm font-medium text-gray-700">Backup Storage Path</label> <input type="text" id="backup_path"${attr("value", settings.settings.backup_path)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <button class="bg-blue-500 text-white px-4 py-2 rounded hover:bg-blue-600 w-max">Save Global Settings</button></div></section> <section class="mb-8 bg-white p-6 rounded shadow"><h2 class="text-xl font-semibold mb-4">Superset Environments</h2> `);
|
|
||||||
if (settings.environments.length === 0) {
|
|
||||||
$$renderer2.push("<!--[-->");
|
|
||||||
$$renderer2.push(`<div class="mb-4 p-4 bg-yellow-100 border-l-4 border-yellow-500 text-yellow-700"><p class="font-bold">Warning</p> <p>No Superset environments configured. You must add at least one environment to perform backups or migrations.</p></div>`);
|
|
||||||
} else {
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--> <div class="mb-6 overflow-x-auto"><table class="min-w-full divide-y divide-gray-200"><thead class="bg-gray-50"><tr><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Name</th><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">URL</th><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Username</th><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Default</th><th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Actions</th></tr></thead><tbody class="bg-white divide-y divide-gray-200"><!--[-->`);
|
|
||||||
const each_array = ensure_array_like(settings.environments);
|
|
||||||
for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) {
|
|
||||||
let env = each_array[$$index];
|
|
||||||
$$renderer2.push(`<tr><td class="px-6 py-4 whitespace-nowrap">${escape_html(env.name)}</td><td class="px-6 py-4 whitespace-nowrap">${escape_html(env.url)}</td><td class="px-6 py-4 whitespace-nowrap">${escape_html(env.username)}</td><td class="px-6 py-4 whitespace-nowrap">${escape_html(env.is_default ? "Yes" : "No")}</td><td class="px-6 py-4 whitespace-nowrap"><button class="text-green-600 hover:text-green-900 mr-4">Test</button> <button class="text-indigo-600 hover:text-indigo-900 mr-4">Edit</button> <button class="text-red-600 hover:text-red-900">Delete</button></td></tr>`);
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--></tbody></table></div> <div class="bg-gray-50 p-4 rounded"><h3 class="text-lg font-medium mb-4">${escape_html("Add")} Environment</h3> <div class="grid grid-cols-1 md:grid-cols-2 gap-4"><div><label for="env_id" class="block text-sm font-medium text-gray-700">ID</label> <input type="text" id="env_id"${attr("value", newEnv.id)}${attr("disabled", false, true)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div><label for="env_name" class="block text-sm font-medium text-gray-700">Name</label> <input type="text" id="env_name"${attr("value", newEnv.name)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div><label for="env_url" class="block text-sm font-medium text-gray-700">URL</label> <input type="text" id="env_url"${attr("value", newEnv.url)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div><label for="env_user" class="block text-sm font-medium text-gray-700">Username</label> <input type="text" id="env_user"${attr("value", newEnv.username)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div><label for="env_pass" class="block text-sm font-medium text-gray-700">Password</label> <input type="password" id="env_pass"${attr("value", newEnv.password)} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2"/></div> <div class="flex items-center"><input type="checkbox" id="env_default"${attr("checked", newEnv.is_default, true)} class="h-4 w-4 text-blue-600 border-gray-300 rounded"/> <label for="env_default" class="ml-2 block text-sm text-gray-900">Default Environment</label></div></div> <div class="mt-4 flex gap-2"><button class="bg-green-500 text-white px-4 py-2 rounded hover:bg-green-600">${escape_html("Add")} Environment</button> `);
|
|
||||||
{
|
|
||||||
$$renderer2.push("<!--[!-->");
|
|
||||||
}
|
|
||||||
$$renderer2.push(`<!--]--></div></div></section></div>`);
|
|
||||||
bind_props($$props, { data });
|
|
||||||
});
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
_page as default
|
|
||||||
};
|
|
||||||
@@ -1,24 +0,0 @@
|
|||||||
import { a as api } from "../../../chunks/api.js";
|
|
||||||
async function load() {
|
|
||||||
try {
|
|
||||||
const settings = await api.getSettings();
|
|
||||||
return {
|
|
||||||
settings
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
console.error("Failed to load settings:", error);
|
|
||||||
return {
|
|
||||||
settings: {
|
|
||||||
environments: [],
|
|
||||||
settings: {
|
|
||||||
backup_path: "",
|
|
||||||
default_environment_id: null
|
|
||||||
}
|
|
||||||
},
|
|
||||||
error: "Failed to load settings"
|
|
||||||
};
|
|
||||||
}
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
load
|
|
||||||
};
|
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -1,13 +0,0 @@
|
|||||||
import { g, o, c, s, a, b } from "./chunks/internal.js";
|
|
||||||
import { s as s2, e, f } from "./chunks/environment.js";
|
|
||||||
export {
|
|
||||||
g as get_hooks,
|
|
||||||
o as options,
|
|
||||||
s2 as set_assets,
|
|
||||||
e as set_building,
|
|
||||||
c as set_manifest,
|
|
||||||
f as set_prerendering,
|
|
||||||
s as set_private_env,
|
|
||||||
a as set_public_env,
|
|
||||||
b as set_read_implementation
|
|
||||||
};
|
|
||||||
@@ -1,47 +0,0 @@
|
|||||||
export const manifest = (() => {
|
|
||||||
function __memo(fn) {
|
|
||||||
let value;
|
|
||||||
return () => value ??= (value = fn());
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
appDir: "_app",
|
|
||||||
appPath: "_app",
|
|
||||||
assets: new Set([]),
|
|
||||||
mimeTypes: {},
|
|
||||||
_: {
|
|
||||||
client: {start:"_app/immutable/entry/start.BHAeOrfR.js",app:"_app/immutable/entry/app.BXnpILpp.js",imports:["_app/immutable/entry/start.BHAeOrfR.js","_app/immutable/chunks/D0iaTcAo.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/entry/app.BXnpILpp.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/chunks/vVxDbqKK.js"],stylesheets:[],fonts:[],uses_env_dynamic_public:false},
|
|
||||||
nodes: [
|
|
||||||
__memo(() => import('./nodes/0.js')),
|
|
||||||
__memo(() => import('./nodes/1.js')),
|
|
||||||
__memo(() => import('./nodes/2.js')),
|
|
||||||
__memo(() => import('./nodes/3.js'))
|
|
||||||
],
|
|
||||||
remotes: {
|
|
||||||
|
|
||||||
},
|
|
||||||
routes: [
|
|
||||||
{
|
|
||||||
id: "/",
|
|
||||||
pattern: /^\/$/,
|
|
||||||
params: [],
|
|
||||||
page: { layouts: [0,], errors: [1,], leaf: 2 },
|
|
||||||
endpoint: null
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "/settings",
|
|
||||||
pattern: /^\/settings\/?$/,
|
|
||||||
params: [],
|
|
||||||
page: { layouts: [0,], errors: [1,], leaf: 3 },
|
|
||||||
endpoint: null
|
|
||||||
}
|
|
||||||
],
|
|
||||||
prerendered_routes: new Set([]),
|
|
||||||
matchers: async () => {
|
|
||||||
|
|
||||||
return { };
|
|
||||||
},
|
|
||||||
server_assets: {}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})();
|
|
||||||
@@ -1,47 +0,0 @@
|
|||||||
export const manifest = (() => {
|
|
||||||
function __memo(fn) {
|
|
||||||
let value;
|
|
||||||
return () => value ??= (value = fn());
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
appDir: "_app",
|
|
||||||
appPath: "_app",
|
|
||||||
assets: new Set([]),
|
|
||||||
mimeTypes: {},
|
|
||||||
_: {
|
|
||||||
client: {start:"_app/immutable/entry/start.BHAeOrfR.js",app:"_app/immutable/entry/app.BXnpILpp.js",imports:["_app/immutable/entry/start.BHAeOrfR.js","_app/immutable/chunks/D0iaTcAo.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/entry/app.BXnpILpp.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/chunks/vVxDbqKK.js"],stylesheets:[],fonts:[],uses_env_dynamic_public:false},
|
|
||||||
nodes: [
|
|
||||||
__memo(() => import('./nodes/0.js')),
|
|
||||||
__memo(() => import('./nodes/1.js')),
|
|
||||||
__memo(() => import('./nodes/2.js')),
|
|
||||||
__memo(() => import('./nodes/3.js'))
|
|
||||||
],
|
|
||||||
remotes: {
|
|
||||||
|
|
||||||
},
|
|
||||||
routes: [
|
|
||||||
{
|
|
||||||
id: "/",
|
|
||||||
pattern: /^\/$/,
|
|
||||||
params: [],
|
|
||||||
page: { layouts: [0,], errors: [1,], leaf: 2 },
|
|
||||||
endpoint: null
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "/settings",
|
|
||||||
pattern: /^\/settings\/?$/,
|
|
||||||
params: [],
|
|
||||||
page: { layouts: [0,], errors: [1,], leaf: 3 },
|
|
||||||
endpoint: null
|
|
||||||
}
|
|
||||||
],
|
|
||||||
prerendered_routes: new Set([]),
|
|
||||||
matchers: async () => {
|
|
||||||
|
|
||||||
return { };
|
|
||||||
},
|
|
||||||
server_assets: {}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})();
|
|
||||||
@@ -1,13 +0,0 @@
|
|||||||
|
|
||||||
|
|
||||||
export const index = 0;
|
|
||||||
let component_cache;
|
|
||||||
export const component = async () => component_cache ??= (await import('../entries/pages/_layout.svelte.js')).default;
|
|
||||||
export const universal = {
|
|
||||||
"ssr": false,
|
|
||||||
"prerender": false
|
|
||||||
};
|
|
||||||
export const universal_id = "src/routes/+layout.ts";
|
|
||||||
export const imports = ["_app/immutable/nodes/0.DZdF_zz-.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/CRLlKr96.js","_app/immutable/chunks/xdjHc-A2.js","_app/immutable/chunks/DXE57cnx.js","_app/immutable/chunks/D0iaTcAo.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/chunks/Dbod7Wv8.js"];
|
|
||||||
export const stylesheets = ["_app/immutable/assets/0.RZHRvmcL.css"];
|
|
||||||
export const fonts = [];
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
|
|
||||||
|
|
||||||
export const index = 1;
|
|
||||||
let component_cache;
|
|
||||||
export const component = async () => component_cache ??= (await import('../entries/pages/_error.svelte.js')).default;
|
|
||||||
export const imports = ["_app/immutable/nodes/1.Bh-fCbID.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/CRLlKr96.js","_app/immutable/chunks/DXE57cnx.js","_app/immutable/chunks/D0iaTcAo.js","_app/immutable/chunks/BxZpmA7Z.js"];
|
|
||||||
export const stylesheets = [];
|
|
||||||
export const fonts = [];
|
|
||||||
@@ -1,14 +0,0 @@
|
|||||||
|
|
||||||
|
|
||||||
export const index = 2;
|
|
||||||
let component_cache;
|
|
||||||
export const component = async () => component_cache ??= (await import('../entries/pages/_page.svelte.js')).default;
|
|
||||||
export const universal = {
|
|
||||||
"ssr": false,
|
|
||||||
"prerender": false,
|
|
||||||
"load": null
|
|
||||||
};
|
|
||||||
export const universal_id = "src/routes/+page.ts";
|
|
||||||
export const imports = ["_app/immutable/nodes/2.BmiXdPHI.js","_app/immutable/chunks/DyPeVqDG.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/Dbod7Wv8.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/CRLlKr96.js","_app/immutable/chunks/vVxDbqKK.js","_app/immutable/chunks/BxZpmA7Z.js","_app/immutable/chunks/xdjHc-A2.js"];
|
|
||||||
export const stylesheets = [];
|
|
||||||
export const fonts = [];
|
|
||||||
@@ -1,14 +0,0 @@
|
|||||||
|
|
||||||
|
|
||||||
export const index = 3;
|
|
||||||
let component_cache;
|
|
||||||
export const component = async () => component_cache ??= (await import('../entries/pages/settings/_page.svelte.js')).default;
|
|
||||||
export const universal = {
|
|
||||||
"ssr": false,
|
|
||||||
"prerender": false,
|
|
||||||
"load": null
|
|
||||||
};
|
|
||||||
export const universal_id = "src/routes/settings/+page.ts";
|
|
||||||
export const imports = ["_app/immutable/nodes/3.guWMyWpk.js","_app/immutable/chunks/DyPeVqDG.js","_app/immutable/chunks/BtL0wB3H.js","_app/immutable/chunks/Dbod7Wv8.js","_app/immutable/chunks/cv2LK44M.js","_app/immutable/chunks/CRLlKr96.js","_app/immutable/chunks/vVxDbqKK.js"];
|
|
||||||
export const stylesheets = [];
|
|
||||||
export const fonts = [];
|
|
||||||
@@ -1,562 +0,0 @@
|
|||||||
import { get_request_store, with_request_store } from "@sveltejs/kit/internal/server";
|
|
||||||
import { parse } from "devalue";
|
|
||||||
import { error, json } from "@sveltejs/kit";
|
|
||||||
import { a as stringify_remote_arg, f as flatten_issues, b as create_field_proxy, n as normalize_issue, e as set_nested_value, g as deep_set, s as stringify, c as create_remote_key } from "./chunks/shared.js";
|
|
||||||
import { ValidationError } from "@sveltejs/kit/internal";
|
|
||||||
import { B as BROWSER } from "./chunks/false.js";
|
|
||||||
import { b as base, c as app_dir, p as prerendering } from "./chunks/environment.js";
|
|
||||||
function create_validator(validate_or_fn, maybe_fn) {
|
|
||||||
if (!maybe_fn) {
|
|
||||||
return (arg) => {
|
|
||||||
if (arg !== void 0) {
|
|
||||||
error(400, "Bad Request");
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
if (validate_or_fn === "unchecked") {
|
|
||||||
return (arg) => arg;
|
|
||||||
}
|
|
||||||
if ("~standard" in validate_or_fn) {
|
|
||||||
return async (arg) => {
|
|
||||||
const { event, state } = get_request_store();
|
|
||||||
const result = await validate_or_fn["~standard"].validate(arg);
|
|
||||||
if (result.issues) {
|
|
||||||
error(
|
|
||||||
400,
|
|
||||||
await state.handleValidationError({
|
|
||||||
issues: result.issues,
|
|
||||||
event
|
|
||||||
})
|
|
||||||
);
|
|
||||||
}
|
|
||||||
return result.value;
|
|
||||||
};
|
|
||||||
}
|
|
||||||
throw new Error(
|
|
||||||
'Invalid validator passed to remote function. Expected "unchecked" or a Standard Schema (https://standardschema.dev)'
|
|
||||||
);
|
|
||||||
}
|
|
||||||
async function get_response(info, arg, state, get_result) {
|
|
||||||
await 0;
|
|
||||||
const cache = get_cache(info, state);
|
|
||||||
return cache[stringify_remote_arg(arg, state.transport)] ??= get_result();
|
|
||||||
}
|
|
||||||
function parse_remote_response(data, transport) {
|
|
||||||
const revivers = {};
|
|
||||||
for (const key in transport) {
|
|
||||||
revivers[key] = transport[key].decode;
|
|
||||||
}
|
|
||||||
return parse(data, revivers);
|
|
||||||
}
|
|
||||||
async function run_remote_function(event, state, allow_cookies, arg, validate, fn) {
|
|
||||||
const store = {
|
|
||||||
event: {
|
|
||||||
...event,
|
|
||||||
setHeaders: () => {
|
|
||||||
throw new Error("setHeaders is not allowed in remote functions");
|
|
||||||
},
|
|
||||||
cookies: {
|
|
||||||
...event.cookies,
|
|
||||||
set: (name, value, opts) => {
|
|
||||||
if (!allow_cookies) {
|
|
||||||
throw new Error("Cannot set cookies in `query` or `prerender` functions");
|
|
||||||
}
|
|
||||||
if (opts.path && !opts.path.startsWith("/")) {
|
|
||||||
throw new Error("Cookies set in remote functions must have an absolute path");
|
|
||||||
}
|
|
||||||
return event.cookies.set(name, value, opts);
|
|
||||||
},
|
|
||||||
delete: (name, opts) => {
|
|
||||||
if (!allow_cookies) {
|
|
||||||
throw new Error("Cannot delete cookies in `query` or `prerender` functions");
|
|
||||||
}
|
|
||||||
if (opts.path && !opts.path.startsWith("/")) {
|
|
||||||
throw new Error("Cookies deleted in remote functions must have an absolute path");
|
|
||||||
}
|
|
||||||
return event.cookies.delete(name, opts);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
state: {
|
|
||||||
...state,
|
|
||||||
is_in_remote_function: true
|
|
||||||
}
|
|
||||||
};
|
|
||||||
const validated = await with_request_store(store, () => validate(arg));
|
|
||||||
return with_request_store(store, () => fn(validated));
|
|
||||||
}
|
|
||||||
function get_cache(info, state = get_request_store().state) {
|
|
||||||
let cache = state.remote_data?.get(info);
|
|
||||||
if (cache === void 0) {
|
|
||||||
cache = {};
|
|
||||||
(state.remote_data ??= /* @__PURE__ */ new Map()).set(info, cache);
|
|
||||||
}
|
|
||||||
return cache;
|
|
||||||
}
|
|
||||||
// @__NO_SIDE_EFFECTS__
|
|
||||||
function command(validate_or_fn, maybe_fn) {
|
|
||||||
const fn = maybe_fn ?? validate_or_fn;
|
|
||||||
const validate = create_validator(validate_or_fn, maybe_fn);
|
|
||||||
const __ = { type: "command", id: "", name: "" };
|
|
||||||
const wrapper = (arg) => {
|
|
||||||
const { event, state } = get_request_store();
|
|
||||||
if (state.is_endpoint_request) {
|
|
||||||
if (!["POST", "PUT", "PATCH", "DELETE"].includes(event.request.method)) {
|
|
||||||
throw new Error(
|
|
||||||
`Cannot call a command (\`${__.name}(${maybe_fn ? "..." : ""})\`) from a ${event.request.method} handler`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
} else if (!event.isRemoteRequest) {
|
|
||||||
throw new Error(
|
|
||||||
`Cannot call a command (\`${__.name}(${maybe_fn ? "..." : ""})\`) during server-side rendering`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
state.refreshes ??= {};
|
|
||||||
const promise = Promise.resolve(run_remote_function(event, state, true, arg, validate, fn));
|
|
||||||
promise.updates = () => {
|
|
||||||
throw new Error(`Cannot call '${__.name}(...).updates(...)' on the server`);
|
|
||||||
};
|
|
||||||
return (
|
|
||||||
/** @type {ReturnType<RemoteCommand<Input, Output>>} */
|
|
||||||
promise
|
|
||||||
);
|
|
||||||
};
|
|
||||||
Object.defineProperty(wrapper, "__", { value: __ });
|
|
||||||
Object.defineProperty(wrapper, "pending", {
|
|
||||||
get: () => 0
|
|
||||||
});
|
|
||||||
return wrapper;
|
|
||||||
}
|
|
||||||
// @__NO_SIDE_EFFECTS__
|
|
||||||
function form(validate_or_fn, maybe_fn) {
|
|
||||||
const fn = maybe_fn ?? validate_or_fn;
|
|
||||||
const schema = !maybe_fn || validate_or_fn === "unchecked" ? null : (
|
|
||||||
/** @type {any} */
|
|
||||||
validate_or_fn
|
|
||||||
);
|
|
||||||
function create_instance(key) {
|
|
||||||
const instance = {};
|
|
||||||
instance.method = "POST";
|
|
||||||
Object.defineProperty(instance, "enhance", {
|
|
||||||
value: () => {
|
|
||||||
return { action: instance.action, method: instance.method };
|
|
||||||
}
|
|
||||||
});
|
|
||||||
const button_props = {
|
|
||||||
type: "submit",
|
|
||||||
onclick: () => {
|
|
||||||
}
|
|
||||||
};
|
|
||||||
Object.defineProperty(button_props, "enhance", {
|
|
||||||
value: () => {
|
|
||||||
return { type: "submit", formaction: instance.buttonProps.formaction, onclick: () => {
|
|
||||||
} };
|
|
||||||
}
|
|
||||||
});
|
|
||||||
Object.defineProperty(instance, "buttonProps", {
|
|
||||||
value: button_props
|
|
||||||
});
|
|
||||||
const __ = {
|
|
||||||
type: "form",
|
|
||||||
name: "",
|
|
||||||
id: "",
|
|
||||||
fn: async (data, meta, form_data) => {
|
|
||||||
const output = {};
|
|
||||||
output.submission = true;
|
|
||||||
const { event, state } = get_request_store();
|
|
||||||
const validated = await schema?.["~standard"].validate(data);
|
|
||||||
if (meta.validate_only) {
|
|
||||||
return validated?.issues?.map((issue) => normalize_issue(issue, true)) ?? [];
|
|
||||||
}
|
|
||||||
if (validated?.issues !== void 0) {
|
|
||||||
handle_issues(output, validated.issues, form_data);
|
|
||||||
} else {
|
|
||||||
if (validated !== void 0) {
|
|
||||||
data = validated.value;
|
|
||||||
}
|
|
||||||
state.refreshes ??= {};
|
|
||||||
const issue = create_issues();
|
|
||||||
try {
|
|
||||||
output.result = await run_remote_function(
|
|
||||||
event,
|
|
||||||
state,
|
|
||||||
true,
|
|
||||||
data,
|
|
||||||
(d) => d,
|
|
||||||
(data2) => !maybe_fn ? fn() : fn(data2, issue)
|
|
||||||
);
|
|
||||||
} catch (e) {
|
|
||||||
if (e instanceof ValidationError) {
|
|
||||||
handle_issues(output, e.issues, form_data);
|
|
||||||
} else {
|
|
||||||
throw e;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (!event.isRemoteRequest) {
|
|
||||||
get_cache(__, state)[""] ??= output;
|
|
||||||
}
|
|
||||||
return output;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
Object.defineProperty(instance, "__", { value: __ });
|
|
||||||
Object.defineProperty(instance, "action", {
|
|
||||||
get: () => `?/remote=${__.id}`,
|
|
||||||
enumerable: true
|
|
||||||
});
|
|
||||||
Object.defineProperty(button_props, "formaction", {
|
|
||||||
get: () => `?/remote=${__.id}`,
|
|
||||||
enumerable: true
|
|
||||||
});
|
|
||||||
Object.defineProperty(instance, "fields", {
|
|
||||||
get() {
|
|
||||||
const data = get_cache(__)?.[""];
|
|
||||||
const issues = flatten_issues(data?.issues ?? []);
|
|
||||||
return create_field_proxy(
|
|
||||||
{},
|
|
||||||
() => data?.input ?? {},
|
|
||||||
(path, value) => {
|
|
||||||
if (data?.submission) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
const input = path.length === 0 ? value : deep_set(data?.input ?? {}, path.map(String), value);
|
|
||||||
(get_cache(__)[""] ??= {}).input = input;
|
|
||||||
},
|
|
||||||
() => issues
|
|
||||||
);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
Object.defineProperty(instance, "result", {
|
|
||||||
get() {
|
|
||||||
try {
|
|
||||||
return get_cache(__)?.[""]?.result;
|
|
||||||
} catch {
|
|
||||||
return void 0;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
Object.defineProperty(instance, "pending", {
|
|
||||||
get: () => 0
|
|
||||||
});
|
|
||||||
Object.defineProperty(button_props, "pending", {
|
|
||||||
get: () => 0
|
|
||||||
});
|
|
||||||
Object.defineProperty(instance, "preflight", {
|
|
||||||
// preflight is a noop on the server
|
|
||||||
value: () => instance
|
|
||||||
});
|
|
||||||
Object.defineProperty(instance, "validate", {
|
|
||||||
value: () => {
|
|
||||||
throw new Error("Cannot call validate() on the server");
|
|
||||||
}
|
|
||||||
});
|
|
||||||
if (key == void 0) {
|
|
||||||
Object.defineProperty(instance, "for", {
|
|
||||||
/** @type {RemoteForm<any, any>['for']} */
|
|
||||||
value: (key2) => {
|
|
||||||
const { state } = get_request_store();
|
|
||||||
const cache_key = __.id + "|" + JSON.stringify(key2);
|
|
||||||
let instance2 = (state.form_instances ??= /* @__PURE__ */ new Map()).get(cache_key);
|
|
||||||
if (!instance2) {
|
|
||||||
instance2 = create_instance(key2);
|
|
||||||
instance2.__.id = `${__.id}/${encodeURIComponent(JSON.stringify(key2))}`;
|
|
||||||
instance2.__.name = __.name;
|
|
||||||
state.form_instances.set(cache_key, instance2);
|
|
||||||
}
|
|
||||||
return instance2;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
return instance;
|
|
||||||
}
|
|
||||||
return create_instance();
|
|
||||||
}
|
|
||||||
function handle_issues(output, issues, form_data) {
|
|
||||||
output.issues = issues.map((issue) => normalize_issue(issue, true));
|
|
||||||
if (form_data) {
|
|
||||||
output.input = {};
|
|
||||||
for (let key of form_data.keys()) {
|
|
||||||
if (/^[.\]]?_/.test(key)) continue;
|
|
||||||
const is_array = key.endsWith("[]");
|
|
||||||
const values = form_data.getAll(key).filter((value) => typeof value === "string");
|
|
||||||
if (is_array) key = key.slice(0, -2);
|
|
||||||
set_nested_value(
|
|
||||||
/** @type {Record<string, any>} */
|
|
||||||
output.input,
|
|
||||||
key,
|
|
||||||
is_array ? values : values[0]
|
|
||||||
);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function create_issues() {
|
|
||||||
return (
|
|
||||||
/** @type {InvalidField<any>} */
|
|
||||||
new Proxy(
|
|
||||||
/** @param {string} message */
|
|
||||||
(message) => {
|
|
||||||
if (typeof message !== "string") {
|
|
||||||
throw new Error(
|
|
||||||
"`invalid` should now be imported from `@sveltejs/kit` to throw validation issues. The second parameter provided to the form function (renamed to `issue`) is still used to construct issues, e.g. `invalid(issue.field('message'))`. For more info see https://github.com/sveltejs/kit/pulls/14768"
|
|
||||||
);
|
|
||||||
}
|
|
||||||
return create_issue(message);
|
|
||||||
},
|
|
||||||
{
|
|
||||||
get(target, prop) {
|
|
||||||
if (typeof prop === "symbol") return (
|
|
||||||
/** @type {any} */
|
|
||||||
target[prop]
|
|
||||||
);
|
|
||||||
return create_issue_proxy(prop, []);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
)
|
|
||||||
);
|
|
||||||
function create_issue(message, path = []) {
|
|
||||||
return {
|
|
||||||
message,
|
|
||||||
path
|
|
||||||
};
|
|
||||||
}
|
|
||||||
function create_issue_proxy(key, path) {
|
|
||||||
const new_path = [...path, key];
|
|
||||||
const issue_func = (message) => create_issue(message, new_path);
|
|
||||||
return new Proxy(issue_func, {
|
|
||||||
get(target, prop) {
|
|
||||||
if (typeof prop === "symbol") return (
|
|
||||||
/** @type {any} */
|
|
||||||
target[prop]
|
|
||||||
);
|
|
||||||
if (/^\d+$/.test(prop)) {
|
|
||||||
return create_issue_proxy(parseInt(prop, 10), new_path);
|
|
||||||
}
|
|
||||||
return create_issue_proxy(prop, new_path);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// @__NO_SIDE_EFFECTS__
|
|
||||||
function prerender(validate_or_fn, fn_or_options, maybe_options) {
|
|
||||||
const maybe_fn = typeof fn_or_options === "function" ? fn_or_options : void 0;
|
|
||||||
const options = maybe_options ?? (maybe_fn ? void 0 : fn_or_options);
|
|
||||||
const fn = maybe_fn ?? validate_or_fn;
|
|
||||||
const validate = create_validator(validate_or_fn, maybe_fn);
|
|
||||||
const __ = {
|
|
||||||
type: "prerender",
|
|
||||||
id: "",
|
|
||||||
name: "",
|
|
||||||
has_arg: !!maybe_fn,
|
|
||||||
inputs: options?.inputs,
|
|
||||||
dynamic: options?.dynamic
|
|
||||||
};
|
|
||||||
const wrapper = (arg) => {
|
|
||||||
const promise = (async () => {
|
|
||||||
const { event, state } = get_request_store();
|
|
||||||
const payload = stringify_remote_arg(arg, state.transport);
|
|
||||||
const id = __.id;
|
|
||||||
const url = `${base}/${app_dir}/remote/${id}${payload ? `/${payload}` : ""}`;
|
|
||||||
if (!state.prerendering && !BROWSER && !event.isRemoteRequest) {
|
|
||||||
try {
|
|
||||||
return await get_response(__, arg, state, async () => {
|
|
||||||
const key = stringify_remote_arg(arg, state.transport);
|
|
||||||
const cache = get_cache(__, state);
|
|
||||||
const promise3 = cache[key] ??= fetch(new URL(url, event.url.origin).href).then(
|
|
||||||
async (response) => {
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error("Prerendered response not found");
|
|
||||||
}
|
|
||||||
const prerendered = await response.json();
|
|
||||||
if (prerendered.type === "error") {
|
|
||||||
error(prerendered.status, prerendered.error);
|
|
||||||
}
|
|
||||||
return prerendered.result;
|
|
||||||
}
|
|
||||||
);
|
|
||||||
return parse_remote_response(await promise3, state.transport);
|
|
||||||
});
|
|
||||||
} catch {
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (state.prerendering?.remote_responses.has(url)) {
|
|
||||||
return (
|
|
||||||
/** @type {Promise<any>} */
|
|
||||||
state.prerendering.remote_responses.get(url)
|
|
||||||
);
|
|
||||||
}
|
|
||||||
const promise2 = get_response(
|
|
||||||
__,
|
|
||||||
arg,
|
|
||||||
state,
|
|
||||||
() => run_remote_function(event, state, false, arg, validate, fn)
|
|
||||||
);
|
|
||||||
if (state.prerendering) {
|
|
||||||
state.prerendering.remote_responses.set(url, promise2);
|
|
||||||
}
|
|
||||||
const result = await promise2;
|
|
||||||
if (state.prerendering) {
|
|
||||||
const body = { type: "result", result: stringify(result, state.transport) };
|
|
||||||
state.prerendering.dependencies.set(url, {
|
|
||||||
body: JSON.stringify(body),
|
|
||||||
response: json(body)
|
|
||||||
});
|
|
||||||
}
|
|
||||||
return result;
|
|
||||||
})();
|
|
||||||
promise.catch(() => {
|
|
||||||
});
|
|
||||||
return (
|
|
||||||
/** @type {RemoteResource<Output>} */
|
|
||||||
promise
|
|
||||||
);
|
|
||||||
};
|
|
||||||
Object.defineProperty(wrapper, "__", { value: __ });
|
|
||||||
return wrapper;
|
|
||||||
}
|
|
||||||
// @__NO_SIDE_EFFECTS__
|
|
||||||
function query(validate_or_fn, maybe_fn) {
|
|
||||||
const fn = maybe_fn ?? validate_or_fn;
|
|
||||||
const validate = create_validator(validate_or_fn, maybe_fn);
|
|
||||||
const __ = { type: "query", id: "", name: "" };
|
|
||||||
const wrapper = (arg) => {
|
|
||||||
if (prerendering) {
|
|
||||||
throw new Error(
|
|
||||||
`Cannot call query '${__.name}' while prerendering, as prerendered pages need static data. Use 'prerender' from $app/server instead`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
const { event, state } = get_request_store();
|
|
||||||
const get_remote_function_result = () => run_remote_function(event, state, false, arg, validate, fn);
|
|
||||||
const promise = get_response(__, arg, state, get_remote_function_result);
|
|
||||||
promise.catch(() => {
|
|
||||||
});
|
|
||||||
promise.set = (value) => update_refresh_value(get_refresh_context(__, "set", arg), value);
|
|
||||||
promise.refresh = () => {
|
|
||||||
const refresh_context = get_refresh_context(__, "refresh", arg);
|
|
||||||
const is_immediate_refresh = !refresh_context.cache[refresh_context.cache_key];
|
|
||||||
const value = is_immediate_refresh ? promise : get_remote_function_result();
|
|
||||||
return update_refresh_value(refresh_context, value, is_immediate_refresh);
|
|
||||||
};
|
|
||||||
promise.withOverride = () => {
|
|
||||||
throw new Error(`Cannot call '${__.name}.withOverride()' on the server`);
|
|
||||||
};
|
|
||||||
return (
|
|
||||||
/** @type {RemoteQuery<Output>} */
|
|
||||||
promise
|
|
||||||
);
|
|
||||||
};
|
|
||||||
Object.defineProperty(wrapper, "__", { value: __ });
|
|
||||||
return wrapper;
|
|
||||||
}
|
|
||||||
// @__NO_SIDE_EFFECTS__
|
|
||||||
function batch(validate_or_fn, maybe_fn) {
|
|
||||||
const fn = maybe_fn ?? validate_or_fn;
|
|
||||||
const validate = create_validator(validate_or_fn, maybe_fn);
|
|
||||||
const __ = {
|
|
||||||
type: "query_batch",
|
|
||||||
id: "",
|
|
||||||
name: "",
|
|
||||||
run: (args) => {
|
|
||||||
const { event, state } = get_request_store();
|
|
||||||
return run_remote_function(
|
|
||||||
event,
|
|
||||||
state,
|
|
||||||
false,
|
|
||||||
args,
|
|
||||||
(array) => Promise.all(array.map(validate)),
|
|
||||||
fn
|
|
||||||
);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
let batching = { args: [], resolvers: [] };
|
|
||||||
const wrapper = (arg) => {
|
|
||||||
if (prerendering) {
|
|
||||||
throw new Error(
|
|
||||||
`Cannot call query.batch '${__.name}' while prerendering, as prerendered pages need static data. Use 'prerender' from $app/server instead`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
const { event, state } = get_request_store();
|
|
||||||
const get_remote_function_result = () => {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
batching.args.push(arg);
|
|
||||||
batching.resolvers.push({ resolve, reject });
|
|
||||||
if (batching.args.length > 1) return;
|
|
||||||
setTimeout(async () => {
|
|
||||||
const batched = batching;
|
|
||||||
batching = { args: [], resolvers: [] };
|
|
||||||
try {
|
|
||||||
const get_result = await run_remote_function(
|
|
||||||
event,
|
|
||||||
state,
|
|
||||||
false,
|
|
||||||
batched.args,
|
|
||||||
(array) => Promise.all(array.map(validate)),
|
|
||||||
fn
|
|
||||||
);
|
|
||||||
for (let i = 0; i < batched.resolvers.length; i++) {
|
|
||||||
try {
|
|
||||||
batched.resolvers[i].resolve(get_result(batched.args[i], i));
|
|
||||||
} catch (error2) {
|
|
||||||
batched.resolvers[i].reject(error2);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (error2) {
|
|
||||||
for (const resolver of batched.resolvers) {
|
|
||||||
resolver.reject(error2);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}, 0);
|
|
||||||
});
|
|
||||||
};
|
|
||||||
const promise = get_response(__, arg, state, get_remote_function_result);
|
|
||||||
promise.catch(() => {
|
|
||||||
});
|
|
||||||
promise.set = (value) => update_refresh_value(get_refresh_context(__, "set", arg), value);
|
|
||||||
promise.refresh = () => {
|
|
||||||
const refresh_context = get_refresh_context(__, "refresh", arg);
|
|
||||||
const is_immediate_refresh = !refresh_context.cache[refresh_context.cache_key];
|
|
||||||
const value = is_immediate_refresh ? promise : get_remote_function_result();
|
|
||||||
return update_refresh_value(refresh_context, value, is_immediate_refresh);
|
|
||||||
};
|
|
||||||
promise.withOverride = () => {
|
|
||||||
throw new Error(`Cannot call '${__.name}.withOverride()' on the server`);
|
|
||||||
};
|
|
||||||
return (
|
|
||||||
/** @type {RemoteQuery<Output>} */
|
|
||||||
promise
|
|
||||||
);
|
|
||||||
};
|
|
||||||
Object.defineProperty(wrapper, "__", { value: __ });
|
|
||||||
return wrapper;
|
|
||||||
}
|
|
||||||
Object.defineProperty(query, "batch", { value: batch, enumerable: true });
|
|
||||||
function get_refresh_context(__, action, arg) {
|
|
||||||
const { state } = get_request_store();
|
|
||||||
const { refreshes } = state;
|
|
||||||
if (!refreshes) {
|
|
||||||
const name = __.type === "query_batch" ? `query.batch '${__.name}'` : `query '${__.name}'`;
|
|
||||||
throw new Error(
|
|
||||||
`Cannot call ${action} on ${name} because it is not executed in the context of a command/form remote function`
|
|
||||||
);
|
|
||||||
}
|
|
||||||
const cache = get_cache(__, state);
|
|
||||||
const cache_key = stringify_remote_arg(arg, state.transport);
|
|
||||||
const refreshes_key = create_remote_key(__.id, cache_key);
|
|
||||||
return { __, state, refreshes, refreshes_key, cache, cache_key };
|
|
||||||
}
|
|
||||||
function update_refresh_value({ __, refreshes, refreshes_key, cache, cache_key }, value, is_immediate_refresh = false) {
|
|
||||||
const promise = Promise.resolve(value);
|
|
||||||
if (!is_immediate_refresh) {
|
|
||||||
cache[cache_key] = promise;
|
|
||||||
}
|
|
||||||
if (__.id) {
|
|
||||||
refreshes[refreshes_key] = promise;
|
|
||||||
}
|
|
||||||
return promise.then(() => {
|
|
||||||
});
|
|
||||||
}
|
|
||||||
export {
|
|
||||||
command,
|
|
||||||
form,
|
|
||||||
prerender,
|
|
||||||
query
|
|
||||||
};
|
|
||||||
@@ -1,52 +0,0 @@
|
|||||||
{
|
|
||||||
"compilerOptions": {
|
|
||||||
"paths": {
|
|
||||||
"$lib": [
|
|
||||||
"../src/lib"
|
|
||||||
],
|
|
||||||
"$lib/*": [
|
|
||||||
"../src/lib/*"
|
|
||||||
],
|
|
||||||
"$app/types": [
|
|
||||||
"./types/index.d.ts"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"rootDirs": [
|
|
||||||
"..",
|
|
||||||
"./types"
|
|
||||||
],
|
|
||||||
"verbatimModuleSyntax": true,
|
|
||||||
"isolatedModules": true,
|
|
||||||
"lib": [
|
|
||||||
"esnext",
|
|
||||||
"DOM",
|
|
||||||
"DOM.Iterable"
|
|
||||||
],
|
|
||||||
"moduleResolution": "bundler",
|
|
||||||
"module": "esnext",
|
|
||||||
"noEmit": true,
|
|
||||||
"target": "esnext"
|
|
||||||
},
|
|
||||||
"include": [
|
|
||||||
"ambient.d.ts",
|
|
||||||
"non-ambient.d.ts",
|
|
||||||
"./types/**/$types.d.ts",
|
|
||||||
"../vite.config.js",
|
|
||||||
"../vite.config.ts",
|
|
||||||
"../src/**/*.js",
|
|
||||||
"../src/**/*.ts",
|
|
||||||
"../src/**/*.svelte",
|
|
||||||
"../tests/**/*.js",
|
|
||||||
"../tests/**/*.ts",
|
|
||||||
"../tests/**/*.svelte"
|
|
||||||
],
|
|
||||||
"exclude": [
|
|
||||||
"../node_modules/**",
|
|
||||||
"../src/service-worker.js",
|
|
||||||
"../src/service-worker/**/*.js",
|
|
||||||
"../src/service-worker.ts",
|
|
||||||
"../src/service-worker/**/*.ts",
|
|
||||||
"../src/service-worker.d.ts",
|
|
||||||
"../src/service-worker/**/*.d.ts"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
13
frontend/package-lock.json
generated
13
frontend/package-lock.json
generated
@@ -7,6 +7,9 @@
|
|||||||
"": {
|
"": {
|
||||||
"name": "frontend",
|
"name": "frontend",
|
||||||
"version": "0.0.0",
|
"version": "0.0.0",
|
||||||
|
"dependencies": {
|
||||||
|
"date-fns": "^4.1.0"
|
||||||
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@sveltejs/adapter-static": "^3.0.10",
|
"@sveltejs/adapter-static": "^3.0.10",
|
||||||
"@sveltejs/kit": "^2.49.2",
|
"@sveltejs/kit": "^2.49.2",
|
||||||
@@ -1279,6 +1282,16 @@
|
|||||||
"node": ">=4"
|
"node": ">=4"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/date-fns": {
|
||||||
|
"version": "4.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/date-fns/-/date-fns-4.1.0.tgz",
|
||||||
|
"integrity": "sha512-Ukq0owbQXxa/U3EGtsdVBkR1w7KOQ5gIBqdH2hkvknzZPYvBxb/aa6E8L7tmjFtkwZBu3UXBbjIgPo/Ez4xaNg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"funding": {
|
||||||
|
"type": "github",
|
||||||
|
"url": "https://github.com/sponsors/kossnocorp"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/debug": {
|
"node_modules/debug": {
|
||||||
"version": "4.4.3",
|
"version": "4.4.3",
|
||||||
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",
|
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",
|
||||||
|
|||||||
@@ -17,5 +17,8 @@
|
|||||||
"svelte": "^5.43.8",
|
"svelte": "^5.43.8",
|
||||||
"tailwindcss": "^3.0.0",
|
"tailwindcss": "^3.0.0",
|
||||||
"vite": "^7.2.4"
|
"vite": "^7.2.4"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"date-fns": "^4.1.0"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -39,7 +39,7 @@
|
|||||||
console.error(`[App.handleFormSubmit][Coherence:Failed] Task creation failed error=${error}`);
|
console.error(`[App.handleFormSubmit][Coherence:Failed] Task creation failed error=${error}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
// [/DEF:handleFormSubmit]
|
// [/DEF:handleFormSubmit:Function]
|
||||||
|
|
||||||
// [DEF:navigate:Function]
|
// [DEF:navigate:Function]
|
||||||
/**
|
/**
|
||||||
@@ -56,7 +56,7 @@
|
|||||||
// Then set page
|
// Then set page
|
||||||
currentPage.set(page);
|
currentPage.set(page);
|
||||||
}
|
}
|
||||||
// [/DEF:navigate]
|
// [/DEF:navigate:Function]
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<!-- [SECTION: TEMPLATE] -->
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
@@ -110,4 +110,4 @@
|
|||||||
</main>
|
</main>
|
||||||
<!-- [/SECTION] -->
|
<!-- [/SECTION] -->
|
||||||
|
|
||||||
<!-- [/DEF:App] -->
|
<!-- [/DEF:App:Component] -->
|
||||||
|
|||||||
@@ -69,7 +69,7 @@
|
|||||||
sortDirection = "asc";
|
sortDirection = "asc";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
// [/DEF:handleSort]
|
// [/DEF:handleSort:Function]
|
||||||
|
|
||||||
// [DEF:handleSelectionChange:Function]
|
// [DEF:handleSelectionChange:Function]
|
||||||
// @PURPOSE: Handles individual checkbox changes.
|
// @PURPOSE: Handles individual checkbox changes.
|
||||||
@@ -83,7 +83,7 @@
|
|||||||
selectedIds = newSelected;
|
selectedIds = newSelected;
|
||||||
dispatch('selectionChanged', newSelected);
|
dispatch('selectionChanged', newSelected);
|
||||||
}
|
}
|
||||||
// [/DEF:handleSelectionChange]
|
// [/DEF:handleSelectionChange:Function]
|
||||||
|
|
||||||
// [DEF:handleSelectAll:Function]
|
// [DEF:handleSelectAll:Function]
|
||||||
// @PURPOSE: Handles select all checkbox.
|
// @PURPOSE: Handles select all checkbox.
|
||||||
@@ -101,7 +101,7 @@
|
|||||||
selectedIds = newSelected;
|
selectedIds = newSelected;
|
||||||
dispatch('selectionChanged', newSelected);
|
dispatch('selectionChanged', newSelected);
|
||||||
}
|
}
|
||||||
// [/DEF:handleSelectAll]
|
// [/DEF:handleSelectAll:Function]
|
||||||
|
|
||||||
// [DEF:goToPage:Function]
|
// [DEF:goToPage:Function]
|
||||||
// @PURPOSE: Changes current page.
|
// @PURPOSE: Changes current page.
|
||||||
@@ -110,7 +110,7 @@
|
|||||||
currentPage = page;
|
currentPage = page;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
// [/DEF:goToPage]
|
// [/DEF:goToPage:Function]
|
||||||
|
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
@@ -202,4 +202,4 @@
|
|||||||
/* Component styles */
|
/* Component styles */
|
||||||
</style>
|
</style>
|
||||||
|
|
||||||
<!-- [/DEF:DashboardGrid] -->
|
<!-- [/DEF:DashboardGrid:Component] -->
|
||||||
@@ -28,7 +28,7 @@
|
|||||||
console.log("[DynamicForm][Action] Submitting form data.", { formData });
|
console.log("[DynamicForm][Action] Submitting form data.", { formData });
|
||||||
dispatch('submit', formData);
|
dispatch('submit', formData);
|
||||||
}
|
}
|
||||||
// [/DEF:handleSubmit]
|
// [/DEF:handleSubmit:Function]
|
||||||
|
|
||||||
// [DEF:initializeForm:Function]
|
// [DEF:initializeForm:Function]
|
||||||
/**
|
/**
|
||||||
@@ -41,7 +41,7 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
// [/DEF:initializeForm]
|
// [/DEF:initializeForm:Function]
|
||||||
|
|
||||||
initializeForm();
|
initializeForm();
|
||||||
</script>
|
</script>
|
||||||
@@ -85,4 +85,4 @@
|
|||||||
</form>
|
</form>
|
||||||
<!-- [/SECTION] -->
|
<!-- [/SECTION] -->
|
||||||
|
|
||||||
<!-- [/DEF:DynamicForm] -->
|
<!-- [/DEF:DynamicForm:Component] -->
|
||||||
|
|||||||
@@ -31,7 +31,7 @@
|
|||||||
selectedId = target.value;
|
selectedId = target.value;
|
||||||
dispatch('change', { id: selectedId });
|
dispatch('change', { id: selectedId });
|
||||||
}
|
}
|
||||||
// [/DEF:handleSelect]
|
// [/DEF:handleSelect:Function]
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<!-- [SECTION: TEMPLATE] -->
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
@@ -55,4 +55,4 @@
|
|||||||
/* Component specific styles */
|
/* Component specific styles */
|
||||||
</style>
|
</style>
|
||||||
|
|
||||||
<!-- [/DEF:EnvSelector] -->
|
<!-- [/DEF:EnvSelector:Component] -->
|
||||||
|
|||||||
@@ -1,3 +1,10 @@
|
|||||||
|
<!-- [DEF:Footer:Component] -->
|
||||||
|
<!--
|
||||||
|
@SEMANTICS: footer, layout, copyright
|
||||||
|
@PURPOSE: Displays the application footer with copyright information.
|
||||||
|
@LAYER: UI
|
||||||
|
-->
|
||||||
<footer class="bg-white border-t p-4 mt-8 text-center text-gray-500 text-sm">
|
<footer class="bg-white border-t p-4 mt-8 text-center text-gray-500 text-sm">
|
||||||
© 2025 Superset Tools. All rights reserved.
|
© 2025 Superset Tools. All rights reserved.
|
||||||
</footer>
|
</footer>
|
||||||
|
<!-- [/DEF:Footer:Component] -->
|
||||||
|
|||||||
@@ -29,7 +29,7 @@
|
|||||||
function updateMapping(sourceUuid: string, targetUuid: string) {
|
function updateMapping(sourceUuid: string, targetUuid: string) {
|
||||||
dispatch('update', { sourceUuid, targetUuid });
|
dispatch('update', { sourceUuid, targetUuid });
|
||||||
}
|
}
|
||||||
// [/DEF:updateMapping]
|
// [/DEF:updateMapping:Function]
|
||||||
|
|
||||||
// [DEF:getSuggestion:Function]
|
// [DEF:getSuggestion:Function]
|
||||||
/**
|
/**
|
||||||
@@ -38,7 +38,7 @@
|
|||||||
function getSuggestion(sourceUuid: string) {
|
function getSuggestion(sourceUuid: string) {
|
||||||
return suggestions.find(s => s.source_db_uuid === sourceUuid);
|
return suggestions.find(s => s.source_db_uuid === sourceUuid);
|
||||||
}
|
}
|
||||||
// [/DEF:getSuggestion]
|
// [/DEF:getSuggestion:Function]
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<!-- [SECTION: TEMPLATE] -->
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
@@ -91,4 +91,4 @@
|
|||||||
/* Component specific styles */
|
/* Component specific styles */
|
||||||
</style>
|
</style>
|
||||||
|
|
||||||
<!-- [/DEF:MappingTable] -->
|
<!-- [/DEF:MappingTable:Component] -->
|
||||||
|
|||||||
@@ -24,6 +24,7 @@
|
|||||||
const dispatch = createEventDispatcher();
|
const dispatch = createEventDispatcher();
|
||||||
|
|
||||||
// [DEF:resolve:Function]
|
// [DEF:resolve:Function]
|
||||||
|
// @PURPOSE: Dispatches the resolution event with the selected mapping.
|
||||||
function resolve() {
|
function resolve() {
|
||||||
if (!selectedTargetUuid) return;
|
if (!selectedTargetUuid) return;
|
||||||
dispatch('resolve', {
|
dispatch('resolve', {
|
||||||
@@ -33,14 +34,15 @@
|
|||||||
});
|
});
|
||||||
show = false;
|
show = false;
|
||||||
}
|
}
|
||||||
// [/DEF:resolve]
|
// [/DEF:resolve:Function]
|
||||||
|
|
||||||
// [DEF:cancel:Function]
|
// [DEF:cancel:Function]
|
||||||
|
// @PURPOSE: Cancels the mapping resolution modal.
|
||||||
function cancel() {
|
function cancel() {
|
||||||
dispatch('cancel');
|
dispatch('cancel');
|
||||||
show = false;
|
show = false;
|
||||||
}
|
}
|
||||||
// [/DEF:cancel]
|
// [/DEF:cancel:Function]
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<!-- [SECTION: TEMPLATE] -->
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
@@ -109,4 +111,4 @@
|
|||||||
/* Modal specific styles */
|
/* Modal specific styles */
|
||||||
</style>
|
</style>
|
||||||
|
|
||||||
<!-- [/DEF:MissingMappingModal] -->
|
<!-- [/DEF:MissingMappingModal:Component] -->
|
||||||
|
|||||||
@@ -1,3 +1,10 @@
|
|||||||
|
<!-- [DEF:Navbar:Component] -->
|
||||||
|
<!--
|
||||||
|
@SEMANTICS: navbar, navigation, header, layout
|
||||||
|
@PURPOSE: Main navigation bar for the application.
|
||||||
|
@LAYER: UI
|
||||||
|
@RELATION: USES -> $app/stores
|
||||||
|
-->
|
||||||
<script>
|
<script>
|
||||||
import { page } from '$app/stores';
|
import { page } from '$app/stores';
|
||||||
</script>
|
</script>
|
||||||
@@ -22,6 +29,12 @@
|
|||||||
>
|
>
|
||||||
Migration
|
Migration
|
||||||
</a>
|
</a>
|
||||||
|
<a
|
||||||
|
href="/tasks"
|
||||||
|
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/tasks') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||||
|
>
|
||||||
|
Tasks
|
||||||
|
</a>
|
||||||
<a
|
<a
|
||||||
href="/settings"
|
href="/settings"
|
||||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||||
@@ -30,3 +43,4 @@
|
|||||||
</a>
|
</a>
|
||||||
</nav>
|
</nav>
|
||||||
</header>
|
</header>
|
||||||
|
<!-- [/DEF:Navbar:Component] -->
|
||||||
|
|||||||
@@ -18,6 +18,8 @@
|
|||||||
let passwords = {};
|
let passwords = {};
|
||||||
let submitting = false;
|
let submitting = false;
|
||||||
|
|
||||||
|
// [DEF:handleSubmit:Function]
|
||||||
|
// @PURPOSE: Validates and dispatches the passwords to resume the task.
|
||||||
function handleSubmit() {
|
function handleSubmit() {
|
||||||
if (submitting) return;
|
if (submitting) return;
|
||||||
|
|
||||||
@@ -32,11 +34,15 @@
|
|||||||
dispatch('resume', { passwords });
|
dispatch('resume', { passwords });
|
||||||
// Reset submitting state is handled by parent or on close
|
// Reset submitting state is handled by parent or on close
|
||||||
}
|
}
|
||||||
|
// [/DEF:handleSubmit:Function]
|
||||||
|
|
||||||
|
// [DEF:handleCancel:Function]
|
||||||
|
// @PURPOSE: Cancels the password prompt.
|
||||||
function handleCancel() {
|
function handleCancel() {
|
||||||
dispatch('cancel');
|
dispatch('cancel');
|
||||||
show = false;
|
show = false;
|
||||||
}
|
}
|
||||||
|
// [/DEF:handleCancel:Function]
|
||||||
|
|
||||||
// Reset passwords when modal opens/closes
|
// Reset passwords when modal opens/closes
|
||||||
$: if (!show) {
|
$: if (!show) {
|
||||||
@@ -120,4 +126,4 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{/if}
|
{/if}
|
||||||
<!-- [/DEF:PasswordPrompt] -->
|
<!-- [/DEF:PasswordPrompt:Component] -->
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user