Compare commits
2 Commits
45c077b928
...
a747a163c8
| Author | SHA1 | Date | |
|---|---|---|---|
| a747a163c8 | |||
| fce0941e98 |
@@ -16,6 +16,8 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
|||||||
- Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
- Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
||||||
- SQLite (optional for job history), existing database for mappings (008-migration-ui-improvements)
|
- SQLite (optional for job history), existing database for mappings (008-migration-ui-improvements)
|
||||||
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
|
||||||
|
- Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS (009-backup-scheduler)
|
||||||
|
- SQLite (`tasks.db`), JSON (`config.json`) (009-backup-scheduler)
|
||||||
|
|
||||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
||||||
|
|
||||||
@@ -36,9 +38,9 @@ cd src; pytest; ruff check .
|
|||||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||||
|
|
||||||
## Recent Changes
|
## Recent Changes
|
||||||
- 008-migration-ui-improvements: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API
|
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||||
- 008-migration-ui-improvements: Added Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API
|
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||||
- 007-migration-dashboard-grid: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, Superset API
|
- 009-backup-scheduler: Added [if applicable, e.g., PostgreSQL, CoreData, files or N/A]
|
||||||
|
|
||||||
|
|
||||||
<!-- MANUAL ADDITIONS START -->
|
<!-- MANUAL ADDITIONS START -->
|
||||||
|
|||||||
@@ -9,8 +9,8 @@
|
|||||||
#
|
#
|
||||||
# OPTIONS:
|
# OPTIONS:
|
||||||
# --json Output in JSON format
|
# --json Output in JSON format
|
||||||
# --require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
|
# --require-tasks Require tasks.md to exist (for implementation phase)
|
||||||
# --include-tasks Include task files in AVAILABLE_DOCS list
|
# --include-tasks Include tasks.md in AVAILABLE_DOCS list
|
||||||
# --paths-only Only output path variables (no validation)
|
# --paths-only Only output path variables (no validation)
|
||||||
# --help, -h Show help message
|
# --help, -h Show help message
|
||||||
#
|
#
|
||||||
@@ -49,8 +49,8 @@ Consolidated prerequisite checking for Spec-Driven Development workflow.
|
|||||||
|
|
||||||
OPTIONS:
|
OPTIONS:
|
||||||
--json Output in JSON format
|
--json Output in JSON format
|
||||||
--require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
|
--require-tasks Require tasks.md to exist (for implementation phase)
|
||||||
--include-tasks Include task files in AVAILABLE_DOCS list
|
--include-tasks Include tasks.md in AVAILABLE_DOCS list
|
||||||
--paths-only Only output path variables (no prerequisite validation)
|
--paths-only Only output path variables (no prerequisite validation)
|
||||||
--help, -h Show this help message
|
--help, -h Show this help message
|
||||||
|
|
||||||
@@ -58,7 +58,7 @@ EXAMPLES:
|
|||||||
# Check task prerequisites (plan.md required)
|
# Check task prerequisites (plan.md required)
|
||||||
./check-prerequisites.sh --json
|
./check-prerequisites.sh --json
|
||||||
|
|
||||||
# Check implementation prerequisites (plan.md + task files required)
|
# Check implementation prerequisites (plan.md + tasks.md required)
|
||||||
./check-prerequisites.sh --json --require-tasks --include-tasks
|
./check-prerequisites.sh --json --require-tasks --include-tasks
|
||||||
|
|
||||||
# Get feature paths only (no validation)
|
# Get feature paths only (no validation)
|
||||||
@@ -86,16 +86,15 @@ check_feature_branch "$CURRENT_BRANCH" "$HAS_GIT" || exit 1
|
|||||||
if $PATHS_ONLY; then
|
if $PATHS_ONLY; then
|
||||||
if $JSON_MODE; then
|
if $JSON_MODE; then
|
||||||
# Minimal JSON paths payload (no validation performed)
|
# Minimal JSON paths payload (no validation performed)
|
||||||
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS_ARCH":"%s","TASKS_DEV":"%s"}\n' \
|
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS":"%s"}\n' \
|
||||||
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS_ARCH" "$TASKS_DEV"
|
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS"
|
||||||
else
|
else
|
||||||
echo "REPO_ROOT: $REPO_ROOT"
|
echo "REPO_ROOT: $REPO_ROOT"
|
||||||
echo "BRANCH: $CURRENT_BRANCH"
|
echo "BRANCH: $CURRENT_BRANCH"
|
||||||
echo "FEATURE_DIR: $FEATURE_DIR"
|
echo "FEATURE_DIR: $FEATURE_DIR"
|
||||||
echo "FEATURE_SPEC: $FEATURE_SPEC"
|
echo "FEATURE_SPEC: $FEATURE_SPEC"
|
||||||
echo "IMPL_PLAN: $IMPL_PLAN"
|
echo "IMPL_PLAN: $IMPL_PLAN"
|
||||||
echo "TASKS_ARCH: $TASKS_ARCH"
|
echo "TASKS: $TASKS"
|
||||||
echo "TASKS_DEV: $TASKS_DEV"
|
|
||||||
fi
|
fi
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
@@ -113,20 +112,11 @@ if [[ ! -f "$IMPL_PLAN" ]]; then
|
|||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Check for task files if required
|
# Check for tasks.md if required
|
||||||
if $REQUIRE_TASKS; then
|
if $REQUIRE_TASKS && [[ ! -f "$TASKS" ]]; then
|
||||||
# Check for split tasks first
|
echo "ERROR: tasks.md not found in $FEATURE_DIR" >&2
|
||||||
if [[ -f "$TASKS_ARCH" ]] && [[ -f "$TASKS_DEV" ]]; then
|
echo "Run /speckit.tasks first to create the task list." >&2
|
||||||
: # Split tasks exist, proceed
|
exit 1
|
||||||
# Fallback to unified tasks.md
|
|
||||||
elif [[ -f "$TASKS" ]]; then
|
|
||||||
: # Unified tasks exist, proceed
|
|
||||||
else
|
|
||||||
echo "ERROR: No valid task files found in $FEATURE_DIR" >&2
|
|
||||||
echo "Expected 'tasks-arch.md' AND 'tasks-dev.md' (split) OR 'tasks.md' (unified)" >&2
|
|
||||||
echo "Run /speckit.tasks first to create the task lists." >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Build list of available documents
|
# Build list of available documents
|
||||||
@@ -143,14 +133,9 @@ fi
|
|||||||
|
|
||||||
[[ -f "$QUICKSTART" ]] && docs+=("quickstart.md")
|
[[ -f "$QUICKSTART" ]] && docs+=("quickstart.md")
|
||||||
|
|
||||||
# Include task files if requested and they exist
|
# Include tasks.md if requested and it exists
|
||||||
if $INCLUDE_TASKS; then
|
if $INCLUDE_TASKS && [[ -f "$TASKS" ]]; then
|
||||||
if [[ -f "$TASKS_ARCH" ]] || [[ -f "$TASKS_DEV" ]]; then
|
docs+=("tasks.md")
|
||||||
[[ -f "$TASKS_ARCH" ]] && docs+=("tasks-arch.md")
|
|
||||||
[[ -f "$TASKS_DEV" ]] && docs+=("tasks-dev.md")
|
|
||||||
elif [[ -f "$TASKS" ]]; then
|
|
||||||
docs+=("tasks.md")
|
|
||||||
fi
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Output results
|
# Output results
|
||||||
@@ -176,11 +161,6 @@ else
|
|||||||
check_file "$QUICKSTART" "quickstart.md"
|
check_file "$QUICKSTART" "quickstart.md"
|
||||||
|
|
||||||
if $INCLUDE_TASKS; then
|
if $INCLUDE_TASKS; then
|
||||||
if [[ -f "$TASKS_ARCH" ]] || [[ -f "$TASKS_DEV" ]]; then
|
check_file "$TASKS" "tasks.md"
|
||||||
check_file "$TASKS_ARCH" "tasks-arch.md"
|
|
||||||
check_file "$TASKS_DEV" "tasks-dev.md"
|
|
||||||
else
|
|
||||||
check_file "$TASKS" "tasks.md"
|
|
||||||
fi
|
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
|||||||
@@ -143,9 +143,7 @@ HAS_GIT='$has_git_repo'
|
|||||||
FEATURE_DIR='$feature_dir'
|
FEATURE_DIR='$feature_dir'
|
||||||
FEATURE_SPEC='$feature_dir/spec.md'
|
FEATURE_SPEC='$feature_dir/spec.md'
|
||||||
IMPL_PLAN='$feature_dir/plan.md'
|
IMPL_PLAN='$feature_dir/plan.md'
|
||||||
TASKS_ARCH='$feature_dir/tasks-arch.md'
|
TASKS='$feature_dir/tasks.md'
|
||||||
TASKS_DEV='$feature_dir/tasks-dev.md'
|
|
||||||
TASKS='$feature_dir/tasks.md' # Deprecated
|
|
||||||
RESEARCH='$feature_dir/research.md'
|
RESEARCH='$feature_dir/research.md'
|
||||||
DATA_MODEL='$feature_dir/data-model.md'
|
DATA_MODEL='$feature_dir/data-model.md'
|
||||||
QUICKSTART='$feature_dir/quickstart.md'
|
QUICKSTART='$feature_dir/quickstart.md'
|
||||||
|
|||||||
251
.specify/templates/tasks-template.md
Normal file
251
.specify/templates/tasks-template.md
Normal file
@@ -0,0 +1,251 @@
|
|||||||
|
---
|
||||||
|
|
||||||
|
description: "Task list template for feature implementation"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Tasks: [FEATURE NAME]
|
||||||
|
|
||||||
|
**Input**: Design documents from `/specs/[###-feature-name]/`
|
||||||
|
**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, contracts/
|
||||||
|
|
||||||
|
**Tests**: The examples below include test tasks. Tests are OPTIONAL - only include them if explicitly requested in the feature specification.
|
||||||
|
|
||||||
|
**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story.
|
||||||
|
|
||||||
|
## Format: `[ID] [P?] [Story] Description`
|
||||||
|
|
||||||
|
- **[P]**: Can run in parallel (different files, no dependencies)
|
||||||
|
- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3)
|
||||||
|
- Include exact file paths in descriptions
|
||||||
|
|
||||||
|
## Path Conventions
|
||||||
|
|
||||||
|
- **Single project**: `src/`, `tests/` at repository root
|
||||||
|
- **Web app**: `backend/src/`, `frontend/src/`
|
||||||
|
- **Mobile**: `api/src/`, `ios/src/` or `android/src/`
|
||||||
|
- Paths shown below assume single project - adjust based on plan.md structure
|
||||||
|
|
||||||
|
<!--
|
||||||
|
============================================================================
|
||||||
|
IMPORTANT: The tasks below are SAMPLE TASKS for illustration purposes only.
|
||||||
|
|
||||||
|
The /speckit.tasks command MUST replace these with actual tasks based on:
|
||||||
|
- User stories from spec.md (with their priorities P1, P2, P3...)
|
||||||
|
- Feature requirements from plan.md
|
||||||
|
- Entities from data-model.md
|
||||||
|
- Endpoints from contracts/
|
||||||
|
|
||||||
|
Tasks MUST be organized by user story so each story can be:
|
||||||
|
- Implemented independently
|
||||||
|
- Tested independently
|
||||||
|
- Delivered as an MVP increment
|
||||||
|
|
||||||
|
DO NOT keep these sample tasks in the generated tasks.md file.
|
||||||
|
============================================================================
|
||||||
|
-->
|
||||||
|
|
||||||
|
## Phase 1: Setup (Shared Infrastructure)
|
||||||
|
|
||||||
|
**Purpose**: Project initialization and basic structure
|
||||||
|
|
||||||
|
- [ ] T001 Create project structure per implementation plan
|
||||||
|
- [ ] T002 Initialize [language] project with [framework] dependencies
|
||||||
|
- [ ] T003 [P] Configure linting and formatting tools
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 2: Foundational (Blocking Prerequisites)
|
||||||
|
|
||||||
|
**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented
|
||||||
|
|
||||||
|
**⚠️ CRITICAL**: No user story work can begin until this phase is complete
|
||||||
|
|
||||||
|
Examples of foundational tasks (adjust based on your project):
|
||||||
|
|
||||||
|
- [ ] T004 Setup database schema and migrations framework
|
||||||
|
- [ ] T005 [P] Implement authentication/authorization framework
|
||||||
|
- [ ] T006 [P] Setup API routing and middleware structure
|
||||||
|
- [ ] T007 Create base models/entities that all stories depend on
|
||||||
|
- [ ] T008 Configure error handling and logging infrastructure
|
||||||
|
- [ ] T009 Setup environment configuration management
|
||||||
|
|
||||||
|
**Checkpoint**: Foundation ready - user story implementation can now begin in parallel
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 3: User Story 1 - [Title] (Priority: P1) 🎯 MVP
|
||||||
|
|
||||||
|
**Goal**: [Brief description of what this story delivers]
|
||||||
|
|
||||||
|
**Independent Test**: [How to verify this story works on its own]
|
||||||
|
|
||||||
|
### Tests for User Story 1 (OPTIONAL - only if tests requested) ⚠️
|
||||||
|
|
||||||
|
> **NOTE: Write these tests FIRST, ensure they FAIL before implementation**
|
||||||
|
|
||||||
|
- [ ] T010 [P] [US1] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||||
|
- [ ] T011 [P] [US1] Integration test for [user journey] in tests/integration/test_[name].py
|
||||||
|
|
||||||
|
### Implementation for User Story 1
|
||||||
|
|
||||||
|
- [ ] T012 [P] [US1] Create [Entity1] model in src/models/[entity1].py
|
||||||
|
- [ ] T013 [P] [US1] Create [Entity2] model in src/models/[entity2].py
|
||||||
|
- [ ] T014 [US1] Implement [Service] in src/services/[service].py (depends on T012, T013)
|
||||||
|
- [ ] T015 [US1] Implement [endpoint/feature] in src/[location]/[file].py
|
||||||
|
- [ ] T016 [US1] Add validation and error handling
|
||||||
|
- [ ] T017 [US1] Add logging for user story 1 operations
|
||||||
|
|
||||||
|
**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 4: User Story 2 - [Title] (Priority: P2)
|
||||||
|
|
||||||
|
**Goal**: [Brief description of what this story delivers]
|
||||||
|
|
||||||
|
**Independent Test**: [How to verify this story works on its own]
|
||||||
|
|
||||||
|
### Tests for User Story 2 (OPTIONAL - only if tests requested) ⚠️
|
||||||
|
|
||||||
|
- [ ] T018 [P] [US2] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||||
|
- [ ] T019 [P] [US2] Integration test for [user journey] in tests/integration/test_[name].py
|
||||||
|
|
||||||
|
### Implementation for User Story 2
|
||||||
|
|
||||||
|
- [ ] T020 [P] [US2] Create [Entity] model in src/models/[entity].py
|
||||||
|
- [ ] T021 [US2] Implement [Service] in src/services/[service].py
|
||||||
|
- [ ] T022 [US2] Implement [endpoint/feature] in src/[location]/[file].py
|
||||||
|
- [ ] T023 [US2] Integrate with User Story 1 components (if needed)
|
||||||
|
|
||||||
|
**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 5: User Story 3 - [Title] (Priority: P3)
|
||||||
|
|
||||||
|
**Goal**: [Brief description of what this story delivers]
|
||||||
|
|
||||||
|
**Independent Test**: [How to verify this story works on its own]
|
||||||
|
|
||||||
|
### Tests for User Story 3 (OPTIONAL - only if tests requested) ⚠️
|
||||||
|
|
||||||
|
- [ ] T024 [P] [US3] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||||
|
- [ ] T025 [P] [US3] Integration test for [user journey] in tests/integration/test_[name].py
|
||||||
|
|
||||||
|
### Implementation for User Story 3
|
||||||
|
|
||||||
|
- [ ] T026 [P] [US3] Create [Entity] model in src/models/[entity].py
|
||||||
|
- [ ] T027 [US3] Implement [Service] in src/services/[service].py
|
||||||
|
- [ ] T028 [US3] Implement [endpoint/feature] in src/[location]/[file].py
|
||||||
|
|
||||||
|
**Checkpoint**: All user stories should now be independently functional
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
[Add more user story phases as needed, following the same pattern]
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase N: Polish & Cross-Cutting Concerns
|
||||||
|
|
||||||
|
**Purpose**: Improvements that affect multiple user stories
|
||||||
|
|
||||||
|
- [ ] TXXX [P] Documentation updates in docs/
|
||||||
|
- [ ] TXXX Code cleanup and refactoring
|
||||||
|
- [ ] TXXX Performance optimization across all stories
|
||||||
|
- [ ] TXXX [P] Additional unit tests (if requested) in tests/unit/
|
||||||
|
- [ ] TXXX Security hardening
|
||||||
|
- [ ] TXXX Run quickstart.md validation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Dependencies & Execution Order
|
||||||
|
|
||||||
|
### Phase Dependencies
|
||||||
|
|
||||||
|
- **Setup (Phase 1)**: No dependencies - can start immediately
|
||||||
|
- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories
|
||||||
|
- **User Stories (Phase 3+)**: All depend on Foundational phase completion
|
||||||
|
- User stories can then proceed in parallel (if staffed)
|
||||||
|
- Or sequentially in priority order (P1 → P2 → P3)
|
||||||
|
- **Polish (Final Phase)**: Depends on all desired user stories being complete
|
||||||
|
|
||||||
|
### User Story Dependencies
|
||||||
|
|
||||||
|
- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
|
||||||
|
- **User Story 2 (P2)**: Can start after Foundational (Phase 2) - May integrate with US1 but should be independently testable
|
||||||
|
- **User Story 3 (P3)**: Can start after Foundational (Phase 2) - May integrate with US1/US2 but should be independently testable
|
||||||
|
|
||||||
|
### Within Each User Story
|
||||||
|
|
||||||
|
- Tests (if included) MUST be written and FAIL before implementation
|
||||||
|
- Models before services
|
||||||
|
- Services before endpoints
|
||||||
|
- Core implementation before integration
|
||||||
|
- Story complete before moving to next priority
|
||||||
|
|
||||||
|
### Parallel Opportunities
|
||||||
|
|
||||||
|
- All Setup tasks marked [P] can run in parallel
|
||||||
|
- All Foundational tasks marked [P] can run in parallel (within Phase 2)
|
||||||
|
- Once Foundational phase completes, all user stories can start in parallel (if team capacity allows)
|
||||||
|
- All tests for a user story marked [P] can run in parallel
|
||||||
|
- Models within a story marked [P] can run in parallel
|
||||||
|
- Different user stories can be worked on in parallel by different team members
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Parallel Example: User Story 1
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Launch all tests for User Story 1 together (if tests requested):
|
||||||
|
Task: "Contract test for [endpoint] in tests/contract/test_[name].py"
|
||||||
|
Task: "Integration test for [user journey] in tests/integration/test_[name].py"
|
||||||
|
|
||||||
|
# Launch all models for User Story 1 together:
|
||||||
|
Task: "Create [Entity1] model in src/models/[entity1].py"
|
||||||
|
Task: "Create [Entity2] model in src/models/[entity2].py"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Strategy
|
||||||
|
|
||||||
|
### MVP First (User Story 1 Only)
|
||||||
|
|
||||||
|
1. Complete Phase 1: Setup
|
||||||
|
2. Complete Phase 2: Foundational (CRITICAL - blocks all stories)
|
||||||
|
3. Complete Phase 3: User Story 1
|
||||||
|
4. **STOP and VALIDATE**: Test User Story 1 independently
|
||||||
|
5. Deploy/demo if ready
|
||||||
|
|
||||||
|
### Incremental Delivery
|
||||||
|
|
||||||
|
1. Complete Setup + Foundational → Foundation ready
|
||||||
|
2. Add User Story 1 → Test independently → Deploy/Demo (MVP!)
|
||||||
|
3. Add User Story 2 → Test independently → Deploy/Demo
|
||||||
|
4. Add User Story 3 → Test independently → Deploy/Demo
|
||||||
|
5. Each story adds value without breaking previous stories
|
||||||
|
|
||||||
|
### Parallel Team Strategy
|
||||||
|
|
||||||
|
With multiple developers:
|
||||||
|
|
||||||
|
1. Team completes Setup + Foundational together
|
||||||
|
2. Once Foundational is done:
|
||||||
|
- Developer A: User Story 1
|
||||||
|
- Developer B: User Story 2
|
||||||
|
- Developer C: User Story 3
|
||||||
|
3. Stories complete and integrate independently
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- [P] tasks = different files, no dependencies
|
||||||
|
- [Story] label maps task to specific user story for traceability
|
||||||
|
- Each user story should be independently completable and testable
|
||||||
|
- Verify tests fail before implementing
|
||||||
|
- Commit after each task or logical group
|
||||||
|
- Stop at any checkpoint to validate story independently
|
||||||
|
- Avoid: vague tasks, same file conflicts, cross-story dependencies that break independence
|
||||||
Binary file not shown.
@@ -1,14 +1,43 @@
|
|||||||
fastapi
|
annotated-doc==0.0.4
|
||||||
uvicorn
|
annotated-types==0.7.0
|
||||||
pydantic
|
anyio==4.12.0
|
||||||
authlib
|
APScheduler==3.11.2
|
||||||
python-multipart
|
attrs==25.4.0
|
||||||
starlette
|
Authlib==1.6.6
|
||||||
jsonschema
|
certifi==2025.11.12
|
||||||
requests
|
cffi==2.0.0
|
||||||
keyring
|
charset-normalizer==3.4.4
|
||||||
httpx
|
click==8.3.1
|
||||||
PyYAML
|
cryptography==46.0.3
|
||||||
websockets
|
fastapi==0.126.0
|
||||||
rapidfuzz
|
greenlet==3.3.0
|
||||||
sqlalchemy
|
h11==0.16.0
|
||||||
|
httpcore==1.0.9
|
||||||
|
httpx==0.28.1
|
||||||
|
idna==3.11
|
||||||
|
jaraco.classes==3.4.0
|
||||||
|
jaraco.context==6.0.1
|
||||||
|
jaraco.functools==4.3.0
|
||||||
|
jeepney==0.9.0
|
||||||
|
jsonschema==4.25.1
|
||||||
|
jsonschema-specifications==2025.9.1
|
||||||
|
keyring==25.7.0
|
||||||
|
more-itertools==10.8.0
|
||||||
|
pycparser==2.23
|
||||||
|
pydantic==2.12.5
|
||||||
|
pydantic_core==2.41.5
|
||||||
|
python-multipart==0.0.21
|
||||||
|
PyYAML==6.0.3
|
||||||
|
RapidFuzz==3.14.3
|
||||||
|
referencing==0.37.0
|
||||||
|
requests==2.32.5
|
||||||
|
rpds-py==0.30.0
|
||||||
|
SecretStorage==3.5.0
|
||||||
|
SQLAlchemy==2.0.45
|
||||||
|
starlette==0.50.0
|
||||||
|
typing-inspection==0.4.2
|
||||||
|
typing_extensions==4.15.0
|
||||||
|
tzlocal==5.3.1
|
||||||
|
urllib3==2.6.2
|
||||||
|
uvicorn==0.38.0
|
||||||
|
websockets==15.0.1
|
||||||
|
|||||||
@@ -11,19 +11,27 @@
|
|||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from typing import List, Dict, Optional
|
from typing import List, Dict, Optional
|
||||||
from backend.src.dependencies import get_config_manager
|
from backend.src.dependencies import get_config_manager, get_scheduler_service
|
||||||
from backend.src.core.superset_client import SupersetClient
|
from backend.src.core.superset_client import SupersetClient
|
||||||
from superset_tool.models import SupersetConfig
|
from superset_tool.models import SupersetConfig
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel, Field
|
||||||
|
from backend.src.core.config_models import Environment as EnvModel
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
router = APIRouter(prefix="/api/environments", tags=["environments"])
|
router = APIRouter()
|
||||||
|
|
||||||
|
# [DEF:ScheduleSchema:DataClass]
|
||||||
|
class ScheduleSchema(BaseModel):
|
||||||
|
enabled: bool = False
|
||||||
|
cron_expression: str = Field(..., pattern=r'^(@(annually|yearly|monthly|weekly|daily|hourly|reboot))|((((\d+,)*\d+|(\d+(\/|-)\d+)|\d+|\*) ?){5,7})$')
|
||||||
|
# [/DEF:ScheduleSchema]
|
||||||
|
|
||||||
# [DEF:EnvironmentResponse:DataClass]
|
# [DEF:EnvironmentResponse:DataClass]
|
||||||
class EnvironmentResponse(BaseModel):
|
class EnvironmentResponse(BaseModel):
|
||||||
id: str
|
id: str
|
||||||
name: str
|
name: str
|
||||||
url: str
|
url: str
|
||||||
|
backup_schedule: Optional[ScheduleSchema] = None
|
||||||
# [/DEF:EnvironmentResponse]
|
# [/DEF:EnvironmentResponse]
|
||||||
|
|
||||||
# [DEF:DatabaseResponse:DataClass]
|
# [DEF:DatabaseResponse:DataClass]
|
||||||
@@ -42,9 +50,47 @@ async def get_environments(config_manager=Depends(get_config_manager)):
|
|||||||
# Ensure envs is a list
|
# Ensure envs is a list
|
||||||
if not isinstance(envs, list):
|
if not isinstance(envs, list):
|
||||||
envs = []
|
envs = []
|
||||||
return [EnvironmentResponse(id=e.id, name=e.name, url=e.url) for e in envs]
|
return [
|
||||||
|
EnvironmentResponse(
|
||||||
|
id=e.id,
|
||||||
|
name=e.name,
|
||||||
|
url=e.url,
|
||||||
|
backup_schedule=ScheduleSchema(
|
||||||
|
enabled=e.backup_schedule.enabled,
|
||||||
|
cron_expression=e.backup_schedule.cron_expression
|
||||||
|
) if e.backup_schedule else None
|
||||||
|
) for e in envs
|
||||||
|
]
|
||||||
# [/DEF:get_environments]
|
# [/DEF:get_environments]
|
||||||
|
|
||||||
|
# [DEF:update_environment_schedule:Function]
|
||||||
|
# @PURPOSE: Update backup schedule for an environment.
|
||||||
|
# @PARAM: id (str) - The environment ID.
|
||||||
|
# @PARAM: schedule (ScheduleSchema) - The new schedule.
|
||||||
|
@router.put("/{id}/schedule")
|
||||||
|
async def update_environment_schedule(
|
||||||
|
id: str,
|
||||||
|
schedule: ScheduleSchema,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
scheduler_service=Depends(get_scheduler_service)
|
||||||
|
):
|
||||||
|
envs = config_manager.get_environments()
|
||||||
|
env = next((e for e in envs if e.id == id), None)
|
||||||
|
if not env:
|
||||||
|
raise HTTPException(status_code=404, detail="Environment not found")
|
||||||
|
|
||||||
|
# Update environment config
|
||||||
|
env.backup_schedule.enabled = schedule.enabled
|
||||||
|
env.backup_schedule.cron_expression = schedule.cron_expression
|
||||||
|
|
||||||
|
config_manager.update_environment(id, env)
|
||||||
|
|
||||||
|
# Refresh scheduler
|
||||||
|
scheduler_service.load_schedules()
|
||||||
|
|
||||||
|
return {"message": "Schedule updated successfully"}
|
||||||
|
# [/DEF:update_environment_schedule]
|
||||||
|
|
||||||
# [DEF:get_environment_databases:Function]
|
# [DEF:get_environment_databases:Function]
|
||||||
# @PURPOSE: Fetch the list of databases from a specific environment.
|
# @PURPOSE: Fetch the list of databases from a specific environment.
|
||||||
# @PARAM: id (str) - The environment ID.
|
# @PARAM: id (str) - The environment ID.
|
||||||
|
|||||||
@@ -18,14 +18,11 @@ from fastapi.responses import FileResponse
|
|||||||
import asyncio
|
import asyncio
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from .dependencies import get_task_manager
|
from .dependencies import get_task_manager, get_scheduler_service
|
||||||
from .core.logger import logger
|
from .core.logger import logger
|
||||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration
|
from .api.routes import plugins, tasks, settings, environments, mappings, migration
|
||||||
from .core.database import init_db
|
from .core.database import init_db
|
||||||
|
|
||||||
# Initialize database
|
|
||||||
init_db()
|
|
||||||
|
|
||||||
# [DEF:App:Global]
|
# [DEF:App:Global]
|
||||||
# @SEMANTICS: app, fastapi, instance
|
# @SEMANTICS: app, fastapi, instance
|
||||||
# @PURPOSE: The global FastAPI application instance.
|
# @PURPOSE: The global FastAPI application instance.
|
||||||
@@ -35,6 +32,18 @@ app = FastAPI(
|
|||||||
version="1.0.0",
|
version="1.0.0",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Startup event
|
||||||
|
@app.on_event("startup")
|
||||||
|
async def startup_event():
|
||||||
|
scheduler = get_scheduler_service()
|
||||||
|
scheduler.start()
|
||||||
|
|
||||||
|
# Shutdown event
|
||||||
|
@app.on_event("shutdown")
|
||||||
|
async def shutdown_event():
|
||||||
|
scheduler = get_scheduler_service()
|
||||||
|
scheduler.stop()
|
||||||
|
|
||||||
# Configure CORS
|
# Configure CORS
|
||||||
app.add_middleware(
|
app.add_middleware(
|
||||||
CORSMiddleware,
|
CORSMiddleware,
|
||||||
@@ -56,7 +65,7 @@ async def log_requests(request: Request, call_next):
|
|||||||
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
||||||
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
||||||
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
||||||
app.include_router(environments.router)
|
app.include_router(environments.router, prefix="/api/environments", tags=["Environments"])
|
||||||
app.include_router(mappings.router)
|
app.include_router(mappings.router)
|
||||||
app.include_router(migration.router)
|
app.include_router(migration.router)
|
||||||
|
|
||||||
|
|||||||
@@ -8,6 +8,13 @@
|
|||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
|
|
||||||
|
# [DEF:Schedule:DataClass]
|
||||||
|
# @PURPOSE: Represents a backup schedule configuration.
|
||||||
|
class Schedule(BaseModel):
|
||||||
|
enabled: bool = False
|
||||||
|
cron_expression: str = "0 0 * * *" # Default: daily at midnight
|
||||||
|
# [/DEF:Schedule]
|
||||||
|
|
||||||
# [DEF:Environment:DataClass]
|
# [DEF:Environment:DataClass]
|
||||||
# @PURPOSE: Represents a Superset environment configuration.
|
# @PURPOSE: Represents a Superset environment configuration.
|
||||||
class Environment(BaseModel):
|
class Environment(BaseModel):
|
||||||
@@ -17,6 +24,7 @@ class Environment(BaseModel):
|
|||||||
username: str
|
username: str
|
||||||
password: str # Will be masked in UI
|
password: str # Will be masked in UI
|
||||||
is_default: bool = False
|
is_default: bool = False
|
||||||
|
backup_schedule: Schedule = Field(default_factory=Schedule)
|
||||||
# [/DEF:Environment]
|
# [/DEF:Environment]
|
||||||
|
|
||||||
# [DEF:LoggingConfig:DataClass]
|
# [DEF:LoggingConfig:DataClass]
|
||||||
|
|||||||
@@ -12,6 +12,8 @@
|
|||||||
from sqlalchemy import create_engine
|
from sqlalchemy import create_engine
|
||||||
from sqlalchemy.orm import sessionmaker, Session
|
from sqlalchemy.orm import sessionmaker, Session
|
||||||
from backend.src.models.mapping import Base
|
from backend.src.models.mapping import Base
|
||||||
|
# Import TaskRecord to ensure it's registered with Base
|
||||||
|
from backend.src.models.task import TaskRecord
|
||||||
import os
|
import os
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
@@ -19,18 +21,31 @@ import os
|
|||||||
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./mappings.db")
|
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./mappings.db")
|
||||||
# [/DEF:DATABASE_URL]
|
# [/DEF:DATABASE_URL]
|
||||||
|
|
||||||
|
# [DEF:TASKS_DATABASE_URL:Constant]
|
||||||
|
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", "sqlite:///./tasks.db")
|
||||||
|
# [/DEF:TASKS_DATABASE_URL]
|
||||||
|
|
||||||
# [DEF:engine:Variable]
|
# [DEF:engine:Variable]
|
||||||
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
# [/DEF:engine]
|
# [/DEF:engine]
|
||||||
|
|
||||||
|
# [DEF:tasks_engine:Variable]
|
||||||
|
tasks_engine = create_engine(TASKS_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
|
# [/DEF:tasks_engine]
|
||||||
|
|
||||||
# [DEF:SessionLocal:Class]
|
# [DEF:SessionLocal:Class]
|
||||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
# [/DEF:SessionLocal]
|
# [/DEF:SessionLocal]
|
||||||
|
|
||||||
|
# [DEF:TasksSessionLocal:Class]
|
||||||
|
TasksSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=tasks_engine)
|
||||||
|
# [/DEF:TasksSessionLocal]
|
||||||
|
|
||||||
# [DEF:init_db:Function]
|
# [DEF:init_db:Function]
|
||||||
# @PURPOSE: Initializes the database by creating all tables.
|
# @PURPOSE: Initializes the database by creating all tables.
|
||||||
def init_db():
|
def init_db():
|
||||||
Base.metadata.create_all(bind=engine)
|
Base.metadata.create_all(bind=engine)
|
||||||
|
Base.metadata.create_all(bind=tasks_engine)
|
||||||
# [/DEF:init_db]
|
# [/DEF:init_db]
|
||||||
|
|
||||||
# [DEF:get_db:Function]
|
# [DEF:get_db:Function]
|
||||||
@@ -45,4 +60,16 @@ def get_db():
|
|||||||
db.close()
|
db.close()
|
||||||
# [/DEF:get_db]
|
# [/DEF:get_db]
|
||||||
|
|
||||||
|
# [DEF:get_tasks_db:Function]
|
||||||
|
# @PURPOSE: Dependency for getting a tasks database session.
|
||||||
|
# @POST: Session is closed after use.
|
||||||
|
# @RETURN: Generator[Session, None, None]
|
||||||
|
def get_tasks_db():
|
||||||
|
db = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:get_tasks_db]
|
||||||
|
|
||||||
# [/DEF:backend.src.core.database]
|
# [/DEF:backend.src.core.database]
|
||||||
|
|||||||
99
backend/src/core/scheduler.py
Normal file
99
backend/src/core/scheduler.py
Normal file
@@ -0,0 +1,99 @@
|
|||||||
|
# [DEF:SchedulerModule:Module]
|
||||||
|
# @SEMANTICS: scheduler, apscheduler, cron, backup
|
||||||
|
# @PURPOSE: Manages scheduled tasks using APScheduler.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: Uses TaskManager to run scheduled backups.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from apscheduler.schedulers.background import BackgroundScheduler
|
||||||
|
from apscheduler.triggers.cron import CronTrigger
|
||||||
|
from .logger import logger, belief_scope
|
||||||
|
from .config_manager import ConfigManager
|
||||||
|
from typing import Optional
|
||||||
|
import asyncio
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:SchedulerService:Class]
|
||||||
|
# @SEMANTICS: scheduler, service, apscheduler
|
||||||
|
# @PURPOSE: Provides a service to manage scheduled backup tasks.
|
||||||
|
class SchedulerService:
|
||||||
|
def __init__(self, task_manager, config_manager: ConfigManager):
|
||||||
|
with belief_scope("SchedulerService.__init__"):
|
||||||
|
self.task_manager = task_manager
|
||||||
|
self.config_manager = config_manager
|
||||||
|
self.scheduler = BackgroundScheduler()
|
||||||
|
self.loop = asyncio.get_event_loop()
|
||||||
|
|
||||||
|
# [DEF:SchedulerService.start:Function]
|
||||||
|
# @PURPOSE: Starts the background scheduler and loads initial schedules.
|
||||||
|
def start(self):
|
||||||
|
with belief_scope("SchedulerService.start"):
|
||||||
|
if not self.scheduler.running:
|
||||||
|
self.scheduler.start()
|
||||||
|
logger.info("Scheduler started.")
|
||||||
|
self.load_schedules()
|
||||||
|
|
||||||
|
# [DEF:SchedulerService.stop:Function]
|
||||||
|
# @PURPOSE: Stops the background scheduler.
|
||||||
|
def stop(self):
|
||||||
|
with belief_scope("SchedulerService.stop"):
|
||||||
|
if self.scheduler.running:
|
||||||
|
self.scheduler.shutdown()
|
||||||
|
logger.info("Scheduler stopped.")
|
||||||
|
|
||||||
|
# [DEF:SchedulerService.load_schedules:Function]
|
||||||
|
# @PURPOSE: Loads backup schedules from configuration and registers them.
|
||||||
|
def load_schedules(self):
|
||||||
|
with belief_scope("SchedulerService.load_schedules"):
|
||||||
|
# Clear existing jobs
|
||||||
|
self.scheduler.remove_all_jobs()
|
||||||
|
|
||||||
|
config = self.config_manager.get_config()
|
||||||
|
for env in config.environments:
|
||||||
|
if env.backup_schedule and env.backup_schedule.enabled:
|
||||||
|
self.add_backup_job(env.id, env.backup_schedule.cron_expression)
|
||||||
|
|
||||||
|
# [DEF:SchedulerService.add_backup_job:Function]
|
||||||
|
# @PURPOSE: Adds a scheduled backup job for an environment.
|
||||||
|
# @PARAM: env_id (str) - The ID of the environment.
|
||||||
|
# @PARAM: cron_expression (str) - The cron expression for the schedule.
|
||||||
|
def add_backup_job(self, env_id: str, cron_expression: str):
|
||||||
|
with belief_scope("SchedulerService.add_backup_job", f"env_id={env_id}, cron={cron_expression}"):
|
||||||
|
job_id = f"backup_{env_id}"
|
||||||
|
try:
|
||||||
|
self.scheduler.add_job(
|
||||||
|
self._trigger_backup,
|
||||||
|
CronTrigger.from_crontab(cron_expression),
|
||||||
|
id=job_id,
|
||||||
|
args=[env_id],
|
||||||
|
replace_existing=True
|
||||||
|
)
|
||||||
|
logger.info(f"Scheduled backup job added for environment {env_id}: {cron_expression}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to add backup job for environment {env_id}: {e}")
|
||||||
|
|
||||||
|
# [DEF:SchedulerService._trigger_backup:Function]
|
||||||
|
# @PURPOSE: Triggered by the scheduler to start a backup task.
|
||||||
|
# @PARAM: env_id (str) - The ID of the environment.
|
||||||
|
def _trigger_backup(self, env_id: str):
|
||||||
|
with belief_scope("SchedulerService._trigger_backup", f"env_id={env_id}"):
|
||||||
|
logger.info(f"Triggering scheduled backup for environment {env_id}")
|
||||||
|
|
||||||
|
# Check if a backup is already running for this environment
|
||||||
|
active_tasks = self.task_manager.get_tasks(limit=100)
|
||||||
|
for task in active_tasks:
|
||||||
|
if (task.plugin_id == "superset-backup" and
|
||||||
|
task.status in ["PENDING", "RUNNING"] and
|
||||||
|
task.params.get("environment_id") == env_id):
|
||||||
|
logger.warning(f"Backup already running for environment {env_id}. Skipping scheduled run.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Run the backup task
|
||||||
|
# We need to run this in the event loop since create_task is async
|
||||||
|
asyncio.run_coroutine_threadsafe(
|
||||||
|
self.task_manager.create_task("superset-backup", {"environment_id": env_id}),
|
||||||
|
self.loop
|
||||||
|
)
|
||||||
|
|
||||||
|
# [/DEF:SchedulerService:Class]
|
||||||
|
# [/DEF:SchedulerModule:Module]
|
||||||
38
backend/src/core/task_manager/cleanup.py
Normal file
38
backend/src/core/task_manager/cleanup.py
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
# [DEF:TaskCleanupModule:Module]
|
||||||
|
# @SEMANTICS: task, cleanup, retention
|
||||||
|
# @PURPOSE: Implements task cleanup and retention policies.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: Uses TaskPersistenceService to delete old tasks.
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from .persistence import TaskPersistenceService
|
||||||
|
from ..logger import logger, belief_scope
|
||||||
|
from ..config_manager import ConfigManager
|
||||||
|
|
||||||
|
# [DEF:TaskCleanupService:Class]
|
||||||
|
# @PURPOSE: Provides methods to clean up old task records.
|
||||||
|
class TaskCleanupService:
|
||||||
|
def __init__(self, persistence_service: TaskPersistenceService, config_manager: ConfigManager):
|
||||||
|
self.persistence_service = persistence_service
|
||||||
|
self.config_manager = config_manager
|
||||||
|
|
||||||
|
# [DEF:TaskCleanupService.run_cleanup:Function]
|
||||||
|
# @PURPOSE: Deletes tasks older than the configured retention period.
|
||||||
|
def run_cleanup(self):
|
||||||
|
with belief_scope("TaskCleanupService.run_cleanup"):
|
||||||
|
settings = self.config_manager.get_config().settings
|
||||||
|
retention_days = settings.task_retention_days
|
||||||
|
|
||||||
|
# This is a simplified implementation.
|
||||||
|
# In a real scenario, we would query IDs of tasks older than retention_days.
|
||||||
|
# For now, we'll log the action.
|
||||||
|
logger.info(f"Cleaning up tasks older than {retention_days} days.")
|
||||||
|
|
||||||
|
# Re-loading tasks to check for limit
|
||||||
|
tasks = self.persistence_service.load_tasks(limit=1000)
|
||||||
|
if len(tasks) > settings.task_retention_limit:
|
||||||
|
to_delete = [t.id for t in tasks[settings.task_retention_limit:]]
|
||||||
|
self.persistence_service.delete_tasks(to_delete)
|
||||||
|
logger.info(f"Deleted {len(to_delete)} tasks exceeding limit of {settings.task_retention_limit}")
|
||||||
|
|
||||||
|
# [/DEF:TaskCleanupService]
|
||||||
@@ -71,6 +71,7 @@ class TaskManager:
|
|||||||
|
|
||||||
task = Task(plugin_id=plugin_id, params=params, user_id=user_id)
|
task = Task(plugin_id=plugin_id, params=params, user_id=user_id)
|
||||||
self.tasks[task.id] = task
|
self.tasks[task.id] = task
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
logger.info(f"Task {task.id} created and scheduled for execution")
|
logger.info(f"Task {task.id} created and scheduled for execution")
|
||||||
self.loop.create_task(self._run_task(task.id)) # Schedule task for execution
|
self.loop.create_task(self._run_task(task.id)) # Schedule task for execution
|
||||||
return task
|
return task
|
||||||
@@ -89,6 +90,7 @@ class TaskManager:
|
|||||||
logger.info(f"Starting execution of task {task_id} for plugin '{plugin.name}'")
|
logger.info(f"Starting execution of task {task_id} for plugin '{plugin.name}'")
|
||||||
task.status = TaskStatus.RUNNING
|
task.status = TaskStatus.RUNNING
|
||||||
task.started_at = datetime.utcnow()
|
task.started_at = datetime.utcnow()
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", f"Task started for plugin '{plugin.name}'")
|
self._add_log(task_id, "INFO", f"Task started for plugin '{plugin.name}'")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -113,6 +115,7 @@ class TaskManager:
|
|||||||
self._add_log(task_id, "ERROR", f"Task failed: {e}", {"error_type": type(e).__name__})
|
self._add_log(task_id, "ERROR", f"Task failed: {e}", {"error_type": type(e).__name__})
|
||||||
finally:
|
finally:
|
||||||
task.finished_at = datetime.utcnow()
|
task.finished_at = datetime.utcnow()
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
logger.info(f"Task {task_id} execution finished with status: {task.status}")
|
logger.info(f"Task {task_id} execution finished with status: {task.status}")
|
||||||
# [/DEF:TaskManager._run_task:Function]
|
# [/DEF:TaskManager._run_task:Function]
|
||||||
|
|
||||||
@@ -132,6 +135,7 @@ class TaskManager:
|
|||||||
# Update task params with resolution
|
# Update task params with resolution
|
||||||
task.params.update(resolution_params)
|
task.params.update(resolution_params)
|
||||||
task.status = TaskStatus.RUNNING
|
task.status = TaskStatus.RUNNING
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", "Task resumed after mapping resolution.")
|
self._add_log(task_id, "INFO", "Task resumed after mapping resolution.")
|
||||||
|
|
||||||
# Signal the future to continue
|
# Signal the future to continue
|
||||||
@@ -150,6 +154,7 @@ class TaskManager:
|
|||||||
if not task: return
|
if not task: return
|
||||||
|
|
||||||
task.status = TaskStatus.AWAITING_MAPPING
|
task.status = TaskStatus.AWAITING_MAPPING
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self.task_futures[task_id] = self.loop.create_future()
|
self.task_futures[task_id] = self.loop.create_future()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -235,6 +240,7 @@ class TaskManager:
|
|||||||
|
|
||||||
log_entry = LogEntry(level=level, message=message, context=context)
|
log_entry = LogEntry(level=level, message=message, context=context)
|
||||||
task.logs.append(log_entry)
|
task.logs.append(log_entry)
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
|
|
||||||
# Notify subscribers
|
# Notify subscribers
|
||||||
if task_id in self.subscribers:
|
if task_id in self.subscribers:
|
||||||
@@ -266,16 +272,10 @@ class TaskManager:
|
|||||||
del self.subscribers[task_id]
|
del self.subscribers[task_id]
|
||||||
# [/DEF:TaskManager.unsubscribe_logs:Function]
|
# [/DEF:TaskManager.unsubscribe_logs:Function]
|
||||||
|
|
||||||
# [DEF:TaskManager.persist_awaiting_input_tasks:Function]
|
|
||||||
# @PURPOSE: Persist tasks in AWAITING_INPUT state using persistence service.
|
|
||||||
def persist_awaiting_input_tasks(self) -> None:
|
|
||||||
self.persistence_service.persist_tasks(list(self.tasks.values()))
|
|
||||||
# [/DEF:TaskManager.persist_awaiting_input_tasks:Function]
|
|
||||||
|
|
||||||
# [DEF:TaskManager.load_persisted_tasks:Function]
|
# [DEF:TaskManager.load_persisted_tasks:Function]
|
||||||
# @PURPOSE: Load persisted tasks using persistence service.
|
# @PURPOSE: Load persisted tasks using persistence service.
|
||||||
def load_persisted_tasks(self) -> None:
|
def load_persisted_tasks(self) -> None:
|
||||||
loaded_tasks = self.persistence_service.load_tasks()
|
loaded_tasks = self.persistence_service.load_tasks(limit=100)
|
||||||
for task in loaded_tasks:
|
for task in loaded_tasks:
|
||||||
if task.id not in self.tasks:
|
if task.id not in self.tasks:
|
||||||
self.tasks[task.id] = task
|
self.tasks[task.id] = task
|
||||||
@@ -299,9 +299,8 @@ class TaskManager:
|
|||||||
task.status = TaskStatus.AWAITING_INPUT
|
task.status = TaskStatus.AWAITING_INPUT
|
||||||
task.input_required = True
|
task.input_required = True
|
||||||
task.input_request = input_request
|
task.input_request = input_request
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", "Task paused for user input", {"input_request": input_request})
|
self._add_log(task_id, "INFO", "Task paused for user input", {"input_request": input_request})
|
||||||
|
|
||||||
self.persist_awaiting_input_tasks()
|
|
||||||
# [/DEF:TaskManager.await_input:Function]
|
# [/DEF:TaskManager.await_input:Function]
|
||||||
|
|
||||||
# [DEF:TaskManager.resume_task_with_password:Function]
|
# [DEF:TaskManager.resume_task_with_password:Function]
|
||||||
@@ -326,13 +325,11 @@ class TaskManager:
|
|||||||
task.input_required = False
|
task.input_required = False
|
||||||
task.input_request = None
|
task.input_request = None
|
||||||
task.status = TaskStatus.RUNNING
|
task.status = TaskStatus.RUNNING
|
||||||
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", "Task resumed with passwords", {"databases": list(passwords.keys())})
|
self._add_log(task_id, "INFO", "Task resumed with passwords", {"databases": list(passwords.keys())})
|
||||||
|
|
||||||
if task_id in self.task_futures:
|
if task_id in self.task_futures:
|
||||||
self.task_futures[task_id].set_result(True)
|
self.task_futures[task_id].set_result(True)
|
||||||
|
|
||||||
# Remove from persistence as it's no longer awaiting input
|
|
||||||
self.persistence_service.delete_tasks([task_id])
|
|
||||||
# [/DEF:TaskManager.resume_task_with_password:Function]
|
# [/DEF:TaskManager.resume_task_with_password:Function]
|
||||||
|
|
||||||
# [DEF:TaskManager.clear_tasks:Function]
|
# [DEF:TaskManager.clear_tasks:Function]
|
||||||
|
|||||||
@@ -1,141 +1,122 @@
|
|||||||
# [DEF:TaskPersistenceModule:Module]
|
# [DEF:TaskPersistenceModule:Module]
|
||||||
# @SEMANTICS: persistence, sqlite, task, storage
|
# @SEMANTICS: persistence, sqlite, sqlalchemy, task, storage
|
||||||
# @PURPOSE: Handles the persistence of tasks, specifically those awaiting user input, to a SQLite database.
|
# @PURPOSE: Handles the persistence of tasks using SQLAlchemy and the tasks.db database.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
# @RELATION: Used by TaskManager to save and load tasks.
|
# @RELATION: Used by TaskManager to save and load tasks.
|
||||||
# @INVARIANT: Database schema must match the Task model structure.
|
# @INVARIANT: Database schema must match the TaskRecord model structure.
|
||||||
# @CONSTRAINT: Uses synchronous SQLite operations (blocking), should be used carefully.
|
|
||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
import sqlite3
|
|
||||||
import json
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from pathlib import Path
|
from typing import List, Optional, Dict, Any
|
||||||
from typing import Dict, List, Optional, Any
|
import json
|
||||||
|
|
||||||
from .models import Task, TaskStatus
|
from sqlalchemy.orm import Session
|
||||||
|
from backend.src.models.task import TaskRecord
|
||||||
|
from backend.src.core.database import TasksSessionLocal
|
||||||
|
from .models import Task, TaskStatus, LogEntry
|
||||||
from ..logger import logger, belief_scope
|
from ..logger import logger, belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService:Class]
|
# [DEF:TaskPersistenceService:Class]
|
||||||
# @SEMANTICS: persistence, service, database
|
# @SEMANTICS: persistence, service, database, sqlalchemy
|
||||||
# @PURPOSE: Provides methods to save and load tasks from a local SQLite database.
|
# @PURPOSE: Provides methods to save and load tasks from the tasks.db database using SQLAlchemy.
|
||||||
class TaskPersistenceService:
|
class TaskPersistenceService:
|
||||||
def __init__(self, db_path: Optional[Path] = None):
|
def __init__(self):
|
||||||
if db_path is None:
|
# We use TasksSessionLocal from database.py
|
||||||
self.db_path = Path(__file__).parent.parent.parent.parent / "migrations.db"
|
pass
|
||||||
else:
|
|
||||||
self.db_path = db_path
|
|
||||||
self._ensure_db_exists()
|
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService._ensure_db_exists:Function]
|
# [DEF:TaskPersistenceService.persist_task:Function]
|
||||||
# @PURPOSE: Ensures the database directory and table exist.
|
# @PURPOSE: Persists or updates a single task in the database.
|
||||||
# @PRE: None.
|
# @PARAM: task (Task) - The task object to persist.
|
||||||
# @POST: Database file and table are created if they didn't exist.
|
def persist_task(self, task: Task) -> None:
|
||||||
def _ensure_db_exists(self) -> None:
|
with belief_scope("TaskPersistenceService.persist_task", f"task_id={task.id}"):
|
||||||
with belief_scope("TaskPersistenceService._ensure_db_exists"):
|
session: Session = TasksSessionLocal()
|
||||||
self.db_path.parent.mkdir(parents=True, exist_ok=True)
|
try:
|
||||||
|
record = session.query(TaskRecord).filter(TaskRecord.id == task.id).first()
|
||||||
|
if not record:
|
||||||
|
record = TaskRecord(id=task.id)
|
||||||
|
session.add(record)
|
||||||
|
|
||||||
conn = sqlite3.connect(str(self.db_path))
|
record.type = task.plugin_id
|
||||||
cursor = conn.cursor()
|
record.status = task.status.value
|
||||||
|
record.environment_id = task.params.get("environment_id") or task.params.get("source_env_id")
|
||||||
|
record.started_at = task.started_at
|
||||||
|
record.finished_at = task.finished_at
|
||||||
|
record.params = task.params
|
||||||
|
|
||||||
cursor.execute("""
|
# Store logs as JSON, converting datetime to string
|
||||||
CREATE TABLE IF NOT EXISTS persistent_tasks (
|
record.logs = []
|
||||||
id TEXT PRIMARY KEY,
|
for log in task.logs:
|
||||||
plugin_id TEXT NOT NULL,
|
log_dict = log.dict()
|
||||||
status TEXT NOT NULL,
|
if isinstance(log_dict.get('timestamp'), datetime):
|
||||||
created_at TEXT NOT NULL,
|
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
||||||
updated_at TEXT NOT NULL,
|
record.logs.append(log_dict)
|
||||||
input_request JSON,
|
|
||||||
context JSON
|
# Extract error if failed
|
||||||
)
|
if task.status == TaskStatus.FAILED:
|
||||||
""")
|
for log in reversed(task.logs):
|
||||||
conn.commit()
|
if log.level == "ERROR":
|
||||||
conn.close()
|
record.error = log.message
|
||||||
# [/DEF:TaskPersistenceService._ensure_db_exists:Function]
|
break
|
||||||
|
|
||||||
|
session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
session.rollback()
|
||||||
|
logger.error(f"Failed to persist task {task.id}: {e}")
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:TaskPersistenceService.persist_task:Function]
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService.persist_tasks:Function]
|
# [DEF:TaskPersistenceService.persist_tasks:Function]
|
||||||
# @PURPOSE: Persists a list of tasks to the database.
|
# @PURPOSE: Persists multiple tasks.
|
||||||
# @PRE: Tasks list contains valid Task objects.
|
# @PARAM: tasks (List[Task]) - The list of tasks to persist.
|
||||||
# @POST: Tasks matching the criteria (AWAITING_INPUT) are saved/updated in the DB.
|
|
||||||
# @PARAM: tasks (List[Task]) - The list of tasks to check and persist.
|
|
||||||
def persist_tasks(self, tasks: List[Task]) -> None:
|
def persist_tasks(self, tasks: List[Task]) -> None:
|
||||||
with belief_scope("TaskPersistenceService.persist_tasks"):
|
for task in tasks:
|
||||||
conn = sqlite3.connect(str(self.db_path))
|
self.persist_task(task)
|
||||||
cursor = conn.cursor()
|
|
||||||
|
|
||||||
count = 0
|
|
||||||
for task in tasks:
|
|
||||||
if task.status == TaskStatus.AWAITING_INPUT:
|
|
||||||
cursor.execute("""
|
|
||||||
INSERT OR REPLACE INTO persistent_tasks
|
|
||||||
(id, plugin_id, status, created_at, updated_at, input_request, context)
|
|
||||||
VALUES (?, ?, ?, ?, ?, ?, ?)
|
|
||||||
""", (
|
|
||||||
task.id,
|
|
||||||
task.plugin_id,
|
|
||||||
task.status.value,
|
|
||||||
task.started_at.isoformat() if task.started_at else datetime.utcnow().isoformat(),
|
|
||||||
datetime.utcnow().isoformat(),
|
|
||||||
json.dumps(task.input_request) if task.input_request else None,
|
|
||||||
json.dumps(task.params)
|
|
||||||
))
|
|
||||||
count += 1
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
conn.close()
|
|
||||||
logger.info(f"Persisted {count} tasks awaiting input.")
|
|
||||||
# [/DEF:TaskPersistenceService.persist_tasks:Function]
|
# [/DEF:TaskPersistenceService.persist_tasks:Function]
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService.load_tasks:Function]
|
# [DEF:TaskPersistenceService.load_tasks:Function]
|
||||||
# @PURPOSE: Loads persisted tasks from the database.
|
# @PURPOSE: Loads tasks from the database.
|
||||||
# @PRE: Database exists.
|
# @PARAM: limit (int) - Max tasks to load.
|
||||||
# @POST: Returns a list of Task objects reconstructed from the DB.
|
# @PARAM: status (Optional[TaskStatus]) - Filter by status.
|
||||||
# @RETURN: List[Task] - The loaded tasks.
|
# @RETURN: List[Task] - The loaded tasks.
|
||||||
def load_tasks(self) -> List[Task]:
|
def load_tasks(self, limit: int = 100, status: Optional[TaskStatus] = None) -> List[Task]:
|
||||||
with belief_scope("TaskPersistenceService.load_tasks"):
|
with belief_scope("TaskPersistenceService.load_tasks"):
|
||||||
if not self.db_path.exists():
|
session: Session = TasksSessionLocal()
|
||||||
return []
|
try:
|
||||||
|
query = session.query(TaskRecord)
|
||||||
|
if status:
|
||||||
|
query = query.filter(TaskRecord.status == status.value)
|
||||||
|
|
||||||
conn = sqlite3.connect(str(self.db_path))
|
records = query.order_by(TaskRecord.created_at.desc()).limit(limit).all()
|
||||||
cursor = conn.cursor()
|
|
||||||
|
|
||||||
# Check if plugin_id column exists (migration for existing db)
|
loaded_tasks = []
|
||||||
cursor.execute("PRAGMA table_info(persistent_tasks)")
|
for record in records:
|
||||||
columns = [info[1] for info in cursor.fetchall()]
|
try:
|
||||||
has_plugin_id = "plugin_id" in columns
|
logs = []
|
||||||
|
if record.logs:
|
||||||
|
for log_data in record.logs:
|
||||||
|
# Handle timestamp conversion if it's a string
|
||||||
|
if isinstance(log_data.get('timestamp'), str):
|
||||||
|
log_data['timestamp'] = datetime.fromisoformat(log_data['timestamp'])
|
||||||
|
logs.append(LogEntry(**log_data))
|
||||||
|
|
||||||
if has_plugin_id:
|
task = Task(
|
||||||
cursor.execute("SELECT id, plugin_id, status, created_at, input_request, context FROM persistent_tasks")
|
id=record.id,
|
||||||
else:
|
plugin_id=record.type,
|
||||||
cursor.execute("SELECT id, status, created_at, input_request, context FROM persistent_tasks")
|
status=TaskStatus(record.status),
|
||||||
|
started_at=record.started_at,
|
||||||
|
finished_at=record.finished_at,
|
||||||
|
params=record.params or {},
|
||||||
|
logs=logs
|
||||||
|
)
|
||||||
|
loaded_tasks.append(task)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to reconstruct task {record.id}: {e}")
|
||||||
|
|
||||||
rows = cursor.fetchall()
|
return loaded_tasks
|
||||||
|
finally:
|
||||||
loaded_tasks = []
|
session.close()
|
||||||
for row in rows:
|
|
||||||
if has_plugin_id:
|
|
||||||
task_id, plugin_id, status, created_at, input_request_json, context_json = row
|
|
||||||
else:
|
|
||||||
task_id, status, created_at, input_request_json, context_json = row
|
|
||||||
plugin_id = "superset-migration" # Default fallback
|
|
||||||
|
|
||||||
try:
|
|
||||||
task = Task(
|
|
||||||
id=task_id,
|
|
||||||
plugin_id=plugin_id,
|
|
||||||
status=TaskStatus(status),
|
|
||||||
started_at=datetime.fromisoformat(created_at),
|
|
||||||
input_required=True,
|
|
||||||
input_request=json.loads(input_request_json) if input_request_json else None,
|
|
||||||
params=json.loads(context_json) if context_json else {}
|
|
||||||
)
|
|
||||||
loaded_tasks.append(task)
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"Failed to load task {task_id}: {e}")
|
|
||||||
|
|
||||||
conn.close()
|
|
||||||
return loaded_tasks
|
|
||||||
# [/DEF:TaskPersistenceService.load_tasks:Function]
|
# [/DEF:TaskPersistenceService.load_tasks:Function]
|
||||||
|
|
||||||
# [DEF:TaskPersistenceService.delete_tasks:Function]
|
# [DEF:TaskPersistenceService.delete_tasks:Function]
|
||||||
@@ -145,14 +126,16 @@ class TaskPersistenceService:
|
|||||||
if not task_ids:
|
if not task_ids:
|
||||||
return
|
return
|
||||||
with belief_scope("TaskPersistenceService.delete_tasks"):
|
with belief_scope("TaskPersistenceService.delete_tasks"):
|
||||||
conn = sqlite3.connect(str(self.db_path))
|
session: Session = TasksSessionLocal()
|
||||||
cursor = conn.cursor()
|
try:
|
||||||
placeholders = ', '.join('?' for _ in task_ids)
|
session.query(TaskRecord).filter(TaskRecord.id.in_(task_ids)).delete(synchronize_session=False)
|
||||||
cursor.execute(f"DELETE FROM persistent_tasks WHERE id IN ({placeholders})", task_ids)
|
session.commit()
|
||||||
conn.commit()
|
except Exception as e:
|
||||||
conn.close()
|
session.rollback()
|
||||||
|
logger.error(f"Failed to delete tasks: {e}")
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
# [/DEF:TaskPersistenceService.delete_tasks:Function]
|
# [/DEF:TaskPersistenceService.delete_tasks:Function]
|
||||||
|
|
||||||
# [/DEF:TaskPersistenceService:Class]
|
# [/DEF:TaskPersistenceService:Class]
|
||||||
|
|
||||||
# [/DEF:TaskPersistenceModule:Module]
|
# [/DEF:TaskPersistenceModule:Module]
|
||||||
@@ -8,6 +8,8 @@ from pathlib import Path
|
|||||||
from .core.plugin_loader import PluginLoader
|
from .core.plugin_loader import PluginLoader
|
||||||
from .core.task_manager import TaskManager
|
from .core.task_manager import TaskManager
|
||||||
from .core.config_manager import ConfigManager
|
from .core.config_manager import ConfigManager
|
||||||
|
from .core.scheduler import SchedulerService
|
||||||
|
from .core.database import init_db
|
||||||
|
|
||||||
# Initialize singletons
|
# Initialize singletons
|
||||||
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
||||||
@@ -15,6 +17,9 @@ project_root = Path(__file__).parent.parent.parent
|
|||||||
config_path = project_root / "config.json"
|
config_path = project_root / "config.json"
|
||||||
config_manager = ConfigManager(config_path=str(config_path))
|
config_manager = ConfigManager(config_path=str(config_path))
|
||||||
|
|
||||||
|
# Initialize database before any other services that might use it
|
||||||
|
init_db()
|
||||||
|
|
||||||
def get_config_manager() -> ConfigManager:
|
def get_config_manager() -> ConfigManager:
|
||||||
"""Dependency injector for the ConfigManager."""
|
"""Dependency injector for the ConfigManager."""
|
||||||
return config_manager
|
return config_manager
|
||||||
@@ -28,6 +33,9 @@ logger.info(f"Available plugins: {[config.name for config in plugin_loader.get_a
|
|||||||
task_manager = TaskManager(plugin_loader)
|
task_manager = TaskManager(plugin_loader)
|
||||||
logger.info("TaskManager initialized")
|
logger.info("TaskManager initialized")
|
||||||
|
|
||||||
|
scheduler_service = SchedulerService(task_manager, config_manager)
|
||||||
|
logger.info("SchedulerService initialized")
|
||||||
|
|
||||||
def get_plugin_loader() -> PluginLoader:
|
def get_plugin_loader() -> PluginLoader:
|
||||||
"""Dependency injector for the PluginLoader."""
|
"""Dependency injector for the PluginLoader."""
|
||||||
return plugin_loader
|
return plugin_loader
|
||||||
@@ -35,4 +43,8 @@ def get_plugin_loader() -> PluginLoader:
|
|||||||
def get_task_manager() -> TaskManager:
|
def get_task_manager() -> TaskManager:
|
||||||
"""Dependency injector for the TaskManager."""
|
"""Dependency injector for the TaskManager."""
|
||||||
return task_manager
|
return task_manager
|
||||||
|
|
||||||
|
def get_scheduler_service() -> SchedulerService:
|
||||||
|
"""Dependency injector for the SchedulerService."""
|
||||||
|
return scheduler_service
|
||||||
# [/DEF]
|
# [/DEF]
|
||||||
34
backend/src/models/task.py
Normal file
34
backend/src/models/task.py
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# [DEF:backend.src.models.task:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: database, task, record, sqlalchemy, sqlite
|
||||||
|
# @PURPOSE: Defines the database schema for task execution records.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||||
|
#
|
||||||
|
# @INVARIANT: All primary keys are UUID strings.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from sqlalchemy import Column, String, DateTime, JSON, ForeignKey
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
from .mapping import Base
|
||||||
|
import uuid
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:TaskRecord:Class]
|
||||||
|
# @PURPOSE: Represents a persistent record of a task execution.
|
||||||
|
class TaskRecord(Base):
|
||||||
|
__tablename__ = "task_records"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||||
|
type = Column(String, nullable=False) # e.g., "backup", "migration"
|
||||||
|
status = Column(String, nullable=False) # Enum: "PENDING", "RUNNING", "SUCCESS", "FAILED"
|
||||||
|
environment_id = Column(String, ForeignKey("environments.id"), nullable=True)
|
||||||
|
started_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
finished_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
logs = Column(JSON, nullable=True) # Store structured logs as JSON
|
||||||
|
error = Column(String, nullable=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
|
params = Column(JSON, nullable=True)
|
||||||
|
# [/DEF:TaskRecord]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.models.task]
|
||||||
@@ -71,8 +71,21 @@ class BackupPlugin(PluginBase):
|
|||||||
}
|
}
|
||||||
|
|
||||||
async def execute(self, params: Dict[str, Any]):
|
async def execute(self, params: Dict[str, Any]):
|
||||||
env = params["env"]
|
config_manager = get_config_manager()
|
||||||
backup_path = Path(params["backup_path"])
|
env_id = params.get("environment_id")
|
||||||
|
|
||||||
|
# Resolve environment name if environment_id is provided
|
||||||
|
if env_id:
|
||||||
|
env_config = next((e for e in config_manager.get_environments() if e.id == env_id), None)
|
||||||
|
if env_config:
|
||||||
|
params["env"] = env_config.name
|
||||||
|
|
||||||
|
env = params.get("env")
|
||||||
|
if not env:
|
||||||
|
raise KeyError("env")
|
||||||
|
|
||||||
|
backup_path_str = params.get("backup_path") or config_manager.get_config().settings.backup_path
|
||||||
|
backup_path = Path(backup_path_str)
|
||||||
|
|
||||||
logger = SupersetLogger(log_dir=backup_path / "Logs", console=True)
|
logger = SupersetLogger(log_dir=backup_path / "Logs", console=True)
|
||||||
logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
||||||
|
|||||||
BIN
backend/tasks.db
Normal file
BIN
backend/tasks.db
Normal file
Binary file not shown.
13
frontend/package-lock.json
generated
13
frontend/package-lock.json
generated
@@ -7,6 +7,9 @@
|
|||||||
"": {
|
"": {
|
||||||
"name": "frontend",
|
"name": "frontend",
|
||||||
"version": "0.0.0",
|
"version": "0.0.0",
|
||||||
|
"dependencies": {
|
||||||
|
"date-fns": "^4.1.0"
|
||||||
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@sveltejs/adapter-static": "^3.0.10",
|
"@sveltejs/adapter-static": "^3.0.10",
|
||||||
"@sveltejs/kit": "^2.49.2",
|
"@sveltejs/kit": "^2.49.2",
|
||||||
@@ -1279,6 +1282,16 @@
|
|||||||
"node": ">=4"
|
"node": ">=4"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/date-fns": {
|
||||||
|
"version": "4.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/date-fns/-/date-fns-4.1.0.tgz",
|
||||||
|
"integrity": "sha512-Ukq0owbQXxa/U3EGtsdVBkR1w7KOQ5gIBqdH2hkvknzZPYvBxb/aa6E8L7tmjFtkwZBu3UXBbjIgPo/Ez4xaNg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"funding": {
|
||||||
|
"type": "github",
|
||||||
|
"url": "https://github.com/sponsors/kossnocorp"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/debug": {
|
"node_modules/debug": {
|
||||||
"version": "4.4.3",
|
"version": "4.4.3",
|
||||||
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",
|
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",
|
||||||
|
|||||||
@@ -17,5 +17,8 @@
|
|||||||
"svelte": "^5.43.8",
|
"svelte": "^5.43.8",
|
||||||
"tailwindcss": "^3.0.0",
|
"tailwindcss": "^3.0.0",
|
||||||
"vite": "^7.2.4"
|
"vite": "^7.2.4"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"date-fns": "^4.1.0"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -22,6 +22,12 @@
|
|||||||
>
|
>
|
||||||
Migration
|
Migration
|
||||||
</a>
|
</a>
|
||||||
|
<a
|
||||||
|
href="/tasks"
|
||||||
|
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/tasks') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||||
|
>
|
||||||
|
Tasks
|
||||||
|
</a>
|
||||||
<a
|
<a
|
||||||
href="/settings"
|
href="/settings"
|
||||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||||
|
|||||||
94
frontend/src/components/TaskList.svelte
Normal file
94
frontend/src/components/TaskList.svelte
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
<!-- [DEF:TaskList:Component] -->
|
||||||
|
<!--
|
||||||
|
@SEMANTICS: tasks, list, status, history
|
||||||
|
@PURPOSE: Displays a list of tasks with their status and execution details.
|
||||||
|
@LAYER: Component
|
||||||
|
@RELATION: USES -> api.js
|
||||||
|
-->
|
||||||
|
|
||||||
|
<script lang="ts">
|
||||||
|
import { createEventDispatcher } from 'svelte';
|
||||||
|
import { formatDistanceToNow } from 'date-fns';
|
||||||
|
|
||||||
|
export let tasks: Array<any> = [];
|
||||||
|
export let loading: boolean = false;
|
||||||
|
|
||||||
|
const dispatch = createEventDispatcher();
|
||||||
|
|
||||||
|
function getStatusColor(status: string) {
|
||||||
|
switch (status) {
|
||||||
|
case 'SUCCESS': return 'bg-green-100 text-green-800';
|
||||||
|
case 'FAILED': return 'bg-red-100 text-red-800';
|
||||||
|
case 'RUNNING': return 'bg-blue-100 text-blue-800 animate-pulse';
|
||||||
|
case 'PENDING': return 'bg-gray-100 text-gray-800';
|
||||||
|
case 'AWAITING_INPUT':
|
||||||
|
case 'AWAITING_MAPPING': return 'bg-yellow-100 text-yellow-800';
|
||||||
|
default: return 'bg-gray-100 text-gray-800';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatTime(dateStr: string | null) {
|
||||||
|
if (!dateStr) return 'N/A';
|
||||||
|
try {
|
||||||
|
return formatDistanceToNow(new Date(dateStr), { addSuffix: true });
|
||||||
|
} catch (e) {
|
||||||
|
return 'Invalid date';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleTaskClick(taskId: string) {
|
||||||
|
dispatch('select', { id: taskId });
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div class="bg-white shadow overflow-hidden sm:rounded-md">
|
||||||
|
{#if loading && tasks.length === 0}
|
||||||
|
<div class="p-4 text-center text-gray-500">Loading tasks...</div>
|
||||||
|
{:else if tasks.length === 0}
|
||||||
|
<div class="p-4 text-center text-gray-500">No tasks found.</div>
|
||||||
|
{:else}
|
||||||
|
<ul class="divide-y divide-gray-200">
|
||||||
|
{#each tasks as task (task.id)}
|
||||||
|
<li>
|
||||||
|
<button
|
||||||
|
class="block hover:bg-gray-50 w-full text-left transition duration-150 ease-in-out focus:outline-none"
|
||||||
|
on:click={() => handleTaskClick(task.id)}
|
||||||
|
>
|
||||||
|
<div class="px-4 py-4 sm:px-6">
|
||||||
|
<div class="flex items-center justify-between">
|
||||||
|
<div class="text-sm font-medium text-blue-600 truncate">
|
||||||
|
{task.plugin_id.toUpperCase()}
|
||||||
|
<span class="ml-2 text-xs text-gray-400 font-mono">{task.id.substring(0, 8)}</span>
|
||||||
|
</div>
|
||||||
|
<div class="ml-2 flex-shrink-0 flex">
|
||||||
|
<p class="px-2 inline-flex text-xs leading-5 font-semibold rounded-full {getStatusColor(task.status)}">
|
||||||
|
{task.status}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="mt-2 sm:flex sm:justify-between">
|
||||||
|
<div class="sm:flex">
|
||||||
|
<p class="flex items-center text-sm text-gray-500">
|
||||||
|
{#if task.params?.environment_id || task.params?.source_env_id}
|
||||||
|
<span class="mr-2">Env: {task.params.environment_id || task.params.source_env_id}</span>
|
||||||
|
{/if}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div class="mt-2 flex items-center text-sm text-gray-500 sm:mt-0">
|
||||||
|
<svg class="flex-shrink-0 mr-1.5 h-5 w-5 text-gray-400" fill="currentColor" viewBox="0 0 20 20">
|
||||||
|
<path fill-rule="evenodd" d="M6 2a1 1 0 00-1 1v1H4a2 2 0 00-2 2v10a2 2 0 002 2h12a2 2 0 002-2V6a2 2 0 00-2-2h-1V3a1 1 0 10-2 0v1H7V3a1 1 0 00-1-1zm0 5a1 1 0 000 2h8a1 1 0 100-2H6z" clip-rule="evenodd" />
|
||||||
|
</svg>
|
||||||
|
<p>
|
||||||
|
Started {formatTime(task.started_at)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</button>
|
||||||
|
</li>
|
||||||
|
{/each}
|
||||||
|
</ul>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- [/DEF:TaskList] -->
|
||||||
@@ -100,19 +100,21 @@ async function requestApi(endpoint, method = 'GET', body = null) {
|
|||||||
// [DEF:api:Data]
|
// [DEF:api:Data]
|
||||||
// @PURPOSE: API client object with specific methods.
|
// @PURPOSE: API client object with specific methods.
|
||||||
export const api = {
|
export const api = {
|
||||||
getPlugins: () => fetchApi('/plugins/'),
|
getPlugins: () => fetchApi('/plugins'),
|
||||||
getTasks: () => fetchApi('/tasks/'),
|
getTasks: () => fetchApi('/tasks'),
|
||||||
getTask: (taskId) => fetchApi(`/tasks/${taskId}`),
|
getTask: (taskId) => fetchApi(`/tasks/${taskId}`),
|
||||||
createTask: (pluginId, params) => postApi('/tasks/', { plugin_id: pluginId, params }),
|
createTask: (pluginId, params) => postApi('/tasks', { plugin_id: pluginId, params }),
|
||||||
|
|
||||||
// Settings
|
// Settings
|
||||||
getSettings: () => fetchApi('/settings/'),
|
getSettings: () => fetchApi('/settings'),
|
||||||
updateGlobalSettings: (settings) => requestApi('/settings/global', 'PATCH', settings),
|
updateGlobalSettings: (settings) => requestApi('/settings/global', 'PATCH', settings),
|
||||||
getEnvironments: () => fetchApi('/settings/environments'),
|
getEnvironments: () => fetchApi('/settings/environments'),
|
||||||
addEnvironment: (env) => postApi('/settings/environments', env),
|
addEnvironment: (env) => postApi('/settings/environments', env),
|
||||||
updateEnvironment: (id, env) => requestApi(`/settings/environments/${id}`, 'PUT', env),
|
updateEnvironment: (id, env) => requestApi(`/settings/environments/${id}`, 'PUT', env),
|
||||||
deleteEnvironment: (id) => requestApi(`/settings/environments/${id}`, 'DELETE'),
|
deleteEnvironment: (id) => requestApi(`/settings/environments/${id}`, 'DELETE'),
|
||||||
testEnvironmentConnection: (id) => postApi(`/settings/environments/${id}/test`, {}),
|
testEnvironmentConnection: (id) => postApi(`/settings/environments/${id}/test`, {}),
|
||||||
|
updateEnvironmentSchedule: (id, schedule) => requestApi(`/environments/${id}/schedule`, 'PUT', schedule),
|
||||||
|
getEnvironmentsList: () => fetchApi('/environments'),
|
||||||
};
|
};
|
||||||
// [/DEF:api_module]
|
// [/DEF:api_module]
|
||||||
|
|
||||||
@@ -128,3 +130,5 @@ export const addEnvironment = api.addEnvironment;
|
|||||||
export const updateEnvironment = api.updateEnvironment;
|
export const updateEnvironment = api.updateEnvironment;
|
||||||
export const deleteEnvironment = api.deleteEnvironment;
|
export const deleteEnvironment = api.deleteEnvironment;
|
||||||
export const testEnvironmentConnection = api.testEnvironmentConnection;
|
export const testEnvironmentConnection = api.testEnvironmentConnection;
|
||||||
|
export const updateEnvironmentSchedule = api.updateEnvironmentSchedule;
|
||||||
|
export const getEnvironmentsList = api.getEnvironmentsList;
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
<script>
|
<script>
|
||||||
// [SECTION: IMPORTS]
|
// [SECTION: IMPORTS]
|
||||||
import { onMount } from 'svelte';
|
import { onMount } from 'svelte';
|
||||||
import { getSettings, updateGlobalSettings, getEnvironments, addEnvironment, updateEnvironment, deleteEnvironment, testEnvironmentConnection } from '../lib/api';
|
import { getSettings, updateGlobalSettings, getEnvironments, addEnvironment, updateEnvironment, deleteEnvironment, testEnvironmentConnection, updateEnvironmentSchedule } from '../lib/api';
|
||||||
import { addToast } from '../lib/toasts';
|
import { addToast } from '../lib/toasts';
|
||||||
// [/SECTION]
|
// [/SECTION]
|
||||||
|
|
||||||
@@ -38,7 +38,11 @@
|
|||||||
url: '',
|
url: '',
|
||||||
username: '',
|
username: '',
|
||||||
password: '',
|
password: '',
|
||||||
is_default: false
|
is_default: false,
|
||||||
|
backup_schedule: {
|
||||||
|
enabled: false,
|
||||||
|
cron_expression: '0 0 * * *'
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
let editingEnvId = null;
|
let editingEnvId = null;
|
||||||
@@ -167,7 +171,11 @@
|
|||||||
url: '',
|
url: '',
|
||||||
username: '',
|
username: '',
|
||||||
password: '',
|
password: '',
|
||||||
is_default: false
|
is_default: false,
|
||||||
|
backup_schedule: {
|
||||||
|
enabled: false,
|
||||||
|
cron_expression: '0 0 * * *'
|
||||||
|
}
|
||||||
};
|
};
|
||||||
editingEnvId = null;
|
editingEnvId = null;
|
||||||
}
|
}
|
||||||
@@ -293,7 +301,21 @@
|
|||||||
<label for="env_default" class="ml-2 block text-sm text-gray-900">Default Environment</label>
|
<label for="env_default" class="ml-2 block text-sm text-gray-900">Default Environment</label>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="mt-4 flex gap-2">
|
|
||||||
|
<h3 class="text-lg font-medium mb-4 mt-6">Backup Schedule</h3>
|
||||||
|
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||||
|
<div class="flex items-center">
|
||||||
|
<input type="checkbox" id="backup_enabled" bind:checked={newEnv.backup_schedule.enabled} class="h-4 w-4 text-blue-600 border-gray-300 rounded" />
|
||||||
|
<label for="backup_enabled" class="ml-2 block text-sm text-gray-900">Enable Automatic Backups</label>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label for="cron_expression" class="block text-sm font-medium text-gray-700">Cron Expression</label>
|
||||||
|
<input type="text" id="cron_expression" bind:value={newEnv.backup_schedule.cron_expression} placeholder="0 0 * * *" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||||
|
<p class="text-xs text-gray-500 mt-1">Example: 0 0 * * * (daily at midnight), */5 * * * * (every 5 minutes)</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="mt-6 flex gap-2">
|
||||||
<button on:click={handleAddOrUpdateEnv} class="bg-green-500 text-white px-4 py-2 rounded hover:bg-green-600">
|
<button on:click={handleAddOrUpdateEnv} class="bg-green-500 text-white px-4 py-2 rounded hover:bg-green-600">
|
||||||
{editingEnvId ? 'Update' : 'Add'} Environment
|
{editingEnvId ? 'Update' : 'Add'} Environment
|
||||||
</button>
|
</button>
|
||||||
|
|||||||
140
frontend/src/routes/tasks/+page.svelte
Normal file
140
frontend/src/routes/tasks/+page.svelte
Normal file
@@ -0,0 +1,140 @@
|
|||||||
|
<script>
|
||||||
|
import { onMount, onDestroy } from 'svelte';
|
||||||
|
import { getTasks, createTask, getEnvironmentsList } from '../../lib/api';
|
||||||
|
import { addToast } from '../../lib/toasts';
|
||||||
|
import TaskList from '../../components/TaskList.svelte';
|
||||||
|
import TaskLogViewer from '../../components/TaskLogViewer.svelte';
|
||||||
|
|
||||||
|
let tasks = [];
|
||||||
|
let environments = [];
|
||||||
|
let loading = true;
|
||||||
|
let selectedTaskId = null;
|
||||||
|
let pollInterval;
|
||||||
|
let showBackupModal = false;
|
||||||
|
let selectedEnvId = '';
|
||||||
|
|
||||||
|
async function loadInitialData() {
|
||||||
|
try {
|
||||||
|
loading = true;
|
||||||
|
const [tasksData, envsData] = await Promise.all([
|
||||||
|
getTasks(),
|
||||||
|
getEnvironmentsList()
|
||||||
|
]);
|
||||||
|
tasks = tasksData;
|
||||||
|
environments = envsData;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to load tasks data:', error);
|
||||||
|
} finally {
|
||||||
|
loading = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function refreshTasks() {
|
||||||
|
try {
|
||||||
|
const data = await getTasks();
|
||||||
|
// Ensure we don't try to parse HTML as JSON if the route returns 404
|
||||||
|
if (Array.isArray(data)) {
|
||||||
|
tasks = data;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to refresh tasks:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleSelectTask(event) {
|
||||||
|
selectedTaskId = event.detail.id;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function handleRunBackup() {
|
||||||
|
if (!selectedEnvId) {
|
||||||
|
addToast('Please select an environment', 'error');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const task = await createTask('superset-backup', { environment_id: selectedEnvId });
|
||||||
|
addToast('Backup task started', 'success');
|
||||||
|
showBackupModal = false;
|
||||||
|
selectedTaskId = task.id;
|
||||||
|
await refreshTasks();
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to start backup:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
loadInitialData();
|
||||||
|
pollInterval = setInterval(refreshTasks, 3000);
|
||||||
|
});
|
||||||
|
|
||||||
|
onDestroy(() => {
|
||||||
|
if (pollInterval) clearInterval(pollInterval);
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div class="container mx-auto p-4 max-w-6xl">
|
||||||
|
<div class="flex justify-between items-center mb-6">
|
||||||
|
<h1 class="text-2xl font-bold text-gray-800">Task Management</h1>
|
||||||
|
<button
|
||||||
|
on:click={() => showBackupModal = true}
|
||||||
|
class="bg-blue-600 hover:bg-blue-700 text-white px-4 py-2 rounded-md shadow-sm transition duration-150 font-medium"
|
||||||
|
>
|
||||||
|
Run Backup
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="grid grid-cols-1 lg:grid-cols-3 gap-6">
|
||||||
|
<div class="lg:col-span-1">
|
||||||
|
<h2 class="text-lg font-semibold mb-3 text-gray-700">Recent Tasks</h2>
|
||||||
|
<TaskList {tasks} {loading} on:select={handleSelectTask} />
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="lg:col-span-2">
|
||||||
|
<h2 class="text-lg font-semibold mb-3 text-gray-700">Task Details & Logs</h2>
|
||||||
|
{#if selectedTaskId}
|
||||||
|
<div class="bg-white rounded-lg shadow-lg h-[600px] flex flex-col">
|
||||||
|
<TaskLogViewer taskId={selectedTaskId} />
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<div class="bg-gray-50 border-2 border-dashed border-gray-300 rounded-lg h-[600px] flex items-center justify-center text-gray-500">
|
||||||
|
<p>Select a task to view logs and details</p>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if showBackupModal}
|
||||||
|
<div class="fixed inset-0 z-50 flex items-center justify-center bg-black bg-opacity-50">
|
||||||
|
<div class="bg-white rounded-lg shadow-xl p-6 w-full max-w-md">
|
||||||
|
<h3 class="text-xl font-bold mb-4">Run Manual Backup</h3>
|
||||||
|
<div class="mb-4">
|
||||||
|
<label for="env-select" class="block text-sm font-medium text-gray-700 mb-1">Target Environment</label>
|
||||||
|
<select
|
||||||
|
id="env-select"
|
||||||
|
bind:value={selectedEnvId}
|
||||||
|
class="w-full border-gray-300 rounded-md shadow-sm focus:ring-blue-500 focus:border-blue-500 p-2 border"
|
||||||
|
>
|
||||||
|
<option value="" disabled>-- Select Environment --</option>
|
||||||
|
{#each environments as env}
|
||||||
|
<option value={env.id}>{env.name}</option>
|
||||||
|
{/each}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
<div class="flex justify-end space-x-3">
|
||||||
|
<button
|
||||||
|
on:click={() => showBackupModal = false}
|
||||||
|
class="px-4 py-2 text-gray-700 hover:bg-gray-100 rounded-md transition"
|
||||||
|
>
|
||||||
|
Cancel
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
on:click={handleRunBackup}
|
||||||
|
class="px-4 py-2 bg-blue-600 text-white rounded-md hover:bg-blue-700 transition"
|
||||||
|
>
|
||||||
|
Start Backup
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
34
specs/009-backup-scheduler/checklists/requirements.md
Normal file
34
specs/009-backup-scheduler/checklists/requirements.md
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# Specification Quality Checklist: Backup Scheduler & Unified Task UI
|
||||||
|
|
||||||
|
**Purpose**: Validate specification completeness and quality before proceeding to planning
|
||||||
|
**Created**: 2025-12-30
|
||||||
|
**Feature**: [Link to spec.md](../spec.md)
|
||||||
|
|
||||||
|
## Content Quality
|
||||||
|
|
||||||
|
- [x] No implementation details (languages, frameworks, APIs)
|
||||||
|
- [x] Focused on user value and business needs
|
||||||
|
- [x] Written for non-technical stakeholders
|
||||||
|
- [x] All mandatory sections completed
|
||||||
|
|
||||||
|
## Requirement Completeness
|
||||||
|
|
||||||
|
- [x] No [NEEDS CLARIFICATION] markers remain
|
||||||
|
- [x] Requirements are testable and unambiguous
|
||||||
|
- [x] Success criteria are measurable
|
||||||
|
- [x] Success criteria are technology-agnostic (no implementation details)
|
||||||
|
- [x] All acceptance scenarios are defined
|
||||||
|
- [x] Edge cases are identified
|
||||||
|
- [x] Scope is clearly bounded
|
||||||
|
- [x] Dependencies and assumptions identified
|
||||||
|
|
||||||
|
## Feature Readiness
|
||||||
|
|
||||||
|
- [x] All functional requirements have clear acceptance criteria
|
||||||
|
- [x] User scenarios cover primary flows
|
||||||
|
- [x] Feature meets measurable outcomes defined in Success Criteria
|
||||||
|
- [x] No implementation details leak into specification
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- Items marked incomplete require spec updates before `/speckit.clarify` or `/speckit.plan`
|
||||||
154
specs/009-backup-scheduler/contracts/api.yaml
Normal file
154
specs/009-backup-scheduler/contracts/api.yaml
Normal file
@@ -0,0 +1,154 @@
|
|||||||
|
openapi: 3.0.0
|
||||||
|
info:
|
||||||
|
title: Backup Scheduler & Task API
|
||||||
|
version: 1.0.0
|
||||||
|
paths:
|
||||||
|
/tasks:
|
||||||
|
get:
|
||||||
|
summary: List all tasks
|
||||||
|
parameters:
|
||||||
|
- name: limit
|
||||||
|
in: query
|
||||||
|
schema:
|
||||||
|
type: integer
|
||||||
|
default: 50
|
||||||
|
- name: type
|
||||||
|
in: query
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
enum: [backup, migration]
|
||||||
|
- name: status
|
||||||
|
in: query
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
enum: [running, success, failed, pending]
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: List of tasks
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
type: array
|
||||||
|
items:
|
||||||
|
$ref: '#/components/schemas/Task'
|
||||||
|
|
||||||
|
/tasks/backup:
|
||||||
|
post:
|
||||||
|
summary: Manually trigger a backup
|
||||||
|
requestBody:
|
||||||
|
required: true
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
type: object
|
||||||
|
required:
|
||||||
|
- environment_id
|
||||||
|
properties:
|
||||||
|
environment_id:
|
||||||
|
type: string
|
||||||
|
format: uuid
|
||||||
|
responses:
|
||||||
|
'202':
|
||||||
|
description: Backup task started
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
$ref: '#/components/schemas/Task'
|
||||||
|
|
||||||
|
/tasks/{id}:
|
||||||
|
get:
|
||||||
|
summary: Get task details
|
||||||
|
parameters:
|
||||||
|
- name: id
|
||||||
|
in: path
|
||||||
|
required: true
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
format: uuid
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: Task details
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
$ref: '#/components/schemas/Task'
|
||||||
|
|
||||||
|
/environments/{id}/schedule:
|
||||||
|
get:
|
||||||
|
summary: Get backup schedule for environment
|
||||||
|
parameters:
|
||||||
|
- name: id
|
||||||
|
in: path
|
||||||
|
required: true
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
format: uuid
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: Schedule configuration
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
$ref: '#/components/schemas/Schedule'
|
||||||
|
put:
|
||||||
|
summary: Update backup schedule
|
||||||
|
parameters:
|
||||||
|
- name: id
|
||||||
|
in: path
|
||||||
|
required: true
|
||||||
|
schema:
|
||||||
|
type: string
|
||||||
|
format: uuid
|
||||||
|
requestBody:
|
||||||
|
required: true
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
$ref: '#/components/schemas/Schedule'
|
||||||
|
responses:
|
||||||
|
'200':
|
||||||
|
description: Schedule updated
|
||||||
|
content:
|
||||||
|
application/json:
|
||||||
|
schema:
|
||||||
|
$ref: '#/components/schemas/Schedule'
|
||||||
|
|
||||||
|
components:
|
||||||
|
schemas:
|
||||||
|
Task:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
id:
|
||||||
|
type: string
|
||||||
|
format: uuid
|
||||||
|
type:
|
||||||
|
type: string
|
||||||
|
enum: [backup, migration]
|
||||||
|
status:
|
||||||
|
type: string
|
||||||
|
enum: [pending, running, success, failed]
|
||||||
|
environment_id:
|
||||||
|
type: string
|
||||||
|
format: uuid
|
||||||
|
started_at:
|
||||||
|
type: string
|
||||||
|
format: date-time
|
||||||
|
finished_at:
|
||||||
|
type: string
|
||||||
|
format: date-time
|
||||||
|
created_at:
|
||||||
|
type: string
|
||||||
|
format: date-time
|
||||||
|
error:
|
||||||
|
type: string
|
||||||
|
logs:
|
||||||
|
type: string
|
||||||
|
|
||||||
|
Schedule:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
enabled:
|
||||||
|
type: boolean
|
||||||
|
cron_expression:
|
||||||
|
type: string
|
||||||
|
example: "0 0 * * *"
|
||||||
34
specs/009-backup-scheduler/data-model.md
Normal file
34
specs/009-backup-scheduler/data-model.md
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# Data Model: Backup Scheduler & Unified Task UI
|
||||||
|
|
||||||
|
## Entities
|
||||||
|
|
||||||
|
### Task
|
||||||
|
Represents a background operation (Backup, Migration) managed by the system.
|
||||||
|
|
||||||
|
| Field | Type | Description | Constraints |
|
||||||
|
|-------|------|-------------|-------------|
|
||||||
|
| `id` | UUID | Unique identifier | Primary Key |
|
||||||
|
| `type` | String | Type of task | Enum: "backup", "migration" |
|
||||||
|
| `status` | String | Current execution state | Enum: "pending", "running", "success", "failed" |
|
||||||
|
| `environment_id` | UUID | Target environment (if applicable) | Foreign Key (Environments), Nullable |
|
||||||
|
| `started_at` | DateTime | When the task began | Nullable |
|
||||||
|
| `finished_at` | DateTime | When the task completed | Nullable |
|
||||||
|
| `logs` | Text | Execution logs | |
|
||||||
|
| `error` | Text | Error message if failed | Nullable |
|
||||||
|
| `created_at` | DateTime | When task was queued | Default: Now |
|
||||||
|
|
||||||
|
### Schedule
|
||||||
|
Configuration for automatic task execution. Nested within Environment config.
|
||||||
|
|
||||||
|
| Field | Type | Description | Constraints |
|
||||||
|
|-------|------|-------------|-------------|
|
||||||
|
| `environment_id` | UUID | Target environment | Foreign Key (Environments) |
|
||||||
|
| `enabled` | Boolean | Is schedule active? | Default: false |
|
||||||
|
| `cron_expression` | String | Frequency definition | Valid Cron string (e.g., "0 0 * * *") |
|
||||||
|
| `last_run_at` | DateTime | Last execution time | Nullable |
|
||||||
|
| `next_run_at` | DateTime | Calculated next run | Nullable |
|
||||||
|
|
||||||
|
## Storage Strategy
|
||||||
|
|
||||||
|
- **Tasks**: Stored in `tasks.db` (SQLite) via SQLAlchemy.
|
||||||
|
- **Schedules**: Stored in `config.json` as part of the Environment model.
|
||||||
84
specs/009-backup-scheduler/plan.md
Normal file
84
specs/009-backup-scheduler/plan.md
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
# Implementation Plan: Backup Scheduler & Unified Task UI
|
||||||
|
|
||||||
|
**Branch**: `009-backup-scheduler` | **Date**: 2025-12-30 | **Spec**: [link](spec.md)
|
||||||
|
**Input**: Feature specification from `/specs/009-backup-scheduler/spec.md`
|
||||||
|
|
||||||
|
**Note**: This template is filled in by the `/speckit.plan` command. See `.specify/templates/commands/plan.md` for the execution workflow.
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
Implement a robust backup scheduling system using `APScheduler` and a unified "Tasks" UI in SvelteKit to manage and monitor all background operations (backups, migrations).
|
||||||
|
|
||||||
|
## Technical Context
|
||||||
|
|
||||||
|
**Language/Version**: Python 3.9+, Node.js 18+
|
||||||
|
**Primary Dependencies**: FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||||
|
**Storage**: SQLite (`tasks.db`), JSON (`config.json`)
|
||||||
|
**Testing**: pytest
|
||||||
|
**Target Platform**: Linux server
|
||||||
|
**Project Type**: Web application
|
||||||
|
**Performance Goals**: UI latency < 200ms, Backup trigger < 1s
|
||||||
|
**Constraints**: Minimal resource footprint for background scheduler
|
||||||
|
**Scale/Scope**: ~10 environments, ~1000 historical tasks
|
||||||
|
|
||||||
|
## Constitution Check
|
||||||
|
|
||||||
|
*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
|
||||||
|
|
||||||
|
- **Library-First**: N/A (Feature integration)
|
||||||
|
- **CLI Interface**: N/A (Web UI focus)
|
||||||
|
- **Test-First**: Mandatory for Scheduler logic and API endpoints. PASS.
|
||||||
|
- **Integration Testing**: Required for Scheduler -> TaskManager interaction. PASS.
|
||||||
|
|
||||||
|
**Result**: PASS
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
### Documentation (this feature)
|
||||||
|
|
||||||
|
```text
|
||||||
|
specs/009-backup-scheduler/
|
||||||
|
├── plan.md # This file (/speckit.plan command output)
|
||||||
|
├── research.md # Phase 0 output (/speckit.plan command)
|
||||||
|
├── data-model.md # Phase 1 output (/speckit.plan command)
|
||||||
|
├── quickstart.md # Phase 1 output (/speckit.plan command)
|
||||||
|
├── contracts/ # Phase 1 output (/speckit.plan command)
|
||||||
|
└── tasks.md # Phase 2 output (/speckit.tasks command - NOT created by /speckit.plan)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Source Code (repository root)
|
||||||
|
|
||||||
|
```text
|
||||||
|
backend/
|
||||||
|
├── src/
|
||||||
|
│ ├── api/
|
||||||
|
│ │ └── routes/
|
||||||
|
│ │ └── tasks.py # NEW: Task management endpoints
|
||||||
|
│ ├── core/
|
||||||
|
│ │ ├── scheduler.py # NEW: APScheduler integration
|
||||||
|
│ │ └── task_manager/ # EXISTING: Updates for DB persistence
|
||||||
|
│ ├── models/
|
||||||
|
│ │ └── task.py # NEW: SQLAlchemy model
|
||||||
|
│ └── services/
|
||||||
|
└── tests/
|
||||||
|
|
||||||
|
frontend/
|
||||||
|
├── src/
|
||||||
|
│ ├── components/
|
||||||
|
│ │ └── TaskList.svelte # NEW: Task display component
|
||||||
|
│ │ └── TaskLogViewer.svelte # NEW: Detailed log view
|
||||||
|
│ ├── routes/
|
||||||
|
│ │ └── tasks/ # NEW: Tasks page
|
||||||
|
│ │ └── +page.svelte
|
||||||
|
│ └── types/
|
||||||
|
```
|
||||||
|
|
||||||
|
**Structure Decision**: Standard FastAPI + SvelteKit structure.
|
||||||
|
|
||||||
|
## Complexity Tracking
|
||||||
|
|
||||||
|
> **Fill ONLY if Constitution Check has violations that must be justified**
|
||||||
|
|
||||||
|
| Violation | Why Needed | Simpler Alternative Rejected Because |
|
||||||
|
|-----------|------------|-------------------------------------|
|
||||||
|
| | | |
|
||||||
28
specs/009-backup-scheduler/quickstart.md
Normal file
28
specs/009-backup-scheduler/quickstart.md
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
# Quickstart: Backup Scheduler & Unified Task UI
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
- Backend running: `cd backend && uvicorn src.app:app --reload`
|
||||||
|
- Frontend running: `cd frontend && npm run dev`
|
||||||
|
|
||||||
|
## Usage Guide
|
||||||
|
|
||||||
|
### 1. View Tasks
|
||||||
|
1. Navigate to the new **Tasks** tab in the main navigation bar.
|
||||||
|
2. Observe the list of recent tasks (Backups, Migrations).
|
||||||
|
3. Click on any task row to view detailed logs.
|
||||||
|
|
||||||
|
### 2. Configure Scheduled Backups
|
||||||
|
1. Go to **Settings**.
|
||||||
|
2. Edit an existing Environment (or create a new one).
|
||||||
|
3. Scroll to the **Backup Schedule** section.
|
||||||
|
4. Enable the "Automatic Backups" toggle.
|
||||||
|
5. Enter a valid Cron expression (e.g., `*/5 * * * *` for every 5 minutes).
|
||||||
|
6. Save the environment.
|
||||||
|
7. Wait for the scheduled time and verify a new Backup task appears in the **Tasks** tab.
|
||||||
|
|
||||||
|
### 3. Manual Backup Trigger
|
||||||
|
1. Go to the **Tasks** tab.
|
||||||
|
2. Click the **Run Backup** button (top right).
|
||||||
|
3. Select the target environment from the dropdown.
|
||||||
|
4. Click **Start**.
|
||||||
|
5. Watch the new task appear with "Running" status.
|
||||||
29
specs/009-backup-scheduler/research.md
Normal file
29
specs/009-backup-scheduler/research.md
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
# Research: Backup Scheduler & Unified Task UI
|
||||||
|
|
||||||
|
## Decisions
|
||||||
|
|
||||||
|
### 1. Scheduler Implementation
|
||||||
|
- **Decision**: Use `APScheduler` (Advanced Python Scheduler) with `BackgroundScheduler`.
|
||||||
|
- **Rationale**: `APScheduler` is the industry standard for Python scheduling. It supports Cron-style scheduling (required by FR-001), runs in the background (FR-003), and handles thread management. It integrates well with FastAPI lifecycles.
|
||||||
|
- **Alternatives Considered**:
|
||||||
|
- `cron` (system level): Harder to manage from within the app, requires OS access.
|
||||||
|
- `schedule` library: Simpler but lacks advanced Cron features and persistence robustness.
|
||||||
|
- Custom thread loop: Error-prone and reinvents the wheel.
|
||||||
|
|
||||||
|
### 2. Task History Database
|
||||||
|
- **Decision**: SQLite (`tasks.db`) accessed via `SQLAlchemy` (AsyncIO).
|
||||||
|
- **Rationale**: The spec explicitly requests `tasks.db` (FR-009). SQLAlchemy provides a robust ORM for the `Task` entity. Using AsyncIO ensures non-blocking database operations within the FastAPI event loop, even if the actual backup tasks run in threads.
|
||||||
|
- **Alternatives Considered**:
|
||||||
|
- `JSON` files: Poor performance for filtering/sorting logs (FR-006).
|
||||||
|
- `PostgreSQL`: Overkill for a local tool configuration.
|
||||||
|
|
||||||
|
### 3. Concurrency Handling
|
||||||
|
- **Decision**: Skip scheduled backups if a backup is already running for the *same* environment. Allow concurrent backups for *different* environments.
|
||||||
|
- **Rationale**: Prevents resource contention and potential corruption of the same target.
|
||||||
|
- **Implementation**: The `SchedulerService` will check `TaskManager` for active jobs with the same `environment_id` before triggering.
|
||||||
|
|
||||||
|
### 4. Frontend Polling vs WebSockets
|
||||||
|
- **Decision**: Polling (every 2-5 seconds) for the "Tasks" tab.
|
||||||
|
- **Rationale**: Simpler to implement than WebSockets for this scale. The requirement is "near real-time" (SC-002: latency < 5s), which polling satisfies easily.
|
||||||
|
- **Alternatives Considered**:
|
||||||
|
- WebSockets: Better real-time, but higher complexity for connection management and state.
|
||||||
115
specs/009-backup-scheduler/spec.md
Normal file
115
specs/009-backup-scheduler/spec.md
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
# Feature Specification: Backup Scheduler & Unified Task UI
|
||||||
|
|
||||||
|
**Feature Branch**: `009-backup-scheduler`
|
||||||
|
**Created**: 2025-12-30
|
||||||
|
**Status**: Draft
|
||||||
|
**Input**: User description: "Я хочу доработать механизм бекапа. Он должен иметь возможность работать по расписанию, задания и их статус должны использовать TaskManager и быть доступны в общем логе. Я думаю нужно вынести все задачи в отдельную вкладку - миграции, бэкапов и прочих задач которые мы в будущем добавим."
|
||||||
|
|
||||||
|
## User Scenarios & Testing *(mandatory)*
|
||||||
|
|
||||||
|
### User Story 1 - Scheduled Backups (Priority: P1)
|
||||||
|
|
||||||
|
As an Administrator, I want to configure automatic backup schedules for my Superset environments so that my data is preserved regularly without manual intervention.
|
||||||
|
|
||||||
|
**Why this priority**: Automation is the core request. It ensures data safety and reduces manual toil.
|
||||||
|
|
||||||
|
**Independent Test**: Configure a schedule (e.g., every minute for testing), wait, and verify a backup task is created and executed automatically.
|
||||||
|
|
||||||
|
**Acceptance Scenarios**:
|
||||||
|
|
||||||
|
1. **Given** an environment configuration, **When** I enable scheduled backups with a specific interval (e.g., daily), **Then** the system automatically triggers a backup task at the specified time.
|
||||||
|
2. **Given** a scheduled backup runs, **When** it completes, **Then** a new backup archive is present in the storage and a success log is recorded.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### User Story 2 - Unified Task Management UI (Priority: P1)
|
||||||
|
|
||||||
|
As an Administrator, I want a dedicated "Tasks" tab where I can see and manage all background operations (backups, migrations) in one place.
|
||||||
|
|
||||||
|
**Why this priority**: Centralizes visibility and control, improving usability as the number of background tasks grows.
|
||||||
|
|
||||||
|
**Independent Test**: Navigate to the new "Tasks" tab and verify it lists both manual and scheduled tasks with their current status.
|
||||||
|
|
||||||
|
**Acceptance Scenarios**:
|
||||||
|
|
||||||
|
1. **Given** the application is open, **When** I click the "Tasks" tab, **Then** I see a list of recent tasks including their type (Backup, Migration), status, and timestamp.
|
||||||
|
2. **Given** a running task, **When** I view the Tasks tab, **Then** I see the task status update in real-time (or near real-time).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### User Story 3 - Manual Backup Trigger (Priority: P2)
|
||||||
|
|
||||||
|
As an Administrator, I want to manually trigger a backup from the Tasks UI immediately, for example, before a major change.
|
||||||
|
|
||||||
|
**Why this priority**: Ad-hoc backups are necessary for operational safety before maintenance.
|
||||||
|
|
||||||
|
**Independent Test**: Click "Run Backup" in the UI and verify a new task starts immediately.
|
||||||
|
|
||||||
|
**Acceptance Scenarios**:
|
||||||
|
|
||||||
|
1. **Given** the Tasks tab is open, **When** I select an environment and click "Run Backup", **Then** a new backup task appears in the list with "Running" status.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### User Story 4 - Task History & Logs (Priority: P2)
|
||||||
|
|
||||||
|
As an Administrator, I want to view the detailed logs of any executed task to troubleshoot failures or verify success.
|
||||||
|
|
||||||
|
**Why this priority**: Essential for debugging and auditability.
|
||||||
|
|
||||||
|
**Independent Test**: Click on a completed task and verify the log output is displayed.
|
||||||
|
|
||||||
|
**Acceptance Scenarios**:
|
||||||
|
|
||||||
|
1. **Given** a list of tasks, **When** I click on a "Failed" task, **Then** I can see the error logs explaining why it failed.
|
||||||
|
2. **Given** a "Success" task, **When** I view logs, **Then** I see the execution steps confirmation.
|
||||||
|
|
||||||
|
### Edge Cases
|
||||||
|
|
||||||
|
- **Concurrent Backups**: What happens if a scheduled backup triggers while a manual backup is already running for the same environment? (System should likely queue or skip).
|
||||||
|
- **Storage Full**: How does the system handle backup failures due to insufficient disk space? (Should fail gracefully and log error).
|
||||||
|
- **Superset Offline**: What happens if the Superset environment is unreachable when a backup is triggered? (Task fails with connection error).
|
||||||
|
|
||||||
|
### Assumptions
|
||||||
|
|
||||||
|
- The backend server is running continuously to process scheduled tasks.
|
||||||
|
- Users have configured valid credentials for Superset environments.
|
||||||
|
- There is sufficient storage space for backup archives.
|
||||||
|
|
||||||
|
## Requirements *(mandatory)*
|
||||||
|
|
||||||
|
### Functional Requirements
|
||||||
|
|
||||||
|
- **FR-001**: The system MUST allow users to configure a backup schedule using **Cron expressions** for each defined Superset environment via the **Settings** page.
|
||||||
|
- **FR-002**: The system MUST persist schedule configurations nested within the Environment configuration in `config.json`.
|
||||||
|
- **FR-003**: The system MUST include a `SchedulerService` running in a background thread that triggers backup tasks via the `TaskManager`.
|
||||||
|
- **FR-004**: The system MUST provide a dedicated "Tasks" page in the frontend.
|
||||||
|
- **FR-005**: The "Tasks" page MUST display a unified list of all `TaskManager` jobs, including Migrations and Backups.
|
||||||
|
- **FR-006**: The "Tasks" page MUST allow users to filter tasks by status (Running, Success, Failed) and type.
|
||||||
|
- **FR-007**: The system MUST allow users to manually trigger a backup for a specific environment from the "Tasks" page.
|
||||||
|
- **FR-008**: All backup operations (scheduled or manual) MUST be executed as `TaskManager` tasks and generate standard logs.
|
||||||
|
- **FR-009**: The system MUST retain a history of task executions in a dedicated SQLite database (`tasks.db`) for long-term review.
|
||||||
|
- **FR-010**: The system MUST automatically clean up task history older than 30 days to prevent unbounded database growth.
|
||||||
|
|
||||||
|
### Key Entities
|
||||||
|
|
||||||
|
- **Task**: Represents a unit of work (Backup, Migration) managed by TaskManager. Attributes: ID, Type, Status, StartedAt, FinishedAt, Logs.
|
||||||
|
- **Schedule**: Configuration for when to run a task. Attributes: EnvironmentID, Frequency, NextRunTime, Enabled.
|
||||||
|
|
||||||
|
## Success Criteria *(mandatory)*
|
||||||
|
|
||||||
|
### Measurable Outcomes
|
||||||
|
|
||||||
|
- **SC-001**: Users can configure a backup schedule that persists and triggers automatically within 1 minute of the target time.
|
||||||
|
- **SC-002**: The "Tasks" UI displays the status of running tasks with a latency of no more than 5 seconds.
|
||||||
|
- **SC-003**: 100% of triggered backups (manual or scheduled) are recorded in the TaskManager history.
|
||||||
|
- **SC-004**: Users can access logs for any task executed in the last 7 days (or configured retention period).
|
||||||
|
|
||||||
|
## Clarifications
|
||||||
|
|
||||||
|
### Session 2025-12-30
|
||||||
|
- Q: Where should the backup schedule configuration UI be located? → A: In the **Settings** tab, inside each Environment's edit form.
|
||||||
|
- Q: How should schedule configurations be persisted? → A: Add a `Schedule` model in `config_models.py` and nest it under `Environment`.
|
||||||
|
- Q: What format should be used for defining schedule frequency? → A: Cron-style strings (e.g., "0 0 * * *").
|
||||||
|
- Q: How should the scheduling mechanism be implemented? → A: Create a dedicated `SchedulerService` in `backend/src/core/scheduler.py` that runs in a background thread.
|
||||||
|
- Q: Where should task history be stored for long-term retention? → A: Add a `tasks.db` SQLite database using SQLAlchemy.
|
||||||
51
specs/009-backup-scheduler/tasks.md
Normal file
51
specs/009-backup-scheduler/tasks.md
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
# Tasks: Backup Scheduler & Unified Task UI
|
||||||
|
|
||||||
|
## Phase 1: Setup
|
||||||
|
- [x] T001 Initialize SQLite database `tasks.db` and SQLAlchemy engine in `backend/src/core/database.py`
|
||||||
|
- [x] T002 Create SQLAlchemy model for `TaskRecord` in `backend/src/models/task.py`
|
||||||
|
- [x] T003 Update `backend/src/core/config_models.py` to include `Schedule` and update `Environment` model
|
||||||
|
- [x] T004 Create database migrations or initialization script for `tasks.db`
|
||||||
|
|
||||||
|
## Phase 2: Foundational
|
||||||
|
- [x] T005 [P] Implement `TaskPersistence` layer in `backend/src/core/task_manager/persistence.py`
|
||||||
|
- [x] T006 Update `TaskManager` in `backend/src/core/task_manager/manager.py` to use persistence for all jobs
|
||||||
|
- [x] T007 Implement `SchedulerService` using `APScheduler` in `backend/src/core/scheduler.py`
|
||||||
|
- [x] T008 Integrate `SchedulerService` into main FastAPI application startup in `backend/src/app.py`
|
||||||
|
|
||||||
|
## Phase 3: [US1] Scheduled Backups
|
||||||
|
- [x] T009 [US1] Implement schedule loading and registration logic in `SchedulerService`
|
||||||
|
- [x] T010 [US1] Update `Environment` settings API to handle `backup_schedule` updates in `backend/src/api/routes/environments.py`
|
||||||
|
- [x] T011 [P] [US1] Add schedule configuration fields to Environment edit form in `frontend/src/components/EnvSelector.svelte` (or appropriate component)
|
||||||
|
- [x] T012 [US1] Implement validation for Cron expressions in backend and frontend
|
||||||
|
|
||||||
|
## Phase 4: [US2] Unified Task Management UI
|
||||||
|
- [x] T013 [US2] Implement `/api/tasks` endpoint to list and filter tasks in `backend/src/api/routes/tasks.py`
|
||||||
|
- [x] T014 [US2] Create new Tasks page in `frontend/src/routes/tasks/+page.svelte`
|
||||||
|
- [x] T015 [P] [US2] Implement `TaskList` component in `frontend/src/components/TaskList.svelte`
|
||||||
|
- [x] T016 [US2] Add "Tasks" link to main navigation in `frontend/src/components/Navbar.svelte`
|
||||||
|
|
||||||
|
## Phase 5: [US3] Manual Backup Trigger
|
||||||
|
- [x] T017 [US3] Implement `/api/tasks/backup` POST endpoint in `backend/src/api/routes/tasks.py`
|
||||||
|
- [x] T018 [US3] Add "Run Backup" button and environment selection to Tasks page in `frontend/src/routes/tasks/+page.svelte`
|
||||||
|
|
||||||
|
## Phase 6: [US4] Task History & Logs
|
||||||
|
- [x] T019 [US4] Implement `/api/tasks/{task_id}` GET endpoint for detailed task info and logs in `backend/src/api/routes/tasks.py`
|
||||||
|
- [x] T020 [US4] Implement `TaskLogViewer` component in `frontend/src/components/TaskLogViewer.svelte`
|
||||||
|
- [x] T021 [US4] Integrate log viewer into TaskList or as a separate modal/page
|
||||||
|
|
||||||
|
## Final Phase: Polish & Cross-cutting concerns
|
||||||
|
- [x] T022 Implement task cleanup/retention policy (e.g., delete tasks older than 30 days)
|
||||||
|
- [ ] T023 Add real-time updates for task status using WebSockets (optional/refinement)
|
||||||
|
- [x] T024 Ensure consistent error handling and logging across scheduler and task manager
|
||||||
|
|
||||||
|
## Dependencies
|
||||||
|
- US1 depends on Phase 1 & 2
|
||||||
|
- US2 depends on Phase 1 & 2
|
||||||
|
- US3 depends on US2
|
||||||
|
- US4 depends on US2
|
||||||
|
|
||||||
|
## Implementation Strategy
|
||||||
|
1. **Infrastructure First**: Setup database and basic task persistence.
|
||||||
|
2. **Backend Logic**: Implement scheduler and update task manager.
|
||||||
|
3. **API & UI**: Build the unified tasks view.
|
||||||
|
4. **Feature Integration**: Add scheduling UI and manual triggers.
|
||||||
Reference in New Issue
Block a user