docs ready

This commit is contained in:
2025-12-30 21:30:37 +03:00
parent 45c077b928
commit fce0941e98
12 changed files with 803 additions and 43 deletions

View File

@@ -16,6 +16,8 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
- Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
- SQLite (optional for job history), existing database for mappings (008-migration-ui-improvements)
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API (008-migration-ui-improvements)
- Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS (009-backup-scheduler)
- SQLite (`tasks.db`), JSON (`config.json`) (009-backup-scheduler)
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
@@ -36,9 +38,9 @@ cd src; pytest; ruff check .
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
## Recent Changes
- 008-migration-ui-improvements: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API
- 008-migration-ui-improvements: Added Python 3.9+ (backend), Node.js 18+ (frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, Superset API
- 007-migration-dashboard-grid: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, Superset API
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
- 009-backup-scheduler: Added [if applicable, e.g., PostgreSQL, CoreData, files or N/A]
<!-- MANUAL ADDITIONS START -->

View File

@@ -9,8 +9,8 @@
#
# OPTIONS:
# --json Output in JSON format
# --require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
# --include-tasks Include task files in AVAILABLE_DOCS list
# --require-tasks Require tasks.md to exist (for implementation phase)
# --include-tasks Include tasks.md in AVAILABLE_DOCS list
# --paths-only Only output path variables (no validation)
# --help, -h Show help message
#
@@ -49,8 +49,8 @@ Consolidated prerequisite checking for Spec-Driven Development workflow.
OPTIONS:
--json Output in JSON format
--require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
--include-tasks Include task files in AVAILABLE_DOCS list
--require-tasks Require tasks.md to exist (for implementation phase)
--include-tasks Include tasks.md in AVAILABLE_DOCS list
--paths-only Only output path variables (no prerequisite validation)
--help, -h Show this help message
@@ -58,7 +58,7 @@ EXAMPLES:
# Check task prerequisites (plan.md required)
./check-prerequisites.sh --json
# Check implementation prerequisites (plan.md + task files required)
# Check implementation prerequisites (plan.md + tasks.md required)
./check-prerequisites.sh --json --require-tasks --include-tasks
# Get feature paths only (no validation)
@@ -86,16 +86,15 @@ check_feature_branch "$CURRENT_BRANCH" "$HAS_GIT" || exit 1
if $PATHS_ONLY; then
if $JSON_MODE; then
# Minimal JSON paths payload (no validation performed)
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS_ARCH":"%s","TASKS_DEV":"%s"}\n' \
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS_ARCH" "$TASKS_DEV"
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS":"%s"}\n' \
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS"
else
echo "REPO_ROOT: $REPO_ROOT"
echo "BRANCH: $CURRENT_BRANCH"
echo "FEATURE_DIR: $FEATURE_DIR"
echo "FEATURE_SPEC: $FEATURE_SPEC"
echo "IMPL_PLAN: $IMPL_PLAN"
echo "TASKS_ARCH: $TASKS_ARCH"
echo "TASKS_DEV: $TASKS_DEV"
echo "TASKS: $TASKS"
fi
exit 0
fi
@@ -113,21 +112,12 @@ if [[ ! -f "$IMPL_PLAN" ]]; then
exit 1
fi
# Check for task files if required
if $REQUIRE_TASKS; then
# Check for split tasks first
if [[ -f "$TASKS_ARCH" ]] && [[ -f "$TASKS_DEV" ]]; then
: # Split tasks exist, proceed
# Fallback to unified tasks.md
elif [[ -f "$TASKS" ]]; then
: # Unified tasks exist, proceed
else
echo "ERROR: No valid task files found in $FEATURE_DIR" >&2
echo "Expected 'tasks-arch.md' AND 'tasks-dev.md' (split) OR 'tasks.md' (unified)" >&2
echo "Run /speckit.tasks first to create the task lists." >&2
# Check for tasks.md if required
if $REQUIRE_TASKS && [[ ! -f "$TASKS" ]]; then
echo "ERROR: tasks.md not found in $FEATURE_DIR" >&2
echo "Run /speckit.tasks first to create the task list." >&2
exit 1
fi
fi
# Build list of available documents
docs=()
@@ -143,15 +133,10 @@ fi
[[ -f "$QUICKSTART" ]] && docs+=("quickstart.md")
# Include task files if requested and they exist
if $INCLUDE_TASKS; then
if [[ -f "$TASKS_ARCH" ]] || [[ -f "$TASKS_DEV" ]]; then
[[ -f "$TASKS_ARCH" ]] && docs+=("tasks-arch.md")
[[ -f "$TASKS_DEV" ]] && docs+=("tasks-dev.md")
elif [[ -f "$TASKS" ]]; then
# Include tasks.md if requested and it exists
if $INCLUDE_TASKS && [[ -f "$TASKS" ]]; then
docs+=("tasks.md")
fi
fi
# Output results
if $JSON_MODE; then
@@ -176,11 +161,6 @@ else
check_file "$QUICKSTART" "quickstart.md"
if $INCLUDE_TASKS; then
if [[ -f "$TASKS_ARCH" ]] || [[ -f "$TASKS_DEV" ]]; then
check_file "$TASKS_ARCH" "tasks-arch.md"
check_file "$TASKS_DEV" "tasks-dev.md"
else
check_file "$TASKS" "tasks.md"
fi
fi
fi

View File

@@ -143,9 +143,7 @@ HAS_GIT='$has_git_repo'
FEATURE_DIR='$feature_dir'
FEATURE_SPEC='$feature_dir/spec.md'
IMPL_PLAN='$feature_dir/plan.md'
TASKS_ARCH='$feature_dir/tasks-arch.md'
TASKS_DEV='$feature_dir/tasks-dev.md'
TASKS='$feature_dir/tasks.md' # Deprecated
TASKS='$feature_dir/tasks.md'
RESEARCH='$feature_dir/research.md'
DATA_MODEL='$feature_dir/data-model.md'
QUICKSTART='$feature_dir/quickstart.md'

View File

@@ -0,0 +1,251 @@
---
description: "Task list template for feature implementation"
---
# Tasks: [FEATURE NAME]
**Input**: Design documents from `/specs/[###-feature-name]/`
**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, contracts/
**Tests**: The examples below include test tasks. Tests are OPTIONAL - only include them if explicitly requested in the feature specification.
**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story.
## Format: `[ID] [P?] [Story] Description`
- **[P]**: Can run in parallel (different files, no dependencies)
- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3)
- Include exact file paths in descriptions
## Path Conventions
- **Single project**: `src/`, `tests/` at repository root
- **Web app**: `backend/src/`, `frontend/src/`
- **Mobile**: `api/src/`, `ios/src/` or `android/src/`
- Paths shown below assume single project - adjust based on plan.md structure
<!--
============================================================================
IMPORTANT: The tasks below are SAMPLE TASKS for illustration purposes only.
The /speckit.tasks command MUST replace these with actual tasks based on:
- User stories from spec.md (with their priorities P1, P2, P3...)
- Feature requirements from plan.md
- Entities from data-model.md
- Endpoints from contracts/
Tasks MUST be organized by user story so each story can be:
- Implemented independently
- Tested independently
- Delivered as an MVP increment
DO NOT keep these sample tasks in the generated tasks.md file.
============================================================================
-->
## Phase 1: Setup (Shared Infrastructure)
**Purpose**: Project initialization and basic structure
- [ ] T001 Create project structure per implementation plan
- [ ] T002 Initialize [language] project with [framework] dependencies
- [ ] T003 [P] Configure linting and formatting tools
---
## Phase 2: Foundational (Blocking Prerequisites)
**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented
**⚠️ CRITICAL**: No user story work can begin until this phase is complete
Examples of foundational tasks (adjust based on your project):
- [ ] T004 Setup database schema and migrations framework
- [ ] T005 [P] Implement authentication/authorization framework
- [ ] T006 [P] Setup API routing and middleware structure
- [ ] T007 Create base models/entities that all stories depend on
- [ ] T008 Configure error handling and logging infrastructure
- [ ] T009 Setup environment configuration management
**Checkpoint**: Foundation ready - user story implementation can now begin in parallel
---
## Phase 3: User Story 1 - [Title] (Priority: P1) 🎯 MVP
**Goal**: [Brief description of what this story delivers]
**Independent Test**: [How to verify this story works on its own]
### Tests for User Story 1 (OPTIONAL - only if tests requested) ⚠️
> **NOTE: Write these tests FIRST, ensure they FAIL before implementation**
- [ ] T010 [P] [US1] Contract test for [endpoint] in tests/contract/test_[name].py
- [ ] T011 [P] [US1] Integration test for [user journey] in tests/integration/test_[name].py
### Implementation for User Story 1
- [ ] T012 [P] [US1] Create [Entity1] model in src/models/[entity1].py
- [ ] T013 [P] [US1] Create [Entity2] model in src/models/[entity2].py
- [ ] T014 [US1] Implement [Service] in src/services/[service].py (depends on T012, T013)
- [ ] T015 [US1] Implement [endpoint/feature] in src/[location]/[file].py
- [ ] T016 [US1] Add validation and error handling
- [ ] T017 [US1] Add logging for user story 1 operations
**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently
---
## Phase 4: User Story 2 - [Title] (Priority: P2)
**Goal**: [Brief description of what this story delivers]
**Independent Test**: [How to verify this story works on its own]
### Tests for User Story 2 (OPTIONAL - only if tests requested) ⚠️
- [ ] T018 [P] [US2] Contract test for [endpoint] in tests/contract/test_[name].py
- [ ] T019 [P] [US2] Integration test for [user journey] in tests/integration/test_[name].py
### Implementation for User Story 2
- [ ] T020 [P] [US2] Create [Entity] model in src/models/[entity].py
- [ ] T021 [US2] Implement [Service] in src/services/[service].py
- [ ] T022 [US2] Implement [endpoint/feature] in src/[location]/[file].py
- [ ] T023 [US2] Integrate with User Story 1 components (if needed)
**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently
---
## Phase 5: User Story 3 - [Title] (Priority: P3)
**Goal**: [Brief description of what this story delivers]
**Independent Test**: [How to verify this story works on its own]
### Tests for User Story 3 (OPTIONAL - only if tests requested) ⚠️
- [ ] T024 [P] [US3] Contract test for [endpoint] in tests/contract/test_[name].py
- [ ] T025 [P] [US3] Integration test for [user journey] in tests/integration/test_[name].py
### Implementation for User Story 3
- [ ] T026 [P] [US3] Create [Entity] model in src/models/[entity].py
- [ ] T027 [US3] Implement [Service] in src/services/[service].py
- [ ] T028 [US3] Implement [endpoint/feature] in src/[location]/[file].py
**Checkpoint**: All user stories should now be independently functional
---
[Add more user story phases as needed, following the same pattern]
---
## Phase N: Polish & Cross-Cutting Concerns
**Purpose**: Improvements that affect multiple user stories
- [ ] TXXX [P] Documentation updates in docs/
- [ ] TXXX Code cleanup and refactoring
- [ ] TXXX Performance optimization across all stories
- [ ] TXXX [P] Additional unit tests (if requested) in tests/unit/
- [ ] TXXX Security hardening
- [ ] TXXX Run quickstart.md validation
---
## Dependencies & Execution Order
### Phase Dependencies
- **Setup (Phase 1)**: No dependencies - can start immediately
- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories
- **User Stories (Phase 3+)**: All depend on Foundational phase completion
- User stories can then proceed in parallel (if staffed)
- Or sequentially in priority order (P1 → P2 → P3)
- **Polish (Final Phase)**: Depends on all desired user stories being complete
### User Story Dependencies
- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
- **User Story 2 (P2)**: Can start after Foundational (Phase 2) - May integrate with US1 but should be independently testable
- **User Story 3 (P3)**: Can start after Foundational (Phase 2) - May integrate with US1/US2 but should be independently testable
### Within Each User Story
- Tests (if included) MUST be written and FAIL before implementation
- Models before services
- Services before endpoints
- Core implementation before integration
- Story complete before moving to next priority
### Parallel Opportunities
- All Setup tasks marked [P] can run in parallel
- All Foundational tasks marked [P] can run in parallel (within Phase 2)
- Once Foundational phase completes, all user stories can start in parallel (if team capacity allows)
- All tests for a user story marked [P] can run in parallel
- Models within a story marked [P] can run in parallel
- Different user stories can be worked on in parallel by different team members
---
## Parallel Example: User Story 1
```bash
# Launch all tests for User Story 1 together (if tests requested):
Task: "Contract test for [endpoint] in tests/contract/test_[name].py"
Task: "Integration test for [user journey] in tests/integration/test_[name].py"
# Launch all models for User Story 1 together:
Task: "Create [Entity1] model in src/models/[entity1].py"
Task: "Create [Entity2] model in src/models/[entity2].py"
```
---
## Implementation Strategy
### MVP First (User Story 1 Only)
1. Complete Phase 1: Setup
2. Complete Phase 2: Foundational (CRITICAL - blocks all stories)
3. Complete Phase 3: User Story 1
4. **STOP and VALIDATE**: Test User Story 1 independently
5. Deploy/demo if ready
### Incremental Delivery
1. Complete Setup + Foundational → Foundation ready
2. Add User Story 1 → Test independently → Deploy/Demo (MVP!)
3. Add User Story 2 → Test independently → Deploy/Demo
4. Add User Story 3 → Test independently → Deploy/Demo
5. Each story adds value without breaking previous stories
### Parallel Team Strategy
With multiple developers:
1. Team completes Setup + Foundational together
2. Once Foundational is done:
- Developer A: User Story 1
- Developer B: User Story 2
- Developer C: User Story 3
3. Stories complete and integrate independently
---
## Notes
- [P] tasks = different files, no dependencies
- [Story] label maps task to specific user story for traceability
- Each user story should be independently completable and testable
- Verify tests fail before implementing
- Commit after each task or logical group
- Stop at any checkpoint to validate story independently
- Avoid: vague tasks, same file conflicts, cross-story dependencies that break independence

View File

@@ -0,0 +1,34 @@
# Specification Quality Checklist: Backup Scheduler & Unified Task UI
**Purpose**: Validate specification completeness and quality before proceeding to planning
**Created**: 2025-12-30
**Feature**: [Link to spec.md](../spec.md)
## Content Quality
- [x] No implementation details (languages, frameworks, APIs)
- [x] Focused on user value and business needs
- [x] Written for non-technical stakeholders
- [x] All mandatory sections completed
## Requirement Completeness
- [x] No [NEEDS CLARIFICATION] markers remain
- [x] Requirements are testable and unambiguous
- [x] Success criteria are measurable
- [x] Success criteria are technology-agnostic (no implementation details)
- [x] All acceptance scenarios are defined
- [x] Edge cases are identified
- [x] Scope is clearly bounded
- [x] Dependencies and assumptions identified
## Feature Readiness
- [x] All functional requirements have clear acceptance criteria
- [x] User scenarios cover primary flows
- [x] Feature meets measurable outcomes defined in Success Criteria
- [x] No implementation details leak into specification
## Notes
- Items marked incomplete require spec updates before `/speckit.clarify` or `/speckit.plan`

View File

@@ -0,0 +1,154 @@
openapi: 3.0.0
info:
title: Backup Scheduler & Task API
version: 1.0.0
paths:
/tasks:
get:
summary: List all tasks
parameters:
- name: limit
in: query
schema:
type: integer
default: 50
- name: type
in: query
schema:
type: string
enum: [backup, migration]
- name: status
in: query
schema:
type: string
enum: [running, success, failed, pending]
responses:
'200':
description: List of tasks
content:
application/json:
schema:
type: array
items:
$ref: '#/components/schemas/Task'
/tasks/backup:
post:
summary: Manually trigger a backup
requestBody:
required: true
content:
application/json:
schema:
type: object
required:
- environment_id
properties:
environment_id:
type: string
format: uuid
responses:
'202':
description: Backup task started
content:
application/json:
schema:
$ref: '#/components/schemas/Task'
/tasks/{id}:
get:
summary: Get task details
parameters:
- name: id
in: path
required: true
schema:
type: string
format: uuid
responses:
'200':
description: Task details
content:
application/json:
schema:
$ref: '#/components/schemas/Task'
/environments/{id}/schedule:
get:
summary: Get backup schedule for environment
parameters:
- name: id
in: path
required: true
schema:
type: string
format: uuid
responses:
'200':
description: Schedule configuration
content:
application/json:
schema:
$ref: '#/components/schemas/Schedule'
put:
summary: Update backup schedule
parameters:
- name: id
in: path
required: true
schema:
type: string
format: uuid
requestBody:
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/Schedule'
responses:
'200':
description: Schedule updated
content:
application/json:
schema:
$ref: '#/components/schemas/Schedule'
components:
schemas:
Task:
type: object
properties:
id:
type: string
format: uuid
type:
type: string
enum: [backup, migration]
status:
type: string
enum: [pending, running, success, failed]
environment_id:
type: string
format: uuid
started_at:
type: string
format: date-time
finished_at:
type: string
format: date-time
created_at:
type: string
format: date-time
error:
type: string
logs:
type: string
Schedule:
type: object
properties:
enabled:
type: boolean
cron_expression:
type: string
example: "0 0 * * *"

View File

@@ -0,0 +1,34 @@
# Data Model: Backup Scheduler & Unified Task UI
## Entities
### Task
Represents a background operation (Backup, Migration) managed by the system.
| Field | Type | Description | Constraints |
|-------|------|-------------|-------------|
| `id` | UUID | Unique identifier | Primary Key |
| `type` | String | Type of task | Enum: "backup", "migration" |
| `status` | String | Current execution state | Enum: "pending", "running", "success", "failed" |
| `environment_id` | UUID | Target environment (if applicable) | Foreign Key (Environments), Nullable |
| `started_at` | DateTime | When the task began | Nullable |
| `finished_at` | DateTime | When the task completed | Nullable |
| `logs` | Text | Execution logs | |
| `error` | Text | Error message if failed | Nullable |
| `created_at` | DateTime | When task was queued | Default: Now |
### Schedule
Configuration for automatic task execution. Nested within Environment config.
| Field | Type | Description | Constraints |
|-------|------|-------------|-------------|
| `environment_id` | UUID | Target environment | Foreign Key (Environments) |
| `enabled` | Boolean | Is schedule active? | Default: false |
| `cron_expression` | String | Frequency definition | Valid Cron string (e.g., "0 0 * * *") |
| `last_run_at` | DateTime | Last execution time | Nullable |
| `next_run_at` | DateTime | Calculated next run | Nullable |
## Storage Strategy
- **Tasks**: Stored in `tasks.db` (SQLite) via SQLAlchemy.
- **Schedules**: Stored in `config.json` as part of the Environment model.

View File

@@ -0,0 +1,84 @@
# Implementation Plan: Backup Scheduler & Unified Task UI
**Branch**: `009-backup-scheduler` | **Date**: 2025-12-30 | **Spec**: [link](spec.md)
**Input**: Feature specification from `/specs/009-backup-scheduler/spec.md`
**Note**: This template is filled in by the `/speckit.plan` command. See `.specify/templates/commands/plan.md` for the execution workflow.
## Summary
Implement a robust backup scheduling system using `APScheduler` and a unified "Tasks" UI in SvelteKit to manage and monitor all background operations (backups, migrations).
## Technical Context
**Language/Version**: Python 3.9+, Node.js 18+
**Primary Dependencies**: FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
**Storage**: SQLite (`tasks.db`), JSON (`config.json`)
**Testing**: pytest
**Target Platform**: Linux server
**Project Type**: Web application
**Performance Goals**: UI latency < 200ms, Backup trigger < 1s
**Constraints**: Minimal resource footprint for background scheduler
**Scale/Scope**: ~10 environments, ~1000 historical tasks
## Constitution Check
*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
- **Library-First**: N/A (Feature integration)
- **CLI Interface**: N/A (Web UI focus)
- **Test-First**: Mandatory for Scheduler logic and API endpoints. PASS.
- **Integration Testing**: Required for Scheduler -> TaskManager interaction. PASS.
**Result**: PASS
## Project Structure
### Documentation (this feature)
```text
specs/009-backup-scheduler/
├── plan.md # This file (/speckit.plan command output)
├── research.md # Phase 0 output (/speckit.plan command)
├── data-model.md # Phase 1 output (/speckit.plan command)
├── quickstart.md # Phase 1 output (/speckit.plan command)
├── contracts/ # Phase 1 output (/speckit.plan command)
└── tasks.md # Phase 2 output (/speckit.tasks command - NOT created by /speckit.plan)
```
### Source Code (repository root)
```text
backend/
├── src/
│ ├── api/
│ │ └── routes/
│ │ └── tasks.py # NEW: Task management endpoints
│ ├── core/
│ │ ├── scheduler.py # NEW: APScheduler integration
│ │ └── task_manager/ # EXISTING: Updates for DB persistence
│ ├── models/
│ │ └── task.py # NEW: SQLAlchemy model
│ └── services/
└── tests/
frontend/
├── src/
│ ├── components/
│ │ └── TaskList.svelte # NEW: Task display component
│ │ └── TaskLogViewer.svelte # NEW: Detailed log view
│ ├── routes/
│ │ └── tasks/ # NEW: Tasks page
│ │ └── +page.svelte
│ └── types/
```
**Structure Decision**: Standard FastAPI + SvelteKit structure.
## Complexity Tracking
> **Fill ONLY if Constitution Check has violations that must be justified**
| Violation | Why Needed | Simpler Alternative Rejected Because |
|-----------|------------|-------------------------------------|
| | | |

View File

@@ -0,0 +1,28 @@
# Quickstart: Backup Scheduler & Unified Task UI
## Prerequisites
- Backend running: `cd backend && uvicorn src.app:app --reload`
- Frontend running: `cd frontend && npm run dev`
## Usage Guide
### 1. View Tasks
1. Navigate to the new **Tasks** tab in the main navigation bar.
2. Observe the list of recent tasks (Backups, Migrations).
3. Click on any task row to view detailed logs.
### 2. Configure Scheduled Backups
1. Go to **Settings**.
2. Edit an existing Environment (or create a new one).
3. Scroll to the **Backup Schedule** section.
4. Enable the "Automatic Backups" toggle.
5. Enter a valid Cron expression (e.g., `*/5 * * * *` for every 5 minutes).
6. Save the environment.
7. Wait for the scheduled time and verify a new Backup task appears in the **Tasks** tab.
### 3. Manual Backup Trigger
1. Go to the **Tasks** tab.
2. Click the **Run Backup** button (top right).
3. Select the target environment from the dropdown.
4. Click **Start**.
5. Watch the new task appear with "Running" status.

View File

@@ -0,0 +1,29 @@
# Research: Backup Scheduler & Unified Task UI
## Decisions
### 1. Scheduler Implementation
- **Decision**: Use `APScheduler` (Advanced Python Scheduler) with `BackgroundScheduler`.
- **Rationale**: `APScheduler` is the industry standard for Python scheduling. It supports Cron-style scheduling (required by FR-001), runs in the background (FR-003), and handles thread management. It integrates well with FastAPI lifecycles.
- **Alternatives Considered**:
- `cron` (system level): Harder to manage from within the app, requires OS access.
- `schedule` library: Simpler but lacks advanced Cron features and persistence robustness.
- Custom thread loop: Error-prone and reinvents the wheel.
### 2. Task History Database
- **Decision**: SQLite (`tasks.db`) accessed via `SQLAlchemy` (AsyncIO).
- **Rationale**: The spec explicitly requests `tasks.db` (FR-009). SQLAlchemy provides a robust ORM for the `Task` entity. Using AsyncIO ensures non-blocking database operations within the FastAPI event loop, even if the actual backup tasks run in threads.
- **Alternatives Considered**:
- `JSON` files: Poor performance for filtering/sorting logs (FR-006).
- `PostgreSQL`: Overkill for a local tool configuration.
### 3. Concurrency Handling
- **Decision**: Skip scheduled backups if a backup is already running for the *same* environment. Allow concurrent backups for *different* environments.
- **Rationale**: Prevents resource contention and potential corruption of the same target.
- **Implementation**: The `SchedulerService` will check `TaskManager` for active jobs with the same `environment_id` before triggering.
### 4. Frontend Polling vs WebSockets
- **Decision**: Polling (every 2-5 seconds) for the "Tasks" tab.
- **Rationale**: Simpler to implement than WebSockets for this scale. The requirement is "near real-time" (SC-002: latency < 5s), which polling satisfies easily.
- **Alternatives Considered**:
- WebSockets: Better real-time, but higher complexity for connection management and state.

View File

@@ -0,0 +1,115 @@
# Feature Specification: Backup Scheduler & Unified Task UI
**Feature Branch**: `009-backup-scheduler`
**Created**: 2025-12-30
**Status**: Draft
**Input**: User description: "Я хочу доработать механизм бекапа. Он должен иметь возможность работать по расписанию, задания и их статус должны использовать TaskManager и быть доступны в общем логе. Я думаю нужно вынести все задачи в отдельную вкладку - миграции, бэкапов и прочих задач которые мы в будущем добавим."
## User Scenarios & Testing *(mandatory)*
### User Story 1 - Scheduled Backups (Priority: P1)
As an Administrator, I want to configure automatic backup schedules for my Superset environments so that my data is preserved regularly without manual intervention.
**Why this priority**: Automation is the core request. It ensures data safety and reduces manual toil.
**Independent Test**: Configure a schedule (e.g., every minute for testing), wait, and verify a backup task is created and executed automatically.
**Acceptance Scenarios**:
1. **Given** an environment configuration, **When** I enable scheduled backups with a specific interval (e.g., daily), **Then** the system automatically triggers a backup task at the specified time.
2. **Given** a scheduled backup runs, **When** it completes, **Then** a new backup archive is present in the storage and a success log is recorded.
---
### User Story 2 - Unified Task Management UI (Priority: P1)
As an Administrator, I want a dedicated "Tasks" tab where I can see and manage all background operations (backups, migrations) in one place.
**Why this priority**: Centralizes visibility and control, improving usability as the number of background tasks grows.
**Independent Test**: Navigate to the new "Tasks" tab and verify it lists both manual and scheduled tasks with their current status.
**Acceptance Scenarios**:
1. **Given** the application is open, **When** I click the "Tasks" tab, **Then** I see a list of recent tasks including their type (Backup, Migration), status, and timestamp.
2. **Given** a running task, **When** I view the Tasks tab, **Then** I see the task status update in real-time (or near real-time).
---
### User Story 3 - Manual Backup Trigger (Priority: P2)
As an Administrator, I want to manually trigger a backup from the Tasks UI immediately, for example, before a major change.
**Why this priority**: Ad-hoc backups are necessary for operational safety before maintenance.
**Independent Test**: Click "Run Backup" in the UI and verify a new task starts immediately.
**Acceptance Scenarios**:
1. **Given** the Tasks tab is open, **When** I select an environment and click "Run Backup", **Then** a new backup task appears in the list with "Running" status.
---
### User Story 4 - Task History & Logs (Priority: P2)
As an Administrator, I want to view the detailed logs of any executed task to troubleshoot failures or verify success.
**Why this priority**: Essential for debugging and auditability.
**Independent Test**: Click on a completed task and verify the log output is displayed.
**Acceptance Scenarios**:
1. **Given** a list of tasks, **When** I click on a "Failed" task, **Then** I can see the error logs explaining why it failed.
2. **Given** a "Success" task, **When** I view logs, **Then** I see the execution steps confirmation.
### Edge Cases
- **Concurrent Backups**: What happens if a scheduled backup triggers while a manual backup is already running for the same environment? (System should likely queue or skip).
- **Storage Full**: How does the system handle backup failures due to insufficient disk space? (Should fail gracefully and log error).
- **Superset Offline**: What happens if the Superset environment is unreachable when a backup is triggered? (Task fails with connection error).
### Assumptions
- The backend server is running continuously to process scheduled tasks.
- Users have configured valid credentials for Superset environments.
- There is sufficient storage space for backup archives.
## Requirements *(mandatory)*
### Functional Requirements
- **FR-001**: The system MUST allow users to configure a backup schedule using **Cron expressions** for each defined Superset environment via the **Settings** page.
- **FR-002**: The system MUST persist schedule configurations nested within the Environment configuration in `config.json`.
- **FR-003**: The system MUST include a `SchedulerService` running in a background thread that triggers backup tasks via the `TaskManager`.
- **FR-004**: The system MUST provide a dedicated "Tasks" page in the frontend.
- **FR-005**: The "Tasks" page MUST display a unified list of all `TaskManager` jobs, including Migrations and Backups.
- **FR-006**: The "Tasks" page MUST allow users to filter tasks by status (Running, Success, Failed) and type.
- **FR-007**: The system MUST allow users to manually trigger a backup for a specific environment from the "Tasks" page.
- **FR-008**: All backup operations (scheduled or manual) MUST be executed as `TaskManager` tasks and generate standard logs.
- **FR-009**: The system MUST retain a history of task executions in a dedicated SQLite database (`tasks.db`) for long-term review.
- **FR-010**: The system MUST automatically clean up task history older than 30 days to prevent unbounded database growth.
### Key Entities
- **Task**: Represents a unit of work (Backup, Migration) managed by TaskManager. Attributes: ID, Type, Status, StartedAt, FinishedAt, Logs.
- **Schedule**: Configuration for when to run a task. Attributes: EnvironmentID, Frequency, NextRunTime, Enabled.
## Success Criteria *(mandatory)*
### Measurable Outcomes
- **SC-001**: Users can configure a backup schedule that persists and triggers automatically within 1 minute of the target time.
- **SC-002**: The "Tasks" UI displays the status of running tasks with a latency of no more than 5 seconds.
- **SC-003**: 100% of triggered backups (manual or scheduled) are recorded in the TaskManager history.
- **SC-004**: Users can access logs for any task executed in the last 7 days (or configured retention period).
## Clarifications
### Session 2025-12-30
- Q: Where should the backup schedule configuration UI be located? → A: In the **Settings** tab, inside each Environment's edit form.
- Q: How should schedule configurations be persisted? → A: Add a `Schedule` model in `config_models.py` and nest it under `Environment`.
- Q: What format should be used for defining schedule frequency? → A: Cron-style strings (e.g., "0 0 * * *").
- Q: How should the scheduling mechanism be implemented? → A: Create a dedicated `SchedulerService` in `backend/src/core/scheduler.py` that runs in a background thread.
- Q: Where should task history be stored for long-term retention? → A: Add a `tasks.db` SQLite database using SQLAlchemy.

View File

@@ -0,0 +1,51 @@
# Tasks: Backup Scheduler & Unified Task UI
## Phase 1: Setup
- [ ] T001 Initialize SQLite database `tasks.db` and SQLAlchemy engine in `backend/src/core/database.py`
- [ ] T002 Create SQLAlchemy model for `TaskRecord` in `backend/src/models/task.py`
- [ ] T003 Update `backend/src/core/config_models.py` to include `Schedule` and update `Environment` model
- [ ] T004 Create database migrations or initialization script for `tasks.db`
## Phase 2: Foundational
- [ ] T005 [P] Implement `TaskPersistence` layer in `backend/src/core/task_manager/persistence.py`
- [ ] T006 Update `TaskManager` in `backend/src/core/task_manager/manager.py` to use persistence for all jobs
- [ ] T007 Implement `SchedulerService` using `APScheduler` in `backend/src/core/scheduler.py`
- [ ] T008 Integrate `SchedulerService` into main FastAPI application startup in `backend/src/app.py`
## Phase 3: [US1] Scheduled Backups
- [ ] T009 [US1] Implement schedule loading and registration logic in `SchedulerService`
- [ ] T010 [US1] Update `Environment` settings API to handle `backup_schedule` updates in `backend/src/api/routes/environments.py`
- [ ] T011 [P] [US1] Add schedule configuration fields to Environment edit form in `frontend/src/components/EnvSelector.svelte` (or appropriate component)
- [ ] T012 [US1] Implement validation for Cron expressions in backend and frontend
## Phase 4: [US2] Unified Task Management UI
- [ ] T013 [US2] Implement `/api/tasks` endpoint to list and filter tasks in `backend/src/api/routes/tasks.py`
- [ ] T014 [US2] Create new Tasks page in `frontend/src/routes/tasks/+page.svelte`
- [ ] T015 [P] [US2] Implement `TaskList` component in `frontend/src/components/TaskList.svelte`
- [ ] T016 [US2] Add "Tasks" link to main navigation in `frontend/src/components/Navbar.svelte`
## Phase 5: [US3] Manual Backup Trigger
- [ ] T017 [US3] Implement `/api/tasks/backup` POST endpoint in `backend/src/api/routes/tasks.py`
- [ ] T018 [US3] Add "Run Backup" button and environment selection to Tasks page in `frontend/src/routes/tasks/+page.svelte`
## Phase 6: [US4] Task History & Logs
- [ ] T019 [US4] Implement `/api/tasks/{task_id}` GET endpoint for detailed task info and logs in `backend/src/api/routes/tasks.py`
- [ ] T020 [US4] Implement `TaskLogViewer` component in `frontend/src/components/TaskLogViewer.svelte`
- [ ] T021 [US4] Integrate log viewer into TaskList or as a separate modal/page
## Final Phase: Polish & Cross-cutting concerns
- [ ] T022 Implement task cleanup/retention policy (e.g., delete tasks older than 30 days)
- [ ] T023 Add real-time updates for task status using WebSockets (optional/refinement)
- [ ] T024 Ensure consistent error handling and logging across scheduler and task manager
## Dependencies
- US1 depends on Phase 1 & 2
- US2 depends on Phase 1 & 2
- US3 depends on US2
- US4 depends on US2
## Implementation Strategy
1. **Infrastructure First**: Setup database and basic task persistence.
2. **Backend Logic**: Implement scheduler and update task manager.
3. **API & UI**: Build the unified tasks view.
4. **Feature Integration**: Add scheduling UI and manual triggers.