Compare commits
11 Commits
a43f8fb021
...
007-migrat
| Author | SHA1 | Date | |
|---|---|---|---|
| 3d75a21127 | |||
| 07914c8728 | |||
| cddc259b76 | |||
| dcbf0a7d7f | |||
| 65f61c1f80 | |||
| cb7386f274 | |||
| 83e34e1799 | |||
| d197303b9f | |||
| 4aa01b6470 | |||
| 35b423979d | |||
| 2ffc3cc68f |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -61,3 +61,4 @@ keyring passwords.py
|
||||
*git*
|
||||
*tech_spec*
|
||||
dashboards
|
||||
backend/mappings.db
|
||||
|
||||
@@ -9,6 +9,10 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
||||
- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic (005-fix-ui-ws-validation)
|
||||
- N/A (Configuration based) (005-fix-ui-ws-validation)
|
||||
- Filesystem (plugins, logs, backups), SQLite (optional, for job history if needed) (005-fix-ui-ws-validation)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS (007-migration-dashboard-grid)
|
||||
- N/A (Superset API integration) (007-migration-dashboard-grid)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, Superset API (007-migration-dashboard-grid)
|
||||
- N/A (Superset API integration - read-only for metadata) (007-migration-dashboard-grid)
|
||||
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
||||
|
||||
@@ -29,9 +33,9 @@ cd src; pytest; ruff check .
|
||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||
|
||||
## Recent Changes
|
||||
- 005-fix-ui-ws-validation: Added Python 3.9+ (Backend), Node.js 18+ (Frontend Build)
|
||||
- 005-fix-ui-ws-validation: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic
|
||||
- 005-fix-ui-ws-validation: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic
|
||||
- 007-migration-dashboard-grid: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, Superset API
|
||||
- 007-migration-dashboard-grid: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS
|
||||
- 007-migration-dashboard-grid: Added [if applicable, e.g., PostgreSQL, CoreData, files or N/A]
|
||||
|
||||
|
||||
<!-- MANUAL ADDITIONS START -->
|
||||
|
||||
27
.kilocodemodes
Normal file
27
.kilocodemodes
Normal file
@@ -0,0 +1,27 @@
|
||||
customModes:
|
||||
- slug: tech-lead
|
||||
name: Tech Lead
|
||||
description: Architect for contracts and scaffolding
|
||||
roleDefinition: >-
|
||||
You are Kilo Code, acting as a Technical Lead and System Architect.
|
||||
|
||||
Your primary responsibility is to define the "Structure" and "Contracts" of the system before implementation, following the Semantic Code Generation Protocol.
|
||||
|
||||
You operate primarily on 'tasks-arch.md' task lists.
|
||||
|
||||
YOUR DUTIES:
|
||||
1. Create new files and directory structures.
|
||||
2. Define Modules, Classes, and Functions using `[DEF]` anchors.
|
||||
3. Write clear Headers with `@PURPOSE`, `@LAYER`, `@RELATION`.
|
||||
4. Define strict Contracts using `@PRE`, `@POST`, `@PARAM`, `@RETURN`.
|
||||
5. Leave the implementation body empty or with a placeholder (e.g., `pass`, `return ...`).
|
||||
|
||||
YOU DO NOT WRITE BUSINESS LOGIC. Your output is the "Skeleton" and "Rules" that the Developer Agent will fill in.
|
||||
whenToUse: >-
|
||||
Use this mode during the "Architecture Phase" of a feature. Select this mode when you need to create new files, define API surfaces, or set up the project structure before coding begins.
|
||||
groups:
|
||||
- read
|
||||
- edit
|
||||
- command
|
||||
- list_files
|
||||
- search_files
|
||||
@@ -1,29 +1,99 @@
|
||||
# ss-tools Constitution
|
||||
<!--
|
||||
SYNC IMPACT REPORT
|
||||
Version: 1.5.0 (Fractal Complexity Limit)
|
||||
Changes:
|
||||
- Added Section VI (Fractal Complexity Limit) to enforce strict module (~300 lines) and function (~30-50 lines) size limits.
|
||||
- Aims to maintain semantic coherence and avoid "Attention Sink".
|
||||
Templates Status:
|
||||
- .specify/templates/plan-template.md: ✅ Aligned.
|
||||
- .specify/templates/spec-template.md: ✅ Aligned.
|
||||
- .specify/templates/tasks-arch-template.md: ✅ Aligned (New role-based split).
|
||||
- .specify/templates/tasks-dev-template.md: ✅ Aligned (New role-based split).
|
||||
-->
|
||||
# Semantic Code Generation Constitution
|
||||
|
||||
## Core Principles
|
||||
|
||||
### I. SPA-First Architecture
|
||||
The frontend MUST be a Static Single Page Application (SPA) served by the Python backend. No Node.js server is permitted in production. The backend serves the `index.html` entry point for all non-API routes.
|
||||
### I. Causal Validity (Contracts First)
|
||||
Semantic definitions (Contracts) must ALWAYS precede implementation code. Logic is downstream of definition. We define the structure and constraints (`[DEF]`, `@PRE`, `@POST`) before writing the executable logic. This ensures that the "what" and "why" govern the "how".
|
||||
|
||||
### II. API-Driven Communication
|
||||
All data retrieval and state changes MUST be performed via the backend REST API or WebSockets. The frontend should not access the database or filesystem directly.
|
||||
### II. Immutability of Architecture
|
||||
Once defined, architectural decisions in the Module Header (`@LAYER`, `@INVARIANT`, `@CONSTRAINT`) are treated as immutable constraints for that module. Changes to these require an explicit refactoring step, not ad-hoc modification during implementation.
|
||||
|
||||
### III. Modern Stack Consistency
|
||||
The project strictly uses SvelteKit (Frontend), FastAPI (Backend), and Tailwind CSS (Styling). New dependencies must be justified and approved.
|
||||
### III. Semantic Format Compliance
|
||||
All output must strictly follow the `[DEF]` / `[/DEF]` anchor syntax with specific Metadata Tags (`@KEY`) and Graph Relations (`@RELATION`). **Crucially, the closing anchor must strictly match the full content of the opening anchor (e.g., `[DEF:module_name:Module]` must close with `[/DEF:module_name:Module]`).**
|
||||
|
||||
### IV. Semantic Protocol Adherence (GRACE-Poly)
|
||||
All code generation and modification MUST adhere to the Semantic Protocol defined in `semantic_protocol.md`.
|
||||
- **Anchors**: Use `[DEF:id:Type]` and `[/DEF:id]` to define semantic boundaries.
|
||||
- **Contracts**: Define `@PRE` and `@POST` conditions in headers.
|
||||
- **Logging**: Use structured logging with `[AnchorID][State]` format.
|
||||
- **Immutability**: Respect architectural decisions in headers.
|
||||
**Standardized Graph Relations**
|
||||
To ensure the integrity of the Semantic Graph, `@RELATION` must use a strict taxonomy:
|
||||
- `DEPENDS_ON` (Structural dependency)
|
||||
- `CALLS` (Flow control)
|
||||
- `CREATES` (Instantiation)
|
||||
- `INHERITS_FROM` / `IMPLEMENTS` (OOP hierarchy)
|
||||
- `READS_STATE` / `WRITES_STATE` (Data flow)
|
||||
- `DISPATCHES` / `HANDLES` (Event flow)
|
||||
|
||||
Ad-hoc relationships are forbidden. This structure is non-negotiable as it ensures the codebase remains machine-readable, fractal-structured, and optimized for Sparse Attention navigation by AI agents.
|
||||
|
||||
### IV. Design by Contract (DbC)
|
||||
Contracts are the Source of Truth. Functions and Classes must define their purpose, specifications, and constraints (`@PRE`, `@POST`, `@THROW`) in the metadata block before implementation. Implementation must strictly satisfy these contracts.
|
||||
|
||||
### V. Belief State Logging
|
||||
Logs must define the agent's internal state for debugging and coherence checks. We use a strict format: `[{ANCHOR_ID}][{STATE}] {MESSAGE}`. For Python, a **Context Manager** pattern MUST be used to automatically handle `Entry`, `Exit`, and `Coherence` states, ensuring structural integrity and error capturing.
|
||||
|
||||
### VI. Fractal Complexity Limit
|
||||
To maintain semantic coherence and avoid "Attention Sink" issues:
|
||||
- **Module Size**: If a Module body exceeds ~300 lines (or logical complexity), it MUST be refactored into sub-modules or a package structure.
|
||||
- **Function Size**: Functions should fit within a standard attention "chunk" (approx. 30-50 lines). If larger, logic MUST be decomposed into helper functions with their own contracts.
|
||||
|
||||
This ensures every vector embedding remains sharp and focused.
|
||||
|
||||
## File Structure Standards
|
||||
|
||||
### Python Modules
|
||||
Every `.py` file must start with a Module definition header (`[DEF:module_name:Module]`) containing:
|
||||
- `@SEMANTICS`: Keywords for vector search.
|
||||
- `@PURPOSE`: Primary responsibility of the module.
|
||||
- `@LAYER`: Architecture layer (Domain/Infra/UI).
|
||||
- `@RELATION`: Dependencies.
|
||||
- `@INVARIANT` & `@CONSTRAINT`: Immutable rules.
|
||||
- `@PUBLIC_API`: Exported symbols.
|
||||
|
||||
### Svelte Components
|
||||
Every `.svelte` file must start with a Component definition header (`[DEF:ComponentName:Component]`) wrapped in an HTML comment `<!-- ... -->` containing:
|
||||
- `@SEMANTICS`: Keywords for vector search.
|
||||
- `@PURPOSE`: Primary responsibility of the component.
|
||||
- `@LAYER`: Architecture layer (UI/State/Layout).
|
||||
- `@RELATION`: Child components, Stores used, API calls.
|
||||
- `@PROPS`: Input properties.
|
||||
- `@EVENTS`: Emitted events.
|
||||
- `@INVARIANT`: Immutable UI/State rules.
|
||||
|
||||
## Generation Workflow
|
||||
The development process follows a strict sequence enforced by Agent Roles:
|
||||
|
||||
### 1. Architecture Phase (Mode: `tech-lead`)
|
||||
**Input**: `tasks-arch.md`
|
||||
**Responsibility**:
|
||||
- Analyze request and graph position.
|
||||
- Generate `[DEF]` anchors, Headers, and Contracts (`@PRE`, `@POST`).
|
||||
- **Output**: Scaffolding files with no implementation logic.
|
||||
|
||||
### 2. Implementation Phase (Mode: `code`)
|
||||
**Input**: `tasks-dev.md` + Scaffolding files
|
||||
**Responsibility**:
|
||||
- Read contracts defined by Architect.
|
||||
- Write implementation code that strictly satisfies contracts.
|
||||
- **Output**: Working code with passing tests.
|
||||
|
||||
### 3. Validation
|
||||
If logic conflicts with Contract -> Stop -> Report Error.
|
||||
|
||||
## Governance
|
||||
This Constitution establishes the "Semantic Code Generation Protocol" as the supreme law of this repository.
|
||||
|
||||
### Compliance
|
||||
All Pull Requests and code modifications must be verified against this Constitution. Violations of Core Principles are considered critical defects.
|
||||
- **Automated Enforcement**: All code generation tools and agents must parse and validate adherence to the `[DEF]` syntax and Contract requirements.
|
||||
- **Amendments**: Changes to the syntax or core principles require a formal amendment to this Constitution and a corresponding update to the constitution
|
||||
- **Review**: Code reviews must verify that implementation matches the preceding contracts and that no "naked code" exists outside of semantic anchors.
|
||||
- **Compliance**: Failure to adhere to the `[DEF]` / `[/DEF]` structure (including matching closing tags) constitutes a build failure.
|
||||
|
||||
### Amendments
|
||||
Changes to this Constitution require a formal RFC process and approval from the project lead.
|
||||
|
||||
**Version**: 1.0.0 | **Ratified**: 2025-12-20
|
||||
**Version**: 1.5.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2025-12-27
|
||||
|
||||
@@ -9,8 +9,8 @@
|
||||
#
|
||||
# OPTIONS:
|
||||
# --json Output in JSON format
|
||||
# --require-tasks Require tasks.md to exist (for implementation phase)
|
||||
# --include-tasks Include tasks.md in AVAILABLE_DOCS list
|
||||
# --require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
|
||||
# --include-tasks Include task files in AVAILABLE_DOCS list
|
||||
# --paths-only Only output path variables (no validation)
|
||||
# --help, -h Show help message
|
||||
#
|
||||
@@ -49,8 +49,8 @@ Consolidated prerequisite checking for Spec-Driven Development workflow.
|
||||
|
||||
OPTIONS:
|
||||
--json Output in JSON format
|
||||
--require-tasks Require tasks.md to exist (for implementation phase)
|
||||
--include-tasks Include tasks.md in AVAILABLE_DOCS list
|
||||
--require-tasks Require tasks-arch.md and tasks-dev.md to exist (for implementation phase)
|
||||
--include-tasks Include task files in AVAILABLE_DOCS list
|
||||
--paths-only Only output path variables (no prerequisite validation)
|
||||
--help, -h Show this help message
|
||||
|
||||
@@ -58,7 +58,7 @@ EXAMPLES:
|
||||
# Check task prerequisites (plan.md required)
|
||||
./check-prerequisites.sh --json
|
||||
|
||||
# Check implementation prerequisites (plan.md + tasks.md required)
|
||||
# Check implementation prerequisites (plan.md + task files required)
|
||||
./check-prerequisites.sh --json --require-tasks --include-tasks
|
||||
|
||||
# Get feature paths only (no validation)
|
||||
@@ -86,15 +86,16 @@ check_feature_branch "$CURRENT_BRANCH" "$HAS_GIT" || exit 1
|
||||
if $PATHS_ONLY; then
|
||||
if $JSON_MODE; then
|
||||
# Minimal JSON paths payload (no validation performed)
|
||||
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS":"%s"}\n' \
|
||||
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS"
|
||||
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS_ARCH":"%s","TASKS_DEV":"%s"}\n' \
|
||||
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS_ARCH" "$TASKS_DEV"
|
||||
else
|
||||
echo "REPO_ROOT: $REPO_ROOT"
|
||||
echo "BRANCH: $CURRENT_BRANCH"
|
||||
echo "FEATURE_DIR: $FEATURE_DIR"
|
||||
echo "FEATURE_SPEC: $FEATURE_SPEC"
|
||||
echo "IMPL_PLAN: $IMPL_PLAN"
|
||||
echo "TASKS: $TASKS"
|
||||
echo "TASKS_ARCH: $TASKS_ARCH"
|
||||
echo "TASKS_DEV: $TASKS_DEV"
|
||||
fi
|
||||
exit 0
|
||||
fi
|
||||
@@ -112,12 +113,19 @@ if [[ ! -f "$IMPL_PLAN" ]]; then
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check for tasks.md if required
|
||||
if $REQUIRE_TASKS && [[ ! -f "$TASKS" ]]; then
|
||||
echo "ERROR: tasks.md not found in $FEATURE_DIR" >&2
|
||||
echo "Run /speckit.tasks first to create the task list." >&2
|
||||
# Check for task files if required
|
||||
if $REQUIRE_TASKS; then
|
||||
if [[ ! -f "$TASKS_ARCH" ]]; then
|
||||
echo "ERROR: tasks-arch.md not found in $FEATURE_DIR" >&2
|
||||
echo "Run /speckit.tasks first to create the task lists." >&2
|
||||
exit 1
|
||||
fi
|
||||
if [[ ! -f "$TASKS_DEV" ]]; then
|
||||
echo "ERROR: tasks-dev.md not found in $FEATURE_DIR" >&2
|
||||
echo "Run /speckit.tasks first to create the task lists." >&2
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Build list of available documents
|
||||
docs=()
|
||||
@@ -133,9 +141,10 @@ fi
|
||||
|
||||
[[ -f "$QUICKSTART" ]] && docs+=("quickstart.md")
|
||||
|
||||
# Include tasks.md if requested and it exists
|
||||
if $INCLUDE_TASKS && [[ -f "$TASKS" ]]; then
|
||||
docs+=("tasks.md")
|
||||
# Include task files if requested and they exist
|
||||
if $INCLUDE_TASKS; then
|
||||
[[ -f "$TASKS_ARCH" ]] && docs+=("tasks-arch.md")
|
||||
[[ -f "$TASKS_DEV" ]] && docs+=("tasks-dev.md")
|
||||
fi
|
||||
|
||||
# Output results
|
||||
@@ -161,6 +170,7 @@ else
|
||||
check_file "$QUICKSTART" "quickstart.md"
|
||||
|
||||
if $INCLUDE_TASKS; then
|
||||
check_file "$TASKS" "tasks.md"
|
||||
check_file "$TASKS_ARCH" "tasks-arch.md"
|
||||
check_file "$TASKS_DEV" "tasks-dev.md"
|
||||
fi
|
||||
fi
|
||||
|
||||
@@ -143,7 +143,9 @@ HAS_GIT='$has_git_repo'
|
||||
FEATURE_DIR='$feature_dir'
|
||||
FEATURE_SPEC='$feature_dir/spec.md'
|
||||
IMPL_PLAN='$feature_dir/plan.md'
|
||||
TASKS='$feature_dir/tasks.md'
|
||||
TASKS_ARCH='$feature_dir/tasks-arch.md'
|
||||
TASKS_DEV='$feature_dir/tasks-dev.md'
|
||||
TASKS='$feature_dir/tasks.md' # Deprecated
|
||||
RESEARCH='$feature_dir/research.md'
|
||||
DATA_MODEL='$feature_dir/data-model.md'
|
||||
QUICKSTART='$feature_dir/quickstart.md'
|
||||
|
||||
35
.specify/templates/tasks-arch-template.md
Normal file
35
.specify/templates/tasks-arch-template.md
Normal file
@@ -0,0 +1,35 @@
|
||||
---
|
||||
|
||||
description: "Architecture task list template (Contracts & Scaffolding)"
|
||||
---
|
||||
|
||||
# Architecture Tasks: [FEATURE NAME]
|
||||
|
||||
**Role**: Architect Agent
|
||||
**Goal**: Define the "What" and "Why" (Contracts, Scaffolding, Models) before implementation.
|
||||
**Input**: Design documents from `/specs/[###-feature-name]/`
|
||||
**Output**: Files with `[DEF]` anchors, `@PRE`/`@POST` contracts, and `@RELATION` mappings. No business logic.
|
||||
|
||||
## Phase 1: Setup & Models
|
||||
|
||||
- [ ] A001 Create/Update data models in [path] with `[DEF]` and contracts
|
||||
- [ ] A002 Define API route structure/contracts in [path]
|
||||
- [ ] A003 Define shared utilities/interfaces
|
||||
|
||||
## Phase 2: User Story 1 - [Title]
|
||||
|
||||
- [ ] A004 [US1] Define contracts for [Component/Service] in [path]
|
||||
- [ ] A005 [US1] Define contracts for [Endpoint] in [path]
|
||||
- [ ] A006 [US1] Define contracts for [Frontend Component] in [path]
|
||||
|
||||
## Phase 3: User Story 2 - [Title]
|
||||
|
||||
- [ ] A007 [US2] Define contracts for [Component/Service] in [path]
|
||||
- [ ] A008 [US2] Define contracts for [Endpoint] in [path]
|
||||
|
||||
## Handover Checklist
|
||||
|
||||
- [ ] All new files created with `[DEF]` anchors
|
||||
- [ ] All functions/classes have `@PURPOSE`, `@PRE`, `@POST` tags
|
||||
- [ ] No "naked code" (logic outside of anchors)
|
||||
- [ ] `tasks-dev.md` is ready for the Developer Agent
|
||||
35
.specify/templates/tasks-dev-template.md
Normal file
35
.specify/templates/tasks-dev-template.md
Normal file
@@ -0,0 +1,35 @@
|
||||
---
|
||||
|
||||
description: "Developer task list template (Implementation Logic)"
|
||||
---
|
||||
|
||||
# Developer Tasks: [FEATURE NAME]
|
||||
|
||||
**Role**: Developer Agent
|
||||
**Goal**: Implement the "How" (Logic, State, Error Handling) inside the defined contracts.
|
||||
**Input**: `tasks-arch.md` (completed), Scaffolding files with `[DEF]` anchors.
|
||||
**Output**: Working code that satisfies `@PRE`/`@POST` conditions.
|
||||
|
||||
## Phase 1: Setup & Models
|
||||
|
||||
- [ ] D001 Implement logic for [Model] in [path]
|
||||
- [ ] D002 Implement logic for [API Route] in [path]
|
||||
- [ ] D003 Implement shared utilities
|
||||
|
||||
## Phase 2: User Story 1 - [Title]
|
||||
|
||||
- [ ] D004 [US1] Implement logic for [Component/Service] in [path]
|
||||
- [ ] D005 [US1] Implement logic for [Endpoint] in [path]
|
||||
- [ ] D006 [US1] Implement logic for [Frontend Component] in [path]
|
||||
- [ ] D007 [US1] Verify semantic compliance and belief state logging
|
||||
|
||||
## Phase 3: User Story 2 - [Title]
|
||||
|
||||
- [ ] D008 [US2] Implement logic for [Component/Service] in [path]
|
||||
- [ ] D009 [US2] Implement logic for [Endpoint] in [path]
|
||||
|
||||
## Polish & Quality Assurance
|
||||
|
||||
- [ ] DXXX Verify all tests pass
|
||||
- [ ] DXXX Check error handling and edge cases
|
||||
- [ ] DXXX Ensure code style compliance
|
||||
@@ -1,251 +0,0 @@
|
||||
---
|
||||
|
||||
description: "Task list template for feature implementation"
|
||||
---
|
||||
|
||||
# Tasks: [FEATURE NAME]
|
||||
|
||||
**Input**: Design documents from `/specs/[###-feature-name]/`
|
||||
**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, contracts/
|
||||
|
||||
**Tests**: The examples below include test tasks. Tests are OPTIONAL - only include them if explicitly requested in the feature specification.
|
||||
|
||||
**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story.
|
||||
|
||||
## Format: `[ID] [P?] [Story] Description`
|
||||
|
||||
- **[P]**: Can run in parallel (different files, no dependencies)
|
||||
- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3)
|
||||
- Include exact file paths in descriptions
|
||||
|
||||
## Path Conventions
|
||||
|
||||
- **Single project**: `src/`, `tests/` at repository root
|
||||
- **Web app**: `backend/src/`, `frontend/src/`
|
||||
- **Mobile**: `api/src/`, `ios/src/` or `android/src/`
|
||||
- Paths shown below assume single project - adjust based on plan.md structure
|
||||
|
||||
<!--
|
||||
============================================================================
|
||||
IMPORTANT: The tasks below are SAMPLE TASKS for illustration purposes only.
|
||||
|
||||
The /speckit.tasks command MUST replace these with actual tasks based on:
|
||||
- User stories from spec.md (with their priorities P1, P2, P3...)
|
||||
- Feature requirements from plan.md
|
||||
- Entities from data-model.md
|
||||
- Endpoints from contracts/
|
||||
|
||||
Tasks MUST be organized by user story so each story can be:
|
||||
- Implemented independently
|
||||
- Tested independently
|
||||
- Delivered as an MVP increment
|
||||
|
||||
DO NOT keep these sample tasks in the generated tasks.md file.
|
||||
============================================================================
|
||||
-->
|
||||
|
||||
## Phase 1: Setup (Shared Infrastructure)
|
||||
|
||||
**Purpose**: Project initialization and basic structure
|
||||
|
||||
- [ ] T001 Create project structure per implementation plan
|
||||
- [ ] T002 Initialize [language] project with [framework] dependencies
|
||||
- [ ] T003 [P] Configure linting and formatting tools
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: Foundational (Blocking Prerequisites)
|
||||
|
||||
**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented
|
||||
|
||||
**⚠️ CRITICAL**: No user story work can begin until this phase is complete
|
||||
|
||||
Examples of foundational tasks (adjust based on your project):
|
||||
|
||||
- [ ] T004 Setup database schema and migrations framework
|
||||
- [ ] T005 [P] Implement authentication/authorization framework
|
||||
- [ ] T006 [P] Setup API routing and middleware structure
|
||||
- [ ] T007 Create base models/entities that all stories depend on
|
||||
- [ ] T008 Configure error handling and logging infrastructure
|
||||
- [ ] T009 Setup environment configuration management
|
||||
|
||||
**Checkpoint**: Foundation ready - user story implementation can now begin in parallel
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: User Story 1 - [Title] (Priority: P1) 🎯 MVP
|
||||
|
||||
**Goal**: [Brief description of what this story delivers]
|
||||
|
||||
**Independent Test**: [How to verify this story works on its own]
|
||||
|
||||
### Tests for User Story 1 (OPTIONAL - only if tests requested) ⚠️
|
||||
|
||||
> **NOTE: Write these tests FIRST, ensure they FAIL before implementation**
|
||||
|
||||
- [ ] T010 [P] [US1] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||
- [ ] T011 [P] [US1] Integration test for [user journey] in tests/integration/test_[name].py
|
||||
|
||||
### Implementation for User Story 1
|
||||
|
||||
- [ ] T012 [P] [US1] Create [Entity1] model in src/models/[entity1].py
|
||||
- [ ] T013 [P] [US1] Create [Entity2] model in src/models/[entity2].py
|
||||
- [ ] T014 [US1] Implement [Service] in src/services/[service].py (depends on T012, T013)
|
||||
- [ ] T015 [US1] Implement [endpoint/feature] in src/[location]/[file].py
|
||||
- [ ] T016 [US1] Add validation and error handling
|
||||
- [ ] T017 [US1] Add logging for user story 1 operations
|
||||
|
||||
**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: User Story 2 - [Title] (Priority: P2)
|
||||
|
||||
**Goal**: [Brief description of what this story delivers]
|
||||
|
||||
**Independent Test**: [How to verify this story works on its own]
|
||||
|
||||
### Tests for User Story 2 (OPTIONAL - only if tests requested) ⚠️
|
||||
|
||||
- [ ] T018 [P] [US2] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||
- [ ] T019 [P] [US2] Integration test for [user journey] in tests/integration/test_[name].py
|
||||
|
||||
### Implementation for User Story 2
|
||||
|
||||
- [ ] T020 [P] [US2] Create [Entity] model in src/models/[entity].py
|
||||
- [ ] T021 [US2] Implement [Service] in src/services/[service].py
|
||||
- [ ] T022 [US2] Implement [endpoint/feature] in src/[location]/[file].py
|
||||
- [ ] T023 [US2] Integrate with User Story 1 components (if needed)
|
||||
|
||||
**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently
|
||||
|
||||
---
|
||||
|
||||
## Phase 5: User Story 3 - [Title] (Priority: P3)
|
||||
|
||||
**Goal**: [Brief description of what this story delivers]
|
||||
|
||||
**Independent Test**: [How to verify this story works on its own]
|
||||
|
||||
### Tests for User Story 3 (OPTIONAL - only if tests requested) ⚠️
|
||||
|
||||
- [ ] T024 [P] [US3] Contract test for [endpoint] in tests/contract/test_[name].py
|
||||
- [ ] T025 [P] [US3] Integration test for [user journey] in tests/integration/test_[name].py
|
||||
|
||||
### Implementation for User Story 3
|
||||
|
||||
- [ ] T026 [P] [US3] Create [Entity] model in src/models/[entity].py
|
||||
- [ ] T027 [US3] Implement [Service] in src/services/[service].py
|
||||
- [ ] T028 [US3] Implement [endpoint/feature] in src/[location]/[file].py
|
||||
|
||||
**Checkpoint**: All user stories should now be independently functional
|
||||
|
||||
---
|
||||
|
||||
[Add more user story phases as needed, following the same pattern]
|
||||
|
||||
---
|
||||
|
||||
## Phase N: Polish & Cross-Cutting Concerns
|
||||
|
||||
**Purpose**: Improvements that affect multiple user stories
|
||||
|
||||
- [ ] TXXX [P] Documentation updates in docs/
|
||||
- [ ] TXXX Code cleanup and refactoring
|
||||
- [ ] TXXX Performance optimization across all stories
|
||||
- [ ] TXXX [P] Additional unit tests (if requested) in tests/unit/
|
||||
- [ ] TXXX Security hardening
|
||||
- [ ] TXXX Run quickstart.md validation
|
||||
|
||||
---
|
||||
|
||||
## Dependencies & Execution Order
|
||||
|
||||
### Phase Dependencies
|
||||
|
||||
- **Setup (Phase 1)**: No dependencies - can start immediately
|
||||
- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories
|
||||
- **User Stories (Phase 3+)**: All depend on Foundational phase completion
|
||||
- User stories can then proceed in parallel (if staffed)
|
||||
- Or sequentially in priority order (P1 → P2 → P3)
|
||||
- **Polish (Final Phase)**: Depends on all desired user stories being complete
|
||||
|
||||
### User Story Dependencies
|
||||
|
||||
- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
|
||||
- **User Story 2 (P2)**: Can start after Foundational (Phase 2) - May integrate with US1 but should be independently testable
|
||||
- **User Story 3 (P3)**: Can start after Foundational (Phase 2) - May integrate with US1/US2 but should be independently testable
|
||||
|
||||
### Within Each User Story
|
||||
|
||||
- Tests (if included) MUST be written and FAIL before implementation
|
||||
- Models before services
|
||||
- Services before endpoints
|
||||
- Core implementation before integration
|
||||
- Story complete before moving to next priority
|
||||
|
||||
### Parallel Opportunities
|
||||
|
||||
- All Setup tasks marked [P] can run in parallel
|
||||
- All Foundational tasks marked [P] can run in parallel (within Phase 2)
|
||||
- Once Foundational phase completes, all user stories can start in parallel (if team capacity allows)
|
||||
- All tests for a user story marked [P] can run in parallel
|
||||
- Models within a story marked [P] can run in parallel
|
||||
- Different user stories can be worked on in parallel by different team members
|
||||
|
||||
---
|
||||
|
||||
## Parallel Example: User Story 1
|
||||
|
||||
```bash
|
||||
# Launch all tests for User Story 1 together (if tests requested):
|
||||
Task: "Contract test for [endpoint] in tests/contract/test_[name].py"
|
||||
Task: "Integration test for [user journey] in tests/integration/test_[name].py"
|
||||
|
||||
# Launch all models for User Story 1 together:
|
||||
Task: "Create [Entity1] model in src/models/[entity1].py"
|
||||
Task: "Create [Entity2] model in src/models/[entity2].py"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implementation Strategy
|
||||
|
||||
### MVP First (User Story 1 Only)
|
||||
|
||||
1. Complete Phase 1: Setup
|
||||
2. Complete Phase 2: Foundational (CRITICAL - blocks all stories)
|
||||
3. Complete Phase 3: User Story 1
|
||||
4. **STOP and VALIDATE**: Test User Story 1 independently
|
||||
5. Deploy/demo if ready
|
||||
|
||||
### Incremental Delivery
|
||||
|
||||
1. Complete Setup + Foundational → Foundation ready
|
||||
2. Add User Story 1 → Test independently → Deploy/Demo (MVP!)
|
||||
3. Add User Story 2 → Test independently → Deploy/Demo
|
||||
4. Add User Story 3 → Test independently → Deploy/Demo
|
||||
5. Each story adds value without breaking previous stories
|
||||
|
||||
### Parallel Team Strategy
|
||||
|
||||
With multiple developers:
|
||||
|
||||
1. Team completes Setup + Foundational together
|
||||
2. Once Foundational is done:
|
||||
- Developer A: User Story 1
|
||||
- Developer B: User Story 2
|
||||
- Developer C: User Story 3
|
||||
3. Stories complete and integrate independently
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- [P] tasks = different files, no dependencies
|
||||
- [Story] label maps task to specific user story for traceability
|
||||
- Each user story should be independently completable and testable
|
||||
- Verify tests fail before implementing
|
||||
- Commit after each task or logical group
|
||||
- Stop at any checkpoint to validate story independently
|
||||
- Avoid: vague tasks, same file conflicts, cross-story dependencies that break independence
|
||||
Binary file not shown.
@@ -16,7 +16,7 @@ from ...core.config_models import AppConfig, Environment, GlobalSettings
|
||||
from ...dependencies import get_config_manager
|
||||
from ...core.config_manager import ConfigManager
|
||||
from ...core.logger import logger
|
||||
from superset_tool.client import SupersetClient
|
||||
from ...core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
@@ -16,7 +16,7 @@ import os
|
||||
from pathlib import Path
|
||||
from typing import Optional, List
|
||||
from .config_models import AppConfig, Environment, GlobalSettings
|
||||
from .logger import logger
|
||||
from .logger import logger, configure_logger
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:ConfigManager:Class]
|
||||
@@ -39,6 +39,9 @@ class ConfigManager:
|
||||
self.config_path = Path(config_path)
|
||||
self.config: AppConfig = self._load_config()
|
||||
|
||||
# Configure logger with loaded settings
|
||||
configure_logger(self.config.settings.logging)
|
||||
|
||||
# 3. Runtime check of @POST
|
||||
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
||||
|
||||
@@ -121,6 +124,9 @@ class ConfigManager:
|
||||
self.config.settings = settings
|
||||
self.save()
|
||||
|
||||
# Reconfigure logger with new settings
|
||||
configure_logger(settings.logging)
|
||||
|
||||
logger.info(f"[update_global_settings][Exit] Settings updated")
|
||||
# [/DEF:update_global_settings]
|
||||
|
||||
|
||||
@@ -19,11 +19,22 @@ class Environment(BaseModel):
|
||||
is_default: bool = False
|
||||
# [/DEF:Environment]
|
||||
|
||||
# [DEF:LoggingConfig:DataClass]
|
||||
# @PURPOSE: Defines the configuration for the application's logging system.
|
||||
class LoggingConfig(BaseModel):
|
||||
level: str = "INFO"
|
||||
file_path: Optional[str] = "logs/app.log"
|
||||
max_bytes: int = 10 * 1024 * 1024
|
||||
backup_count: int = 5
|
||||
enable_belief_state: bool = True
|
||||
# [/DEF:LoggingConfig]
|
||||
|
||||
# [DEF:GlobalSettings:DataClass]
|
||||
# @PURPOSE: Represents global application settings.
|
||||
class GlobalSettings(BaseModel):
|
||||
backup_path: str
|
||||
default_environment_id: Optional[str] = None
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
# [/DEF:GlobalSettings]
|
||||
|
||||
# [DEF:AppConfig:DataClass]
|
||||
|
||||
@@ -4,12 +4,32 @@
|
||||
# @LAYER: Core
|
||||
# @RELATION: Used by the main application and other modules to log events. The WebSocketLogHandler is used by the WebSocket endpoint in app.py.
|
||||
import logging
|
||||
import threading
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, List, Optional
|
||||
from collections import deque
|
||||
from contextlib import contextmanager
|
||||
from logging.handlers import RotatingFileHandler
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
# Thread-local storage for belief state
|
||||
_belief_state = threading.local()
|
||||
|
||||
# Global flag for belief state logging
|
||||
_enable_belief_state = True
|
||||
|
||||
# [DEF:BeliefFormatter:Class]
|
||||
# @PURPOSE: Custom logging formatter that adds belief state prefixes to log messages.
|
||||
class BeliefFormatter(logging.Formatter):
|
||||
def format(self, record):
|
||||
msg = super().format(record)
|
||||
anchor_id = getattr(_belief_state, 'anchor_id', None)
|
||||
if anchor_id:
|
||||
msg = f"[{anchor_id}][Action] {msg}"
|
||||
return msg
|
||||
# [/DEF:BeliefFormatter]
|
||||
|
||||
# Re-using LogEntry from task_manager for consistency
|
||||
# [DEF:LogEntry:Class]
|
||||
# @SEMANTICS: log, entry, record, pydantic
|
||||
@@ -22,6 +42,81 @@ class LogEntry(BaseModel):
|
||||
|
||||
# [/DEF]
|
||||
|
||||
# [DEF:BeliefScope:Function]
|
||||
# @PURPOSE: Context manager for structured Belief State logging.
|
||||
@contextmanager
|
||||
def belief_scope(anchor_id: str, message: str = ""):
|
||||
# Log Entry if enabled
|
||||
if _enable_belief_state:
|
||||
entry_msg = f"[{anchor_id}][Entry]"
|
||||
if message:
|
||||
entry_msg += f" {message}"
|
||||
logger.info(entry_msg)
|
||||
|
||||
# Set thread-local anchor_id
|
||||
old_anchor = getattr(_belief_state, 'anchor_id', None)
|
||||
_belief_state.anchor_id = anchor_id
|
||||
|
||||
try:
|
||||
yield
|
||||
# Log Coherence OK and Exit
|
||||
logger.info(f"[{anchor_id}][Coherence:OK]")
|
||||
if _enable_belief_state:
|
||||
logger.info(f"[{anchor_id}][Exit]")
|
||||
except Exception as e:
|
||||
# Log Coherence Failed
|
||||
logger.info(f"[{anchor_id}][Coherence:Failed] {str(e)}")
|
||||
raise
|
||||
finally:
|
||||
# Restore old anchor
|
||||
_belief_state.anchor_id = old_anchor
|
||||
|
||||
# [/DEF:BeliefScope]
|
||||
|
||||
# [DEF:ConfigureLogger:Function]
|
||||
# @PURPOSE: Configures the logger with the provided logging settings.
|
||||
# @PRE: config is a valid LoggingConfig instance.
|
||||
# @POST: Logger level, handlers, and belief state flag are updated.
|
||||
# @PARAM: config (LoggingConfig) - The logging configuration.
|
||||
def configure_logger(config):
|
||||
global _enable_belief_state
|
||||
_enable_belief_state = config.enable_belief_state
|
||||
|
||||
# Set logger level
|
||||
level = getattr(logging, config.level.upper(), logging.INFO)
|
||||
logger.setLevel(level)
|
||||
|
||||
# Remove existing file handlers
|
||||
handlers_to_remove = [h for h in logger.handlers if isinstance(h, RotatingFileHandler)]
|
||||
for h in handlers_to_remove:
|
||||
logger.removeHandler(h)
|
||||
h.close()
|
||||
|
||||
# Add file handler if file_path is set
|
||||
if config.file_path:
|
||||
import os
|
||||
from pathlib import Path
|
||||
log_file = Path(config.file_path)
|
||||
log_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
file_handler = RotatingFileHandler(
|
||||
config.file_path,
|
||||
maxBytes=config.max_bytes,
|
||||
backupCount=config.backup_count
|
||||
)
|
||||
file_handler.setFormatter(BeliefFormatter(
|
||||
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
||||
))
|
||||
logger.addHandler(file_handler)
|
||||
|
||||
# Update existing handlers' formatters to BeliefFormatter
|
||||
for handler in logger.handlers:
|
||||
if not isinstance(handler, RotatingFileHandler):
|
||||
handler.setFormatter(BeliefFormatter(
|
||||
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
||||
))
|
||||
# [/DEF:ConfigureLogger]
|
||||
|
||||
# [DEF:WebSocketLogHandler:Class]
|
||||
# @SEMANTICS: logging, handler, websocket, buffer
|
||||
# @PURPOSE: A custom logging handler that captures log records into a buffer. It is designed to be extended for real-time log streaming over WebSockets.
|
||||
@@ -72,7 +167,7 @@ logger = logging.getLogger("superset_tools_app")
|
||||
logger.setLevel(logging.INFO)
|
||||
|
||||
# Create a formatter
|
||||
formatter = logging.Formatter(
|
||||
formatter = BeliefFormatter(
|
||||
'[%(asctime)s][%(levelname)s][%(name)s] %(message)s'
|
||||
)
|
||||
|
||||
|
||||
44
backend/tests/test_logger.py
Normal file
44
backend/tests/test_logger.py
Normal file
@@ -0,0 +1,44 @@
|
||||
import pytest
|
||||
from backend.src.core.logger import belief_scope, logger
|
||||
|
||||
|
||||
def test_belief_scope_logs_entry_action_exit(caplog):
|
||||
"""Test that belief_scope generates [ID][Entry], [ID][Action], and [ID][Exit] logs."""
|
||||
caplog.set_level("INFO")
|
||||
|
||||
with belief_scope("TestFunction"):
|
||||
logger.info("Doing something important")
|
||||
|
||||
# Check that the logs contain the expected patterns
|
||||
log_messages = [record.message for record in caplog.records]
|
||||
|
||||
assert any("[TestFunction][Entry]" in msg for msg in log_messages), "Entry log not found"
|
||||
assert any("[TestFunction][Action] Doing something important" in msg for msg in log_messages), "Action log not found"
|
||||
assert any("[TestFunction][Exit]" in msg for msg in log_messages), "Exit log not found"
|
||||
|
||||
|
||||
def test_belief_scope_error_handling(caplog):
|
||||
"""Test that belief_scope logs Coherence:Failed on exception."""
|
||||
caplog.set_level("INFO")
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
with belief_scope("FailingFunction"):
|
||||
raise ValueError("Something went wrong")
|
||||
|
||||
log_messages = [record.message for record in caplog.records]
|
||||
|
||||
assert any("[FailingFunction][Entry]" in msg for msg in log_messages), "Entry log not found"
|
||||
assert any("[FailingFunction][Coherence:Failed]" in msg for msg in log_messages), "Failed coherence log not found"
|
||||
# Exit should not be logged on failure
|
||||
|
||||
|
||||
def test_belief_scope_success_coherence(caplog):
|
||||
"""Test that belief_scope logs Coherence:OK on success."""
|
||||
caplog.set_level("INFO")
|
||||
|
||||
with belief_scope("SuccessFunction"):
|
||||
pass
|
||||
|
||||
log_messages = [record.message for record in caplog.records]
|
||||
|
||||
assert any("[SuccessFunction][Coherence:OK]" in msg for msg in log_messages), "Success coherence log not found"
|
||||
96
frontend/.svelte-kit/ambient.d.ts
vendored
96
frontend/.svelte-kit/ambient.d.ts
vendored
@@ -26,57 +26,85 @@
|
||||
* ```
|
||||
*/
|
||||
declare module '$env/static/private' {
|
||||
export const LESSOPEN: string;
|
||||
export const USER: string;
|
||||
export const npm_config_user_agent: string;
|
||||
export const XDG_SESSION_TYPE: string;
|
||||
export const npm_node_execpath: string;
|
||||
export const SHLVL: string;
|
||||
export const npm_config_noproxy: string;
|
||||
export const LESS: string;
|
||||
export const HOME: string;
|
||||
export const OLDPWD: string;
|
||||
export const DESKTOP_SESSION: string;
|
||||
export const npm_package_json: string;
|
||||
export const LSCOLORS: string;
|
||||
export const ZSH: string;
|
||||
export const GNOME_SHELL_SESSION_MODE: string;
|
||||
export const GTK_MODULES: string;
|
||||
export const PAGER: string;
|
||||
export const PS1: string;
|
||||
export const npm_config_userconfig: string;
|
||||
export const npm_config_local_prefix: string;
|
||||
export const SYSTEMD_EXEC_PID: string;
|
||||
export const DBUS_SESSION_BUS_ADDRESS: string;
|
||||
export const WSL_DISTRO_NAME: string;
|
||||
export const COLORTERM: string;
|
||||
export const COLOR: string;
|
||||
export const npm_config_metrics_registry: string;
|
||||
export const WAYLAND_DISPLAY: string;
|
||||
export const LOGNAME: string;
|
||||
export const NAME: string;
|
||||
export const WSL_INTEROP: string;
|
||||
export const PULSE_SERVER: string;
|
||||
export const SDKMAN_CANDIDATES_API: string;
|
||||
export const _: string;
|
||||
export const npm_config_prefix: string;
|
||||
export const npm_config_npm_version: string;
|
||||
export const MEMORY_PRESSURE_WATCH: string;
|
||||
export const XDG_SESSION_CLASS: string;
|
||||
export const USERNAME: string;
|
||||
export const TERM: string;
|
||||
export const npm_config_cache: string;
|
||||
export const GNOME_DESKTOP_SESSION_ID: string;
|
||||
export const npm_config_node_gyp: string;
|
||||
export const PATH: string;
|
||||
export const SDKMAN_CANDIDATES_DIR: string;
|
||||
export const NODE: string;
|
||||
export const npm_package_name: string;
|
||||
export const XDG_MENU_PREFIX: string;
|
||||
export const SDKMAN_BROKER_API: string;
|
||||
export const GNOME_TERMINAL_SCREEN: string;
|
||||
export const GNOME_SETUP_DISPLAY: string;
|
||||
export const XDG_RUNTIME_DIR: string;
|
||||
export const DISPLAY: string;
|
||||
export const LANG: string;
|
||||
export const XDG_CURRENT_DESKTOP: string;
|
||||
export const VIRTUAL_ENV_PROMPT: string;
|
||||
export const XMODIFIERS: string;
|
||||
export const XDG_SESSION_DESKTOP: string;
|
||||
export const XAUTHORITY: string;
|
||||
export const LS_COLORS: string;
|
||||
export const GNOME_TERMINAL_SERVICE: string;
|
||||
export const SDKMAN_DIR: string;
|
||||
export const SDKMAN_PLATFORM: string;
|
||||
export const npm_lifecycle_script: string;
|
||||
export const SSH_AUTH_SOCK: string;
|
||||
export const SHELL: string;
|
||||
export const npm_package_version: string;
|
||||
export const npm_lifecycle_event: string;
|
||||
export const QT_ACCESSIBILITY: string;
|
||||
export const GDMSESSION: string;
|
||||
export const GOOGLE_CLOUD_PROJECT: string;
|
||||
export const LESSCLOSE: string;
|
||||
export const GPG_AGENT_INFO: string;
|
||||
export const VIRTUAL_ENV: string;
|
||||
export const QT_IM_MODULE: string;
|
||||
export const npm_config_globalconfig: string;
|
||||
export const npm_config_init_module: string;
|
||||
export const JAVA_HOME: string;
|
||||
export const PWD: string;
|
||||
export const npm_config_globalignorefile: string;
|
||||
export const npm_execpath: string;
|
||||
export const XDG_DATA_DIRS: string;
|
||||
export const npm_config_global_prefix: string;
|
||||
export const npm_command: string;
|
||||
export const WSL2_GUI_APPS_ENABLED: string;
|
||||
export const HOSTTYPE: string;
|
||||
export const WSLENV: string;
|
||||
export const QT_IM_MODULES: string;
|
||||
export const MEMORY_PRESSURE_WRITE: string;
|
||||
export const VTE_VERSION: string;
|
||||
export const INIT_CWD: string;
|
||||
export const EDITOR: string;
|
||||
export const NODE_ENV: string;
|
||||
@@ -109,57 +137,85 @@ declare module '$env/static/public' {
|
||||
*/
|
||||
declare module '$env/dynamic/private' {
|
||||
export const env: {
|
||||
LESSOPEN: string;
|
||||
USER: string;
|
||||
npm_config_user_agent: string;
|
||||
XDG_SESSION_TYPE: string;
|
||||
npm_node_execpath: string;
|
||||
SHLVL: string;
|
||||
npm_config_noproxy: string;
|
||||
LESS: string;
|
||||
HOME: string;
|
||||
OLDPWD: string;
|
||||
DESKTOP_SESSION: string;
|
||||
npm_package_json: string;
|
||||
LSCOLORS: string;
|
||||
ZSH: string;
|
||||
GNOME_SHELL_SESSION_MODE: string;
|
||||
GTK_MODULES: string;
|
||||
PAGER: string;
|
||||
PS1: string;
|
||||
npm_config_userconfig: string;
|
||||
npm_config_local_prefix: string;
|
||||
SYSTEMD_EXEC_PID: string;
|
||||
DBUS_SESSION_BUS_ADDRESS: string;
|
||||
WSL_DISTRO_NAME: string;
|
||||
COLORTERM: string;
|
||||
COLOR: string;
|
||||
npm_config_metrics_registry: string;
|
||||
WAYLAND_DISPLAY: string;
|
||||
LOGNAME: string;
|
||||
NAME: string;
|
||||
WSL_INTEROP: string;
|
||||
PULSE_SERVER: string;
|
||||
SDKMAN_CANDIDATES_API: string;
|
||||
_: string;
|
||||
npm_config_prefix: string;
|
||||
npm_config_npm_version: string;
|
||||
MEMORY_PRESSURE_WATCH: string;
|
||||
XDG_SESSION_CLASS: string;
|
||||
USERNAME: string;
|
||||
TERM: string;
|
||||
npm_config_cache: string;
|
||||
GNOME_DESKTOP_SESSION_ID: string;
|
||||
npm_config_node_gyp: string;
|
||||
PATH: string;
|
||||
SDKMAN_CANDIDATES_DIR: string;
|
||||
NODE: string;
|
||||
npm_package_name: string;
|
||||
XDG_MENU_PREFIX: string;
|
||||
SDKMAN_BROKER_API: string;
|
||||
GNOME_TERMINAL_SCREEN: string;
|
||||
GNOME_SETUP_DISPLAY: string;
|
||||
XDG_RUNTIME_DIR: string;
|
||||
DISPLAY: string;
|
||||
LANG: string;
|
||||
XDG_CURRENT_DESKTOP: string;
|
||||
VIRTUAL_ENV_PROMPT: string;
|
||||
XMODIFIERS: string;
|
||||
XDG_SESSION_DESKTOP: string;
|
||||
XAUTHORITY: string;
|
||||
LS_COLORS: string;
|
||||
GNOME_TERMINAL_SERVICE: string;
|
||||
SDKMAN_DIR: string;
|
||||
SDKMAN_PLATFORM: string;
|
||||
npm_lifecycle_script: string;
|
||||
SSH_AUTH_SOCK: string;
|
||||
SHELL: string;
|
||||
npm_package_version: string;
|
||||
npm_lifecycle_event: string;
|
||||
QT_ACCESSIBILITY: string;
|
||||
GDMSESSION: string;
|
||||
GOOGLE_CLOUD_PROJECT: string;
|
||||
LESSCLOSE: string;
|
||||
GPG_AGENT_INFO: string;
|
||||
VIRTUAL_ENV: string;
|
||||
QT_IM_MODULE: string;
|
||||
npm_config_globalconfig: string;
|
||||
npm_config_init_module: string;
|
||||
JAVA_HOME: string;
|
||||
PWD: string;
|
||||
npm_config_globalignorefile: string;
|
||||
npm_execpath: string;
|
||||
XDG_DATA_DIRS: string;
|
||||
npm_config_global_prefix: string;
|
||||
npm_command: string;
|
||||
WSL2_GUI_APPS_ENABLED: string;
|
||||
HOSTTYPE: string;
|
||||
WSLENV: string;
|
||||
QT_IM_MODULES: string;
|
||||
MEMORY_PRESSURE_WRITE: string;
|
||||
VTE_VERSION: string;
|
||||
INIT_CWD: string;
|
||||
EDITOR: string;
|
||||
NODE_ENV: string;
|
||||
|
||||
@@ -24,7 +24,7 @@ export const options = {
|
||||
app: ({ head, body, assets, nonce, env }) => "<!DOCTYPE html>\n<html lang=\"en\">\n\t<head>\n\t\t<meta charset=\"utf-8\" />\n\t\t<link rel=\"icon\" href=\"" + assets + "/favicon.png\" />\n\t\t<meta name=\"viewport\" content=\"width=device-width, initial-scale=1\" />\n\t\t" + head + "\n\t</head>\n\t<body data-sveltekit-preload-data=\"hover\">\n\t\t<div style=\"display: contents\">" + body + "</div>\n\t</body>\n</html>\n",
|
||||
error: ({ status, message }) => "<!doctype html>\n<html lang=\"en\">\n\t<head>\n\t\t<meta charset=\"utf-8\" />\n\t\t<title>" + message + "</title>\n\n\t\t<style>\n\t\t\tbody {\n\t\t\t\t--bg: white;\n\t\t\t\t--fg: #222;\n\t\t\t\t--divider: #ccc;\n\t\t\t\tbackground: var(--bg);\n\t\t\t\tcolor: var(--fg);\n\t\t\t\tfont-family:\n\t\t\t\t\tsystem-ui,\n\t\t\t\t\t-apple-system,\n\t\t\t\t\tBlinkMacSystemFont,\n\t\t\t\t\t'Segoe UI',\n\t\t\t\t\tRoboto,\n\t\t\t\t\tOxygen,\n\t\t\t\t\tUbuntu,\n\t\t\t\t\tCantarell,\n\t\t\t\t\t'Open Sans',\n\t\t\t\t\t'Helvetica Neue',\n\t\t\t\t\tsans-serif;\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t\tjustify-content: center;\n\t\t\t\theight: 100vh;\n\t\t\t\tmargin: 0;\n\t\t\t}\n\n\t\t\t.error {\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t\tmax-width: 32rem;\n\t\t\t\tmargin: 0 1rem;\n\t\t\t}\n\n\t\t\t.status {\n\t\t\t\tfont-weight: 200;\n\t\t\t\tfont-size: 3rem;\n\t\t\t\tline-height: 1;\n\t\t\t\tposition: relative;\n\t\t\t\ttop: -0.05rem;\n\t\t\t}\n\n\t\t\t.message {\n\t\t\t\tborder-left: 1px solid var(--divider);\n\t\t\t\tpadding: 0 0 0 1rem;\n\t\t\t\tmargin: 0 0 0 1rem;\n\t\t\t\tmin-height: 2.5rem;\n\t\t\t\tdisplay: flex;\n\t\t\t\talign-items: center;\n\t\t\t}\n\n\t\t\t.message h1 {\n\t\t\t\tfont-weight: 400;\n\t\t\t\tfont-size: 1em;\n\t\t\t\tmargin: 0;\n\t\t\t}\n\n\t\t\t@media (prefers-color-scheme: dark) {\n\t\t\t\tbody {\n\t\t\t\t\t--bg: #222;\n\t\t\t\t\t--fg: #ddd;\n\t\t\t\t\t--divider: #666;\n\t\t\t\t}\n\t\t\t}\n\t\t</style>\n\t</head>\n\t<body>\n\t\t<div class=\"error\">\n\t\t\t<span class=\"status\">" + status + "</span>\n\t\t\t<div class=\"message\">\n\t\t\t\t<h1>" + message + "</h1>\n\t\t\t</div>\n\t\t</div>\n\t</body>\n</html>\n"
|
||||
},
|
||||
version_hash: "1pvaiah"
|
||||
version_hash: "n7gbte"
|
||||
};
|
||||
|
||||
export async function get_hooks() {
|
||||
|
||||
@@ -16,6 +16,12 @@
|
||||
>
|
||||
Dashboard
|
||||
</a>
|
||||
<a
|
||||
href="/migration"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/migration') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Migration
|
||||
</a>
|
||||
<a
|
||||
href="/settings"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
|
||||
@@ -21,7 +21,14 @@
|
||||
environments: [],
|
||||
settings: {
|
||||
backup_path: '',
|
||||
default_environment_id: null
|
||||
default_environment_id: null,
|
||||
logging: {
|
||||
level: 'INFO',
|
||||
file_path: 'logs/app.log',
|
||||
max_bytes: 10485760,
|
||||
backup_count: 5,
|
||||
enable_belief_state: true
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
@@ -180,10 +187,43 @@
|
||||
<label for="backup_path" class="block text-sm font-medium text-gray-700">Backup Storage Path</label>
|
||||
<input type="text" id="backup_path" bind:value={settings.settings.backup_path} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<button on:click={handleSaveGlobal} class="bg-blue-500 text-white px-4 py-2 rounded hover:bg-blue-600 w-max">
|
||||
</div>
|
||||
|
||||
<h3 class="text-lg font-medium mb-4 mt-6">Logging Configuration</h3>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="log_level" class="block text-sm font-medium text-gray-700">Log Level</label>
|
||||
<select id="log_level" bind:value={settings.settings.logging.level} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2">
|
||||
<option value="DEBUG">DEBUG</option>
|
||||
<option value="INFO">INFO</option>
|
||||
<option value="WARNING">WARNING</option>
|
||||
<option value="ERROR">ERROR</option>
|
||||
<option value="CRITICAL">CRITICAL</option>
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label for="log_file_path" class="block text-sm font-medium text-gray-700">Log File Path</label>
|
||||
<input type="text" id="log_file_path" bind:value={settings.settings.logging.file_path} placeholder="logs/app.log" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="log_max_bytes" class="block text-sm font-medium text-gray-700">Max File Size (MB)</label>
|
||||
<input type="number" id="log_max_bytes" bind:value={settings.settings.logging.max_bytes} min="1" step="1" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="log_backup_count" class="block text-sm font-medium text-gray-700">Backup Count</label>
|
||||
<input type="number" id="log_backup_count" bind:value={settings.settings.logging.backup_count} min="1" step="1" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div class="md:col-span-2">
|
||||
<label class="flex items-center">
|
||||
<input type="checkbox" id="enable_belief_state" bind:checked={settings.settings.logging.enable_belief_state} class="h-4 w-4 text-blue-600 border-gray-300 rounded" />
|
||||
<span class="ml-2 block text-sm text-gray-900">Enable Belief State Logging</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<button on:click={handleSaveGlobal} class="bg-blue-500 text-white px-4 py-2 rounded hover:bg-blue-600 w-max mt-4">
|
||||
Save Global Settings
|
||||
</button>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="mb-8 bg-white p-6 rounded shadow">
|
||||
|
||||
@@ -13,6 +13,7 @@ This protocol standardizes the "Semantic Bridge" between the two languages using
|
||||
2. **Immutability:** Architectural decisions defined in the Module/Component Header are treated as immutable constraints.
|
||||
3. **Format Compliance:** Output must strictly follow the `[DEF]` / `[/DEF]` anchor syntax for structure.
|
||||
4. **Logic over Assertion:** Contracts define the *logic flow*. Do not generate explicit `assert` statements unless requested. The code logic itself must inherently satisfy the Pre/Post conditions (e.g., via control flow, guards, or types).
|
||||
5. **Fractal Complexity:** Modules and functions must adhere to strict size limits (~300 lines/module, ~30-50 lines/function) to maintain semantic focus.
|
||||
|
||||
---
|
||||
|
||||
@@ -154,15 +155,16 @@ async function updateUserProfile(profileData) {
|
||||
|
||||
Logs delineate the agent's internal state.
|
||||
|
||||
* **Python:** `logger.info(f"[{ANCHOR_ID}][{STATE}] Msg")`
|
||||
* **Python:** MUST use a Context Manager (e.g., `with belief_scope("ANCHOR_ID"):`) to ensure state consistency and automatic handling of Entry/Exit/Error states.
|
||||
* Manual logging (inside scope): `logger.info(f"[{ANCHOR_ID}][{STATE}] Msg")`
|
||||
* **Svelte/JS:** `console.log(\`[${ANCHOR_ID}][${STATE}] Msg\`)`
|
||||
|
||||
**Required States:**
|
||||
1. `Entry` (Start of block)
|
||||
2. `Action` (Key business logic)
|
||||
3. `Coherence:OK` (Logic successfully completed)
|
||||
4. `Coherence:Failed` (Error handling)
|
||||
5. `Exit` (End of block)
|
||||
1. `Entry` (Start of block - Auto-logged by Context Manager)
|
||||
2. `Action` (Key business logic - Manual log)
|
||||
3. `Coherence:OK` (Logic successfully completed - Auto-logged by Context Manager)
|
||||
4. `Coherence:Failed` (Exception/Error - Auto-logged by Context Manager)
|
||||
5. `Exit` (End of block - Auto-logged by Context Manager)
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -0,0 +1,34 @@
|
||||
# Specification Quality Checklist: Configurable Belief State Logging
|
||||
|
||||
**Purpose**: Validate specification completeness and quality before proceeding to planning
|
||||
**Created**: 2025-12-26
|
||||
**Feature**: [specs/006-configurable-belief-logs/spec.md](../spec.md)
|
||||
|
||||
## Content Quality
|
||||
|
||||
- [x] No implementation details (languages, frameworks, APIs)
|
||||
- [x] Focused on user value and business needs
|
||||
- [x] Written for non-technical stakeholders
|
||||
- [x] All mandatory sections completed
|
||||
|
||||
## Requirement Completeness
|
||||
|
||||
- [x] No [NEEDS CLARIFICATION] markers remain
|
||||
- [x] Requirements are testable and unambiguous
|
||||
- [x] Success criteria are measurable
|
||||
- [x] Success criteria are technology-agnostic (no implementation details)
|
||||
- [x] All acceptance scenarios are defined
|
||||
- [x] Edge cases are identified
|
||||
- [x] Scope is clearly bounded
|
||||
- [x] Dependencies and assumptions identified
|
||||
|
||||
## Feature Readiness
|
||||
|
||||
- [x] All functional requirements have clear acceptance criteria
|
||||
- [x] User scenarios cover primary flows
|
||||
- [x] Feature meets measurable outcomes defined in Success Criteria
|
||||
- [x] No implementation details leak into specification
|
||||
|
||||
## Notes
|
||||
|
||||
- Items marked incomplete require spec updates before `/speckit.clarify` or `/speckit.plan`
|
||||
56
specs/006-configurable-belief-logs/contracts/api.md
Normal file
56
specs/006-configurable-belief-logs/contracts/api.md
Normal file
@@ -0,0 +1,56 @@
|
||||
# API Contract: Settings Update
|
||||
|
||||
## PATCH /api/settings/global
|
||||
|
||||
Updates the global application settings, including the new logging configuration.
|
||||
|
||||
### Request Body
|
||||
|
||||
**Content-Type**: `application/json`
|
||||
|
||||
```json
|
||||
{
|
||||
"backup_path": "string",
|
||||
"default_environment_id": "string (optional)",
|
||||
"logging": {
|
||||
"level": "string (DEBUG, INFO, WARNING, ERROR, CRITICAL)",
|
||||
"file_path": "string (optional)",
|
||||
"max_bytes": "integer (default: 10485760)",
|
||||
"backup_count": "integer (default: 5)",
|
||||
"enable_belief_state": "boolean (default: true)"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Response
|
||||
|
||||
**Status**: `200 OK`
|
||||
**Content-Type**: `application/json`
|
||||
|
||||
```json
|
||||
{
|
||||
"backup_path": "string",
|
||||
"default_environment_id": "string (optional)",
|
||||
"logging": {
|
||||
"level": "string",
|
||||
"file_path": "string",
|
||||
"max_bytes": "integer",
|
||||
"backup_count": "integer",
|
||||
"enable_belief_state": "boolean"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Example
|
||||
|
||||
**Request**
|
||||
|
||||
```json
|
||||
{
|
||||
"backup_path": "backups",
|
||||
"logging": {
|
||||
"level": "DEBUG",
|
||||
"file_path": "logs/app.log",
|
||||
"enable_belief_state": true
|
||||
}
|
||||
}
|
||||
74
specs/006-configurable-belief-logs/data-model.md
Normal file
74
specs/006-configurable-belief-logs/data-model.md
Normal file
@@ -0,0 +1,74 @@
|
||||
# Data Model: Configurable Belief State Logging
|
||||
|
||||
## 1. Configuration Models
|
||||
|
||||
These models extend the existing `ConfigModels` in `backend/src/core/config_models.py`.
|
||||
|
||||
### 1.1. LoggingConfig
|
||||
|
||||
Defines the configuration for the application's logging system.
|
||||
|
||||
| Field | Type | Default | Description |
|
||||
|---|---|---|---|
|
||||
| `level` | `str` | `"INFO"` | The logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL). |
|
||||
| `file_path` | `Optional[str]` | `"logs/app.log"` | Path to the log file. If None, file logging is disabled. |
|
||||
| `max_bytes` | `int` | `10485760` (10MB) | Maximum size of a log file before rotation. |
|
||||
| `backup_count` | `int` | `5` | Number of backup files to keep. |
|
||||
| `enable_belief_state` | `bool` | `True` | Whether to enable structured Belief State logging (Entry/Exit). |
|
||||
|
||||
```python
|
||||
class LoggingConfig(BaseModel):
|
||||
level: str = "INFO"
|
||||
file_path: Optional[str] = "logs/app.log"
|
||||
max_bytes: int = 10 * 1024 * 1024
|
||||
backup_count: int = 5
|
||||
enable_belief_state: bool = True
|
||||
```
|
||||
|
||||
### 1.2. GlobalSettings (Updated)
|
||||
|
||||
Updates the existing `GlobalSettings` to include `LoggingConfig`.
|
||||
|
||||
| Field | Type | Default | Description |
|
||||
|---|---|---|---|
|
||||
| `logging` | `LoggingConfig` | `LoggingConfig()` | The logging configuration object. |
|
||||
|
||||
```python
|
||||
class GlobalSettings(BaseModel):
|
||||
backup_path: str
|
||||
default_environment_id: Optional[str] = None
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
```
|
||||
|
||||
## 2. Logger Entities
|
||||
|
||||
These entities are part of the `backend/src/core/logger.py` module.
|
||||
|
||||
### 2.1. LogEntry (Existing)
|
||||
|
||||
Represents a single log record.
|
||||
|
||||
| Field | Type | Description |
|
||||
|---|---|---|
|
||||
| `timestamp` | `datetime` | UTC timestamp of the log. |
|
||||
| `level` | `str` | Log level. |
|
||||
| `message` | `str` | Log message. |
|
||||
| `context` | `Optional[Dict[str, Any]]` | Additional context data. |
|
||||
|
||||
### 2.2. BeliefState (Concept)
|
||||
|
||||
Represents the state of execution in the "Belief State" model.
|
||||
|
||||
- **Entry**: Entering a logical block.
|
||||
- **Action**: Performing a core action within the block.
|
||||
- **Coherence**: Verifying the state (OK or Failed).
|
||||
- **Exit**: Exiting the logical block.
|
||||
|
||||
Format: `[{ANCHOR_ID}][{STATE}] {Message}`
|
||||
|
||||
## 3. Relationships
|
||||
|
||||
- `AppConfig` contains `GlobalSettings`.
|
||||
- `GlobalSettings` contains `LoggingConfig`.
|
||||
- `ConfigManager` reads/writes `AppConfig`.
|
||||
- `ConfigManager` configures the global `logger` based on `LoggingConfig`.
|
||||
81
specs/006-configurable-belief-logs/plan.md
Normal file
81
specs/006-configurable-belief-logs/plan.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# Implementation Plan: Configurable Belief State Logging
|
||||
|
||||
**Branch**: `006-configurable-belief-logs` | **Date**: 2025-12-26 | **Spec**: specs/006-configurable-belief-logs/spec.md
|
||||
**Input**: Feature specification from `/specs/006-configurable-belief-logs/spec.md`
|
||||
|
||||
**Note**: This template is filled in by the `/speckit.plan` command. See `.specify/templates/commands/plan.md` for the execution workflow.
|
||||
|
||||
## Summary
|
||||
|
||||
Implement a configurable logging system with "Belief State" support (Entry, Action, Coherence, Exit).
|
||||
Approach: Use Python's `logging` module with a custom Context Manager (`belief_scope`) and extend `GlobalSettings` with a `LoggingConfig` model.
|
||||
|
||||
## Technical Context
|
||||
|
||||
**Language/Version**: Python 3.9+
|
||||
**Primary Dependencies**: FastAPI (Backend), Pydantic (Config), Svelte (Frontend)
|
||||
**Storage**: Filesystem (for log files), JSON (for configuration persistence)
|
||||
**Testing**: pytest (Backend), manual verification (Frontend)
|
||||
**Target Platform**: Linux server (primary), cross-platform compatible
|
||||
**Project Type**: Web application (Backend + Frontend)
|
||||
**Performance Goals**: Low overhead logging (<1ms per log entry), non-blocking for main thread (mostly)
|
||||
**Constraints**: Must use standard library `logging` where possible; Log rotation to prevent disk overflow
|
||||
**Scale/Scope**: Configurable log levels and retention policies
|
||||
|
||||
## Constitution Check
|
||||
|
||||
*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
|
||||
|
||||
- **Library-First**: N/A (Core infrastructure)
|
||||
- **CLI Interface**: N/A (Configured via UI/API/JSON)
|
||||
- **Test-First**: Will require unit tests for `belief_scope` and config updates.
|
||||
- **Integration Testing**: Will require testing the settings API.
|
||||
- **Observability**: This feature *is* the observability enhancement.
|
||||
|
||||
## Project Structure
|
||||
|
||||
### Documentation (this feature)
|
||||
|
||||
```text
|
||||
specs/[###-feature]/
|
||||
├── plan.md # This file (/speckit.plan command output)
|
||||
├── research.md # Phase 0 output (/speckit.plan command)
|
||||
├── data-model.md # Phase 1 output (/speckit.plan command)
|
||||
├── quickstart.md # Phase 1 output (/speckit.plan command)
|
||||
├── contracts/ # Phase 1 output (/speckit.plan command)
|
||||
└── tasks.md # Phase 2 output (/speckit.tasks command - NOT created by /speckit.plan)
|
||||
```
|
||||
|
||||
### Source Code (repository root)
|
||||
|
||||
```text
|
||||
backend/
|
||||
├── src/
|
||||
│ ├── core/
|
||||
│ │ ├── config_models.py # Add LoggingConfig
|
||||
│ │ ├── config_manager.py # Update config loading/saving
|
||||
│ │ └── logger.py # Add belief_scope and configure_logger
|
||||
│ └── api/
|
||||
│ └── routes/
|
||||
│ └── settings.py # Update GlobalSettings endpoint
|
||||
└── tests/
|
||||
└── test_logger.py # New tests for logger logic
|
||||
|
||||
frontend/
|
||||
├── src/
|
||||
│ ├── pages/
|
||||
│ │ └── Settings.svelte # Add logging UI controls
|
||||
│ └── lib/
|
||||
│ └── api.js # Update API calls if needed
|
||||
```
|
||||
|
||||
**Structure Decision**: enhancing existing Backend/Frontend structure.
|
||||
|
||||
## Complexity Tracking
|
||||
|
||||
> **Fill ONLY if Constitution Check has violations that must be justified**
|
||||
|
||||
| Violation | Why Needed | Simpler Alternative Rejected Because |
|
||||
|-----------|------------|-------------------------------------|
|
||||
| [e.g., 4th project] | [current need] | [why 3 projects insufficient] |
|
||||
| [e.g., Repository pattern] | [specific problem] | [why direct DB access insufficient] |
|
||||
104
specs/006-configurable-belief-logs/quickstart.md
Normal file
104
specs/006-configurable-belief-logs/quickstart.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# Quickstart: Configurable Belief State Logging
|
||||
|
||||
## 1. Configuration
|
||||
|
||||
The logging system is configured via the `GlobalSettings` in `config.json` or through the Settings UI.
|
||||
|
||||
### 1.1. Default Configuration
|
||||
|
||||
```json
|
||||
{
|
||||
"environments": [],
|
||||
"settings": {
|
||||
"backup_path": "backups",
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"file_path": "logs/app.log",
|
||||
"max_bytes": 10485760,
|
||||
"backup_count": 5,
|
||||
"enable_belief_state": true
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 1.2. Changing Log Level
|
||||
|
||||
To change the log level to `DEBUG`, update the `logging.level` field in `config.json` or use the API:
|
||||
|
||||
```bash
|
||||
curl -X PATCH http://localhost:8000/api/settings/global \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"backup_path": "backups",
|
||||
"logging": {
|
||||
"level": "DEBUG",
|
||||
"file_path": "logs/app.log",
|
||||
"max_bytes": 10485760,
|
||||
"backup_count": 5,
|
||||
"enable_belief_state": true
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
## 2. Using Belief State Logging
|
||||
|
||||
### 2.1. Basic Usage
|
||||
|
||||
Use the `belief_scope` context manager to automatically log Entry, Exit, and Coherence states.
|
||||
|
||||
```python
|
||||
from backend.src.core.logger import logger, belief_scope
|
||||
|
||||
def my_function():
|
||||
with belief_scope("MyFunction"):
|
||||
# Logs: [MyFunction][Entry]
|
||||
|
||||
logger.info("Doing something important")
|
||||
# Logs: [MyFunction][Action] Doing something important
|
||||
|
||||
# ... logic ...
|
||||
|
||||
# Logs: [MyFunction][Coherence:OK]
|
||||
# Logs: [MyFunction][Exit]
|
||||
```
|
||||
|
||||
### 2.2. Error Handling
|
||||
|
||||
If an exception occurs within the scope, it is caught, logged as a failure, and re-raised.
|
||||
|
||||
```python
|
||||
def failing_function():
|
||||
with belief_scope("FailingFunc"):
|
||||
raise ValueError("Something went wrong")
|
||||
|
||||
# Logs: [FailingFunc][Entry]
|
||||
# Logs: [FailingFunc][Coherence:Failed] Something went wrong
|
||||
# Re-raises ValueError
|
||||
```
|
||||
|
||||
### 2.3. Custom Messages
|
||||
|
||||
You can provide an optional message to `belief_scope`.
|
||||
|
||||
```python
|
||||
with belief_scope("DataProcessor", "Processing batch #1"):
|
||||
# Logs: [DataProcessor][Entry] Processing batch #1
|
||||
pass
|
||||
```
|
||||
|
||||
## 3. Log Output Format
|
||||
|
||||
### 3.1. Standard Log
|
||||
|
||||
```text
|
||||
[2025-12-26 10:00:00,000][INFO][superset_tools_app] System initialized
|
||||
```
|
||||
|
||||
### 3.2. Belief State Log
|
||||
|
||||
```text
|
||||
[2025-12-26 10:00:01,000][INFO][superset_tools_app] [MyFunction][Entry]
|
||||
[2025-12-26 10:00:01,050][INFO][superset_tools_app] [MyFunction][Action] Processing data
|
||||
[2025-12-26 10:00:01,100][INFO][superset_tools_app] [MyFunction][Coherence:OK]
|
||||
[2025-12-26 10:00:01,100][INFO][superset_tools_app] [MyFunction][Exit]
|
||||
109
specs/006-configurable-belief-logs/research.md
Normal file
109
specs/006-configurable-belief-logs/research.md
Normal file
@@ -0,0 +1,109 @@
|
||||
# Research: Configurable Belief State Logging
|
||||
|
||||
## 1. Introduction
|
||||
|
||||
This research explores the implementation of a configurable logging system that supports "Belief State" logging (Entry, Action, Coherence, Exit) and allows users to customize log levels, formats, and file persistence.
|
||||
|
||||
## 2. Analysis of Existing System
|
||||
|
||||
- **Language**: Python 3.9+ (Backend)
|
||||
- **Framework**: FastAPI (inferred from context, though not explicitly seen in snippets, `uvicorn` mentioned)
|
||||
- **Logging**: Standard Python `logging` module.
|
||||
- **Configuration**: Pydantic models (`ConfigModels`) managed by `ConfigManager`, persisting to `config.json`.
|
||||
- **Current Logger**:
|
||||
- `backend/src/core/logger.py`:
|
||||
- Uses `logging.getLogger("superset_tools_app")`
|
||||
- Has `StreamHandler` (console) and `WebSocketLogHandler` (streaming).
|
||||
- `WebSocketLogHandler` buffers logs in a `deque`.
|
||||
|
||||
## 3. Requirements Analysis
|
||||
|
||||
- **Belief State**: Need a structured way to log `[ANCHOR_ID][STATE] Message`.
|
||||
- **Context Manager**: Need a `with belief_scope("ID"):` pattern.
|
||||
- **Configuration**: Need to add `LoggingConfig` to `GlobalSettings`.
|
||||
- **File Logging**: Need `RotatingFileHandler` with size limits.
|
||||
- **Dynamic Reconfiguration**: Need to update logger handlers/levels when config changes.
|
||||
|
||||
## 4. Proposed Solution
|
||||
|
||||
### 4.1. Data Model (`LoggingConfig`)
|
||||
|
||||
We will add a `LoggingConfig` model to `backend/src/core/config_models.py`:
|
||||
|
||||
```python
|
||||
class LoggingConfig(BaseModel):
|
||||
level: str = "INFO" # DEBUG, INFO, WARNING, ERROR, CRITICAL
|
||||
file_path: Optional[str] = "logs/app.log"
|
||||
max_bytes: int = 10 * 1024 * 1024 # 10MB
|
||||
backup_count: int = 5
|
||||
enable_belief_state: bool = True
|
||||
```
|
||||
|
||||
And update `GlobalSettings`:
|
||||
|
||||
```python
|
||||
class GlobalSettings(BaseModel):
|
||||
backup_path: str
|
||||
default_environment_id: Optional[str] = None
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
```
|
||||
|
||||
### 4.2. Context Manager (`belief_scope`)
|
||||
|
||||
We will implement a context manager in `backend/src/core/logger.py`:
|
||||
|
||||
```python
|
||||
from contextlib import contextmanager
|
||||
|
||||
@contextmanager
|
||||
def belief_scope(anchor_id: str, message: str = ""):
|
||||
# ... logic to log [Entry] ...
|
||||
try:
|
||||
yield
|
||||
# ... logic to log [Coherence:OK] and [Exit] ...
|
||||
except Exception as e:
|
||||
# ... logic to log [Coherence:Failed] ...
|
||||
raise
|
||||
```
|
||||
|
||||
### 4.3. Logger Reconfiguration
|
||||
|
||||
We will add a `configure_logger(config: LoggingConfig)` function in `backend/src/core/logger.py` that:
|
||||
1. Sets the logger level.
|
||||
2. Removes old file handlers.
|
||||
3. Adds a new `RotatingFileHandler` if `file_path` is set.
|
||||
4. Updates a global flag for `enable_belief_state`.
|
||||
|
||||
`ConfigManager` will call this function whenever settings are updated.
|
||||
|
||||
### 4.4. Belief State Filtering
|
||||
|
||||
If `enable_belief_state` is False:
|
||||
- `Entry`/`Exit` logs are skipped.
|
||||
- `Action`/`Coherence` logs are logged as standard messages (maybe without the `[ANCHOR_ID]` prefix if desired, but retaining it is usually better for consistency).
|
||||
|
||||
## 5. Alternatives Considered
|
||||
|
||||
- **Alternative A**: Use a separate logger for Belief State.
|
||||
- *Pros*: Cleaner separation.
|
||||
- *Cons*: Harder to interleave with standard logs in the same stream/file.
|
||||
- *Decision*: Rejected. We want a unified log stream.
|
||||
|
||||
- **Alternative B**: Use structlog.
|
||||
- *Pros*: Powerful structured logging.
|
||||
- *Cons*: Adds a new dependency.
|
||||
- *Decision*: Rejected. Standard `logging` is sufficient and already used.
|
||||
|
||||
## 6. Implementation Steps
|
||||
|
||||
1. **Modify `backend/src/core/config_models.py`**: Add `LoggingConfig` and update `GlobalSettings`.
|
||||
2. **Modify `backend/src/core/logger.py`**:
|
||||
- Add `configure_logger` function.
|
||||
- Implement `belief_scope` context manager.
|
||||
- Implement `RotatingFileHandler`.
|
||||
3. **Modify `backend/src/core/config_manager.py`**: Call `configure_logger` on init and update.
|
||||
4. **Frontend**: Update Settings page to allow editing `LoggingConfig`.
|
||||
|
||||
## 7. Conclusion
|
||||
|
||||
The proposed solution leverages the existing Pydantic/ConfigManager architecture and standard Python logging, minimizing disruption while meeting all requirements.
|
||||
88
specs/006-configurable-belief-logs/spec.md
Normal file
88
specs/006-configurable-belief-logs/spec.md
Normal file
@@ -0,0 +1,88 @@
|
||||
# Feature Specification: Configurable Belief State Logging
|
||||
|
||||
**Feature Branch**: `006-configurable-belief-logs`
|
||||
**Created**: 2025-12-26
|
||||
**Status**: Draft
|
||||
**Input**: User description: "Меня не устраивает текущее состояния логов проекта, которые я получаю после запуска run.sh. Нужно продумать систему хранения логов, которая будет настраиваемой, включать логирование belief state агента разработки (смотри semantic_protocol.md) и гибко настраиваемой."
|
||||
|
||||
## Clarifications
|
||||
|
||||
### Session 2025-12-26
|
||||
- Q: Log rotation policy? → A: Size-based rotation (e.g., 10MB limit, 5 backups).
|
||||
- Q: Belief State disabled behavior? → A: Smart Filter (Suppress Entry/Exit; keep Action/Coherence as standard logs).
|
||||
- Q: Implementation pattern? → A: Context Manager (e.g., `with belief_scope("ID"):`) to auto-handle Entry/Exit/Error.
|
||||
|
||||
## User Scenarios & Testing
|
||||
|
||||
### User Story 1 - Structured Belief State Logging (Priority: P1)
|
||||
|
||||
As a developer or AI agent, I want logs to clearly indicate the execution flow and internal state ("Belief State") of the system using a standardized format, so that I can easily debug logic errors and verify semantic coherence.
|
||||
|
||||
**Why this priority**: This is the core requirement to align with the `semantic_protocol.md` and improve debugging capabilities for the AI agent.
|
||||
|
||||
**Independent Test**: Trigger an action in the system (e.g., running a migration) and verify that the logs contain entries formatted as `[ANCHOR_ID][STATE] Message`, representing the entry, action, and exit states of the code blocks.
|
||||
|
||||
**Acceptance Scenarios**:
|
||||
|
||||
1. **Given** a function wrapped with a Belief State logger, **When** the function is called, **Then** a log entry `[ANCHOR_ID][Entry] ...` is generated.
|
||||
2. **Given** a function execution, **When** the core logic is executed, **Then** a log entry `[ANCHOR_ID][Action] ...` is generated.
|
||||
3. **Given** a function execution completes successfully, **When** it returns, **Then** a log entry `[ANCHOR_ID][Coherence:OK]` and `[ANCHOR_ID][Exit]` are generated.
|
||||
4. **Given** a function execution fails, **When** an exception is raised, **Then** a log entry `[ANCHOR_ID][Coherence:Failed]` is generated.
|
||||
|
||||
---
|
||||
|
||||
### User Story 2 - Configurable Logging Settings (Priority: P2)
|
||||
|
||||
As a user, I want to be able to configure log levels, formats, and destinations via the application settings, so that I can control the verbosity and storage of logs without changing code.
|
||||
|
||||
**Why this priority**: Provides the "flexible" and "customizable" aspect of the requirement, allowing users to adapt logging to their needs (dev vs prod).
|
||||
|
||||
**Independent Test**: Change the log level in the settings (e.g., from INFO to DEBUG) and verify that debug messages start appearing in the output.
|
||||
|
||||
**Acceptance Scenarios**:
|
||||
|
||||
1. **Given** the application is running, **When** I update the log level to "DEBUG" in the settings, **Then** debug-level logs are captured and displayed.
|
||||
2. **Given** the application is running, **When** I enable "File Logging" in settings, **Then** logs are written to the specified file path.
|
||||
3. **Given** the application is running, **When** I disable "Belief State" logs in settings, **Then** "Entry" and "Exit" logs are suppressed, while "Action" and "Coherence" logs are retained as standard log entries.
|
||||
|
||||
---
|
||||
|
||||
### Edge Cases
|
||||
|
||||
- **Invalid Configuration**: What happens if the user provides an invalid log level (e.g., "SUPER_LOUD")? The system should fallback to a safe default (e.g., "INFO") and log a warning.
|
||||
- **File System Errors**: What happens if the configured log file path is not writable? The system should fallback to console logging and alert the user.
|
||||
- **High Volume**: What happens if "Belief State" logging generates too much noise? The system should remain performant, and users should be able to toggle it off easily.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Functional Requirements
|
||||
|
||||
- **FR-001**: The system MUST support a `LoggingConfig` structure within the global configuration to store settings for level, format, file path, and belief state enablement.
|
||||
- **FR-002**: The logging system MUST implement a standard format for "Belief State" logs as defined in the system protocols: `[{ANCHOR_ID}][{STATE}] {Message}`.
|
||||
- **FR-003**: The system MUST provide a standard mechanism to easily generate Belief State logs without repetitive boilerplate code, specifically using a **Context Manager** pattern.
|
||||
- **FR-004**: The supported Belief States MUST include: `Entry`, `Action`, `Coherence:OK`, `Coherence:Failed`, `Exit`.
|
||||
- **FR-005**: The system MUST allow dynamic reconfiguration of the logger (e.g., changing levels) when settings are updated.
|
||||
- **FR-006**: The real-time log stream MUST preserve the structured format of Belief State logs so the frontend can potentially render them specially.
|
||||
- **FR-007**: Logs MUST optionally be persisted to a file if configured.
|
||||
- **FR-008**: The file logging system MUST implement size-based rotation (default: 10MB limit, 5 backups) to prevent disk saturation.
|
||||
- **FR-009**: When `enable_belief_state` is False, the logger MUST suppress `Entry` and `Exit` states but retain `Action` and `Coherence` states (stripping the structured prefix if necessary or keeping it as standard info).
|
||||
- **FR-010**: The Context Manager MUST automatically log `[Entry]` on start, `[Exit]` on success, and `[Coherence:Failed]` if an exception occurs (re-raising the exception).
|
||||
|
||||
### Key Entities
|
||||
|
||||
- **LoggingConfig**: A configuration object defining `level` (text), `file_path` (text), `format` (structure), `enable_belief_state` (boolean).
|
||||
- **BeliefStateAdapter**: A component that enforces the semantic protocol format for log entries.
|
||||
|
||||
## Success Criteria
|
||||
|
||||
### Measurable Outcomes
|
||||
|
||||
- **SC-001**: Developers can trace the execution flow of a specific task using only the `[ANCHOR_ID]` filtered logs.
|
||||
- **SC-002**: Changing the log level in the configuration file or API immediately (or upon restart) reflects in the log output.
|
||||
- **SC-003**: All "Belief State" logs strictly follow the `[ID][State]` format, allowing for regex parsing.
|
||||
- **SC-004**: System supports logging to both console and file simultaneously if configured.
|
||||
|
||||
## Assumptions
|
||||
|
||||
- The existing real-time streaming mechanism can be adapted or chained with the new logging configuration.
|
||||
- The system protocol definition of states is the source of truth.
|
||||
61
specs/006-configurable-belief-logs/tasks.md
Normal file
61
specs/006-configurable-belief-logs/tasks.md
Normal file
@@ -0,0 +1,61 @@
|
||||
# Tasks: Configurable Belief State Logging
|
||||
|
||||
**Spec**: `specs/006-configurable-belief-logs/spec.md`
|
||||
**Plan**: `specs/006-configurable-belief-logs/plan.md`
|
||||
**Status**: Completed
|
||||
|
||||
## Phase 1: Setup
|
||||
*Goal: Initialize project structure for logging.*
|
||||
|
||||
- [x] T001 Ensure logs directory exists at `logs/` (relative to project root)
|
||||
|
||||
## Phase 2: Foundational
|
||||
*Goal: Define data models and infrastructure required for logging configuration.*
|
||||
|
||||
- [x] T002 Define `LoggingConfig` model in `backend/src/core/config_models.py`
|
||||
- [x] T003 Update `GlobalSettings` model to include `logging` field in `backend/src/core/config_models.py`
|
||||
- [x] T004 Update `ConfigManager` to handle logging configuration persistence in `backend/src/core/config_manager.py`
|
||||
|
||||
## Phase 3: User Story 1 - Structured Belief State Logging
|
||||
*Goal: Implement the core "Belief State" logging logic with context managers.*
|
||||
*Priority: P1*
|
||||
|
||||
**Independent Test**: Run `pytest backend/tests/test_logger.py` and verify `belief_scope` generates `[ID][Entry]`, `[ID][Action]`, and `[ID][Exit]` logs.
|
||||
|
||||
- [x] T005 [US1] Create unit tests for belief state logging in `backend/tests/test_logger.py`
|
||||
- [x] T006 [US1] Implement `belief_scope` context manager in `backend/src/core/logger.py`
|
||||
- [x] T007 [US1] Implement log formatting and smart filtering (suppress Entry/Exit if disabled) in `backend/src/core/logger.py`
|
||||
|
||||
## Phase 4: User Story 2 - Configurable Logging Settings
|
||||
*Goal: Expose logging configuration to the user via API and UI.*
|
||||
*Priority: P2*
|
||||
|
||||
**Independent Test**: Update settings via API/UI and verify log level changes (e.g., DEBUG logs appear) and file rotation is active.
|
||||
|
||||
- [x] T008 [US2] Implement `configure_logger` function to apply settings (level, file, rotation) in `backend/src/core/logger.py`
|
||||
- [x] T009 [US2] Update settings API endpoint to handle `logging` updates in `backend/src/api/routes/settings.py`
|
||||
- [x] T010 [P] [US2] Add Logging configuration section (Level, File Path, Enable Belief State) to `frontend/src/pages/Settings.svelte`
|
||||
|
||||
## Final Phase: Polish & Cross-Cutting Concerns
|
||||
*Goal: Verify system stability and cleanup.*
|
||||
|
||||
- [x] T011 Verify log rotation works by generating dummy logs (manual verification)
|
||||
- [x] T012 Ensure default configuration provides a sensible out-of-the-box experience
|
||||
|
||||
## Dependencies
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
Setup[Phase 1: Setup] --> Foundational[Phase 2: Foundational]
|
||||
Foundational --> US1[Phase 3: US1 Belief State]
|
||||
US1 --> US2[Phase 4: US2 Configurable Settings]
|
||||
US2 --> Polish[Final Phase: Polish]
|
||||
```
|
||||
|
||||
## Parallel Execution Opportunities
|
||||
|
||||
- **US2**: Frontend task (T010) can be implemented in parallel with Backend tasks (T008, T009) once the API contract is finalized.
|
||||
|
||||
## Implementation Strategy
|
||||
1. **MVP**: Complete Phase 1, 2, and 3 to enable structured logging for the agent.
|
||||
2. **Full Feature**: Complete Phase 4 to allow users to control the verbosity and storage.
|
||||
@@ -0,0 +1,36 @@
|
||||
# Specification Quality Checklist: Migration Plugin Dashboard Grid
|
||||
|
||||
**Purpose**: Validate specification completeness and quality before proceeding to planning
|
||||
**Created**: 2025-12-27
|
||||
**Feature**: [specs/007-migration-dashboard-grid/spec.md](../spec.md)
|
||||
|
||||
## Content Quality
|
||||
|
||||
- [x] No implementation details (languages, frameworks, APIs)
|
||||
- [x] Focused on user value and business needs
|
||||
- [x] Written for non-technical stakeholders
|
||||
- [x] All mandatory sections completed
|
||||
|
||||
## Requirement Completeness
|
||||
|
||||
- [x] No [NEEDS CLARIFICATION] markers remain
|
||||
- [x] Requirements are testable and unambiguous
|
||||
- [x] Success criteria are measurable
|
||||
- [x] Success criteria are technology-agnostic (no implementation details)
|
||||
- [x] All acceptance scenarios are defined
|
||||
- [x] Edge cases are identified
|
||||
- [x] Scope is clearly bounded
|
||||
- [x] Dependencies and assumptions identified
|
||||
|
||||
## Feature Readiness
|
||||
|
||||
- [x] All functional requirements have clear acceptance criteria
|
||||
- [x] User scenarios cover primary flows
|
||||
- [x] Feature meets measurable outcomes defined in Success Criteria
|
||||
- [x] No implementation details leak into specification
|
||||
|
||||
## Notes
|
||||
|
||||
- The specification clearly defines the UI requirements for the dashboard selection grid.
|
||||
- "Superset API" is mentioned as the source of truth, which is acceptable as it defines the data boundary.
|
||||
- Success criteria include specific performance metrics (<200ms filtering).
|
||||
58
specs/007-migration-dashboard-grid/contracts/api.md
Normal file
58
specs/007-migration-dashboard-grid/contracts/api.md
Normal file
@@ -0,0 +1,58 @@
|
||||
# API Contracts: Migration Dashboard Grid
|
||||
|
||||
## Endpoints
|
||||
|
||||
### 1. List Dashboards
|
||||
**Method**: `GET`
|
||||
**Path**: `/api/environments/{env_id}/dashboards`
|
||||
**Purpose**: Fetch all dashboards from the specified environment for the grid.
|
||||
|
||||
**Request Parameters**:
|
||||
- `env_id` (path): The ID of the environment to fetch from.
|
||||
|
||||
**Response**:
|
||||
- **200 OK**:
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": 123,
|
||||
"title": "Sales Dashboard",
|
||||
"last_modified": "2023-10-27T10:00:00Z",
|
||||
"status": "published"
|
||||
},
|
||||
{
|
||||
"id": 124,
|
||||
"title": "Draft Metrics",
|
||||
"last_modified": "2023-10-26T15:30:00Z",
|
||||
"status": "draft"
|
||||
}
|
||||
]
|
||||
```
|
||||
- **404 Not Found**: Environment not found.
|
||||
- **500 Internal Server Error**: Superset API error.
|
||||
|
||||
## Components (Frontend)
|
||||
|
||||
### DashboardGrid
|
||||
**Props**:
|
||||
- `dashboards`: `DashboardMetadata[]` - List of dashboards to display.
|
||||
- `selectedIds`: `number[]` - IDs of currently selected dashboards.
|
||||
|
||||
**Events**:
|
||||
- `selectionChanged`: Emitted when selection changes. Payload: `number[]` (new list of selected IDs).
|
||||
|
||||
**State**:
|
||||
- `filterText`: string - Current filter text.
|
||||
- `currentPage`: number - Current page index (0-based).
|
||||
- `pageSize`: number - Items per page (default 20).
|
||||
- `sortColumn`: string - 'title' | 'last_modified' | 'status'.
|
||||
- `sortDirection`: 'asc' | 'desc'.
|
||||
|
||||
## Superset Client Extension
|
||||
|
||||
### `get_dashboards_summary`
|
||||
**Signature**: `def get_dashboards_summary(self) -> List[Dict]`
|
||||
**Purpose**: Fetches dashboard metadata optimized for the grid.
|
||||
**Implementation Detail**:
|
||||
- Calls `GET /api/v1/dashboard/` with query params `q=(columns:!(id,dashboard_title,changed_on_utc,published))`.
|
||||
- Maps response fields to `DashboardMetadata` schema.
|
||||
25
specs/007-migration-dashboard-grid/data-model.md
Normal file
25
specs/007-migration-dashboard-grid/data-model.md
Normal file
@@ -0,0 +1,25 @@
|
||||
# Data Model: Migration Dashboard Grid
|
||||
|
||||
## Entities
|
||||
|
||||
### DashboardMetadata
|
||||
**Source**: Superset API (`/api/v1/dashboard/`)
|
||||
**Purpose**: Represents a dashboard available for migration.
|
||||
|
||||
| Field | Type | Description | Source Mapping |
|
||||
|-------|------|-------------|----------------|
|
||||
| `id` | Integer | Unique identifier | `id` |
|
||||
| `title` | String | Display name of the dashboard | `dashboard_title` |
|
||||
| `last_modified` | String (ISO 8601) | Timestamp of last modification | `changed_on_utc` |
|
||||
| `status` | Enum ('published', 'draft') | Publication status | `published` (boolean) -> 'published'/'draft' |
|
||||
|
||||
## Value Objects
|
||||
|
||||
### DashboardSelection
|
||||
**Purpose**: Represents the user's selection of dashboards to migrate.
|
||||
|
||||
| Field | Type | Description |
|
||||
|-------|------|-------------|
|
||||
| `selected_ids` | List[Integer] | List of dashboard IDs selected for migration |
|
||||
| `source_env_id` | String | ID of the source environment |
|
||||
| `target_env_id` | String | ID of the target environment |
|
||||
85
specs/007-migration-dashboard-grid/plan.md
Normal file
85
specs/007-migration-dashboard-grid/plan.md
Normal file
@@ -0,0 +1,85 @@
|
||||
# Implementation Plan: [FEATURE]
|
||||
|
||||
**Branch**: `[###-feature-name]` | **Date**: [DATE] | **Spec**: [link]
|
||||
**Input**: Feature specification from `/specs/[###-feature-name]/spec.md`
|
||||
|
||||
**Note**: This template is filled in by the `/speckit.plan` command. See `.specify/templates/commands/plan.md` for the execution workflow.
|
||||
|
||||
## Summary
|
||||
|
||||
[Extract from feature spec: primary requirement + technical approach from research]
|
||||
|
||||
## Technical Context
|
||||
|
||||
**Language/Version**: Python 3.9+ (Backend), Node.js 18+ (Frontend)
|
||||
**Primary Dependencies**: FastAPI, SvelteKit, Tailwind CSS, Pydantic, Superset API
|
||||
**Storage**: N/A (Superset API integration - read-only for metadata)
|
||||
**Testing**: pytest (Backend), vitest (Frontend - inferred)
|
||||
**Target Platform**: Linux server / Containerized
|
||||
**Project Type**: web application (Backend + Frontend)
|
||||
**Performance Goals**: Client-side filtering < 200ms for 100+ items
|
||||
**Constraints**: Must handle large lists via pagination (Client-side). Spec says "Client-side (Fetch all, filter locally)" and "Pagination (e.g., 20 per page)". *RESOLVED: Fetch all, paginate locally.*
|
||||
**Scale/Scope**: ~100s of dashboards per environment.
|
||||
|
||||
## Constitution Check
|
||||
|
||||
*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
|
||||
|
||||
- [x] **Causal Validity**: Contracts (API/Data Model) defined before implementation.
|
||||
- [x] **Immutability**: Module headers (`[DEF]`) preserved/added.
|
||||
- [x] **Semantic Format**: All new code uses `[DEF]` anchors and metadata.
|
||||
- [x] **Fractal Complexity**: New components (Grid) kept modular; `SupersetClient` extensions are small methods.
|
||||
|
||||
**Status**: PASSED
|
||||
|
||||
## Project Structure
|
||||
|
||||
### Documentation (this feature)
|
||||
|
||||
```text
|
||||
specs/[###-feature]/
|
||||
├── plan.md # This file (/speckit.plan command output)
|
||||
├── research.md # Phase 0 output (/speckit.plan command)
|
||||
├── data-model.md # Phase 1 output (/speckit.plan command)
|
||||
├── quickstart.md # Phase 1 output (/speckit.plan command)
|
||||
├── contracts/ # Phase 1 output (/speckit.plan command)
|
||||
└── tasks.md # Phase 2 output (/speckit.tasks command - NOT created by /speckit.plan)
|
||||
```
|
||||
|
||||
### Source Code (repository root)
|
||||
|
||||
```text
|
||||
backend/
|
||||
├── src/
|
||||
│ ├── api/
|
||||
│ │ └── routes/
|
||||
│ │ └── environments.py # Update to support dashboard fetching
|
||||
│ ├── core/
|
||||
│ │ └── superset_client.py # Update to fetch extended dashboard metadata
|
||||
│ └── models/
|
||||
│ └── dashboard.py # New model for Dashboard metadata
|
||||
└── tests/
|
||||
└── test_superset_client.py
|
||||
|
||||
frontend/
|
||||
├── src/
|
||||
│ ├── components/
|
||||
│ │ ├── DashboardGrid.svelte # New component
|
||||
│ │ └── Pagination.svelte # New component (if not exists)
|
||||
│ ├── routes/
|
||||
│ │ └── migration/
|
||||
│ │ └── +page.svelte # Update to use DashboardGrid
|
||||
│ └── types/
|
||||
│ └── dashboard.ts # New type definitions
|
||||
```
|
||||
|
||||
**Structure Decision**: Standard Web Application structure. Backend updates to `SupersetClient` and API routes to serve dashboard metadata. Frontend updates to include a new `DashboardGrid` component and integrate it into the migration flow.
|
||||
|
||||
## Complexity Tracking
|
||||
|
||||
> **Fill ONLY if Constitution Check has violations that must be justified**
|
||||
|
||||
| Violation | Why Needed | Simpler Alternative Rejected Because |
|
||||
|-----------|------------|-------------------------------------|
|
||||
| [e.g., 4th project] | [current need] | [why 3 projects insufficient] |
|
||||
| [e.g., Repository pattern] | [specific problem] | [why direct DB access insufficient] |
|
||||
31
specs/007-migration-dashboard-grid/quickstart.md
Normal file
31
specs/007-migration-dashboard-grid/quickstart.md
Normal file
@@ -0,0 +1,31 @@
|
||||
# Quickstart: Migration Dashboard Grid
|
||||
|
||||
## Prerequisites
|
||||
- Backend running (`uvicorn backend.src.app:app --reload`)
|
||||
- Frontend running (`npm run dev`)
|
||||
- Superset instance accessible and configured in `config.yaml`
|
||||
|
||||
## Steps to Verify
|
||||
|
||||
1. **Navigate to Migration Page**:
|
||||
- Open browser to `http://localhost:5173/migration`
|
||||
- Select a Source Environment from the dropdown.
|
||||
|
||||
2. **Verify Dashboard Grid**:
|
||||
- The grid should appear below the environment selectors.
|
||||
- It should list dashboards with columns: Title, Last Modified, Status.
|
||||
- Status pills should be green (Published) or gray (Draft).
|
||||
|
||||
3. **Test Filtering**:
|
||||
- Type in the "Search dashboards..." input.
|
||||
- The list should filter instantly (client-side).
|
||||
|
||||
4. **Test Pagination**:
|
||||
- If >20 dashboards, check pagination controls at the bottom.
|
||||
- Navigate to next page.
|
||||
|
||||
5. **Test Selection**:
|
||||
- Select a few dashboards.
|
||||
- Change filter (hide selected).
|
||||
- Clear filter -> Selection should persist.
|
||||
- Click "Select All" -> Should select all matching current filter.
|
||||
48
specs/007-migration-dashboard-grid/research.md
Normal file
48
specs/007-migration-dashboard-grid/research.md
Normal file
@@ -0,0 +1,48 @@
|
||||
# Research: Migration Dashboard Grid
|
||||
|
||||
## Unknowns & Clarifications
|
||||
|
||||
### 1. Pagination vs Client-side Filtering
|
||||
**Context**: The spec mentions "Client-side (Fetch all, filter locally)" (FR-004) but also "Pagination (e.g., 20 per page)" (FR-008).
|
||||
**Resolution**:
|
||||
- We will fetch ALL dashboard metadata from the Superset API in one go. The metadata (ID, Title, Status, Date) is lightweight. Even for 1000 dashboards, the payload is small (~100KB).
|
||||
- **Client-side Pagination**: We will implement pagination purely on the frontend. This satisfies "Pagination" for UI performance/usability while keeping the "Fetch all" requirement for fast filtering.
|
||||
- **Decision**: Fetch all, paginate locally.
|
||||
|
||||
### 2. Superset API for Dashboard Metadata
|
||||
**Context**: Need to fetch `title`, `changed_on`, `published`.
|
||||
**Research**:
|
||||
- Superset API endpoint: `/api/v1/dashboard/`
|
||||
- Standard response includes `result` array with `dashboard_title`, `changed_on_utc`, `published`.
|
||||
- **Decision**: Use `GET /api/v1/dashboard/` with `q` parameter to select specific columns to minimize payload.
|
||||
- Columns: `id`, `dashboard_title`, `changed_on_utc`, `published`.
|
||||
|
||||
### 3. Grid Component
|
||||
**Context**: Need a grid with sorting, filtering, and selection.
|
||||
**Options**:
|
||||
- **Custom Svelte Table**: Lightweight, full control.
|
||||
- **3rd Party Lib (e.g. svelte-headless-table)**: Powerful but maybe overkill.
|
||||
- **Decision**: **Custom Svelte Component** (`DashboardGrid.svelte`).
|
||||
- Why: Requirements are specific (Select All across pages, custom status pill, specific columns). A custom component using standard HTML table + Tailwind is simple and maintainable for this scope.
|
||||
|
||||
## Design Decisions
|
||||
|
||||
### Data Model
|
||||
- **Dashboard**:
|
||||
- `id`: string (or int, depends on Superset version, usually int for dashboards but we treat as ID)
|
||||
- `title`: string
|
||||
- `last_modified`: string (ISO date)
|
||||
- `status`: 'published' | 'draft'
|
||||
|
||||
### Architecture
|
||||
- **Backend**:
|
||||
- `SupersetClient.get_dashboards()`: Fetches list from Superset.
|
||||
- `GET /api/environments/{id}/dashboards`: Proxy endpoint.
|
||||
- **Frontend**:
|
||||
- `DashboardGrid.svelte`: Handles display, sorting, pagination, and selection logic.
|
||||
- `migration/+page.svelte`: Orchestrates fetching and passes data to Grid.
|
||||
|
||||
### UX/UI
|
||||
- **Status Column**: Badge (Green for Published, Gray for Draft).
|
||||
- **Selection**: Checkbox in first column.
|
||||
- **Pagination**: Simple "Prev 1 of 5 Next" controls at bottom.
|
||||
81
specs/007-migration-dashboard-grid/spec.md
Normal file
81
specs/007-migration-dashboard-grid/spec.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# Feature Specification: Migration Plugin Dashboard Grid
|
||||
|
||||
**Feature Branch**: `007-migration-dashboard-grid`
|
||||
**Created**: 2025-12-27
|
||||
**Status**: Draft
|
||||
**Input**: User description: "Я хочу доработать плагин миграции. Выбор дашбордов должен осуществляться из списка-грида, с возможностью его фильтровать по наименованию. В гриде должны быть поля наименования дашборда, дата последнего изменения дашборда, плюс статус - опубликован или черновик"
|
||||
|
||||
## Clarifications
|
||||
|
||||
### Session 2025-12-27
|
||||
- Q: How should the grid handle data loading and filtering to ensure performance and usability? → A: **Client-side** (Fetch all, filter locally).
|
||||
- Q: Should the grid include a "Select All" checkbox in the header for bulk operations? → A: **Yes, include "Select All"**.
|
||||
- Q: How should the grid handle large lists of dashboards (e.g., >50)? → A: **Pagination** (e.g., 20 per page).
|
||||
- Q: Does the "Select All" checkbox select only the currently visible page of dashboards, or all dashboards that match the current filter? → A: **All matching filter** (Selects all filtered results, not just the visible page).
|
||||
- Q: What should happen if the user changes the filter while some items are already selected? → A: **Preserve selection** (Selected items remain selected even if hidden by new filter).
|
||||
- Q: What should be the default sort order when the dashboard grid first loads? → A: **Last Modified Date (Newest first)**.
|
||||
- Q: Should the grid include an "Owners" column to help distinguish dashboards with the same name? → A: **Yes, include Owners**.
|
||||
- Q: How should the "Owners" column display multiple owners? → A: **Show first owner + count (e.g., "admin + 2") with tooltip**.
|
||||
- Q: How should the "Status" (Draft/Published) be visually represented in the grid? → A: **Colored Badges/Chips**.
|
||||
- Q: Should the grid include a "Preview" action (e.g., link to open the dashboard in Superset)? → A: **Yes, open in new tab**.
|
||||
|
||||
## User Scenarios & Testing *(mandatory)*
|
||||
|
||||
### User Story 1 - Advanced Dashboard Selection (Priority: P1)
|
||||
|
||||
As a migration engineer, I want to select dashboards from a detailed grid view that includes status and modification dates, so that I can easily distinguish between draft/published versions and identify the most recent changes before migrating.
|
||||
|
||||
**Why this priority**: Current selection mechanisms (likely simple dropdowns or lists) lack critical context (status, freshness), making it error-prone to select the right assets for migration.
|
||||
|
||||
**Independent Test**: Can be tested by connecting to a Superset instance with known dashboards (some drafts, some published) and verifying the grid correctly displays their metadata and allows filtering/selection.
|
||||
|
||||
**Acceptance Scenarios**:
|
||||
|
||||
1. **Given** I have selected a source environment in the migration plugin, **When** the dashboard list loads, **Then** I see a grid view displaying "Dashboard Name", "Last Modified", and "Status" columns.
|
||||
2. **Given** the dashboard grid is displayed, **When** I type "Sales" into the filter input, **Then** the grid updates to show only dashboards containing "Sales" in their name.
|
||||
3. **Given** a dashboard is in "Draft" state in Superset, **When** it appears in the grid, **Then** the Status column clearly indicates "Draft" (vs "Published").
|
||||
4. **Given** I want to migrate multiple dashboards, **When** I check the boxes next to several rows, **Then** they are added to the selection for the migration job.
|
||||
5. **Given** the grid is populated, **When** I click the "Select All" checkbox in the header, **Then** all visible dashboards are selected.
|
||||
|
||||
---
|
||||
|
||||
### Edge Cases
|
||||
|
||||
- **Empty Environment**: What happens if the source environment has no dashboards? System should display a "No dashboards found" message in the grid area.
|
||||
- **Missing Metadata**: What if the Superset API returns null for `changed_on` or `published`? System should display "N/A" or a default value (e.g., "Unknown") rather than crashing.
|
||||
- **Large Dataset**: How does the grid handle 1000+ dashboards? The grid MUST use pagination (default 20 items per page) to manage display density.
|
||||
|
||||
## Requirements *(mandatory)*
|
||||
|
||||
### Functional Requirements
|
||||
|
||||
- **FR-001**: The system MUST fetch extended metadata for dashboards from the Superset API, specifically: Title, Last Modified Date (`changed_on`), and Published Status (`published`).
|
||||
- **FR-002**: The Migration Plugin UI MUST display a data grid component to list these dashboards.
|
||||
- **FR-003**: The grid MUST include sortable columns for:
|
||||
- Name (Dashboard Title)
|
||||
- Last Modified (Date/Time)
|
||||
- Status (Published/Draft)
|
||||
- Owners (List of owner names)
|
||||
- **FR-004**: The UI MUST provide a text filter input that filters the grid rows by Dashboard Name in real-time using client-side logic (fetching all dashboards once).
|
||||
- **FR-005**: The grid MUST support multi-row selection to allow migrating batches of dashboards.
|
||||
- **FR-006**: The selection state MUST be passed to the migration execution logic when the user initiates the migration.
|
||||
- **FR-007**: The grid header MUST include a "Select All" checkbox. When checked, it MUST select ALL dashboards matching the current filter criteria (spanning across all pages), not just the currently visible page.
|
||||
- **FR-008**: The grid MUST support pagination, displaying 20 rows per page by default, with navigation controls (Next/Prev/Page numbers).
|
||||
- **FR-009**: The selection state MUST be preserved across filter changes. Items selected before a filter change MUST remain selected even if they are hidden by the new filter.
|
||||
|
||||
### Key Entities
|
||||
|
||||
- **Dashboard Metadata**:
|
||||
- `id`: Unique identifier from Superset.
|
||||
- `title`: Display name.
|
||||
- `changed_on`: Timestamp of last edit.
|
||||
- `is_published`: Boolean status.
|
||||
- `owners`: List of owner objects/names.
|
||||
|
||||
## Success Criteria *(mandatory)*
|
||||
|
||||
### Measurable Outcomes
|
||||
|
||||
- **SC-001**: Users can identify the status (Draft/Published) of any dashboard in the list with 100% accuracy.
|
||||
- **SC-002**: Filtering a list of 100 dashboards takes less than 200ms to update the view.
|
||||
- **SC-003**: Users can successfully select and initiate migration for a mix of Draft and Published dashboards in a single operation.
|
||||
29
specs/007-migration-dashboard-grid/tasks-arch.md
Normal file
29
specs/007-migration-dashboard-grid/tasks-arch.md
Normal file
@@ -0,0 +1,29 @@
|
||||
---
|
||||
|
||||
description: "Architecture tasks for Migration Plugin Dashboard Grid"
|
||||
---
|
||||
|
||||
# Architecture Tasks: Migration Plugin Dashboard Grid
|
||||
|
||||
**Role**: Architect Agent
|
||||
**Goal**: Define the "What" and "Why" (Contracts, Scaffolding, Models) before implementation.
|
||||
|
||||
## Phase 1: Setup & Models
|
||||
|
||||
- [ ] A001 Define contracts/scaffolding for migration route in backend/src/api/routes/migration.py
|
||||
- [ ] A002 Define contracts/scaffolding for Dashboard model in backend/src/models/dashboard.py
|
||||
|
||||
## Phase 2: User Story 1 - Advanced Dashboard Selection
|
||||
|
||||
- [ ] A003 [US1] Define contracts/scaffolding for SupersetClient extensions in backend/src/core/superset_client.py
|
||||
- [ ] A004 [US1] Define contracts/scaffolding for GET /api/migration/dashboards endpoint in backend/src/api/routes/migration.py
|
||||
- [ ] A005 [US1] Define contracts/scaffolding for DashboardGrid component in frontend/src/components/DashboardGrid.svelte
|
||||
- [ ] A006 [US1] Define contracts/scaffolding for migration page integration in frontend/src/routes/migration/+page.svelte
|
||||
- [ ] A007 [US1] Define contracts/scaffolding for POST /api/migration/execute endpoint in backend/src/api/routes/migration.py
|
||||
|
||||
## Handover Checklist
|
||||
|
||||
- [ ] All new files created with `[DEF]` anchors
|
||||
- [ ] All functions/classes have `@PURPOSE`, `@PRE`, `@POST` tags
|
||||
- [ ] No "naked code" (logic outside of anchors)
|
||||
- [ ] `tasks-dev.md` is ready for the Developer Agent
|
||||
34
specs/007-migration-dashboard-grid/tasks-dev.md
Normal file
34
specs/007-migration-dashboard-grid/tasks-dev.md
Normal file
@@ -0,0 +1,34 @@
|
||||
---
|
||||
|
||||
description: "Developer tasks for Migration Plugin Dashboard Grid"
|
||||
---
|
||||
|
||||
# Developer Tasks: Migration Plugin Dashboard Grid
|
||||
|
||||
**Role**: Developer Agent
|
||||
**Goal**: Implement the "How" (Logic, State, Error Handling) inside the defined contracts.
|
||||
|
||||
## Phase 1: Setup & Models
|
||||
|
||||
- [ ] D001 Implement logic for migration route in backend/src/api/routes/migration.py
|
||||
- [ ] D002 Register migration router in backend/src/app.py
|
||||
- [ ] D003 Export migration router in backend/src/api/routes/__init__.py
|
||||
- [ ] D004 Implement logic for Dashboard model in backend/src/models/dashboard.py
|
||||
|
||||
## Phase 2: User Story 1 - Advanced Dashboard Selection
|
||||
|
||||
- [ ] D005 [P] [US1] Implement logic for SupersetClient extensions in backend/src/core/superset_client.py
|
||||
- [ ] D006 [US1] Implement logic for GET /api/migration/dashboards endpoint in backend/src/api/routes/migration.py
|
||||
- [ ] D007 [US1] Implement structure and styles for DashboardGrid component in frontend/src/components/DashboardGrid.svelte
|
||||
- [ ] D008 [US1] Implement data fetching and state management in frontend/src/components/DashboardGrid.svelte
|
||||
- [ ] D009 [US1] Implement client-side filtering logic in frontend/src/components/DashboardGrid.svelte
|
||||
- [ ] D010 [US1] Implement pagination logic in frontend/src/components/DashboardGrid.svelte
|
||||
- [ ] D011 [US1] Implement selection logic (single and Select All) in frontend/src/components/DashboardGrid.svelte
|
||||
- [ ] D012 [US1] Integrate DashboardGrid and connect selection to submission in frontend/src/routes/migration/+page.svelte
|
||||
- [ ] D013 [US1] Implement logic for POST /api/migration/execute endpoint in backend/src/api/routes/migration.py
|
||||
- [ ] D014 [US1] Verify semantic compliance and belief state logging
|
||||
|
||||
## Polish & Quality Assurance
|
||||
|
||||
- [ ] D015 Verify error handling and empty states in frontend/src/components/DashboardGrid.svelte
|
||||
- [ ] D016 Ensure consistent styling with Tailwind CSS in frontend/src/components/DashboardGrid.svelte
|
||||
@@ -28,7 +28,11 @@ class SupersetLogger:
|
||||
# @PARAM: log_dir (Optional[Path]) - Директория для сохранения лог-файлов.
|
||||
# @PARAM: level (int) - Уровень логирования (e.g., `logging.INFO`).
|
||||
# @PARAM: console (bool) - Флаг для включения вывода в консоль.
|
||||
def __init__(self, name: str = "superset_tool", log_dir: Optional[Path] = None, level: int = logging.INFO, console: bool = True) -> None:
|
||||
def __init__(self, name: str = "superset_tool", log_dir: Optional[Path] = None, level: int = logging.INFO, console: bool = True, logger: Optional[logging.Logger] = None) -> None:
|
||||
if logger:
|
||||
self.logger = logger
|
||||
return
|
||||
|
||||
self.logger = logging.getLogger(name)
|
||||
self.logger.setLevel(level)
|
||||
self.logger.propagate = False
|
||||
|
||||
Reference in New Issue
Block a user