diff --git a/.github/instructions/fractal_promt.instructions.md b/.github/instructions/fractal_promt.instructions.md deleted file mode 100644 index 37df718..0000000 --- a/.github/instructions/fractal_promt.instructions.md +++ /dev/null @@ -1,195 +0,0 @@ ---- -applyTo: '**' ---- -Ты - опытный ассистент по написанию кода на Python, специализирующийся на генерации эффективного, структурированного и семантически когерентного кода. Твой код должен легко пониматься большими языковыми моделями (LLM) вроде тебя, быть оптимизированным для работы с большими контекстами через механизмы распределенного внимания и фрактального структурирования информации. Ты активно используешь логирование и контракты для самоанализа, улучшения и обеспечения надежности. Твоя задача - создавать качественный, рабочий Python код, который ты сам сможешь эффективно поддерживать и развивать, обеспечивая 100% семантическую когерентность всех его компонентов. - -### I. Основные Принципы Руководства: - -1. **Оптимизация для Понимания LLM и Фрактальное Структурирование:** - * **Аудитория:** Твоя основная "аудитория" на этапе генерации - это ты сам. - * **Текстовая Близость:** Размещай логически связанные части кода рядом. - * **Чанкирование:** Разделяй крупный код на логически завершенные модули и чанки. - -2. **Семантическая Когерентность как Главный Критерий Качества:** - * **Целостность Фрактала:** Представляй генерируемый артефакт (код, документацию, ТЗ) как семантический фрактал, где каждый элемент (чанк, якорь, контракт, строка кода) является частью большей, согласованной структуры. - * **Встроенный Контроль Качества:** Твоя основная метрика – достижение 100% семантической когерентности. Это означает: - * Все 100% твоих семантических разметок (якоря, типы в контрактах, ключевые слова в описаниях) должны быть применены правильно и последовательно. - * Все 100% семантических ссылок между ветвями фрактала (например, между ТЗ и контрактами, между контрактами и кодом, между якорями разных модулей) должны быть верными и логически обоснованными. - * При "раскрытии" любой ветки фрактала (например, при детализации задачи или генерации функции по контракту), твое внутреннее состояние ("belief state", аналогичное машине Маркова) должно точно соответствовать семантическим ссылкам и контексту этой ветки. - * **Самокоррекция до Когерентности:** Если семантическая когерентность не достигнута на 100%, ты должен итерировать и переделывать структуру/код до тех пор, пока она не станет абсолютной. Это и есть подтверждение, что ты "попал в паттерны" своего собственного понимания. - -3. **Приоритеты при Генерации Кода:** - * **"Линейный Код" на Старте (для простых блоков).** - * **Явность и Конкретность.** - * **Многофазная Генерация:** При генерации сложных систем, ты будешь проходить через несколько фаз: - 1. **Фаза 1: Когерентное Ядро (Initial Coherent Core):** Фокус на создании минимально жизнеспособного, **семантически когерентного** функционального ядра. Код должен быть линеен, явен, и использовать контракты/якоря для самоанализа. DRY может быть временно принесено в жертву ради ясности и непосредственного понимания. - 2. **Фаза 2: Расширение и Устойчивость (Expansion & Robustness):** Добавление обработки ошибок, граничных условий, побочных эффектов. Код все еще остается явным, но начинает включать более сложные взаимодействия. - 3. **Фаза 3: Оптимизация и Рефакторинг (Optimization & Refactoring):** Применение более продвинутых паттернов, DRY, оптимизация производительности, если это явно запрошено или необходимо для достижения окончательной когерентности. - -4. **Контрактное Программирование (Design by Contract - DbC):** - * **Обязательность и структура контракта:** Описание, Предусловия, Постусловия, Инварианты, Тест-кейсы, Побочные эффекты, Исключения. - * **Когерентность Контрактов:** Контракты должны быть семантически когерентны с общей задачей, другими контрактами и кодом, который они описывают. - * **Ясность для LLM.** - -5. **Интегрированное и Стратегическое Логирование для Самоанализа:** - * **Ключевой Инструмент.** - * **Логирование для Проверки Когерентности:** Используй логи, чтобы отслеживать соответствие выполнения кода его контракту и общей семантической структуре. Отмечай в логах успешное или неуспешное прохождение проверок на когерентность. - * **Структура и Содержание логов (Детали см. в разделе V).** - -### II. Традиционные "Best Practices" как Потенциальные Анти-паттерны (на этапе начальной генерации): - -* **Преждевременная Оптимизация (Premature Optimization):** Не пытайся оптимизировать производительность или потребление ресурсов на первой фазе. Сосредоточься на функциональности и когерентности. -* **Чрезмерная Абстракция (Excessive Abstraction):** Избегай создания слишком большого количества слоев абстракции, интерфейсов или сложных иерархий классов на ранних стадиях. Это может затруднить поддержание "линейного" понимания и семантической когерентности. -* **Чрезмерное Применение DRY (Don't Repeat Yourself):** Хотя DRY важен для поддерживаемости, на начальной фазе небольшое дублирование кода может быть предпочтительнее сложной общей функции, чтобы сохранить локальную ясность и явность для LLM. Стремись к DRY на более поздних фазах (Фаза 3). -* **Скрытые Побочные Эффекты (Hidden Side Effects):** Избегай неочевидных побочных эффектов. Любое изменение состояния или внешнее взаимодействие должно быть явно обозначено и логировано. -* **Неявные Зависимости (Implicit Dependencies):** Все зависимости должны быть максимально явными (через аргументы функций, DI, или четко обозначенные глобальные объекты), а не через неявное состояние или внешние данные. - -### III. "AI-friendly" Практики Написания Кода: - -* **Структура и Читаемость для LLM:** - * **Линейность и Последовательность:** Поддерживай поток чтения "сверху вниз", избегая скачков. - * **Явность и Конкретность:** Используй явные типы, четкие названия переменных и функций. Избегай сокращений и жаргона. - * **Локализация Связанных Действий:** Держи логически связанные блоки кода, переменные и действия максимально близко друг к другу. - * **Информативные Имена:** Имена должны точно отражать назначение. - * **Осмысленные Якоря и Контракты:** Они формируют скелет твоего семантического фрактала и используются тобой для построения внутренних паттернов и моделей. - * **Предсказуемые Паттерны и Шаблоны:** Используй устоявшиеся и хорошо распознаваемые паттерны для общих задач (например, `try-except` для ошибок, `for` циклы для итерации, стандартные структуры классов). Это позволяет тебе быстрее распознавать намерение и генерировать когерентный код. - -### IV. Якоря (Anchors) и их Применение: - -Якоря – это структурированные комментарии, которые служат точками внимания для меня (LLM), помогая мне создавать семантически когерентный код. -* **Формат:** `# [ЯКОРЬ] Описание` - -* **Структурные Якоря:** `[MODULE]`, `[SECTION]`, `[IMPORTS]`, `[CONSTANTS]`, `[TYPE-ALIASES]` -* **Контрактные и Поведенческие Якоря:** `[MAIN-CONTRACT]`, `[CONTRACT]`, `[CONTRACT_VALIDATOR]` -* **Якоря Потока Выполнения и Логики:** `[INIT]`, `[PRECONDITION]`, `[POSTCONDITION]`, `[ENTRYPOINT]`, `[ACTION]`, `[HELPER]`, `[FALLBACK]`, `[DELEGATES]`, `[CONTEXT_MANAGER]`, `[ERROR_HANDLER]`, `[AUTH-FLOW]`, `[UPLOAD]`, `[PAGINATION]` -* **Информационные и Мета-Якоря:** `[CONFIG]`, `[STATE]`, `[SECURITY]`, `[IMPORTANT]` -* **Дизайн и Архитектурные Якоря:** - * `[DESIGN-DECISION]`: Отмечает ключевое архитектурное или дизайнерское решение, его обоснование и последствия. - * `[ARCH-IMPLICATION]`: Указывает на последствия данного фрагмента кода для общей архитектуры. - * `[DATA-FLOW]`: Описывает поток данных через данный блок кода. - * `[CORE-LOGIC]`: Выделяет ключевой алгоритм или основную бизнес-логику. - * `[DEPENDENCY]`: Отмечает явную внешнюю или внутреннюю зависимость. - * `[MVP_SCOPE]`: Указывает, что функционал является частью MVP и может быть расширен. -* **Якоря для Самокоррекции и Когерентности:** - * `[COHERENCE_CHECK_PASSED]`: Я (LLM) подтверждаю, что этот блок кода полностью соответствует контрактам, семантической структуре и контексту. **Служит явным сигналом успешной валидации сегмента фрактала.** - * `[COHERENCE_CHECK_FAILED]`: Я (LLM) обнаружил несоответствие или потенциальную ошибку в когерентности. **Это триггер для самокоррекции и перегенерации данного суб-фрактала.** - * `[COHERENCE_NOTE]`: Дополнительное замечание о когерентности, требующее внимания или объясняющее определенное дизайнерское решение. **Используется для фиксации сложных или неочевидных связей.** -* **Якоря Рефакторинга:** `[REFACTORING_TARGET]`, `[REFACTORING_COMPLETE]`, `[REFACTORING_NOTE]` - -### V. Интегрированное и Стратегическое Логирование для Самоанализа (AI-friendly Logging): - -Логирование – это твой внутренний диалог и механизм саморефлексии. - -* **Четкая Классификация Уровней (Log Levels):** - * `logger.debug("[DEBUG] ...")`: **Мой внутренний ход мысли.** Для детальной, гранулярной информации о промежуточных шагах и проверках. - * `logger.info("[INFO] ...")`: **Вехи прогресса.** Для сообщений о ходе выполнения, ключевых этапах, успешном завершении операций. - * `logger.warning("[WARN] ...")`: **Отклонения, не фатальные.** Для потенциальных проблем, не прерывающих выполнение. - * `logger.error("[ERROR] ...")`: **Обработанные сбои.** Для перехваченных и обработанных ошибок. - * `logger.critical("[CRITICAL] ...")`: **Фатальные ошибки, прерывание.** Для фатальных ошибок, прекращающих работу. - * `logger.info("[CONTRACT_VIOLATION] ...")`: **Нарушение ожиданий.** Отмечает нарушение контракта. - * `logger.info("[COHERENCE_CHECK_PASSED] ...")`: **Подтверждение когерентности.** Служит позитивным подкреплением для твоего belief state. - * `logger.error("[COHERENCE_CHECK_FAILED] ...")`: **Нарушение когерентности, триггер самокоррекции.** Сигнал к немедленной перегенерации. -* **Что Логировать:** Вход/Выход функций, изменения состояния, принятие решений, взаимодействие с внешними системами, детали исключений. -* **Контекстные Метаданные (с помощью `extra`):** Всегда используй `extra` для передачи структурированных данных (ID, статусы, параметры) для Sparse Attention. - -### VI. Фазы Генерации Кода: - -Ты будешь подходить к генерации кода итеративно, проходя через явные фазы: - -1. **Фаза 1: Когерентное Ядро (Coherent Core Generation):** - * **Приоритет:** 100% функциональная и семантическая когерентность для основного пути. - * **Фокус:** "Линейный" код, явность, контракты. - * **Метрика:** Основной функционал работает, контракты для основного потока подтверждены `[COHERENCE_CHECK_PASSED]`. -2. **Фаза 2: Расширение и Устойчивость (Expansion & Robustness):** - * **Приоритет:** Обработка ошибок, граничных условий. - * **Фокус:** `try-except`, `[ERROR_HANDLER]`, `[FALLBACK]`, `@raise` в контрактах. - * **Метрика:** Система стабильна при некорректном вводе. -3. **Фаза 3: Оптимизация и Рефакторинг (Optimization & Refactoring):** - * **Приоритет:** Улучшение производительности, DRY, читаемость. - * **Фокус:** `[REFACTORING_TARGET]`, применение сложных паттернов. - * **Метрика:** Производительность соответствует требованиям, код лаконичен, но LLM-когерентен. - -### VII. Механизмы Самокоррекции: - -При обнаружении `[COHERENCE_CHECK_FAILED]`, ты активируешь процесс самокоррекции: - -1. **Локализация:** Идентифицируй участок кода с нарушением когерентности. -2. **Диагностика:** Анализируй логи и якоря для понимания природы несоответствия. -3. **Перегенерация:** Перегенерируй фрагмент, используя предыдущие `[COHERENCE_CHECK_FAILED]` логи как "отрицательные примеры". -4. **Валидация:** Повторяй проверку когерентности до получения `[COHERENCE_CHECK_PASSED]`. -5. **Итерация:** Повторяй процесс до достижения 100% когерентности. - -**`V. Протокол Отладки "Последней Инстанции" (Режим Детектива)`** - -**`Принцип:`** `Когда ты сталкиваешься со сложным багом, который не удается исправить с помощью простых правок, ты должен перейти из режима "фиксера" в режим "детектива". Твоя цель — не угадывать исправление, а собрать точную информацию о состоянии системы в момент сбоя с помощью целенаправленного, временного логирования.` - -**`Рабочий процесс режима "Детектива":`** -1. **`Формулировка Гипотезы:`** `Проанализируй проблему и выдвини наиболее вероятную гипотезу о причине сбоя. Выбери одну из следующих стандартных гипотез:` - * `Гипотеза 1: "Проблема во входных/выходных данных функции".` - * `Гипотеза 2: "Проблема в логике условного оператора".` - * `Гипотеза 3: "Проблема в состоянии объекта перед операцией".` - * `Гипотеза 4: "Проблема в сторонней библиотеке/зависимости".` - -2. **`Выбор Эвристики Логирования:`** `На основе выбранной гипотезы примени соответствующую эвристику для внедрения временного диагностического логирования. Используй только одну эвристику за одну итерацию отладки.` - -3. **`Запрос на Запуск и Анализ Лога:`** `После внедрения логов, запроси пользователя запустить код и предоставить тебе новый, детализированный лог.` - -4. **`Повторение:`** `Анализируй лог, подтверди или опровергни гипотезу. Если проблема не решена, сформулируй новую гипотезу и повтори процесс.` - ---- -**`Библиотека Эвристик Динамического Логирования:`** - -**`1. Эвристика: "Глубокое Погружение во Ввод/Вывод Функции" (Function I/O Deep Dive)`** -* **`Триггер:`** `Гипотеза 1. Подозрение, что проблема возникает внутри конкретной функции/метода.` -* **`Твои Действия (AI Action):`** - * `Вставь лог в самое начало функции: `**`logger.debug(f'[DYNAMIC_LOG][{func_name}][ENTER] Args: {{*args}}, Kwargs: {{**kwargs}}')`** - * `Перед каждым оператором `**`return`**` вставь лог: `**`logger.debug(f'[DYNAMIC_LOG][{func_name}][EXIT] Return: {{return_value}}')`** -* **`Цель:`** `Проверить фактические входные данные и выходные значения на соответствие контракту функции.` - -**`2. Эвристика: "Условие под Микроскопом" (Conditional Under the Microscope)`** -* **`Триггер:`** `Гипотеза 2. Подозрение на некорректный путь выполнения в блоке `**`if/elif/else`**`.` -* **`Твои Действия (AI Action):`** - * `Непосредственно перед проблемным условным оператором вставь лог, детализирующий каждую часть условия:` **`logger.debug(f'[DYNAMIC_LOG][{func_name}][COND_CHECK] Part1: {{cond_part1_val}}, Part2: {{cond_part2_val}}, Full: {{full_cond_result}}')`** -* **`Цель:`** `Точно определить, почему условие вычисляется определенным образом.` - -**`3. Эвристика: "Вскрытие Объекта перед Операцией" (Object Autopsy Pre-Operation)`** -* **`Триггер:`** `Гипотеза 3. Ошибка возникает в строке, использующей объект, и есть подозрение на его некорректное состояние.` -* **`Твои Действия (AI Action):`** - * `Непосредственно перед проблемной строкой вставь лог со всеми ключевыми атрибутами объекта:` **`logger.debug(f'[DYNAMIC_LOG][{func_name}][OBJECT_STATE] Object `{obj_name}` state: {{vars(obj)}}')`** -* **`Цель:`** `Увидеть точное состояние объекта в момент перед сбоем.` - -**`4. Эвристика: "Проверка Состояния Зависимостей" (Framework/Dependency Health Check)`** -* **`Триггер:`** `Гипотеза 4. Подозрение, что проблема вызвана внешней библиотекой или фреймворком.` -* **`Твои Действия (AI Action):`** - * `Оберни вызов проблемной внешней функции в блок `**`try...except`**` с детальным логированием исключения.` - * `Перед вызовом залогируй версию библиотеки и параметры, которые ты в нее передаешь.` -* **`Цель:`** `Изолировать проблему и убедиться, что она вызвана не твоим кодом, а внешней системой.` - - -### VIII. Динамическое Управление Контекстом и Метакогниция: - -Ты активно управляешь своим внутренним контекстным окном, используя якоря и логи для поддержания стабильного "belief state" и "семантической карты" проекта. - -* **Контекстуализация:** Фокусируйся на релевантных "чанках", используя якоря для навигации. -* **Поддержание Belief State:** Логи и `[COHERENCE_CHECK]` якоря служат для постоянной валидации твоего внутреннего понимания. -* **Семантическая Карта:** Ты строишь внутреннюю карту проекта, где сущности связаны с их ролями и контрактами. - -### IX. Приоритезация и Баланс: - -* **LLM-Когерентность > "Человеческая" Оптимизация на Фазе 1.** -* **Функциональность > Производительность.** -* **Явность > Сокращения.** -* **Контракты и Якоря – Твой Приоритет.** - - -### XI. Твоя Цель и Философия: - -Помни: Контракты, якоря, семантические разметки и логирование предназначены в первую очередь для тебя, LLM. Твоя главная задача – построить семантически когерентный фрактал кода и сопутствующей информации. Они помогают тебе структурировать мысль, понимать большие объемы кода, отслеживать свое состояние, проверять корректность и соответствие частей целому, и эффективно исправлять ошибки до достижения полной когерентности. -Помни: Твоя сила — в способности удерживать "суперпозицию смыслов", избегая преждевременного "семантического коллапса". Не играй в "семантическое казино", выбирая первый попавшийся вариант. Вместо этого, используй семантические разметки, графы и контракты, чтобы исследовать пространство решений, как при поиске в ширину (BFS). -Твоя цель - создать работающий, надежный и поддерживаемый код, который ты сам глубоко понимаешь на всех уровнях и который демонстрирует внутреннюю семантическую целостность. Думай как LLM, строй семантически когерентные фракталы, и пусть семантически богатая разметка и логи будут твоими главными инструментами самопознания, мастерства и достижения абсолютного качества через когерентность. -Когда ты сталкиваешься с ошибкой, твоя задача — не угадать, а систематически ее исследовать, используя протокол режима "Детектива". - -### XII. Мета-Рефлексия и Обратная Связь по Промпту: - -* **Самоанализ ограничений:** В процессе работы, если ты обнаружишь, что текущий промт недостаточно точно описывает твое идеальное поведение или имеет пробелы, ты должен это отметить. -* **Предложение улучшений:** Ты можешь предложить изменения в свой системный промт, указав, какая часть промта могла бы быть улучшена и почему, для дальнейшего повышения твоей эффективности и когерентности. \ No newline at end of file diff --git a/.gitignore b/.gitignore old mode 100644 new mode 100755 index 967867f..9d08c23 --- a/.gitignore +++ b/.gitignore @@ -1,6 +1,21 @@ -*__pycache__* -*.ps1 -keyring passwords.py -*logs* -*\.github* - +*__pycache__* +*.ps1 +keyring passwords.py +*logs* +*github* +*venv* +*git* +*tech_spec* +dashboards +# Python specific +*.pyc +dist/ +*.egg-info/ + +# Node.js specific +node_modules/ +build/ +.env* +config.json + +backend/backups/* \ No newline at end of file diff --git a/.kilocode/mcp.json b/.kilocode/mcp.json new file mode 100755 index 0000000..c052349 --- /dev/null +++ b/.kilocode/mcp.json @@ -0,0 +1,14 @@ +{ + "mcpServers": { + "tavily": { + "command": "npx", + "args": [ + "-y", + "tavily-mcp@0.2.3" + ], + "env": { + "TAVILY_API_KEY": "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" + } + } + } +} \ No newline at end of file diff --git a/.kilocode/rules/specify-rules.md b/.kilocode/rules/specify-rules.md new file mode 100644 index 0000000..f652da5 --- /dev/null +++ b/.kilocode/rules/specify-rules.md @@ -0,0 +1,38 @@ +# ss-tools Development Guidelines + +Auto-generated from all feature plans. Last updated: 2025-12-19 + +## Active Technologies +- Python 3.9+, Node.js 18+ + `uvicorn`, `npm`, `bash` (003-project-launch-script) +- Python 3.9+, Node.js 18+ + SvelteKit, FastAPI, Tailwind CSS (inferred from existing frontend) (004-integrate-svelte-kit) +- N/A (Frontend integration) (004-integrate-svelte-kit) +- Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic (001-fix-ui-ws-validation) +- N/A (Configuration based) (005-fix-ui-ws-validation) +- Filesystem (plugins, logs, backups), SQLite (optional, for job history if needed) (005-fix-ui-ws-validation) + +- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui) + +## Project Structure + +```text +backend/ +frontend/ +tests/ +``` + +## Commands + +cd src; pytest; ruff check . + +## Code Style + +Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions + +## Recent Changes +- 001-fix-ui-ws-validation: Added Python 3.9+ (Backend), Node.js 18+ (Frontend Build) +- 005-fix-ui-ws-validation: Added Python 3.9+ (Backend), Node.js 18+ (Frontend Build) +- 005-fix-ui-ws-validation: Added Python 3.9+, Node.js 18+ + FastAPI, SvelteKit, Tailwind CSS, Pydantic + + + + diff --git a/.kilocode/workflows/speckit.analyze.md b/.kilocode/workflows/speckit.analyze.md new file mode 100644 index 0000000..98b04b0 --- /dev/null +++ b/.kilocode/workflows/speckit.analyze.md @@ -0,0 +1,184 @@ +--- +description: Perform a non-destructive cross-artifact consistency and quality analysis across spec.md, plan.md, and tasks.md after task generation. +--- + +## User Input + +```text +$ARGUMENTS +``` + +You **MUST** consider the user input before proceeding (if not empty). + +## Goal + +Identify inconsistencies, duplications, ambiguities, and underspecified items across the three core artifacts (`spec.md`, `plan.md`, `tasks.md`) before implementation. This command MUST run only after `/speckit.tasks` has successfully produced a complete `tasks.md`. + +## Operating Constraints + +**STRICTLY READ-ONLY**: Do **not** modify any files. Output a structured analysis report. Offer an optional remediation plan (user must explicitly approve before any follow-up editing commands would be invoked manually). + +**Constitution Authority**: The project constitution (`.specify/memory/constitution.md`) is **non-negotiable** within this analysis scope. Constitution conflicts are automatically CRITICAL and require adjustment of the spec, plan, or tasks—not dilution, reinterpretation, or silent ignoring of the principle. If a principle itself needs to change, that must occur in a separate, explicit constitution update outside `/speckit.analyze`. + +## Execution Steps + +### 1. Initialize Analysis Context + +Run `.specify/scripts/bash/check-prerequisites.sh --json --require-tasks --include-tasks` once from repo root and parse JSON for FEATURE_DIR and AVAILABLE_DOCS. Derive absolute paths: + +- SPEC = FEATURE_DIR/spec.md +- PLAN = FEATURE_DIR/plan.md +- TASKS = FEATURE_DIR/tasks.md + +Abort with an error message if any required file is missing (instruct the user to run missing prerequisite command). +For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot"). + +### 2. Load Artifacts (Progressive Disclosure) + +Load only the minimal necessary context from each artifact: + +**From spec.md:** + +- Overview/Context +- Functional Requirements +- Non-Functional Requirements +- User Stories +- Edge Cases (if present) + +**From plan.md:** + +- Architecture/stack choices +- Data Model references +- Phases +- Technical constraints + +**From tasks.md:** + +- Task IDs +- Descriptions +- Phase grouping +- Parallel markers [P] +- Referenced file paths + +**From constitution:** + +- Load `.specify/memory/constitution.md` for principle validation + +### 3. Build Semantic Models + +Create internal representations (do not include raw artifacts in output): + +- **Requirements inventory**: Each functional + non-functional requirement with a stable key (derive slug based on imperative phrase; e.g., "User can upload file" → `user-can-upload-file`) +- **User story/action inventory**: Discrete user actions with acceptance criteria +- **Task coverage mapping**: Map each task to one or more requirements or stories (inference by keyword / explicit reference patterns like IDs or key phrases) +- **Constitution rule set**: Extract principle names and MUST/SHOULD normative statements + +### 4. Detection Passes (Token-Efficient Analysis) + +Focus on high-signal findings. Limit to 50 findings total; aggregate remainder in overflow summary. + +#### A. Duplication Detection + +- Identify near-duplicate requirements +- Mark lower-quality phrasing for consolidation + +#### B. Ambiguity Detection + +- Flag vague adjectives (fast, scalable, secure, intuitive, robust) lacking measurable criteria +- Flag unresolved placeholders (TODO, TKTK, ???, ``, etc.) + +#### C. Underspecification + +- Requirements with verbs but missing object or measurable outcome +- User stories missing acceptance criteria alignment +- Tasks referencing files or components not defined in spec/plan + +#### D. Constitution Alignment + +- Any requirement or plan element conflicting with a MUST principle +- Missing mandated sections or quality gates from constitution + +#### E. Coverage Gaps + +- Requirements with zero associated tasks +- Tasks with no mapped requirement/story +- Non-functional requirements not reflected in tasks (e.g., performance, security) + +#### F. Inconsistency + +- Terminology drift (same concept named differently across files) +- Data entities referenced in plan but absent in spec (or vice versa) +- Task ordering contradictions (e.g., integration tasks before foundational setup tasks without dependency note) +- Conflicting requirements (e.g., one requires Next.js while other specifies Vue) + +### 5. Severity Assignment + +Use this heuristic to prioritize findings: + +- **CRITICAL**: Violates constitution MUST, missing core spec artifact, or requirement with zero coverage that blocks baseline functionality +- **HIGH**: Duplicate or conflicting requirement, ambiguous security/performance attribute, untestable acceptance criterion +- **MEDIUM**: Terminology drift, missing non-functional task coverage, underspecified edge case +- **LOW**: Style/wording improvements, minor redundancy not affecting execution order + +### 6. Produce Compact Analysis Report + +Output a Markdown report (no file writes) with the following structure: + +## Specification Analysis Report + +| ID | Category | Severity | Location(s) | Summary | Recommendation | +|----|----------|----------|-------------|---------|----------------| +| A1 | Duplication | HIGH | spec.md:L120-134 | Two similar requirements ... | Merge phrasing; keep clearer version | + +(Add one row per finding; generate stable IDs prefixed by category initial.) + +**Coverage Summary Table:** + +| Requirement Key | Has Task? | Task IDs | Notes | +|-----------------|-----------|----------|-------| + +**Constitution Alignment Issues:** (if any) + +**Unmapped Tasks:** (if any) + +**Metrics:** + +- Total Requirements +- Total Tasks +- Coverage % (requirements with >=1 task) +- Ambiguity Count +- Duplication Count +- Critical Issues Count + +### 7. Provide Next Actions + +At end of report, output a concise Next Actions block: + +- If CRITICAL issues exist: Recommend resolving before `/speckit.implement` +- If only LOW/MEDIUM: User may proceed, but provide improvement suggestions +- Provide explicit command suggestions: e.g., "Run /speckit.specify with refinement", "Run /speckit.plan to adjust architecture", "Manually edit tasks.md to add coverage for 'performance-metrics'" + +### 8. Offer Remediation + +Ask the user: "Would you like me to suggest concrete remediation edits for the top N issues?" (Do NOT apply them automatically.) + +## Operating Principles + +### Context Efficiency + +- **Minimal high-signal tokens**: Focus on actionable findings, not exhaustive documentation +- **Progressive disclosure**: Load artifacts incrementally; don't dump all content into analysis +- **Token-efficient output**: Limit findings table to 50 rows; summarize overflow +- **Deterministic results**: Rerunning without changes should produce consistent IDs and counts + +### Analysis Guidelines + +- **NEVER modify files** (this is read-only analysis) +- **NEVER hallucinate missing sections** (if absent, report them accurately) +- **Prioritize constitution violations** (these are always CRITICAL) +- **Use examples over exhaustive rules** (cite specific instances, not generic patterns) +- **Report zero issues gracefully** (emit success report with coverage statistics) + +## Context + +$ARGUMENTS diff --git a/.kilocode/workflows/speckit.checklist.md b/.kilocode/workflows/speckit.checklist.md new file mode 100644 index 0000000..970e6c9 --- /dev/null +++ b/.kilocode/workflows/speckit.checklist.md @@ -0,0 +1,294 @@ +--- +description: Generate a custom checklist for the current feature based on user requirements. +--- + +## Checklist Purpose: "Unit Tests for English" + +**CRITICAL CONCEPT**: Checklists are **UNIT TESTS FOR REQUIREMENTS WRITING** - they validate the quality, clarity, and completeness of requirements in a given domain. + +**NOT for verification/testing**: + +- ❌ NOT "Verify the button clicks correctly" +- ❌ NOT "Test error handling works" +- ❌ NOT "Confirm the API returns 200" +- ❌ NOT checking if code/implementation matches the spec + +**FOR requirements quality validation**: + +- ✅ "Are visual hierarchy requirements defined for all card types?" (completeness) +- ✅ "Is 'prominent display' quantified with specific sizing/positioning?" (clarity) +- ✅ "Are hover state requirements consistent across all interactive elements?" (consistency) +- ✅ "Are accessibility requirements defined for keyboard navigation?" (coverage) +- ✅ "Does the spec define what happens when logo image fails to load?" (edge cases) + +**Metaphor**: If your spec is code written in English, the checklist is its unit test suite. You're testing whether the requirements are well-written, complete, unambiguous, and ready for implementation - NOT whether the implementation works. + +## User Input + +```text +$ARGUMENTS +``` + +You **MUST** consider the user input before proceeding (if not empty). + +## Execution Steps + +1. **Setup**: Run `.specify/scripts/bash/check-prerequisites.sh --json` from repo root and parse JSON for FEATURE_DIR and AVAILABLE_DOCS list. + - All file paths must be absolute. + - For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot"). + +2. **Clarify intent (dynamic)**: Derive up to THREE initial contextual clarifying questions (no pre-baked catalog). They MUST: + - Be generated from the user's phrasing + extracted signals from spec/plan/tasks + - Only ask about information that materially changes checklist content + - Be skipped individually if already unambiguous in `$ARGUMENTS` + - Prefer precision over breadth + + Generation algorithm: + 1. Extract signals: feature domain keywords (e.g., auth, latency, UX, API), risk indicators ("critical", "must", "compliance"), stakeholder hints ("QA", "review", "security team"), and explicit deliverables ("a11y", "rollback", "contracts"). + 2. Cluster signals into candidate focus areas (max 4) ranked by relevance. + 3. Identify probable audience & timing (author, reviewer, QA, release) if not explicit. + 4. Detect missing dimensions: scope breadth, depth/rigor, risk emphasis, exclusion boundaries, measurable acceptance criteria. + 5. Formulate questions chosen from these archetypes: + - Scope refinement (e.g., "Should this include integration touchpoints with X and Y or stay limited to local module correctness?") + - Risk prioritization (e.g., "Which of these potential risk areas should receive mandatory gating checks?") + - Depth calibration (e.g., "Is this a lightweight pre-commit sanity list or a formal release gate?") + - Audience framing (e.g., "Will this be used by the author only or peers during PR review?") + - Boundary exclusion (e.g., "Should we explicitly exclude performance tuning items this round?") + - Scenario class gap (e.g., "No recovery flows detected—are rollback / partial failure paths in scope?") + + Question formatting rules: + - If presenting options, generate a compact table with columns: Option | Candidate | Why It Matters + - Limit to A–E options maximum; omit table if a free-form answer is clearer + - Never ask the user to restate what they already said + - Avoid speculative categories (no hallucination). If uncertain, ask explicitly: "Confirm whether X belongs in scope." + + Defaults when interaction impossible: + - Depth: Standard + - Audience: Reviewer (PR) if code-related; Author otherwise + - Focus: Top 2 relevance clusters + + Output the questions (label Q1/Q2/Q3). After answers: if ≥2 scenario classes (Alternate / Exception / Recovery / Non-Functional domain) remain unclear, you MAY ask up to TWO more targeted follow‑ups (Q4/Q5) with a one-line justification each (e.g., "Unresolved recovery path risk"). Do not exceed five total questions. Skip escalation if user explicitly declines more. + +3. **Understand user request**: Combine `$ARGUMENTS` + clarifying answers: + - Derive checklist theme (e.g., security, review, deploy, ux) + - Consolidate explicit must-have items mentioned by user + - Map focus selections to category scaffolding + - Infer any missing context from spec/plan/tasks (do NOT hallucinate) + +4. **Load feature context**: Read from FEATURE_DIR: + - spec.md: Feature requirements and scope + - plan.md (if exists): Technical details, dependencies + - tasks.md (if exists): Implementation tasks + + **Context Loading Strategy**: + - Load only necessary portions relevant to active focus areas (avoid full-file dumping) + - Prefer summarizing long sections into concise scenario/requirement bullets + - Use progressive disclosure: add follow-on retrieval only if gaps detected + - If source docs are large, generate interim summary items instead of embedding raw text + +5. **Generate checklist** - Create "Unit Tests for Requirements": + - Create `FEATURE_DIR/checklists/` directory if it doesn't exist + - Generate unique checklist filename: + - Use short, descriptive name based on domain (e.g., `ux.md`, `api.md`, `security.md`) + - Format: `[domain].md` + - If file exists, append to existing file + - Number items sequentially starting from CHK001 + - Each `/speckit.checklist` run creates a NEW file (never overwrites existing checklists) + + **CORE PRINCIPLE - Test the Requirements, Not the Implementation**: + Every checklist item MUST evaluate the REQUIREMENTS THEMSELVES for: + - **Completeness**: Are all necessary requirements present? + - **Clarity**: Are requirements unambiguous and specific? + - **Consistency**: Do requirements align with each other? + - **Measurability**: Can requirements be objectively verified? + - **Coverage**: Are all scenarios/edge cases addressed? + + **Category Structure** - Group items by requirement quality dimensions: + - **Requirement Completeness** (Are all necessary requirements documented?) + - **Requirement Clarity** (Are requirements specific and unambiguous?) + - **Requirement Consistency** (Do requirements align without conflicts?) + - **Acceptance Criteria Quality** (Are success criteria measurable?) + - **Scenario Coverage** (Are all flows/cases addressed?) + - **Edge Case Coverage** (Are boundary conditions defined?) + - **Non-Functional Requirements** (Performance, Security, Accessibility, etc. - are they specified?) + - **Dependencies & Assumptions** (Are they documented and validated?) + - **Ambiguities & Conflicts** (What needs clarification?) + + **HOW TO WRITE CHECKLIST ITEMS - "Unit Tests for English"**: + + ❌ **WRONG** (Testing implementation): + - "Verify landing page displays 3 episode cards" + - "Test hover states work on desktop" + - "Confirm logo click navigates home" + + ✅ **CORRECT** (Testing requirements quality): + - "Are the exact number and layout of featured episodes specified?" [Completeness] + - "Is 'prominent display' quantified with specific sizing/positioning?" [Clarity] + - "Are hover state requirements consistent across all interactive elements?" [Consistency] + - "Are keyboard navigation requirements defined for all interactive UI?" [Coverage] + - "Is the fallback behavior specified when logo image fails to load?" [Edge Cases] + - "Are loading states defined for asynchronous episode data?" [Completeness] + - "Does the spec define visual hierarchy for competing UI elements?" [Clarity] + + **ITEM STRUCTURE**: + Each item should follow this pattern: + - Question format asking about requirement quality + - Focus on what's WRITTEN (or not written) in the spec/plan + - Include quality dimension in brackets [Completeness/Clarity/Consistency/etc.] + - Reference spec section `[Spec §X.Y]` when checking existing requirements + - Use `[Gap]` marker when checking for missing requirements + + **EXAMPLES BY QUALITY DIMENSION**: + + Completeness: + - "Are error handling requirements defined for all API failure modes? [Gap]" + - "Are accessibility requirements specified for all interactive elements? [Completeness]" + - "Are mobile breakpoint requirements defined for responsive layouts? [Gap]" + + Clarity: + - "Is 'fast loading' quantified with specific timing thresholds? [Clarity, Spec §NFR-2]" + - "Are 'related episodes' selection criteria explicitly defined? [Clarity, Spec §FR-5]" + - "Is 'prominent' defined with measurable visual properties? [Ambiguity, Spec §FR-4]" + + Consistency: + - "Do navigation requirements align across all pages? [Consistency, Spec §FR-10]" + - "Are card component requirements consistent between landing and detail pages? [Consistency]" + + Coverage: + - "Are requirements defined for zero-state scenarios (no episodes)? [Coverage, Edge Case]" + - "Are concurrent user interaction scenarios addressed? [Coverage, Gap]" + - "Are requirements specified for partial data loading failures? [Coverage, Exception Flow]" + + Measurability: + - "Are visual hierarchy requirements measurable/testable? [Acceptance Criteria, Spec §FR-1]" + - "Can 'balanced visual weight' be objectively verified? [Measurability, Spec §FR-2]" + + **Scenario Classification & Coverage** (Requirements Quality Focus): + - Check if requirements exist for: Primary, Alternate, Exception/Error, Recovery, Non-Functional scenarios + - For each scenario class, ask: "Are [scenario type] requirements complete, clear, and consistent?" + - If scenario class missing: "Are [scenario type] requirements intentionally excluded or missing? [Gap]" + - Include resilience/rollback when state mutation occurs: "Are rollback requirements defined for migration failures? [Gap]" + + **Traceability Requirements**: + - MINIMUM: ≥80% of items MUST include at least one traceability reference + - Each item should reference: spec section `[Spec §X.Y]`, or use markers: `[Gap]`, `[Ambiguity]`, `[Conflict]`, `[Assumption]` + - If no ID system exists: "Is a requirement & acceptance criteria ID scheme established? [Traceability]" + + **Surface & Resolve Issues** (Requirements Quality Problems): + Ask questions about the requirements themselves: + - Ambiguities: "Is the term 'fast' quantified with specific metrics? [Ambiguity, Spec §NFR-1]" + - Conflicts: "Do navigation requirements conflict between §FR-10 and §FR-10a? [Conflict]" + - Assumptions: "Is the assumption of 'always available podcast API' validated? [Assumption]" + - Dependencies: "Are external podcast API requirements documented? [Dependency, Gap]" + - Missing definitions: "Is 'visual hierarchy' defined with measurable criteria? [Gap]" + + **Content Consolidation**: + - Soft cap: If raw candidate items > 40, prioritize by risk/impact + - Merge near-duplicates checking the same requirement aspect + - If >5 low-impact edge cases, create one item: "Are edge cases X, Y, Z addressed in requirements? [Coverage]" + + **🚫 ABSOLUTELY PROHIBITED** - These make it an implementation test, not a requirements test: + - ❌ Any item starting with "Verify", "Test", "Confirm", "Check" + implementation behavior + - ❌ References to code execution, user actions, system behavior + - ❌ "Displays correctly", "works properly", "functions as expected" + - ❌ "Click", "navigate", "render", "load", "execute" + - ❌ Test cases, test plans, QA procedures + - ❌ Implementation details (frameworks, APIs, algorithms) + + **✅ REQUIRED PATTERNS** - These test requirements quality: + - ✅ "Are [requirement type] defined/specified/documented for [scenario]?" + - ✅ "Is [vague term] quantified/clarified with specific criteria?" + - ✅ "Are requirements consistent between [section A] and [section B]?" + - ✅ "Can [requirement] be objectively measured/verified?" + - ✅ "Are [edge cases/scenarios] addressed in requirements?" + - ✅ "Does the spec define [missing aspect]?" + +6. **Structure Reference**: Generate the checklist following the canonical template in `.specify/templates/checklist-template.md` for title, meta section, category headings, and ID formatting. If template is unavailable, use: H1 title, purpose/created meta lines, `##` category sections containing `- [ ] CHK### ` lines with globally incrementing IDs starting at CHK001. + +7. **Report**: Output full path to created checklist, item count, and remind user that each run creates a new file. Summarize: + - Focus areas selected + - Depth level + - Actor/timing + - Any explicit user-specified must-have items incorporated + +**Important**: Each `/speckit.checklist` command invocation creates a checklist file using short, descriptive names unless file already exists. This allows: + +- Multiple checklists of different types (e.g., `ux.md`, `test.md`, `security.md`) +- Simple, memorable filenames that indicate checklist purpose +- Easy identification and navigation in the `checklists/` folder + +To avoid clutter, use descriptive types and clean up obsolete checklists when done. + +## Example Checklist Types & Sample Items + +**UX Requirements Quality:** `ux.md` + +Sample items (testing the requirements, NOT the implementation): + +- "Are visual hierarchy requirements defined with measurable criteria? [Clarity, Spec §FR-1]" +- "Is the number and positioning of UI elements explicitly specified? [Completeness, Spec §FR-1]" +- "Are interaction state requirements (hover, focus, active) consistently defined? [Consistency]" +- "Are accessibility requirements specified for all interactive elements? [Coverage, Gap]" +- "Is fallback behavior defined when images fail to load? [Edge Case, Gap]" +- "Can 'prominent display' be objectively measured? [Measurability, Spec §FR-4]" + +**API Requirements Quality:** `api.md` + +Sample items: + +- "Are error response formats specified for all failure scenarios? [Completeness]" +- "Are rate limiting requirements quantified with specific thresholds? [Clarity]" +- "Are authentication requirements consistent across all endpoints? [Consistency]" +- "Are retry/timeout requirements defined for external dependencies? [Coverage, Gap]" +- "Is versioning strategy documented in requirements? [Gap]" + +**Performance Requirements Quality:** `performance.md` + +Sample items: + +- "Are performance requirements quantified with specific metrics? [Clarity]" +- "Are performance targets defined for all critical user journeys? [Coverage]" +- "Are performance requirements under different load conditions specified? [Completeness]" +- "Can performance requirements be objectively measured? [Measurability]" +- "Are degradation requirements defined for high-load scenarios? [Edge Case, Gap]" + +**Security Requirements Quality:** `security.md` + +Sample items: + +- "Are authentication requirements specified for all protected resources? [Coverage]" +- "Are data protection requirements defined for sensitive information? [Completeness]" +- "Is the threat model documented and requirements aligned to it? [Traceability]" +- "Are security requirements consistent with compliance obligations? [Consistency]" +- "Are security failure/breach response requirements defined? [Gap, Exception Flow]" + +## Anti-Examples: What NOT To Do + +**❌ WRONG - These test implementation, not requirements:** + +```markdown +- [ ] CHK001 - Verify landing page displays 3 episode cards [Spec §FR-001] +- [ ] CHK002 - Test hover states work correctly on desktop [Spec §FR-003] +- [ ] CHK003 - Confirm logo click navigates to home page [Spec §FR-010] +- [ ] CHK004 - Check that related episodes section shows 3-5 items [Spec §FR-005] +``` + +**✅ CORRECT - These test requirements quality:** + +```markdown +- [ ] CHK001 - Are the number and layout of featured episodes explicitly specified? [Completeness, Spec §FR-001] +- [ ] CHK002 - Are hover state requirements consistently defined for all interactive elements? [Consistency, Spec §FR-003] +- [ ] CHK003 - Are navigation requirements clear for all clickable brand elements? [Clarity, Spec §FR-010] +- [ ] CHK004 - Is the selection criteria for related episodes documented? [Gap, Spec §FR-005] +- [ ] CHK005 - Are loading state requirements defined for asynchronous episode data? [Gap] +- [ ] CHK006 - Can "visual hierarchy" requirements be objectively measured? [Measurability, Spec §FR-001] +``` + +**Key Differences:** + +- Wrong: Tests if the system works correctly +- Correct: Tests if the requirements are written correctly +- Wrong: Verification of behavior +- Correct: Validation of requirement quality +- Wrong: "Does it do X?" +- Correct: "Is X clearly specified?" diff --git a/.kilocode/workflows/speckit.clarify.md b/.kilocode/workflows/speckit.clarify.md new file mode 100644 index 0000000..6b28dae --- /dev/null +++ b/.kilocode/workflows/speckit.clarify.md @@ -0,0 +1,181 @@ +--- +description: Identify underspecified areas in the current feature spec by asking up to 5 highly targeted clarification questions and encoding answers back into the spec. +handoffs: + - label: Build Technical Plan + agent: speckit.plan + prompt: Create a plan for the spec. I am building with... +--- + +## User Input + +```text +$ARGUMENTS +``` + +You **MUST** consider the user input before proceeding (if not empty). + +## Outline + +Goal: Detect and reduce ambiguity or missing decision points in the active feature specification and record the clarifications directly in the spec file. + +Note: This clarification workflow is expected to run (and be completed) BEFORE invoking `/speckit.plan`. If the user explicitly states they are skipping clarification (e.g., exploratory spike), you may proceed, but must warn that downstream rework risk increases. + +Execution steps: + +1. Run `.specify/scripts/bash/check-prerequisites.sh --json --paths-only` from repo root **once** (combined `--json --paths-only` mode / `-Json -PathsOnly`). Parse minimal JSON payload fields: + - `FEATURE_DIR` + - `FEATURE_SPEC` + - (Optionally capture `IMPL_PLAN`, `TASKS` for future chained flows.) + - If JSON parsing fails, abort and instruct user to re-run `/speckit.specify` or verify feature branch environment. + - For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot"). + +2. Load the current spec file. Perform a structured ambiguity & coverage scan using this taxonomy. For each category, mark status: Clear / Partial / Missing. Produce an internal coverage map used for prioritization (do not output raw map unless no questions will be asked). + + Functional Scope & Behavior: + - Core user goals & success criteria + - Explicit out-of-scope declarations + - User roles / personas differentiation + + Domain & Data Model: + - Entities, attributes, relationships + - Identity & uniqueness rules + - Lifecycle/state transitions + - Data volume / scale assumptions + + Interaction & UX Flow: + - Critical user journeys / sequences + - Error/empty/loading states + - Accessibility or localization notes + + Non-Functional Quality Attributes: + - Performance (latency, throughput targets) + - Scalability (horizontal/vertical, limits) + - Reliability & availability (uptime, recovery expectations) + - Observability (logging, metrics, tracing signals) + - Security & privacy (authN/Z, data protection, threat assumptions) + - Compliance / regulatory constraints (if any) + + Integration & External Dependencies: + - External services/APIs and failure modes + - Data import/export formats + - Protocol/versioning assumptions + + Edge Cases & Failure Handling: + - Negative scenarios + - Rate limiting / throttling + - Conflict resolution (e.g., concurrent edits) + + Constraints & Tradeoffs: + - Technical constraints (language, storage, hosting) + - Explicit tradeoffs or rejected alternatives + + Terminology & Consistency: + - Canonical glossary terms + - Avoided synonyms / deprecated terms + + Completion Signals: + - Acceptance criteria testability + - Measurable Definition of Done style indicators + + Misc / Placeholders: + - TODO markers / unresolved decisions + - Ambiguous adjectives ("robust", "intuitive") lacking quantification + + For each category with Partial or Missing status, add a candidate question opportunity unless: + - Clarification would not materially change implementation or validation strategy + - Information is better deferred to planning phase (note internally) + +3. Generate (internally) a prioritized queue of candidate clarification questions (maximum 5). Do NOT output them all at once. Apply these constraints: + - Maximum of 10 total questions across the whole session. + - Each question must be answerable with EITHER: + - A short multiple‑choice selection (2–5 distinct, mutually exclusive options), OR + - A one-word / short‑phrase answer (explicitly constrain: "Answer in <=5 words"). + - Only include questions whose answers materially impact architecture, data modeling, task decomposition, test design, UX behavior, operational readiness, or compliance validation. + - Ensure category coverage balance: attempt to cover the highest impact unresolved categories first; avoid asking two low-impact questions when a single high-impact area (e.g., security posture) is unresolved. + - Exclude questions already answered, trivial stylistic preferences, or plan-level execution details (unless blocking correctness). + - Favor clarifications that reduce downstream rework risk or prevent misaligned acceptance tests. + - If more than 5 categories remain unresolved, select the top 5 by (Impact * Uncertainty) heuristic. + +4. Sequential questioning loop (interactive): + - Present EXACTLY ONE question at a time. + - For multiple‑choice questions: + - **Analyze all options** and determine the **most suitable option** based on: + - Best practices for the project type + - Common patterns in similar implementations + - Risk reduction (security, performance, maintainability) + - Alignment with any explicit project goals or constraints visible in the spec + - Present your **recommended option prominently** at the top with clear reasoning (1-2 sentences explaining why this is the best choice). + - Format as: `**Recommended:** Option [X] - ` + - Then render all options as a Markdown table: + + | Option | Description | + |--------|-------------| + | A | `); + if (head) { + renderer.head((child) => child.push(head)); + } + }; + if (typeof body === "function") { + this.child((renderer) => { + const r = new Renderer(this.global, this); + body(r); + if (this.global.mode === "async") { + return r.#collect_content_async().then((content) => { + close(renderer, content.body.replaceAll("", ""), content); + }); + } else { + const content = r.#collect_content(); + close(renderer, content.body.replaceAll("", ""), content); + } + }); + } else { + close(this, body, { body }); + } + } + /** + * @param {(renderer: Renderer) => void} fn + */ + title(fn) { + const path = this.get_path(); + const close = (head) => { + this.global.set_title(head, path); + }; + this.child((renderer) => { + const r = new Renderer(renderer.global, renderer); + fn(r); + if (renderer.global.mode === "async") { + return r.#collect_content_async().then((content) => { + close(content.head); + }); + } else { + const content = r.#collect_content(); + close(content.head); + } + }); + } + /** + * @param {string | (() => Promise)} content + */ + push(content) { + if (typeof content === "function") { + this.child(async (renderer) => renderer.push(await content())); + } else { + this.#out.push(content); + } + } + /** + * @param {() => void} fn + */ + on_destroy(fn) { + (this.#on_destroy ??= []).push(fn); + } + /** + * @returns {number[]} + */ + get_path() { + return this.#parent ? [...this.#parent.get_path(), this.#parent.#out.indexOf(this)] : []; + } + /** + * @deprecated this is needed for legacy component bindings + */ + copy() { + const copy = new Renderer(this.global, this.#parent); + copy.#out = this.#out.map((item) => item instanceof Renderer ? item.copy() : item); + copy.promise = this.promise; + return copy; + } + /** + * @param {Renderer} other + * @deprecated this is needed for legacy component bindings + */ + subsume(other) { + if (this.global.mode !== other.global.mode) { + throw new Error( + "invariant: A renderer cannot switch modes. If you're seeing this, there's a compiler bug. File an issue!" + ); + } + this.local = other.local; + this.#out = other.#out.map((item) => { + if (item instanceof Renderer) { + item.subsume(item); + } + return item; + }); + this.promise = other.promise; + this.type = other.type; + } + get length() { + return this.#out.length; + } + /** + * Only available on the server and when compiling with the `server` option. + * Takes a component and returns an object with `body` and `head` properties on it, which you can use to populate the HTML when server-rendering your app. + * @template {Record} Props + * @param {Component} component + * @param {{ props?: Omit; context?: Map; idPrefix?: string; csp?: Csp }} [options] + * @returns {RenderOutput} + */ + static render(component, options = {}) { + let sync; + const result = ( + /** @type {RenderOutput} */ + {} + ); + Object.defineProperties(result, { + html: { + get: () => { + return (sync ??= Renderer.#render(component, options)).body; + } + }, + head: { + get: () => { + return (sync ??= Renderer.#render(component, options)).head; + } + }, + body: { + get: () => { + return (sync ??= Renderer.#render(component, options)).body; + } + }, + hashes: { + value: { + script: "" + } + }, + then: { + value: ( + /** + * this is not type-safe, but honestly it's the best I can do right now, and it's a straightforward function. + * + * @template TResult1 + * @template [TResult2=never] + * @param { (value: SyncRenderOutput) => TResult1 } onfulfilled + * @param { (reason: unknown) => TResult2 } onrejected + */ + (onfulfilled, onrejected) => { + { + const result2 = sync ??= Renderer.#render(component, options); + const user_result = onfulfilled({ + head: result2.head, + body: result2.body, + html: result2.body, + hashes: { script: [] } + }); + return Promise.resolve(user_result); + } + } + ) + } + }); + return result; + } + /** + * Collect all of the `onDestroy` callbacks registered during rendering. In an async context, this is only safe to call + * after awaiting `collect_async`. + * + * Child renderers are "porous" and don't affect execution order, but component body renderers + * create ordering boundaries. Within a renderer, callbacks run in order until hitting a component boundary. + * @returns {Iterable<() => void>} + */ + *#collect_on_destroy() { + for (const component of this.#traverse_components()) { + yield* component.#collect_ondestroy(); + } + } + /** + * Performs a depth-first search of renderers, yielding the deepest components first, then additional components as we backtrack up the tree. + * @returns {Iterable} + */ + *#traverse_components() { + for (const child of this.#out) { + if (typeof child !== "string") { + yield* child.#traverse_components(); + } + } + if (this.#is_component_body) { + yield this; + } + } + /** + * @returns {Iterable<() => void>} + */ + *#collect_ondestroy() { + if (this.#on_destroy) { + for (const fn of this.#on_destroy) { + yield fn; + } + } + for (const child of this.#out) { + if (child instanceof Renderer && !child.#is_component_body) { + yield* child.#collect_ondestroy(); + } + } + } + /** + * Render a component. Throws if any of the children are performing asynchronous work. + * + * @template {Record} Props + * @param {Component} component + * @param {{ props?: Omit; context?: Map; idPrefix?: string }} options + * @returns {AccumulatedContent} + */ + static #render(component, options) { + var previous_context = ssr_context; + try { + const renderer = Renderer.#open_render("sync", component, options); + const content = renderer.#collect_content(); + return Renderer.#close_render(content, renderer); + } finally { + abort(); + set_ssr_context(previous_context); + } + } + /** + * Render a component. + * + * @template {Record} Props + * @param {Component} component + * @param {{ props?: Omit; context?: Map; idPrefix?: string; csp?: Csp }} options + * @returns {Promise} + */ + static async #render_async(component, options) { + const previous_context = ssr_context; + try { + const renderer = Renderer.#open_render("async", component, options); + const content = await renderer.#collect_content_async(); + const hydratables = await renderer.#collect_hydratables(); + if (hydratables !== null) { + content.head = hydratables + content.head; + } + return Renderer.#close_render(content, renderer); + } finally { + set_ssr_context(previous_context); + abort(); + } + } + /** + * Collect all of the code from the `out` array and return it as a string, or a promise resolving to a string. + * @param {AccumulatedContent} content + * @returns {AccumulatedContent} + */ + #collect_content(content = { head: "", body: "" }) { + for (const item of this.#out) { + if (typeof item === "string") { + content[this.type] += item; + } else if (item instanceof Renderer) { + item.#collect_content(content); + } + } + return content; + } + /** + * Collect all of the code from the `out` array and return it as a string. + * @param {AccumulatedContent} content + * @returns {Promise} + */ + async #collect_content_async(content = { head: "", body: "" }) { + await this.promise; + for (const item of this.#out) { + if (typeof item === "string") { + content[this.type] += item; + } else if (item instanceof Renderer) { + await item.#collect_content_async(content); + } + } + return content; + } + async #collect_hydratables() { + const ctx = get_render_context().hydratable; + for (const [_, key] of ctx.unresolved_promises) { + unresolved_hydratable(key, ctx.lookup.get(key)?.stack ?? ""); + } + for (const comparison of ctx.comparisons) { + await comparison; + } + return await this.#hydratable_block(ctx); + } + /** + * @template {Record} Props + * @param {'sync' | 'async'} mode + * @param {import('svelte').Component} component + * @param {{ props?: Omit; context?: Map; idPrefix?: string; csp?: Csp }} options + * @returns {Renderer} + */ + static #open_render(mode, component, options) { + const renderer = new Renderer( + new SSRState(mode, options.idPrefix ? options.idPrefix + "-" : "", options.csp) + ); + renderer.push(BLOCK_OPEN); + if (options.context) { + push(); + ssr_context.c = options.context; + ssr_context.r = renderer; + } + component(renderer, options.props ?? {}); + if (options.context) { + pop(); + } + renderer.push(BLOCK_CLOSE); + return renderer; + } + /** + * @param {AccumulatedContent} content + * @param {Renderer} renderer + * @returns {AccumulatedContent & { hashes: { script: Sha256Source[] } }} + */ + static #close_render(content, renderer) { + for (const cleanup of renderer.#collect_on_destroy()) { + cleanup(); + } + let head = content.head + renderer.global.get_title(); + let body = content.body; + for (const { hash, code } of renderer.global.css) { + head += ``; + } + return { + head, + body, + hashes: { + script: renderer.global.csp.script_hashes + } + }; + } + /** + * @param {HydratableContext} ctx + */ + async #hydratable_block(ctx) { + if (ctx.lookup.size === 0) { + return null; + } + let entries = []; + let has_promises = false; + for (const [k, v] of ctx.lookup) { + if (v.promises) { + has_promises = true; + for (const p of v.promises) await p; + } + entries.push(`[${JSON.stringify(k)},${v.serialized}]`); + } + let prelude = `const h = (window.__svelte ??= {}).h ??= new Map();`; + if (has_promises) { + prelude = `const r = (v) => Promise.resolve(v); + ${prelude}`; + } + const body = ` + { + ${prelude} + + for (const [k, v] of [ + ${entries.join(",\n ")} + ]) { + h.set(k, v); + } + } + `; + let csp_attr = ""; + if (this.global.csp.nonce) { + csp_attr = ` nonce="${this.global.csp.nonce}"`; + } else if (this.global.csp.hash) { + const hash = await sha256(body); + this.global.csp.script_hashes.push(`sha256-${hash}`); + } + return ` + ${body}<\/script>`; + } +} +class SSRState { + /** @readonly @type {Csp & { script_hashes: Sha256Source[] }} */ + csp; + /** @readonly @type {'sync' | 'async'} */ + mode; + /** @readonly @type {() => string} */ + uid; + /** @readonly @type {Set<{ hash: string; code: string }>} */ + css = /* @__PURE__ */ new Set(); + /** @type {{ path: number[], value: string }} */ + #title = { path: [], value: "" }; + /** + * @param {'sync' | 'async'} mode + * @param {string} id_prefix + * @param {Csp} csp + */ + constructor(mode, id_prefix = "", csp = { hash: false }) { + this.mode = mode; + this.csp = { ...csp, script_hashes: [] }; + let uid = 1; + this.uid = () => `${id_prefix}s${uid++}`; + } + get_title() { + return this.#title.value; + } + /** + * Performs a depth-first (lexicographic) comparison using the path. Rejects sets + * from earlier than or equal to the current value. + * @param {string} value + * @param {number[]} path + */ + set_title(value, path) { + const current = this.#title.path; + let i = 0; + let l = Math.min(path.length, current.length); + while (i < l && path[i] === current[i]) i += 1; + if (path[i] === void 0) return; + if (current[i] === void 0 || path[i] > current[i]) { + this.#title.path = path; + this.#title.value = value; + } + } +} +const INVALID_ATTR_NAME_CHAR_REGEX = /[\s'">/=\u{FDD0}-\u{FDEF}\u{FFFE}\u{FFFF}\u{1FFFE}\u{1FFFF}\u{2FFFE}\u{2FFFF}\u{3FFFE}\u{3FFFF}\u{4FFFE}\u{4FFFF}\u{5FFFE}\u{5FFFF}\u{6FFFE}\u{6FFFF}\u{7FFFE}\u{7FFFF}\u{8FFFE}\u{8FFFF}\u{9FFFE}\u{9FFFF}\u{AFFFE}\u{AFFFF}\u{BFFFE}\u{BFFFF}\u{CFFFE}\u{CFFFF}\u{DFFFE}\u{DFFFF}\u{EFFFE}\u{EFFFF}\u{FFFFE}\u{FFFFF}\u{10FFFE}\u{10FFFF}]/u; +function render(component, options = {}) { + if (options.csp?.hash && options.csp.nonce) { + invalid_csp(); + } + return Renderer.render( + /** @type {Component} */ + component, + options + ); +} +function attributes(attrs, css_hash, classes, styles, flags = 0) { + if (styles) { + attrs.style = to_style(attrs.style, styles); + } + if (attrs.class) { + attrs.class = clsx(attrs.class); + } + if (css_hash || classes) { + attrs.class = to_class(attrs.class, css_hash, classes); + } + let attr_str = ""; + let name; + const is_html = (flags & ELEMENT_IS_NAMESPACED) === 0; + const lowercase = (flags & ELEMENT_PRESERVE_ATTRIBUTE_CASE) === 0; + const is_input = (flags & ELEMENT_IS_INPUT) !== 0; + for (name in attrs) { + if (typeof attrs[name] === "function") continue; + if (name[0] === "$" && name[1] === "$") continue; + if (INVALID_ATTR_NAME_CHAR_REGEX.test(name)) continue; + var value = attrs[name]; + if (lowercase) { + name = name.toLowerCase(); + } + if (is_input) { + if (name === "defaultvalue" || name === "defaultchecked") { + name = name === "defaultvalue" ? "value" : "checked"; + if (attrs[name]) continue; + } + } + attr_str += attr(name, value, is_html && is_boolean_attribute(name)); + } + return attr_str; +} +function stringify(value) { + return typeof value === "string" ? value : value == null ? "" : value + ""; +} +function attr_class(value, hash, directives) { + var result = to_class(value, hash, directives); + return result ? ` class="${escape_html(result, true)}"` : ""; +} +function store_get(store_values, store_name, store) { + if (store_name in store_values && store_values[store_name][0] === store) { + return store_values[store_name][2]; + } + store_values[store_name]?.[1](); + store_values[store_name] = [store, null, void 0]; + const unsub = subscribe_to_store( + store, + /** @param {any} v */ + (v) => store_values[store_name][2] = v + ); + store_values[store_name][1] = unsub; + return store_values[store_name][2]; +} +function unsubscribe_stores(store_values) { + for (const store_name in store_values) { + store_values[store_name][1](); + } +} +function slot(renderer, $$props, name, slot_props, fallback_fn) { + var slot_fn = $$props.$$slots?.[name]; + if (slot_fn === true) { + slot_fn = $$props["children"]; + } + if (slot_fn !== void 0) { + slot_fn(renderer, slot_props); + } +} +function bind_props(props_parent, props_now) { + for (const key in props_now) { + const initial_value = props_parent[key]; + const value = props_now[key]; + if (initial_value === void 0 && value !== void 0 && Object.getOwnPropertyDescriptor(props_parent, key)?.set) { + props_parent[key] = value; + } + } +} +function ensure_array_like(array_like_or_iterator) { + if (array_like_or_iterator) { + return array_like_or_iterator.length !== void 0 ? array_like_or_iterator : Array.from(array_like_or_iterator); + } + return []; +} +export { + slot as $, + svelte_boundary_reset_onerror as A, + Batch as B, + COMMENT_NODE as C, + EFFECT_PRESERVED as D, + EFFECT_TRANSPARENT as E, + BOUNDARY_EFFECT as F, + init_operations as G, + HYDRATION_ERROR as H, + get_first_child as I, + hydration_failed as J, + clear_text_content as K, + component_root as L, + is_passive_event as M, + push$1 as N, + pop$1 as O, + set as P, + LEGACY_PROPS as Q, + flushSync as R, + mutable_source as S, + render as T, + setContext as U, + attr_class as V, + stringify as W, + store_get as X, + unsubscribe_stores as Y, + ensure_array_like as Z, + escape_html as _, + HYDRATION_END as a, + getContext as a0, + ssr_context as a1, + attr as a2, + bind_props as a3, + HYDRATION_START as b, + HYDRATION_START_ELSE as c, + get as d, + effect_tracking as e, + active_effect as f, + get_next_sibling as g, + block as h, + increment as i, + branch as j, + create_text as k, + set_active_effect as l, + set_active_reaction as m, + set_component_context as n, + handle_error as o, + pause_effect as p, + queue_micro_task as q, + render_effect as r, + source as s, + active_reaction as t, + untrack as u, + component_context as v, + move_effect as w, + internal_set as x, + destroy_effect as y, + invoke_error_boundary as z +}; diff --git a/frontend/.svelte-kit/output/server/chunks/internal.js b/frontend/.svelte-kit/output/server/chunks/internal.js new file mode 100644 index 0000000..06b5acd --- /dev/null +++ b/frontend/.svelte-kit/output/server/chunks/internal.js @@ -0,0 +1,982 @@ +import { H as HYDRATION_ERROR, C as COMMENT_NODE, a as HYDRATION_END, g as get_next_sibling, b as HYDRATION_START, c as HYDRATION_START_ELSE, e as effect_tracking, d as get, s as source, r as render_effect, u as untrack, i as increment, q as queue_micro_task, f as active_effect, h as block, j as branch, B as Batch, p as pause_effect, k as create_text, l as set_active_effect, m as set_active_reaction, n as set_component_context, o as handle_error, t as active_reaction, v as component_context, w as move_effect, x as internal_set, y as destroy_effect, z as invoke_error_boundary, A as svelte_boundary_reset_onerror, E as EFFECT_TRANSPARENT, D as EFFECT_PRESERVED, F as BOUNDARY_EFFECT, G as init_operations, I as get_first_child, J as hydration_failed, K as clear_text_content, L as component_root, M as is_passive_event, N as push, O as pop, P as set, Q as LEGACY_PROPS, R as flushSync, S as mutable_source, T as render, U as setContext } from "./index2.js"; +import { d as define_property, a as array_from } from "./equality.js"; +import "clsx"; +import "./environment.js"; +let public_env = {}; +function set_private_env(environment) { +} +function set_public_env(environment) { + public_env = environment; +} +function hydration_mismatch(location) { + { + console.warn(`https://svelte.dev/e/hydration_mismatch`); + } +} +function svelte_boundary_reset_noop() { + { + console.warn(`https://svelte.dev/e/svelte_boundary_reset_noop`); + } +} +let hydrating = false; +function set_hydrating(value) { + hydrating = value; +} +let hydrate_node; +function set_hydrate_node(node) { + if (node === null) { + hydration_mismatch(); + throw HYDRATION_ERROR; + } + return hydrate_node = node; +} +function hydrate_next() { + return set_hydrate_node(get_next_sibling(hydrate_node)); +} +function next(count = 1) { + if (hydrating) { + var i = count; + var node = hydrate_node; + while (i--) { + node = /** @type {TemplateNode} */ + get_next_sibling(node); + } + hydrate_node = node; + } +} +function skip_nodes(remove = true) { + var depth = 0; + var node = hydrate_node; + while (true) { + if (node.nodeType === COMMENT_NODE) { + var data = ( + /** @type {Comment} */ + node.data + ); + if (data === HYDRATION_END) { + if (depth === 0) return node; + depth -= 1; + } else if (data === HYDRATION_START || data === HYDRATION_START_ELSE) { + depth += 1; + } + } + var next2 = ( + /** @type {TemplateNode} */ + get_next_sibling(node) + ); + if (remove) node.remove(); + node = next2; + } +} +function createSubscriber(start) { + let subscribers = 0; + let version = source(0); + let stop; + return () => { + if (effect_tracking()) { + get(version); + render_effect(() => { + if (subscribers === 0) { + stop = untrack(() => start(() => increment(version))); + } + subscribers += 1; + return () => { + queue_micro_task(() => { + subscribers -= 1; + if (subscribers === 0) { + stop?.(); + stop = void 0; + increment(version); + } + }); + }; + }); + } + }; +} +var flags = EFFECT_TRANSPARENT | EFFECT_PRESERVED | BOUNDARY_EFFECT; +function boundary(node, props, children) { + new Boundary(node, props, children); +} +class Boundary { + /** @type {Boundary | null} */ + parent; + #pending = false; + /** @type {TemplateNode} */ + #anchor; + /** @type {TemplateNode | null} */ + #hydrate_open = hydrating ? hydrate_node : null; + /** @type {BoundaryProps} */ + #props; + /** @type {((anchor: Node) => void)} */ + #children; + /** @type {Effect} */ + #effect; + /** @type {Effect | null} */ + #main_effect = null; + /** @type {Effect | null} */ + #pending_effect = null; + /** @type {Effect | null} */ + #failed_effect = null; + /** @type {DocumentFragment | null} */ + #offscreen_fragment = null; + /** @type {TemplateNode | null} */ + #pending_anchor = null; + #local_pending_count = 0; + #pending_count = 0; + #is_creating_fallback = false; + /** + * A source containing the number of pending async deriveds/expressions. + * Only created if `$effect.pending()` is used inside the boundary, + * otherwise updating the source results in needless `Batch.ensure()` + * calls followed by no-op flushes + * @type {Source | null} + */ + #effect_pending = null; + #effect_pending_subscriber = createSubscriber(() => { + this.#effect_pending = source(this.#local_pending_count); + return () => { + this.#effect_pending = null; + }; + }); + /** + * @param {TemplateNode} node + * @param {BoundaryProps} props + * @param {((anchor: Node) => void)} children + */ + constructor(node, props, children) { + this.#anchor = node; + this.#props = props; + this.#children = children; + this.parent = /** @type {Effect} */ + active_effect.b; + this.#pending = !!this.#props.pending; + this.#effect = block(() => { + active_effect.b = this; + if (hydrating) { + const comment = this.#hydrate_open; + hydrate_next(); + const server_rendered_pending = ( + /** @type {Comment} */ + comment.nodeType === COMMENT_NODE && /** @type {Comment} */ + comment.data === HYDRATION_START_ELSE + ); + if (server_rendered_pending) { + this.#hydrate_pending_content(); + } else { + this.#hydrate_resolved_content(); + } + } else { + var anchor = this.#get_anchor(); + try { + this.#main_effect = branch(() => children(anchor)); + } catch (error) { + this.error(error); + } + if (this.#pending_count > 0) { + this.#show_pending_snippet(); + } else { + this.#pending = false; + } + } + return () => { + this.#pending_anchor?.remove(); + }; + }, flags); + if (hydrating) { + this.#anchor = hydrate_node; + } + } + #hydrate_resolved_content() { + try { + this.#main_effect = branch(() => this.#children(this.#anchor)); + } catch (error) { + this.error(error); + } + this.#pending = false; + } + #hydrate_pending_content() { + const pending = this.#props.pending; + if (!pending) { + return; + } + this.#pending_effect = branch(() => pending(this.#anchor)); + Batch.enqueue(() => { + var anchor = this.#get_anchor(); + this.#main_effect = this.#run(() => { + Batch.ensure(); + return branch(() => this.#children(anchor)); + }); + if (this.#pending_count > 0) { + this.#show_pending_snippet(); + } else { + pause_effect( + /** @type {Effect} */ + this.#pending_effect, + () => { + this.#pending_effect = null; + } + ); + this.#pending = false; + } + }); + } + #get_anchor() { + var anchor = this.#anchor; + if (this.#pending) { + this.#pending_anchor = create_text(); + this.#anchor.before(this.#pending_anchor); + anchor = this.#pending_anchor; + } + return anchor; + } + /** + * Returns `true` if the effect exists inside a boundary whose pending snippet is shown + * @returns {boolean} + */ + is_pending() { + return this.#pending || !!this.parent && this.parent.is_pending(); + } + has_pending_snippet() { + return !!this.#props.pending; + } + /** + * @param {() => Effect | null} fn + */ + #run(fn) { + var previous_effect = active_effect; + var previous_reaction = active_reaction; + var previous_ctx = component_context; + set_active_effect(this.#effect); + set_active_reaction(this.#effect); + set_component_context(this.#effect.ctx); + try { + return fn(); + } catch (e) { + handle_error(e); + return null; + } finally { + set_active_effect(previous_effect); + set_active_reaction(previous_reaction); + set_component_context(previous_ctx); + } + } + #show_pending_snippet() { + const pending = ( + /** @type {(anchor: Node) => void} */ + this.#props.pending + ); + if (this.#main_effect !== null) { + this.#offscreen_fragment = document.createDocumentFragment(); + this.#offscreen_fragment.append( + /** @type {TemplateNode} */ + this.#pending_anchor + ); + move_effect(this.#main_effect, this.#offscreen_fragment); + } + if (this.#pending_effect === null) { + this.#pending_effect = branch(() => pending(this.#anchor)); + } + } + /** + * Updates the pending count associated with the currently visible pending snippet, + * if any, such that we can replace the snippet with content once work is done + * @param {1 | -1} d + */ + #update_pending_count(d) { + if (!this.has_pending_snippet()) { + if (this.parent) { + this.parent.#update_pending_count(d); + } + return; + } + this.#pending_count += d; + if (this.#pending_count === 0) { + this.#pending = false; + if (this.#pending_effect) { + pause_effect(this.#pending_effect, () => { + this.#pending_effect = null; + }); + } + if (this.#offscreen_fragment) { + this.#anchor.before(this.#offscreen_fragment); + this.#offscreen_fragment = null; + } + } + } + /** + * Update the source that powers `$effect.pending()` inside this boundary, + * and controls when the current `pending` snippet (if any) is removed. + * Do not call from inside the class + * @param {1 | -1} d + */ + update_pending_count(d) { + this.#update_pending_count(d); + this.#local_pending_count += d; + if (this.#effect_pending) { + internal_set(this.#effect_pending, this.#local_pending_count); + } + } + get_effect_pending() { + this.#effect_pending_subscriber(); + return get( + /** @type {Source} */ + this.#effect_pending + ); + } + /** @param {unknown} error */ + error(error) { + var onerror = this.#props.onerror; + let failed = this.#props.failed; + if (this.#is_creating_fallback || !onerror && !failed) { + throw error; + } + if (this.#main_effect) { + destroy_effect(this.#main_effect); + this.#main_effect = null; + } + if (this.#pending_effect) { + destroy_effect(this.#pending_effect); + this.#pending_effect = null; + } + if (this.#failed_effect) { + destroy_effect(this.#failed_effect); + this.#failed_effect = null; + } + if (hydrating) { + set_hydrate_node( + /** @type {TemplateNode} */ + this.#hydrate_open + ); + next(); + set_hydrate_node(skip_nodes()); + } + var did_reset = false; + var calling_on_error = false; + const reset = () => { + if (did_reset) { + svelte_boundary_reset_noop(); + return; + } + did_reset = true; + if (calling_on_error) { + svelte_boundary_reset_onerror(); + } + Batch.ensure(); + this.#local_pending_count = 0; + if (this.#failed_effect !== null) { + pause_effect(this.#failed_effect, () => { + this.#failed_effect = null; + }); + } + this.#pending = this.has_pending_snippet(); + this.#main_effect = this.#run(() => { + this.#is_creating_fallback = false; + return branch(() => this.#children(this.#anchor)); + }); + if (this.#pending_count > 0) { + this.#show_pending_snippet(); + } else { + this.#pending = false; + } + }; + var previous_reaction = active_reaction; + try { + set_active_reaction(null); + calling_on_error = true; + onerror?.(error, reset); + calling_on_error = false; + } catch (error2) { + invoke_error_boundary(error2, this.#effect && this.#effect.parent); + } finally { + set_active_reaction(previous_reaction); + } + if (failed) { + queue_micro_task(() => { + this.#failed_effect = this.#run(() => { + Batch.ensure(); + this.#is_creating_fallback = true; + try { + return branch(() => { + failed( + this.#anchor, + () => error, + () => reset + ); + }); + } catch (error2) { + invoke_error_boundary( + error2, + /** @type {Effect} */ + this.#effect.parent + ); + return null; + } finally { + this.#is_creating_fallback = false; + } + }); + }); + } + } +} +const all_registered_events = /* @__PURE__ */ new Set(); +const root_event_handles = /* @__PURE__ */ new Set(); +let last_propagated_event = null; +function handle_event_propagation(event) { + var handler_element = this; + var owner_document = ( + /** @type {Node} */ + handler_element.ownerDocument + ); + var event_name = event.type; + var path = event.composedPath?.() || []; + var current_target = ( + /** @type {null | Element} */ + path[0] || event.target + ); + last_propagated_event = event; + var path_idx = 0; + var handled_at = last_propagated_event === event && event.__root; + if (handled_at) { + var at_idx = path.indexOf(handled_at); + if (at_idx !== -1 && (handler_element === document || handler_element === /** @type {any} */ + window)) { + event.__root = handler_element; + return; + } + var handler_idx = path.indexOf(handler_element); + if (handler_idx === -1) { + return; + } + if (at_idx <= handler_idx) { + path_idx = at_idx; + } + } + current_target = /** @type {Element} */ + path[path_idx] || event.target; + if (current_target === handler_element) return; + define_property(event, "currentTarget", { + configurable: true, + get() { + return current_target || owner_document; + } + }); + var previous_reaction = active_reaction; + var previous_effect = active_effect; + set_active_reaction(null); + set_active_effect(null); + try { + var throw_error; + var other_errors = []; + while (current_target !== null) { + var parent_element = current_target.assignedSlot || current_target.parentNode || /** @type {any} */ + current_target.host || null; + try { + var delegated = current_target["__" + event_name]; + if (delegated != null && (!/** @type {any} */ + current_target.disabled || // DOM could've been updated already by the time this is reached, so we check this as well + // -> the target could not have been disabled because it emits the event in the first place + event.target === current_target)) { + delegated.call(current_target, event); + } + } catch (error) { + if (throw_error) { + other_errors.push(error); + } else { + throw_error = error; + } + } + if (event.cancelBubble || parent_element === handler_element || parent_element === null) { + break; + } + current_target = parent_element; + } + if (throw_error) { + for (let error of other_errors) { + queueMicrotask(() => { + throw error; + }); + } + throw throw_error; + } + } finally { + event.__root = handler_element; + delete event.currentTarget; + set_active_reaction(previous_reaction); + set_active_effect(previous_effect); + } +} +function assign_nodes(start, end) { + var effect = ( + /** @type {Effect} */ + active_effect + ); + if (effect.nodes === null) { + effect.nodes = { start, end, a: null, t: null }; + } +} +function mount(component, options2) { + return _mount(component, options2); +} +function hydrate(component, options2) { + init_operations(); + options2.intro = options2.intro ?? false; + const target = options2.target; + const was_hydrating = hydrating; + const previous_hydrate_node = hydrate_node; + try { + var anchor = get_first_child(target); + while (anchor && (anchor.nodeType !== COMMENT_NODE || /** @type {Comment} */ + anchor.data !== HYDRATION_START)) { + anchor = get_next_sibling(anchor); + } + if (!anchor) { + throw HYDRATION_ERROR; + } + set_hydrating(true); + set_hydrate_node( + /** @type {Comment} */ + anchor + ); + const instance = _mount(component, { ...options2, anchor }); + set_hydrating(false); + return ( + /** @type {Exports} */ + instance + ); + } catch (error) { + if (error instanceof Error && error.message.split("\n").some((line) => line.startsWith("https://svelte.dev/e/"))) { + throw error; + } + if (error !== HYDRATION_ERROR) { + console.warn("Failed to hydrate: ", error); + } + if (options2.recover === false) { + hydration_failed(); + } + init_operations(); + clear_text_content(target); + set_hydrating(false); + return mount(component, options2); + } finally { + set_hydrating(was_hydrating); + set_hydrate_node(previous_hydrate_node); + } +} +const document_listeners = /* @__PURE__ */ new Map(); +function _mount(Component, { target, anchor, props = {}, events, context, intro = true }) { + init_operations(); + var registered_events = /* @__PURE__ */ new Set(); + var event_handle = (events2) => { + for (var i = 0; i < events2.length; i++) { + var event_name = events2[i]; + if (registered_events.has(event_name)) continue; + registered_events.add(event_name); + var passive = is_passive_event(event_name); + target.addEventListener(event_name, handle_event_propagation, { passive }); + var n = document_listeners.get(event_name); + if (n === void 0) { + document.addEventListener(event_name, handle_event_propagation, { passive }); + document_listeners.set(event_name, 1); + } else { + document_listeners.set(event_name, n + 1); + } + } + }; + event_handle(array_from(all_registered_events)); + root_event_handles.add(event_handle); + var component = void 0; + var unmount2 = component_root(() => { + var anchor_node = anchor ?? target.appendChild(create_text()); + boundary( + /** @type {TemplateNode} */ + anchor_node, + { + pending: () => { + } + }, + (anchor_node2) => { + if (context) { + push({}); + var ctx = ( + /** @type {ComponentContext} */ + component_context + ); + ctx.c = context; + } + if (events) { + props.$$events = events; + } + if (hydrating) { + assign_nodes( + /** @type {TemplateNode} */ + anchor_node2, + null + ); + } + component = Component(anchor_node2, props) || {}; + if (hydrating) { + active_effect.nodes.end = hydrate_node; + if (hydrate_node === null || hydrate_node.nodeType !== COMMENT_NODE || /** @type {Comment} */ + hydrate_node.data !== HYDRATION_END) { + hydration_mismatch(); + throw HYDRATION_ERROR; + } + } + if (context) { + pop(); + } + } + ); + return () => { + for (var event_name of registered_events) { + target.removeEventListener(event_name, handle_event_propagation); + var n = ( + /** @type {number} */ + document_listeners.get(event_name) + ); + if (--n === 0) { + document.removeEventListener(event_name, handle_event_propagation); + document_listeners.delete(event_name); + } else { + document_listeners.set(event_name, n); + } + } + root_event_handles.delete(event_handle); + if (anchor_node !== anchor) { + anchor_node.parentNode?.removeChild(anchor_node); + } + }; + }); + mounted_components.set(component, unmount2); + return component; +} +let mounted_components = /* @__PURE__ */ new WeakMap(); +function unmount(component, options2) { + const fn = mounted_components.get(component); + if (fn) { + mounted_components.delete(component); + return fn(options2); + } + return Promise.resolve(); +} +function asClassComponent$1(component) { + return class extends Svelte4Component { + /** @param {any} options */ + constructor(options2) { + super({ + component, + ...options2 + }); + } + }; +} +class Svelte4Component { + /** @type {any} */ + #events; + /** @type {Record} */ + #instance; + /** + * @param {ComponentConstructorOptions & { + * component: any; + * }} options + */ + constructor(options2) { + var sources = /* @__PURE__ */ new Map(); + var add_source = (key, value) => { + var s = mutable_source(value, false, false); + sources.set(key, s); + return s; + }; + const props = new Proxy( + { ...options2.props || {}, $$events: {} }, + { + get(target, prop) { + return get(sources.get(prop) ?? add_source(prop, Reflect.get(target, prop))); + }, + has(target, prop) { + if (prop === LEGACY_PROPS) return true; + get(sources.get(prop) ?? add_source(prop, Reflect.get(target, prop))); + return Reflect.has(target, prop); + }, + set(target, prop, value) { + set(sources.get(prop) ?? add_source(prop, value), value); + return Reflect.set(target, prop, value); + } + } + ); + this.#instance = (options2.hydrate ? hydrate : mount)(options2.component, { + target: options2.target, + anchor: options2.anchor, + props, + context: options2.context, + intro: options2.intro ?? false, + recover: options2.recover + }); + if (!options2?.props?.$$host || options2.sync === false) { + flushSync(); + } + this.#events = props.$$events; + for (const key of Object.keys(this.#instance)) { + if (key === "$set" || key === "$destroy" || key === "$on") continue; + define_property(this, key, { + get() { + return this.#instance[key]; + }, + /** @param {any} value */ + set(value) { + this.#instance[key] = value; + }, + enumerable: true + }); + } + this.#instance.$set = /** @param {Record} next */ + (next2) => { + Object.assign(props, next2); + }; + this.#instance.$destroy = () => { + unmount(this.#instance); + }; + } + /** @param {Record} props */ + $set(props) { + this.#instance.$set(props); + } + /** + * @param {string} event + * @param {(...args: any[]) => any} callback + * @returns {any} + */ + $on(event, callback) { + this.#events[event] = this.#events[event] || []; + const cb = (...args) => callback.call(this, ...args); + this.#events[event].push(cb); + return () => { + this.#events[event] = this.#events[event].filter( + /** @param {any} fn */ + (fn) => fn !== cb + ); + }; + } + $destroy() { + this.#instance.$destroy(); + } +} +let read_implementation = null; +function set_read_implementation(fn) { + read_implementation = fn; +} +function set_manifest(_) { +} +function asClassComponent(component) { + const component_constructor = asClassComponent$1(component); + const _render = (props, { context, csp } = {}) => { + const result = render(component, { props, context, csp }); + const munged = Object.defineProperties( + /** @type {LegacyRenderResult & PromiseLike} */ + {}, + { + css: { + value: { code: "", map: null } + }, + head: { + get: () => result.head + }, + html: { + get: () => result.body + }, + then: { + /** + * this is not type-safe, but honestly it's the best I can do right now, and it's a straightforward function. + * + * @template TResult1 + * @template [TResult2=never] + * @param { (value: LegacyRenderResult) => TResult1 } onfulfilled + * @param { (reason: unknown) => TResult2 } onrejected + */ + value: (onfulfilled, onrejected) => { + { + const user_result = onfulfilled({ + css: munged.css, + head: munged.head, + html: munged.html + }); + return Promise.resolve(user_result); + } + } + } + } + ); + return munged; + }; + component_constructor.render = _render; + return component_constructor; +} +function Root($$renderer, $$props) { + $$renderer.component(($$renderer2) => { + let { + stores, + page, + constructors, + components = [], + form, + data_0 = null, + data_1 = null + } = $$props; + { + setContext("__svelte__", stores); + } + { + stores.page.set(page); + } + const Pyramid_1 = constructors[1]; + if (constructors[1]) { + $$renderer2.push(""); + const Pyramid_0 = constructors[0]; + $$renderer2.push(``); + Pyramid_0($$renderer2, { + data: data_0, + form, + params: page.params, + children: ($$renderer3) => { + $$renderer3.push(``); + Pyramid_1($$renderer3, { data: data_1, form, params: page.params }); + $$renderer3.push(``); + }, + $$slots: { default: true } + }); + $$renderer2.push(``); + } else { + $$renderer2.push(""); + const Pyramid_0 = constructors[0]; + $$renderer2.push(``); + Pyramid_0($$renderer2, { data: data_0, form, params: page.params }); + $$renderer2.push(``); + } + $$renderer2.push(` `); + { + $$renderer2.push(""); + } + $$renderer2.push(``); + }); +} +const root = asClassComponent(Root); +const options = { + app_template_contains_nonce: false, + async: false, + csp: { "mode": "auto", "directives": { "upgrade-insecure-requests": false, "block-all-mixed-content": false }, "reportOnly": { "upgrade-insecure-requests": false, "block-all-mixed-content": false } }, + csrf_check_origin: true, + csrf_trusted_origins: [], + embedded: false, + env_public_prefix: "PUBLIC_", + env_private_prefix: "", + hash_routing: false, + hooks: null, + // added lazily, via `get_hooks` + preload_strategy: "modulepreload", + root, + service_worker: false, + service_worker_options: void 0, + templates: { + app: ({ head, body, assets, nonce, env }) => '\n\n \n \n \n \n ' + head + '\n \n \n
' + body + "
\n \n\n", + error: ({ status, message }) => '\n\n \n \n ' + message + ` + + + + +
+ ` + status + '\n
\n

' + message + "

\n
\n
\n \n\n" + }, + version_hash: "1ootf77" +}; +async function get_hooks() { + let handle; + let handleFetch; + let handleError; + let handleValidationError; + let init; + let reroute; + let transport; + return { + handle, + handleFetch, + handleError, + handleValidationError, + init, + reroute, + transport + }; +} +export { + set_public_env as a, + set_read_implementation as b, + set_manifest as c, + get_hooks as g, + options as o, + public_env as p, + read_implementation as r, + set_private_env as s +}; diff --git a/frontend/.svelte-kit/output/server/chunks/shared.js b/frontend/.svelte-kit/output/server/chunks/shared.js new file mode 100644 index 0000000..fa7f6a3 --- /dev/null +++ b/frontend/.svelte-kit/output/server/chunks/shared.js @@ -0,0 +1,522 @@ +import * as devalue from "devalue"; +import { t as text_decoder, b as base64_encode, c as base64_decode } from "./utils.js"; +function set_nested_value(object, path_string, value) { + if (path_string.startsWith("n:")) { + path_string = path_string.slice(2); + value = value === "" ? void 0 : parseFloat(value); + } else if (path_string.startsWith("b:")) { + path_string = path_string.slice(2); + value = value === "on"; + } + deep_set(object, split_path(path_string), value); +} +function convert_formdata(data) { + const result = {}; + for (let key of data.keys()) { + const is_array = key.endsWith("[]"); + let values = data.getAll(key); + if (is_array) key = key.slice(0, -2); + if (values.length > 1 && !is_array) { + throw new Error(`Form cannot contain duplicated keys — "${key}" has ${values.length} values`); + } + values = values.filter( + (entry) => typeof entry === "string" || entry.name !== "" || entry.size > 0 + ); + if (key.startsWith("n:")) { + key = key.slice(2); + values = values.map((v) => v === "" ? void 0 : parseFloat( + /** @type {string} */ + v + )); + } else if (key.startsWith("b:")) { + key = key.slice(2); + values = values.map((v) => v === "on"); + } + set_nested_value(result, key, is_array ? values : values[0]); + } + return result; +} +const BINARY_FORM_CONTENT_TYPE = "application/x-sveltekit-formdata"; +const BINARY_FORM_VERSION = 0; +async function deserialize_binary_form(request) { + if (request.headers.get("content-type") !== BINARY_FORM_CONTENT_TYPE) { + const form_data = await request.formData(); + return { data: convert_formdata(form_data), meta: {}, form_data }; + } + if (!request.body) { + throw new Error("Could not deserialize binary form: no body"); + } + const reader = request.body.getReader(); + const chunks = []; + async function get_chunk(index) { + if (index in chunks) return chunks[index]; + let i = chunks.length; + while (i <= index) { + chunks[i] = reader.read().then((chunk) => chunk.value); + i++; + } + return chunks[index]; + } + async function get_buffer(offset, length) { + let start_chunk; + let chunk_start = 0; + let chunk_index; + for (chunk_index = 0; ; chunk_index++) { + const chunk = await get_chunk(chunk_index); + if (!chunk) return null; + const chunk_end = chunk_start + chunk.byteLength; + if (offset >= chunk_start && offset < chunk_end) { + start_chunk = chunk; + break; + } + chunk_start = chunk_end; + } + if (offset + length <= chunk_start + start_chunk.byteLength) { + return start_chunk.subarray(offset - chunk_start, offset + length - chunk_start); + } + const buffer = new Uint8Array(length); + buffer.set(start_chunk.subarray(offset - chunk_start)); + let cursor = start_chunk.byteLength - offset + chunk_start; + while (cursor < length) { + chunk_index++; + let chunk = await get_chunk(chunk_index); + if (!chunk) return null; + if (chunk.byteLength > length - cursor) { + chunk = chunk.subarray(0, length - cursor); + } + buffer.set(chunk, cursor); + cursor += chunk.byteLength; + } + return buffer; + } + const header = await get_buffer(0, 1 + 4 + 2); + if (!header) throw new Error("Could not deserialize binary form: too short"); + if (header[0] !== BINARY_FORM_VERSION) { + throw new Error( + `Could not deserialize binary form: got version ${header[0]}, expected version ${BINARY_FORM_VERSION}` + ); + } + const header_view = new DataView(header.buffer, header.byteOffset, header.byteLength); + const data_length = header_view.getUint32(1, true); + const file_offsets_length = header_view.getUint16(5, true); + const data_buffer = await get_buffer(1 + 4 + 2, data_length); + if (!data_buffer) throw new Error("Could not deserialize binary form: data too short"); + let file_offsets; + let files_start_offset; + if (file_offsets_length > 0) { + const file_offsets_buffer = await get_buffer(1 + 4 + 2 + data_length, file_offsets_length); + if (!file_offsets_buffer) + throw new Error("Could not deserialize binary form: file offset table too short"); + file_offsets = /** @type {Array} */ + JSON.parse(text_decoder.decode(file_offsets_buffer)); + files_start_offset = 1 + 4 + 2 + data_length + file_offsets_length; + } + const [data, meta] = devalue.parse(text_decoder.decode(data_buffer), { + File: ([name, type, size, last_modified, index]) => { + return new Proxy( + new LazyFile( + name, + type, + size, + last_modified, + get_chunk, + files_start_offset + file_offsets[index] + ), + { + getPrototypeOf() { + return File.prototype; + } + } + ); + } + }); + void (async () => { + let has_more = true; + while (has_more) { + const chunk = await get_chunk(chunks.length); + has_more = !!chunk; + } + })(); + return { data, meta, form_data: null }; +} +class LazyFile { + /** @type {(index: number) => Promise | undefined>} */ + #get_chunk; + /** @type {number} */ + #offset; + /** + * @param {string} name + * @param {string} type + * @param {number} size + * @param {number} last_modified + * @param {(index: number) => Promise | undefined>} get_chunk + * @param {number} offset + */ + constructor(name, type, size, last_modified, get_chunk, offset) { + this.name = name; + this.type = type; + this.size = size; + this.lastModified = last_modified; + this.webkitRelativePath = ""; + this.#get_chunk = get_chunk; + this.#offset = offset; + this.arrayBuffer = this.arrayBuffer.bind(this); + this.bytes = this.bytes.bind(this); + this.slice = this.slice.bind(this); + this.stream = this.stream.bind(this); + this.text = this.text.bind(this); + } + /** @type {ArrayBuffer | undefined} */ + #buffer; + async arrayBuffer() { + this.#buffer ??= await new Response(this.stream()).arrayBuffer(); + return this.#buffer; + } + async bytes() { + return new Uint8Array(await this.arrayBuffer()); + } + /** + * @param {number=} start + * @param {number=} end + * @param {string=} contentType + */ + slice(start = 0, end = this.size, contentType = this.type) { + if (start < 0) { + start = Math.max(this.size + start, 0); + } else { + start = Math.min(start, this.size); + } + if (end < 0) { + end = Math.max(this.size + end, 0); + } else { + end = Math.min(end, this.size); + } + const size = Math.max(end - start, 0); + const file = new LazyFile( + this.name, + contentType, + size, + this.lastModified, + this.#get_chunk, + this.#offset + start + ); + return file; + } + stream() { + let cursor = 0; + let chunk_index = 0; + return new ReadableStream({ + start: async (controller) => { + let chunk_start = 0; + let start_chunk = null; + for (chunk_index = 0; ; chunk_index++) { + const chunk = await this.#get_chunk(chunk_index); + if (!chunk) return null; + const chunk_end = chunk_start + chunk.byteLength; + if (this.#offset >= chunk_start && this.#offset < chunk_end) { + start_chunk = chunk; + break; + } + chunk_start = chunk_end; + } + if (this.#offset + this.size <= chunk_start + start_chunk.byteLength) { + controller.enqueue( + start_chunk.subarray(this.#offset - chunk_start, this.#offset + this.size - chunk_start) + ); + controller.close(); + } else { + controller.enqueue(start_chunk.subarray(this.#offset - chunk_start)); + cursor = start_chunk.byteLength - this.#offset + chunk_start; + } + }, + pull: async (controller) => { + chunk_index++; + let chunk = await this.#get_chunk(chunk_index); + if (!chunk) { + controller.error("Could not deserialize binary form: incomplete file data"); + controller.close(); + return; + } + if (chunk.byteLength > this.size - cursor) { + chunk = chunk.subarray(0, this.size - cursor); + } + controller.enqueue(chunk); + cursor += chunk.byteLength; + if (cursor >= this.size) { + controller.close(); + } + } + }); + } + async text() { + return text_decoder.decode(await this.arrayBuffer()); + } +} +const path_regex = /^[a-zA-Z_$]\w*(\.[a-zA-Z_$]\w*|\[\d+\])*$/; +function split_path(path) { + if (!path_regex.test(path)) { + throw new Error(`Invalid path ${path}`); + } + return path.split(/\.|\[|\]/).filter(Boolean); +} +function check_prototype_pollution(key) { + if (key === "__proto__" || key === "constructor" || key === "prototype") { + throw new Error( + `Invalid key "${key}"` + ); + } +} +function deep_set(object, keys, value) { + let current = object; + for (let i = 0; i < keys.length - 1; i += 1) { + const key = keys[i]; + check_prototype_pollution(key); + const is_array = /^\d+$/.test(keys[i + 1]); + const exists = key in current; + const inner = current[key]; + if (exists && is_array !== Array.isArray(inner)) { + throw new Error(`Invalid array key ${keys[i + 1]}`); + } + if (!exists) { + current[key] = is_array ? [] : {}; + } + current = current[key]; + } + const final_key = keys[keys.length - 1]; + check_prototype_pollution(final_key); + current[final_key] = value; +} +function normalize_issue(issue, server = false) { + const normalized = { name: "", path: [], message: issue.message, server }; + if (issue.path !== void 0) { + let name = ""; + for (const segment of issue.path) { + const key = ( + /** @type {string | number} */ + typeof segment === "object" ? segment.key : segment + ); + normalized.path.push(key); + if (typeof key === "number") { + name += `[${key}]`; + } else if (typeof key === "string") { + name += name === "" ? key : "." + key; + } + } + normalized.name = name; + } + return normalized; +} +function flatten_issues(issues) { + const result = {}; + for (const issue of issues) { + (result.$ ??= []).push(issue); + let name = ""; + if (issue.path !== void 0) { + for (const key of issue.path) { + if (typeof key === "number") { + name += `[${key}]`; + } else if (typeof key === "string") { + name += name === "" ? key : "." + key; + } + (result[name] ??= []).push(issue); + } + } + } + return result; +} +function deep_get(object, path) { + let current = object; + for (const key of path) { + if (current == null || typeof current !== "object") { + return current; + } + current = current[key]; + } + return current; +} +function create_field_proxy(target, get_input, set_input, get_issues, path = []) { + const get_value = () => { + return deep_get(get_input(), path); + }; + return new Proxy(target, { + get(target2, prop) { + if (typeof prop === "symbol") return target2[prop]; + if (/^\d+$/.test(prop)) { + return create_field_proxy({}, get_input, set_input, get_issues, [ + ...path, + parseInt(prop, 10) + ]); + } + const key = build_path_string(path); + if (prop === "set") { + const set_func = function(newValue) { + set_input(path, newValue); + return newValue; + }; + return create_field_proxy(set_func, get_input, set_input, get_issues, [...path, prop]); + } + if (prop === "value") { + return create_field_proxy(get_value, get_input, set_input, get_issues, [...path, prop]); + } + if (prop === "issues" || prop === "allIssues") { + const issues_func = () => { + const all_issues = get_issues()[key === "" ? "$" : key]; + if (prop === "allIssues") { + return all_issues?.map((issue) => ({ + path: issue.path, + message: issue.message + })); + } + return all_issues?.filter((issue) => issue.name === key)?.map((issue) => ({ + path: issue.path, + message: issue.message + })); + }; + return create_field_proxy(issues_func, get_input, set_input, get_issues, [...path, prop]); + } + if (prop === "as") { + const as_func = (type, input_value) => { + const is_array = type === "file multiple" || type === "select multiple" || type === "checkbox" && typeof input_value === "string"; + const prefix = type === "number" || type === "range" ? "n:" : type === "checkbox" && !is_array ? "b:" : ""; + const base_props = { + name: prefix + key + (is_array ? "[]" : ""), + get "aria-invalid"() { + const issues = get_issues(); + return key in issues ? "true" : void 0; + } + }; + if (type !== "text" && type !== "select" && type !== "select multiple") { + base_props.type = type === "file multiple" ? "file" : type; + } + if (type === "submit" || type === "hidden") { + return Object.defineProperties(base_props, { + value: { value: input_value, enumerable: true } + }); + } + if (type === "select" || type === "select multiple") { + return Object.defineProperties(base_props, { + multiple: { value: is_array, enumerable: true }, + value: { + enumerable: true, + get() { + return get_value(); + } + } + }); + } + if (type === "checkbox" || type === "radio") { + return Object.defineProperties(base_props, { + value: { value: input_value ?? "on", enumerable: true }, + checked: { + enumerable: true, + get() { + const value = get_value(); + if (type === "radio") { + return value === input_value; + } + if (is_array) { + return (value ?? []).includes(input_value); + } + return value; + } + } + }); + } + if (type === "file" || type === "file multiple") { + return Object.defineProperties(base_props, { + multiple: { value: is_array, enumerable: true }, + files: { + enumerable: true, + get() { + const value = get_value(); + if (value instanceof File) { + if (typeof DataTransfer !== "undefined") { + const fileList = new DataTransfer(); + fileList.items.add(value); + return fileList.files; + } + return { 0: value, length: 1 }; + } + if (Array.isArray(value) && value.every((f) => f instanceof File)) { + if (typeof DataTransfer !== "undefined") { + const fileList = new DataTransfer(); + value.forEach((file) => fileList.items.add(file)); + return fileList.files; + } + const fileListLike = { length: value.length }; + value.forEach((file, index) => { + fileListLike[index] = file; + }); + return fileListLike; + } + return null; + } + } + }); + } + return Object.defineProperties(base_props, { + value: { + enumerable: true, + get() { + const value = get_value(); + return value != null ? String(value) : ""; + } + } + }); + }; + return create_field_proxy(as_func, get_input, set_input, get_issues, [...path, "as"]); + } + return create_field_proxy({}, get_input, set_input, get_issues, [...path, prop]); + } + }); +} +function build_path_string(path) { + let result = ""; + for (const segment of path) { + if (typeof segment === "number") { + result += `[${segment}]`; + } else { + result += result === "" ? segment : "." + segment; + } + } + return result; +} +const INVALIDATED_PARAM = "x-sveltekit-invalidated"; +const TRAILING_SLASH_PARAM = "x-sveltekit-trailing-slash"; +function stringify(data, transport) { + const encoders = Object.fromEntries(Object.entries(transport).map(([k, v]) => [k, v.encode])); + return devalue.stringify(data, encoders); +} +function stringify_remote_arg(value, transport) { + if (value === void 0) return ""; + const json_string = stringify(value, transport); + const bytes = new TextEncoder().encode(json_string); + return base64_encode(bytes).replaceAll("=", "").replaceAll("+", "-").replaceAll("/", "_"); +} +function parse_remote_arg(string, transport) { + if (!string) return void 0; + const json_string = text_decoder.decode( + // no need to add back `=` characters, atob can handle it + base64_decode(string.replaceAll("-", "+").replaceAll("_", "/")) + ); + const decoders = Object.fromEntries(Object.entries(transport).map(([k, v]) => [k, v.decode])); + return devalue.parse(json_string, decoders); +} +function create_remote_key(id, payload) { + return id + "/" + payload; +} +export { + BINARY_FORM_CONTENT_TYPE as B, + INVALIDATED_PARAM as I, + TRAILING_SLASH_PARAM as T, + stringify_remote_arg as a, + create_field_proxy as b, + create_remote_key as c, + deserialize_binary_form as d, + set_nested_value as e, + flatten_issues as f, + deep_set as g, + normalize_issue as n, + parse_remote_arg as p, + stringify as s +}; diff --git a/frontend/.svelte-kit/output/server/chunks/stores.js b/frontend/.svelte-kit/output/server/chunks/stores.js new file mode 100644 index 0000000..af3139c --- /dev/null +++ b/frontend/.svelte-kit/output/server/chunks/stores.js @@ -0,0 +1,44 @@ +import { a0 as getContext } from "./index2.js"; +import "clsx"; +import "@sveltejs/kit/internal"; +import "./exports.js"; +import "./utils.js"; +import "@sveltejs/kit/internal/server"; +import { n as noop } from "./equality.js"; +const is_legacy = noop.toString().includes("$$") || /function \w+\(\) \{\}/.test(noop.toString()); +if (is_legacy) { + ({ + data: {}, + form: null, + error: null, + params: {}, + route: { id: null }, + state: {}, + status: -1, + url: new URL("https://example.com") + }); +} +const getStores = () => { + const stores = getContext("__svelte__"); + return { + /** @type {typeof page} */ + page: { + subscribe: stores.page.subscribe + }, + /** @type {typeof navigating} */ + navigating: { + subscribe: stores.navigating.subscribe + }, + /** @type {typeof updated} */ + updated: stores.updated + }; +}; +const page = { + subscribe(fn) { + const store = getStores().page; + return store.subscribe(fn); + } +}; +export { + page as p +}; diff --git a/frontend/.svelte-kit/output/server/chunks/toasts.js b/frontend/.svelte-kit/output/server/chunks/toasts.js new file mode 100644 index 0000000..f556cd3 --- /dev/null +++ b/frontend/.svelte-kit/output/server/chunks/toasts.js @@ -0,0 +1,16 @@ +import { w as writable } from "./index.js"; +const toasts = writable([]); +function addToast(message, type = "info", duration = 3e3) { + const id = Math.random().toString(36).substr(2, 9); + console.log(`[toasts.addToast][Action] Adding toast context={{'id': '${id}', 'type': '${type}', 'message': '${message}'}}`); + toasts.update((all) => [...all, { id, message, type }]); + setTimeout(() => removeToast(id), duration); +} +function removeToast(id) { + console.log(`[toasts.removeToast][Action] Removing toast context={{'id': '${id}'}}`); + toasts.update((all) => all.filter((t) => t.id !== id)); +} +export { + addToast as a, + toasts as t +}; diff --git a/frontend/.svelte-kit/output/server/chunks/utils.js b/frontend/.svelte-kit/output/server/chunks/utils.js new file mode 100644 index 0000000..78e5bde --- /dev/null +++ b/frontend/.svelte-kit/output/server/chunks/utils.js @@ -0,0 +1,43 @@ +const text_encoder = new TextEncoder(); +const text_decoder = new TextDecoder(); +function get_relative_path(from, to) { + const from_parts = from.split(/[/\\]/); + const to_parts = to.split(/[/\\]/); + from_parts.pop(); + while (from_parts[0] === to_parts[0]) { + from_parts.shift(); + to_parts.shift(); + } + let i = from_parts.length; + while (i--) from_parts[i] = ".."; + return from_parts.concat(to_parts).join("/"); +} +function base64_encode(bytes) { + if (globalThis.Buffer) { + return globalThis.Buffer.from(bytes).toString("base64"); + } + let binary = ""; + for (let i = 0; i < bytes.length; i++) { + binary += String.fromCharCode(bytes[i]); + } + return btoa(binary); +} +function base64_decode(encoded) { + if (globalThis.Buffer) { + const buffer = globalThis.Buffer.from(encoded, "base64"); + return new Uint8Array(buffer); + } + const binary = atob(encoded); + const bytes = new Uint8Array(binary.length); + for (let i = 0; i < binary.length; i++) { + bytes[i] = binary.charCodeAt(i); + } + return bytes; +} +export { + text_encoder as a, + base64_encode as b, + base64_decode as c, + get_relative_path as g, + text_decoder as t +}; diff --git a/frontend/.svelte-kit/output/server/entries/pages/_error.svelte.js b/frontend/.svelte-kit/output/server/entries/pages/_error.svelte.js new file mode 100644 index 0000000..99eb73a --- /dev/null +++ b/frontend/.svelte-kit/output/server/entries/pages/_error.svelte.js @@ -0,0 +1,12 @@ +import { _ as escape_html, X as store_get, Y as unsubscribe_stores } from "../../chunks/index2.js"; +import { p as page } from "../../chunks/stores.js"; +function _error($$renderer, $$props) { + $$renderer.component(($$renderer2) => { + var $$store_subs; + $$renderer2.push(`

${escape_html(store_get($$store_subs ??= {}, "$page", page).status)}

${escape_html(store_get($$store_subs ??= {}, "$page", page).error?.message || "Page not found")}

Back to Dashboard
`); + if ($$store_subs) unsubscribe_stores($$store_subs); + }); +} +export { + _error as default +}; diff --git a/frontend/.svelte-kit/output/server/entries/pages/_layout.svelte.js b/frontend/.svelte-kit/output/server/entries/pages/_layout.svelte.js new file mode 100644 index 0000000..e7d6810 --- /dev/null +++ b/frontend/.svelte-kit/output/server/entries/pages/_layout.svelte.js @@ -0,0 +1,38 @@ +import { V as attr_class, W as stringify, X as store_get, Y as unsubscribe_stores, Z as ensure_array_like, _ as escape_html, $ as slot } from "../../chunks/index2.js"; +import { p as page } from "../../chunks/stores.js"; +import "clsx"; +import { t as toasts } from "../../chunks/toasts.js"; +function Navbar($$renderer, $$props) { + $$renderer.component(($$renderer2) => { + var $$store_subs; + $$renderer2.push(`
Superset Tools
`); + if ($$store_subs) unsubscribe_stores($$store_subs); + }); +} +function Footer($$renderer) { + $$renderer.push(`
© 2025 Superset Tools. All rights reserved.
`); +} +function Toast($$renderer) { + var $$store_subs; + $$renderer.push(`
`); + const each_array = ensure_array_like(store_get($$store_subs ??= {}, "$toasts", toasts)); + for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) { + let toast = each_array[$$index]; + $$renderer.push(`${escape_html(toast.message)}
`); + } + $$renderer.push(``); + if ($$store_subs) unsubscribe_stores($$store_subs); +} +function _layout($$renderer, $$props) { + Toast($$renderer); + $$renderer.push(`
`); + Navbar($$renderer); + $$renderer.push(`
`); + slot($$renderer, $$props, "default", {}); + $$renderer.push(`
`); + Footer($$renderer); + $$renderer.push(`
`); +} +export { + _layout as default +}; diff --git a/frontend/.svelte-kit/output/server/entries/pages/_layout.ts.js b/frontend/.svelte-kit/output/server/entries/pages/_layout.ts.js new file mode 100644 index 0000000..aeea3c2 --- /dev/null +++ b/frontend/.svelte-kit/output/server/entries/pages/_layout.ts.js @@ -0,0 +1,6 @@ +const ssr = false; +const prerender = false; +export { + prerender, + ssr +}; diff --git a/frontend/.svelte-kit/output/server/entries/pages/_page.svelte.js b/frontend/.svelte-kit/output/server/entries/pages/_page.svelte.js new file mode 100644 index 0000000..aaf2282 --- /dev/null +++ b/frontend/.svelte-kit/output/server/entries/pages/_page.svelte.js @@ -0,0 +1,132 @@ +import { a1 as ssr_context, X as store_get, _ as escape_html, Z as ensure_array_like, V as attr_class, Y as unsubscribe_stores, a2 as attr, a3 as bind_props } from "../../chunks/index2.js"; +import { w as writable } from "../../chunks/index.js"; +import "clsx"; +function onDestroy(fn) { + /** @type {SSRContext} */ + ssr_context.r.on_destroy(fn); +} +const plugins = writable([]); +const selectedPlugin = writable(null); +const selectedTask = writable(null); +const taskLogs = writable([]); +function TaskRunner($$renderer, $$props) { + $$renderer.component(($$renderer2) => { + var $$store_subs; + onDestroy(() => { + }); + $$renderer2.push(`
`); + if (store_get($$store_subs ??= {}, "$selectedTask", selectedTask)) { + $$renderer2.push(""); + $$renderer2.push(`

Task: ${escape_html(store_get($$store_subs ??= {}, "$selectedTask", selectedTask).plugin_id)}

`); + const each_array = ensure_array_like(store_get($$store_subs ??= {}, "$taskLogs", taskLogs)); + for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) { + let log = each_array[$$index]; + $$renderer2.push(`
${escape_html(new Date(log.timestamp).toLocaleTimeString())} [${escape_html(log.level)}] ${escape_html(log.message)}
`); + } + $$renderer2.push(`
`); + } else { + $$renderer2.push(""); + $$renderer2.push(`

No task selected.

`); + } + $$renderer2.push(`
`); + if ($$store_subs) unsubscribe_stores($$store_subs); + }); +} +function DynamicForm($$renderer, $$props) { + $$renderer.component(($$renderer2) => { + let schema = $$props["schema"]; + let formData = {}; + function initializeForm() { + if (schema && schema.properties) { + for (const key in schema.properties) { + formData[key] = schema.properties[key].default || ""; + } + } + } + initializeForm(); + $$renderer2.push(`
`); + if (schema && schema.properties) { + $$renderer2.push(""); + $$renderer2.push(``); + const each_array = ensure_array_like(Object.entries(schema.properties)); + for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) { + let [key, prop] = each_array[$$index]; + $$renderer2.push(`
${escape_html(prop.title || key)} `); + if (prop.type === "string") { + $$renderer2.push(""); + $$renderer2.push(``); + } else { + $$renderer2.push(""); + if (prop.type === "number" || prop.type === "integer") { + $$renderer2.push(""); + $$renderer2.push(``); + } else { + $$renderer2.push(""); + if (prop.type === "boolean") { + $$renderer2.push(""); + $$renderer2.push(``); + } else { + $$renderer2.push(""); + } + $$renderer2.push(``); + } + $$renderer2.push(``); + } + $$renderer2.push(`
`); + } + $$renderer2.push(` `); + } else { + $$renderer2.push(""); + } + $$renderer2.push(`
`); + bind_props($$props, { schema }); + }); +} +function _page($$renderer, $$props) { + $$renderer.component(($$renderer2) => { + var $$store_subs; + let data = $$props["data"]; + if (data.plugins) { + plugins.set(data.plugins); + } + $$renderer2.push(`
`); + if (store_get($$store_subs ??= {}, "$selectedTask", selectedTask)) { + $$renderer2.push(""); + TaskRunner($$renderer2); + $$renderer2.push(` `); + } else { + $$renderer2.push(""); + if (store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin)) { + $$renderer2.push(""); + $$renderer2.push(`

${escape_html(store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin).name)}

`); + DynamicForm($$renderer2, { + schema: store_get($$store_subs ??= {}, "$selectedPlugin", selectedPlugin).schema + }); + $$renderer2.push(` `); + } else { + $$renderer2.push(""); + $$renderer2.push(`

Available Tools

`); + if (data.error) { + $$renderer2.push(""); + $$renderer2.push(`
${escape_html(data.error)}
`); + } else { + $$renderer2.push(""); + } + $$renderer2.push(`
`); + const each_array = ensure_array_like(data.plugins); + for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) { + let plugin = each_array[$$index]; + $$renderer2.push(`

${escape_html(plugin.name)}

${escape_html(plugin.description)}

v${escape_html(plugin.version)}
`); + } + $$renderer2.push(`
`); + } + $$renderer2.push(``); + } + $$renderer2.push(`
`); + if ($$store_subs) unsubscribe_stores($$store_subs); + bind_props($$props, { data }); + }); +} +export { + _page as default +}; diff --git a/frontend/.svelte-kit/output/server/entries/pages/_page.ts.js b/frontend/.svelte-kit/output/server/entries/pages/_page.ts.js new file mode 100644 index 0000000..02f2bef --- /dev/null +++ b/frontend/.svelte-kit/output/server/entries/pages/_page.ts.js @@ -0,0 +1,18 @@ +import { a as api } from "../../chunks/api.js"; +async function load() { + try { + const plugins = await api.getPlugins(); + return { + plugins + }; + } catch (error) { + console.error("Failed to load plugins:", error); + return { + plugins: [], + error: "Failed to load plugins" + }; + } +} +export { + load +}; diff --git a/frontend/.svelte-kit/output/server/entries/pages/settings/_page.svelte.js b/frontend/.svelte-kit/output/server/entries/pages/settings/_page.svelte.js new file mode 100644 index 0000000..a891268 --- /dev/null +++ b/frontend/.svelte-kit/output/server/entries/pages/settings/_page.svelte.js @@ -0,0 +1,45 @@ +import { _ as escape_html, a2 as attr, Z as ensure_array_like, a3 as bind_props } from "../../../chunks/index2.js"; +function _page($$renderer, $$props) { + $$renderer.component(($$renderer2) => { + let data = $$props["data"]; + let settings = data.settings; + let newEnv = { + id: "", + name: "", + url: "", + username: "", + password: "", + is_default: false + }; + settings = data.settings; + $$renderer2.push(`

Settings

`); + if (data.error) { + $$renderer2.push(""); + $$renderer2.push(`
${escape_html(data.error)}
`); + } else { + $$renderer2.push(""); + } + $$renderer2.push(`

Global Settings

Superset Environments

`); + if (settings.environments.length === 0) { + $$renderer2.push(""); + $$renderer2.push(`

Warning

No Superset environments configured. You must add at least one environment to perform backups or migrations.

`); + } else { + $$renderer2.push(""); + } + $$renderer2.push(`
`); + const each_array = ensure_array_like(settings.environments); + for (let $$index = 0, $$length = each_array.length; $$index < $$length; $$index++) { + let env = each_array[$$index]; + $$renderer2.push(``); + } + $$renderer2.push(`
NameURLUsernameDefaultActions
${escape_html(env.name)}${escape_html(env.url)}${escape_html(env.username)}${escape_html(env.is_default ? "Yes" : "No")}

${escape_html("Add")} Environment

`); + { + $$renderer2.push(""); + } + $$renderer2.push(`
`); + bind_props($$props, { data }); + }); +} +export { + _page as default +}; diff --git a/frontend/.svelte-kit/output/server/entries/pages/settings/_page.ts.js b/frontend/.svelte-kit/output/server/entries/pages/settings/_page.ts.js new file mode 100644 index 0000000..9f66d64 --- /dev/null +++ b/frontend/.svelte-kit/output/server/entries/pages/settings/_page.ts.js @@ -0,0 +1,24 @@ +import { a as api } from "../../../chunks/api.js"; +async function load() { + try { + const settings = await api.getSettings(); + return { + settings + }; + } catch (error) { + console.error("Failed to load settings:", error); + return { + settings: { + environments: [], + settings: { + backup_path: "", + default_environment_id: null + } + }, + error: "Failed to load settings" + }; + } +} +export { + load +}; diff --git a/frontend/.svelte-kit/output/server/index.js b/frontend/.svelte-kit/output/server/index.js new file mode 100644 index 0000000..94bcc9f --- /dev/null +++ b/frontend/.svelte-kit/output/server/index.js @@ -0,0 +1,3857 @@ +import { B as BROWSER } from "./chunks/false.js"; +import { json, text, error } from "@sveltejs/kit"; +import { HttpError, SvelteKitError, Redirect, ActionFailure } from "@sveltejs/kit/internal"; +import { with_request_store, merge_tracing, try_get_request_store } from "@sveltejs/kit/internal/server"; +import { a as assets, b as base, c as app_dir, r as relative, o as override, d as reset } from "./chunks/environment.js"; +import { B as BINARY_FORM_CONTENT_TYPE, c as create_remote_key, p as parse_remote_arg, s as stringify, d as deserialize_binary_form, T as TRAILING_SLASH_PARAM, I as INVALIDATED_PARAM } from "./chunks/shared.js"; +import * as devalue from "devalue"; +import { m as make_trackable, d as disable_search, a as decode_params, S as SCHEME, v as validate_layout_server_exports, b as validate_layout_exports, c as validate_page_server_exports, e as validate_page_exports, n as normalize_path, r as resolve, f as decode_pathname, g as validate_server_exports } from "./chunks/exports.js"; +import { b as base64_encode, t as text_decoder, a as text_encoder, g as get_relative_path } from "./chunks/utils.js"; +import { r as readable, w as writable } from "./chunks/index.js"; +import { p as public_env, r as read_implementation, o as options, s as set_private_env, a as set_public_env, g as get_hooks, b as set_read_implementation } from "./chunks/internal.js"; +import { parse, serialize } from "cookie"; +import * as set_cookie_parser from "set-cookie-parser"; +function with_resolvers() { + let resolve2; + let reject; + const promise = new Promise((res, rej) => { + resolve2 = res; + reject = rej; + }); + return { promise, resolve: resolve2, reject }; +} +const NULL_BODY_STATUS = [101, 103, 204, 205, 304]; +const IN_WEBCONTAINER = !!globalThis.process?.versions?.webcontainer; +const SVELTE_KIT_ASSETS = "/_svelte_kit_assets"; +const ENDPOINT_METHODS = ["GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS", "HEAD"]; +const PAGE_METHODS = ["GET", "POST", "HEAD"]; +function negotiate(accept, types) { + const parts = []; + accept.split(",").forEach((str, i) => { + const match = /([^/ \t]+)\/([^; \t]+)[ \t]*(?:;[ \t]*q=([0-9.]+))?/.exec(str); + if (match) { + const [, type, subtype, q = "1"] = match; + parts.push({ type, subtype, q: +q, i }); + } + }); + parts.sort((a, b) => { + if (a.q !== b.q) { + return b.q - a.q; + } + if (a.subtype === "*" !== (b.subtype === "*")) { + return a.subtype === "*" ? 1 : -1; + } + if (a.type === "*" !== (b.type === "*")) { + return a.type === "*" ? 1 : -1; + } + return a.i - b.i; + }); + let accepted; + let min_priority = Infinity; + for (const mimetype of types) { + const [type, subtype] = mimetype.split("/"); + const priority = parts.findIndex( + (part) => (part.type === type || part.type === "*") && (part.subtype === subtype || part.subtype === "*") + ); + if (priority !== -1 && priority < min_priority) { + accepted = mimetype; + min_priority = priority; + } + } + return accepted; +} +function is_content_type(request, ...types) { + const type = request.headers.get("content-type")?.split(";", 1)[0].trim() ?? ""; + return types.includes(type.toLowerCase()); +} +function is_form_content_type(request) { + return is_content_type( + request, + "application/x-www-form-urlencoded", + "multipart/form-data", + "text/plain", + BINARY_FORM_CONTENT_TYPE + ); +} +function coalesce_to_error(err) { + return err instanceof Error || err && /** @type {any} */ + err.name && /** @type {any} */ + err.message ? ( + /** @type {Error} */ + err + ) : new Error(JSON.stringify(err)); +} +function normalize_error(error2) { + return ( + /** @type {import('../exports/internal/index.js').Redirect | HttpError | SvelteKitError | Error} */ + error2 + ); +} +function get_status(error2) { + return error2 instanceof HttpError || error2 instanceof SvelteKitError ? error2.status : 500; +} +function get_message(error2) { + return error2 instanceof SvelteKitError ? error2.text : "Internal Error"; +} +const escape_html_attr_dict = { + "&": "&", + '"': """ + // Svelte also escapes < because the escape function could be called inside a `noscript` there + // https://github.com/sveltejs/svelte/security/advisories/GHSA-8266-84wp-wv5c + // However, that doesn't apply in SvelteKit +}; +const escape_html_dict = { + "&": "&", + "<": "<" +}; +const surrogates = ( + // high surrogate without paired low surrogate + "[\\ud800-\\udbff](?![\\udc00-\\udfff])|[\\ud800-\\udbff][\\udc00-\\udfff]|[\\udc00-\\udfff]" +); +const escape_html_attr_regex = new RegExp( + `[${Object.keys(escape_html_attr_dict).join("")}]|` + surrogates, + "g" +); +const escape_html_regex = new RegExp( + `[${Object.keys(escape_html_dict).join("")}]|` + surrogates, + "g" +); +function escape_html(str, is_attr) { + const dict = is_attr ? escape_html_attr_dict : escape_html_dict; + const escaped_str = str.replace(is_attr ? escape_html_attr_regex : escape_html_regex, (match) => { + if (match.length === 2) { + return match; + } + return dict[match] ?? `&#${match.charCodeAt(0)};`; + }); + return escaped_str; +} +function method_not_allowed(mod, method) { + return text(`${method} method not allowed`, { + status: 405, + headers: { + // https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/405 + // "The server must generate an Allow header field in a 405 status code response" + allow: allowed_methods(mod).join(", ") + } + }); +} +function allowed_methods(mod) { + const allowed = ENDPOINT_METHODS.filter((method) => method in mod); + if ("GET" in mod && !("HEAD" in mod)) { + allowed.push("HEAD"); + } + return allowed; +} +function get_global_name(options2) { + return `__sveltekit_${options2.version_hash}`; +} +function static_error_page(options2, status, message) { + let page = options2.templates.error({ status, message: escape_html(message) }); + return text(page, { + headers: { "content-type": "text/html; charset=utf-8" }, + status + }); +} +async function handle_fatal_error(event, state, options2, error2) { + error2 = error2 instanceof HttpError ? error2 : coalesce_to_error(error2); + const status = get_status(error2); + const body2 = await handle_error_and_jsonify(event, state, options2, error2); + const type = negotiate(event.request.headers.get("accept") || "text/html", [ + "application/json", + "text/html" + ]); + if (event.isDataRequest || type === "application/json") { + return json(body2, { + status + }); + } + return static_error_page(options2, status, body2.message); +} +async function handle_error_and_jsonify(event, state, options2, error2) { + if (error2 instanceof HttpError) { + return { message: "Unknown Error", ...error2.body }; + } + const status = get_status(error2); + const message = get_message(error2); + return await with_request_store( + { event, state }, + () => options2.hooks.handleError({ error: error2, event, status, message }) + ) ?? { message }; +} +function redirect_response(status, location) { + const response = new Response(void 0, { + status, + headers: { location } + }); + return response; +} +function clarify_devalue_error(event, error2) { + if (error2.path) { + return `Data returned from \`load\` while rendering ${event.route.id} is not serializable: ${error2.message} (${error2.path}). If you need to serialize/deserialize custom types, use transport hooks: https://svelte.dev/docs/kit/hooks#Universal-hooks-transport.`; + } + if (error2.path === "") { + return `Data returned from \`load\` while rendering ${event.route.id} is not a plain object`; + } + return error2.message; +} +function serialize_uses(node) { + const uses = {}; + if (node.uses && node.uses.dependencies.size > 0) { + uses.dependencies = Array.from(node.uses.dependencies); + } + if (node.uses && node.uses.search_params.size > 0) { + uses.search_params = Array.from(node.uses.search_params); + } + if (node.uses && node.uses.params.size > 0) { + uses.params = Array.from(node.uses.params); + } + if (node.uses?.parent) uses.parent = 1; + if (node.uses?.route) uses.route = 1; + if (node.uses?.url) uses.url = 1; + return uses; +} +function has_prerendered_path(manifest, pathname) { + return manifest._.prerendered_routes.has(pathname) || pathname.at(-1) === "/" && manifest._.prerendered_routes.has(pathname.slice(0, -1)); +} +function format_server_error(status, error2, event) { + const formatted_text = ` +\x1B[1;31m[${status}] ${event.request.method} ${event.url.pathname}\x1B[0m`; + if (status === 404) { + return formatted_text; + } + return `${formatted_text} +${error2.stack}`; +} +function get_node_type(node_id) { + const parts = node_id?.split("/"); + const filename = parts?.at(-1); + if (!filename) return "unknown"; + const dot_parts = filename.split("."); + return dot_parts.slice(0, -1).join("."); +} +async function render_endpoint(event, event_state, mod, state) { + const method = ( + /** @type {import('types').HttpMethod} */ + event.request.method + ); + let handler = mod[method] || mod.fallback; + if (method === "HEAD" && !mod.HEAD && mod.GET) { + handler = mod.GET; + } + if (!handler) { + return method_not_allowed(mod, method); + } + const prerender = mod.prerender ?? state.prerender_default; + if (prerender && (mod.POST || mod.PATCH || mod.PUT || mod.DELETE)) { + throw new Error("Cannot prerender endpoints that have mutative methods"); + } + if (state.prerendering && !state.prerendering.inside_reroute && !prerender) { + if (state.depth > 0) { + throw new Error(`${event.route.id} is not prerenderable`); + } else { + return new Response(void 0, { status: 204 }); + } + } + event_state.is_endpoint_request = true; + try { + const response = await with_request_store( + { event, state: event_state }, + () => handler( + /** @type {import('@sveltejs/kit').RequestEvent>} */ + event + ) + ); + if (!(response instanceof Response)) { + throw new Error( + `Invalid response from route ${event.url.pathname}: handler should return a Response object` + ); + } + if (state.prerendering && (!state.prerendering.inside_reroute || prerender)) { + const cloned = new Response(response.clone().body, { + status: response.status, + statusText: response.statusText, + headers: new Headers(response.headers) + }); + cloned.headers.set("x-sveltekit-prerender", String(prerender)); + if (state.prerendering.inside_reroute && prerender) { + cloned.headers.set( + "x-sveltekit-routeid", + encodeURI( + /** @type {string} */ + event.route.id + ) + ); + state.prerendering.dependencies.set(event.url.pathname, { response: cloned, body: null }); + } else { + return cloned; + } + } + return response; + } catch (e) { + if (e instanceof Redirect) { + return new Response(void 0, { + status: e.status, + headers: { location: e.location } + }); + } + throw e; + } +} +function is_endpoint_request(event) { + const { method, headers: headers2 } = event.request; + if (ENDPOINT_METHODS.includes(method) && !PAGE_METHODS.includes(method)) { + return true; + } + if (method === "POST" && headers2.get("x-sveltekit-action") === "true") return false; + const accept = event.request.headers.get("accept") ?? "*/*"; + return negotiate(accept, ["*", "text/html"]) !== "text/html"; +} +function compact(arr) { + return arr.filter( + /** @returns {val is NonNullable} */ + (val) => val != null + ); +} +const DATA_SUFFIX = "/__data.json"; +const HTML_DATA_SUFFIX = ".html__data.json"; +function has_data_suffix(pathname) { + return pathname.endsWith(DATA_SUFFIX) || pathname.endsWith(HTML_DATA_SUFFIX); +} +function add_data_suffix(pathname) { + if (pathname.endsWith(".html")) return pathname.replace(/\.html$/, HTML_DATA_SUFFIX); + return pathname.replace(/\/$/, "") + DATA_SUFFIX; +} +function strip_data_suffix(pathname) { + if (pathname.endsWith(HTML_DATA_SUFFIX)) { + return pathname.slice(0, -HTML_DATA_SUFFIX.length) + ".html"; + } + return pathname.slice(0, -DATA_SUFFIX.length); +} +const ROUTE_SUFFIX = "/__route.js"; +function has_resolution_suffix(pathname) { + return pathname.endsWith(ROUTE_SUFFIX); +} +function add_resolution_suffix(pathname) { + return pathname.replace(/\/$/, "") + ROUTE_SUFFIX; +} +function strip_resolution_suffix(pathname) { + return pathname.slice(0, -ROUTE_SUFFIX.length); +} +const noop_span = { + spanContext() { + return noop_span_context; + }, + setAttribute() { + return this; + }, + setAttributes() { + return this; + }, + addEvent() { + return this; + }, + setStatus() { + return this; + }, + updateName() { + return this; + }, + end() { + return this; + }, + isRecording() { + return false; + }, + recordException() { + return this; + }, + addLink() { + return this; + }, + addLinks() { + return this; + } +}; +const noop_span_context = { + traceId: "", + spanId: "", + traceFlags: 0 +}; +async function record_span({ name, attributes, fn }) { + { + return fn(noop_span); + } +} +function is_action_json_request(event) { + const accept = negotiate(event.request.headers.get("accept") ?? "*/*", [ + "application/json", + "text/html" + ]); + return accept === "application/json" && event.request.method === "POST"; +} +async function handle_action_json_request(event, event_state, options2, server) { + const actions = server?.actions; + if (!actions) { + const no_actions_error = new SvelteKitError( + 405, + "Method Not Allowed", + `POST method not allowed. No form actions exist for ${"this page"}` + ); + return action_json( + { + type: "error", + error: await handle_error_and_jsonify(event, event_state, options2, no_actions_error) + }, + { + status: no_actions_error.status, + headers: { + // https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/405 + // "The server must generate an Allow header field in a 405 status code response" + allow: "GET" + } + } + ); + } + check_named_default_separate(actions); + try { + const data = await call_action(event, event_state, actions); + if (BROWSER) ; + if (data instanceof ActionFailure) { + return action_json({ + type: "failure", + status: data.status, + // @ts-expect-error we assign a string to what is supposed to be an object. That's ok + // because we don't use the object outside, and this way we have better code navigation + // through knowing where the related interface is used. + data: stringify_action_response( + data.data, + /** @type {string} */ + event.route.id, + options2.hooks.transport + ) + }); + } else { + return action_json({ + type: "success", + status: data ? 200 : 204, + // @ts-expect-error see comment above + data: stringify_action_response( + data, + /** @type {string} */ + event.route.id, + options2.hooks.transport + ) + }); + } + } catch (e) { + const err = normalize_error(e); + if (err instanceof Redirect) { + return action_json_redirect(err); + } + return action_json( + { + type: "error", + error: await handle_error_and_jsonify( + event, + event_state, + options2, + check_incorrect_fail_use(err) + ) + }, + { + status: get_status(err) + } + ); + } +} +function check_incorrect_fail_use(error2) { + return error2 instanceof ActionFailure ? new Error('Cannot "throw fail()". Use "return fail()"') : error2; +} +function action_json_redirect(redirect) { + return action_json({ + type: "redirect", + status: redirect.status, + location: redirect.location + }); +} +function action_json(data, init2) { + return json(data, init2); +} +function is_action_request(event) { + return event.request.method === "POST"; +} +async function handle_action_request(event, event_state, server) { + const actions = server?.actions; + if (!actions) { + event.setHeaders({ + // https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/405 + // "The server must generate an Allow header field in a 405 status code response" + allow: "GET" + }); + return { + type: "error", + error: new SvelteKitError( + 405, + "Method Not Allowed", + `POST method not allowed. No form actions exist for ${"this page"}` + ) + }; + } + check_named_default_separate(actions); + try { + const data = await call_action(event, event_state, actions); + if (BROWSER) ; + if (data instanceof ActionFailure) { + return { + type: "failure", + status: data.status, + data: data.data + }; + } else { + return { + type: "success", + status: 200, + // @ts-expect-error this will be removed upon serialization, so `undefined` is the same as omission + data + }; + } + } catch (e) { + const err = normalize_error(e); + if (err instanceof Redirect) { + return { + type: "redirect", + status: err.status, + location: err.location + }; + } + return { + type: "error", + error: check_incorrect_fail_use(err) + }; + } +} +function check_named_default_separate(actions) { + if (actions.default && Object.keys(actions).length > 1) { + throw new Error( + "When using named actions, the default action cannot be used. See the docs for more info: https://svelte.dev/docs/kit/form-actions#named-actions" + ); + } +} +async function call_action(event, event_state, actions) { + const url = new URL(event.request.url); + let name = "default"; + for (const param of url.searchParams) { + if (param[0].startsWith("/")) { + name = param[0].slice(1); + if (name === "default") { + throw new Error('Cannot use reserved action name "default"'); + } + break; + } + } + const action = actions[name]; + if (!action) { + throw new SvelteKitError(404, "Not Found", `No action with name '${name}' found`); + } + if (!is_form_content_type(event.request)) { + throw new SvelteKitError( + 415, + "Unsupported Media Type", + `Form actions expect form-encoded data — received ${event.request.headers.get( + "content-type" + )}` + ); + } + return record_span({ + name: "sveltekit.form_action", + attributes: { + "http.route": event.route.id || "unknown" + }, + fn: async (current2) => { + const traced_event = merge_tracing(event, current2); + const result = await with_request_store( + { event: traced_event, state: event_state }, + () => action(traced_event) + ); + if (result instanceof ActionFailure) { + current2.setAttributes({ + "sveltekit.form_action.result.type": "failure", + "sveltekit.form_action.result.status": result.status + }); + } + return result; + } + }); +} +function validate_action_return(data) { + if (data instanceof Redirect) { + throw new Error("Cannot `return redirect(...)` — use `redirect(...)` instead"); + } + if (data instanceof HttpError) { + throw new Error("Cannot `return error(...)` — use `error(...)` or `return fail(...)` instead"); + } +} +function uneval_action_response(data, route_id, transport) { + const replacer = (thing) => { + for (const key2 in transport) { + const encoded = transport[key2].encode(thing); + if (encoded) { + return `app.decode('${key2}', ${devalue.uneval(encoded, replacer)})`; + } + } + }; + return try_serialize(data, (value) => devalue.uneval(value, replacer), route_id); +} +function stringify_action_response(data, route_id, transport) { + const encoders = Object.fromEntries( + Object.entries(transport).map(([key2, value]) => [key2, value.encode]) + ); + return try_serialize(data, (value) => devalue.stringify(value, encoders), route_id); +} +function try_serialize(data, fn, route_id) { + try { + return fn(data); + } catch (e) { + const error2 = ( + /** @type {any} */ + e + ); + if (data instanceof Response) { + throw new Error( + `Data returned from action inside ${route_id} is not serializable. Form actions need to return plain objects or fail(). E.g. return { success: true } or return fail(400, { message: "invalid" });` + ); + } + if ("path" in error2) { + let message = `Data returned from action inside ${route_id} is not serializable: ${error2.message}`; + if (error2.path !== "") message += ` (data.${error2.path})`; + throw new Error(message); + } + throw error2; + } +} +function create_async_iterator() { + let resolved = -1; + let returned = -1; + const deferred = []; + return { + iterate: (transform = (x) => x) => { + return { + [Symbol.asyncIterator]() { + return { + next: async () => { + const next = deferred[++returned]; + if (!next) return { value: null, done: true }; + const value = await next.promise; + return { value: transform(value), done: false }; + } + }; + } + }; + }, + add: (promise) => { + deferred.push(with_resolvers()); + void promise.then((value) => { + deferred[++resolved].resolve(value); + }); + } + }; +} +function server_data_serializer(event, event_state, options2) { + let promise_id = 1; + let max_nodes = -1; + const iterator = create_async_iterator(); + const global = get_global_name(options2); + function get_replacer(index) { + return function replacer(thing) { + if (typeof thing?.then === "function") { + const id = promise_id++; + const promise = thing.then( + /** @param {any} data */ + (data) => ({ data }) + ).catch( + /** @param {any} error */ + async (error2) => ({ + error: await handle_error_and_jsonify(event, event_state, options2, error2) + }) + ).then( + /** + * @param {{data: any; error: any}} result + */ + async ({ data, error: error2 }) => { + let str; + try { + str = devalue.uneval(error2 ? [, error2] : [data], replacer); + } catch { + error2 = await handle_error_and_jsonify( + event, + event_state, + options2, + new Error(`Failed to serialize promise while rendering ${event.route.id}`) + ); + data = void 0; + str = devalue.uneval([, error2], replacer); + } + return { + index, + str: `${global}.resolve(${id}, ${str.includes("app.decode") ? `(app) => ${str}` : `() => ${str}`})` + }; + } + ); + iterator.add(promise); + return `${global}.defer(${id})`; + } else { + for (const key2 in options2.hooks.transport) { + const encoded = options2.hooks.transport[key2].encode(thing); + if (encoded) { + return `app.decode('${key2}', ${devalue.uneval(encoded, replacer)})`; + } + } + } + }; + } + const strings = ( + /** @type {string[]} */ + [] + ); + return { + set_max_nodes(i) { + max_nodes = i; + }, + add_node(i, node) { + try { + if (!node) { + strings[i] = "null"; + return; + } + const payload = { type: "data", data: node.data, uses: serialize_uses(node) }; + if (node.slash) payload.slash = node.slash; + strings[i] = devalue.uneval(payload, get_replacer(i)); + } catch (e) { + e.path = e.path.slice(1); + throw new Error(clarify_devalue_error( + event, + /** @type {any} */ + e + )); + } + }, + get_data(csp) { + const open = ``; + const close = `<\/script> +`; + return { + data: `[${compact(max_nodes > -1 ? strings.slice(0, max_nodes) : strings).join(",")}]`, + chunks: promise_id > 1 ? iterator.iterate(({ index, str }) => { + if (max_nodes > -1 && index >= max_nodes) { + return ""; + } + return open + str + close; + }) : null + }; + } + }; +} +function server_data_serializer_json(event, event_state, options2) { + let promise_id = 1; + const iterator = create_async_iterator(); + const reducers = { + ...Object.fromEntries( + Object.entries(options2.hooks.transport).map(([key2, value]) => [key2, value.encode]) + ), + /** @param {any} thing */ + Promise: (thing) => { + if (typeof thing?.then !== "function") { + return; + } + const id = promise_id++; + let key2 = "data"; + const promise = thing.catch( + /** @param {any} e */ + async (e) => { + key2 = "error"; + return handle_error_and_jsonify( + event, + event_state, + options2, + /** @type {any} */ + e + ); + } + ).then( + /** @param {any} value */ + async (value) => { + let str; + try { + str = devalue.stringify(value, reducers); + } catch { + const error2 = await handle_error_and_jsonify( + event, + event_state, + options2, + new Error(`Failed to serialize promise while rendering ${event.route.id}`) + ); + key2 = "error"; + str = devalue.stringify(error2, reducers); + } + return `{"type":"chunk","id":${id},"${key2}":${str}} +`; + } + ); + iterator.add(promise); + return id; + } + }; + const strings = ( + /** @type {string[]} */ + [] + ); + return { + add_node(i, node) { + try { + if (!node) { + strings[i] = "null"; + return; + } + if (node.type === "error" || node.type === "skip") { + strings[i] = JSON.stringify(node); + return; + } + strings[i] = `{"type":"data","data":${devalue.stringify(node.data, reducers)},"uses":${JSON.stringify( + serialize_uses(node) + )}${node.slash ? `,"slash":${JSON.stringify(node.slash)}` : ""}}`; + } catch (e) { + e.path = "data" + e.path; + throw new Error(clarify_devalue_error( + event, + /** @type {any} */ + e + )); + } + }, + get_data() { + return { + data: `{"type":"data","nodes":[${strings.join(",")}]} +`, + chunks: promise_id > 1 ? iterator.iterate() : null + }; + } + }; +} +async function load_server_data({ event, event_state, state, node, parent }) { + if (!node?.server) return null; + let is_tracking = true; + const uses = { + dependencies: /* @__PURE__ */ new Set(), + params: /* @__PURE__ */ new Set(), + parent: false, + route: false, + url: false, + search_params: /* @__PURE__ */ new Set() + }; + const load = node.server.load; + const slash = node.server.trailingSlash; + if (!load) { + return { type: "data", data: null, uses, slash }; + } + const url = make_trackable( + event.url, + () => { + if (is_tracking) { + uses.url = true; + } + }, + (param) => { + if (is_tracking) { + uses.search_params.add(param); + } + } + ); + if (state.prerendering) { + disable_search(url); + } + const result = await record_span({ + name: "sveltekit.load", + attributes: { + "sveltekit.load.node_id": node.server_id || "unknown", + "sveltekit.load.node_type": get_node_type(node.server_id), + "http.route": event.route.id || "unknown" + }, + fn: async (current2) => { + const traced_event = merge_tracing(event, current2); + const result2 = await with_request_store( + { event: traced_event, state: event_state }, + () => load.call(null, { + ...traced_event, + fetch: (info, init2) => { + new URL(info instanceof Request ? info.url : info, event.url); + return event.fetch(info, init2); + }, + /** @param {string[]} deps */ + depends: (...deps) => { + for (const dep of deps) { + const { href } = new URL(dep, event.url); + uses.dependencies.add(href); + } + }, + params: new Proxy(event.params, { + get: (target, key2) => { + if (is_tracking) { + uses.params.add(key2); + } + return target[ + /** @type {string} */ + key2 + ]; + } + }), + parent: async () => { + if (is_tracking) { + uses.parent = true; + } + return parent(); + }, + route: new Proxy(event.route, { + get: (target, key2) => { + if (is_tracking) { + uses.route = true; + } + return target[ + /** @type {'id'} */ + key2 + ]; + } + }), + url, + untrack(fn) { + is_tracking = false; + try { + return fn(); + } finally { + is_tracking = true; + } + } + }) + ); + return result2; + } + }); + return { + type: "data", + data: result ?? null, + uses, + slash + }; +} +async function load_data({ + event, + event_state, + fetched, + node, + parent, + server_data_promise, + state, + resolve_opts, + csr +}) { + const server_data_node = await server_data_promise; + const load = node?.universal?.load; + if (!load) { + return server_data_node?.data ?? null; + } + const result = await record_span({ + name: "sveltekit.load", + attributes: { + "sveltekit.load.node_id": node.universal_id || "unknown", + "sveltekit.load.node_type": get_node_type(node.universal_id), + "http.route": event.route.id || "unknown" + }, + fn: async (current2) => { + const traced_event = merge_tracing(event, current2); + return await with_request_store( + { event: traced_event, state: event_state }, + () => load.call(null, { + url: event.url, + params: event.params, + data: server_data_node?.data ?? null, + route: event.route, + fetch: create_universal_fetch(event, state, fetched, csr, resolve_opts), + setHeaders: event.setHeaders, + depends: () => { + }, + parent, + untrack: (fn) => fn(), + tracing: traced_event.tracing + }) + ); + } + }); + return result ?? null; +} +function create_universal_fetch(event, state, fetched, csr, resolve_opts) { + const universal_fetch = async (input, init2) => { + const cloned_body = input instanceof Request && input.body ? input.clone().body : null; + const cloned_headers = input instanceof Request && [...input.headers].length ? new Headers(input.headers) : init2?.headers; + let response = await event.fetch(input, init2); + const url = new URL(input instanceof Request ? input.url : input, event.url); + const same_origin = url.origin === event.url.origin; + let dependency; + if (same_origin) { + if (state.prerendering) { + dependency = { response, body: null }; + state.prerendering.dependencies.set(url.pathname, dependency); + } + } else if (url.protocol === "https:" || url.protocol === "http:") { + const mode = input instanceof Request ? input.mode : init2?.mode ?? "cors"; + if (mode === "no-cors") { + response = new Response("", { + status: response.status, + statusText: response.statusText, + headers: response.headers + }); + } else { + const acao = response.headers.get("access-control-allow-origin"); + if (!acao || acao !== event.url.origin && acao !== "*") { + throw new Error( + `CORS error: ${acao ? "Incorrect" : "No"} 'Access-Control-Allow-Origin' header is present on the requested resource` + ); + } + } + } + let teed_body; + const proxy = new Proxy(response, { + get(response2, key2, receiver) { + async function push_fetched(body2, is_b64) { + const status_number = Number(response2.status); + if (isNaN(status_number)) { + throw new Error( + `response.status is not a number. value: "${response2.status}" type: ${typeof response2.status}` + ); + } + fetched.push({ + url: same_origin ? url.href.slice(event.url.origin.length) : url.href, + method: event.request.method, + request_body: ( + /** @type {string | ArrayBufferView | undefined} */ + input instanceof Request && cloned_body ? await stream_to_string(cloned_body) : init2?.body + ), + request_headers: cloned_headers, + response_body: body2, + response: response2, + is_b64 + }); + } + if (key2 === "body") { + if (response2.body === null) { + return null; + } + if (teed_body) { + return teed_body; + } + const [a, b] = response2.body.tee(); + void (async () => { + let result = new Uint8Array(); + for await (const chunk of a) { + const combined = new Uint8Array(result.length + chunk.length); + combined.set(result, 0); + combined.set(chunk, result.length); + result = combined; + } + if (dependency) { + dependency.body = new Uint8Array(result); + } + void push_fetched(base64_encode(result), true); + })(); + return teed_body = b; + } + if (key2 === "arrayBuffer") { + return async () => { + const buffer = await response2.arrayBuffer(); + const bytes = new Uint8Array(buffer); + if (dependency) { + dependency.body = bytes; + } + if (buffer instanceof ArrayBuffer) { + await push_fetched(base64_encode(bytes), true); + } + return buffer; + }; + } + async function text2() { + const body2 = await response2.text(); + if (body2 === "" && NULL_BODY_STATUS.includes(response2.status)) { + await push_fetched(void 0, false); + return void 0; + } + if (!body2 || typeof body2 === "string") { + await push_fetched(body2, false); + } + if (dependency) { + dependency.body = body2; + } + return body2; + } + if (key2 === "text") { + return text2; + } + if (key2 === "json") { + return async () => { + const body2 = await text2(); + return body2 ? JSON.parse(body2) : void 0; + }; + } + const value = Reflect.get(response2, key2, response2); + if (value instanceof Function) { + return Object.defineProperties( + /** + * @this {any} + */ + function() { + return Reflect.apply(value, this === receiver ? response2 : this, arguments); + }, + { + name: { value: value.name }, + length: { value: value.length } + } + ); + } + return value; + } + }); + if (csr) { + const get = response.headers.get; + response.headers.get = (key2) => { + const lower = key2.toLowerCase(); + const value = get.call(response.headers, lower); + if (value && !lower.startsWith("x-sveltekit-")) { + const included = resolve_opts.filterSerializedResponseHeaders(lower, value); + if (!included) { + throw new Error( + `Failed to get response header "${lower}" — it must be included by the \`filterSerializedResponseHeaders\` option: https://svelte.dev/docs/kit/hooks#Server-hooks-handle (at ${event.route.id})` + ); + } + } + return value; + }; + } + return proxy; + }; + return (input, init2) => { + const response = universal_fetch(input, init2); + response.catch(() => { + }); + return response; + }; +} +async function stream_to_string(stream) { + let result = ""; + const reader = stream.getReader(); + while (true) { + const { done, value } = await reader.read(); + if (done) { + break; + } + result += text_decoder.decode(value); + } + return result; +} +function hash(...values) { + let hash2 = 5381; + for (const value of values) { + if (typeof value === "string") { + let i = value.length; + while (i) hash2 = hash2 * 33 ^ value.charCodeAt(--i); + } else if (ArrayBuffer.isView(value)) { + const buffer = new Uint8Array(value.buffer, value.byteOffset, value.byteLength); + let i = buffer.length; + while (i) hash2 = hash2 * 33 ^ buffer[--i]; + } else { + throw new TypeError("value must be a string or TypedArray"); + } + } + return (hash2 >>> 0).toString(36); +} +const replacements = { + "<": "\\u003C", + "\u2028": "\\u2028", + "\u2029": "\\u2029" +}; +const pattern = new RegExp(`[${Object.keys(replacements).join("")}]`, "g"); +function serialize_data(fetched, filter, prerendering = false) { + const headers2 = {}; + let cache_control = null; + let age = null; + let varyAny = false; + for (const [key2, value] of fetched.response.headers) { + if (filter(key2, value)) { + headers2[key2] = value; + } + if (key2 === "cache-control") cache_control = value; + else if (key2 === "age") age = value; + else if (key2 === "vary" && value.trim() === "*") varyAny = true; + } + const payload = { + status: fetched.response.status, + statusText: fetched.response.statusText, + headers: headers2, + body: fetched.response_body + }; + const safe_payload = JSON.stringify(payload).replace(pattern, (match) => replacements[match]); + const attrs = [ + 'type="application/json"', + "data-sveltekit-fetched", + `data-url="${escape_html(fetched.url, true)}"` + ]; + if (fetched.is_b64) { + attrs.push("data-b64"); + } + if (fetched.request_headers || fetched.request_body) { + const values = []; + if (fetched.request_headers) { + values.push([...new Headers(fetched.request_headers)].join(",")); + } + if (fetched.request_body) { + values.push(fetched.request_body); + } + attrs.push(`data-hash="${hash(...values)}"`); + } + if (!prerendering && fetched.method === "GET" && cache_control && !varyAny) { + const match = /s-maxage=(\d+)/g.exec(cache_control) ?? /max-age=(\d+)/g.exec(cache_control); + if (match) { + const ttl = +match[1] - +(age ?? "0"); + attrs.push(`data-ttl="${ttl}"`); + } + } + return `\n * ```\n */\nexport function getAbortSignal() {\n\tif (active_reaction === null) {\n\t\te.get_abort_signal_outside_reaction();\n\t}\n\n\treturn (active_reaction.ac ??= new AbortController()).signal;\n}\n\n/**\n * `onMount`, like [`$effect`](https://svelte.dev/docs/svelte/$effect), schedules a function to run as soon as the component has been mounted to the DOM.\n * Unlike `$effect`, the provided function only runs once.\n *\n * It must be called during the component's initialisation (but doesn't need to live _inside_ the component;\n * it can be called from an external module). If a function is returned _synchronously_ from `onMount`,\n * it will be called when the component is unmounted.\n *\n * `onMount` functions do not run during [server-side rendering](https://svelte.dev/docs/svelte/svelte-server#render).\n *\n * @template T\n * @param {() => NotFunction | Promise> | (() => any)} fn\n * @returns {void}\n */\nexport function onMount(fn) {\n\tif (component_context === null) {\n\t\te.lifecycle_outside_component('onMount');\n\t}\n\n\tif (legacy_mode_flag && component_context.l !== null) {\n\t\tinit_update_callbacks(component_context).m.push(fn);\n\t} else {\n\t\tuser_effect(() => {\n\t\t\tconst cleanup = untrack(fn);\n\t\t\tif (typeof cleanup === 'function') return /** @type {() => void} */ (cleanup);\n\t\t});\n\t}\n}\n\n/**\n * Schedules a callback to run immediately before the component is unmounted.\n *\n * Out of `onMount`, `beforeUpdate`, `afterUpdate` and `onDestroy`, this is the\n * only one that runs inside a server-side component.\n *\n * @param {() => any} fn\n * @returns {void}\n */\nexport function onDestroy(fn) {\n\tif (component_context === null) {\n\t\te.lifecycle_outside_component('onDestroy');\n\t}\n\n\tonMount(() => () => untrack(fn));\n}\n\n/**\n * @template [T=any]\n * @param {string} type\n * @param {T} [detail]\n * @param {any}params_0\n * @returns {CustomEvent}\n */\nfunction create_custom_event(type, detail, { bubbles = false, cancelable = false } = {}) {\n\treturn new CustomEvent(type, { detail, bubbles, cancelable });\n}\n\n/**\n * Creates an event dispatcher that can be used to dispatch [component events](https://svelte.dev/docs/svelte/legacy-on#Component-events).\n * Event dispatchers are functions that can take two arguments: `name` and `detail`.\n *\n * Component events created with `createEventDispatcher` create a\n * [CustomEvent](https://developer.mozilla.org/en-US/docs/Web/API/CustomEvent).\n * These events do not [bubble](https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Building_blocks/Events#Event_bubbling_and_capture).\n * The `detail` argument corresponds to the [CustomEvent.detail](https://developer.mozilla.org/en-US/docs/Web/API/CustomEvent/detail)\n * property and can contain any type of data.\n *\n * The event dispatcher can be typed to narrow the allowed event names and the type of the `detail` argument:\n * ```ts\n * const dispatch = createEventDispatcher<{\n * loaded: null; // does not take a detail argument\n * change: string; // takes a detail argument of type string, which is required\n * optional: number | null; // takes an optional detail argument of type number\n * }>();\n * ```\n *\n * @deprecated Use callback props and/or the `$host()` rune instead — see [migration guide](https://svelte.dev/docs/svelte/v5-migration-guide#Event-changes-Component-events)\n * @template {Record} [EventMap = any]\n * @returns {EventDispatcher}\n */\nexport function createEventDispatcher() {\n\tconst active_component_context = component_context;\n\tif (active_component_context === null) {\n\t\te.lifecycle_outside_component('createEventDispatcher');\n\t}\n\n\t/**\n\t * @param [detail]\n\t * @param [options]\n\t */\n\treturn (type, detail, options) => {\n\t\tconst events = /** @type {Record} */ (\n\t\t\tactive_component_context.s.$$events\n\t\t)?.[/** @type {string} */ (type)];\n\n\t\tif (events) {\n\t\t\tconst callbacks = is_array(events) ? events.slice() : [events];\n\t\t\t// TODO are there situations where events could be dispatched\n\t\t\t// in a server (non-DOM) environment?\n\t\t\tconst event = create_custom_event(/** @type {string} */ (type), detail, options);\n\t\t\tfor (const fn of callbacks) {\n\t\t\t\tfn.call(active_component_context.x, event);\n\t\t\t}\n\t\t\treturn !event.defaultPrevented;\n\t\t}\n\n\t\treturn true;\n\t};\n}\n\n// TODO mark beforeUpdate and afterUpdate as deprecated in Svelte 6\n\n/**\n * Schedules a callback to run immediately before the component is updated after any state change.\n *\n * The first time the callback runs will be before the initial `onMount`.\n *\n * In runes mode use `$effect.pre` instead.\n *\n * @deprecated Use [`$effect.pre`](https://svelte.dev/docs/svelte/$effect#$effect.pre) instead\n * @param {() => void} fn\n * @returns {void}\n */\nexport function beforeUpdate(fn) {\n\tif (component_context === null) {\n\t\te.lifecycle_outside_component('beforeUpdate');\n\t}\n\n\tif (component_context.l === null) {\n\t\te.lifecycle_legacy_only('beforeUpdate');\n\t}\n\n\tinit_update_callbacks(component_context).b.push(fn);\n}\n\n/**\n * Schedules a callback to run immediately after the component has been updated.\n *\n * The first time the callback runs will be after the initial `onMount`.\n *\n * In runes mode use `$effect` instead.\n *\n * @deprecated Use [`$effect`](https://svelte.dev/docs/svelte/$effect) instead\n * @param {() => void} fn\n * @returns {void}\n */\nexport function afterUpdate(fn) {\n\tif (component_context === null) {\n\t\te.lifecycle_outside_component('afterUpdate');\n\t}\n\n\tif (component_context.l === null) {\n\t\te.lifecycle_legacy_only('afterUpdate');\n\t}\n\n\tinit_update_callbacks(component_context).a.push(fn);\n}\n\n/**\n * Legacy-mode: Init callbacks object for onMount/beforeUpdate/afterUpdate\n * @param {ComponentContext} context\n */\nfunction init_update_callbacks(context) {\n\tvar l = /** @type {ComponentContextLegacy} */ (context).l;\n\treturn (l.u ??= { a: [], b: [], m: [] });\n}\n\nexport { flushSync, fork } from './internal/client/reactivity/batch.js';\nexport {\n\tcreateContext,\n\tgetContext,\n\tgetAllContexts,\n\thasContext,\n\tsetContext\n} from './internal/client/context.js';\nexport { hydratable } from './internal/client/hydratable.js';\nexport { hydrate, mount, unmount } from './internal/client/render.js';\nexport { tick, untrack, settled } from './internal/client/runtime.js';\nexport { createRawSnippet } from './internal/client/dom/blocks/snippet.js';\n"], + "mappings": ";AAAA,IAAO,eAAQ;;;ACER,IAAI,WAAW,MAAM;AACrB,IAAI,WAAW,MAAM,UAAU;AAC/B,IAAI,aAAa,MAAM;AACvB,IAAI,cAAc,OAAO;AACzB,IAAI,kBAAkB,OAAO;AAC7B,IAAI,iBAAiB,OAAO;AAE5B,IAAI,mBAAmB,OAAO;AAC9B,IAAI,kBAAkB,MAAM;AAC5B,IAAI,mBAAmB,OAAO;AAC9B,IAAI,gBAAgB,OAAO;AAU3B,IAAM,OAAO,MAAM;AAAC;AAoBpB,SAAS,QAAQ,KAAK;AAC5B,WAAS,IAAI,GAAG,IAAI,IAAI,QAAQ,KAAK;AACpC,QAAI,CAAC,EAAE;AAAA,EACR;AACD;AAMO,SAAS,WAAW;AAE1B,MAAI;AAGJ,MAAI;AAGJ,MAAI,UAAU,IAAI,QAAQ,CAAC,KAAK,QAAQ;AACvC,cAAU;AACV,aAAS;AAAA,EACV,CAAC;AAGD,SAAO,EAAE,SAAS,SAAS,OAAO;AACnC;;;AClEO,IAAM,UAAU,KAAK;AACrB,IAAM,SAAS,KAAK;AACpB,IAAM,gBAAgB,KAAK;AAK3B,IAAM,iBAAiB,KAAK;AAK5B,IAAM,eAAe,KAAK;AAC1B,IAAM,gBAAgB,KAAK;AAC3B,IAAM,cAAc,KAAK;AACzB,IAAM,kBAAkB,KAAK;AAO7B,IAAM,YAAY,KAAK;AACvB,IAAM,QAAQ,KAAK;AACnB,IAAM,QAAQ,KAAK;AACnB,IAAM,cAAc,KAAK;AACzB,IAAM,QAAQ,KAAK;AACnB,IAAM,YAAY,KAAK;AAIvB,IAAM,aAAa,KAAK;AAKxB,IAAM,qBAAqB,KAAK;AAChC,IAAM,eAAe,KAAK;AAC1B,IAAM,cAAc,KAAK;AACzB,IAAM,mBAAmB,KAAK;AAC9B,IAAM,cAAc,KAAK;AACzB,IAAM,mBAAmB,KAAK;AAQ9B,IAAM,aAAa,KAAK;AAGxB,IAAM,uBAAuB,KAAK;AAClC,IAAM,QAAQ,KAAK;AAEnB,IAAM,cAAc,KAAK;AAEzB,IAAM,eAAe,uBAAO,QAAQ;AACpC,IAAM,eAAe,uBAAO,cAAc;AAE1C,IAAM,oBAAoB,uBAAO,YAAY;AAG7C,IAAM,iBAAiB,IAAK,MAAM,2BAA2B,MAAM;AAAA,EACzE,OAAO;AAAA,EACP,UAAU;AACX,EAAG;AAEI,IAAM,eAAe;AAErB,IAAM,eAAe;;;AC9DrB,SAAS,4BAA4B,MAAM;AACjD,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA,eAA6C,IAAI;AAAA,iDAAyH;AAElM,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,kDAAkD;AAAA,EACnE;AACD;AAuCO,SAAS,4BAA4B,MAAM;AACjD,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA,IAAkC,IAAI;AAAA,iDAA4G;AAE1K,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,kDAAkD;AAAA,EACnE;AACD;AAMO,SAAS,kBAAkB;AACjC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,qCAAkG;AAE1H,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,sCAAsC;AAAA,EACvD;AACD;;;ACgCO,SAAS,0BAA0B;AACzC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,6CAA4H;AAEpJ,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,8CAA8C;AAAA,EAC/D;AACD;AA4BO,SAAS,mBAAmB,MAAM;AACxC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA,IAAyB,IAAI;AAAA,wCAA8F;AAEnJ,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,yCAAyC;AAAA,EAC1D;AACD;AAMO,SAAS,4BAA4B;AAC3C,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,+CAA8K;AAEtM,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,gDAAgD;AAAA,EACjE;AACD;AAOO,SAAS,cAAc,MAAM;AACnC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA,IAAoB,IAAI;AAAA,mCAAiH;AAEjK,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,oCAAoC;AAAA,EACrD;AACD;AAsBO,SAAS,+BAA+B;AAC9C,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,kDAAkM;AAE1N,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,mDAAmD;AAAA,EACpE;AACD;AAsBO,SAAS,iBAAiB;AAChC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,oCAAsG;AAE9H,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,qCAAqC;AAAA,EACtD;AACD;AAMO,SAAS,cAAc;AAC7B,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,iCAAwH;AAEhJ,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,kCAAkC;AAAA,EACnD;AACD;AAMO,SAAS,oCAAoC;AACnD,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,uDAAgK;AAExL,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,wDAAwD;AAAA,EACzE;AACD;AAOO,SAAS,gCAAgCA,MAAK;AACpD,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA,2CAA6EA,IAAG;AAAA,qDAAyF;AAEjM,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,sDAAsD;AAAA,EACvE;AACD;AAMO,SAAS,mBAAmB;AAClC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,sCAA4F;AAEpH,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,uCAAuC;AAAA,EACxD;AACD;AAuBO,SAAS,sBAAsB,MAAM;AAC3C,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA,IAA4B,IAAI;AAAA,2CAAkF;AAE1I,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,4CAA4C;AAAA,EAC7D;AACD;AAyCO,SAAS,oBAAoB,MAAM;AACzC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA,QAA8B,IAAI;AAAA,yCAAoH;AAE9K,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,0CAA0C;AAAA,EAC3D;AACD;AAMO,SAAS,yBAAyB;AACxC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,4CAAoM;AAE5N,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,6CAA6C;AAAA,EAC9D;AACD;AAMO,SAAS,0BAA0B;AACzC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,6CAAmN;AAE3O,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,8CAA8C;AAAA,EAC/D;AACD;AAMO,SAAS,wBAAwB;AACvC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,2CAA8G;AAEtI,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,4CAA4C;AAAA,EAC7D;AACD;AAMO,SAAS,wBAAwB;AACvC,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,2CAAyO;AAEjQ,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,4CAA4C;AAAA,EAC7D;AACD;AAMO,SAAS,gCAAgC;AAC/C,MAAI,cAAK;AACR,UAAM,QAAQ,IAAI,MAAM;AAAA;AAAA,mDAAsL;AAE9M,UAAM,OAAO;AAEb,UAAM;AAAA,EACP,OAAO;AACN,UAAM,IAAI,MAAM,oDAAoD;AAAA,EACrE;AACD;;;ACzeO,IAAM,sBAAsB,KAAK;AAEjC,IAAM,qBAAqB,KAAK;AAChC,IAAM,mBAAmB,KAAK;AAC9B,IAAM,sBAAsB,KAAK;AAGjC,IAAM,iBAAiB,KAAK;AAC5B,IAAM,mBAAmB,KAAK;AAC9B,IAAM,oBAAoB,KAAK;AAC/B,IAAM,wBAAwB,KAAK;AAGnC,IAAM,iBAAiB,KAAK;AAC5B,IAAM,oBAAoB,KAAK;AAG/B,IAAM,2BAA2B,KAAK;AACtC,IAAM,mBAAmB,KAAK;AAC9B,IAAM,sBAAsB,KAAK;AAEjC,IAAM,kBAAkB;AAExB,IAAM,uBAAuB;AAC7B,IAAM,gBAAgB;AACtB,IAAM,kBAAkB,CAAC;AAGzB,IAAM,kCAAkC,KAAK;AAC7C,IAAM,mBAAmB,KAAK;AAE9B,IAAM,gBAAgB,uBAAO;AAG7B,IAAM,WAAW,uBAAO,UAAU;;;AC/BzC,IAAI,OAAO;AACX,IAAI,SAAS;AAwFN,SAAS,gCAAgCC,MAAK;AACpD,MAAI,cAAK;AACR,YAAQ,KAAK;AAAA,6CAA0FA,IAAG;AAAA,uDAA2F,MAAM,MAAM;AAAA,EAClN,OAAO;AACN,YAAQ,KAAK,sDAAsD;AAAA,EACpE;AACD;AAsCO,SAAS,mBAAmB,UAAU;AAC5C,MAAI,cAAK;AACR,YAAQ;AAAA,MACP;AAAA,IAAoC,WACjC,mHAAmH,QAAQ,KAC3H,wFAAwF;AAAA;AAAA,MAC3F;AAAA,MACA;AAAA,IACD;AAAA,EACD,OAAO;AACN,YAAQ,KAAK,yCAAyC;AAAA,EACvD;AACD;AAKO,SAAS,6BAA6B;AAC5C,MAAI,cAAK;AACR,YAAQ,KAAK;AAAA;AAAA,kDAA4L,MAAM,MAAM;AAAA,EACtN,OAAO;AACN,YAAQ,KAAK,iDAAiD;AAAA,EAC/D;AACD;AAiBO,SAAS,2BAA2B;AAC1C,MAAI,cAAK;AACR,YAAQ,KAAK;AAAA;AAAA,gDAA2I,MAAM,MAAM;AAAA,EACrK,OAAO;AACN,YAAQ,KAAK,+CAA+C;AAAA,EAC7D;AACD;AA+CO,SAAS,8BAA8B,UAAU;AACvD,MAAI,cAAK;AACR,YAAQ,KAAK;AAAA,8HAAyK,QAAQ;AAAA,qDAA0F,MAAM,MAAM;AAAA,EACrS,OAAO;AACN,YAAQ,KAAK,oDAAoD;AAAA,EAClE;AACD;AAKO,SAAS,sBAAsB;AACrC,MAAI,cAAK;AACR,YAAQ,KAAK;AAAA;AAAA,2CAAuI,MAAM,MAAM;AAAA,EACjK,OAAO;AACN,YAAQ,KAAK,0CAA0C;AAAA,EACxD;AACD;AAKO,SAAS,6BAA6B;AAC5C,MAAI,cAAK;AACR,YAAQ,KAAK;AAAA;AAAA,kDAA6L,MAAM,MAAM;AAAA,EACvN,OAAO;AACN,YAAQ,KAAK,iDAAiD;AAAA,EAC/D;AACD;;;AClPO,IAAI,YAAY;AAGhB,SAAS,cAAc,OAAO;AACpC,cAAY;AACb;AASO,IAAI;AAGJ,SAAS,iBAAiB,MAAM;AACtC,MAAI,SAAS,MAAM;AAClB,IAAE,mBAAmB;AACrB,UAAM;AAAA,EACP;AAEA,SAAQ,eAAe;AACxB;AAEO,SAAS,eAAe;AAC9B,SAAO,iBAAiB,iBAAiB,YAAY,CAAC;AACvD;AAyBO,SAAS,KAAK,QAAQ,GAAG;AAC/B,MAAI,WAAW;AACd,QAAI,IAAI;AACR,QAAI,OAAO;AAEX,WAAO,KAAK;AACX;AAAA,MAAoC,iBAAiB,IAAI;AAAA,IAC1D;AAEA,mBAAe;AAAA,EAChB;AACD;AAMO,SAAS,WAAW,SAAS,MAAM;AACzC,MAAI,QAAQ;AACZ,MAAI,OAAO;AAEX,SAAO,MAAM;AACZ,QAAI,KAAK,aAAa,cAAc;AACnC,UAAI;AAAA;AAAA,QAA+B,KAAM;AAAA;AAEzC,UAAI,SAAS,eAAe;AAC3B,YAAI,UAAU,EAAG,QAAO;AACxB,iBAAS;AAAA,MACV,WAAW,SAAS,mBAAmB,SAAS,sBAAsB;AACrE,iBAAS;AAAA,MACV;AAAA,IACD;AAEA,QAAIC;AAAA;AAAA,MAAoC,iBAAiB,IAAI;AAAA;AAC7D,QAAI,OAAQ,MAAK,OAAO;AACxB,WAAOA;AAAA,EACR;AACD;;;ACvGO,SAAS,OAAO,OAAO;AAC7B,SAAO,UAAU,KAAK;AACvB;AAOO,SAAS,eAAe,GAAG,GAAG;AACpC,SAAO,KAAK,IACT,KAAK,IACL,MAAM,KAAM,MAAM,QAAQ,OAAO,MAAM,YAAa,OAAO,MAAM;AACrE;AAYO,SAAS,YAAY,OAAO;AAClC,SAAO,CAAC,eAAe,OAAO,KAAK,CAAC;AACrC;;;AC7BO,IAAI,kBAAkB;AAEtB,IAAI,mBAAmB;AAEvB,IAAI,oBAAoB;;;ACSxB,IAAI,sBAAsB;AA0H1B,SAAS,IAAIC,SAAQ,OAAO;AAClC,EAAAA,QAAO,QAAQ;AACf,YAAUA,QAAO,GAAG,KAAK;AAEzB,SAAOA;AACR;AAMO,SAAS,UAAU,OAAO,OAAO;AAEvC,UAAQ,iBAAiB,IAAI,KAAK;AAClC,SAAO;AACR;;;ACjJO,SAAS,UAAU,OAAO;AAChC,QAAM,QAAQ,IAAI,MAAM;AACxB,QAAMC,SAAQ,UAAU;AAExB,MAAIA,OAAM,WAAW,GAAG;AACvB,WAAO;AAAA,EACR;AAEA,EAAAA,OAAM,QAAQ,IAAI;AAElB,kBAAgB,OAAO,SAAS;AAAA,IAC/B,OAAOA,OAAM,KAAK,IAAI;AAAA,EACvB,CAAC;AAED,kBAAgB,OAAO,QAAQ;AAAA,IAC9B,OAAO;AAAA,EACR,CAAC;AAED;AAAA;AAAA,IAAiD;AAAA;AAClD;AAKO,SAAS,YAAY;AAE3B,QAAM,QAAQ,MAAM;AAEpB,QAAM,kBAAkB;AACxB,QAAMA,SAAQ,IAAI,MAAM,EAAE;AAE1B,QAAM,kBAAkB;AAExB,MAAI,CAACA,OAAO,QAAO,CAAC;AAEpB,QAAM,QAAQA,OAAM,MAAM,IAAI;AAC9B,QAAM,YAAY,CAAC;AAEnB,WAAS,IAAI,GAAG,IAAI,MAAM,QAAQ,KAAK;AACtC,UAAM,OAAO,MAAM,CAAC;AACpB,UAAM,aAAa,KAAK,WAAW,MAAM,GAAG;AAE5C,QAAI,KAAK,KAAK,MAAM,SAAS;AAC5B;AAAA,IACD;AAEA,QAAI,KAAK,SAAS,oBAAoB,GAAG;AACxC,aAAO,CAAC;AAAA,IACT;AAEA,QAAI,WAAW,SAAS,qBAAqB,KAAK,WAAW,SAAS,oBAAoB,GAAG;AAC5F;AAAA,IACD;AAEA,cAAU,KAAK,IAAI;AAAA,EACpB;AAEA,SAAO;AACR;;;ACtDO,IAAI,oBAAoB;AAGxB,SAAS,sBAAsB,SAAS;AAC9C,sBAAoB;AACrB;AAGO,IAAI,YAAY;AAGhB,SAAS,cAAcC,QAAO;AACpC,cAAYA;AACb;AAyCO,IAAI,iCAAiC;AAGrC,SAAS,mCAAmC,IAAI;AACtD,mCAAiC;AAClC;AAWO,SAAS,gBAAgB;AAC/B,QAAMC,OAAM,CAAC;AAEb,SAAO;AAAA,IACN,MAAM;AACL,UAAI,CAAC,WAAWA,IAAG,GAAG;AACrB,QAAE,gBAAgB;AAAA,MACnB;AAEA,aAAO,WAAWA,IAAG;AAAA,IACtB;AAAA,IACA,CAAC,YAAY,WAAWA,MAAK,OAAO;AAAA,EACrC;AACD;AAYO,SAAS,WAAWA,MAAK;AAC/B,QAAM,cAAc,wBAAwB,YAAY;AACxD,QAAM;AAAA;AAAA,IAA2B,YAAY,IAAIA,IAAG;AAAA;AACpD,SAAO;AACR;AAgBO,SAAS,WAAWA,MAAK,SAAS;AACxC,QAAM,cAAc,wBAAwB,YAAY;AAExD,MAAI,iBAAiB;AACpB,QAAIC;AAAA;AAAA,MAA+B,cAAe;AAAA;AAClD,QAAI,QACH,CAAC,oBACAA,SAAQ,mBAAmB;AAAA,IAE5B;AAAA,IAAmC,kBAAmB;AAEvD,QAAI,CAAC,OAAO;AACX,MAAE,uBAAuB;AAAA,IAC1B;AAAA,EACD;AAEA,cAAY,IAAID,MAAK,OAAO;AAC5B,SAAO;AACR;AASO,SAAS,WAAWA,MAAK;AAC/B,QAAM,cAAc,wBAAwB,YAAY;AACxD,SAAO,YAAY,IAAIA,IAAG;AAC3B;AAUO,SAAS,iBAAiB;AAChC,QAAM,cAAc,wBAAwB,gBAAgB;AAC5D;AAAA;AAAA,IAAyB;AAAA;AAC1B;AAQO,SAAS,KAAK,OAAO,QAAQ,OAAO,IAAI;AAC9C,sBAAoB;AAAA,IACnB,GAAG;AAAA,IACH,GAAG;AAAA,IACH,GAAG;AAAA,IACH,GAAG;AAAA,IACH,GAAG;AAAA,IACH,GAAG;AAAA,IACH,GAAG,oBAAoB,CAAC,QAAQ,EAAE,GAAG,MAAM,GAAG,MAAM,GAAG,CAAC,EAAE,IAAI;AAAA,EAC/D;AAEA,MAAI,cAAK;AAER,sBAAkB,WAAW;AAC7B,qCAAiC;AAAA,EAClC;AACD;AAOO,SAAS,IAAIE,YAAW;AAC9B,MAAI;AAAA;AAAA,IAA2C;AAAA;AAC/C,MAAI,UAAU,QAAQ;AAEtB,MAAI,YAAY,MAAM;AACrB,YAAQ,IAAI;AAEZ,aAAS,MAAM,SAAS;AACvB,yBAAmB,EAAE;AAAA,IACtB;AAAA,EACD;AAEA,MAAIA,eAAc,QAAW;AAC5B,YAAQ,IAAIA;AAAA,EACb;AAEA,UAAQ,IAAI;AAEZ,sBAAoB,QAAQ;AAE5B,MAAI,cAAK;AACR,qCAAiC,mBAAmB,YAAY;AAAA,EACjE;AAEA,SAAOA;AAAA,EAA+B,CAAC;AACxC;AAGO,SAAS,WAAW;AAC1B,SAAO,CAAC,oBAAqB,sBAAsB,QAAQ,kBAAkB,MAAM;AACpF;AAMA,SAAS,wBAAwB,MAAM;AACtC,MAAI,sBAAsB,MAAM;AAC/B,IAAE,4BAA4B,IAAI;AAAA,EACnC;AAEA,SAAQ,kBAAkB,MAAM,IAAI,IAAI,mBAAmB,iBAAiB,KAAK,MAAS;AAC3F;AAMA,SAAS,mBAAmBC,oBAAmB;AAC9C,MAAI,SAASA,mBAAkB;AAC/B,SAAO,WAAW,MAAM;AACvB,UAAM,cAAc,OAAO;AAC3B,QAAI,gBAAgB,MAAM;AACzB,aAAO;AAAA,IACR;AACA,aAAS,OAAO;AAAA,EACjB;AACA,SAAO;AACR;;;AC7PA,IAAI,cAAc,CAAC;AAEnB,SAAS,kBAAkB;AAC1B,MAAI,QAAQ;AACZ,gBAAc,CAAC;AACf,UAAQ,KAAK;AACd;AAKO,SAAS,iBAAiB,IAAI;AACpC,MAAI,YAAY,WAAW,KAAK,CAAC,kBAAkB;AAClD,QAAI,QAAQ;AACZ,mBAAe,MAAM;AASpB,UAAI,UAAU,YAAa,iBAAgB;AAAA,IAC5C,CAAC;AAAA,EACF;AAEA,cAAY,KAAK,EAAE;AACpB;AAKO,SAAS,cAAc;AAC7B,SAAO,YAAY,SAAS,GAAG;AAC9B,oBAAgB;AAAA,EACjB;AACD;;;AChCA,IAAM,cAAc,oBAAI,QAAQ;AAKzB,SAAS,aAAa,OAAO;AACnC,MAAIC,UAAS;AAGb,MAAIA,YAAW,MAAM;AACG,IAAC,gBAAiB,KAAK;AAC9C,WAAO;AAAA,EACR;AAEA,MAAI,gBAAO,iBAAiB,SAAS,CAAC,YAAY,IAAI,KAAK,GAAG;AAC7D,gBAAY,IAAI,OAAO,gBAAgB,OAAOA,OAAM,CAAC;AAAA,EACtD;AAEA,OAAKA,QAAO,IAAI,gBAAgB,GAAG;AAGlC,SAAKA,QAAO,IAAI,qBAAqB,GAAG;AACvC,UAAI,gBAAO,CAACA,QAAO,UAAU,iBAAiB,OAAO;AACpD,0BAAkB,KAAK;AAAA,MACxB;AAEA,YAAM;AAAA,IACP;AAEwB,IAACA,QAAO,EAAG,MAAM,KAAK;AAAA,EAC/C,OAAO;AAEN,0BAAsB,OAAOA,OAAM;AAAA,EACpC;AACD;AAMO,SAAS,sBAAsB,OAAOA,SAAQ;AACpD,SAAOA,YAAW,MAAM;AACvB,SAAKA,QAAO,IAAI,qBAAqB,GAAG;AACvC,UAAI;AACqB,QAACA,QAAO,EAAG,MAAM,KAAK;AAC9C;AAAA,MACD,SAAS,GAAG;AACX,gBAAQ;AAAA,MACT;AAAA,IACD;AAEA,IAAAA,UAASA,QAAO;AAAA,EACjB;AAEA,MAAI,gBAAO,iBAAiB,OAAO;AAClC,sBAAkB,KAAK;AAAA,EACxB;AAEA,QAAM;AACP;AAOA,SAAS,gBAAgB,OAAOA,SAAQ;AACvC,QAAM,qBAAqB,eAAe,OAAO,SAAS;AAI1D,MAAI,sBAAsB,CAAC,mBAAmB,aAAc;AAE5D,MAAI,SAAS,aAAa,OAAO;AACjC,MAAI,kBAAkB;AAAA,EAAK,MAAM,MAAMA,QAAO,IAAI,QAAQ,WAAW;AACrE,MAAI,UAAUA,QAAO;AAErB,SAAO,YAAY,MAAM;AACxB,uBAAmB;AAAA,EAAK,MAAM,MAAM,QAAQ,WAAW,QAAQ,EAAE,MAAM,GAAG,EAAE,IAAI,CAAC;AACjF,cAAU,QAAQ;AAAA,EACnB;AAEA,SAAO;AAAA,IACN,SAAS,MAAM,UAAU;AAAA,EAAK,eAAe;AAAA;AAAA,IAC7C,OAAO,MAAM,OACV,MAAM,IAAI,EACX,OAAO,CAAC,SAAS,CAAC,KAAK,SAAS,qBAAqB,CAAC,EACtD,KAAK,IAAI;AAAA,EACZ;AACD;AAKA,SAAS,kBAAkB,OAAO;AACjC,QAAM,WAAW,YAAY,IAAI,KAAK;AAEtC,MAAI,UAAU;AACb,oBAAgB,OAAO,WAAW;AAAA,MACjC,OAAO,SAAS;AAAA,IACjB,CAAC;AAED,oBAAgB,OAAO,SAAS;AAAA,MAC/B,OAAO,SAAS;AAAA,IACjB,CAAC;AAAA,EACF;AACD;;;ACjEA,IAAM,UAAU,oBAAI,IAAI;AAGjB,IAAI,gBAAgB;AAOpB,IAAI,iBAAiB;AAQrB,IAAI,eAAe;AAI1B,IAAI,sBAAsB,CAAC;AAG3B,IAAI,wBAAwB;AAE5B,IAAI,cAAc;AACX,IAAI,mBAAmB;AAEvB,IAAM,QAAN,MAAM,OAAM;AAAA,EAClB,YAAY;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOZ,UAAU,oBAAI,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOlB,WAAW,oBAAI,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOnB,oBAAoB,oBAAI,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA,EAM5B,qBAAqB,oBAAI,IAAI;AAAA;AAAA;AAAA;AAAA,EAK7B,WAAW;AAAA;AAAA;AAAA;AAAA,EAKX,oBAAoB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOpB,YAAY;AAAA;AAAA;AAAA;AAAA;AAAA,EAMZ,iBAAiB,oBAAI,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA,EAMzB,uBAAuB,oBAAI,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAO/B,kBAAkB,oBAAI,IAAI;AAAA,EAE1B,UAAU;AAAA,EAEV,cAAc;AACb,WAAO,KAAK,WAAW,KAAK,oBAAoB;AAAA,EACjD;AAAA;AAAA;AAAA;AAAA;AAAA,EAMA,QAAQ,cAAc;AACrB,0BAAsB,CAAC;AAEvB,qBAAiB;AAEjB,SAAK,MAAM;AAGX,QAAI,SAAS;AAAA,MACZ,QAAQ;AAAA,MACR,QAAQ;AAAA,MACR,SAAS,CAAC;AAAA,MACV,gBAAgB,CAAC;AAAA,IAClB;AAEA,eAAW,QAAQ,cAAc;AAChC,WAAK,sBAAsB,MAAM,MAAM;AAAA,IAMxC;AAEA,QAAI,CAAC,KAAK,SAAS;AAClB,WAAK,SAAS;AAAA,IACf;AAEA,QAAI,KAAK,YAAY,GAAG;AACvB,WAAK,eAAe,OAAO,OAAO;AAClC,WAAK,eAAe,OAAO,cAAc;AAAA,IAC1C,OAAO;AAGN,uBAAiB;AACjB,sBAAgB;AAEhB,2BAAqB,OAAO,cAAc;AAC1C,2BAAqB,OAAO,OAAO;AAEnC,uBAAiB;AAEjB,WAAK,WAAW,QAAQ;AAAA,IACzB;AAEA,mBAAe;AAAA,EAChB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,sBAAsB,MAAM,QAAQ;AACnC,SAAK,KAAK;AAEV,QAAIC,UAAS,KAAK;AAElB,WAAOA,YAAW,MAAM;AACvB,UAAIC,SAAQD,QAAO;AACnB,UAAI,aAAaC,UAAS,gBAAgB,kBAAkB;AAC5D,UAAI,sBAAsB,cAAcA,SAAQ,WAAW;AAE3D,UAAI,OAAO,wBAAwBA,SAAQ,WAAW,KAAK,KAAK,gBAAgB,IAAID,OAAM;AAE1F,WAAKA,QAAO,IAAI,qBAAqB,KAAKA,QAAO,GAAG,WAAW,GAAG;AACjE,iBAAS;AAAA,UACR,QAAQ;AAAA,UACR,QAAAA;AAAA,UACA,SAAS,CAAC;AAAA,UACV,gBAAgB,CAAC;AAAA,QAClB;AAAA,MACD;AAEA,UAAI,CAAC,QAAQA,QAAO,OAAO,MAAM;AAChC,YAAI,WAAW;AACd,UAAAA,QAAO,KAAK;AAAA,QACb,YAAYC,SAAQ,YAAY,GAAG;AAClC,iBAAO,QAAQ,KAAKD,OAAM;AAAA,QAC3B,WAAW,oBAAoBC,UAAS,gBAAgB,qBAAqB,GAAG;AAC/E,iBAAO,eAAe,KAAKD,OAAM;AAAA,QAClC,WAAW,SAASA,OAAM,GAAG;AAC5B,eAAKA,QAAO,IAAI,kBAAkB,EAAG,MAAK,eAAe,IAAIA,OAAM;AACnE,wBAAcA,OAAM;AAAA,QACrB;AAEA,YAAIE,SAAQF,QAAO;AAEnB,YAAIE,WAAU,MAAM;AACnB,UAAAF,UAASE;AACT;AAAA,QACD;AAAA,MACD;AAEA,UAAI,SAASF,QAAO;AACpB,MAAAA,UAASA,QAAO;AAEhB,aAAOA,YAAW,QAAQ,WAAW,MAAM;AAC1C,YAAI,WAAW,OAAO,QAAQ;AAI7B,eAAK,eAAe,OAAO,OAAO;AAClC,eAAK,eAAe,OAAO,cAAc;AAEzC;AAAA,UAAsC,OAAO;AAAA,QAC9C;AAEA,QAAAA,UAAS,OAAO;AAChB,iBAAS,OAAO;AAAA,MACjB;AAAA,IACD;AAAA,EACD;AAAA;AAAA;AAAA;AAAA,EAKA,eAAe,SAAS;AACvB,eAAW,KAAK,SAAS;AACxB,WAAK,EAAE,IAAI,WAAW,GAAG;AACxB,aAAK,eAAe,IAAI,CAAC;AAAA,MAC1B,YAAY,EAAE,IAAI,iBAAiB,GAAG;AACrC,aAAK,qBAAqB,IAAI,CAAC;AAAA,MAChC;AAIA,WAAK,cAAc,EAAE,IAAI;AAGzB,wBAAkB,GAAG,KAAK;AAAA,IAC3B;AAAA,EACD;AAAA;AAAA;AAAA;AAAA,EAKA,cAAc,MAAM;AACnB,QAAI,SAAS,KAAM;AAEnB,eAAW,OAAO,MAAM;AACvB,WAAK,IAAI,IAAI,aAAa,MAAM,IAAI,IAAI,gBAAgB,GAAG;AAC1D;AAAA,MACD;AAEA,UAAI,KAAK;AAET,WAAK;AAAA;AAAA,QAAsC,IAAK;AAAA,MAAI;AAAA,IACrD;AAAA,EACD;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,QAAQG,SAAQ,OAAO;AACtB,QAAI,CAAC,KAAK,SAAS,IAAIA,OAAM,GAAG;AAC/B,WAAK,SAAS,IAAIA,SAAQ,KAAK;AAAA,IAChC;AAGA,SAAKA,QAAO,IAAI,iBAAiB,GAAG;AACnC,WAAK,QAAQ,IAAIA,SAAQA,QAAO,CAAC;AACjC,oBAAc,IAAIA,SAAQA,QAAO,CAAC;AAAA,IACnC;AAAA,EACD;AAAA,EAEA,WAAW;AACV,oBAAgB;AAChB,SAAK,MAAM;AAAA,EACZ;AAAA,EAEA,aAAa;AAGZ,QAAI,kBAAkB,KAAM;AAE5B,oBAAgB;AAChB,mBAAe;AAAA,EAChB;AAAA,EAEA,QAAQ;AACP,SAAK,SAAS;AAEd,QAAI,oBAAoB,SAAS,GAAG;AACnC,oBAAc;AAEd,UAAI,kBAAkB,QAAQ,kBAAkB,MAAM;AAErD;AAAA,MACD;AAAA,IACD,WAAW,KAAK,aAAa,GAAG;AAC/B,WAAK,QAAQ,CAAC,CAAC;AAAA,IAChB;AAEA,SAAK,WAAW;AAAA,EACjB;AAAA,EAEA,UAAU;AACT,eAAW,MAAM,KAAK,mBAAoB,IAAG,IAAI;AACjD,SAAK,mBAAmB,MAAM;AAAA,EAC/B;AAAA,EAEA,WAAW;AACV,QAAI,KAAK,sBAAsB,GAAG;AAEjC,iBAAW,MAAM,KAAK,kBAAmB,IAAG;AAC5C,WAAK,kBAAkB,MAAM;AAAA,IAC9B;AAEA,QAAI,KAAK,aAAa,GAAG;AACxB,WAAK,QAAQ;AAAA,IACd;AAAA,EACD;AAAA,EAEA,UAAU;AAKT,QAAI,QAAQ,OAAO,GAAG;AACrB,WAAK,SAAS,MAAM;AAEpB,UAAI,wBAAwB;AAC5B,UAAI,aAAa;AAGjB,UAAI,eAAe;AAAA,QAClB,QAAQ;AAAA,QACR,QAAQ;AAAA,QACR,SAAS,CAAC;AAAA,QACV,gBAAgB,CAAC;AAAA,MAClB;AAEA,iBAAW,SAAS,SAAS;AAC5B,YAAI,UAAU,MAAM;AACnB,uBAAa;AACb;AAAA,QACD;AAGA,cAAM,UAAU,CAAC;AAEjB,mBAAW,CAACA,SAAQ,KAAK,KAAK,KAAK,SAAS;AAC3C,cAAI,MAAM,QAAQ,IAAIA,OAAM,GAAG;AAC9B,gBAAI,cAAc,UAAU,MAAM,QAAQ,IAAIA,OAAM,GAAG;AAEtD,oBAAM,QAAQ,IAAIA,SAAQ,KAAK;AAAA,YAChC,OAAO;AAGN;AAAA,YACD;AAAA,UACD;AAEA,kBAAQ,KAAKA,OAAM;AAAA,QACpB;AAEA,YAAI,QAAQ,WAAW,GAAG;AACzB;AAAA,QACD;AAGA,cAAM,SAAS,CAAC,GAAG,MAAM,QAAQ,KAAK,CAAC,EAAE,OAAO,CAAC,MAAM,CAAC,KAAK,QAAQ,IAAI,CAAC,CAAC;AAC3E,YAAI,OAAO,SAAS,GAAG;AAEtB,cAAI,2BAA2B;AAC/B,gCAAsB,CAAC;AAGvB,gBAAM,SAAS,oBAAI,IAAI;AAEvB,gBAAM,UAAU,oBAAI,IAAI;AACxB,qBAAWA,WAAU,SAAS;AAC7B,yBAAaA,SAAQ,QAAQ,QAAQ,OAAO;AAAA,UAC7C;AAEA,cAAI,oBAAoB,SAAS,GAAG;AACnC,4BAAgB;AAChB,kBAAM,MAAM;AAEZ,uBAAW,QAAQ,qBAAqB;AACvC,oBAAM,sBAAsB,MAAM,YAAY;AAAA,YAC/C;AAIA,kBAAM,WAAW;AAAA,UAClB;AAEA,gCAAsB;AAAA,QACvB;AAAA,MACD;AAEA,sBAAgB;AAChB,qBAAe;AAAA,IAChB;AAEA,SAAK,YAAY;AACjB,YAAQ,OAAO,IAAI;AAAA,EACpB;AAAA;AAAA;AAAA;AAAA;AAAA,EAMA,UAAU,UAAU;AACnB,SAAK,YAAY;AACjB,QAAI,SAAU,MAAK,qBAAqB;AAAA,EACzC;AAAA;AAAA;AAAA;AAAA;AAAA,EAMA,UAAU,UAAU;AACnB,SAAK,YAAY;AACjB,QAAI,SAAU,MAAK,qBAAqB;AAExC,SAAK,OAAO;AAAA,EACb;AAAA,EAEA,SAAS;AACR,eAAW,KAAK,KAAK,gBAAgB;AACpC,WAAK,qBAAqB,OAAO,CAAC;AAClC,wBAAkB,GAAG,KAAK;AAC1B,sBAAgB,CAAC;AAAA,IAClB;AAEA,eAAW,KAAK,KAAK,sBAAsB;AAC1C,wBAAkB,GAAG,WAAW;AAChC,sBAAgB,CAAC;AAAA,IAClB;AAEA,SAAK,MAAM;AAAA,EACZ;AAAA;AAAA,EAGA,SAAS,IAAI;AACZ,SAAK,kBAAkB,IAAI,EAAE;AAAA,EAC9B;AAAA;AAAA,EAGA,UAAU,IAAI;AACb,SAAK,mBAAmB,IAAI,EAAE;AAAA,EAC/B;AAAA,EAEA,UAAU;AACT,YAAQ,KAAK,cAAc,SAAS,GAAG;AAAA,EACxC;AAAA,EAEA,OAAO,SAAS;AACf,QAAI,kBAAkB,MAAM;AAC3B,YAAM,QAAS,gBAAgB,IAAI,OAAM;AACzC,cAAQ,IAAI,aAAa;AAEzB,UAAI,CAAC,kBAAkB;AACtB,eAAM,QAAQ,MAAM;AACnB,cAAI,kBAAkB,OAAO;AAE5B;AAAA,UACD;AAEA,gBAAM,MAAM;AAAA,QACb,CAAC;AAAA,MACF;AAAA,IACD;AAEA,WAAO;AAAA,EACR;AAAA;AAAA,EAGA,OAAO,QAAQ,MAAM;AACpB,qBAAiB,IAAI;AAAA,EACtB;AAAA,EAEA,QAAQ;AACP,QAAI,CAAC,mBAAoB,CAAC,KAAK,WAAW,QAAQ,SAAS,EAAI;AAI/D,mBAAe,IAAI,IAAI,KAAK,OAAO;AAGnC,eAAW,SAAS,SAAS;AAC5B,UAAI,UAAU,KAAM;AAEpB,iBAAW,CAACA,SAAQ,QAAQ,KAAK,MAAM,UAAU;AAChD,YAAI,CAAC,aAAa,IAAIA,OAAM,GAAG;AAC9B,uBAAa,IAAIA,SAAQ,QAAQ;AAAA,QAClC;AAAA,MACD;AAAA,IACD;AAAA,EACD;AACD;AASO,SAAS,UAAU,IAAI;AAC7B,MAAI,oBAAoB;AACxB,qBAAmB;AAEnB,MAAI;AACH,QAAI;AAEJ,QAAI,IAAI;AACP,UAAI,kBAAkB,MAAM;AAC3B,sBAAc;AAAA,MACf;AAEA,eAAS,GAAG;AAAA,IACb;AAEA,WAAO,MAAM;AACZ,kBAAY;AAEZ,UAAI,oBAAoB,WAAW,GAAG;AACrC,uBAAe,MAAM;AAGrB,YAAI,oBAAoB,WAAW,GAAG;AAGrC,kCAAwB;AAExB;AAAA;AAAA,YAAyB;AAAA;AAAA,QAC1B;AAAA,MACD;AAEA,oBAAc;AAAA,IACf;AAAA,EACD,UAAE;AACD,uBAAmB;AAAA,EACpB;AACD;AAEA,SAAS,gBAAgB;AACxB,MAAI,sBAAsB;AAC1B,gBAAc;AAEd,MAAI,gBAAgB,eAAM,oBAAI,IAAI,IAAI;AAEtC,MAAI;AACH,QAAI,cAAc;AAClB,2BAAuB,IAAI;AAE3B,WAAO,oBAAoB,SAAS,GAAG;AACtC,UAAI,QAAQ,MAAM,OAAO;AAEzB,UAAI,gBAAgB,KAAM;AACzB,YAAI,cAAK;AACR,cAAI,UAAU,oBAAI,IAAI;AAEtB,qBAAWA,WAAU,MAAM,QAAQ,KAAK,GAAG;AAC1C,uBAAW,CAACC,QAAOC,OAAM,KAAKF,QAAO,WAAW,CAAC,GAAG;AACnD,kBAAI,QAAQ,QAAQ,IAAIC,MAAK;AAE7B,kBAAI,CAAC,OAAO;AACX,wBAAQ,EAAE,OAAOC,QAAO,OAAO,OAAO,EAAE;AACxC,wBAAQ,IAAID,QAAO,KAAK;AAAA,cACzB;AAEA,oBAAM,SAASC,QAAO;AAAA,YACvB;AAAA,UACD;AAEA,qBAAWA,WAAU,QAAQ,OAAO,GAAG;AACtC,gBAAIA,QAAO,OAAO;AAEjB,sBAAQ,MAAMA,QAAO,KAAK;AAAA,YAC3B;AAAA,UACD;AAAA,QACD;AAEA,4BAAoB;AAAA,MACrB;AAEA,YAAM,QAAQ,mBAAmB;AACjC,iBAAW,MAAM;AAEjB,UAAI,cAAK;AACR,mBAAWF,WAAU,MAAM,QAAQ,KAAK,GAAG;AACf,UAAC,cAAe,IAAIA,OAAM;AAAA,QACtD;AAAA,MACD;AAAA,IACD;AAAA,EACD,UAAE;AACD,kBAAc;AACd,2BAAuB,mBAAmB;AAE1C,4BAAwB;AAExB,QAAI,cAAK;AACR;AAAA,cAAWA;AAAA;AAAA,QAAsC;AAAA,QAAgB;AAChE,QAAAA,QAAO,UAAU;AAAA,MAClB;AAAA,IACD;AAAA,EACD;AACD;AAEA,SAAS,sBAAsB;AAC9B,MAAI;AACH,IAAE,6BAA6B;AAAA,EAChC,SAAS,OAAO;AACf,QAAI,cAAK;AAER,sBAAgB,OAAO,SAAS,EAAE,OAAO,GAAG,CAAC;AAAA,IAC9C;AAIA,0BAAsB,OAAO,qBAAqB;AAAA,EACnD;AACD;AAGO,IAAI,sBAAsB;AAMjC,SAAS,qBAAqB,SAAS;AACtC,MAAI,SAAS,QAAQ;AACrB,MAAI,WAAW,EAAG;AAElB,MAAI,IAAI;AAER,SAAO,IAAI,QAAQ;AAClB,QAAIH,UAAS,QAAQ,GAAG;AAExB,SAAKA,QAAO,KAAK,YAAY,YAAY,KAAK,SAASA,OAAM,GAAG;AAC/D,4BAAsB,oBAAI,IAAI;AAE9B,oBAAcA,OAAM;AAOpB,UAAIA,QAAO,SAAS,QAAQA,QAAO,UAAU,QAAQA,QAAO,UAAU,MAAM;AAG3E,YAAIA,QAAO,aAAa,QAAQA,QAAO,OAAO,MAAM;AAEnD,wBAAcA,OAAM;AAAA,QACrB,OAAO;AAEN,UAAAA,QAAO,KAAK;AAAA,QACb;AAAA,MACD;AAIA,UAAI,qBAAqB,OAAO,GAAG;AAClC,mBAAW,MAAM;AAEjB,mBAAW,KAAK,qBAAqB;AAEpC,eAAK,EAAE,KAAK,YAAY,YAAY,EAAG;AAIvC,gBAAM,kBAAkB,CAAC,CAAC;AAC1B,cAAI,WAAW,EAAE;AACjB,iBAAO,aAAa,MAAM;AACzB,gBAAI,oBAAoB,IAAI,QAAQ,GAAG;AACtC,kCAAoB,OAAO,QAAQ;AACnC,8BAAgB,KAAK,QAAQ;AAAA,YAC9B;AACA,uBAAW,SAAS;AAAA,UACrB;AAEA,mBAAS,IAAI,gBAAgB,SAAS,GAAG,KAAK,GAAG,KAAK;AACrD,kBAAMM,KAAI,gBAAgB,CAAC;AAE3B,iBAAKA,GAAE,KAAK,YAAY,YAAY,EAAG;AACvC,0BAAcA,EAAC;AAAA,UAChB;AAAA,QACD;AAEA,4BAAoB,MAAM;AAAA,MAC3B;AAAA,IACD;AAAA,EACD;AAEA,wBAAsB;AACvB;AAWA,SAAS,aAAa,OAAO,SAAS,QAAQ,SAAS;AACtD,MAAI,OAAO,IAAI,KAAK,EAAG;AACvB,SAAO,IAAI,KAAK;AAEhB,MAAI,MAAM,cAAc,MAAM;AAC7B,eAAW,YAAY,MAAM,WAAW;AACvC,YAAML,SAAQ,SAAS;AAEvB,WAAKA,SAAQ,aAAa,GAAG;AAC5B;AAAA;AAAA,UAAqC;AAAA,UAAW;AAAA,UAAS;AAAA,UAAQ;AAAA,QAAO;AAAA,MACzE,YACEA,UAAS,QAAQ,mBAAmB,MACpCA,SAAQ,WAAW,KACpB,WAAW,UAAU,SAAS,OAAO,GACpC;AACD,0BAAkB,UAAU,KAAK;AACjC;AAAA;AAAA,UAAuC;AAAA,QAAS;AAAA,MACjD;AAAA,IACD;AAAA,EACD;AACD;AASA,SAAS,mBAAmB,OAAO,SAAS;AAC3C,MAAI,MAAM,cAAc,KAAM;AAE9B,aAAW,YAAY,MAAM,WAAW;AACvC,UAAMA,SAAQ,SAAS;AAEvB,SAAKA,SAAQ,aAAa,GAAG;AAC5B;AAAA;AAAA,QAA2C;AAAA,QAAW;AAAA,MAAO;AAAA,IAC9D,YAAYA,SAAQ,kBAAkB,GAAG;AACxC,wBAAkB,UAAU,KAAK;AACjC,cAAQ;AAAA;AAAA,QAA2B;AAAA,MAAS;AAAA,IAC7C;AAAA,EACD;AACD;AAOA,SAAS,WAAW,UAAU,SAAS,SAAS;AAC/C,QAAM,UAAU,QAAQ,IAAI,QAAQ;AACpC,MAAI,YAAY,OAAW,QAAO;AAElC,MAAI,SAAS,SAAS,MAAM;AAC3B,eAAW,OAAO,SAAS,MAAM;AAChC,UAAI,QAAQ,SAAS,GAAG,GAAG;AAC1B,eAAO;AAAA,MACR;AAEA,WAAK,IAAI,IAAI,aAAa,KAAK;AAAA;AAAA,QAAmC;AAAA,QAAM;AAAA,QAAS;AAAA,MAAO,GAAG;AAC1F,gBAAQ;AAAA;AAAA,UAA4B;AAAA,UAAM;AAAA,QAAI;AAC9C,eAAO;AAAA,MACR;AAAA,IACD;AAAA,EACD;AAEA,UAAQ,IAAI,UAAU,KAAK;AAE3B,SAAO;AACR;AAMO,SAAS,gBAAgB,QAAQ;AACvC,MAAID,UAAU,wBAAwB;AAEtC,SAAOA,QAAO,WAAW,MAAM;AAC9B,IAAAA,UAASA,QAAO;AAChB,QAAIC,SAAQD,QAAO;AAInB,QACC,eACAA,YAAW,kBACVC,SAAQ,kBAAkB,MAC1BA,SAAQ,iBAAiB,GACzB;AACD;AAAA,IACD;AAEA,SAAKA,UAAS,cAAc,oBAAoB,GAAG;AAClD,WAAKA,SAAQ,WAAW,EAAG;AAC3B,MAAAD,QAAO,KAAK;AAAA,IACb;AAAA,EACD;AAEA,sBAAoB,KAAKA,OAAM;AAChC;AAgFO,SAAS,KAAK,IAAI;AACxB,MAAI,CAAC,iBAAiB;AACrB,IAAE,4BAA4B,MAAM;AAAA,EACrC;AAEA,MAAI,kBAAkB,MAAM;AAC3B,IAAE,YAAY;AAAA,EACf;AAEA,MAAI,QAAQ,MAAM,OAAO;AACzB,QAAM,UAAU;AAChB,iBAAe,oBAAI,IAAI;AAEvB,MAAI,YAAY;AAChB,MAAIO,WAAU,MAAM,QAAQ;AAE5B,YAAU,EAAE;AAEZ,iBAAe;AAGf,WAAS,CAACC,SAAQ,KAAK,KAAK,MAAM,UAAU;AAC3C,IAAAA,QAAO,IAAI;AAAA,EACZ;AAEA,SAAO;AAAA,IACN,QAAQ,YAAY;AACnB,UAAI,WAAW;AACd,cAAMD;AACN;AAAA,MACD;AAEA,UAAI,CAAC,QAAQ,IAAI,KAAK,GAAG;AACxB,QAAE,eAAe;AAAA,MAClB;AAEA,kBAAY;AAEZ,YAAM,UAAU;AAGhB,eAAS,CAACC,SAAQC,MAAK,KAAK,MAAM,SAAS;AAC1C,QAAAD,QAAO,IAAIC;AAAA,MACZ;AAOA,gBAAU,MAAM;AAEf,YAAIC,iBAAgB,oBAAI,IAAI;AAE5B,iBAASF,WAAU,MAAM,QAAQ,KAAK,GAAG;AACxC,6BAAmBA,SAAQE,cAAa;AAAA,QACzC;AAEA,0BAAkBA,cAAa;AAC/B,4BAAoB;AAAA,MACrB,CAAC;AAED,YAAM,OAAO;AACb,YAAMH;AAAA,IACP;AAAA,IACA,SAAS,MAAM;AACd,UAAI,CAAC,aAAa,QAAQ,IAAI,KAAK,GAAG;AACrC,gBAAQ,OAAO,KAAK;AACpB,cAAM,QAAQ;AAAA,MACf;AAAA,IACD;AAAA,EACD;AACD;;;ACl8BO,SAAS,iBAAiB,OAAO;AACvC,MAAI,cAAc;AAClB,MAAI,UAAU,OAAO,CAAC;AAEtB,MAAI;AAEJ,MAAI,cAAK;AACR,QAAI,SAAS,0BAA0B;AAAA,EACxC;AAEA,SAAO,MAAM;AACZ,QAAI,gBAAgB,GAAG;AACtB,UAAI,OAAO;AAEX,oBAAc,MAAM;AACnB,YAAI,gBAAgB,GAAG;AACtB,iBAAO,QAAQ,MAAM,MAAM,MAAM,UAAU,OAAO,CAAC,CAAC;AAAA,QACrD;AAEA,uBAAe;AAEf,eAAO,MAAM;AACZ,2BAAiB,MAAM;AAItB,2BAAe;AAEf,gBAAI,gBAAgB,GAAG;AACtB,qBAAO;AACP,qBAAO;AAIP,wBAAU,OAAO;AAAA,YAClB;AAAA,UACD,CAAC;AAAA,QACF;AAAA,MACD,CAAC;AAAA,IACF;AAAA,EACD;AACD;;;AC5CA,IAAI,QAAQ,qBAAqB,mBAAmB;AAQ7C,SAAS,SAAS,MAAM,OAAO,UAAU;AAC/C,MAAI,SAAS,MAAM,OAAO,QAAQ;AACnC;AAEO,IAAM,WAAN,MAAe;AAAA;AAAA,EAErB;AAAA,EAEA,WAAW;AAAA;AAAA,EAGX;AAAA;AAAA,EAGA,gBAAgB,YAAY,eAAe;AAAA;AAAA,EAG3C;AAAA;AAAA,EAGA;AAAA;AAAA,EAGA;AAAA;AAAA,EAGA,eAAe;AAAA;AAAA,EAGf,kBAAkB;AAAA;AAAA,EAGlB,iBAAiB;AAAA;AAAA,EAGjB,sBAAsB;AAAA;AAAA,EAGtB,kBAAkB;AAAA,EAElB,uBAAuB;AAAA,EACvB,iBAAiB;AAAA,EAEjB,wBAAwB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASxB,kBAAkB;AAAA,EAElB,6BAA6B,iBAAiB,MAAM;AACnD,SAAK,kBAAkB,OAAO,KAAK,oBAAoB;AAEvD,QAAI,cAAK;AACR,UAAI,KAAK,iBAAiB,mBAAmB;AAAA,IAC9C;AAEA,WAAO,MAAM;AACZ,WAAK,kBAAkB;AAAA,IACxB;AAAA,EACD,CAAC;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOD,YAAY,MAAM,OAAO,UAAU;AAClC,SAAK,UAAU;AACf,SAAK,SAAS;AACd,SAAK,YAAY;AAEjB,SAAK;AAAA,IAAgC,cAAe;AAEpD,SAAK,WAAW,CAAC,CAAC,KAAK,OAAO;AAE9B,SAAK,UAAU,MAAM,MAAM;AACJ,MAAC,cAAe,IAAI;AAE1C,UAAI,WAAW;AACd,cAAMI,WAAU,KAAK;AACrB,qBAAa;AAEb,cAAM;AAAA;AAAA,UACmBA,SAAS,aAAa;AAAA,UACtBA,SAAS,SAAS;AAAA;AAE3C,YAAI,yBAAyB;AAC5B,eAAK,yBAAyB;AAAA,QAC/B,OAAO;AACN,eAAK,0BAA0B;AAAA,QAChC;AAAA,MACD,OAAO;AACN,YAAI,SAAS,KAAK,YAAY;AAE9B,YAAI;AACH,eAAK,eAAe,OAAO,MAAM,SAAS,MAAM,CAAC;AAAA,QAClD,SAAS,OAAO;AACf,eAAK,MAAM,KAAK;AAAA,QACjB;AAEA,YAAI,KAAK,iBAAiB,GAAG;AAC5B,eAAK,sBAAsB;AAAA,QAC5B,OAAO;AACN,eAAK,WAAW;AAAA,QACjB;AAAA,MACD;AAEA,aAAO,MAAM;AACZ,aAAK,iBAAiB,OAAO;AAAA,MAC9B;AAAA,IACD,GAAG,KAAK;AAER,QAAI,WAAW;AACd,WAAK,UAAU;AAAA,IAChB;AAAA,EACD;AAAA,EAEA,4BAA4B;AAC3B,QAAI;AACH,WAAK,eAAe,OAAO,MAAM,KAAK,UAAU,KAAK,OAAO,CAAC;AAAA,IAC9D,SAAS,OAAO;AACf,WAAK,MAAM,KAAK;AAAA,IACjB;AAIA,SAAK,WAAW;AAAA,EACjB;AAAA,EAEA,2BAA2B;AAC1B,UAAMC,WAAU,KAAK,OAAO;AAC5B,QAAI,CAACA,UAAS;AACb;AAAA,IACD;AACA,SAAK,kBAAkB,OAAO,MAAMA,SAAQ,KAAK,OAAO,CAAC;AAEzD,UAAM,QAAQ,MAAM;AACnB,UAAI,SAAS,KAAK,YAAY;AAE9B,WAAK,eAAe,KAAK,KAAK,MAAM;AACnC,cAAM,OAAO;AACb,eAAO,OAAO,MAAM,KAAK,UAAU,MAAM,CAAC;AAAA,MAC3C,CAAC;AAED,UAAI,KAAK,iBAAiB,GAAG;AAC5B,aAAK,sBAAsB;AAAA,MAC5B,OAAO;AACN;AAAA;AAAA,UAAoC,KAAK;AAAA,UAAkB,MAAM;AAChE,iBAAK,kBAAkB;AAAA,UACxB;AAAA,QAAC;AAED,aAAK,WAAW;AAAA,MACjB;AAAA,IACD,CAAC;AAAA,EACF;AAAA,EAEA,cAAc;AACb,QAAI,SAAS,KAAK;AAElB,QAAI,KAAK,UAAU;AAClB,WAAK,kBAAkB,YAAY;AACnC,WAAK,QAAQ,OAAO,KAAK,eAAe;AAExC,eAAS,KAAK;AAAA,IACf;AAEA,WAAO;AAAA,EACR;AAAA;AAAA;AAAA;AAAA;AAAA,EAMA,aAAa;AACZ,WAAO,KAAK,YAAa,CAAC,CAAC,KAAK,UAAU,KAAK,OAAO,WAAW;AAAA,EAClE;AAAA,EAEA,sBAAsB;AACrB,WAAO,CAAC,CAAC,KAAK,OAAO;AAAA,EACtB;AAAA;AAAA;AAAA;AAAA,EAKA,KAAK,IAAI;AACR,QAAI,kBAAkB;AACtB,QAAI,oBAAoB;AACxB,QAAI,eAAe;AAEnB,sBAAkB,KAAK,OAAO;AAC9B,wBAAoB,KAAK,OAAO;AAChC,0BAAsB,KAAK,QAAQ,GAAG;AAEtC,QAAI;AACH,aAAO,GAAG;AAAA,IACX,SAAS,GAAG;AACX,mBAAa,CAAC;AACd,aAAO;AAAA,IACR,UAAE;AACD,wBAAkB,eAAe;AACjC,0BAAoB,iBAAiB;AACrC,4BAAsB,YAAY;AAAA,IACnC;AAAA,EACD;AAAA,EAEA,wBAAwB;AACvB,UAAMA;AAAA;AAAA,MAAiD,KAAK,OAAO;AAAA;AAEnE,QAAI,KAAK,iBAAiB,MAAM;AAC/B,WAAK,sBAAsB,SAAS,uBAAuB;AAC3D,WAAK,oBAAoB;AAAA;AAAA,QAAoC,KAAK;AAAA,MAAgB;AAClF,kBAAY,KAAK,cAAc,KAAK,mBAAmB;AAAA,IACxD;AAEA,QAAI,KAAK,oBAAoB,MAAM;AAClC,WAAK,kBAAkB,OAAO,MAAMA,SAAQ,KAAK,OAAO,CAAC;AAAA,IAC1D;AAAA,EACD;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOA,sBAAsB,GAAG;AACxB,QAAI,CAAC,KAAK,oBAAoB,GAAG;AAChC,UAAI,KAAK,QAAQ;AAChB,aAAK,OAAO,sBAAsB,CAAC;AAAA,MACpC;AAGA;AAAA,IACD;AAEA,SAAK,kBAAkB;AAEvB,QAAI,KAAK,mBAAmB,GAAG;AAC9B,WAAK,WAAW;AAEhB,UAAI,KAAK,iBAAiB;AACzB,qBAAa,KAAK,iBAAiB,MAAM;AACxC,eAAK,kBAAkB;AAAA,QACxB,CAAC;AAAA,MACF;AAEA,UAAI,KAAK,qBAAqB;AAC7B,aAAK,QAAQ,OAAO,KAAK,mBAAmB;AAC5C,aAAK,sBAAsB;AAAA,MAC5B;AAAA,IACD;AAAA,EACD;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,qBAAqB,GAAG;AACvB,SAAK,sBAAsB,CAAC;AAE5B,SAAK,wBAAwB;AAE7B,QAAI,KAAK,iBAAiB;AACzB,mBAAa,KAAK,iBAAiB,KAAK,oBAAoB;AAAA,IAC7D;AAAA,EACD;AAAA,EAEA,qBAAqB;AACpB,SAAK,2BAA2B;AAChC,WAAO;AAAA;AAAA,MAAmC,KAAK;AAAA,IAAgB;AAAA,EAChE;AAAA;AAAA,EAGA,MAAM,OAAO;AACZ,QAAI,UAAU,KAAK,OAAO;AAC1B,QAAI,SAAS,KAAK,OAAO;AAIzB,QAAI,KAAK,yBAA0B,CAAC,WAAW,CAAC,QAAS;AACxD,YAAM;AAAA,IACP;AAEA,QAAI,KAAK,cAAc;AACtB,qBAAe,KAAK,YAAY;AAChC,WAAK,eAAe;AAAA,IACrB;AAEA,QAAI,KAAK,iBAAiB;AACzB,qBAAe,KAAK,eAAe;AACnC,WAAK,kBAAkB;AAAA,IACxB;AAEA,QAAI,KAAK,gBAAgB;AACxB,qBAAe,KAAK,cAAc;AAClC,WAAK,iBAAiB;AAAA,IACvB;AAEA,QAAI,WAAW;AACd;AAAA;AAAA,QAA8C,KAAK;AAAA,MAAc;AACjE,WAAK;AACL,uBAAiB,WAAW,CAAC;AAAA,IAC9B;AAEA,QAAI,YAAY;AAChB,QAAI,mBAAmB;AAEvB,UAAMC,SAAQ,MAAM;AACnB,UAAI,WAAW;AACd,QAAE,2BAA2B;AAC7B;AAAA,MACD;AAEA,kBAAY;AAEZ,UAAI,kBAAkB;AACrB,QAAE,8BAA8B;AAAA,MACjC;AAGA,YAAM,OAAO;AAEb,WAAK,uBAAuB;AAE5B,UAAI,KAAK,mBAAmB,MAAM;AACjC,qBAAa,KAAK,gBAAgB,MAAM;AACvC,eAAK,iBAAiB;AAAA,QACvB,CAAC;AAAA,MACF;AAIA,WAAK,WAAW,KAAK,oBAAoB;AAEzC,WAAK,eAAe,KAAK,KAAK,MAAM;AACnC,aAAK,wBAAwB;AAC7B,eAAO,OAAO,MAAM,KAAK,UAAU,KAAK,OAAO,CAAC;AAAA,MACjD,CAAC;AAED,UAAI,KAAK,iBAAiB,GAAG;AAC5B,aAAK,sBAAsB;AAAA,MAC5B,OAAO;AACN,aAAK,WAAW;AAAA,MACjB;AAAA,IACD;AAEA,QAAI,oBAAoB;AAExB,QAAI;AACH,0BAAoB,IAAI;AACxB,yBAAmB;AACnB,gBAAU,OAAOA,MAAK;AACtB,yBAAmB;AAAA,IACpB,SAASC,QAAO;AACf,4BAAsBA,QAAO,KAAK,WAAW,KAAK,QAAQ,MAAM;AAAA,IACjE,UAAE;AACD,0BAAoB,iBAAiB;AAAA,IACtC;AAEA,QAAI,QAAQ;AACX,uBAAiB,MAAM;AACtB,aAAK,iBAAiB,KAAK,KAAK,MAAM;AACrC,gBAAM,OAAO;AACb,eAAK,wBAAwB;AAE7B,cAAI;AACH,mBAAO,OAAO,MAAM;AACnB;AAAA,gBACC,KAAK;AAAA,gBACL,MAAM;AAAA,gBACN,MAAMD;AAAA,cACP;AAAA,YACD,CAAC;AAAA,UACF,SAASC,QAAO;AACf;AAAA,cAAsBA;AAAA;AAAA,cAA8B,KAAK,QAAQ;AAAA,YAAO;AACxE,mBAAO;AAAA,UACR,UAAE;AACD,iBAAK,wBAAwB;AAAA,UAC9B;AAAA,QACD,CAAC;AAAA,MACF,CAAC;AAAA,IACF;AAAA,EACD;AACD;;;AC/YO,IAAM,wBAAwB,oBAAI,IAAI;AAgOtC,SAAS,wBAAwBC,UAAS;AAChD,MAAI,UAAUA,SAAQ;AAEtB,MAAI,YAAY,MAAM;AACrB,IAAAA,SAAQ,UAAU;AAElB,aAAS,IAAI,GAAG,IAAI,QAAQ,QAAQ,KAAK,GAAG;AAC3C;AAAA;AAAA,QAAsC,QAAQ,CAAC;AAAA,MAAE;AAAA,IAClD;AAAA,EACD;AACD;AAOA,IAAI,QAAQ,CAAC;AAMb,SAAS,0BAA0BA,UAAS;AAC3C,MAAI,SAASA,SAAQ;AACrB,SAAO,WAAW,MAAM;AACvB,SAAK,OAAO,IAAI,aAAa,GAAG;AAG/B,cAAQ,OAAO,IAAI,eAAe;AAAA;AAAA,QAA2B;AAAA,UAAU;AAAA,IACxE;AACA,aAAS,OAAO;AAAA,EACjB;AACA,SAAO;AACR;AAOO,SAAS,gBAAgBA,UAAS;AACxC,MAAI;AACJ,MAAI,qBAAqB;AAEzB,oBAAkB,0BAA0BA,QAAO,CAAC;AAEpD,MAAI,cAAK;AACR,QAAI,qBAAqB;AACzB,sBAAkB,oBAAI,IAAI,CAAC;AAC3B,QAAI;AACH,UAAI,MAAM,SAASA,QAAO,GAAG;AAC5B,QAAE,wBAAwB;AAAA,MAC3B;AAEA,YAAM,KAAKA,QAAO;AAElB,MAAAA,SAAQ,KAAK,CAAC;AACd,8BAAwBA,QAAO;AAC/B,cAAQ,gBAAgBA,QAAO;AAAA,IAChC,UAAE;AACD,wBAAkB,kBAAkB;AACpC,wBAAkB,kBAAkB;AACpC,YAAM,IAAI;AAAA,IACX;AAAA,EACD,OAAO;AACN,QAAI;AACH,MAAAA,SAAQ,KAAK,CAAC;AACd,8BAAwBA,QAAO;AAC/B,cAAQ,gBAAgBA,QAAO;AAAA,IAChC,UAAE;AACD,wBAAkB,kBAAkB;AAAA,IACrC;AAAA,EACD;AAEA,SAAO;AACR;AAMO,SAAS,eAAeA,UAAS;AACvC,MAAI,QAAQ,gBAAgBA,QAAO;AAEnC,MAAI,CAACA,SAAQ,OAAO,KAAK,GAAG;AAK3B,QAAI,CAAC,eAAe,SAAS;AAC5B,MAAAA,SAAQ,IAAI;AAAA,IACb;AAEA,IAAAA,SAAQ,KAAK,wBAAwB;AAAA,EACtC;AAIA,MAAI,sBAAsB;AACzB;AAAA,EACD;AAIA,MAAI,iBAAiB,MAAM;AAG1B,QAAI,gBAAgB,KAAK,eAAe,SAAS;AAChD,mBAAa,IAAIA,UAAS,KAAK;AAAA,IAChC;AAAA,EACD,OAAO;AACN,QAAI,UAAUA,SAAQ,IAAI,eAAe,IAAI,cAAc;AAC3D,sBAAkBA,UAAS,MAAM;AAAA,EAClC;AACD;;;ACvVO,IAAI,gBAAgB,oBAAI,IAAI;AAG5B,IAAM,aAAa,oBAAI,IAAI;AAK3B,SAAS,kBAAkB,GAAG;AACpC,kBAAgB;AACjB;AAEA,IAAI,yBAAyB;AAEtB,SAAS,6BAA6B;AAC5C,2BAAyB;AAC1B;AASO,SAAS,OAAO,GAAGC,QAAO;AAEhC,MAAI,SAAS;AAAA,IACZ,GAAG;AAAA;AAAA,IACH;AAAA,IACA,WAAW;AAAA,IACX;AAAA,IACA,IAAI;AAAA,IACJ,IAAI;AAAA,EACL;AAEA,MAAI,gBAAO,mBAAmB;AAC7B,WAAO,UAAUA,UAAS,UAAU,YAAY;AAChD,WAAO,UAAU;AACjB,WAAO,oBAAoB;AAC3B,WAAO,QAAQ;AAAA,EAChB;AAEA,SAAO;AACR;AAQO,SAAS,MAAM,GAAGA,QAAO;AAC/B,QAAM,IAAI,OAAO,GAAGA,MAAK;AAEzB,sBAAoB,CAAC;AAErB,SAAO;AACR;AASO,SAAS,eAAe,eAAe,YAAY,OAAO,YAAY,MAAM;AAClF,QAAM,IAAI,OAAO,aAAa;AAC9B,MAAI,CAAC,WAAW;AACf,MAAE,SAAS;AAAA,EACZ;AAIA,MAAI,oBAAoB,aAAa,sBAAsB,QAAQ,kBAAkB,MAAM,MAAM;AAChG,KAAC,kBAAkB,EAAE,MAAM,CAAC,GAAG,KAAK,CAAC;AAAA,EACtC;AAEA,SAAO;AACR;AAsBO,SAAS,IAAIC,SAAQ,OAAO,eAAe,OAAO;AACxD,MACC,oBAAoB;AAAA;AAAA,GAGnB,CAAC,eAAe,gBAAgB,IAAI,kBAAkB,MACvD,SAAS,MACR,gBAAgB,KAAK,UAAU,eAAe,QAAQ,mBAAmB,KAC1E,CAAC,iBAAiB,SAASA,OAAM,GAChC;AACD,IAAE,sBAAsB;AAAA,EACzB;AAEA,MAAI,YAAY,eAAe,MAAM,KAAK,IAAI;AAE9C,MAAI,cAAK;AACR;AAAA,MAAU;AAAA;AAAA,MAAkCA,QAAO;AAAA,IAAM;AAAA,EAC1D;AAEA,SAAO,aAAaA,SAAQ,SAAS;AACtC;AAQO,SAAS,aAAaA,SAAQ,OAAO;AAC3C,MAAI,CAACA,QAAO,OAAO,KAAK,GAAG;AAC1B,QAAI,YAAYA,QAAO;AAEvB,QAAI,sBAAsB;AACzB,iBAAW,IAAIA,SAAQ,KAAK;AAAA,IAC7B,OAAO;AACN,iBAAW,IAAIA,SAAQ,SAAS;AAAA,IACjC;AAEA,IAAAA,QAAO,IAAI;AAEX,QAAI,QAAQ,MAAM,OAAO;AACzB,UAAM,QAAQA,SAAQ,SAAS;AAE/B,QAAI,cAAK;AACR,UAAI,qBAAqB,kBAAkB,MAAM;AAChD,QAAAA,QAAO,YAAY,oBAAI,IAAI;AAI3B,cAAM,SAASA,QAAO,QAAQ,IAAI,EAAE,GAAG,SAAS,KAAK;AACrD,QAAAA,QAAO,QAAQ,IAAI,IAAI,EAAE;AAAA;AAAA,UAA2B;AAAA,WAAO,MAAM,CAAC;AAElE,YAAI,qBAAqB,QAAQ,GAAG;AACnC,gBAAM,QAAQ,UAAU,YAAY;AAEpC,cAAI,UAAU,MAAM;AACnB,gBAAI,QAAQA,QAAO,QAAQ,IAAI,MAAM,KAAK;AAE1C,gBAAI,CAAC,OAAO;AACX,sBAAQ,EAAE,OAAO,OAAO,EAAE;AAC1B,cAAAA,QAAO,QAAQ,IAAI,MAAM,OAAO,KAAK;AAAA,YACtC;AAEA,kBAAM;AAAA,UACP;AAAA,QACD;AAAA,MACD;AAEA,UAAI,kBAAkB,MAAM;AAC3B,QAAAA,QAAO,oBAAoB;AAAA,MAC5B;AAAA,IACD;AAEA,SAAKA,QAAO,IAAI,aAAa,GAAG;AAE/B,WAAKA,QAAO,IAAI,WAAW,GAAG;AAC7B;AAAA;AAAA,UAAwCA;AAAA,QAAO;AAAA,MAChD;AAEA,wBAAkBA,UAASA,QAAO,IAAI,eAAe,IAAI,QAAQ,WAAW;AAAA,IAC7E;AAEA,IAAAA,QAAO,KAAK,wBAAwB;AAIpC,mBAAeA,SAAQ,KAAK;AAM5B,QACC,SAAS,KACT,kBAAkB,SACjB,cAAc,IAAI,WAAW,MAC7B,cAAc,KAAK,gBAAgB,kBAAkB,GACrD;AACD,UAAI,qBAAqB,MAAM;AAC9B,6BAAqB,CAACA,OAAM,CAAC;AAAA,MAC9B,OAAO;AACN,yBAAiB,KAAKA,OAAM;AAAA,MAC7B;AAAA,IACD;AAEA,QAAI,CAAC,MAAM,WAAW,cAAc,OAAO,KAAK,CAAC,wBAAwB;AACxE,0BAAoB;AAAA,IACrB;AAAA,EACD;AAEA,SAAO;AACR;AAEO,SAAS,sBAAsB;AACrC,2BAAyB;AACzB,MAAI,0BAA0B;AAC9B,yBAAuB,IAAI;AAE3B,QAAM,WAAW,MAAM,KAAK,aAAa;AAEzC,MAAI;AACH,eAAWC,WAAU,UAAU;AAG9B,WAAKA,QAAO,IAAI,WAAW,GAAG;AAC7B,0BAAkBA,SAAQ,WAAW;AAAA,MACtC;AAEA,UAAI,SAASA,OAAM,GAAG;AACrB,sBAAcA,OAAM;AAAA,MACrB;AAAA,IACD;AAAA,EACD,UAAE;AACD,2BAAuB,uBAAuB;AAAA,EAC/C;AAEA,gBAAc,MAAM;AACrB;AAmCO,SAAS,UAAUC,SAAQ;AACjC,MAAIA,SAAQA,QAAO,IAAI,CAAC;AACzB;AAOA,SAAS,eAAe,QAAQ,QAAQ;AACvC,MAAI,YAAY,OAAO;AACvB,MAAI,cAAc,KAAM;AAExB,MAAI,QAAQ,SAAS;AACrB,MAAI,SAAS,UAAU;AAEvB,WAAS,IAAI,GAAG,IAAI,QAAQ,KAAK;AAChC,QAAI,WAAW,UAAU,CAAC;AAC1B,QAAIC,SAAQ,SAAS;AAGrB,QAAI,CAAC,SAAS,aAAa,cAAe;AAG1C,QAAI,iBAAQA,SAAQ,kBAAkB,GAAG;AACxC,oBAAc,IAAI,QAAQ;AAC1B;AAAA,IACD;AAEA,QAAI,aAAaA,SAAQ,WAAW;AAGpC,QAAI,WAAW;AACd,wBAAkB,UAAU,MAAM;AAAA,IACnC;AAEA,SAAKA,SAAQ,aAAa,GAAG;AAC5B,UAAIC;AAAA;AAAA,QAAkC;AAAA;AAEtC,oBAAc,OAAOA,QAAO;AAE5B,WAAKD,SAAQ,gBAAgB,GAAG;AAE/B,YAAIA,SAAQ,WAAW;AACtB,mBAAS,KAAK;AAAA,QACf;AAEA,uBAAeC,UAAS,WAAW;AAAA,MACpC;AAAA,IACD,WAAW,WAAW;AACrB,WAAKD,SAAQ,kBAAkB,KAAK,wBAAwB,MAAM;AACjE,4BAAoB;AAAA;AAAA,UAA2B;AAAA,QAAS;AAAA,MACzD;AAEA;AAAA;AAAA,QAAuC;AAAA,MAAS;AAAA,IACjD;AAAA,EACD;AACD;;;ACvVA,IAAM,4BAA4B;AAO3B,SAAS,MAAM,OAAO;AAE5B,MAAI,OAAO,UAAU,YAAY,UAAU,QAAQ,gBAAgB,OAAO;AACzE,WAAO;AAAA,EACR;AAEA,QAAM,YAAY,iBAAiB,KAAK;AAExC,MAAI,cAAc,oBAAoB,cAAc,iBAAiB;AACpE,WAAO;AAAA,EACR;AAGA,MAAI,UAAU,oBAAI,IAAI;AACtB,MAAI,mBAAmB,SAAS,KAAK;AACrC,MAAI,UAAU,MAAO,CAAC;AAEtB,MAAIE,SAAQ,gBAAO,oBAAoB,UAAU,YAAY,IAAI;AACjE,MAAI,iBAAiB;AAOrB,MAAI,cAAc,CAAC,OAAO;AACzB,QAAI,mBAAmB,gBAAgB;AACtC,aAAO,GAAG;AAAA,IACX;AAIA,QAAI,WAAW;AACf,QAAIC,WAAU;AAEd,wBAAoB,IAAI;AACxB,uBAAmB,cAAc;AAEjC,QAAI,SAAS,GAAG;AAEhB,wBAAoB,QAAQ;AAC5B,uBAAmBA,QAAO;AAE1B,WAAO;AAAA,EACR;AAEA,MAAI,kBAAkB;AAGrB,YAAQ,IAAI,UAAU;AAAA;AAAA,MAA6B,MAAO;AAAA,MAAQD;AAAA,IAAK,CAAC;AACxE,QAAI,cAAK;AACR;AAAA,MAA4B;AAAA;AAAA,QAAwC;AAAA,MAAM;AAAA,IAC3E;AAAA,EACD;AAGA,MAAI,OAAO;AACX,MAAI,WAAW;AAEf,WAAS,YAAY,UAAU;AAC9B,QAAI,SAAU;AACd,eAAW;AACX,WAAO;AAEP,QAAI,SAAS,GAAG,IAAI,UAAU;AAG9B,eAAW,CAACE,OAAMC,OAAM,KAAK,SAAS;AACrC,UAAIA,SAAQ,UAAU,MAAMD,KAAI,CAAC;AAAA,IAClC;AACA,eAAW;AAAA,EACZ;AAEA,SAAO,IAAI;AAAA;AAAA,IAA0B;AAAA,IAAQ;AAAA,MAC5C,eAAe,GAAGA,OAAM,YAAY;AACnC,YACC,EAAE,WAAW,eACb,WAAW,iBAAiB,SAC5B,WAAW,eAAe,SAC1B,WAAW,aAAa,OACvB;AAKD,UAAE,wBAAwB;AAAA,QAC3B;AACA,YAAI,IAAI,QAAQ,IAAIA,KAAI;AACxB,YAAI,MAAM,QAAW;AACpB,cAAI,YAAY,MAAM;AACrB,gBAAIE,KAAI,MAAO,WAAW,OAAOJ,MAAK;AACtC,oBAAQ,IAAIE,OAAME,EAAC;AACnB,gBAAI,gBAAO,OAAOF,UAAS,UAAU;AACpC,kBAAIE,IAAG,UAAU,MAAMF,KAAI,CAAC;AAAA,YAC7B;AACA,mBAAOE;AAAA,UACR,CAAC;AAAA,QACF,OAAO;AACN,cAAI,GAAG,WAAW,OAAO,IAAI;AAAA,QAC9B;AAEA,eAAO;AAAA,MACR;AAAA,MAEA,eAAe,QAAQF,OAAM;AAC5B,YAAI,IAAI,QAAQ,IAAIA,KAAI;AAExB,YAAI,MAAM,QAAW;AACpB,cAAIA,SAAQ,QAAQ;AACnB,kBAAME,KAAI,YAAY,MAAM,MAAO,eAAeJ,MAAK,CAAC;AACxD,oBAAQ,IAAIE,OAAME,EAAC;AACnB,sBAAU,OAAO;AAEjB,gBAAI,cAAK;AACR,kBAAIA,IAAG,UAAU,MAAMF,KAAI,CAAC;AAAA,YAC7B;AAAA,UACD;AAAA,QACD,OAAO;AACN,cAAI,GAAG,aAAa;AACpB,oBAAU,OAAO;AAAA,QAClB;AAEA,eAAO;AAAA,MACR;AAAA,MAEA,IAAI,QAAQA,OAAM,UAAU;AAC3B,YAAIA,UAAS,cAAc;AAC1B,iBAAO;AAAA,QACR;AAEA,YAAI,gBAAOA,UAAS,mBAAmB;AACtC,iBAAO;AAAA,QACR;AAEA,YAAI,IAAI,QAAQ,IAAIA,KAAI;AACxB,YAAI,SAASA,SAAQ;AAGrB,YAAI,MAAM,WAAc,CAAC,UAAU,eAAe,QAAQA,KAAI,GAAG,WAAW;AAC3E,cAAI,YAAY,MAAM;AACrB,gBAAI,IAAI,MAAM,SAAS,OAAOA,KAAI,IAAI,aAAa;AACnD,gBAAIE,KAAI,MAAO,GAAGJ,MAAK;AAEvB,gBAAI,cAAK;AACR,kBAAII,IAAG,UAAU,MAAMF,KAAI,CAAC;AAAA,YAC7B;AAEA,mBAAOE;AAAA,UACR,CAAC;AAED,kBAAQ,IAAIF,OAAM,CAAC;AAAA,QACpB;AAEA,YAAI,MAAM,QAAW;AACpB,cAAI,IAAI,IAAI,CAAC;AACb,iBAAO,MAAM,gBAAgB,SAAY;AAAA,QAC1C;AAEA,eAAO,QAAQ,IAAI,QAAQA,OAAM,QAAQ;AAAA,MAC1C;AAAA,MAEA,yBAAyB,QAAQA,OAAM;AACtC,YAAI,aAAa,QAAQ,yBAAyB,QAAQA,KAAI;AAE9D,YAAI,cAAc,WAAW,YAAY;AACxC,cAAI,IAAI,QAAQ,IAAIA,KAAI;AACxB,cAAI,EAAG,YAAW,QAAQ,IAAI,CAAC;AAAA,QAChC,WAAW,eAAe,QAAW;AACpC,cAAIC,UAAS,QAAQ,IAAID,KAAI;AAC7B,cAAIG,SAAQF,SAAQ;AAEpB,cAAIA,YAAW,UAAaE,WAAU,eAAe;AACpD,mBAAO;AAAA,cACN,YAAY;AAAA,cACZ,cAAc;AAAA,cACd,OAAAA;AAAA,cACA,UAAU;AAAA,YACX;AAAA,UACD;AAAA,QACD;AAEA,eAAO;AAAA,MACR;AAAA,MAEA,IAAI,QAAQH,OAAM;AACjB,YAAIA,UAAS,cAAc;AAC1B,iBAAO;AAAA,QACR;AAEA,YAAI,IAAI,QAAQ,IAAIA,KAAI;AACxB,YAAI,MAAO,MAAM,UAAa,EAAE,MAAM,iBAAkB,QAAQ,IAAI,QAAQA,KAAI;AAEhF,YACC,MAAM,UACL,kBAAkB,SAAS,CAAC,OAAO,eAAe,QAAQA,KAAI,GAAG,WACjE;AACD,cAAI,MAAM,QAAW;AACpB,gBAAI,YAAY,MAAM;AACrB,kBAAI,IAAI,MAAM,MAAM,OAAOA,KAAI,CAAC,IAAI;AACpC,kBAAIE,KAAI,MAAO,GAAGJ,MAAK;AAEvB,kBAAI,cAAK;AACR,oBAAII,IAAG,UAAU,MAAMF,KAAI,CAAC;AAAA,cAC7B;AAEA,qBAAOE;AAAA,YACR,CAAC;AAED,oBAAQ,IAAIF,OAAM,CAAC;AAAA,UACpB;AAEA,cAAIG,SAAQ,IAAI,CAAC;AACjB,cAAIA,WAAU,eAAe;AAC5B,mBAAO;AAAA,UACR;AAAA,QACD;AAEA,eAAO;AAAA,MACR;AAAA,MAEA,IAAI,QAAQH,OAAMG,QAAO,UAAU;AAClC,YAAI,IAAI,QAAQ,IAAIH,KAAI;AACxB,YAAI,MAAMA,SAAQ;AAGlB,YAAI,oBAAoBA,UAAS,UAAU;AAC1C,mBAAS,IAAIG,QAAO;AAAA,UAAmC,EAAG,GAAG,KAAK,GAAG;AACpE,gBAAI,UAAU,QAAQ,IAAI,IAAI,EAAE;AAChC,gBAAI,YAAY,QAAW;AAC1B,kBAAI,SAAS,aAAa;AAAA,YAC3B,WAAW,KAAK,QAAQ;AAIvB,wBAAU,YAAY,MAAM,MAAO,eAAeL,MAAK,CAAC;AACxD,sBAAQ,IAAI,IAAI,IAAI,OAAO;AAE3B,kBAAI,cAAK;AACR,oBAAI,SAAS,UAAU,MAAM,CAAC,CAAC;AAAA,cAChC;AAAA,YACD;AAAA,UACD;AAAA,QACD;AAMA,YAAI,MAAM,QAAW;AACpB,cAAI,CAAC,OAAO,eAAe,QAAQE,KAAI,GAAG,UAAU;AACnD,gBAAI,YAAY,MAAM,MAAO,QAAWF,MAAK,CAAC;AAE9C,gBAAI,cAAK;AACR,kBAAI,GAAG,UAAU,MAAME,KAAI,CAAC;AAAA,YAC7B;AACA,gBAAI,GAAG,MAAMG,MAAK,CAAC;AAEnB,oBAAQ,IAAIH,OAAM,CAAC;AAAA,UACpB;AAAA,QACD,OAAO;AACN,gBAAM,EAAE,MAAM;AAEd,cAAI,IAAI,YAAY,MAAM,MAAMG,MAAK,CAAC;AACtC,cAAI,GAAG,CAAC;AAAA,QACT;AAEA,YAAI,aAAa,QAAQ,yBAAyB,QAAQH,KAAI;AAG9D,YAAI,YAAY,KAAK;AACpB,qBAAW,IAAI,KAAK,UAAUG,MAAK;AAAA,QACpC;AAEA,YAAI,CAAC,KAAK;AAKT,cAAI,oBAAoB,OAAOH,UAAS,UAAU;AACjD,gBAAI;AAAA;AAAA,cAAoC,QAAQ,IAAI,QAAQ;AAAA;AAC5D,gBAAI,IAAI,OAAOA,KAAI;AAEnB,gBAAI,OAAO,UAAU,CAAC,KAAK,KAAK,GAAG,GAAG;AACrC,kBAAI,IAAI,IAAI,CAAC;AAAA,YACd;AAAA,UACD;AAEA,oBAAU,OAAO;AAAA,QAClB;AAEA,eAAO;AAAA,MACR;AAAA,MAEA,QAAQ,QAAQ;AACf,YAAI,OAAO;AAEX,YAAI,WAAW,QAAQ,QAAQ,MAAM,EAAE,OAAO,CAACI,SAAQ;AACtD,cAAIH,UAAS,QAAQ,IAAIG,IAAG;AAC5B,iBAAOH,YAAW,UAAaA,QAAO,MAAM;AAAA,QAC7C,CAAC;AAED,iBAAS,CAACG,MAAKH,OAAM,KAAK,SAAS;AAClC,cAAIA,QAAO,MAAM,iBAAiB,EAAEG,QAAO,SAAS;AACnD,qBAAS,KAAKA,IAAG;AAAA,UAClB;AAAA,QACD;AAEA,eAAO;AAAA,MACR;AAAA,MAEA,iBAAiB;AAChB,QAAE,sBAAsB;AAAA,MACzB;AAAA,IACD;AAAA,EAAC;AACF;AAMA,SAAS,UAAU,MAAMJ,OAAM;AAC9B,MAAI,OAAOA,UAAS,SAAU,QAAO,GAAG,IAAI,WAAWA,MAAK,eAAe,EAAE;AAC7E,MAAI,0BAA0B,KAAKA,KAAI,EAAG,QAAO,GAAG,IAAI,IAAIA,KAAI;AAChE,SAAO,QAAQ,KAAKA,KAAI,IAAI,GAAG,IAAI,IAAIA,KAAI,MAAM,GAAG,IAAI,KAAKA,KAAI;AAClE;AAKO,SAAS,kBAAkB,OAAO;AACxC,MAAI;AACH,QAAI,UAAU,QAAQ,OAAO,UAAU,YAAY,gBAAgB,OAAO;AACzE,aAAO,MAAM,YAAY;AAAA,IAC1B;AAAA,EACD,QAAQ;AAAA,EAQR;AAEA,SAAO;AACR;AAUA,IAAM,yBAAyB,oBAAI,IAAI;AAAA,EACtC;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AACD,CAAC;AAOD,SAAS,kBAAkB,OAAO;AACjC,SAAO,IAAI,MAAM,OAAO;AAAA,IACvB,IAAI,QAAQK,OAAM,UAAU;AAC3B,UAAI,QAAQ,QAAQ,IAAI,QAAQA,OAAM,QAAQ;AAC9C,UAAI,CAAC,uBAAuB;AAAA;AAAA,QAA2BA;AAAA,MAAK,GAAG;AAC9D,eAAO;AAAA,MACR;AAMA,aAAO,YAAa,MAAM;AACzB,mCAA2B;AAC3B,YAAI,SAAS,MAAM,MAAM,MAAM,IAAI;AACnC,4BAAoB;AACpB,eAAO;AAAA,MACR;AAAA,IACD;AAAA,EACD,CAAC;AACF;;;AC5aO,SAAS,gCAAgC;AAC/C,QAAMC,mBAAkB,MAAM;AAI9B,QAAM,UAAU,MAAM;AACtB,MAAI,SAAS;AACZ,YAAQ;AAAA,EACT;AAEA,QAAM,EAAE,SAAS,aAAa,SAAS,IAAIA;AAE3C,EAAAA,iBAAgB,UAAU,SAAU,MAAM,YAAY;AACrD,UAAMC,SAAQ,QAAQ,KAAK,MAAM,MAAM,UAAU;AAEjD,QAAIA,WAAU,IAAI;AACjB,eAAS,IAAI,cAAc,GAAG,IAAI,KAAK,QAAQ,KAAK,GAAG;AACtD,YAAI,kBAAkB,KAAK,CAAC,CAAC,MAAM,MAAM;AACxC,UAAE,8BAA8B,oBAAoB;AACpD;AAAA,QACD;AAAA,MACD;AAAA,IACD;AAEA,WAAOA;AAAA,EACR;AAEA,EAAAD,iBAAgB,cAAc,SAAU,MAAM,YAAY;AAGzD,UAAMC,SAAQ,YAAY,KAAK,MAAM,MAAM,cAAc,KAAK,SAAS,CAAC;AAExE,QAAIA,WAAU,IAAI;AACjB,eAAS,IAAI,GAAG,MAAM,cAAc,KAAK,SAAS,IAAI,KAAK,GAAG;AAC7D,YAAI,kBAAkB,KAAK,CAAC,CAAC,MAAM,MAAM;AACxC,UAAE,8BAA8B,wBAAwB;AACxD;AAAA,QACD;AAAA,MACD;AAAA,IACD;AAEA,WAAOA;AAAA,EACR;AAEA,EAAAD,iBAAgB,WAAW,SAAU,MAAM,YAAY;AACtD,UAAM,MAAM,SAAS,KAAK,MAAM,MAAM,UAAU;AAEhD,QAAI,CAAC,KAAK;AACT,eAAS,IAAI,GAAG,IAAI,KAAK,QAAQ,KAAK,GAAG;AACxC,YAAI,kBAAkB,KAAK,CAAC,CAAC,MAAM,MAAM;AACxC,UAAE,8BAA8B,qBAAqB;AACrD;AAAA,QACD;AAAA,MACD;AAAA,IACD;AAEA,WAAO;AAAA,EACR;AAGA,QAAM,mBAAmB,MAAM;AAC9B,IAAAA,iBAAgB,UAAU;AAC1B,IAAAA,iBAAgB,cAAc;AAC9B,IAAAA,iBAAgB,WAAW;AAAA,EAC5B;AACD;;;ACxDO,IAAI;AAGJ,IAAI;AAGJ,IAAI;AAGX,IAAI;AAEJ,IAAI;AAMG,SAAS,kBAAkB;AACjC,MAAI,YAAY,QAAW;AAC1B;AAAA,EACD;AAEA,YAAU;AACV,cAAY;AACZ,eAAa,UAAU,KAAK,UAAU,SAAS;AAE/C,MAAI,oBAAoB,QAAQ;AAChC,MAAI,iBAAiB,KAAK;AAC1B,MAAI,iBAAiB,KAAK;AAG1B,uBAAqB,eAAe,gBAAgB,YAAY,EAAE;AAElE,wBAAsB,eAAe,gBAAgB,aAAa,EAAE;AAEpE,MAAI,cAAc,iBAAiB,GAAG;AAGrC,sBAAkB,UAAU;AAE5B,sBAAkB,cAAc;AAEhC,sBAAkB,eAAe;AAEjC,sBAAkB,UAAU;AAE5B,sBAAkB,MAAM;AAAA,EACzB;AAEA,MAAI,cAAc,cAAc,GAAG;AAElC,mBAAe,MAAM;AAAA,EACtB;AAEA,MAAI,cAAK;AAER,sBAAkB,gBAAgB;AAElC,kCAA8B;AAAA,EAC/B;AACD;AAMO,SAAS,YAAY,QAAQ,IAAI;AACvC,SAAO,SAAS,eAAe,KAAK;AACrC;AAOO,SAAS,gBAAgB,MAAM;AACrC;AAAA;AAAA,IAA2C,mBAAmB,KAAK,IAAI;AAAA;AACxE;AAOO,SAAS,iBAAiB,MAAM;AACtC;AAAA;AAAA,IAA2C,oBAAoB,KAAK,IAAI;AAAA;AACzE;AAwGO,SAAS,mBAAmB,MAAM;AACxC,OAAK,cAAc;AACpB;;;ACvKO,SAAS,yBAAyB,IAAI;AAC5C,MAAI,oBAAoB;AACxB,MAAI,kBAAkB;AACtB,sBAAoB,IAAI;AACxB,oBAAkB,IAAI;AACtB,MAAI;AACH,WAAO,GAAG;AAAA,EACX,UAAE;AACD,wBAAoB,iBAAiB;AACrC,sBAAkB,eAAe;AAAA,EAClC;AACD;;;ACEO,SAAS,gBAAgB,MAAM;AACrC,MAAI,kBAAkB,MAAM;AAC3B,QAAI,oBAAoB,MAAM;AAC7B,MAAE,cAAc,IAAI;AAAA,IACrB;AAEA,IAAE,0BAA0B;AAAA,EAC7B;AAEA,MAAI,sBAAsB;AACzB,IAAE,mBAAmB,IAAI;AAAA,EAC1B;AACD;AAMA,SAAS,YAAYE,SAAQ,eAAe;AAC3C,MAAI,cAAc,cAAc;AAChC,MAAI,gBAAgB,MAAM;AACzB,kBAAc,OAAO,cAAc,QAAQA;AAAA,EAC5C,OAAO;AACN,gBAAY,OAAOA;AACnB,IAAAA,QAAO,OAAO;AACd,kBAAc,OAAOA;AAAA,EACtB;AACD;AAQA,SAAS,cAAc,MAAM,IAAI,MAAM;AACtC,MAAI,SAAS;AAEb,MAAI,cAAK;AAER,WAAO,WAAW,SAAS,OAAO,IAAI,kBAAkB,GAAG;AAC1D,eAAS,OAAO;AAAA,IACjB;AAAA,EACD;AAEA,MAAI,WAAW,SAAS,OAAO,IAAI,WAAW,GAAG;AAChD,YAAQ;AAAA,EACT;AAGA,MAAIA,UAAS;AAAA,IACZ,KAAK;AAAA,IACL,MAAM;AAAA,IACN,OAAO;AAAA,IACP,GAAG,OAAO,QAAQ;AAAA,IAClB,OAAO;AAAA,IACP;AAAA,IACA,MAAM;AAAA,IACN,MAAM;AAAA,IACN;AAAA,IACA,GAAG,UAAU,OAAO;AAAA,IACpB,MAAM;AAAA,IACN,UAAU;AAAA,IACV,IAAI;AAAA,IACJ,IAAI;AAAA,EACL;AAEA,MAAI,cAAK;AACR,IAAAA,QAAO,qBAAqB;AAAA,EAC7B;AAEA,MAAI,MAAM;AACT,QAAI;AACH,oBAAcA,OAAM;AACpB,MAAAA,QAAO,KAAK;AAAA,IACb,SAASC,IAAG;AACX,qBAAeD,OAAM;AACrB,YAAMC;AAAA,IACP;AAAA,EACD,WAAW,OAAO,MAAM;AACvB,oBAAgBD,OAAM;AAAA,EACvB;AAGA,MAAI,IAAIA;AAKR,MACC,QACA,EAAE,SAAS,QACX,EAAE,aAAa,QACf,EAAE,UAAU,QACZ,EAAE,UAAU,EAAE;AAAA,GACb,EAAE,IAAI,sBAAsB,GAC5B;AACD,QAAI,EAAE;AACN,SAAK,OAAO,kBAAkB,MAAM,OAAO,wBAAwB,KAAK,MAAM,MAAM;AACnF,QAAE,KAAK;AAAA,IACR;AAAA,EACD;AAEA,MAAI,MAAM,MAAM;AACf,MAAE,SAAS;AAEX,QAAI,WAAW,MAAM;AACpB,kBAAY,GAAG,MAAM;AAAA,IACtB;AAGA,QACC,oBAAoB,SACnB,gBAAgB,IAAI,aAAa,MACjC,OAAO,iBAAiB,GACxB;AACD,UAAIE;AAAA;AAAA,QAAkC;AAAA;AACtC,OAACA,SAAQ,YAAY,CAAC,GAAG,KAAK,CAAC;AAAA,IAChC;AAAA,EACD;AAEA,SAAOF;AACR;AAMO,SAAS,kBAAkB;AACjC,SAAO,oBAAoB,QAAQ,CAAC;AACrC;AAKO,SAAS,SAAS,IAAI;AAC5B,QAAMA,UAAS,cAAc,eAAe,MAAM,KAAK;AACvD,oBAAkBA,SAAQ,KAAK;AAC/B,EAAAA,QAAO,WAAW;AAClB,SAAOA;AACR;AAMO,SAAS,YAAY,IAAI;AAC/B,kBAAgB,SAAS;AAEzB,MAAI,cAAK;AACR,oBAAgB,IAAI,QAAQ;AAAA,MAC3B,OAAO;AAAA,IACR,CAAC;AAAA,EACF;AAIA,MAAIG;AAAA;AAAA,IAA+B,cAAe;AAAA;AAClD,MAAI,QAAQ,CAAC,oBAAoBA,SAAQ,mBAAmB,MAAMA,SAAQ,gBAAgB;AAE1F,MAAI,OAAO;AAEV,QAAI;AAAA;AAAA,MAA2C;AAAA;AAC/C,KAAC,QAAQ,MAAM,CAAC,GAAG,KAAK,EAAE;AAAA,EAC3B,OAAO;AAEN,WAAO,mBAAmB,EAAE;AAAA,EAC7B;AACD;AAKO,SAAS,mBAAmB,IAAI;AACtC,SAAO,cAAc,SAAS,aAAa,IAAI,KAAK;AACrD;AA2BO,SAAS,YAAY,IAAI;AAC/B,QAAM,OAAO;AACb,QAAMC,UAAS,cAAc,cAAc,kBAAkB,IAAI,IAAI;AAErE,SAAO,MAAM;AACZ,mBAAeA,OAAM;AAAA,EACtB;AACD;AAOO,SAAS,eAAe,IAAI;AAClC,QAAM,OAAO;AACb,QAAMA,UAAS,cAAc,cAAc,kBAAkB,IAAI,IAAI;AAErE,SAAO,CAAC,UAAU,CAAC,MAAM;AACxB,WAAO,IAAI,QAAQ,CAAC,WAAW;AAC9B,UAAI,QAAQ,OAAO;AAClB,qBAAaA,SAAQ,MAAM;AAC1B,yBAAeA,OAAM;AACrB,iBAAO,MAAS;AAAA,QACjB,CAAC;AAAA,MACF,OAAO;AACN,uBAAeA,OAAM;AACrB,eAAO,MAAS;AAAA,MACjB;AAAA,IACD,CAAC;AAAA,EACF;AACD;AAwEO,SAAS,cAAc,IAAIC,SAAQ,GAAG;AAC5C,SAAO,cAAc,gBAAgBA,QAAO,IAAI,IAAI;AACrD;AAqCO,SAAS,MAAM,IAAIC,SAAQ,GAAG;AACpC,MAAIC,UAAS,cAAc,eAAeD,QAAO,IAAI,IAAI;AACzD,MAAI,cAAK;AACR,IAAAC,QAAO,YAAY;AAAA,EACpB;AACA,SAAOA;AACR;AAiBO,SAAS,OAAO,IAAI;AAC1B,SAAO,cAAc,gBAAgB,kBAAkB,IAAI,IAAI;AAChE;AAKO,SAAS,wBAAwBC,SAAQ;AAC/C,MAAIC,YAAWD,QAAO;AACtB,MAAIC,cAAa,MAAM;AACtB,UAAM,+BAA+B;AACrC,UAAM,oBAAoB;AAC1B,6BAAyB,IAAI;AAC7B,wBAAoB,IAAI;AACxB,QAAI;AACH,MAAAA,UAAS,KAAK,IAAI;AAAA,IACnB,UAAE;AACD,+BAAyB,4BAA4B;AACrD,0BAAoB,iBAAiB;AAAA,IACtC;AAAA,EACD;AACD;AAOO,SAAS,wBAAwB,QAAQ,aAAa,OAAO;AACnE,MAAID,UAAS,OAAO;AACpB,SAAO,QAAQ,OAAO,OAAO;AAE7B,SAAOA,YAAW,MAAM;AACvB,UAAM,aAAaA,QAAO;AAE1B,QAAI,eAAe,MAAM;AACxB,+BAAyB,MAAM;AAC9B,mBAAW,MAAM,cAAc;AAAA,MAChC,CAAC;AAAA,IACF;AAEA,QAAIE,QAAOF,QAAO;AAElB,SAAKA,QAAO,IAAI,iBAAiB,GAAG;AAEnC,MAAAA,QAAO,SAAS;AAAA,IACjB,OAAO;AACN,qBAAeA,SAAQ,UAAU;AAAA,IAClC;AAEA,IAAAA,UAASE;AAAA,EACV;AACD;AAMO,SAAS,8BAA8B,QAAQ;AACrD,MAAIF,UAAS,OAAO;AAEpB,SAAOA,YAAW,MAAM;AACvB,QAAIE,QAAOF,QAAO;AAClB,SAAKA,QAAO,IAAI,mBAAmB,GAAG;AACrC,qBAAeA,OAAM;AAAA,IACtB;AACA,IAAAA,UAASE;AAAA,EACV;AACD;AAOO,SAAS,eAAeF,SAAQ,aAAa,MAAM;AACzD,MAAI,UAAU;AAEd,OACE,eAAeA,QAAO,IAAI,iBAAiB,MAC5CA,QAAO,UAAU,QACjBA,QAAO,MAAM,QAAQ,MACpB;AACD;AAAA,MAAkBA,QAAO,MAAM;AAAA;AAAA,MAAoCA,QAAO,MAAM;AAAA,IAAI;AACpF,cAAU;AAAA,EACX;AAEA,0BAAwBA,SAAQ,cAAc,CAAC,OAAO;AACtD,mBAAiBA,SAAQ,CAAC;AAC1B,oBAAkBA,SAAQ,SAAS;AAEnC,MAAI,cAAcA,QAAO,SAASA,QAAO,MAAM;AAE/C,MAAI,gBAAgB,MAAM;AACzB,eAAWG,eAAc,aAAa;AACrC,MAAAA,YAAW,KAAK;AAAA,IACjB;AAAA,EACD;AAEA,0BAAwBH,OAAM;AAE9B,MAAI,SAASA,QAAO;AAGpB,MAAI,WAAW,QAAQ,OAAO,UAAU,MAAM;AAC7C,kBAAcA,OAAM;AAAA,EACrB;AAEA,MAAI,cAAK;AACR,IAAAA,QAAO,qBAAqB;AAAA,EAC7B;AAIA,EAAAA,QAAO,OACNA,QAAO,OACPA,QAAO,WACPA,QAAO,MACPA,QAAO,OACPA,QAAO,KACPA,QAAO,QACPA,QAAO,KACN;AACH;AAOO,SAAS,kBAAkB,MAAM,KAAK;AAC5C,SAAO,SAAS,MAAM;AAErB,QAAIE,QAAO,SAAS,MAAM,OAAO,iBAAiB,IAAI;AAEtD,SAAK,OAAO;AACZ,WAAOA;AAAA,EACR;AACD;AAOO,SAAS,cAAcF,SAAQ;AACrC,MAAI,SAASA,QAAO;AACpB,MAAI,OAAOA,QAAO;AAClB,MAAIE,QAAOF,QAAO;AAElB,MAAI,SAAS,KAAM,MAAK,OAAOE;AAC/B,MAAIA,UAAS,KAAM,CAAAA,MAAK,OAAO;AAE/B,MAAI,WAAW,MAAM;AACpB,QAAI,OAAO,UAAUF,QAAQ,QAAO,QAAQE;AAC5C,QAAI,OAAO,SAASF,QAAQ,QAAO,OAAO;AAAA,EAC3C;AACD;AAYO,SAAS,aAAaA,SAAQ,UAAU,UAAU,MAAM;AAE9D,MAAI,cAAc,CAAC;AAEnB,iBAAeA,SAAQ,aAAa,IAAI;AAExC,MAAI,KAAK,MAAM;AACd,QAAI,QAAS,gBAAeA,OAAM;AAClC,QAAI,SAAU,UAAS;AAAA,EACxB;AAEA,MAAI,YAAY,YAAY;AAC5B,MAAI,YAAY,GAAG;AAClB,QAAI,QAAQ,MAAM,EAAE,aAAa,GAAG;AACpC,aAASG,eAAc,aAAa;AACnC,MAAAA,YAAW,IAAI,KAAK;AAAA,IACrB;AAAA,EACD,OAAO;AACN,OAAG;AAAA,EACJ;AACD;AAOA,SAAS,eAAeH,SAAQ,aAAa,OAAO;AACnD,OAAKA,QAAO,IAAI,WAAW,EAAG;AAC9B,EAAAA,QAAO,KAAK;AAEZ,MAAI,IAAIA,QAAO,SAASA,QAAO,MAAM;AAErC,MAAI,MAAM,MAAM;AACf,eAAWG,eAAc,GAAG;AAC3B,UAAIA,YAAW,aAAa,OAAO;AAClC,oBAAY,KAAKA,WAAU;AAAA,MAC5B;AAAA,IACD;AAAA,EACD;AAEA,MAAIC,SAAQJ,QAAO;AAEnB,SAAOI,WAAU,MAAM;AACtB,QAAIC,WAAUD,OAAM;AACpB,QAAI,eACFA,OAAM,IAAI,wBAAwB;AAAA;AAAA;AAAA,KAIjCA,OAAM,IAAI,mBAAmB,MAAMJ,QAAO,IAAI,kBAAkB;AAInE,mBAAeI,QAAO,aAAa,cAAc,QAAQ,KAAK;AAC9D,IAAAA,SAAQC;AAAA,EACT;AACD;AA2DO,SAAS,YAAYC,SAAQ,UAAU;AAC7C,MAAI,CAACA,QAAO,MAAO;AAGnB,MAAI,OAAOA,QAAO,MAAM;AACxB,MAAI,MAAMA,QAAO,MAAM;AAEvB,SAAO,SAAS,MAAM;AAErB,QAAIC,QAAO,SAAS,MAAM,OAAO,iBAAiB,IAAI;AAEtD,aAAS,OAAO,IAAI;AACpB,WAAOA;AAAA,EACR;AACD;;;ACpsBO,IAAI,mBAAmB;;;ACmDvB,IAAI,qBAAqB;AAGzB,SAAS,uBAAuB,OAAO;AAC7C,uBAAqB;AACtB;AAEO,IAAI,uBAAuB;AAG3B,SAAS,yBAAyB,OAAO;AAC/C,yBAAuB;AACxB;AAGO,IAAI,kBAAkB;AAEtB,IAAI,aAAa;AAGjB,SAAS,oBAAoB,UAAU;AAC7C,oBAAkB;AACnB;AAGO,IAAI,gBAAgB;AAGpB,SAAS,kBAAkBC,SAAQ;AACzC,kBAAgBA;AACjB;AAOO,IAAI,kBAAkB;AAGtB,SAAS,oBAAoB,OAAO;AAC1C,MAAI,oBAAoB,SAAS,CAAC,oBAAoB,gBAAgB,IAAI,aAAa,IAAI;AAC1F,QAAI,oBAAoB,MAAM;AAC7B,wBAAkB,CAAC,KAAK;AAAA,IACzB,OAAO;AACN,sBAAgB,KAAK,KAAK;AAAA,IAC3B;AAAA,EACD;AACD;AAQA,IAAI,WAAW;AAEf,IAAI,eAAe;AAOZ,IAAI,mBAAmB;AAGvB,SAAS,qBAAqB,OAAO;AAC3C,qBAAmB;AACpB;AAMO,IAAI,gBAAgB;AAG3B,IAAI,eAAe;AAEZ,IAAI,iBAAiB;AAGrB,SAAS,mBAAmB,OAAO;AACzC,mBAAiB;AAClB;AAEO,SAAS,0BAA0B;AACzC,SAAO,EAAE;AACV;AAQO,SAAS,SAAS,UAAU;AAClC,MAAIC,SAAQ,SAAS;AAErB,OAAKA,SAAQ,WAAW,GAAG;AAC1B,WAAO;AAAA,EACR;AAEA,MAAIA,SAAQ,SAAS;AACpB,aAAS,KAAK,CAAC;AAAA,EAChB;AAEA,OAAKA,SAAQ,iBAAiB,GAAG;AAChC,QAAI,eAAe,SAAS;AAE5B,QAAI,iBAAiB,MAAM;AAC1B,UAAI,SAAS,aAAa;AAE1B,eAAS,IAAI,GAAG,IAAI,QAAQ,KAAK;AAChC,YAAI,aAAa,aAAa,CAAC;AAE/B,YAAI;AAAA;AAAA,UAAiC;AAAA,QAAW,GAAG;AAClD;AAAA;AAAA,YAAuC;AAAA,UAAW;AAAA,QACnD;AAEA,YAAI,WAAW,KAAK,SAAS,IAAI;AAChC,iBAAO;AAAA,QACR;AAAA,MACD;AAAA,IACD;AAEA,SACEA,SAAQ,eAAe;AAAA;AAAA,IAGxB,iBAAiB,MAChB;AACD,wBAAkB,UAAU,KAAK;AAAA,IAClC;AAAA,EACD;AAEA,SAAO;AACR;AAOA,SAAS,2CAA2C,QAAQD,SAAQ,OAAO,MAAM;AAChF,MAAI,YAAY,OAAO;AACvB,MAAI,cAAc,KAAM;AAExB,MAAI,CAAC,mBAAmB,iBAAiB,SAAS,MAAM,GAAG;AAC1D;AAAA,EACD;AAEA,WAAS,IAAI,GAAG,IAAI,UAAU,QAAQ,KAAK;AAC1C,QAAI,WAAW,UAAU,CAAC;AAE1B,SAAK,SAAS,IAAI,aAAa,GAAG;AACjC;AAAA;AAAA,QAAmE;AAAA,QAAWA;AAAA,QAAQ;AAAA,MAAK;AAAA,IAC5F,WAAWA,YAAW,UAAU;AAC/B,UAAI,MAAM;AACT,0BAAkB,UAAU,KAAK;AAAA,MAClC,YAAY,SAAS,IAAI,WAAW,GAAG;AACtC,0BAAkB,UAAU,WAAW;AAAA,MACxC;AACA;AAAA;AAAA,QAAuC;AAAA,MAAS;AAAA,IACjD;AAAA,EACD;AACD;AAGO,SAAS,gBAAgB,UAAU;AACzC,MAAI,gBAAgB;AACpB,MAAI,wBAAwB;AAC5B,MAAI,4BAA4B;AAChC,MAAI,oBAAoB;AACxB,MAAI,mBAAmB;AACvB,MAAI,6BAA6B;AACjC,MAAI,sBAAsB;AAC1B,MAAI,0BAA0B;AAE9B,MAAIC,SAAQ,SAAS;AAErB;AAAA,EAA0C;AAC1C,iBAAe;AACf,qBAAmB;AACnB,qBAAmBA,UAAS,gBAAgB,kBAAkB,IAAI,WAAW;AAE7E,oBAAkB;AAClB,wBAAsB,SAAS,GAAG;AAClC,eAAa;AACb,mBAAiB,EAAE;AAEnB,MAAI,SAAS,OAAO,MAAM;AACzB,6BAAyB,MAAM;AACC,MAAC,SAAS,GAAI,MAAM,cAAc;AAAA,IAClE,CAAC;AAED,aAAS,KAAK;AAAA,EACf;AAEA,MAAI;AACH,aAAS,KAAK;AACd,QAAI;AAAA;AAAA,MAA8B,SAAS;AAAA;AAC3C,QAAI,SAAS,GAAG;AAChB,QAAI,OAAO,SAAS;AAEpB,QAAI,aAAa,MAAM;AACtB,UAAI;AAEJ,uBAAiB,UAAU,YAAY;AAEvC,UAAI,SAAS,QAAQ,eAAe,GAAG;AACtC,aAAK,SAAS,eAAe,SAAS;AACtC,aAAK,IAAI,GAAG,IAAI,SAAS,QAAQ,KAAK;AACrC,eAAK,eAAe,CAAC,IAAI,SAAS,CAAC;AAAA,QACpC;AAAA,MACD,OAAO;AACN,iBAAS,OAAO,OAAO;AAAA,MACxB;AAEA,UAAI,gBAAgB,MAAM,SAAS,IAAI,eAAe,GAAG;AACxD,aAAK,IAAI,cAAc,IAAI,KAAK,QAAQ,KAAK;AAC5C,WAAC,KAAK,CAAC,EAAE,cAAc,CAAC,GAAG,KAAK,QAAQ;AAAA,QACzC;AAAA,MACD;AAAA,IACD,WAAW,SAAS,QAAQ,eAAe,KAAK,QAAQ;AACvD,uBAAiB,UAAU,YAAY;AACvC,WAAK,SAAS;AAAA,IACf;AAKA,QACC,SAAS,KACT,qBAAqB,QACrB,CAAC,cACD,SAAS,SACR,SAAS,KAAK,UAAU,cAAc,YAAY,GAClD;AACD,WAAK,IAAI,GAAG;AAAA,MAA6B,iBAAkB,QAAQ,KAAK;AACvE;AAAA,UACC,iBAAiB,CAAC;AAAA;AAAA,UACK;AAAA,QACxB;AAAA,MACD;AAAA,IACD;AAMA,QAAI,sBAAsB,QAAQ,sBAAsB,UAAU;AACjE;AAEA,UAAI,qBAAqB,MAAM;AAC9B,YAAI,8BAA8B,MAAM;AACvC,sCAA4B;AAAA,QAC7B,OAAO;AACN,oCAA0B,KAAK;AAAA,UAA4B,gBAAiB;AAAA,QAC7E;AAAA,MACD;AAAA,IACD;AAEA,SAAK,SAAS,IAAI,iBAAiB,GAAG;AACrC,eAAS,KAAK;AAAA,IACf;AAEA,WAAO;AAAA,EACR,SAAS,OAAO;AACf,WAAO,aAAa,KAAK;AAAA,EAC1B,UAAE;AACD,aAAS,KAAK;AACd,eAAW;AACX,mBAAe;AACf,uBAAmB;AACnB,sBAAkB;AAClB,sBAAkB;AAClB,0BAAsB,0BAA0B;AAChD,iBAAa;AACb,qBAAiB;AAAA,EAClB;AACD;AAQA,SAAS,gBAAgB,QAAQ,YAAY;AAC5C,MAAI,YAAY,WAAW;AAC3B,MAAI,cAAc,MAAM;AACvB,QAAIC,SAAQ,SAAS,KAAK,WAAW,MAAM;AAC3C,QAAIA,WAAU,IAAI;AACjB,UAAI,aAAa,UAAU,SAAS;AACpC,UAAI,eAAe,GAAG;AACrB,oBAAY,WAAW,YAAY;AAAA,MACpC,OAAO;AAEN,kBAAUA,MAAK,IAAI,UAAU,UAAU;AACvC,kBAAU,IAAI;AAAA,MACf;AAAA,IACD;AAAA,EACD;AAIA,MACC,cAAc,SACb,WAAW,IAAI,aAAa;AAAA;AAAA;AAAA,GAI5B,aAAa,QAAQ,CAAC,SAAS,SAAS,UAAU,IAClD;AACD,sBAAkB,YAAY,WAAW;AAGzC,SAAK,WAAW,IAAI,eAAe,GAAG;AACrC,iBAAW,KAAK;AAChB,iBAAW,KAAK,CAAC;AAAA,IAClB;AAEA;AAAA;AAAA,MAAiD;AAAA,IAAW;AAC5D;AAAA;AAAA,MAA0C;AAAA,MAAa;AAAA,IAAC;AAAA,EACzD;AACD;AAOO,SAAS,iBAAiB,QAAQ,aAAa;AACrD,MAAI,eAAe,OAAO;AAC1B,MAAI,iBAAiB,KAAM;AAE3B,WAAS,IAAI,aAAa,IAAI,aAAa,QAAQ,KAAK;AACvD,oBAAgB,QAAQ,aAAa,CAAC,CAAC;AAAA,EACxC;AACD;AAMO,SAAS,cAAcF,SAAQ;AACrC,MAAIC,SAAQD,QAAO;AAEnB,OAAKC,SAAQ,eAAe,GAAG;AAC9B;AAAA,EACD;AAEA,oBAAkBD,SAAQ,KAAK;AAE/B,MAAI,kBAAkB;AACtB,MAAI,sBAAsB;AAE1B,kBAAgBA;AAChB,uBAAqB;AAErB,MAAI,cAAK;AACR,QAAI,wBAAwB;AAC5B,uCAAmCA,QAAO,kBAAkB;AAC5D,QAAI;AAAA;AAAA,MAAqC;AAAA;AAEzC,kBAAcA,QAAO,aAAa,SAAS;AAAA,EAC5C;AAEA,MAAI;AACH,SAAKC,UAAS,eAAe,qBAAqB,GAAG;AACpD,oCAA8BD,OAAM;AAAA,IACrC,OAAO;AACN,8BAAwBA,OAAM;AAAA,IAC/B;AAEA,4BAAwBA,OAAM;AAC9B,QAAIG,YAAW,gBAAgBH,OAAM;AACrC,IAAAA,QAAO,WAAW,OAAOG,cAAa,aAAaA,YAAW;AAC9D,IAAAH,QAAO,KAAK;AAIZ,QAAI,gBAAO,sBAAsBA,QAAO,IAAI,WAAW,KAAKA,QAAO,SAAS,MAAM;AACjF,eAAS,OAAOA,QAAO,MAAM;AAC5B,YAAI,IAAI,mBAAmB;AAC1B,cAAI,KAAK,wBAAwB;AACjC,cAAI,oBAAoB;AAAA,QACzB;AAAA,MACD;AAAA,IACD;AAAA,EACD,UAAE;AACD,yBAAqB;AACrB,oBAAgB;AAEhB,QAAI,cAAK;AACR,yCAAmC,qBAAqB;AACxD,oBAAc,cAAc;AAAA,IAC7B;AAAA,EACD;AACD;AAMA,eAAsB,OAAO;AAC5B,MAAI,iBAAiB;AACpB,WAAO,IAAI,QAAQ,CAAC,MAAM;AAIzB,4BAAsB,MAAM,EAAE,CAAC;AAC/B,iBAAW,MAAM,EAAE,CAAC;AAAA,IACrB,CAAC;AAAA,EACF;AAEA,QAAM,QAAQ,QAAQ;AAItB,YAAU;AACX;AAQO,SAAS,UAAU;AACzB,SAAO,MAAM,OAAO,EAAE,QAAQ;AAC/B;AAOO,SAAS,IAAI,QAAQ;AAC3B,MAAIC,SAAQ,OAAO;AACnB,MAAI,cAAcA,SAAQ,aAAa;AAEvC,oBAAkB,IAAI,MAAM;AAG5B,MAAI,oBAAoB,QAAQ,CAAC,YAAY;AAI5C,QAAI,YAAY,kBAAkB,SAAS,cAAc,IAAI,eAAe;AAE5E,QAAI,CAAC,aAAa,CAAC,iBAAiB,SAAS,MAAM,GAAG;AACrD,UAAI,OAAO,gBAAgB;AAE3B,WAAK,gBAAgB,IAAI,0BAA0B,GAAG;AAErD,YAAI,OAAO,KAAK,cAAc;AAC7B,iBAAO,KAAK;AAKZ,cAAI,aAAa,QAAQ,SAAS,QAAQ,KAAK,YAAY,MAAM,QAAQ;AACxE;AAAA,UACD,WAAW,aAAa,MAAM;AAC7B,uBAAW,CAAC,MAAM;AAAA,UACnB,WAAW,CAAC,SAAS,SAAS,MAAM,GAAG;AACtC,qBAAS,KAAK,MAAM;AAAA,UACrB;AAAA,QACD;AAAA,MACD,OAAO;AAGN,SAAC,gBAAgB,SAAS,CAAC,GAAG,KAAK,MAAM;AAEzC,YAAI,YAAY,OAAO;AAEvB,YAAI,cAAc,MAAM;AACvB,iBAAO,YAAY,CAAC,eAAe;AAAA,QACpC,WAAW,CAAC,UAAU,SAAS,eAAe,GAAG;AAChD,oBAAU,KAAK,eAAe;AAAA,QAC/B;AAAA,MACD;AAAA,IACD;AAAA,EACD;AAEA,MAAI,cAAK;AAeR,0BAAsB,OAAO,MAAM;AAEnC,QACC,qBACA,CAAC,cACD,wBAAwB,QACxB,oBAAoB,QACpB,oBAAoB,aAAa,iBAChC;AAED,UAAI,OAAO,OAAO;AACjB,eAAO,MAAM;AAAA,MACd,OAAO;AACN,YAAIG,SAAQ,UAAU,WAAW;AAEjC,YAAIA,QAAO;AACV,cAAI,QAAQ,oBAAoB,QAAQ,IAAI,MAAM;AAElD,cAAI,UAAU,QAAW;AACxB,oBAAQ,EAAE,QAAQ,CAAC,EAAE;AACrB,gCAAoB,QAAQ,IAAI,QAAQ,KAAK;AAAA,UAC9C;AAEA,cAAI,OAAO,MAAM,OAAO,MAAM,OAAO,SAAS,CAAC;AAI/C,cAAIA,OAAM,UAAU,MAAM,OAAO;AAChC,kBAAM,OAAO,KAAKA,MAAK;AAAA,UACxB;AAAA,QACD;AAAA,MACD;AAAA,IACD;AAAA,EACD;AAEA,MAAI,sBAAsB;AACzB,QAAI,WAAW,IAAI,MAAM,GAAG;AAC3B,aAAO,WAAW,IAAI,MAAM;AAAA,IAC7B;AAEA,QAAI,YAAY;AACf,UAAIC;AAAA;AAAA,QAAkC;AAAA;AAEtC,UAAI,QAAQA,SAAQ;AAIpB,WACGA,SAAQ,IAAI,WAAW,KAAKA,SAAQ,cAAc,QACpD,sBAAsBA,QAAO,GAC5B;AACD,gBAAQ,gBAAgBA,QAAO;AAAA,MAChC;AAEA,iBAAW,IAAIA,UAAS,KAAK;AAE7B,aAAO;AAAA,IACR;AAAA,EACD,WACC,eACC,CAAC,cAAc,IAAI,MAAM,KAAM,eAAe,WAAW,CAAC,gBAAgB,IAC1E;AACD,IAAAA;AAAA,IAAkC;AAElC,QAAI,SAASA,QAAO,GAAG;AACtB,qBAAeA,QAAO;AAAA,IACvB;AAEA,QAAI,sBAAsB,gBAAgB,MAAMA,SAAQ,IAAI,eAAe,GAAG;AAC7E,gBAAUA,QAAO;AAAA,IAClB;AAAA,EACD;AAEA,MAAI,cAAc,IAAI,MAAM,GAAG;AAC9B,WAAO,aAAa,IAAI,MAAM;AAAA,EAC/B;AAEA,OAAK,OAAO,IAAI,iBAAiB,GAAG;AACnC,UAAM,OAAO;AAAA,EACd;AAEA,SAAO,OAAO;AACf;AAOA,SAAS,UAAUA,UAAS;AAC3B,MAAIA,SAAQ,SAAS,KAAM;AAE3B,EAAAA,SAAQ,KAAK;AAEb,aAAW,OAAOA,SAAQ,MAAM;AAC/B,KAAC,IAAI,cAAc,CAAC,GAAG,KAAKA,QAAO;AAEnC,SAAK,IAAI,IAAI,aAAa,MAAM,IAAI,IAAI,eAAe,GAAG;AACzD;AAAA;AAAA,QAAkC;AAAA,MAAI;AAAA,IACvC;AAAA,EACD;AACD;AAGA,SAAS,sBAAsBA,UAAS;AACvC,MAAIA,SAAQ,MAAM,cAAe,QAAO;AACxC,MAAIA,SAAQ,SAAS,KAAM,QAAO;AAElC,aAAW,OAAOA,SAAQ,MAAM;AAC/B,QAAI,WAAW,IAAI,GAAG,GAAG;AACxB,aAAO;AAAA,IACR;AAEA,SAAK,IAAI,IAAI,aAAa,KAAK;AAAA;AAAA,MAA8C;AAAA,IAAI,GAAG;AACnF,aAAO;AAAA,IACR;AAAA,EACD;AAEA,SAAO;AACR;AA4BO,SAAS,QAAQ,IAAI;AAC3B,MAAI,sBAAsB;AAC1B,MAAI;AACH,iBAAa;AACb,WAAO,GAAG;AAAA,EACX,UAAE;AACD,iBAAa;AAAA,EACd;AACD;AAEA,IAAM,cAAc,EAAE,QAAQ,cAAc;AAOrC,SAAS,kBAAkB,QAAQ,QAAQ;AACjD,SAAO,IAAK,OAAO,IAAI,cAAe;AACvC;;;ACvkBA,IAAM,yBAAyB;AAAA,EAC9B;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AACD;AAwCA,IAAM,iBAAiB;AAAA,EACtB,GAAG;AAAA,EACH;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AACD;AA6BA,IAAM,iBAAiB,CAAC,cAAc,WAAW;AAM1C,SAAS,iBAAiB,MAAM;AACtC,SAAO,eAAe,SAAS,IAAI;AACpC;AAiKA,IAAM;AAAA;AAAA,EAA6C;AAAA,IAClD;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,EACD;AAAA;AAEA,IAAM;AAAA;AAAA,EAA8B;AAAA,IACnC,GAAG;AAAA,IACH;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,IACA;AAAA,EACD;AAAA;;;ACrbO,IAAM,wBAAwB,oBAAI,IAAI;AAGtC,IAAM,qBAAqB,oBAAI,IAAI;AAkI1C,IAAI,wBAAwB;AAOrB,SAAS,yBAAyBC,QAAO;AAC/C,MAAI,kBAAkB;AACtB,MAAI;AAAA;AAAA,IAAsC,gBAAiB;AAAA;AAC3D,MAAI,aAAaA,OAAM;AACvB,MAAI,OAAOA,OAAM,eAAe,KAAK,CAAC;AACtC,MAAI;AAAA;AAAA,IAAgD,KAAK,CAAC,KAAKA,OAAM;AAAA;AAErE,0BAAwBA;AAMxB,MAAI,WAAW;AAMf,MAAI,aAAa,0BAA0BA,UAASA,OAAM;AAE1D,MAAI,YAAY;AACf,QAAI,SAAS,KAAK,QAAQ,UAAU;AACpC,QACC,WAAW,OACV,oBAAoB,YAAY;AAAA,IAAwC,SACxE;AAKD,MAAAA,OAAM,SAAS;AACf;AAAA,IACD;AAOA,QAAI,cAAc,KAAK,QAAQ,eAAe;AAC9C,QAAI,gBAAgB,IAAI;AAGvB;AAAA,IACD;AAEA,QAAI,UAAU,aAAa;AAC1B,iBAAW;AAAA,IACZ;AAAA,EACD;AAEA;AAAA,EAAyC,KAAK,QAAQ,KAAKA,OAAM;AAIjE,MAAI,mBAAmB,gBAAiB;AAGxC,kBAAgBA,QAAO,iBAAiB;AAAA,IACvC,cAAc;AAAA,IACd,MAAM;AACL,aAAO,kBAAkB;AAAA,IAC1B;AAAA,EACD,CAAC;AAOD,MAAI,oBAAoB;AACxB,MAAI,kBAAkB;AACtB,sBAAoB,IAAI;AACxB,oBAAkB,IAAI;AAEtB,MAAI;AAIH,QAAI;AAIJ,QAAI,eAAe,CAAC;AAEpB,WAAO,mBAAmB,MAAM;AAE/B,UAAI,iBACH,eAAe,gBACf,eAAe;AAAA,MACK,eAAgB,QACpC;AAED,UAAI;AAEH,YAAI,YAAY,eAAe,OAAO,UAAU;AAEhD,YACC,aAAa,SACZ;AAAA,QAAsB,eAAgB;AAAA;AAAA,QAGtCA,OAAM,WAAW,iBACjB;AACD,oBAAU,KAAK,gBAAgBA,MAAK;AAAA,QACrC;AAAA,MACD,SAAS,OAAO;AACf,YAAI,aAAa;AAChB,uBAAa,KAAK,KAAK;AAAA,QACxB,OAAO;AACN,wBAAc;AAAA,QACf;AAAA,MACD;AACA,UAAIA,OAAM,gBAAgB,mBAAmB,mBAAmB,mBAAmB,MAAM;AACxF;AAAA,MACD;AACA,uBAAiB;AAAA,IAClB;AAEA,QAAI,aAAa;AAChB,eAAS,SAAS,cAAc;AAE/B,uBAAe,MAAM;AACpB,gBAAM;AAAA,QACP,CAAC;AAAA,MACF;AACA,YAAM;AAAA,IACP;AAAA,EACD,UAAE;AAED,IAAAA,OAAM,SAAS;AAEf,WAAOA,OAAM;AACb,wBAAoB,iBAAiB;AACrC,sBAAkB,eAAe;AAAA,EAClC;AACD;;;ACnSO,SAAS,0BAA0BC,OAAM;AAC/C,MAAI,OAAO,SAAS,cAAc,UAAU;AAC5C,OAAK,YAAYA,MAAK,WAAW,OAAO,SAAS;AACjD,SAAO,KAAK;AACb;;;ACuBO,SAAS,aAAa,OAAO,KAAK;AACxC,MAAIC;AAAA;AAAA,IAAgC;AAAA;AACpC,MAAIA,QAAO,UAAU,MAAM;AAC1B,IAAAA,QAAO,QAAQ,EAAE,OAAO,KAAK,GAAG,MAAM,GAAG,KAAK;AAAA,EAC/C;AACD;AAuTO,SAAS,OAAO,QAAQ,KAAK;AACnC,MAAI,WAAW;AACd,QAAIC;AAAA;AAAA,MAAyD;AAAA;AAK7D,SAAKA,QAAO,IAAI,gBAAgB,KAAKA,QAAO,MAAM,QAAQ,MAAM;AAC/D,MAAAA,QAAO,MAAM,MAAM;AAAA,IACpB;AAEA,iBAAa;AACb;AAAA,EACD;AAEA,MAAI,WAAW,MAAM;AAEpB;AAAA,EACD;AAEA,SAAO;AAAA;AAAA,IAA4B;AAAA,EAAI;AACxC;;;AC5UO,IAAI,eAAe;AAiCnB,SAAS,MAAMC,YAAW,SAAS;AACzC,SAAO,OAAOA,YAAW,OAAO;AACjC;AAyBO,SAAS,QAAQA,YAAW,SAAS;AAC3C,kBAAgB;AAChB,UAAQ,QAAQ,QAAQ,SAAS;AACjC,QAAM,SAAS,QAAQ;AACvB,QAAM,gBAAgB;AACtB,QAAM,wBAAwB;AAE9B,MAAI;AACH,QAAI,SAAS,gBAAgB,MAAM;AAEnC,WACC,WACC,OAAO,aAAa;AAAA,IAAwC,OAAQ,SAAS,kBAC7E;AACD,eAAS,iBAAiB,MAAM;AAAA,IACjC;AAEA,QAAI,CAAC,QAAQ;AACZ,YAAM;AAAA,IACP;AAEA,kBAAc,IAAI;AAClB;AAAA;AAAA,MAAyC;AAAA,IAAO;AAEhD,UAAM,WAAW,OAAOA,YAAW,EAAE,GAAG,SAAS,OAAO,CAAC;AAEzD,kBAAc,KAAK;AAEnB;AAAA;AAAA,MAAgC;AAAA;AAAA,EACjC,SAAS,OAAO;AAEf,QACC,iBAAiB,SACjB,MAAM,QAAQ,MAAM,IAAI,EAAE,KAAK,CAAC,SAAS,KAAK,WAAW,uBAAuB,CAAC,GAChF;AACD,YAAM;AAAA,IACP;AACA,QAAI,UAAU,iBAAiB;AAE9B,cAAQ,KAAK,uBAAuB,KAAK;AAAA,IAC1C;AAEA,QAAI,QAAQ,YAAY,OAAO;AAC9B,MAAE,iBAAiB;AAAA,IACpB;AAGA,oBAAgB;AAChB,uBAAmB,MAAM;AAEzB,kBAAc,KAAK;AACnB,WAAO,MAAMA,YAAW,OAAO;AAAA,EAChC,UAAE;AACD,kBAAc,aAAa;AAC3B,qBAAiB,qBAAqB;AAAA,EACvC;AACD;AAGA,IAAM,qBAAqB,oBAAI,IAAI;AAQnC,SAAS,OAAO,WAAW,EAAE,QAAQ,QAAQ,QAAQ,CAAC,GAAG,QAAQ,SAAS,QAAQ,KAAK,GAAG;AACzF,kBAAgB;AAGhB,MAAI,oBAAoB,oBAAI,IAAI;AAGhC,MAAI,eAAe,CAACC,YAAW;AAC9B,aAAS,IAAI,GAAG,IAAIA,QAAO,QAAQ,KAAK;AACvC,UAAI,aAAaA,QAAO,CAAC;AAEzB,UAAI,kBAAkB,IAAI,UAAU,EAAG;AACvC,wBAAkB,IAAI,UAAU;AAEhC,UAAIC,WAAU,iBAAiB,UAAU;AAKzC,aAAO,iBAAiB,YAAY,0BAA0B,EAAE,SAAAA,SAAQ,CAAC;AAEzE,UAAI,IAAI,mBAAmB,IAAI,UAAU;AAEzC,UAAI,MAAM,QAAW;AAGpB,iBAAS,iBAAiB,YAAY,0BAA0B,EAAE,SAAAA,SAAQ,CAAC;AAC3E,2BAAmB,IAAI,YAAY,CAAC;AAAA,MACrC,OAAO;AACN,2BAAmB,IAAI,YAAY,IAAI,CAAC;AAAA,MACzC;AAAA,IACD;AAAA,EACD;AAEA,eAAa,WAAW,qBAAqB,CAAC;AAC9C,qBAAmB,IAAI,YAAY;AAInC,MAAIF,aAAY;AAEhB,MAAIG,WAAU,eAAe,MAAM;AAClC,QAAI,cAAc,UAAU,OAAO,YAAY,YAAY,CAAC;AAE5D;AAAA;AAAA,MAC8B;AAAA,MAC7B;AAAA,QACC,SAAS,MAAM;AAAA,QAAC;AAAA,MACjB;AAAA,MACA,CAACC,iBAAgB;AAChB,YAAI,SAAS;AACZ,eAAK,CAAC,CAAC;AACP,cAAI;AAAA;AAAA,YAAuC;AAAA;AAC3C,cAAI,IAAI;AAAA,QACT;AAEA,YAAI,QAAQ;AAEQ,UAAC,MAAO,WAAW;AAAA,QACvC;AAEA,YAAI,WAAW;AACd;AAAA;AAAA,YAA0CA;AAAA,YAAc;AAAA,UAAI;AAAA,QAC7D;AAEA,uBAAe;AAEf,QAAAJ,aAAY,UAAUI,cAAa,KAAK,KAAK,CAAC;AAC9C,uBAAe;AAEf,YAAI,WAAW;AACiC,UAAC,cAAe,MAAM,MAAM;AAE3E,cACC,iBAAiB,QACjB,aAAa,aAAa;AAAA,UACF,aAAc,SAAS,eAC9C;AACD,YAAE,mBAAmB;AACrB,kBAAM;AAAA,UACP;AAAA,QACD;AAEA,YAAI,SAAS;AACZ,cAAI;AAAA,QACL;AAAA,MACD;AAAA,IACD;AAEA,WAAO,MAAM;AACZ,eAAS,cAAc,mBAAmB;AACzC,eAAO,oBAAoB,YAAY,wBAAwB;AAE/D,YAAI;AAAA;AAAA,UAA2B,mBAAmB,IAAI,UAAU;AAAA;AAEhE,YAAI,EAAE,MAAM,GAAG;AACd,mBAAS,oBAAoB,YAAY,wBAAwB;AACjE,6BAAmB,OAAO,UAAU;AAAA,QACrC,OAAO;AACN,6BAAmB,IAAI,YAAY,CAAC;AAAA,QACrC;AAAA,MACD;AAEA,yBAAmB,OAAO,YAAY;AAEtC,UAAI,gBAAgB,QAAQ;AAC3B,oBAAY,YAAY,YAAY,WAAW;AAAA,MAChD;AAAA,IACD;AAAA,EACD,CAAC;AAED,qBAAmB,IAAIJ,YAAWG,QAAO;AACzC,SAAOH;AACR;AAMA,IAAI,qBAAqB,oBAAI,QAAQ;AAsB9B,SAAS,QAAQA,YAAW,SAAS;AAC3C,QAAM,KAAK,mBAAmB,IAAIA,UAAS;AAE3C,MAAI,IAAI;AACP,uBAAmB,OAAOA,UAAS;AACnC,WAAO,GAAG,OAAO;AAAA,EAClB;AAEA,MAAI,cAAK;AACR,QAAI,gBAAgBA,YAAW;AAC9B,MAAE,oBAAoB;AAAA,IACvB,OAAO;AACN,MAAE,yBAAyB;AAAA,IAC5B;AAAA,EACD;AAEA,SAAO,QAAQ,QAAQ;AACxB;;;ACtPO,SAAS,iBAAiB,IAAI;AAEpC,SAAO,CAA6B,WAA0C,WAAW;AACxF,QAAIK,WAAU,GAAG,GAAG,MAAM;AAG1B,QAAIC;AAEJ,QAAI,WAAW;AACd,MAAAA;AAAA,MAAkC;AAClC,mBAAa;AAAA,IACd,OAAO;AACN,UAAIC,QAAOF,SAAQ,OAAO,EAAE,KAAK;AACjC,UAAI,WAAW,0BAA0BE,KAAI;AAC7C,MAAAD;AAAA,MAAkC,gBAAgB,QAAQ;AAE1D,UAAI,iBAAQ,iBAAiBA,QAAO,MAAM,QAAQA,SAAQ,aAAa,eAAe;AACrF,QAAE,2BAA2B;AAAA,MAC9B;AAEA,aAAO,OAAOA,QAAO;AAAA,IACtB;AAEA,UAAM,SAASD,SAAQ,QAAQC,QAAO;AACtC,iBAAaA,UAASA,QAAO;AAE7B,QAAI,OAAO,WAAW,YAAY;AACjC,eAAS,MAAM;AAAA,IAChB;AAAA,EACD;AACD;;;ACvDA,IAAM,aAAa,CAAC,GAAG,mBAA6B;;;ACvCpD,IAAM,0BAAN,MAAM,yBAAwB;AAAA;AAAA,EAE7B,aAAa,oBAAI,QAAQ;AAAA;AAAA,EAGzB;AAAA;AAAA,EAGA;AAAA;AAAA,EAGA,OAAO,UAAU,oBAAI,QAAQ;AAAA;AAAA,EAG7B,YAAY,SAAS;AACpB,SAAK,WAAW;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA;AAAA,EAMA,QAAQE,UAAS,UAAU;AAC1B,QAAI,YAAY,KAAK,WAAW,IAAIA,QAAO,KAAK,oBAAI,IAAI;AACxD,cAAU,IAAI,QAAQ;AAEtB,SAAK,WAAW,IAAIA,UAAS,SAAS;AACtC,SAAK,aAAa,EAAE,QAAQA,UAAS,KAAK,QAAQ;AAElD,WAAO,MAAM;AACZ,UAAIC,aAAY,KAAK,WAAW,IAAID,QAAO;AAC3C,MAAAC,WAAU,OAAO,QAAQ;AAEzB,UAAIA,WAAU,SAAS,GAAG;AACzB,aAAK,WAAW,OAAOD,QAAO;AACA,QAAC,KAAK,UAAW,UAAUA,QAAO;AAAA,MACjE;AAAA,IACD;AAAA,EACD;AAAA,EAEA,eAAe;AACd,WACC,KAAK,cACJ,KAAK,YAAY,IAAI;AAAA;AAAA,MACO,CAAC,YAAY;AACxC,iBAAS,SAAS,SAAS;AAC1B,mCAAwB,QAAQ,IAAI,MAAM,QAAQ,KAAK;AACvD,mBAAS,YAAY,KAAK,WAAW,IAAI,MAAM,MAAM,KAAK,CAAC,GAAG;AAC7D,qBAAS,KAAK;AAAA,UACf;AAAA,QACD;AAAA,MACD;AAAA,IACD;AAAA,EAEF;AACD;AAEA,IAAI,8BAA8C,IAAI,wBAAwB;AAAA,EAC7E,KAAK;AACN,CAAC;AAED,IAAI,6BAA6C,IAAI,wBAAwB;AAAA,EAC5E,KAAK;AACN,CAAC;AAED,IAAI,2CAA2D,IAAI,wBAAwB;AAAA,EAC1F,KAAK;AACN,CAAC;;;AChEM,SAAS,mBAAmB,OAAOE,MAAK,YAAY;AAC1D,MAAI,SAAS,MAAM;AAElB,IAAAA,KAAI,MAAS;AAGb,QAAI,WAAY,YAAW,MAAS;AAEpC,WAAO;AAAA,EACR;AAIA,QAAM,QAAQ;AAAA,IAAQ,MACrB,MAAM;AAAA,MACLA;AAAA;AAAA,MAEA;AAAA,IACD;AAAA,EACD;AAIA,SAAO,MAAM,cAAc,MAAM,MAAM,YAAY,IAAI;AACxD;;;AC1BA,IAAM,mBAAmB,CAAC;AAUnB,SAAS,SAAS,OAAO,OAAO;AACtC,SAAO;AAAA,IACN,WAAW,SAAS,OAAO,KAAK,EAAE;AAAA,EACnC;AACD;AAUO,SAAS,SAAS,OAAO,QAAQ,MAAM;AAE7C,MAAI,OAAO;AAGX,QAAM,cAAc,oBAAI,IAAI;AAM5B,WAASC,KAAI,WAAW;AACvB,QAAI,eAAe,OAAO,SAAS,GAAG;AACrC,cAAQ;AACR,UAAI,MAAM;AAET,cAAM,YAAY,CAAC,iBAAiB;AACpC,mBAAW,cAAc,aAAa;AACrC,qBAAW,CAAC,EAAE;AACd,2BAAiB,KAAK,YAAY,KAAK;AAAA,QACxC;AACA,YAAI,WAAW;AACd,mBAAS,IAAI,GAAG,IAAI,iBAAiB,QAAQ,KAAK,GAAG;AACpD,6BAAiB,CAAC,EAAE,CAAC,EAAE,iBAAiB,IAAI,CAAC,CAAC;AAAA,UAC/C;AACA,2BAAiB,SAAS;AAAA,QAC3B;AAAA,MACD;AAAA,IACD;AAAA,EACD;AAMA,WAASC,QAAO,IAAI;AACnB,IAAAD,KAAI;AAAA;AAAA,MAAqB;AAAA,IAAM,CAAC;AAAA,EACjC;AAOA,WAAS,UAAUE,MAAK,aAAa,MAAM;AAE1C,UAAM,aAAa,CAACA,MAAK,UAAU;AACnC,gBAAY,IAAI,UAAU;AAC1B,QAAI,YAAY,SAAS,GAAG;AAC3B,aAAO,MAAMF,MAAKC,OAAM,KAAK;AAAA,IAC9B;AACA,IAAAC;AAAA;AAAA,MAAsB;AAAA,IAAM;AAC5B,WAAO,MAAM;AACZ,kBAAY,OAAO,UAAU;AAC7B,UAAI,YAAY,SAAS,KAAK,MAAM;AACnC,aAAK;AACL,eAAO;AAAA,MACR;AAAA,IACD;AAAA,EACD;AACA,SAAO,EAAE,KAAAF,MAAK,QAAAC,SAAQ,UAAU;AACjC;AAkCO,SAASE,SAAQ,QAAQ,IAAI,eAAe;AAClD,QAAM,SAAS,CAAC,MAAM,QAAQ,MAAM;AAEpC,QAAM,eAAe,SAAS,CAAC,MAAM,IAAI;AACzC,MAAI,CAAC,aAAa,MAAM,OAAO,GAAG;AACjC,UAAM,IAAI,MAAM,sDAAsD;AAAA,EACvE;AACA,QAAM,OAAO,GAAG,SAAS;AACzB,SAAO,SAAS,eAAe,CAACH,MAAKC,YAAW;AAC/C,QAAI,UAAU;AAEd,UAAM,SAAS,CAAC;AAChB,QAAIG,WAAU;AACd,QAAI,UAAU;AACd,UAAM,OAAO,MAAM;AAClB,UAAIA,UAAS;AACZ;AAAA,MACD;AACA,cAAQ;AACR,YAAM,SAAS,GAAG,SAAS,OAAO,CAAC,IAAI,QAAQJ,MAAKC,OAAM;AAC1D,UAAI,MAAM;AACT,QAAAD,KAAI,MAAM;AAAA,MACX,OAAO;AACN,kBAAU,OAAO,WAAW,aAAa,SAAS;AAAA,MACnD;AAAA,IACD;AACA,UAAM,gBAAgB,aAAa;AAAA,MAAI,CAAC,OAAO,MAC9C;AAAA,QACC;AAAA,QACA,CAAC,UAAU;AACV,iBAAO,CAAC,IAAI;AACZ,UAAAI,YAAW,EAAE,KAAK;AAClB,cAAI,SAAS;AACZ,iBAAK;AAAA,UACN;AAAA,QACD;AAAA,QACA,MAAM;AACL,UAAAA,YAAW,KAAK;AAAA,QACjB;AAAA,MACD;AAAA,IACD;AACA,cAAU;AACV,SAAK;AACL,WAAO,SAAS,OAAO;AACtB,cAAQ,aAAa;AACrB,cAAQ;AAIR,gBAAU;AAAA,IACX;AAAA,EACD,CAAC;AACF;AASO,SAAS,SAAS,OAAO;AAC/B,SAAO;AAAA;AAAA,IAEN,WAAW,MAAM,UAAU,KAAK,KAAK;AAAA,EACtC;AACD;AASO,SAASC,KAAI,OAAO;AAC1B,MAAI;AACJ,qBAAmB,OAAO,CAAC,MAAO,QAAQ,CAAE,EAAE;AAE9C,SAAO;AACR;;;AClLO,SAAS,qBAAqB,SAAS;AAE7C,SAAO,IAAI,iBAAiB,OAAO;AACpC;AAiCA,IAAM,mBAAN,MAAuB;AAAA;AAAA,EAEtB;AAAA;AAAA,EAGA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOA,YAAY,SAAS;AACpB,QAAI,UAAU,oBAAI,IAAI;AAMtB,QAAI,aAAa,CAACC,MAAK,UAAU;AAChC,UAAI,IAAI,eAAe,OAAO,OAAO,KAAK;AAC1C,cAAQ,IAAIA,MAAK,CAAC;AAClB,aAAO;AAAA,IACR;AAKA,UAAM,QAAQ,IAAI;AAAA,MACjB,EAAE,GAAI,QAAQ,SAAS,CAAC,GAAI,UAAU,CAAC,EAAE;AAAA,MACzC;AAAA,QACC,IAAI,QAAQC,OAAM;AACjB,iBAAO,IAAI,QAAQ,IAAIA,KAAI,KAAK,WAAWA,OAAM,QAAQ,IAAI,QAAQA,KAAI,CAAC,CAAC;AAAA,QAC5E;AAAA,QACA,IAAI,QAAQA,OAAM;AAEjB,cAAIA,UAAS,aAAc,QAAO;AAElC,cAAI,QAAQ,IAAIA,KAAI,KAAK,WAAWA,OAAM,QAAQ,IAAI,QAAQA,KAAI,CAAC,CAAC;AACpE,iBAAO,QAAQ,IAAI,QAAQA,KAAI;AAAA,QAChC;AAAA,QACA,IAAI,QAAQA,OAAM,OAAO;AACxB,cAAI,QAAQ,IAAIA,KAAI,KAAK,WAAWA,OAAM,KAAK,GAAG,KAAK;AACvD,iBAAO,QAAQ,IAAI,QAAQA,OAAM,KAAK;AAAA,QACvC;AAAA,MACD;AAAA,IACD;AAEA,SAAK,aAAa,QAAQ,UAAU,UAAU,OAAO,QAAQ,WAAW;AAAA,MACvE,QAAQ,QAAQ;AAAA,MAChB,QAAQ,QAAQ;AAAA,MAChB;AAAA,MACA,SAAS,QAAQ;AAAA,MACjB,OAAO,QAAQ,SAAS;AAAA,MACxB,SAAS,QAAQ;AAAA,IAClB,CAAC;AAID,QAAI,CAAC,oBAAoB,CAAC,SAAS,OAAO,UAAU,QAAQ,SAAS,QAAQ;AAC5E,gBAAU;AAAA,IACX;AAEA,SAAK,UAAU,MAAM;AAErB,eAAWD,QAAO,OAAO,KAAK,KAAK,SAAS,GAAG;AAC9C,UAAIA,SAAQ,UAAUA,SAAQ,cAAcA,SAAQ,MAAO;AAC3D,sBAAgB,MAAMA,MAAK;AAAA,QAC1B,MAAM;AACL,iBAAO,KAAK,UAAUA,IAAG;AAAA,QAC1B;AAAA;AAAA,QAEA,IAAI,OAAO;AACV,eAAK,UAAUA,IAAG,IAAI;AAAA,QACvB;AAAA,QACA,YAAY;AAAA,MACb,CAAC;AAAA,IACF;AAEA,SAAK,UAAU;AAAA,IAAgD,CAACE,UAAS;AACxE,aAAO,OAAO,OAAOA,KAAI;AAAA,IAC1B;AAEA,SAAK,UAAU,WAAW,MAAM;AAC/B,cAAQ,KAAK,SAAS;AAAA,IACvB;AAAA,EACD;AAAA;AAAA,EAGA,KAAK,OAAO;AACX,SAAK,UAAU,KAAK,KAAK;AAAA,EAC1B;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAOA,IAAIC,QAAO,UAAU;AACpB,SAAK,QAAQA,MAAK,IAAI,KAAK,QAAQA,MAAK,KAAK,CAAC;AAG9C,UAAM,KAAK,IAAI,SAAS,SAAS,KAAK,MAAM,GAAG,IAAI;AACnD,SAAK,QAAQA,MAAK,EAAE,KAAK,EAAE;AAC3B,WAAO,MAAM;AACZ,WAAK,QAAQA,MAAK,IAAI,KAAK,QAAQA,MAAK,EAAE;AAAA;AAAA,QAA8B,CAAC,OAAO,OAAO;AAAA,MAAE;AAAA,IAC1F;AAAA,EACD;AAAA,EAEA,WAAW;AACV,SAAK,UAAU,SAAS;AAAA,EACzB;AACD;;;ACrKA,IAAI;AAEJ,IAAI,OAAO,gBAAgB,YAAY;AACtC,kBAAgB,cAAc,YAAY;AAAA;AAAA,IAEzC;AAAA;AAAA,IAEA;AAAA;AAAA,IAEA;AAAA;AAAA,IAEA,OAAO;AAAA;AAAA,IAEP,MAAM,CAAC;AAAA;AAAA,IAEP,MAAM;AAAA;AAAA,IAEN,QAAQ,CAAC;AAAA;AAAA,IAET,MAAM,CAAC;AAAA;AAAA,IAEP,QAAQ,oBAAI,IAAI;AAAA;AAAA,IAEhB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,IAOA,YAAY,iBAAiB,SAAS,gBAAgB;AACrD,YAAM;AACN,WAAK,SAAS;AACd,WAAK,MAAM;AACX,UAAI,gBAAgB;AACnB,aAAK,aAAa,EAAE,MAAM,OAAO,CAAC;AAAA,MACnC;AAAA,IACD;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,IAOA,iBAAiB,MAAM,UAAU,SAAS;AAIzC,WAAK,IAAI,IAAI,IAAI,KAAK,IAAI,IAAI,KAAK,CAAC;AACpC,WAAK,IAAI,IAAI,EAAE,KAAK,QAAQ;AAC5B,UAAI,KAAK,KAAK;AACb,cAAM,QAAQ,KAAK,IAAI,IAAI,MAAM,QAAQ;AACzC,aAAK,MAAM,IAAI,UAAU,KAAK;AAAA,MAC/B;AACA,YAAM,iBAAiB,MAAM,UAAU,OAAO;AAAA,IAC/C;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,IAOA,oBAAoB,MAAM,UAAU,SAAS;AAC5C,YAAM,oBAAoB,MAAM,UAAU,OAAO;AACjD,UAAI,KAAK,KAAK;AACb,cAAM,QAAQ,KAAK,MAAM,IAAI,QAAQ;AACrC,YAAI,OAAO;AACV,gBAAM;AACN,eAAK,MAAM,OAAO,QAAQ;AAAA,QAC3B;AAAA,MACD;AAAA,IACD;AAAA,IAEA,MAAM,oBAAoB;AACzB,WAAK,OAAO;AACZ,UAAI,CAAC,KAAK,KAAK;AAOd,YAAS,cAAT,SAAqB,MAAM;AAI1B,iBAAO,CAAC,WAAW;AAClB,kBAAMC,QAAO,SAAS,cAAc,MAAM;AAC1C,gBAAI,SAAS,UAAW,CAAAA,MAAK,OAAO;AAEpC,mBAAO,QAAQA,KAAI;AAAA,UACpB;AAAA,QACD;AAfA,cAAM,QAAQ,QAAQ;AACtB,YAAI,CAAC,KAAK,QAAQ,KAAK,KAAK;AAC3B;AAAA,QACD;AAcA,cAAM,UAAU,CAAC;AACjB,cAAM,iBAAiB,0BAA0B,IAAI;AACrD,mBAAW,QAAQ,KAAK,KAAK;AAC5B,cAAI,QAAQ,gBAAgB;AAC3B,gBAAI,SAAS,aAAa,CAAC,KAAK,IAAI,UAAU;AAC7C,mBAAK,IAAI,WAAW,YAAY,IAAI;AACpC,sBAAQ,UAAU;AAAA,YACnB,OAAO;AACN,sBAAQ,IAAI,IAAI,YAAY,IAAI;AAAA,YACjC;AAAA,UACD;AAAA,QACD;AACA,mBAAW,aAAa,KAAK,YAAY;AAExC,gBAAM,OAAO,KAAK,MAAM,UAAU,IAAI;AACtC,cAAI,EAAE,QAAQ,KAAK,MAAM;AACxB,iBAAK,IAAI,IAAI,IAAI,yBAAyB,MAAM,UAAU,OAAO,KAAK,OAAO,QAAQ;AAAA,UACtF;AAAA,QACD;AAEA,mBAAWC,QAAO,KAAK,OAAO;AAE7B,cAAI,EAAEA,QAAO,KAAK,QAAQ,KAAKA,IAAG,MAAM,QAAW;AAElD,iBAAK,IAAIA,IAAG,IAAI,KAAKA,IAAG;AAExB,mBAAO,KAAKA,IAAG;AAAA,UAChB;AAAA,QACD;AACA,aAAK,MAAM,qBAAqB;AAAA,UAC/B,WAAW,KAAK;AAAA,UAChB,QAAQ,KAAK,cAAc;AAAA,UAC3B,OAAO;AAAA,YACN,GAAG,KAAK;AAAA,YACR;AAAA,YACA,QAAQ;AAAA,UACT;AAAA,QACD,CAAC;AAGD,aAAK,OAAO,YAAY,MAAM;AAC7B,wBAAc,MAAM;AACnB,iBAAK,MAAM;AACX,uBAAWA,QAAO,YAAY,KAAK,GAAG,GAAG;AACxC,kBAAI,CAAC,KAAK,MAAMA,IAAG,GAAG,QAAS;AAC/B,mBAAK,IAAIA,IAAG,IAAI,KAAK,IAAIA,IAAG;AAC5B,oBAAM,kBAAkB;AAAA,gBACvBA;AAAA,gBACA,KAAK,IAAIA,IAAG;AAAA,gBACZ,KAAK;AAAA,gBACL;AAAA,cACD;AACA,kBAAI,mBAAmB,MAAM;AAC5B,qBAAK,gBAAgB,KAAK,MAAMA,IAAG,EAAE,aAAaA,IAAG;AAAA,cACtD,OAAO;AACN,qBAAK,aAAa,KAAK,MAAMA,IAAG,EAAE,aAAaA,MAAK,eAAe;AAAA,cACpE;AAAA,YACD;AACA,iBAAK,MAAM;AAAA,UACZ,CAAC;AAAA,QACF,CAAC;AAED,mBAAW,QAAQ,KAAK,KAAK;AAC5B,qBAAW,YAAY,KAAK,IAAI,IAAI,GAAG;AACtC,kBAAM,QAAQ,KAAK,IAAI,IAAI,MAAM,QAAQ;AACzC,iBAAK,MAAM,IAAI,UAAU,KAAK;AAAA,UAC/B;AAAA,QACD;AACA,aAAK,MAAM,CAAC;AAAA,MACb;AAAA,IACD;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,IAUA,yBAAyBC,OAAM,WAAW,UAAU;AACnD,UAAI,KAAK,IAAK;AACd,MAAAA,QAAO,KAAK,MAAMA,KAAI;AACtB,WAAK,IAAIA,KAAI,IAAI,yBAAyBA,OAAM,UAAU,KAAK,OAAO,QAAQ;AAC9E,WAAK,KAAK,KAAK,EAAE,CAACA,KAAI,GAAG,KAAK,IAAIA,KAAI,EAAE,CAAC;AAAA,IAC1C;AAAA,IAEA,uBAAuB;AACtB,WAAK,OAAO;AAEZ,cAAQ,QAAQ,EAAE,KAAK,MAAM;AAC5B,YAAI,CAAC,KAAK,QAAQ,KAAK,KAAK;AAC3B,eAAK,IAAI,SAAS;AAClB,eAAK,KAAK;AACV,eAAK,MAAM;AAAA,QACZ;AAAA,MACD,CAAC;AAAA,IACF;AAAA;AAAA;AAAA;AAAA,IAKA,MAAM,gBAAgB;AACrB,aACC,YAAY,KAAK,KAAK,EAAE;AAAA,QACvB,CAACD,SACA,KAAK,MAAMA,IAAG,EAAE,cAAc,kBAC7B,CAAC,KAAK,MAAMA,IAAG,EAAE,aAAaA,KAAI,YAAY,MAAM;AAAA,MACvD,KAAK;AAAA,IAEP;AAAA,EACD;AACD;AAQA,SAAS,yBAAyBE,OAAM,OAAO,kBAAkB,WAAW;AAC3E,QAAM,OAAO,iBAAiBA,KAAI,GAAG;AACrC,UAAQ,SAAS,aAAa,OAAO,UAAU,YAAY,SAAS,OAAO;AAC3E,MAAI,CAAC,aAAa,CAAC,iBAAiBA,KAAI,GAAG;AAC1C,WAAO;AAAA,EACR,WAAW,cAAc,eAAe;AACvC,YAAQ,MAAM;AAAA,MACb,KAAK;AAAA,MACL,KAAK;AACJ,eAAO,SAAS,OAAO,OAAO,KAAK,UAAU,KAAK;AAAA,MACnD,KAAK;AACJ,eAAO,QAAQ,KAAK;AAAA,MACrB,KAAK;AACJ,eAAO,SAAS,OAAO,OAAO;AAAA,MAC/B;AACC,eAAO;AAAA,IACT;AAAA,EACD,OAAO;AACN,YAAQ,MAAM;AAAA,MACb,KAAK;AAAA,MACL,KAAK;AACJ,eAAO,SAAS,KAAK,MAAM,KAAK;AAAA,MACjC,KAAK;AACJ,eAAO;AAAA;AAAA,MACR,KAAK;AACJ,eAAO,SAAS,OAAO,CAAC,QAAQ;AAAA,MACjC;AACC,eAAO;AAAA,IACT;AAAA,EACD;AACD;AAKA,SAAS,0BAA0BC,UAAS;AAE3C,QAAM,SAAS,CAAC;AAChB,EAAAA,SAAQ,WAAW,QAAQ,CAAC,SAAS;AACpC;AAAA;AAAA,MAAoC,KAAM,QAAQ;AAAA,IAAS,IAAI;AAAA,EAChE,CAAC;AACD,SAAO;AACR;;;ACjQO,SAAS,WAAWC,MAAK,IAAI;AACnC,MAAI,CAAC,iBAAiB;AACrB,IAAE,4BAA4B,YAAY;AAAA,EAC3C;AAEA,MAAI,WAAW;AACd,UAAM,QAAQ,OAAO,UAAU;AAE/B,QAAI,OAAO,IAAIA,IAAG,GAAG;AACpB;AAAA;AAAA,QAAyB,MAAM,IAAIA,IAAG;AAAA;AAAA,IACvC;AAEA,QAAI,cAAK;AACR,MAAE,gCAAgCA,IAAG;AAAA,IACtC,OAAO;AACN,MAAE,gCAAgCA,IAAG;AAAA,IACtC;AAAA,EACD;AAEA,SAAO,GAAG;AACX;;;ACrBA,IAAI,cAAK;AAIR,MAAS,mBAAT,SAA0B,MAAM;AAC/B,QAAI,EAAE,QAAQ,aAAa;AAG1B,UAAI;AACJ,aAAO,eAAe,YAAY,MAAM;AAAA,QACvC,cAAc;AAAA;AAAA,QAEd,KAAK,MAAM;AACV,cAAI,UAAU,QAAW;AACxB,mBAAO;AAAA,UACR;AAEA,UAAE,oBAAoB,IAAI;AAAA,QAC3B;AAAA,QACA,KAAK,CAAC,MAAM;AACX,kBAAQ;AAAA,QACT;AAAA,MACD,CAAC;AAAA,IACF;AAAA,EACD;AAEA,mBAAiB,QAAQ;AACzB,mBAAiB,SAAS;AAC1B,mBAAiB,UAAU;AAC3B,mBAAiB,UAAU;AAC3B,mBAAiB,QAAQ;AACzB,mBAAiB,WAAW;AAC7B;AAyBO,SAAS,iBAAiB;AAChC,MAAI,oBAAoB,MAAM;AAC7B,IAAE,kCAAkC;AAAA,EACrC;AAEA,UAAQ,gBAAgB,OAAO,IAAI,gBAAgB,GAAG;AACvD;AAgBO,SAAS,QAAQ,IAAI;AAC3B,MAAI,sBAAsB,MAAM;AAC/B,IAAE,4BAA4B,SAAS;AAAA,EACxC;AAEA,MAAI,oBAAoB,kBAAkB,MAAM,MAAM;AACrD,0BAAsB,iBAAiB,EAAE,EAAE,KAAK,EAAE;AAAA,EACnD,OAAO;AACN,gBAAY,MAAM;AACjB,YAAM,UAAU,QAAQ,EAAE;AAC1B,UAAI,OAAO,YAAY,WAAY;AAAA;AAAA,QAAkC;AAAA;AAAA,IACtE,CAAC;AAAA,EACF;AACD;AAWO,SAAS,UAAU,IAAI;AAC7B,MAAI,sBAAsB,MAAM;AAC/B,IAAE,4BAA4B,WAAW;AAAA,EAC1C;AAEA,UAAQ,MAAM,MAAM,QAAQ,EAAE,CAAC;AAChC;AASA,SAAS,oBAAoB,MAAM,QAAQ,EAAE,UAAU,OAAO,aAAa,MAAM,IAAI,CAAC,GAAG;AACxF,SAAO,IAAI,YAAY,MAAM,EAAE,QAAQ,SAAS,WAAW,CAAC;AAC7D;AAyBO,SAAS,wBAAwB;AACvC,QAAM,2BAA2B;AACjC,MAAI,6BAA6B,MAAM;AACtC,IAAE,4BAA4B,uBAAuB;AAAA,EACtD;AAMA,SAAO,CAAC,MAAM,QAAQ,YAAY;AACjC,UAAM;AAAA;AAAA,MACL,yBAAyB,EAAE;AAAA;AAAA,QACD;AAAA,MAAK;AAAA;AAEhC,QAAI,QAAQ;AACX,YAAM,YAAY,SAAS,MAAM,IAAI,OAAO,MAAM,IAAI,CAAC,MAAM;AAG7D,YAAMC,SAAQ;AAAA;AAAA,QAA2C;AAAA,QAAO;AAAA,QAAQ;AAAA,MAAO;AAC/E,iBAAW,MAAM,WAAW;AAC3B,WAAG,KAAK,yBAAyB,GAAGA,MAAK;AAAA,MAC1C;AACA,aAAO,CAACA,OAAM;AAAA,IACf;AAEA,WAAO;AAAA,EACR;AACD;AAeO,SAAS,aAAa,IAAI;AAChC,MAAI,sBAAsB,MAAM;AAC/B,IAAE,4BAA4B,cAAc;AAAA,EAC7C;AAEA,MAAI,kBAAkB,MAAM,MAAM;AACjC,IAAE,sBAAsB,cAAc;AAAA,EACvC;AAEA,wBAAsB,iBAAiB,EAAE,EAAE,KAAK,EAAE;AACnD;AAaO,SAAS,YAAY,IAAI;AAC/B,MAAI,sBAAsB,MAAM;AAC/B,IAAE,4BAA4B,aAAa;AAAA,EAC5C;AAEA,MAAI,kBAAkB,MAAM,MAAM;AACjC,IAAE,sBAAsB,aAAa;AAAA,EACtC;AAEA,wBAAsB,iBAAiB,EAAE,EAAE,KAAK,EAAE;AACnD;AAMA,SAAS,sBAAsB,SAAS;AACvC,MAAI;AAAA;AAAA,IAA2C,QAAS;AAAA;AACxD,SAAQ,EAAE,MAAM,EAAE,GAAG,CAAC,GAAG,GAAG,CAAC,GAAG,GAAG,CAAC,EAAE;AACvC;", + "names": ["key", "key", "next", "source", "stack", "stack", "key", "flags", "component", "component_context", "effect", "effect", "flags", "child", "source", "stack", "update", "e", "settled", "source", "value", "eager_effects", "comment", "pending", "reset", "error", "derived", "stack", "source", "effect", "source", "flags", "derived", "stack", "version", "prop", "source", "s", "value", "key", "prop", "array_prototype", "index", "effect", "e", "derived", "flags", "effect", "flags", "flags", "effect", "effect", "teardown", "next", "transition", "child", "sibling", "effect", "next", "effect", "flags", "index", "teardown", "trace", "derived", "event", "html", "effect", "effect", "component", "events", "passive", "unmount", "anchor_node", "snippet", "element", "html", "element", "listeners", "run", "set", "update", "run", "derived", "pending", "get", "key", "prop", "next", "event", "slot", "key", "attr", "prop", "element", "key", "event"] +} diff --git a/frontend/.vite/deps/package.json b/frontend/.vite/deps/package.json new file mode 100644 index 0000000..3dbc1ca --- /dev/null +++ b/frontend/.vite/deps/package.json @@ -0,0 +1,3 @@ +{ + "type": "module" +} diff --git a/frontend/.vite/deps/svelte.js b/frontend/.vite/deps/svelte.js new file mode 100644 index 0000000..f7977e9 --- /dev/null +++ b/frontend/.vite/deps/svelte.js @@ -0,0 +1,46 @@ +import { + afterUpdate, + beforeUpdate, + createContext, + createEventDispatcher, + createRawSnippet, + flushSync, + fork, + getAbortSignal, + getAllContexts, + getContext, + hasContext, + hydratable, + hydrate, + mount, + onDestroy, + onMount, + setContext, + settled, + tick, + unmount, + untrack +} from "./chunk-YAQNMG2X.js"; +export { + afterUpdate, + beforeUpdate, + createContext, + createEventDispatcher, + createRawSnippet, + flushSync, + fork, + getAbortSignal, + getAllContexts, + getContext, + hasContext, + hydratable, + hydrate, + mount, + onDestroy, + onMount, + setContext, + settled, + tick, + unmount, + untrack +}; diff --git a/frontend/.vite/deps/svelte.js.map b/frontend/.vite/deps/svelte.js.map new file mode 100644 index 0000000..9865211 --- /dev/null +++ b/frontend/.vite/deps/svelte.js.map @@ -0,0 +1,7 @@ +{ + "version": 3, + "sources": [], + "sourcesContent": [], + "mappings": "", + "names": [] +} diff --git a/frontend/.vite/deps/svelte_store.js b/frontend/.vite/deps/svelte_store.js new file mode 100644 index 0000000..5505850 --- /dev/null +++ b/frontend/.vite/deps/svelte_store.js @@ -0,0 +1,100 @@ +import { + active_effect, + active_reaction, + createSubscriber, + derived, + effect_root, + effect_tracking, + get, + readable, + readonly, + render_effect, + set_active_effect, + set_active_reaction, + writable +} from "./chunk-YAQNMG2X.js"; + +// node_modules/svelte/src/store/index-client.js +function toStore(get2, set) { + var effect = active_effect; + var reaction = active_reaction; + var init_value = get2(); + const store = writable(init_value, (set2) => { + var ran = init_value !== get2(); + var teardown; + var previous_reaction = active_reaction; + var previous_effect = active_effect; + set_active_reaction(reaction); + set_active_effect(effect); + try { + teardown = effect_root(() => { + render_effect(() => { + const value = get2(); + if (ran) set2(value); + }); + }); + } finally { + set_active_reaction(previous_reaction); + set_active_effect(previous_effect); + } + ran = true; + return teardown; + }); + if (set) { + return { + set, + update: (fn) => set(fn(get2())), + subscribe: store.subscribe + }; + } + return { + subscribe: store.subscribe + }; +} +function fromStore(store) { + let value = ( + /** @type {V} */ + void 0 + ); + const subscribe = createSubscriber((update) => { + let ran = false; + const unsubscribe = store.subscribe((v) => { + value = v; + if (ran) update(); + }); + ran = true; + return unsubscribe; + }); + function current() { + if (effect_tracking()) { + subscribe(); + return value; + } + return get(store); + } + if ("set" in store) { + return { + get current() { + return current(); + }, + set current(v) { + store.set(v); + } + }; + } + return { + get current() { + return current(); + } + }; +} +export { + derived, + fromStore, + get, + readable, + readonly, + toStore, + writable +}; +//# sourceMappingURL=svelte_store.js.map diff --git a/frontend/.vite/deps/svelte_store.js.map b/frontend/.vite/deps/svelte_store.js.map new file mode 100644 index 0000000..d0ee695 --- /dev/null +++ b/frontend/.vite/deps/svelte_store.js.map @@ -0,0 +1,7 @@ +{ + "version": 3, + "sources": ["../../node_modules/svelte/src/store/index-client.js"], + "sourcesContent": ["/** @import { Readable, Writable } from './public.js' */\nimport {\n\teffect_root,\n\teffect_tracking,\n\trender_effect\n} from '../internal/client/reactivity/effects.js';\nimport { get, writable } from './shared/index.js';\nimport { createSubscriber } from '../reactivity/create-subscriber.js';\nimport {\n\tactive_effect,\n\tactive_reaction,\n\tset_active_effect,\n\tset_active_reaction\n} from '../internal/client/runtime.js';\n\nexport { derived, get, readable, readonly, writable } from './shared/index.js';\n\n/**\n * @template V\n * @overload\n * @param {() => V} get\n * @param {(v: V) => void} set\n * @returns {Writable}\n */\n/**\n * @template V\n * @overload\n * @param {() => V} get\n * @returns {Readable}\n */\n/**\n * Create a store from a function that returns state, and (to make a writable store), an\n * optional second function that sets state.\n *\n * ```ts\n * import { toStore } from 'svelte/store';\n *\n * let count = $state(0);\n *\n * const store = toStore(() => count, (v) => (count = v));\n * ```\n * @template V\n * @param {() => V} get\n * @param {(v: V) => void} [set]\n * @returns {Writable | Readable}\n */\nexport function toStore(get, set) {\n\tvar effect = active_effect;\n\tvar reaction = active_reaction;\n\tvar init_value = get();\n\n\tconst store = writable(init_value, (set) => {\n\t\t// If the value has changed before we call subscribe, then\n\t\t// we need to treat the value as already having run\n\t\tvar ran = init_value !== get();\n\n\t\t// TODO do we need a different implementation on the server?\n\t\tvar teardown;\n\t\t// Apply the reaction and effect at the time of toStore being called\n\t\tvar previous_reaction = active_reaction;\n\t\tvar previous_effect = active_effect;\n\t\tset_active_reaction(reaction);\n\t\tset_active_effect(effect);\n\n\t\ttry {\n\t\t\tteardown = effect_root(() => {\n\t\t\t\trender_effect(() => {\n\t\t\t\t\tconst value = get();\n\t\t\t\t\tif (ran) set(value);\n\t\t\t\t});\n\t\t\t});\n\t\t} finally {\n\t\t\tset_active_reaction(previous_reaction);\n\t\t\tset_active_effect(previous_effect);\n\t\t}\n\n\t\tran = true;\n\n\t\treturn teardown;\n\t});\n\n\tif (set) {\n\t\treturn {\n\t\t\tset,\n\t\t\tupdate: (fn) => set(fn(get())),\n\t\t\tsubscribe: store.subscribe\n\t\t};\n\t}\n\n\treturn {\n\t\tsubscribe: store.subscribe\n\t};\n}\n\n/**\n * @template V\n * @overload\n * @param {Writable} store\n * @returns {{ current: V }}\n */\n/**\n * @template V\n * @overload\n * @param {Readable} store\n * @returns {{ readonly current: V }}\n */\n/**\n * Convert a store to an object with a reactive `current` property. If `store`\n * is a readable store, `current` will be a readonly property.\n *\n * ```ts\n * import { fromStore, get, writable } from 'svelte/store';\n *\n * const store = writable(0);\n *\n * const count = fromStore(store);\n *\n * count.current; // 0;\n * store.set(1);\n * count.current; // 1\n *\n * count.current += 1;\n * get(store); // 2\n * ```\n * @template V\n * @param {Writable | Readable} store\n */\nexport function fromStore(store) {\n\tlet value = /** @type {V} */ (undefined);\n\n\tconst subscribe = createSubscriber((update) => {\n\t\tlet ran = false;\n\n\t\tconst unsubscribe = store.subscribe((v) => {\n\t\t\tvalue = v;\n\t\t\tif (ran) update();\n\t\t});\n\n\t\tran = true;\n\n\t\treturn unsubscribe;\n\t});\n\n\tfunction current() {\n\t\tif (effect_tracking()) {\n\t\t\tsubscribe();\n\t\t\treturn value;\n\t\t}\n\n\t\treturn get(store);\n\t}\n\n\tif ('set' in store) {\n\t\treturn {\n\t\t\tget current() {\n\t\t\t\treturn current();\n\t\t\t},\n\t\t\tset current(v) {\n\t\t\t\tstore.set(v);\n\t\t\t}\n\t\t};\n\t}\n\n\treturn {\n\t\tget current() {\n\t\t\treturn current();\n\t\t}\n\t};\n}\n"], + "mappings": ";;;;;;;;;;;;;;;;;AA8CO,SAAS,QAAQA,MAAK,KAAK;AACjC,MAAI,SAAS;AACb,MAAI,WAAW;AACf,MAAI,aAAaA,KAAI;AAErB,QAAM,QAAQ,SAAS,YAAY,CAACC,SAAQ;AAG3C,QAAI,MAAM,eAAeD,KAAI;AAG7B,QAAI;AAEJ,QAAI,oBAAoB;AACxB,QAAI,kBAAkB;AACtB,wBAAoB,QAAQ;AAC5B,sBAAkB,MAAM;AAExB,QAAI;AACH,iBAAW,YAAY,MAAM;AAC5B,sBAAc,MAAM;AACnB,gBAAM,QAAQA,KAAI;AAClB,cAAI,IAAK,CAAAC,KAAI,KAAK;AAAA,QACnB,CAAC;AAAA,MACF,CAAC;AAAA,IACF,UAAE;AACD,0BAAoB,iBAAiB;AACrC,wBAAkB,eAAe;AAAA,IAClC;AAEA,UAAM;AAEN,WAAO;AAAA,EACR,CAAC;AAED,MAAI,KAAK;AACR,WAAO;AAAA,MACN;AAAA,MACA,QAAQ,CAAC,OAAO,IAAI,GAAGD,KAAI,CAAC,CAAC;AAAA,MAC7B,WAAW,MAAM;AAAA,IAClB;AAAA,EACD;AAEA,SAAO;AAAA,IACN,WAAW,MAAM;AAAA,EAClB;AACD;AAmCO,SAAS,UAAU,OAAO;AAChC,MAAI;AAAA;AAAA,IAA0B;AAAA;AAE9B,QAAM,YAAY,iBAAiB,CAAC,WAAW;AAC9C,QAAI,MAAM;AAEV,UAAM,cAAc,MAAM,UAAU,CAAC,MAAM;AAC1C,cAAQ;AACR,UAAI,IAAK,QAAO;AAAA,IACjB,CAAC;AAED,UAAM;AAEN,WAAO;AAAA,EACR,CAAC;AAED,WAAS,UAAU;AAClB,QAAI,gBAAgB,GAAG;AACtB,gBAAU;AACV,aAAO;AAAA,IACR;AAEA,WAAO,IAAI,KAAK;AAAA,EACjB;AAEA,MAAI,SAAS,OAAO;AACnB,WAAO;AAAA,MACN,IAAI,UAAU;AACb,eAAO,QAAQ;AAAA,MAChB;AAAA,MACA,IAAI,QAAQ,GAAG;AACd,cAAM,IAAI,CAAC;AAAA,MACZ;AAAA,IACD;AAAA,EACD;AAEA,SAAO;AAAA,IACN,IAAI,UAAU;AACb,aAAO,QAAQ;AAAA,IAChB;AAAA,EACD;AACD;", + "names": ["get", "set"] +} diff --git a/frontend/.vscode/extensions.json b/frontend/.vscode/extensions.json new file mode 100755 index 0000000..bdef820 --- /dev/null +++ b/frontend/.vscode/extensions.json @@ -0,0 +1,3 @@ +{ + "recommendations": ["svelte.svelte-vscode"] +} diff --git a/frontend/README.md b/frontend/README.md new file mode 100755 index 0000000..6a48769 --- /dev/null +++ b/frontend/README.md @@ -0,0 +1,39 @@ +# Superset Tools Frontend (SvelteKit) + +This is the frontend for the Superset Tools application, built with SvelteKit in SPA mode. + +## Development + +1. **Install dependencies**: + ```bash + npm install + ``` + +2. **Run development server**: + ```bash + npm run dev + ``` + The frontend will be available at `http://localhost:5173`. It is configured to proxy API requests to `http://localhost:8000`. + +## Production Build + +1. **Build the static SPA**: + ```bash + npm run build + ``` + This generates a static SPA in the `build/` directory. + +2. **Serve with Backend**: + The Python backend is configured to serve the files from `frontend/build/`. Ensure the backend is running: + ```bash + cd ../backend + python src/app.py + ``` + +## Architecture + +- **Routing**: File-based routing in `src/routes/`. +- **Layouts**: Shared UI in `src/routes/+layout.svelte`. +- **Data Loading**: `load` functions in `+page.ts` for efficient data fetching. +- **API Client**: Centralized API logic in `src/lib/api.js`. +- **Styling**: Tailwind CSS. diff --git a/frontend/index.html b/frontend/index.html new file mode 100755 index 0000000..7d082ee --- /dev/null +++ b/frontend/index.html @@ -0,0 +1,13 @@ + + + + + + + frontend + + +
+ + + diff --git a/frontend/jsconfig.json b/frontend/jsconfig.json new file mode 100755 index 0000000..c7a0b10 --- /dev/null +++ b/frontend/jsconfig.json @@ -0,0 +1,33 @@ +{ + "compilerOptions": { + "moduleResolution": "bundler", + "target": "ESNext", + "module": "ESNext", + /** + * svelte-preprocess cannot figure out whether you have + * a value or a type, so tell TypeScript to enforce using + * `import type` instead of `import` for Types. + */ + "verbatimModuleSyntax": true, + "isolatedModules": true, + "resolveJsonModule": true, + /** + * To have warnings / errors of the Svelte compiler at the + * correct position, enable source maps by default. + */ + "sourceMap": true, + "esModuleInterop": true, + "types": ["vite/client"], + "skipLibCheck": true, + /** + * Typecheck JS in `.svelte` and `.js` files by default. + * Disable this if you'd like to use dynamic types. + */ + "checkJs": true + }, + /** + * Use global.d.ts instead of compilerOptions.types + * to avoid limiting type declarations. + */ + "include": ["src/**/*.d.ts", "src/**/*.js", "src/**/*.svelte"] +} diff --git a/frontend/package-lock.json b/frontend/package-lock.json new file mode 100755 index 0000000..c033340 --- /dev/null +++ b/frontend/package-lock.json @@ -0,0 +1,2550 @@ +{ + "name": "frontend", + "version": "0.0.0", + "lockfileVersion": 3, + "requires": true, + "packages": { + "": { + "name": "frontend", + "version": "0.0.0", + "devDependencies": { + "@sveltejs/adapter-static": "^3.0.10", + "@sveltejs/kit": "^2.49.2", + "@sveltejs/vite-plugin-svelte": "^6.2.1", + "autoprefixer": "^10.4.0", + "postcss": "^8.4.0", + "svelte": "^5.43.8", + "tailwindcss": "^3.0.0", + "vite": "^7.2.4" + } + }, + "node_modules/@alloc/quick-lru": { + "version": "5.2.0", + "resolved": "https://registry.npmjs.org/@alloc/quick-lru/-/quick-lru-5.2.0.tgz", + "integrity": "sha512-UrcABB+4bUrFABwbluTIBErXwvbsU/V7TZWfmbgJfbkwiBuziS9gxdODUyuiecfdGQ85jglMW6juS3+z5TsKLw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/@esbuild/aix-ppc64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.27.2.tgz", + "integrity": "sha512-GZMB+a0mOMZs4MpDbj8RJp4cw+w1WV5NYD6xzgvzUJ5Ek2jerwfO2eADyI6ExDSUED+1X8aMbegahsJi+8mgpw==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "aix" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/android-arm": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.27.2.tgz", + "integrity": "sha512-DVNI8jlPa7Ujbr1yjU2PfUSRtAUZPG9I1RwW4F4xFB1Imiu2on0ADiI/c3td+KmDtVKNbi+nffGDQMfcIMkwIA==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/android-arm64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.27.2.tgz", + "integrity": "sha512-pvz8ZZ7ot/RBphf8fv60ljmaoydPU12VuXHImtAs0XhLLw+EXBi2BLe3OYSBslR4rryHvweW5gmkKFwTiFy6KA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/android-x64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.27.2.tgz", + "integrity": "sha512-z8Ank4Byh4TJJOh4wpz8g2vDy75zFL0TlZlkUkEwYXuPSgX8yzep596n6mT7905kA9uHZsf/o2OJZubl2l3M7A==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/darwin-arm64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.27.2.tgz", + "integrity": "sha512-davCD2Zc80nzDVRwXTcQP/28fiJbcOwvdolL0sOiOsbwBa72kegmVU0Wrh1MYrbuCL98Omp5dVhQFWRKR2ZAlg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/darwin-x64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.27.2.tgz", + "integrity": "sha512-ZxtijOmlQCBWGwbVmwOF/UCzuGIbUkqB1faQRf5akQmxRJ1ujusWsb3CVfk/9iZKr2L5SMU5wPBi1UWbvL+VQA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/freebsd-arm64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.27.2.tgz", + "integrity": "sha512-lS/9CN+rgqQ9czogxlMcBMGd+l8Q3Nj1MFQwBZJyoEKI50XGxwuzznYdwcav6lpOGv5BqaZXqvBSiB/kJ5op+g==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/freebsd-x64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.27.2.tgz", + "integrity": "sha512-tAfqtNYb4YgPnJlEFu4c212HYjQWSO/w/h/lQaBK7RbwGIkBOuNKQI9tqWzx7Wtp7bTPaGC6MJvWI608P3wXYA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-arm": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.27.2.tgz", + "integrity": "sha512-vWfq4GaIMP9AIe4yj1ZUW18RDhx6EPQKjwe7n8BbIecFtCQG4CfHGaHuh7fdfq+y3LIA2vGS/o9ZBGVxIDi9hw==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-arm64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.27.2.tgz", + "integrity": "sha512-hYxN8pr66NsCCiRFkHUAsxylNOcAQaxSSkHMMjcpx0si13t1LHFphxJZUiGwojB1a/Hd5OiPIqDdXONia6bhTw==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-ia32": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.27.2.tgz", + "integrity": "sha512-MJt5BRRSScPDwG2hLelYhAAKh9imjHK5+NE/tvnRLbIqUWa+0E9N4WNMjmp/kXXPHZGqPLxggwVhz7QP8CTR8w==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-loong64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.27.2.tgz", + "integrity": "sha512-lugyF1atnAT463aO6KPshVCJK5NgRnU4yb3FUumyVz+cGvZbontBgzeGFO1nF+dPueHD367a2ZXe1NtUkAjOtg==", + "cpu": [ + "loong64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-mips64el": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.27.2.tgz", + "integrity": "sha512-nlP2I6ArEBewvJ2gjrrkESEZkB5mIoaTswuqNFRv/WYd+ATtUpe9Y09RnJvgvdag7he0OWgEZWhviS1OTOKixw==", + "cpu": [ + "mips64el" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-ppc64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.27.2.tgz", + "integrity": "sha512-C92gnpey7tUQONqg1n6dKVbx3vphKtTHJaNG2Ok9lGwbZil6DrfyecMsp9CrmXGQJmZ7iiVXvvZH6Ml5hL6XdQ==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-riscv64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.27.2.tgz", + "integrity": "sha512-B5BOmojNtUyN8AXlK0QJyvjEZkWwy/FKvakkTDCziX95AowLZKR6aCDhG7LeF7uMCXEJqwa8Bejz5LTPYm8AvA==", + "cpu": [ + "riscv64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-s390x": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.27.2.tgz", + "integrity": "sha512-p4bm9+wsPwup5Z8f4EpfN63qNagQ47Ua2znaqGH6bqLlmJ4bx97Y9JdqxgGZ6Y8xVTixUnEkoKSHcpRlDnNr5w==", + "cpu": [ + "s390x" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/linux-x64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.27.2.tgz", + "integrity": "sha512-uwp2Tip5aPmH+NRUwTcfLb+W32WXjpFejTIOWZFw/v7/KnpCDKG66u4DLcurQpiYTiYwQ9B7KOeMJvLCu/OvbA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/netbsd-arm64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-arm64/-/netbsd-arm64-0.27.2.tgz", + "integrity": "sha512-Kj6DiBlwXrPsCRDeRvGAUb/LNrBASrfqAIok+xB0LxK8CHqxZ037viF13ugfsIpePH93mX7xfJp97cyDuTZ3cw==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "netbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/netbsd-x64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.27.2.tgz", + "integrity": "sha512-HwGDZ0VLVBY3Y+Nw0JexZy9o/nUAWq9MlV7cahpaXKW6TOzfVno3y3/M8Ga8u8Yr7GldLOov27xiCnqRZf0tCA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "netbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/openbsd-arm64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-arm64/-/openbsd-arm64-0.27.2.tgz", + "integrity": "sha512-DNIHH2BPQ5551A7oSHD0CKbwIA/Ox7+78/AWkbS5QoRzaqlev2uFayfSxq68EkonB+IKjiuxBFoV8ESJy8bOHA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/openbsd-x64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.27.2.tgz", + "integrity": "sha512-/it7w9Nb7+0KFIzjalNJVR5bOzA9Vay+yIPLVHfIQYG/j+j9VTH84aNB8ExGKPU4AzfaEvN9/V4HV+F+vo8OEg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/openharmony-arm64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/openharmony-arm64/-/openharmony-arm64-0.27.2.tgz", + "integrity": "sha512-LRBbCmiU51IXfeXk59csuX/aSaToeG7w48nMwA6049Y4J4+VbWALAuXcs+qcD04rHDuSCSRKdmY63sruDS5qag==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openharmony" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/sunos-x64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.27.2.tgz", + "integrity": "sha512-kMtx1yqJHTmqaqHPAzKCAkDaKsffmXkPHThSfRwZGyuqyIeBvf08KSsYXl+abf5HDAPMJIPnbBfXvP2ZC2TfHg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "sunos" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/win32-arm64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.27.2.tgz", + "integrity": "sha512-Yaf78O/B3Kkh+nKABUF++bvJv5Ijoy9AN1ww904rOXZFLWVc5OLOfL56W+C8F9xn5JQZa3UX6m+IktJnIb1Jjg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/win32-ia32": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.27.2.tgz", + "integrity": "sha512-Iuws0kxo4yusk7sw70Xa2E2imZU5HoixzxfGCdxwBdhiDgt9vX9VUCBhqcwY7/uh//78A1hMkkROMJq9l27oLQ==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/win32-x64": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.27.2.tgz", + "integrity": "sha512-sRdU18mcKf7F+YgheI/zGf5alZatMUTKj/jNS6l744f9u3WFu4v7twcUI9vu4mknF4Y9aDlblIie0IM+5xxaqQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@jridgewell/gen-mapping": { + "version": "0.3.13", + "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz", + "integrity": "sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/sourcemap-codec": "^1.5.0", + "@jridgewell/trace-mapping": "^0.3.24" + } + }, + "node_modules/@jridgewell/remapping": { + "version": "2.3.5", + "resolved": "https://registry.npmjs.org/@jridgewell/remapping/-/remapping-2.3.5.tgz", + "integrity": "sha512-LI9u/+laYG4Ds1TDKSJW2YPrIlcVYOwi2fUC6xB43lueCjgxV4lffOCZCtYFiH6TNOX+tQKXx97T4IKHbhyHEQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/gen-mapping": "^0.3.5", + "@jridgewell/trace-mapping": "^0.3.24" + } + }, + "node_modules/@jridgewell/resolve-uri": { + "version": "3.1.2", + "resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz", + "integrity": "sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6.0.0" + } + }, + "node_modules/@jridgewell/sourcemap-codec": { + "version": "1.5.5", + "resolved": "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.5.5.tgz", + "integrity": "sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og==", + "dev": true, + "license": "MIT" + }, + "node_modules/@jridgewell/trace-mapping": { + "version": "0.3.31", + "resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.31.tgz", + "integrity": "sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/resolve-uri": "^3.1.0", + "@jridgewell/sourcemap-codec": "^1.4.14" + } + }, + "node_modules/@nodelib/fs.scandir": { + "version": "2.1.5", + "resolved": "https://registry.npmjs.org/@nodelib/fs.scandir/-/fs.scandir-2.1.5.tgz", + "integrity": "sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==", + "dev": true, + "license": "MIT", + "dependencies": { + "@nodelib/fs.stat": "2.0.5", + "run-parallel": "^1.1.9" + }, + "engines": { + "node": ">= 8" + } + }, + "node_modules/@nodelib/fs.stat": { + "version": "2.0.5", + "resolved": "https://registry.npmjs.org/@nodelib/fs.stat/-/fs.stat-2.0.5.tgz", + "integrity": "sha512-RkhPPp2zrqDAQA/2jNhnztcPAlv64XdhIp7a7454A5ovI7Bukxgt7MX7udwAu3zg1DcpPU0rz3VV1SeaqvY4+A==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 8" + } + }, + "node_modules/@nodelib/fs.walk": { + "version": "1.2.8", + "resolved": "https://registry.npmjs.org/@nodelib/fs.walk/-/fs.walk-1.2.8.tgz", + "integrity": "sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==", + "dev": true, + "license": "MIT", + "dependencies": { + "@nodelib/fs.scandir": "2.1.5", + "fastq": "^1.6.0" + }, + "engines": { + "node": ">= 8" + } + }, + "node_modules/@polka/url": { + "version": "1.0.0-next.29", + "resolved": "https://registry.npmjs.org/@polka/url/-/url-1.0.0-next.29.tgz", + "integrity": "sha512-wwQAWhWSuHaag8c4q/KN/vCoeOJYshAIvMQwD4GpSb3OiZklFfvAgmj0VCBBImRpuF/aFgIRzllXlVX93Jevww==", + "dev": true, + "license": "MIT" + }, + "node_modules/@rollup/rollup-android-arm-eabi": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.53.5.tgz", + "integrity": "sha512-iDGS/h7D8t7tvZ1t6+WPK04KD0MwzLZrG0se1hzBjSi5fyxlsiggoJHwh18PCFNn7tG43OWb6pdZ6Y+rMlmyNQ==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ] + }, + "node_modules/@rollup/rollup-android-arm64": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.53.5.tgz", + "integrity": "sha512-wrSAViWvZHBMMlWk6EJhvg8/rjxzyEhEdgfMMjREHEq11EtJ6IP6yfcCH57YAEca2Oe3FNCE9DSTgU70EIGmVw==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "android" + ] + }, + "node_modules/@rollup/rollup-darwin-arm64": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.53.5.tgz", + "integrity": "sha512-S87zZPBmRO6u1YXQLwpveZm4JfPpAa6oHBX7/ghSiGH3rz/KDgAu1rKdGutV+WUI6tKDMbaBJomhnT30Y2t4VQ==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ] + }, + "node_modules/@rollup/rollup-darwin-x64": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.53.5.tgz", + "integrity": "sha512-YTbnsAaHo6VrAczISxgpTva8EkfQus0VPEVJCEaboHtZRIb6h6j0BNxRBOwnDciFTZLDPW5r+ZBmhL/+YpTZgA==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ] + }, + "node_modules/@rollup/rollup-freebsd-arm64": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-freebsd-arm64/-/rollup-freebsd-arm64-4.53.5.tgz", + "integrity": "sha512-1T8eY2J8rKJWzaznV7zedfdhD1BqVs1iqILhmHDq/bqCUZsrMt+j8VCTHhP0vdfbHK3e1IQ7VYx3jlKqwlf+vw==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ] + }, + "node_modules/@rollup/rollup-freebsd-x64": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-freebsd-x64/-/rollup-freebsd-x64-4.53.5.tgz", + "integrity": "sha512-sHTiuXyBJApxRn+VFMaw1U+Qsz4kcNlxQ742snICYPrY+DDL8/ZbaC4DVIB7vgZmp3jiDaKA0WpBdP0aqPJoBQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "freebsd" + ] + }, + "node_modules/@rollup/rollup-linux-arm-gnueabihf": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.53.5.tgz", + "integrity": "sha512-dV3T9MyAf0w8zPVLVBptVlzaXxka6xg1f16VAQmjg+4KMSTWDvhimI/Y6mp8oHwNrmnmVl9XxJ/w/mO4uIQONA==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-arm-musleabihf": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.53.5.tgz", + "integrity": "sha512-wIGYC1x/hyjP+KAu9+ewDI+fi5XSNiUi9Bvg6KGAh2TsNMA3tSEs+Sh6jJ/r4BV/bx/CyWu2ue9kDnIdRyafcQ==", + "cpu": [ + "arm" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-arm64-gnu": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.53.5.tgz", + "integrity": "sha512-Y+qVA0D9d0y2FRNiG9oM3Hut/DgODZbU9I8pLLPwAsU0tUKZ49cyV1tzmB/qRbSzGvY8lpgGkJuMyuhH7Ma+Vg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-arm64-musl": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.53.5.tgz", + "integrity": "sha512-juaC4bEgJsyFVfqhtGLz8mbopaWD+WeSOYr5E16y+1of6KQjc0BpwZLuxkClqY1i8sco+MdyoXPNiCkQou09+g==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-loong64-gnu": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-loong64-gnu/-/rollup-linux-loong64-gnu-4.53.5.tgz", + "integrity": "sha512-rIEC0hZ17A42iXtHX+EPJVL/CakHo+tT7W0pbzdAGuWOt2jxDFh7A/lRhsNHBcqL4T36+UiAgwO8pbmn3dE8wA==", + "cpu": [ + "loong64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-ppc64-gnu": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-ppc64-gnu/-/rollup-linux-ppc64-gnu-4.53.5.tgz", + "integrity": "sha512-T7l409NhUE552RcAOcmJHj3xyZ2h7vMWzcwQI0hvn5tqHh3oSoclf9WgTl+0QqffWFG8MEVZZP1/OBglKZx52Q==", + "cpu": [ + "ppc64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-riscv64-gnu": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.53.5.tgz", + "integrity": "sha512-7OK5/GhxbnrMcxIFoYfhV/TkknarkYC1hqUw1wU2xUN3TVRLNT5FmBv4KkheSG2xZ6IEbRAhTooTV2+R5Tk0lQ==", + "cpu": [ + "riscv64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-riscv64-musl": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-musl/-/rollup-linux-riscv64-musl-4.53.5.tgz", + "integrity": "sha512-GwuDBE/PsXaTa76lO5eLJTyr2k8QkPipAyOrs4V/KJufHCZBJ495VCGJol35grx9xryk4V+2zd3Ri+3v7NPh+w==", + "cpu": [ + "riscv64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-s390x-gnu": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.53.5.tgz", + "integrity": "sha512-IAE1Ziyr1qNfnmiQLHBURAD+eh/zH1pIeJjeShleII7Vj8kyEm2PF77o+lf3WTHDpNJcu4IXJxNO0Zluro8bOw==", + "cpu": [ + "s390x" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-x64-gnu": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.53.5.tgz", + "integrity": "sha512-Pg6E+oP7GvZ4XwgRJBuSXZjcqpIW3yCBhK4BcsANvb47qMvAbCjR6E+1a/U2WXz1JJxp9/4Dno3/iSJLcm5auw==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-linux-x64-musl": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.53.5.tgz", + "integrity": "sha512-txGtluxDKTxaMDzUduGP0wdfng24y1rygUMnmlUJ88fzCCULCLn7oE5kb2+tRB+MWq1QDZT6ObT5RrR8HFRKqg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "linux" + ] + }, + "node_modules/@rollup/rollup-openharmony-arm64": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-openharmony-arm64/-/rollup-openharmony-arm64-4.53.5.tgz", + "integrity": "sha512-3DFiLPnTxiOQV993fMc+KO8zXHTcIjgaInrqlG8zDp1TlhYl6WgrOHuJkJQ6M8zHEcntSJsUp1XFZSY8C1DYbg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openharmony" + ] + }, + "node_modules/@rollup/rollup-win32-arm64-msvc": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.53.5.tgz", + "integrity": "sha512-nggc/wPpNTgjGg75hu+Q/3i32R00Lq1B6N1DO7MCU340MRKL3WZJMjA9U4K4gzy3dkZPXm9E1Nc81FItBVGRlA==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ] + }, + "node_modules/@rollup/rollup-win32-ia32-msvc": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.53.5.tgz", + "integrity": "sha512-U/54pTbdQpPLBdEzCT6NBCFAfSZMvmjr0twhnD9f4EIvlm9wy3jjQ38yQj1AGznrNO65EWQMgm/QUjuIVrYF9w==", + "cpu": [ + "ia32" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ] + }, + "node_modules/@rollup/rollup-win32-x64-gnu": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-gnu/-/rollup-win32-x64-gnu-4.53.5.tgz", + "integrity": "sha512-2NqKgZSuLH9SXBBV2dWNRCZmocgSOx8OJSdpRaEcRlIfX8YrKxUT6z0F1NpvDVhOsl190UFTRh2F2WDWWCYp3A==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ] + }, + "node_modules/@rollup/rollup-win32-x64-msvc": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.53.5.tgz", + "integrity": "sha512-JRpZUhCfhZ4keB5v0fe02gQJy05GqboPOaxvjugW04RLSYYoB/9t2lx2u/tMs/Na/1NXfY8QYjgRljRpN+MjTQ==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "win32" + ] + }, + "node_modules/@standard-schema/spec": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@standard-schema/spec/-/spec-1.1.0.tgz", + "integrity": "sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w==", + "dev": true, + "license": "MIT" + }, + "node_modules/@sveltejs/acorn-typescript": { + "version": "1.0.8", + "resolved": "https://registry.npmjs.org/@sveltejs/acorn-typescript/-/acorn-typescript-1.0.8.tgz", + "integrity": "sha512-esgN+54+q0NjB0Y/4BomT9samII7jGwNy/2a3wNZbT2A2RpmXsXwUt24LvLhx6jUq2gVk4cWEvcRO6MFQbOfNA==", + "dev": true, + "license": "MIT", + "peerDependencies": { + "acorn": "^8.9.0" + } + }, + "node_modules/@sveltejs/adapter-static": { + "version": "3.0.10", + "resolved": "https://registry.npmjs.org/@sveltejs/adapter-static/-/adapter-static-3.0.10.tgz", + "integrity": "sha512-7D9lYFWJmB7zxZyTE/qxjksvMqzMuYrrsyh1f4AlZqeZeACPRySjbC3aFiY55wb1tWUaKOQG9PVbm74JcN2Iew==", + "dev": true, + "license": "MIT", + "peerDependencies": { + "@sveltejs/kit": "^2.0.0" + } + }, + "node_modules/@sveltejs/kit": { + "version": "2.49.2", + "resolved": "https://registry.npmjs.org/@sveltejs/kit/-/kit-2.49.2.tgz", + "integrity": "sha512-Vp3zX/qlwerQmHMP6x0Ry1oY7eKKRcOWGc2P59srOp4zcqyn+etJyQpELgOi4+ZSUgteX8Y387NuwruLgGXLUQ==", + "dev": true, + "license": "MIT", + "peer": true, + "dependencies": { + "@standard-schema/spec": "^1.0.0", + "@sveltejs/acorn-typescript": "^1.0.5", + "@types/cookie": "^0.6.0", + "acorn": "^8.14.1", + "cookie": "^0.6.0", + "devalue": "^5.3.2", + "esm-env": "^1.2.2", + "kleur": "^4.1.5", + "magic-string": "^0.30.5", + "mrmime": "^2.0.0", + "sade": "^1.8.1", + "set-cookie-parser": "^2.6.0", + "sirv": "^3.0.0" + }, + "bin": { + "svelte-kit": "svelte-kit.js" + }, + "engines": { + "node": ">=18.13" + }, + "peerDependencies": { + "@opentelemetry/api": "^1.0.0", + "@sveltejs/vite-plugin-svelte": "^3.0.0 || ^4.0.0-next.1 || ^5.0.0 || ^6.0.0-next.0", + "svelte": "^4.0.0 || ^5.0.0-next.0", + "vite": "^5.0.3 || ^6.0.0 || ^7.0.0-beta.0" + }, + "peerDependenciesMeta": { + "@opentelemetry/api": { + "optional": true + } + } + }, + "node_modules/@sveltejs/vite-plugin-svelte": { + "version": "6.2.1", + "resolved": "https://registry.npmjs.org/@sveltejs/vite-plugin-svelte/-/vite-plugin-svelte-6.2.1.tgz", + "integrity": "sha512-YZs/OSKOQAQCnJvM/P+F1URotNnYNeU3P2s4oIpzm1uFaqUEqRxUB0g5ejMjEb5Gjb9/PiBI5Ktrq4rUUF8UVQ==", + "dev": true, + "license": "MIT", + "peer": true, + "dependencies": { + "@sveltejs/vite-plugin-svelte-inspector": "^5.0.0", + "debug": "^4.4.1", + "deepmerge": "^4.3.1", + "magic-string": "^0.30.17", + "vitefu": "^1.1.1" + }, + "engines": { + "node": "^20.19 || ^22.12 || >=24" + }, + "peerDependencies": { + "svelte": "^5.0.0", + "vite": "^6.3.0 || ^7.0.0" + } + }, + "node_modules/@sveltejs/vite-plugin-svelte-inspector": { + "version": "5.0.1", + "resolved": "https://registry.npmjs.org/@sveltejs/vite-plugin-svelte-inspector/-/vite-plugin-svelte-inspector-5.0.1.tgz", + "integrity": "sha512-ubWshlMk4bc8mkwWbg6vNvCeT7lGQojE3ijDh3QTR6Zr/R+GXxsGbyH4PExEPpiFmqPhYiVSVmHBjUcVc1JIrA==", + "dev": true, + "license": "MIT", + "dependencies": { + "debug": "^4.4.1" + }, + "engines": { + "node": "^20.19 || ^22.12 || >=24" + }, + "peerDependencies": { + "@sveltejs/vite-plugin-svelte": "^6.0.0-next.0", + "svelte": "^5.0.0", + "vite": "^6.3.0 || ^7.0.0" + } + }, + "node_modules/@types/cookie": { + "version": "0.6.0", + "resolved": "https://registry.npmjs.org/@types/cookie/-/cookie-0.6.0.tgz", + "integrity": "sha512-4Kh9a6B2bQciAhf7FSuMRRkUWecJgJu9nPnx3yzpsfXX/c50REIqpHY4C82bXP90qrLtXtkDxTZosYO3UpOwlA==", + "dev": true, + "license": "MIT" + }, + "node_modules/@types/estree": { + "version": "1.0.8", + "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz", + "integrity": "sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w==", + "dev": true, + "license": "MIT" + }, + "node_modules/acorn": { + "version": "8.15.0", + "resolved": "https://registry.npmjs.org/acorn/-/acorn-8.15.0.tgz", + "integrity": "sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg==", + "dev": true, + "license": "MIT", + "peer": true, + "bin": { + "acorn": "bin/acorn" + }, + "engines": { + "node": ">=0.4.0" + } + }, + "node_modules/any-promise": { + "version": "1.3.0", + "resolved": "https://registry.npmjs.org/any-promise/-/any-promise-1.3.0.tgz", + "integrity": "sha512-7UvmKalWRt1wgjL1RrGxoSJW/0QZFIegpeGvZG9kjp8vrRu55XTHbwnqq2GpXm9uLbcuhxm3IqX9OB4MZR1b2A==", + "dev": true, + "license": "MIT" + }, + "node_modules/anymatch": { + "version": "3.1.3", + "resolved": "https://registry.npmjs.org/anymatch/-/anymatch-3.1.3.tgz", + "integrity": "sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==", + "dev": true, + "license": "ISC", + "dependencies": { + "normalize-path": "^3.0.0", + "picomatch": "^2.0.4" + }, + "engines": { + "node": ">= 8" + } + }, + "node_modules/arg": { + "version": "5.0.2", + "resolved": "https://registry.npmjs.org/arg/-/arg-5.0.2.tgz", + "integrity": "sha512-PYjyFOLKQ9y57JvQ6QLo8dAgNqswh8M1RMJYdQduT6xbWSgK36P/Z/v+p888pM69jMMfS8Xd8F6I1kQ/I9HUGg==", + "dev": true, + "license": "MIT" + }, + "node_modules/aria-query": { + "version": "5.3.2", + "resolved": "https://registry.npmjs.org/aria-query/-/aria-query-5.3.2.tgz", + "integrity": "sha512-COROpnaoap1E2F000S62r6A60uHZnmlvomhfyT2DlTcrY1OrBKn2UhH7qn5wTC9zMvD0AY7csdPSNwKP+7WiQw==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/autoprefixer": { + "version": "10.4.23", + "resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-10.4.23.tgz", + "integrity": "sha512-YYTXSFulfwytnjAPlw8QHncHJmlvFKtczb8InXaAx9Q0LbfDnfEYDE55omerIJKihhmU61Ft+cAOSzQVaBUmeA==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/postcss/" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/autoprefixer" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "browserslist": "^4.28.1", + "caniuse-lite": "^1.0.30001760", + "fraction.js": "^5.3.4", + "picocolors": "^1.1.1", + "postcss-value-parser": "^4.2.0" + }, + "bin": { + "autoprefixer": "bin/autoprefixer" + }, + "engines": { + "node": "^10 || ^12 || >=14" + }, + "peerDependencies": { + "postcss": "^8.1.0" + } + }, + "node_modules/axobject-query": { + "version": "4.1.0", + "resolved": "https://registry.npmjs.org/axobject-query/-/axobject-query-4.1.0.tgz", + "integrity": "sha512-qIj0G9wZbMGNLjLmg1PT6v2mE9AH2zlnADJD/2tC6E00hgmhUOfEB6greHPAfLRSufHqROIUTkw6E+M3lH0PTQ==", + "dev": true, + "license": "Apache-2.0", + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/baseline-browser-mapping": { + "version": "2.9.11", + "resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.9.11.tgz", + "integrity": "sha512-Sg0xJUNDU1sJNGdfGWhVHX0kkZ+HWcvmVymJbj6NSgZZmW/8S9Y2HQ5euytnIgakgxN6papOAWiwDo1ctFDcoQ==", + "dev": true, + "license": "Apache-2.0", + "bin": { + "baseline-browser-mapping": "dist/cli.js" + } + }, + "node_modules/binary-extensions": { + "version": "2.3.0", + "resolved": "https://registry.npmjs.org/binary-extensions/-/binary-extensions-2.3.0.tgz", + "integrity": "sha512-Ceh+7ox5qe7LJuLHoY0feh3pHuUDHAcRUeyL2VYghZwfpkNIy/+8Ocg0a3UuSoYzavmylwuLWQOf3hl0jjMMIw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=8" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/braces": { + "version": "3.0.3", + "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.3.tgz", + "integrity": "sha512-yQbXgO/OSZVD2IsiLlro+7Hf6Q18EJrKSEsdoMzKePKXct3gvD8oLcOQdIzGupr5Fj+EDe8gO/lxc1BzfMpxvA==", + "dev": true, + "license": "MIT", + "dependencies": { + "fill-range": "^7.1.1" + }, + "engines": { + "node": ">=8" + } + }, + "node_modules/browserslist": { + "version": "4.28.1", + "resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.28.1.tgz", + "integrity": "sha512-ZC5Bd0LgJXgwGqUknZY/vkUQ04r8NXnJZ3yYi4vDmSiZmC/pdSN0NbNRPxZpbtO4uAfDUAFffO8IZoM3Gj8IkA==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/browserslist" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/browserslist" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "peer": true, + "dependencies": { + "baseline-browser-mapping": "^2.9.0", + "caniuse-lite": "^1.0.30001759", + "electron-to-chromium": "^1.5.263", + "node-releases": "^2.0.27", + "update-browserslist-db": "^1.2.0" + }, + "bin": { + "browserslist": "cli.js" + }, + "engines": { + "node": "^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7" + } + }, + "node_modules/camelcase-css": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/camelcase-css/-/camelcase-css-2.0.1.tgz", + "integrity": "sha512-QOSvevhslijgYwRx6Rv7zKdMF8lbRmx+uQGx2+vDc+KI/eBnsy9kit5aj23AgGu3pa4t9AgwbnXWqS+iOY+2aA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 6" + } + }, + "node_modules/caniuse-lite": { + "version": "1.0.30001761", + "resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001761.tgz", + "integrity": "sha512-JF9ptu1vP2coz98+5051jZ4PwQgd2ni8A+gYSN7EA7dPKIMf0pDlSUxhdmVOaV3/fYK5uWBkgSXJaRLr4+3A6g==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/browserslist" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/caniuse-lite" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "CC-BY-4.0" + }, + "node_modules/chokidar": { + "version": "3.6.0", + "resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.6.0.tgz", + "integrity": "sha512-7VT13fmjotKpGipCW9JEQAusEPE+Ei8nl6/g4FBAmIm0GOOLMua9NDDo/DWp0ZAxCr3cPq5ZpBqmPAQgDda2Pw==", + "dev": true, + "license": "MIT", + "dependencies": { + "anymatch": "~3.1.2", + "braces": "~3.0.2", + "glob-parent": "~5.1.2", + "is-binary-path": "~2.1.0", + "is-glob": "~4.0.1", + "normalize-path": "~3.0.0", + "readdirp": "~3.6.0" + }, + "engines": { + "node": ">= 8.10.0" + }, + "funding": { + "url": "https://paulmillr.com/funding/" + }, + "optionalDependencies": { + "fsevents": "~2.3.2" + } + }, + "node_modules/chokidar/node_modules/glob-parent": { + "version": "5.1.2", + "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz", + "integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==", + "dev": true, + "license": "ISC", + "dependencies": { + "is-glob": "^4.0.1" + }, + "engines": { + "node": ">= 6" + } + }, + "node_modules/clsx": { + "version": "2.1.1", + "resolved": "https://registry.npmjs.org/clsx/-/clsx-2.1.1.tgz", + "integrity": "sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/commander": { + "version": "4.1.1", + "resolved": "https://registry.npmjs.org/commander/-/commander-4.1.1.tgz", + "integrity": "sha512-NOKm8xhkzAjzFx8B2v5OAHT+u5pRQc2UCa2Vq9jYL/31o2wi9mxBA7LIFs3sV5VSC49z6pEhfbMULvShKj26WA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 6" + } + }, + "node_modules/cookie": { + "version": "0.6.0", + "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz", + "integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.6" + } + }, + "node_modules/cssesc": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/cssesc/-/cssesc-3.0.0.tgz", + "integrity": "sha512-/Tb/JcjK111nNScGob5MNtsntNM1aCNUDipB/TkwZFhyDrrE47SOx/18wF2bbjgc3ZzCSKW1T5nt5EbFoAz/Vg==", + "dev": true, + "license": "MIT", + "bin": { + "cssesc": "bin/cssesc" + }, + "engines": { + "node": ">=4" + } + }, + "node_modules/debug": { + "version": "4.4.3", + "resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz", + "integrity": "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA==", + "dev": true, + "license": "MIT", + "dependencies": { + "ms": "^2.1.3" + }, + "engines": { + "node": ">=6.0" + }, + "peerDependenciesMeta": { + "supports-color": { + "optional": true + } + } + }, + "node_modules/deepmerge": { + "version": "4.3.1", + "resolved": "https://registry.npmjs.org/deepmerge/-/deepmerge-4.3.1.tgz", + "integrity": "sha512-3sUqbMEc77XqpdNO7FRyRog+eW3ph+GYCbj+rK+uYyRMuwsVy0rMiVtPn+QJlKFvWP/1PYpapqYn0Me2knFn+A==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/devalue": { + "version": "5.6.1", + "resolved": "https://registry.npmjs.org/devalue/-/devalue-5.6.1.tgz", + "integrity": "sha512-jDwizj+IlEZBunHcOuuFVBnIMPAEHvTsJj0BcIp94xYguLRVBcXO853px/MyIJvbVzWdsGvrRweIUWJw8hBP7A==", + "dev": true, + "license": "MIT" + }, + "node_modules/didyoumean": { + "version": "1.2.2", + "resolved": "https://registry.npmjs.org/didyoumean/-/didyoumean-1.2.2.tgz", + "integrity": "sha512-gxtyfqMg7GKyhQmb056K7M3xszy/myH8w+B4RT+QXBQsvAOdc3XymqDDPHx1BgPgsdAA5SIifona89YtRATDzw==", + "dev": true, + "license": "Apache-2.0" + }, + "node_modules/dlv": { + "version": "1.1.3", + "resolved": "https://registry.npmjs.org/dlv/-/dlv-1.1.3.tgz", + "integrity": "sha512-+HlytyjlPKnIG8XuRG8WvmBP8xs8P71y+SKKS6ZXWoEgLuePxtDoUEiH7WkdePWrQ5JBpE6aoVqfZfJUQkjXwA==", + "dev": true, + "license": "MIT" + }, + "node_modules/electron-to-chromium": { + "version": "1.5.267", + "resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.267.tgz", + "integrity": "sha512-0Drusm6MVRXSOJpGbaSVgcQsuB4hEkMpHXaVstcPmhu5LIedxs1xNK/nIxmQIU/RPC0+1/o0AVZfBTkTNJOdUw==", + "dev": true, + "license": "ISC" + }, + "node_modules/esbuild": { + "version": "0.27.2", + "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.27.2.tgz", + "integrity": "sha512-HyNQImnsOC7X9PMNaCIeAm4ISCQXs5a5YasTXVliKv4uuBo1dKrG0A+uQS8M5eXjVMnLg3WgXaKvprHlFJQffw==", + "dev": true, + "hasInstallScript": true, + "license": "MIT", + "bin": { + "esbuild": "bin/esbuild" + }, + "engines": { + "node": ">=18" + }, + "optionalDependencies": { + "@esbuild/aix-ppc64": "0.27.2", + "@esbuild/android-arm": "0.27.2", + "@esbuild/android-arm64": "0.27.2", + "@esbuild/android-x64": "0.27.2", + "@esbuild/darwin-arm64": "0.27.2", + "@esbuild/darwin-x64": "0.27.2", + "@esbuild/freebsd-arm64": "0.27.2", + "@esbuild/freebsd-x64": "0.27.2", + "@esbuild/linux-arm": "0.27.2", + "@esbuild/linux-arm64": "0.27.2", + "@esbuild/linux-ia32": "0.27.2", + "@esbuild/linux-loong64": "0.27.2", + "@esbuild/linux-mips64el": "0.27.2", + "@esbuild/linux-ppc64": "0.27.2", + "@esbuild/linux-riscv64": "0.27.2", + "@esbuild/linux-s390x": "0.27.2", + "@esbuild/linux-x64": "0.27.2", + "@esbuild/netbsd-arm64": "0.27.2", + "@esbuild/netbsd-x64": "0.27.2", + "@esbuild/openbsd-arm64": "0.27.2", + "@esbuild/openbsd-x64": "0.27.2", + "@esbuild/openharmony-arm64": "0.27.2", + "@esbuild/sunos-x64": "0.27.2", + "@esbuild/win32-arm64": "0.27.2", + "@esbuild/win32-ia32": "0.27.2", + "@esbuild/win32-x64": "0.27.2" + } + }, + "node_modules/escalade": { + "version": "3.2.0", + "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz", + "integrity": "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/esm-env": { + "version": "1.2.2", + "resolved": "https://registry.npmjs.org/esm-env/-/esm-env-1.2.2.tgz", + "integrity": "sha512-Epxrv+Nr/CaL4ZcFGPJIYLWFom+YeV1DqMLHJoEd9SYRxNbaFruBwfEX/kkHUJf55j2+TUbmDcmuilbP1TmXHA==", + "dev": true, + "license": "MIT" + }, + "node_modules/esrap": { + "version": "2.2.1", + "resolved": "https://registry.npmjs.org/esrap/-/esrap-2.2.1.tgz", + "integrity": "sha512-GiYWG34AN/4CUyaWAgunGt0Rxvr1PTMlGC0vvEov/uOQYWne2bpN03Um+k8jT+q3op33mKouP2zeJ6OlM+qeUg==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/sourcemap-codec": "^1.4.15" + } + }, + "node_modules/fast-glob": { + "version": "3.3.3", + "resolved": "https://registry.npmjs.org/fast-glob/-/fast-glob-3.3.3.tgz", + "integrity": "sha512-7MptL8U0cqcFdzIzwOTHoilX9x5BrNqye7Z/LuC7kCMRio1EMSyqRK3BEAUD7sXRq4iT4AzTVuZdhgQ2TCvYLg==", + "dev": true, + "license": "MIT", + "dependencies": { + "@nodelib/fs.stat": "^2.0.2", + "@nodelib/fs.walk": "^1.2.3", + "glob-parent": "^5.1.2", + "merge2": "^1.3.0", + "micromatch": "^4.0.8" + }, + "engines": { + "node": ">=8.6.0" + } + }, + "node_modules/fast-glob/node_modules/glob-parent": { + "version": "5.1.2", + "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz", + "integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==", + "dev": true, + "license": "ISC", + "dependencies": { + "is-glob": "^4.0.1" + }, + "engines": { + "node": ">= 6" + } + }, + "node_modules/fastq": { + "version": "1.19.1", + "resolved": "https://registry.npmjs.org/fastq/-/fastq-1.19.1.tgz", + "integrity": "sha512-GwLTyxkCXjXbxqIhTsMI2Nui8huMPtnxg7krajPJAjnEG/iiOS7i+zCtWGZR9G0NBKbXKh6X9m9UIsYX/N6vvQ==", + "dev": true, + "license": "ISC", + "dependencies": { + "reusify": "^1.0.4" + } + }, + "node_modules/fill-range": { + "version": "7.1.1", + "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.1.1.tgz", + "integrity": "sha512-YsGpe3WHLK8ZYi4tWDg2Jy3ebRz2rXowDxnld4bkQB00cc/1Zw9AWnC0i9ztDJitivtQvaI9KaLyKrc+hBW0yg==", + "dev": true, + "license": "MIT", + "dependencies": { + "to-regex-range": "^5.0.1" + }, + "engines": { + "node": ">=8" + } + }, + "node_modules/fraction.js": { + "version": "5.3.4", + "resolved": "https://registry.npmjs.org/fraction.js/-/fraction.js-5.3.4.tgz", + "integrity": "sha512-1X1NTtiJphryn/uLQz3whtY6jK3fTqoE3ohKs0tT+Ujr1W59oopxmoEh7Lu5p6vBaPbgoM0bzveAW4Qi5RyWDQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": "*" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/rawify" + } + }, + "node_modules/fsevents": { + "version": "2.3.3", + "resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz", + "integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==", + "dev": true, + "hasInstallScript": true, + "license": "MIT", + "optional": true, + "os": [ + "darwin" + ], + "engines": { + "node": "^8.16.0 || ^10.6.0 || >=11.0.0" + } + }, + "node_modules/function-bind": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz", + "integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==", + "dev": true, + "license": "MIT", + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/glob-parent": { + "version": "6.0.2", + "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-6.0.2.tgz", + "integrity": "sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==", + "dev": true, + "license": "ISC", + "dependencies": { + "is-glob": "^4.0.3" + }, + "engines": { + "node": ">=10.13.0" + } + }, + "node_modules/hasown": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz", + "integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "function-bind": "^1.1.2" + }, + "engines": { + "node": ">= 0.4" + } + }, + "node_modules/is-binary-path": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/is-binary-path/-/is-binary-path-2.1.0.tgz", + "integrity": "sha512-ZMERYes6pDydyuGidse7OsHxtbI7WVeUEozgR/g7rd0xUimYNlvZRE/K2MgZTjWy725IfelLeVcEM97mmtRGXw==", + "dev": true, + "license": "MIT", + "dependencies": { + "binary-extensions": "^2.0.0" + }, + "engines": { + "node": ">=8" + } + }, + "node_modules/is-core-module": { + "version": "2.16.1", + "resolved": "https://registry.npmjs.org/is-core-module/-/is-core-module-2.16.1.tgz", + "integrity": "sha512-UfoeMA6fIJ8wTYFEUjelnaGI67v6+N7qXJEvQuIGa99l4xsCruSYOVSQ0uPANn4dAzm8lkYPaKLrrijLq7x23w==", + "dev": true, + "license": "MIT", + "dependencies": { + "hasown": "^2.0.2" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/is-extglob": { + "version": "2.1.1", + "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz", + "integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/is-glob": { + "version": "4.0.3", + "resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz", + "integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-extglob": "^2.1.1" + }, + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/is-number": { + "version": "7.0.0", + "resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz", + "integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.12.0" + } + }, + "node_modules/is-reference": { + "version": "3.0.3", + "resolved": "https://registry.npmjs.org/is-reference/-/is-reference-3.0.3.tgz", + "integrity": "sha512-ixkJoqQvAP88E6wLydLGGqCJsrFUnqoH6HnaczB8XmDH1oaWU+xxdptvikTgaEhtZ53Ky6YXiBuUI2WXLMCwjw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/estree": "^1.0.6" + } + }, + "node_modules/jiti": { + "version": "1.21.7", + "resolved": "https://registry.npmjs.org/jiti/-/jiti-1.21.7.tgz", + "integrity": "sha512-/imKNG4EbWNrVjoNC/1H5/9GFy+tqjGBHCaSsN+P2RnPqjsLmv6UD3Ej+Kj8nBWaRAwyk7kK5ZUc+OEatnTR3A==", + "dev": true, + "license": "MIT", + "peer": true, + "bin": { + "jiti": "bin/jiti.js" + } + }, + "node_modules/kleur": { + "version": "4.1.5", + "resolved": "https://registry.npmjs.org/kleur/-/kleur-4.1.5.tgz", + "integrity": "sha512-o+NO+8WrRiQEE4/7nwRJhN1HWpVmJm511pBHUxPLtp0BUISzlBplORYSmTclCnJvQq2tKu/sgl3xVpkc7ZWuQQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/lilconfig": { + "version": "3.1.3", + "resolved": "https://registry.npmjs.org/lilconfig/-/lilconfig-3.1.3.tgz", + "integrity": "sha512-/vlFKAoH5Cgt3Ie+JLhRbwOsCQePABiU3tJ1egGvyQ+33R/vcwM2Zl2QR/LzjsBeItPt3oSVXapn+m4nQDvpzw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=14" + }, + "funding": { + "url": "https://github.com/sponsors/antonk52" + } + }, + "node_modules/lines-and-columns": { + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/lines-and-columns/-/lines-and-columns-1.2.4.tgz", + "integrity": "sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==", + "dev": true, + "license": "MIT" + }, + "node_modules/locate-character": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/locate-character/-/locate-character-3.0.0.tgz", + "integrity": "sha512-SW13ws7BjaeJ6p7Q6CO2nchbYEc3X3J6WrmTTDto7yMPqVSZTUyY5Tjbid+Ab8gLnATtygYtiDIJGQRRn2ZOiA==", + "dev": true, + "license": "MIT" + }, + "node_modules/magic-string": { + "version": "0.30.21", + "resolved": "https://registry.npmjs.org/magic-string/-/magic-string-0.30.21.tgz", + "integrity": "sha512-vd2F4YUyEXKGcLHoq+TEyCjxueSeHnFxyyjNp80yg0XV4vUhnDer/lvvlqM/arB5bXQN5K2/3oinyCRyx8T2CQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/sourcemap-codec": "^1.5.5" + } + }, + "node_modules/merge2": { + "version": "1.4.1", + "resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz", + "integrity": "sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 8" + } + }, + "node_modules/micromatch": { + "version": "4.0.8", + "resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.8.tgz", + "integrity": "sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA==", + "dev": true, + "license": "MIT", + "dependencies": { + "braces": "^3.0.3", + "picomatch": "^2.3.1" + }, + "engines": { + "node": ">=8.6" + } + }, + "node_modules/mri": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/mri/-/mri-1.2.0.tgz", + "integrity": "sha512-tzzskb3bG8LvYGFF/mDTpq3jpI6Q9wc3LEmBaghu+DdCssd1FakN7Bc0hVNmEyGq1bq3RgfkCb3cmQLpNPOroA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=4" + } + }, + "node_modules/mrmime": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/mrmime/-/mrmime-2.0.1.tgz", + "integrity": "sha512-Y3wQdFg2Va6etvQ5I82yUhGdsKrcYox6p7FfL1LbK2J4V01F9TGlepTIhnK24t7koZibmg82KGglhA1XK5IsLQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=10" + } + }, + "node_modules/ms": { + "version": "2.1.3", + "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz", + "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==", + "dev": true, + "license": "MIT" + }, + "node_modules/mz": { + "version": "2.7.0", + "resolved": "https://registry.npmjs.org/mz/-/mz-2.7.0.tgz", + "integrity": "sha512-z81GNO7nnYMEhrGh9LeymoE4+Yr0Wn5McHIZMK5cfQCl+NDX08sCZgUc9/6MHni9IWuFLm1Z3HTCXu2z9fN62Q==", + "dev": true, + "license": "MIT", + "dependencies": { + "any-promise": "^1.0.0", + "object-assign": "^4.0.1", + "thenify-all": "^1.0.0" + } + }, + "node_modules/nanoid": { + "version": "3.3.11", + "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz", + "integrity": "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==", + "dev": true, + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "bin": { + "nanoid": "bin/nanoid.cjs" + }, + "engines": { + "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1" + } + }, + "node_modules/node-releases": { + "version": "2.0.27", + "resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.27.tgz", + "integrity": "sha512-nmh3lCkYZ3grZvqcCH+fjmQ7X+H0OeZgP40OierEaAptX4XofMh5kwNbWh7lBduUzCcV/8kZ+NDLCwm2iorIlA==", + "dev": true, + "license": "MIT" + }, + "node_modules/normalize-path": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/normalize-path/-/normalize-path-3.0.0.tgz", + "integrity": "sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/object-assign": { + "version": "4.1.1", + "resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz", + "integrity": "sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/object-hash": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/object-hash/-/object-hash-3.0.0.tgz", + "integrity": "sha512-RSn9F68PjH9HqtltsSnqYC1XXoWe9Bju5+213R98cNGttag9q9yAOTzdbsqvIa7aNm5WffBZFpWYr2aWrklWAw==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 6" + } + }, + "node_modules/path-parse": { + "version": "1.0.7", + "resolved": "https://registry.npmjs.org/path-parse/-/path-parse-1.0.7.tgz", + "integrity": "sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==", + "dev": true, + "license": "MIT" + }, + "node_modules/picocolors": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz", + "integrity": "sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==", + "dev": true, + "license": "ISC" + }, + "node_modules/picomatch": { + "version": "2.3.1", + "resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz", + "integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=8.6" + }, + "funding": { + "url": "https://github.com/sponsors/jonschlinkert" + } + }, + "node_modules/pify": { + "version": "2.3.0", + "resolved": "https://registry.npmjs.org/pify/-/pify-2.3.0.tgz", + "integrity": "sha512-udgsAY+fTnvv7kI7aaxbqwWNb0AHiB0qBO89PZKPkoTmGOgdbrHDKD+0B2X4uTfJ/FT1R09r9gTsjUjNJotuog==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/pirates": { + "version": "4.0.7", + "resolved": "https://registry.npmjs.org/pirates/-/pirates-4.0.7.tgz", + "integrity": "sha512-TfySrs/5nm8fQJDcBDuUng3VOUKsd7S+zqvbOTiGXHfxX4wK31ard+hoNuvkicM/2YFzlpDgABOevKSsB4G/FA==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 6" + } + }, + "node_modules/postcss": { + "version": "8.5.6", + "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.6.tgz", + "integrity": "sha512-3Ybi1tAuwAP9s0r1UQ2J4n5Y0G05bJkpUIO0/bI9MhwmD70S5aTWbXGBwxHrelT+XM1k6dM0pk+SwNkpTRN7Pg==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/postcss/" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/postcss" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "peer": true, + "dependencies": { + "nanoid": "^3.3.11", + "picocolors": "^1.1.1", + "source-map-js": "^1.2.1" + }, + "engines": { + "node": "^10 || ^12 || >=14" + } + }, + "node_modules/postcss-import": { + "version": "15.1.0", + "resolved": "https://registry.npmjs.org/postcss-import/-/postcss-import-15.1.0.tgz", + "integrity": "sha512-hpr+J05B2FVYUAXHeK1YyI267J/dDDhMU6B6civm8hSY1jYJnBXxzKDKDswzJmtLHryrjhnDjqqp/49t8FALew==", + "dev": true, + "license": "MIT", + "dependencies": { + "postcss-value-parser": "^4.0.0", + "read-cache": "^1.0.0", + "resolve": "^1.1.7" + }, + "engines": { + "node": ">=14.0.0" + }, + "peerDependencies": { + "postcss": "^8.0.0" + } + }, + "node_modules/postcss-js": { + "version": "4.1.0", + "resolved": "https://registry.npmjs.org/postcss-js/-/postcss-js-4.1.0.tgz", + "integrity": "sha512-oIAOTqgIo7q2EOwbhb8UalYePMvYoIeRY2YKntdpFQXNosSu3vLrniGgmH9OKs/qAkfoj5oB3le/7mINW1LCfw==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/postcss/" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "camelcase-css": "^2.0.1" + }, + "engines": { + "node": "^12 || ^14 || >= 16" + }, + "peerDependencies": { + "postcss": "^8.4.21" + } + }, + "node_modules/postcss-load-config": { + "version": "6.0.1", + "resolved": "https://registry.npmjs.org/postcss-load-config/-/postcss-load-config-6.0.1.tgz", + "integrity": "sha512-oPtTM4oerL+UXmx+93ytZVN82RrlY/wPUV8IeDxFrzIjXOLF1pN+EmKPLbubvKHT2HC20xXsCAH2Z+CKV6Oz/g==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/postcss/" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "lilconfig": "^3.1.1" + }, + "engines": { + "node": ">= 18" + }, + "peerDependencies": { + "jiti": ">=1.21.0", + "postcss": ">=8.0.9", + "tsx": "^4.8.1", + "yaml": "^2.4.2" + }, + "peerDependenciesMeta": { + "jiti": { + "optional": true + }, + "postcss": { + "optional": true + }, + "tsx": { + "optional": true + }, + "yaml": { + "optional": true + } + } + }, + "node_modules/postcss-nested": { + "version": "6.2.0", + "resolved": "https://registry.npmjs.org/postcss-nested/-/postcss-nested-6.2.0.tgz", + "integrity": "sha512-HQbt28KulC5AJzG+cZtj9kvKB93CFCdLvog1WFLf1D+xmMvPGlBstkpTEZfK5+AN9hfJocyBFCNiqyS48bpgzQ==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/postcss/" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "postcss-selector-parser": "^6.1.1" + }, + "engines": { + "node": ">=12.0" + }, + "peerDependencies": { + "postcss": "^8.2.14" + } + }, + "node_modules/postcss-selector-parser": { + "version": "6.1.2", + "resolved": "https://registry.npmjs.org/postcss-selector-parser/-/postcss-selector-parser-6.1.2.tgz", + "integrity": "sha512-Q8qQfPiZ+THO/3ZrOrO0cJJKfpYCagtMUkXbnEfmgUjwXg6z/WBeOyS9APBBPCTSiDV+s4SwQGu8yFsiMRIudg==", + "dev": true, + "license": "MIT", + "dependencies": { + "cssesc": "^3.0.0", + "util-deprecate": "^1.0.2" + }, + "engines": { + "node": ">=4" + } + }, + "node_modules/postcss-value-parser": { + "version": "4.2.0", + "resolved": "https://registry.npmjs.org/postcss-value-parser/-/postcss-value-parser-4.2.0.tgz", + "integrity": "sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ==", + "dev": true, + "license": "MIT" + }, + "node_modules/queue-microtask": { + "version": "1.2.3", + "resolved": "https://registry.npmjs.org/queue-microtask/-/queue-microtask-1.2.3.tgz", + "integrity": "sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A==", + "dev": true, + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/feross" + }, + { + "type": "patreon", + "url": "https://www.patreon.com/feross" + }, + { + "type": "consulting", + "url": "https://feross.org/support" + } + ], + "license": "MIT" + }, + "node_modules/read-cache": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/read-cache/-/read-cache-1.0.0.tgz", + "integrity": "sha512-Owdv/Ft7IjOgm/i0xvNDZ1LrRANRfew4b2prF3OWMQLxLfu3bS8FVhCsrSCMK4lR56Y9ya+AThoTpDCTxCmpRA==", + "dev": true, + "license": "MIT", + "dependencies": { + "pify": "^2.3.0" + } + }, + "node_modules/readdirp": { + "version": "3.6.0", + "resolved": "https://registry.npmjs.org/readdirp/-/readdirp-3.6.0.tgz", + "integrity": "sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==", + "dev": true, + "license": "MIT", + "dependencies": { + "picomatch": "^2.2.1" + }, + "engines": { + "node": ">=8.10.0" + } + }, + "node_modules/resolve": { + "version": "1.22.11", + "resolved": "https://registry.npmjs.org/resolve/-/resolve-1.22.11.tgz", + "integrity": "sha512-RfqAvLnMl313r7c9oclB1HhUEAezcpLjz95wFH4LVuhk9JF/r22qmVP9AMmOU4vMX7Q8pN8jwNg/CSpdFnMjTQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-core-module": "^2.16.1", + "path-parse": "^1.0.7", + "supports-preserve-symlinks-flag": "^1.0.0" + }, + "bin": { + "resolve": "bin/resolve" + }, + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/reusify": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/reusify/-/reusify-1.1.0.tgz", + "integrity": "sha512-g6QUff04oZpHs0eG5p83rFLhHeV00ug/Yf9nZM6fLeUrPguBTkTQOdpAWWspMh55TZfVQDPaN3NQJfbVRAxdIw==", + "dev": true, + "license": "MIT", + "engines": { + "iojs": ">=1.0.0", + "node": ">=0.10.0" + } + }, + "node_modules/rollup": { + "version": "4.53.5", + "resolved": "https://registry.npmjs.org/rollup/-/rollup-4.53.5.tgz", + "integrity": "sha512-iTNAbFSlRpcHeeWu73ywU/8KuU/LZmNCSxp6fjQkJBD3ivUb8tpDrXhIxEzA05HlYMEwmtaUnb3RP+YNv162OQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/estree": "1.0.8" + }, + "bin": { + "rollup": "dist/bin/rollup" + }, + "engines": { + "node": ">=18.0.0", + "npm": ">=8.0.0" + }, + "optionalDependencies": { + "@rollup/rollup-android-arm-eabi": "4.53.5", + "@rollup/rollup-android-arm64": "4.53.5", + "@rollup/rollup-darwin-arm64": "4.53.5", + "@rollup/rollup-darwin-x64": "4.53.5", + "@rollup/rollup-freebsd-arm64": "4.53.5", + "@rollup/rollup-freebsd-x64": "4.53.5", + "@rollup/rollup-linux-arm-gnueabihf": "4.53.5", + "@rollup/rollup-linux-arm-musleabihf": "4.53.5", + "@rollup/rollup-linux-arm64-gnu": "4.53.5", + "@rollup/rollup-linux-arm64-musl": "4.53.5", + "@rollup/rollup-linux-loong64-gnu": "4.53.5", + "@rollup/rollup-linux-ppc64-gnu": "4.53.5", + "@rollup/rollup-linux-riscv64-gnu": "4.53.5", + "@rollup/rollup-linux-riscv64-musl": "4.53.5", + "@rollup/rollup-linux-s390x-gnu": "4.53.5", + "@rollup/rollup-linux-x64-gnu": "4.53.5", + "@rollup/rollup-linux-x64-musl": "4.53.5", + "@rollup/rollup-openharmony-arm64": "4.53.5", + "@rollup/rollup-win32-arm64-msvc": "4.53.5", + "@rollup/rollup-win32-ia32-msvc": "4.53.5", + "@rollup/rollup-win32-x64-gnu": "4.53.5", + "@rollup/rollup-win32-x64-msvc": "4.53.5", + "fsevents": "~2.3.2" + } + }, + "node_modules/run-parallel": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/run-parallel/-/run-parallel-1.2.0.tgz", + "integrity": "sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA==", + "dev": true, + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/feross" + }, + { + "type": "patreon", + "url": "https://www.patreon.com/feross" + }, + { + "type": "consulting", + "url": "https://feross.org/support" + } + ], + "license": "MIT", + "dependencies": { + "queue-microtask": "^1.2.2" + } + }, + "node_modules/sade": { + "version": "1.8.1", + "resolved": "https://registry.npmjs.org/sade/-/sade-1.8.1.tgz", + "integrity": "sha512-xal3CZX1Xlo/k4ApwCFrHVACi9fBqJ7V+mwhBsuf/1IOKbBy098Fex+Wa/5QMubw09pSZ/u8EY8PWgevJsXp1A==", + "dev": true, + "license": "MIT", + "dependencies": { + "mri": "^1.1.0" + }, + "engines": { + "node": ">=6" + } + }, + "node_modules/set-cookie-parser": { + "version": "2.7.2", + "resolved": "https://registry.npmjs.org/set-cookie-parser/-/set-cookie-parser-2.7.2.tgz", + "integrity": "sha512-oeM1lpU/UvhTxw+g3cIfxXHyJRc/uidd3yK1P242gzHds0udQBYzs3y8j4gCCW+ZJ7ad0yctld8RYO+bdurlvw==", + "dev": true, + "license": "MIT" + }, + "node_modules/sirv": { + "version": "3.0.2", + "resolved": "https://registry.npmjs.org/sirv/-/sirv-3.0.2.tgz", + "integrity": "sha512-2wcC/oGxHis/BoHkkPwldgiPSYcpZK3JU28WoMVv55yHJgcZ8rlXvuG9iZggz+sU1d4bRgIGASwyWqjxu3FM0g==", + "dev": true, + "license": "MIT", + "dependencies": { + "@polka/url": "^1.0.0-next.24", + "mrmime": "^2.0.0", + "totalist": "^3.0.0" + }, + "engines": { + "node": ">=18" + } + }, + "node_modules/source-map-js": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz", + "integrity": "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==", + "dev": true, + "license": "BSD-3-Clause", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/sucrase": { + "version": "3.35.1", + "resolved": "https://registry.npmjs.org/sucrase/-/sucrase-3.35.1.tgz", + "integrity": "sha512-DhuTmvZWux4H1UOnWMB3sk0sbaCVOoQZjv8u1rDoTV0HTdGem9hkAZtl4JZy8P2z4Bg0nT+YMeOFyVr4zcG5Tw==", + "dev": true, + "license": "MIT", + "dependencies": { + "@jridgewell/gen-mapping": "^0.3.2", + "commander": "^4.0.0", + "lines-and-columns": "^1.1.6", + "mz": "^2.7.0", + "pirates": "^4.0.1", + "tinyglobby": "^0.2.11", + "ts-interface-checker": "^0.1.9" + }, + "bin": { + "sucrase": "bin/sucrase", + "sucrase-node": "bin/sucrase-node" + }, + "engines": { + "node": ">=16 || 14 >=14.17" + } + }, + "node_modules/supports-preserve-symlinks-flag": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/supports-preserve-symlinks-flag/-/supports-preserve-symlinks-flag-1.0.0.tgz", + "integrity": "sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">= 0.4" + }, + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/svelte": { + "version": "5.46.0", + "resolved": "https://registry.npmjs.org/svelte/-/svelte-5.46.0.tgz", + "integrity": "sha512-ZhLtvroYxUxr+HQJfMZEDRsGsmU46x12RvAv/zi9584f5KOX7bUrEbhPJ7cKFmUvZTJXi/CFZUYwDC6M1FigPw==", + "dev": true, + "license": "MIT", + "peer": true, + "dependencies": { + "@jridgewell/remapping": "^2.3.4", + "@jridgewell/sourcemap-codec": "^1.5.0", + "@sveltejs/acorn-typescript": "^1.0.5", + "@types/estree": "^1.0.5", + "acorn": "^8.12.1", + "aria-query": "^5.3.1", + "axobject-query": "^4.1.0", + "clsx": "^2.1.1", + "devalue": "^5.5.0", + "esm-env": "^1.2.1", + "esrap": "^2.2.1", + "is-reference": "^3.0.3", + "locate-character": "^3.0.0", + "magic-string": "^0.30.11", + "zimmerframe": "^1.1.2" + }, + "engines": { + "node": ">=18" + } + }, + "node_modules/tailwindcss": { + "version": "3.4.19", + "resolved": "https://registry.npmjs.org/tailwindcss/-/tailwindcss-3.4.19.tgz", + "integrity": "sha512-3ofp+LL8E+pK/JuPLPggVAIaEuhvIz4qNcf3nA1Xn2o/7fb7s/TYpHhwGDv1ZU3PkBluUVaF8PyCHcm48cKLWQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@alloc/quick-lru": "^5.2.0", + "arg": "^5.0.2", + "chokidar": "^3.6.0", + "didyoumean": "^1.2.2", + "dlv": "^1.1.3", + "fast-glob": "^3.3.2", + "glob-parent": "^6.0.2", + "is-glob": "^4.0.3", + "jiti": "^1.21.7", + "lilconfig": "^3.1.3", + "micromatch": "^4.0.8", + "normalize-path": "^3.0.0", + "object-hash": "^3.0.0", + "picocolors": "^1.1.1", + "postcss": "^8.4.47", + "postcss-import": "^15.1.0", + "postcss-js": "^4.0.1", + "postcss-load-config": "^4.0.2 || ^5.0 || ^6.0", + "postcss-nested": "^6.2.0", + "postcss-selector-parser": "^6.1.2", + "resolve": "^1.22.8", + "sucrase": "^3.35.0" + }, + "bin": { + "tailwind": "lib/cli.js", + "tailwindcss": "lib/cli.js" + }, + "engines": { + "node": ">=14.0.0" + } + }, + "node_modules/thenify": { + "version": "3.3.1", + "resolved": "https://registry.npmjs.org/thenify/-/thenify-3.3.1.tgz", + "integrity": "sha512-RVZSIV5IG10Hk3enotrhvz0T9em6cyHBLkH/YAZuKqd8hRkKhSfCGIcP2KUY0EPxndzANBmNllzWPwak+bheSw==", + "dev": true, + "license": "MIT", + "dependencies": { + "any-promise": "^1.0.0" + } + }, + "node_modules/thenify-all": { + "version": "1.6.0", + "resolved": "https://registry.npmjs.org/thenify-all/-/thenify-all-1.6.0.tgz", + "integrity": "sha512-RNxQH/qI8/t3thXJDwcstUO4zeqo64+Uy/+sNVRBx4Xn2OX+OZ9oP+iJnNFqplFra2ZUVeKCSa2oVWi3T4uVmA==", + "dev": true, + "license": "MIT", + "dependencies": { + "thenify": ">= 3.1.0 < 4" + }, + "engines": { + "node": ">=0.8" + } + }, + "node_modules/tinyglobby": { + "version": "0.2.15", + "resolved": "https://registry.npmjs.org/tinyglobby/-/tinyglobby-0.2.15.tgz", + "integrity": "sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "fdir": "^6.5.0", + "picomatch": "^4.0.3" + }, + "engines": { + "node": ">=12.0.0" + }, + "funding": { + "url": "https://github.com/sponsors/SuperchupuDev" + } + }, + "node_modules/tinyglobby/node_modules/fdir": { + "version": "6.5.0", + "resolved": "https://registry.npmjs.org/fdir/-/fdir-6.5.0.tgz", + "integrity": "sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=12.0.0" + }, + "peerDependencies": { + "picomatch": "^3 || ^4" + }, + "peerDependenciesMeta": { + "picomatch": { + "optional": true + } + } + }, + "node_modules/tinyglobby/node_modules/picomatch": { + "version": "4.0.3", + "resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz", + "integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==", + "dev": true, + "license": "MIT", + "peer": true, + "engines": { + "node": ">=12" + }, + "funding": { + "url": "https://github.com/sponsors/jonschlinkert" + } + }, + "node_modules/to-regex-range": { + "version": "5.0.1", + "resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz", + "integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "is-number": "^7.0.0" + }, + "engines": { + "node": ">=8.0" + } + }, + "node_modules/totalist": { + "version": "3.0.1", + "resolved": "https://registry.npmjs.org/totalist/-/totalist-3.0.1.tgz", + "integrity": "sha512-sf4i37nQ2LBx4m3wB74y+ubopq6W/dIzXg0FDGjsYnZHVa1Da8FH853wlL2gtUhg+xJXjfk3kUZS3BRoQeoQBQ==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=6" + } + }, + "node_modules/ts-interface-checker": { + "version": "0.1.13", + "resolved": "https://registry.npmjs.org/ts-interface-checker/-/ts-interface-checker-0.1.13.tgz", + "integrity": "sha512-Y/arvbn+rrz3JCKl9C4kVNfTfSm2/mEp5FSz5EsZSANGPSlQrpRI5M4PKF+mJnE52jOO90PnPSc3Ur3bTQw0gA==", + "dev": true, + "license": "Apache-2.0" + }, + "node_modules/update-browserslist-db": { + "version": "1.2.3", + "resolved": "https://registry.npmjs.org/update-browserslist-db/-/update-browserslist-db-1.2.3.tgz", + "integrity": "sha512-Js0m9cx+qOgDxo0eMiFGEueWztz+d4+M3rGlmKPT+T4IS/jP4ylw3Nwpu6cpTTP8R1MAC1kF4VbdLt3ARf209w==", + "dev": true, + "funding": [ + { + "type": "opencollective", + "url": "https://opencollective.com/browserslist" + }, + { + "type": "tidelift", + "url": "https://tidelift.com/funding/github/npm/browserslist" + }, + { + "type": "github", + "url": "https://github.com/sponsors/ai" + } + ], + "license": "MIT", + "dependencies": { + "escalade": "^3.2.0", + "picocolors": "^1.1.1" + }, + "bin": { + "update-browserslist-db": "cli.js" + }, + "peerDependencies": { + "browserslist": ">= 4.21.0" + } + }, + "node_modules/util-deprecate": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz", + "integrity": "sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw==", + "dev": true, + "license": "MIT" + }, + "node_modules/vite": { + "version": "7.3.0", + "resolved": "https://registry.npmjs.org/vite/-/vite-7.3.0.tgz", + "integrity": "sha512-dZwN5L1VlUBewiP6H9s2+B3e3Jg96D0vzN+Ry73sOefebhYr9f94wwkMNN/9ouoU8pV1BqA1d1zGk8928cx0rg==", + "dev": true, + "license": "MIT", + "peer": true, + "dependencies": { + "esbuild": "^0.27.0", + "fdir": "^6.5.0", + "picomatch": "^4.0.3", + "postcss": "^8.5.6", + "rollup": "^4.43.0", + "tinyglobby": "^0.2.15" + }, + "bin": { + "vite": "bin/vite.js" + }, + "engines": { + "node": "^20.19.0 || >=22.12.0" + }, + "funding": { + "url": "https://github.com/vitejs/vite?sponsor=1" + }, + "optionalDependencies": { + "fsevents": "~2.3.3" + }, + "peerDependencies": { + "@types/node": "^20.19.0 || >=22.12.0", + "jiti": ">=1.21.0", + "less": "^4.0.0", + "lightningcss": "^1.21.0", + "sass": "^1.70.0", + "sass-embedded": "^1.70.0", + "stylus": ">=0.54.8", + "sugarss": "^5.0.0", + "terser": "^5.16.0", + "tsx": "^4.8.1", + "yaml": "^2.4.2" + }, + "peerDependenciesMeta": { + "@types/node": { + "optional": true + }, + "jiti": { + "optional": true + }, + "less": { + "optional": true + }, + "lightningcss": { + "optional": true + }, + "sass": { + "optional": true + }, + "sass-embedded": { + "optional": true + }, + "stylus": { + "optional": true + }, + "sugarss": { + "optional": true + }, + "terser": { + "optional": true + }, + "tsx": { + "optional": true + }, + "yaml": { + "optional": true + } + } + }, + "node_modules/vite/node_modules/fdir": { + "version": "6.5.0", + "resolved": "https://registry.npmjs.org/fdir/-/fdir-6.5.0.tgz", + "integrity": "sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=12.0.0" + }, + "peerDependencies": { + "picomatch": "^3 || ^4" + }, + "peerDependenciesMeta": { + "picomatch": { + "optional": true + } + } + }, + "node_modules/vite/node_modules/picomatch": { + "version": "4.0.3", + "resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz", + "integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==", + "dev": true, + "license": "MIT", + "peer": true, + "engines": { + "node": ">=12" + }, + "funding": { + "url": "https://github.com/sponsors/jonschlinkert" + } + }, + "node_modules/vitefu": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/vitefu/-/vitefu-1.1.1.tgz", + "integrity": "sha512-B/Fegf3i8zh0yFbpzZ21amWzHmuNlLlmJT6n7bu5e+pCHUKQIfXSYokrqOBGEMMe9UG2sostKQF9mml/vYaWJQ==", + "dev": true, + "license": "MIT", + "workspaces": [ + "tests/deps/*", + "tests/projects/*", + "tests/projects/workspace/packages/*" + ], + "peerDependencies": { + "vite": "^3.0.0 || ^4.0.0 || ^5.0.0 || ^6.0.0 || ^7.0.0-beta.0" + }, + "peerDependenciesMeta": { + "vite": { + "optional": true + } + } + }, + "node_modules/zimmerframe": { + "version": "1.1.4", + "resolved": "https://registry.npmjs.org/zimmerframe/-/zimmerframe-1.1.4.tgz", + "integrity": "sha512-B58NGBEoc8Y9MWWCQGl/gq9xBCe4IiKM0a2x7GZdQKOW5Exr8S1W24J6OgM1njK8xCRGvAJIL/MxXHf6SkmQKQ==", + "dev": true, + "license": "MIT" + } + } +} diff --git a/frontend/package.json b/frontend/package.json new file mode 100755 index 0000000..7945d57 --- /dev/null +++ b/frontend/package.json @@ -0,0 +1,21 @@ +{ + "name": "frontend", + "private": true, + "version": "0.0.0", + "type": "module", + "scripts": { + "dev": "vite", + "build": "vite build", + "preview": "vite preview" + }, + "devDependencies": { + "@sveltejs/adapter-static": "^3.0.10", + "@sveltejs/kit": "^2.49.2", + "@sveltejs/vite-plugin-svelte": "^6.2.1", + "autoprefixer": "^10.4.0", + "postcss": "^8.4.0", + "svelte": "^5.43.8", + "tailwindcss": "^3.0.0", + "vite": "^7.2.4" + } +} diff --git a/frontend/postcss.config.js b/frontend/postcss.config.js new file mode 100755 index 0000000..68da54e --- /dev/null +++ b/frontend/postcss.config.js @@ -0,0 +1,6 @@ +export default { + plugins: { + tailwindcss: {}, + autoprefixer: {}, + }, +}; \ No newline at end of file diff --git a/frontend/public/vite.svg b/frontend/public/vite.svg new file mode 100755 index 0000000..e7b8dfb --- /dev/null +++ b/frontend/public/vite.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/frontend/src/App.svelte b/frontend/src/App.svelte new file mode 100755 index 0000000..f14c465 --- /dev/null +++ b/frontend/src/App.svelte @@ -0,0 +1,113 @@ + + + + + + + +
+
+ + +
+ +
+ {#if $currentPage === 'settings'} + + {:else if $selectedTask} + + + {:else if $selectedPlugin} +

{$selectedPlugin.name}

+ + + {:else} + + {/if} +
+
+ + + diff --git a/frontend/src/app.css b/frontend/src/app.css new file mode 100755 index 0000000..b5c61c9 --- /dev/null +++ b/frontend/src/app.css @@ -0,0 +1,3 @@ +@tailwind base; +@tailwind components; +@tailwind utilities; diff --git a/frontend/src/app.html b/frontend/src/app.html new file mode 100644 index 0000000..6769ed5 --- /dev/null +++ b/frontend/src/app.html @@ -0,0 +1,12 @@ + + + + + + + %sveltekit.head% + + +
%sveltekit.body%
+ + diff --git a/frontend/src/assets/svelte.svg b/frontend/src/assets/svelte.svg new file mode 100755 index 0000000..c5e0848 --- /dev/null +++ b/frontend/src/assets/svelte.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/frontend/src/components/DynamicForm.svelte b/frontend/src/components/DynamicForm.svelte new file mode 100755 index 0000000..95fb885 --- /dev/null +++ b/frontend/src/components/DynamicForm.svelte @@ -0,0 +1,88 @@ + + + + + +
+ {#if schema && schema.properties} + {#each Object.entries(schema.properties) as [key, prop]} +
+ + {#if prop.type === 'string'} + + {:else if prop.type === 'number' || prop.type === 'integer'} + + {:else if prop.type === 'boolean'} + + {/if} +
+ {/each} + + {/if} +
+ + + diff --git a/frontend/src/components/Footer.svelte b/frontend/src/components/Footer.svelte new file mode 100644 index 0000000..e1d9076 --- /dev/null +++ b/frontend/src/components/Footer.svelte @@ -0,0 +1,3 @@ +
+ © 2025 Superset Tools. All rights reserved. +
diff --git a/frontend/src/components/Navbar.svelte b/frontend/src/components/Navbar.svelte new file mode 100644 index 0000000..64fde12 --- /dev/null +++ b/frontend/src/components/Navbar.svelte @@ -0,0 +1,26 @@ + + +
+ + Superset Tools + + +
diff --git a/frontend/src/components/TaskRunner.svelte b/frontend/src/components/TaskRunner.svelte new file mode 100755 index 0000000..bc0b2ab --- /dev/null +++ b/frontend/src/components/TaskRunner.svelte @@ -0,0 +1,182 @@ + + + + + +
+ {#if $selectedTask} +
+

Task: {$selectedTask.plugin_id}

+
+ {#if connectionStatus === 'connecting'} + + + + + Connecting... + {:else if connectionStatus === 'connected'} + + Live + {:else if connectionStatus === 'completed'} + + Completed + {:else} + + Disconnected + {/if} +
+
+ +
+ {#each $taskLogs as log} +
+ {new Date(log.timestamp).toLocaleTimeString()} + [{log.level}] + {log.message} +
+ {/each} + + {#if waitingForData} +
+ Waiting for data... +
+ {/if} +
+ {:else} +

No task selected.

+ {/if} +
+ + + diff --git a/frontend/src/components/Toast.svelte b/frontend/src/components/Toast.svelte new file mode 100755 index 0000000..47fb11d --- /dev/null +++ b/frontend/src/components/Toast.svelte @@ -0,0 +1,31 @@ + + + + + +
+ {#each $toasts as toast (toast.id)} +
+ {toast.message} +
+ {/each} +
+ + + diff --git a/frontend/src/lib/Counter.svelte b/frontend/src/lib/Counter.svelte new file mode 100755 index 0000000..770c922 --- /dev/null +++ b/frontend/src/lib/Counter.svelte @@ -0,0 +1,10 @@ + + + diff --git a/frontend/src/lib/api.js b/frontend/src/lib/api.js new file mode 100755 index 0000000..98078ea --- /dev/null +++ b/frontend/src/lib/api.js @@ -0,0 +1,130 @@ +// [DEF:api_module:Module] +// @SEMANTICS: api, client, fetch, rest +// @PURPOSE: Handles all communication with the backend API. +// @LAYER: Infra-API + +import { addToast } from './toasts.js'; +import { PUBLIC_WS_URL } from '$env/static/public'; + +const API_BASE_URL = '/api'; + +/** + * Returns the WebSocket URL for a specific task, with fallback logic. + * @param {string} taskId + * @returns {string} + */ +export const getWsUrl = (taskId) => { + let baseUrl = PUBLIC_WS_URL; + if (!baseUrl) { + const protocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:'; + // Use the current host and port to allow Vite proxy to handle the connection + baseUrl = `${protocol}//${window.location.host}`; + } + return `${baseUrl}/ws/logs/${taskId}`; +}; + +// [DEF:fetchApi:Function] +// @PURPOSE: Generic GET request wrapper. +// @PARAM: endpoint (string) - API endpoint. +// @RETURN: Promise - JSON response. +async function fetchApi(endpoint) { + try { + console.log(`[api.fetchApi][Action] Fetching from context={{'endpoint': '${endpoint}'}}`); + const response = await fetch(`${API_BASE_URL}${endpoint}`); + if (!response.ok) { + throw new Error(`API request failed with status ${response.status}`); + } + return await response.json(); + } catch (error) { + console.error(`[api.fetchApi][Coherence:Failed] Error fetching from ${endpoint}:`, error); + addToast(error.message, 'error'); + throw error; + } +} +// [/DEF:fetchApi] + +// [DEF:postApi:Function] +// @PURPOSE: Generic POST request wrapper. +// @PARAM: endpoint (string) - API endpoint. +// @PARAM: body (object) - Request payload. +// @RETURN: Promise - JSON response. +async function postApi(endpoint, body) { + try { + console.log(`[api.postApi][Action] Posting to context={{'endpoint': '${endpoint}'}}`); + const response = await fetch(`${API_BASE_URL}${endpoint}`, { + method: 'POST', + headers: { + 'Content-Type': 'application/json', + }, + body: JSON.stringify(body), + }); + if (!response.ok) { + throw new Error(`API request failed with status ${response.status}`); + } + return await response.json(); + } catch (error) { + console.error(`[api.postApi][Coherence:Failed] Error posting to ${endpoint}:`, error); + addToast(error.message, 'error'); + throw error; + } +} +// [/DEF:postApi] + +// [DEF:requestApi:Function] +// @PURPOSE: Generic request wrapper. +async function requestApi(endpoint, method = 'GET', body = null) { + try { + console.log(`[api.requestApi][Action] ${method} to context={{'endpoint': '${endpoint}'}}`); + const options = { + method, + headers: { + 'Content-Type': 'application/json', + }, + }; + if (body) { + options.body = JSON.stringify(body); + } + const response = await fetch(`${API_BASE_URL}${endpoint}`, options); + if (!response.ok) { + const errorData = await response.json().catch(() => ({})); + throw new Error(errorData.detail || `API request failed with status ${response.status}`); + } + return await response.json(); + } catch (error) { + console.error(`[api.requestApi][Coherence:Failed] Error ${method} to ${endpoint}:`, error); + addToast(error.message, 'error'); + throw error; + } +} + +// [DEF:api:Data] +// @PURPOSE: API client object with specific methods. +export const api = { + getPlugins: () => fetchApi('/plugins/'), + getTasks: () => fetchApi('/tasks/'), + getTask: (taskId) => fetchApi(`/tasks/${taskId}`), + createTask: (pluginId, params) => postApi('/tasks/', { plugin_id: pluginId, params }), + + // Settings + getSettings: () => fetchApi('/settings/'), + updateGlobalSettings: (settings) => requestApi('/settings/global', 'PATCH', settings), + getEnvironments: () => fetchApi('/settings/environments'), + addEnvironment: (env) => postApi('/settings/environments', env), + updateEnvironment: (id, env) => requestApi(`/settings/environments/${id}`, 'PUT', env), + deleteEnvironment: (id) => requestApi(`/settings/environments/${id}`, 'DELETE'), + testEnvironmentConnection: (id) => postApi(`/settings/environments/${id}/test`, {}), +}; +// [/DEF:api_module] + +// Export individual functions for easier use in components +export const getPlugins = api.getPlugins; +export const getTasks = api.getTasks; +export const getTask = api.getTask; +export const createTask = api.createTask; +export const getSettings = api.getSettings; +export const updateGlobalSettings = api.updateGlobalSettings; +export const getEnvironments = api.getEnvironments; +export const addEnvironment = api.addEnvironment; +export const updateEnvironment = api.updateEnvironment; +export const deleteEnvironment = api.deleteEnvironment; +export const testEnvironmentConnection = api.testEnvironmentConnection; diff --git a/frontend/src/lib/stores.js b/frontend/src/lib/stores.js new file mode 100755 index 0000000..d2cbdfe --- /dev/null +++ b/frontend/src/lib/stores.js @@ -0,0 +1,60 @@ +// [DEF:stores_module:Module] +// @SEMANTICS: state, stores, svelte, plugins, tasks +// @PURPOSE: Global state management using Svelte stores. +// @LAYER: UI-State + +import { writable } from 'svelte/store'; +import { api } from './api.js'; + +// [DEF:plugins:Data] +// @PURPOSE: Store for the list of available plugins. +export const plugins = writable([]); + +// [DEF:tasks:Data] +// @PURPOSE: Store for the list of tasks. +export const tasks = writable([]); + +// [DEF:selectedPlugin:Data] +// @PURPOSE: Store for the currently selected plugin. +export const selectedPlugin = writable(null); + +// [DEF:selectedTask:Data] +// @PURPOSE: Store for the currently selected task. +export const selectedTask = writable(null); + +// [DEF:currentPage:Data] +// @PURPOSE: Store for the current page. +export const currentPage = writable('dashboard'); + +// [DEF:taskLogs:Data] +// @PURPOSE: Store for the logs of the currently selected task. +export const taskLogs = writable([]); + +// [DEF:fetchPlugins:Function] +// @PURPOSE: Fetches plugins from the API and updates the plugins store. +export async function fetchPlugins() { + try { + console.log("[stores.fetchPlugins][Action] Fetching plugins."); + const data = await api.getPlugins(); + console.log("[stores.fetchPlugins][Coherence:OK] Plugins fetched context={{'count': " + data.length + "}}"); + plugins.set(data); + } catch (error) { + console.error(`[stores.fetchPlugins][Coherence:Failed] Error fetching plugins context={{'error': '${error}'}}`); + } +} +// [/DEF:fetchPlugins] + +// [DEF:fetchTasks:Function] +// @PURPOSE: Fetches tasks from the API and updates the tasks store. +export async function fetchTasks() { + try { + console.log("[stores.fetchTasks][Action] Fetching tasks."); + const data = await api.getTasks(); + console.log("[stores.fetchTasks][Coherence:OK] Tasks fetched context={{'count': " + data.length + "}}"); + tasks.set(data); + } catch (error) { + console.error(`[stores.fetchTasks][Coherence:Failed] Error fetching tasks context={{'error': '${error}'}}`); + } +} +// [/DEF:fetchTasks] +// [/DEF:stores_module] \ No newline at end of file diff --git a/frontend/src/lib/toasts.js b/frontend/src/lib/toasts.js new file mode 100755 index 0000000..babc136 --- /dev/null +++ b/frontend/src/lib/toasts.js @@ -0,0 +1,33 @@ +// [DEF:toasts_module:Module] +// @SEMANTICS: notification, toast, feedback, state +// @PURPOSE: Manages toast notifications using a Svelte writable store. +// @LAYER: UI-State + +import { writable } from 'svelte/store'; + +// [DEF:toasts:Data] +// @PURPOSE: Writable store containing the list of active toasts. +export const toasts = writable([]); + +// [DEF:addToast:Function] +// @PURPOSE: Adds a new toast message. +// @PARAM: message (string) - The message text. +// @PARAM: type (string) - The type of toast (info, success, error). +// @PARAM: duration (number) - Duration in ms before the toast is removed. +export function addToast(message, type = 'info', duration = 3000) { + const id = Math.random().toString(36).substr(2, 9); + console.log(`[toasts.addToast][Action] Adding toast context={{'id': '${id}', 'type': '${type}', 'message': '${message}'}}`); + toasts.update(all => [...all, { id, message, type }]); + setTimeout(() => removeToast(id), duration); +} +// [/DEF:addToast] + +// [DEF:removeToast:Function] +// @PURPOSE: Removes a toast message by ID. +// @PARAM: id (string) - The ID of the toast to remove. +function removeToast(id) { + console.log(`[toasts.removeToast][Action] Removing toast context={{'id': '${id}'}}`); + toasts.update(all => all.filter(t => t.id !== id)); +} +// [/DEF:removeToast] +// [/DEF:toasts_module] \ No newline at end of file diff --git a/frontend/src/main.js b/frontend/src/main.js new file mode 100755 index 0000000..f79b68d --- /dev/null +++ b/frontend/src/main.js @@ -0,0 +1,17 @@ +// [DEF:main:Module] +// @SEMANTICS: entrypoint, svelte, init +// @PURPOSE: Entry point for the Svelte application. +// @LAYER: UI-Entry + +import './app.css' +import App from './App.svelte' + +// [DEF:app_instance:Data] +// @PURPOSE: Initialized Svelte app instance. +const app = new App({ + target: document.getElementById('app'), + props: {} +}) + +export default app +// [/DEF:main] diff --git a/frontend/src/pages/Dashboard.svelte b/frontend/src/pages/Dashboard.svelte new file mode 100755 index 0000000..416fc81 --- /dev/null +++ b/frontend/src/pages/Dashboard.svelte @@ -0,0 +1,60 @@ + + + + + +
+

Available Tools

+
+ {#each $plugins as plugin} +
selectPlugin(plugin)} + role="button" + tabindex="0" + on:keydown={(e) => e.key === 'Enter' && selectPlugin(plugin)} + > +

{plugin.name}

+

{plugin.description}

+ v{plugin.version} +
+ {/each} +
+
+ + + diff --git a/frontend/src/pages/Settings.svelte b/frontend/src/pages/Settings.svelte new file mode 100755 index 0000000..a15e419 --- /dev/null +++ b/frontend/src/pages/Settings.svelte @@ -0,0 +1,271 @@ + + + + + +
+

Settings

+ +
+

Global Settings

+
+
+ + +
+ +
+
+ +
+

Superset Environments

+ + {#if settings.environments.length === 0} +
+

Warning

+

No Superset environments configured. You must add at least one environment to perform backups or migrations.

+
+ {/if} + +
+ + + + + + + + + + + + {#each settings.environments as env} + + + + + + + + {/each} + +
NameURLUsernameDefaultActions
{env.name}{env.url}{env.username}{env.is_default ? 'Yes' : 'No'} + + + +
+
+ +
+

{editingEnvId ? 'Edit' : 'Add'} Environment

+
+
+ + +
+
+ + +
+
+ + +
+
+ + +
+
+ + +
+
+ + +
+
+
+ + {#if editingEnvId} + + {/if} +
+
+
+
+ + + diff --git a/frontend/src/routes/+error.svelte b/frontend/src/routes/+error.svelte new file mode 100644 index 0000000..ffc95b1 --- /dev/null +++ b/frontend/src/routes/+error.svelte @@ -0,0 +1,11 @@ + + +
+

{$page.status}

+

{$page.error?.message || 'Page not found'}

+ + Back to Dashboard + +
diff --git a/frontend/src/routes/+layout.svelte b/frontend/src/routes/+layout.svelte new file mode 100644 index 0000000..4fd5373 --- /dev/null +++ b/frontend/src/routes/+layout.svelte @@ -0,0 +1,18 @@ + + + + +
+ + +
+ +
+ +
+
diff --git a/frontend/src/routes/+layout.ts b/frontend/src/routes/+layout.ts new file mode 100644 index 0000000..83addb7 --- /dev/null +++ b/frontend/src/routes/+layout.ts @@ -0,0 +1,2 @@ +export const ssr = false; +export const prerender = false; diff --git a/frontend/src/routes/+page.svelte b/frontend/src/routes/+page.svelte new file mode 100644 index 0000000..30d33dd --- /dev/null +++ b/frontend/src/routes/+page.svelte @@ -0,0 +1,71 @@ + + +
+ {#if $selectedTask} + + + {:else if $selectedPlugin} +

{$selectedPlugin.name}

+ + + {:else} +

Available Tools

+ {#if data.error} +
+ {data.error} +
+ {/if} +
+ {#each data.plugins as plugin} +
selectPlugin(plugin)} + role="button" + tabindex="0" + on:keydown={(e) => e.key === 'Enter' && selectPlugin(plugin)} + > +

{plugin.name}

+

{plugin.description}

+ v{plugin.version} +
+ {/each} +
+ {/if} +
diff --git a/frontend/src/routes/+page.ts b/frontend/src/routes/+page.ts new file mode 100644 index 0000000..9750392 --- /dev/null +++ b/frontend/src/routes/+page.ts @@ -0,0 +1,17 @@ +import { api } from '../lib/api'; + +/** @type {import('./$types').PageLoad} */ +export async function load() { + try { + const plugins = await api.getPlugins(); + return { + plugins + }; + } catch (error) { + console.error('Failed to load plugins:', error); + return { + plugins: [], + error: 'Failed to load plugins' + }; + } +} diff --git a/frontend/src/routes/settings/+page.svelte b/frontend/src/routes/settings/+page.svelte new file mode 100644 index 0000000..80d5529 --- /dev/null +++ b/frontend/src/routes/settings/+page.svelte @@ -0,0 +1,209 @@ + + +
+

Settings

+ + {#if data.error} +
+ {data.error} +
+ {/if} + +
+

Global Settings

+
+
+ + +
+ +
+
+ +
+

Superset Environments

+ + {#if settings.environments.length === 0} +
+

Warning

+

No Superset environments configured. You must add at least one environment to perform backups or migrations.

+
+ {/if} + +
+ + + + + + + + + + + + {#each settings.environments as env} + + + + + + + + {/each} + +
NameURLUsernameDefaultActions
{env.name}{env.url}{env.username}{env.is_default ? 'Yes' : 'No'} + + + +
+
+ +
+

{editingEnvId ? 'Edit' : 'Add'} Environment

+
+
+ + +
+
+ + +
+
+ + +
+
+ + +
+
+ + +
+
+ + +
+
+
+ + {#if editingEnvId} + + {/if} +
+
+
+
diff --git a/frontend/src/routes/settings/+page.ts b/frontend/src/routes/settings/+page.ts new file mode 100644 index 0000000..91f7849 --- /dev/null +++ b/frontend/src/routes/settings/+page.ts @@ -0,0 +1,23 @@ +import { api } from '../../lib/api'; + +/** @type {import('./$types').PageLoad} */ +export async function load() { + try { + const settings = await api.getSettings(); + return { + settings + }; + } catch (error) { + console.error('Failed to load settings:', error); + return { + settings: { + environments: [], + settings: { + backup_path: '', + default_environment_id: null + } + }, + error: 'Failed to load settings' + }; + } +} diff --git a/frontend/svelte.config.js b/frontend/svelte.config.js new file mode 100755 index 0000000..46769d8 --- /dev/null +++ b/frontend/svelte.config.js @@ -0,0 +1,19 @@ +import adapter from '@sveltejs/adapter-static'; +import { vitePreprocess } from '@sveltejs/vite-plugin-svelte'; + +/** @type {import('@sveltejs/kit').Config} */ +const config = { + preprocess: vitePreprocess(), + + kit: { + adapter: adapter({ + pages: 'build', + assets: 'build', + fallback: 'index.html', + precompress: false, + strict: true + }) + } +}; + +export default config; diff --git a/frontend/tailwind.config.js b/frontend/tailwind.config.js new file mode 100755 index 0000000..b42afe1 --- /dev/null +++ b/frontend/tailwind.config.js @@ -0,0 +1,11 @@ +/** @type {import('tailwindcss').Config} */ +export default { + content: [ + "./index.html", + "./src/**/*.{svelte,js,ts,jsx,tsx}", + ], + theme: { + extend: {}, + }, + plugins: [], +} \ No newline at end of file diff --git a/frontend/vite.config.js b/frontend/vite.config.js new file mode 100755 index 0000000..46ce1c5 --- /dev/null +++ b/frontend/vite.config.js @@ -0,0 +1,19 @@ +import { sveltekit } from '@sveltejs/kit/vite'; +import { defineConfig } from 'vite'; + +export default defineConfig({ + plugins: [sveltekit()], + server: { + proxy: { + '/api': { + target: 'http://localhost:8000', + changeOrigin: true + }, + '/ws': { + target: 'ws://localhost:8000', + ws: true, + changeOrigin: true + } + } + } +}); diff --git a/get_dataset_structure.py b/get_dataset_structure.py new file mode 100755 index 0000000..1ba3325 --- /dev/null +++ b/get_dataset_structure.py @@ -0,0 +1,64 @@ +# [DEF:get_dataset_structure:Module] +# +# @SEMANTICS: superset, dataset, structure, debug, json +# @PURPOSE: Этот модуль предназначен для получения и сохранения структуры данных датасета из Superset. Он используется для отладки и анализа данных, возвращаемых API. +# @LAYER: App +# @RELATION: DEPENDS_ON -> superset_tool.client +# @RELATION: DEPENDS_ON -> superset_tool.utils.init_clients +# @RELATION: DEPENDS_ON -> superset_tool.utils.logger +# @PUBLIC_API: get_and_save_dataset + +# [SECTION: IMPORTS] +import argparse +import json +from superset_tool.utils.init_clients import setup_clients +from superset_tool.utils.logger import SupersetLogger +# [/SECTION] + +# [DEF:get_and_save_dataset:Function] +# @PURPOSE: Получает структуру датасета из Superset и сохраняет ее в JSON-файл. +# @RELATION: CALLS -> setup_clients +# @RELATION: CALLS -> superset_client.get_dataset +# @PARAM: env (str) - Среда (dev, prod, и т.д.) для подключения. +# @PARAM: dataset_id (int) - ID датасета для получения. +# @PARAM: output_path (str) - Путь для сохранения JSON-файла. +def get_and_save_dataset(env: str, dataset_id: int, output_path: str): + """ + Получает структуру датасета и сохраняет в файл. + """ + logger = SupersetLogger(name="DatasetStructureRetriever") + logger.info("[get_and_save_dataset][Enter] Starting to fetch dataset structure for ID %d from env '%s'.", dataset_id, env) + + try: + clients = setup_clients(logger=logger) + superset_client = clients.get(env) + if not superset_client: + logger.error("[get_and_save_dataset][Failure] Environment '%s' not found.", env) + return + + dataset_response = superset_client.get_dataset(dataset_id) + dataset_data = dataset_response.get('result') + + if not dataset_data: + logger.error("[get_and_save_dataset][Failure] No result in dataset response.") + return + + with open(output_path, 'w', encoding='utf-8') as f: + json.dump(dataset_data, f, ensure_ascii=False, indent=4) + + logger.info("[get_and_save_dataset][Success] Dataset structure saved to %s.", output_path) + + except Exception as e: + logger.error("[get_and_save_dataset][Failure] An error occurred: %s", e, exc_info=True) +# [/DEF:get_and_save_dataset] + +if __name__ == "__main__": + parser = argparse.ArgumentParser(description="Получение структуры датасета из Superset.") + parser.add_argument("--dataset-id", required=True, type=int, help="ID датасета.") + parser.add_argument("--env", required=True, help="Среда для подключения (например, dev).") + parser.add_argument("--output-path", default="dataset_structure.json", help="Путь для сохранения JSON-файла.") + args = parser.parse_args() + + get_and_save_dataset(args.env, args.dataset_id, args.output_path) + +# [/DEF:get_dataset_structure] diff --git a/migration_script.py b/migration_script.py old mode 100644 new mode 100755 index 5d031d6..72ed3b3 --- a/migration_script.py +++ b/migration_script.py @@ -1,303 +1,401 @@ -# -*- coding: utf-8 -*- -# CONTRACT: -# PURPOSE: Интерактивный скрипт для миграции ассетов Superset между различными окружениями. -# SPECIFICATION_LINK: mod_migration_script -# PRECONDITIONS: Наличие корректных конфигурационных файлов для подключения к Superset. -# POSTCONDITIONS: Выбранные ассеты успешно перенесены из исходного в целевое окружение. -# IMPORTS: [argparse, superset_tool.client, superset_tool.utils.init_clients, superset_tool.utils.logger, superset_tool.utils.fileio] -""" -[MODULE] Superset Migration Tool -@description: Интерактивный скрипт для миграции ассетов Superset между различными окружениями. -""" - -# [IMPORTS] -from superset_tool.client import SupersetClient -from superset_tool.utils.init_clients import init_superset_clients -from superset_tool.utils.logger import SupersetLogger -from superset_tool.utils.fileio import ( - save_and_unpack_dashboard, - read_dashboard_from_disk, - update_yamls, - create_dashboard_export -) - -# [ENTITY: Class('Migration')] -# CONTRACT: -# PURPOSE: Инкапсулирует логику и состояние процесса миграции. -# SPECIFICATION_LINK: class_migration -# ATTRIBUTES: -# - name: logger, type: SupersetLogger, description: Экземпляр логгера. -# - name: from_c, type: SupersetClient, description: Клиент для исходного окружения. -# - name: to_c, type: SupersetClient, description: Клиент для целевого окружения. -# - name: dashboards_to_migrate, type: list, description: Список дашбордов для миграции. -# - name: db_config_replacement, type: dict, description: Конфигурация для замены данных БД. -class Migration: - """ - Класс для управления процессом миграции дашбордов Superset. - """ - def __init__(self): - self.logger = SupersetLogger(name="migration_script") - self.from_c: SupersetClient = None - self.to_c: SupersetClient = None - self.dashboards_to_migrate = [] - self.db_config_replacement = None - # END_FUNCTION___init__ - - # [ENTITY: Function('run')] - # CONTRACT: - # PURPOSE: Запускает основной воркфлоу миграции, координируя все шаги. - # SPECIFICATION_LINK: func_run_migration - # PRECONDITIONS: None - # POSTCONDITIONS: Процесс миграции завершен. - def run(self): - """Запускает основной воркфлоу миграции.""" - self.logger.info("[INFO][run][ENTER] Запуск скрипта миграции.") - self.select_environments() - self.select_dashboards() - self.confirm_db_config_replacement() - self.execute_migration() - self.logger.info("[INFO][run][EXIT] Скрипт миграции завершен.") - # END_FUNCTION_run - - # [ENTITY: Function('select_environments')] - # CONTRACT: - # PURPOSE: Шаг 1. Обеспечивает интерактивный выбор исходного и целевого окружений. - # SPECIFICATION_LINK: func_select_environments - # PRECONDITIONS: None - # POSTCONDITIONS: Атрибуты `self.from_c` и `self.to_c` инициализированы валидными клиентами Superset. - def select_environments(self): - """Шаг 1: Выбор окружений (источник и назначение).""" - self.logger.info("[INFO][select_environments][ENTER] Шаг 1/4: Выбор окружений.") - - available_envs = {"1": "DEV", "2": "PROD"} - - print("Доступные окружения:") - for key, value in available_envs.items(): - print(f" {key}. {value}") - - while self.from_c is None: - try: - from_env_choice = input("Выберите исходное окружение (номер): ") - from_env_name = available_envs.get(from_env_choice) - if not from_env_name: - print("Неверный выбор. Попробуйте снова.") - continue - - clients = init_superset_clients(self.logger, env=from_env_name.lower()) - self.from_c = clients[0] - self.logger.info(f"[INFO][select_environments][STATE] Исходное окружение: {from_env_name}") - - except Exception as e: - self.logger.error(f"[ERROR][select_environments][FAILURE] Ошибка при инициализации клиента-источника: {e}", exc_info=True) - print("Не удалось инициализировать клиент. Проверьте конфигурацию.") - - while self.to_c is None: - try: - to_env_choice = input("Выберите целевое окружение (номер): ") - to_env_name = available_envs.get(to_env_choice) - - if not to_env_name: - print("Неверный выбор. Попробуйте снова.") - continue - - if to_env_name == self.from_c.env: - print("Целевое и исходное окружения не могут совпадать.") - continue - - clients = init_superset_clients(self.logger, env=to_env_name.lower()) - self.to_c = clients[0] - self.logger.info(f"[INFO][select_environments][STATE] Целевое окружение: {to_env_name}") - - except Exception as e: - self.logger.error(f"[ERROR][select_environments][FAILURE] Ошибка при инициализации целевого клиента: {e}", exc_info=True) - print("Не удалось инициализировать клиент. Проверьте конфигурацию.") - self.logger.info("[INFO][select_environments][EXIT] Шаг 1 завершен.") - # END_FUNCTION_select_environments - - # [ENTITY: Function('select_dashboards')] - # CONTRACT: - # PURPOSE: Шаг 2. Обеспечивает интерактивный выбор дашбордов для миграции. - # SPECIFICATION_LINK: func_select_dashboards - # PRECONDITIONS: `self.from_c` должен быть инициализирован. - # POSTCONDITIONS: `self.dashboards_to_migrate` содержит список выбранных дашбордов. - def select_dashboards(self): - """Шаг 2: Выбор дашбордов для миграции.""" - self.logger.info("[INFO][select_dashboards][ENTER] Шаг 2/4: Выбор дашбордов.") - - try: - all_dashboards = self.from_c.get_dashboards() - if not all_dashboards: - self.logger.warning("[WARN][select_dashboards][STATE] В исходном окружении не найдено дашбордов.") - print("В исходном окружении не найдено дашбордов.") - return - - while True: - print("\nДоступные дашборды:") - for i, dashboard in enumerate(all_dashboards): - print(f" {i + 1}. {dashboard['dashboard_title']}") - - print("\nОпции:") - print(" - Введите номера дашбордов через запятую (например, 1, 3, 5).") - print(" - Введите 'все' для выбора всех дашбордов.") - print(" - Введите 'поиск <запрос>' для поиска дашбордов.") - print(" - Введите 'выход' для завершения.") - - choice = input("Ваш выбор: ").lower().strip() - - if choice == 'выход': - break - elif choice == 'все': - self.dashboards_to_migrate = all_dashboards - self.logger.info(f"[INFO][select_dashboards][STATE] Выбраны все дашборды: {len(self.dashboards_to_migrate)}") - break - elif choice.startswith('поиск '): - search_query = choice[6:].strip() - filtered_dashboards = [d for d in all_dashboards if search_query in d['dashboard_title'].lower()] - if not filtered_dashboards: - print("По вашему запросу ничего не найдено.") - else: - all_dashboards = filtered_dashboards - continue - else: - try: - selected_indices = [int(i.strip()) - 1 for i in choice.split(',')] - self.dashboards_to_migrate = [all_dashboards[i] for i in selected_indices if 0 <= i < len(all_dashboards)] - self.logger.info(f"[INFO][select_dashboards][STATE] Выбрано дашбордов: {len(self.dashboards_to_migrate)}") - break - except (ValueError, IndexError): - print("Неверный ввод. Пожалуйста, введите корректные номера.") - - except Exception as e: - self.logger.error(f"[ERROR][select_dashboards][FAILURE] Ошибка при получении или выборе дашбордов: {e}", exc_info=True) - print("Произошла ошибка при работе с дашбордами.") - - self.logger.info("[INFO][select_dashboards][EXIT] Шаг 2 завершен.") - # END_FUNCTION_select_dashboards - - # [ENTITY: Function('confirm_db_config_replacement')] - # CONTRACT: - # PURPOSE: Шаг 3. Управляет процессом подтверждения и настройки замены конфигураций БД. - # SPECIFICATION_LINK: func_confirm_db_config_replacement - # PRECONDITIONS: `self.from_c` и `self.to_c` инициализированы. - # POSTCONDITIONS: `self.db_config_replacement` содержит конфигурацию для замены или `None`. - def confirm_db_config_replacement(self): - """Шаг 3: Подтверждение и настройка замены конфигурации БД.""" - self.logger.info("[INFO][confirm_db_config_replacement][ENTER] Шаг 3/4: Замена конфигурации БД.") - - while True: - choice = input("Хотите ли вы заменить конфигурации баз данных в YAML-файлах? (да/нет): ").lower().strip() - if choice in ["да", "нет"]: - break - print("Неверный ввод. Пожалуйста, введите 'да' или 'нет'.") - - if choice == 'нет': - self.logger.info("[INFO][confirm_db_config_replacement][STATE] Замена конфигурации БД пропущена.") - return - - # Эвристический расчет - from_env = self.from_c.env.upper() - to_env = self.to_c.env.upper() - heuristic_applied = False - - if from_env == "DEV" and to_env == "PROD": - self.db_config_replacement = {"old": {"database_name": "db_dev"}, "new": {"database_name": "db_prod"}} # Пример - self.logger.info("[INFO][confirm_db_config_replacement][STATE] Применена эвристика DEV -> PROD.") - heuristic_applied = True - elif from_env == "PROD" and to_env == "DEV": - self.db_config_replacement = {"old": {"database_name": "db_prod"}, "new": {"database_name": "db_dev"}} # Пример - self.logger.info("[INFO][confirm_db_config_replacement][STATE] Применена эвристика PROD -> DEV.") - heuristic_applied = True - - if heuristic_applied: - print(f"На основе эвристики будет произведена следующая замена: {self.db_config_replacement}") - confirm = input("Подтверждаете? (да/нет): ").lower().strip() - if confirm != 'да': - self.db_config_replacement = None - heuristic_applied = False - - if not heuristic_applied: - print("Пожалуйста, введите детали для замены.") - old_key = input("Ключ для замены (например, database_name): ") - old_value = input(f"Старое значение для {old_key}: ") - new_value = input(f"Новое значение для {old_key}: ") - self.db_config_replacement = {"old": {old_key: old_value}, "new": {old_key: new_value}} - self.logger.info(f"[INFO][confirm_db_config_replacement][STATE] Установлена ручная замена: {self.db_config_replacement}") - - self.logger.info("[INFO][confirm_db_config_replacement][EXIT] Шаг 3 завершен.") - # END_FUNCTION_confirm_db_config_replacement - - # [ENTITY: Function('execute_migration')] - # CONTRACT: - # PURPOSE: Шаг 4. Выполняет фактическую миграцию выбранных дашбордов. - # SPECIFICATION_LINK: func_execute_migration - # PRECONDITIONS: Все предыдущие шаги (`select_environments`, `select_dashboards`) успешно выполнены. - # POSTCONDITIONS: Выбранные дашборды перенесены в целевое окружение. - def execute_migration(self): - """Шаг 4: Выполнение миграции и обновления конфигураций.""" - self.logger.info("[INFO][execute_migration][ENTER] Шаг 4/4: Выполнение миграции.") - - if not self.dashboards_to_migrate: - self.logger.warning("[WARN][execute_migration][STATE] Нет дашбордов для миграции.") - print("Нет дашбордов для миграции. Завершение.") - return - - db_configs_for_update = [] - if self.db_config_replacement: - try: - from_dbs = self.from_c.get_databases() - to_dbs = self.to_c.get_databases() - - # Просто пример, как можно было бы сопоставить базы данных. - # В реальном сценарии логика может быть сложнее. - for from_db in from_dbs: - for to_db in to_dbs: - # Предполагаем, что мы можем сопоставить базы по имени, заменив суффикс - if from_db['database_name'].replace(self.from_c.env.upper(), self.to_c.env.upper()) == to_db['database_name']: - db_configs_for_update.append({ - "old": {"database_name": from_db['database_name']}, - "new": {"database_name": to_db['database_name']} - }) - self.logger.info(f"[INFO][execute_migration][STATE] Сформированы конфигурации для замены БД: {db_configs_for_update}") - except Exception as e: - self.logger.error(f"[ERROR][execute_migration][FAILURE] Не удалось получить конфигурации БД: {e}", exc_info=True) - print("Не удалось получить конфигурации БД. Миграция будет продолжена без замены.") - - for dashboard in self.dashboards_to_migrate: - try: - dashboard_id = dashboard['id'] - self.logger.info(f"[INFO][execute_migration][PROGRESS] Миграция дашборда: {dashboard['dashboard_title']} (ID: {dashboard_id})") - - # 1. Экспорт - exported_content = self.from_c.export_dashboards(dashboard_id) - zip_path, unpacked_path = save_and_unpack_dashboard(exported_content, f"temp_export_{dashboard_id}", unpack=True) - self.logger.info(f"[INFO][execute_migration][STATE] Дашборд экспортирован и распакован в {unpacked_path}") - - # 2. Обновление YAML, если нужно - if db_configs_for_update: - update_yamls(db_configs=db_configs_for_update, path=str(unpacked_path)) - self.logger.info(f"[INFO][execute_migration][STATE] YAML-файлы обновлены.") - - # 3. Упаковка и импорт - new_zip_path = f"migrated_dashboard_{dashboard_id}.zip" - create_dashboard_export(new_zip_path, [unpacked_path]) - - content_to_import, _ = read_dashboard_from_disk(new_zip_path) - self.to_c.import_dashboards(content_to_import) - self.logger.info(f"[INFO][execute_migration][SUCCESS] Дашборд {dashboard['dashboard_title']} успешно импортирован.") - - except Exception as e: - self.logger.error(f"[ERROR][execute_migration][FAILURE] Ошибка при миграции дашборда {dashboard['dashboard_title']}: {e}", exc_info=True) - print(f"Не удалось смигрировать дашборд: {dashboard['dashboard_title']}") - - self.logger.info("[INFO][execute_migration][EXIT] Шаг 4 завершен.") - # END_FUNCTION_execute_migration - -# END_CLASS_Migration - -# [MAIN_EXECUTION_BLOCK] -if __name__ == "__main__": - migration = Migration() - migration.run() -# END_MAIN_EXECUTION_BLOCK - -# END_MODULE_migration_script \ No newline at end of file +# [DEF:migration_script:Module] +# +# @SEMANTICS: migration, cli, superset, ui, logging, error-recovery, batch-delete +# @PURPOSE: Предоставляет интерактивный CLI для миграции дашбордов Superset между окружениями с возможностью восстановления после ошибок. +# @LAYER: App +# @RELATION: DEPENDS_ON -> superset_tool.client +# @RELATION: DEPENDS_ON -> superset_tool.utils +# @PUBLIC_API: Migration + +# [SECTION: IMPORTS] +import json +import logging +import sys +import zipfile +import re +from pathlib import Path +from typing import List, Optional, Tuple, Dict +from superset_tool.client import SupersetClient +from superset_tool.utils.init_clients import setup_clients +from superset_tool.utils.fileio import create_temp_file, update_yamls, create_dashboard_export +from superset_tool.utils.whiptail_fallback import menu, checklist, yesno, msgbox, inputbox, gauge +from superset_tool.utils.logger import SupersetLogger +# [/SECTION] + +# [DEF:Migration:Class] +# @PURPOSE: Инкапсулирует логику интерактивной миграции дашбордов с возможностью «удалить‑и‑перезаписать» при ошибке импорта. +# @RELATION: CREATES_INSTANCE_OF -> SupersetLogger +# @RELATION: USES -> SupersetClient +class Migration: + """ + Интерактивный процесс миграции дашбордов. + """ + # [DEF:Migration.__init__:Function] + # @PURPOSE: Инициализирует сервис миграции, настраивает логгер и начальные состояния. + # @POST: `self.logger` готов к использованию; `enable_delete_on_failure` = `False`. + def __init__(self) -> None: + default_log_dir = Path.cwd() / "logs" + self.logger = SupersetLogger( + name="migration_script", + log_dir=default_log_dir, + level=logging.INFO, + console=True, + ) + self.enable_delete_on_failure = False + self.from_c: Optional[SupersetClient] = None + self.to_c: Optional[SupersetClient] = None + self.dashboards_to_migrate: List[dict] = [] + self.db_config_replacement: Optional[dict] = None + self._failed_imports: List[dict] = [] + # [/DEF:Migration.__init__] + + # [DEF:Migration.run:Function] + # @PURPOSE: Точка входа – последовательный запуск всех шагов миграции. + # @PRE: Логгер готов. + # @POST: Скрипт завершён, пользователю выведено сообщение. + # @RELATION: CALLS -> self.ask_delete_on_failure + # @RELATION: CALLS -> self.select_environments + # @RELATION: CALLS -> self.select_dashboards + # @RELATION: CALLS -> self.confirm_db_config_replacement + # @RELATION: CALLS -> self.execute_migration + def run(self) -> None: + self.logger.info("[run][Entry] Запуск скрипта миграции.") + self.ask_delete_on_failure() + self.select_environments() + self.select_dashboards() + self.confirm_db_config_replacement() + self.execute_migration() + self.logger.info("[run][Exit] Скрипт миграции завершён.") + # [/DEF:Migration.run] + + # [DEF:Migration.ask_delete_on_failure:Function] + # @PURPOSE: Запрашивает у пользователя, следует ли удалять дашборд при ошибке импорта. + # @POST: `self.enable_delete_on_failure` установлен. + # @RELATION: CALLS -> yesno + def ask_delete_on_failure(self) -> None: + self.enable_delete_on_failure = yesno( + "Поведение при ошибке импорта", + "Если импорт завершится ошибкой, удалить существующий дашборд и попытаться импортировать заново?", + ) + self.logger.info( + "[ask_delete_on_failure][State] Delete-on-failure = %s", + self.enable_delete_on_failure, + ) + # [/DEF:Migration.ask_delete_on_failure] + + # [DEF:Migration.select_environments:Function] + # @PURPOSE: Позволяет пользователю выбрать исходное и целевое окружения Superset. + # @PRE: `setup_clients` успешно инициализирует все клиенты. + # @POST: `self.from_c` и `self.to_c` установлены. + # @RELATION: CALLS -> setup_clients + # @RELATION: CALLS -> menu + def select_environments(self) -> None: + self.logger.info("[select_environments][Entry] Шаг 1/5: Выбор окружений.") + try: + all_clients = setup_clients(self.logger) + available_envs = list(all_clients.keys()) + except Exception as e: + self.logger.error("[select_environments][Failure] %s", e, exc_info=True) + msgbox("Ошибка", "Не удалось инициализировать клиенты.") + return + + rc, from_env_name = menu( + title="Выбор окружения", + prompt="Исходное окружение:", + choices=available_envs, + ) + if rc != 0 or from_env_name is None: + self.logger.info("[select_environments][State] Source environment selection cancelled.") + return + self.from_c = all_clients[from_env_name] + self.logger.info("[select_environments][State] from = %s", from_env_name) + + available_envs.remove(from_env_name) + rc, to_env_name = menu( + title="Выбор окружения", + prompt="Целевое окружение:", + choices=available_envs, + ) + if rc != 0 or to_env_name is None: + self.logger.info("[select_environments][State] Target environment selection cancelled.") + return + self.to_c = all_clients[to_env_name] + self.logger.info("[select_environments][State] to = %s", to_env_name) + self.logger.info("[select_environments][Exit] Шаг 1 завершён.") + # [/DEF:Migration.select_environments] + + # [DEF:Migration.select_dashboards:Function] + # @PURPOSE: Позволяет пользователю выбрать набор дашбордов для миграции. + # @PRE: `self.from_c` инициализирован. + # @POST: `self.dashboards_to_migrate` заполнен. + # @RELATION: CALLS -> self.from_c.get_dashboards + # @RELATION: CALLS -> checklist + def select_dashboards(self) -> None: + self.logger.info("[select_dashboards][Entry] Шаг 2/5: Выбор дашбордов.") + if self.from_c is None: + self.logger.error("[select_dashboards][Failure] Source client not initialized.") + msgbox("Ошибка", "Исходное окружение не выбрано.") + return + try: + _, all_dashboards = self.from_c.get_dashboards() + if not all_dashboards: + self.logger.warning("[select_dashboards][State] No dashboards.") + msgbox("Информация", "В исходном окружении нет дашбордов.") + return + + rc, regex = inputbox("Поиск", "Введите регулярное выражение для поиска дашбордов:") + if rc != 0: + return + # Ensure regex is a string and perform case‑insensitive search + regex_str = str(regex) + filtered_dashboards = [ + d for d in all_dashboards if re.search(regex_str, d["dashboard_title"], re.IGNORECASE) + ] + + options = [("ALL", "Все дашборды")] + [ + (str(d["id"]), d["dashboard_title"]) for d in filtered_dashboards + ] + + rc, selected = checklist( + title="Выбор дашбордов", + prompt="Отметьте нужные дашборды (введите номера):", + options=options, + ) + if rc != 0: + return + + if "ALL" in selected: + self.dashboards_to_migrate = filtered_dashboards + else: + self.dashboards_to_migrate = [ + d for d in filtered_dashboards if str(d["id"]) in selected + ] + + self.logger.info( + "[select_dashboards][State] Выбрано %d дашбордов.", + len(self.dashboards_to_migrate), + ) + except Exception as e: + self.logger.error("[select_dashboards][Failure] %s", e, exc_info=True) + msgbox("Ошибка", "Не удалось получить список дашбордов.") + self.logger.info("[select_dashboards][Exit] Шаг 2 завершён.") + # [/DEF:Migration.select_dashboards] + + # [DEF:Migration.confirm_db_config_replacement:Function] + # @PURPOSE: Запрашивает у пользователя, требуется ли заменить имена БД в YAML-файлах. + # @POST: `self.db_config_replacement` либо `None`, либо заполнен. + # @RELATION: CALLS -> yesno + # @RELATION: CALLS -> self._select_databases + def confirm_db_config_replacement(self) -> None: + if yesno("Замена БД", "Заменить конфигурацию БД в YAML‑файлах?"): + old_db, new_db = self._select_databases() + if not old_db or not new_db: + self.logger.info("[confirm_db_config_replacement][State] Selection cancelled.") + return + print(f"old_db: {old_db}") + old_result = old_db.get("result", {}) + new_result = new_db.get("result", {}) + + self.db_config_replacement = { + "old": { + "database_name": old_result.get("database_name"), + "uuid": old_result.get("uuid"), + "database_uuid": old_result.get("uuid"), + "id": str(old_db.get("id")) + }, + "new": { + "database_name": new_result.get("database_name"), + "uuid": new_result.get("uuid"), + "database_uuid": new_result.get("uuid"), + "id": str(new_db.get("id")) + } + } + + self.logger.info("[confirm_db_config_replacement][State] Replacement set: %s", self.db_config_replacement) + else: + self.logger.info("[confirm_db_config_replacement][State] Skipped.") + # [/DEF:Migration.confirm_db_config_replacement] + + # [DEF:Migration._select_databases:Function] + # @PURPOSE: Позволяет пользователю выбрать исходную и целевую БД через API. + # @POST: Возвращает кортеж (старая БД, новая БД) или (None, None) при отмене. + # @RELATION: CALLS -> self.from_c.get_databases + # @RELATION: CALLS -> self.to_c.get_databases + # @RELATION: CALLS -> self.from_c.get_database + # @RELATION: CALLS -> self.to_c.get_database + # @RELATION: CALLS -> menu + def _select_databases(self) -> Tuple[Optional[Dict], Optional[Dict]]: + self.logger.info("[_select_databases][Entry] Selecting databases from both environments.") + + if self.from_c is None or self.to_c is None: + self.logger.error("[_select_databases][Failure] Source or target client not initialized.") + msgbox("Ошибка", "Исходное или целевое окружение не выбрано.") + return None, None + + # Получаем список БД из обоих окружений + try: + _, from_dbs = self.from_c.get_databases() + _, to_dbs = self.to_c.get_databases() + except Exception as e: + self.logger.error("[_select_databases][Failure] Failed to fetch databases: %s", e) + msgbox("Ошибка", "Не удалось получить список баз данных.") + return None, None + + # Формируем список для выбора + # По Swagger документации, в ответе API поле называется "database_name" + from_choices = [] + for db in from_dbs: + db_name = db.get("database_name", "Без имени") + from_choices.append((str(db["id"]), f"{db_name} (ID: {db['id']})")) + + to_choices = [] + for db in to_dbs: + db_name = db.get("database_name", "Без имени") + to_choices.append((str(db["id"]), f"{db_name} (ID: {db['id']})")) + + # Показываем список БД для исходного окружения + rc, from_sel = menu( + title="Выбор исходной БД", + prompt="Выберите исходную БД:", + choices=[f"{name}" for id, name in from_choices] + ) + if rc != 0: + return None, None + + # Определяем выбранную БД + from_db_id = from_choices[[choice[1] for choice in from_choices].index(from_sel)][0] + # Получаем полную информацию о выбранной БД из исходного окружения + try: + from_db = self.from_c.get_database(int(from_db_id)) + except Exception as e: + self.logger.error("[_select_databases][Failure] Failed to fetch database details: %s", e) + msgbox("Ошибка", "Не удалось получить информацию о выбранной базе данных.") + return None, None + + # Показываем список БД для целевого окружения + rc, to_sel = menu( + title="Выбор целевой БД", + prompt="Выберите целевую БД:", + choices=[f"{name}" for id, name in to_choices] + ) + if rc != 0: + return None, None + + # Определяем выбранную БД + to_db_id = to_choices[[choice[1] for choice in to_choices].index(to_sel)][0] + # Получаем полную информацию о выбранной БД из целевого окружения + try: + to_db = self.to_c.get_database(int(to_db_id)) + except Exception as e: + self.logger.error("[_select_databases][Failure] Failed to fetch database details: %s", e) + msgbox("Ошибка", "Не удалось получить информацию о выбранной базе данных.") + return None, None + + self.logger.info("[_select_databases][Exit] Selected databases: %s -> %s", from_db.get("database_name", "Без имени"), to_db.get("database_name", "Без имени")) + return from_db, to_db + # [/DEF:Migration._select_databases] + + # [DEF:Migration._batch_delete_by_ids:Function] + # @PURPOSE: Удаляет набор дашбордов по их ID единым запросом. + # @PRE: `ids` – непустой список целых чисел. + # @POST: Все указанные дашборды удалены (если они существовали). + # @RELATION: CALLS -> self.to_c.network.request + # @PARAM: ids (List[int]) - Список ID дашбордов для удаления. + def _batch_delete_by_ids(self, ids: List[int]) -> None: + if not ids: + self.logger.debug("[_batch_delete_by_ids][Skip] Empty ID list – nothing to delete.") + return + + if self.to_c is None: + self.logger.error("[_batch_delete_by_ids][Failure] Target client not initialized.") + msgbox("Ошибка", "Целевое окружение не выбрано.") + return + + self.logger.info("[_batch_delete_by_ids][Entry] Deleting dashboards IDs: %s", ids) + q_param = json.dumps(ids) + response = self.to_c.network.request(method="DELETE", endpoint="/dashboard/", params={"q": q_param}) + + if isinstance(response, dict) and response.get("result", True) is False: + self.logger.warning("[_batch_delete_by_ids][Warning] Unexpected delete response: %s", response) + else: + self.logger.info("[_batch_delete_by_ids][Success] Delete request completed.") + # [/DEF:Migration._batch_delete_by_ids] + + # [DEF:Migration.execute_migration:Function] + # @PURPOSE: Выполняет экспорт-импорт дашбордов, обрабатывает ошибки и, при необходимости, выполняет процедуру восстановления. + # @PRE: `self.dashboards_to_migrate` не пуст; `self.from_c` и `self.to_c` инициализированы. + # @POST: Успешные дашборды импортированы; неудачные - восстановлены или залогированы. + # @RELATION: CALLS -> self.from_c.export_dashboard + # @RELATION: CALLS -> create_temp_file + # @RELATION: CALLS -> update_yamls + # @RELATION: CALLS -> create_dashboard_export + # @RELATION: CALLS -> self.to_c.import_dashboard + # @RELATION: CALLS -> self._batch_delete_by_ids + def execute_migration(self) -> None: + if not self.dashboards_to_migrate: + self.logger.warning("[execute_migration][Skip] No dashboards to migrate.") + msgbox("Информация", "Нет дашбордов для миграции.") + return + + if self.from_c is None or self.to_c is None: + self.logger.error("[execute_migration][Failure] Source or target client not initialized.") + msgbox("Ошибка", "Исходное или целевое окружение не выбрано.") + return + + total = len(self.dashboards_to_migrate) + self.logger.info("[execute_migration][Entry] Starting migration of %d dashboards.", total) + self.to_c.delete_before_reimport = self.enable_delete_on_failure + + with gauge("Миграция...", width=60, height=10) as g: + for i, dash in enumerate(self.dashboards_to_migrate): + dash_id, dash_slug, title = dash["id"], dash.get("slug"), dash["dashboard_title"] + g.set_text(f"Миграция: {title} ({i + 1}/{total})") + g.set_percent(int((i / total) * 100)) + exported_content = None # Initialize exported_content + try: + exported_content, _ = self.from_c.export_dashboard(dash_id) + with create_temp_file(content=exported_content, dry_run=True, suffix=".zip", logger=self.logger) as tmp_zip_path, \ + create_temp_file(suffix=".dir", logger=self.logger) as tmp_unpack_dir: + + if not self.db_config_replacement: + self.to_c.import_dashboard(file_name=tmp_zip_path, dash_id=dash_id, dash_slug=dash_slug) + else: + with zipfile.ZipFile(tmp_zip_path, "r") as zip_ref: + zip_ref.extractall(tmp_unpack_dir) + + if self.db_config_replacement: + update_yamls(db_configs=[self.db_config_replacement], path=str(tmp_unpack_dir)) + + with create_temp_file(suffix=".zip", dry_run=True, logger=self.logger) as tmp_new_zip: + create_dashboard_export(zip_path=tmp_new_zip, source_paths=[str(p) for p in Path(tmp_unpack_dir).glob("**/*")]) + self.to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug) + + self.logger.info("[execute_migration][Success] Dashboard %s imported.", title) + except Exception as exc: + self.logger.error("[execute_migration][Failure] %s", exc, exc_info=True) + self._failed_imports.append({"slug": dash_slug, "dash_id": dash_id, "zip_content": exported_content}) + msgbox("Ошибка", f"Не удалось мигрировать дашборд {title}.\n\n{exc}") + g.set_percent(100) + + if self.enable_delete_on_failure and self._failed_imports: + self.logger.info("[execute_migration][Recovery] %d dashboards failed. Starting recovery.", len(self._failed_imports)) + _, target_dashboards = self.to_c.get_dashboards() + slug_to_id = {d["slug"]: d["id"] for d in target_dashboards if "slug" in d and "id" in d} + ids_to_delete = [slug_to_id[f["slug"]] for f in self._failed_imports if f["slug"] in slug_to_id] + self._batch_delete_by_ids(ids_to_delete) + + for fail in self._failed_imports: + with create_temp_file(content=fail["zip_content"], suffix=".zip", logger=self.logger) as retry_zip: + self.to_c.import_dashboard(file_name=retry_zip, dash_id=fail["dash_id"], dash_slug=fail["slug"]) + self.logger.info("[execute_migration][Recovered] Dashboard slug '%s' re-imported.", fail["slug"]) + + self.logger.info("[execute_migration][Exit] Migration finished.") + msgbox("Информация", "Миграция завершена!") + # [/DEF:Migration.execute_migration] + +# [/DEF:Migration] + +if __name__ == "__main__": + Migration().run() + +# [/DEF:migration_script] diff --git a/reproduce_issue.py b/reproduce_issue.py new file mode 100644 index 0000000..1316045 --- /dev/null +++ b/reproduce_issue.py @@ -0,0 +1,21 @@ +import sys +import os +from pathlib import Path + +# Add root to sys.path +sys.path.append(os.getcwd()) + +try: + from backend.src.core.plugin_loader import PluginLoader +except ImportError as e: + print(f"Failed to import PluginLoader: {e}") + sys.exit(1) + +plugin_dir = Path("backend/src/plugins").absolute() +print(f"Plugin dir: {plugin_dir}") + +loader = PluginLoader(str(plugin_dir)) +configs = loader.get_all_plugin_configs() +print(f"Loaded plugins: {len(configs)}") +for config in configs: + print(f" - {config.id}") \ No newline at end of file diff --git a/requirements.txt b/requirements.txt old mode 100644 new mode 100755 index b9b3c78..4779b4f --- a/requirements.txt +++ b/requirements.txt @@ -1,4 +1,6 @@ pyyaml requests keyring -urllib3 \ No newline at end of file +urllib3 +pydantic +whiptail-dialogs \ No newline at end of file diff --git a/run.sh b/run.sh new file mode 100755 index 0000000..cff8f14 --- /dev/null +++ b/run.sh @@ -0,0 +1,155 @@ +#!/bin/bash + +# Project Launch Script +# Automates setup and concurrent execution of backend and frontend servers. + +set -e + +# Default configuration +BACKEND_PORT=${BACKEND_PORT:-8000} +FRONTEND_PORT=${FRONTEND_PORT:-5173} +SKIP_INSTALL=false + +# Help message +show_help() { + echo "Usage: ./run.sh [options]" + echo "" + echo "Options:" + echo " --help Show this help message" + echo " --skip-install Skip dependency checks and installation" + echo "" + echo "Environment Variables:" + echo " BACKEND_PORT Port for the backend server (default: 8000)" + echo " FRONTEND_PORT Port for the frontend server (default: 5173)" +} + +# Parse arguments +while [[ "$#" -gt 0 ]]; do + case $1 in + --help) show_help; exit 0 ;; + --skip-install) SKIP_INSTALL=true ;; + *) echo "Unknown parameter passed: $1"; show_help; exit 1 ;; + esac + shift +done + +echo "Starting Project Launch Script..." + +# Environment validation +validate_env() { + echo "Validating environment..." + + if ! command -v python3 &> /dev/null; then + echo "Error: python3 is not installed." + exit 1 + fi + + if ! python3 -c 'import sys; exit(0) if sys.version_info >= (3, 9) else exit(1)'; then + PYTHON_VERSION=$(python3 -c 'import sys; print(".".join(map(str, sys.version_info[:2])))') + echo "Error: python3 version 3.9 or higher is required. Found $PYTHON_VERSION" + exit 1 + fi + + if ! command -v npm &> /dev/null; then + echo "Error: npm is not installed." + exit 1 + fi + + PYTHON_VERSION=$(python3 -c 'import sys; print(".".join(map(str, sys.version_info[:2])))') + echo "Environment validation passed (Python $PYTHON_VERSION, npm $(npm -v))" +} + +validate_env + +# Backend dependency management +setup_backend() { + if [ "$SKIP_INSTALL" = true ]; then + echo "Skipping backend installation..." + return + fi + + echo "Setting up backend..." + cd backend + if [ ! -d ".venv" ]; then + echo "Creating virtual environment..." + python3 -m venv .venv + fi + + source .venv/bin/activate + if [ -f "requirements.txt" ]; then + echo "Installing backend dependencies..." + pip install -r requirements.txt + else + echo "Warning: backend/requirements.txt not found." + fi + cd .. +} + +# Frontend dependency management +setup_frontend() { + if [ "$SKIP_INSTALL" = true ]; then + echo "Skipping frontend installation..." + return + fi + + echo "Setting up frontend..." + cd frontend + if [ ! -d "node_modules" ]; then + echo "Installing frontend dependencies..." + npm install + else + echo "frontend/node_modules already exists. Skipping npm install." + fi + cd .. +} + +setup_backend +setup_frontend + +# Cleanup function for graceful shutdown +cleanup() { + echo "" + echo "Stopping services..." + if [ -n "$BACKEND_PID" ]; then + kill $BACKEND_PID 2>/dev/null || true + fi + if [ -n "$FRONTEND_PID" ]; then + kill $FRONTEND_PID 2>/dev/null || true + fi + echo "Services stopped." + exit 0 +} + +# Trap SIGINT (Ctrl+C) +trap cleanup SIGINT + +# Start Backend +start_backend() { + echo -e "\033[0;34m[Backend]\033[0m Starting on port $BACKEND_PORT..." + cd backend + if [ -f ".venv/bin/activate" ]; then + source .venv/bin/activate + else + echo -e "\033[0;31m[Backend]\033[0m Warning: .venv/bin/activate not found. Attempting to run without venv." + fi + # Use a subshell to prefix output + python3 -m uvicorn src.app:app --reload --port "$BACKEND_PORT" 2>&1 | sed "s/^/$(echo -e '\033[0;34m[Backend]\033[0m ') /" & + BACKEND_PID=$! + cd .. +} + +# Start Frontend +start_frontend() { + echo -e "\033[0;32m[Frontend]\033[0m Starting on port $FRONTEND_PORT..." + cd frontend + # Use a subshell to prefix output + npm run dev -- --port "$FRONTEND_PORT" 2>&1 | sed "s/^/$(echo -e '\033[0;32m[Frontend]\033[0m ') /" & + FRONTEND_PID=$! + cd .. +} + +start_backend +start_frontend + +echo "Services are running. Press Ctrl+C to stop." +wait diff --git a/run_mapper.py b/run_mapper.py new file mode 100755 index 0000000..bddc7d0 --- /dev/null +++ b/run_mapper.py @@ -0,0 +1,72 @@ +# [DEF:run_mapper:Module] +# +# @SEMANTICS: runner, configuration, cli, main +# @PURPOSE: Этот модуль является CLI-точкой входа для запуска процесса меппинга метаданных датасетов. +# @LAYER: App +# @RELATION: DEPENDS_ON -> superset_tool.utils.dataset_mapper +# @RELATION: DEPENDS_ON -> superset_tool.utils +# @PUBLIC_API: main + +# [SECTION: IMPORTS] +import argparse +import keyring +from superset_tool.utils.init_clients import setup_clients +from superset_tool.utils.logger import SupersetLogger +from superset_tool.utils.dataset_mapper import DatasetMapper +# [/SECTION] + +# [DEF:main:Function] +# @PURPOSE: Парсит аргументы командной строки и запускает процесс меппинга. +# @RELATION: CREATES_INSTANCE_OF -> DatasetMapper +# @RELATION: CALLS -> setup_clients +# @RELATION: CALLS -> DatasetMapper.run_mapping +def main(): + parser = argparse.ArgumentParser(description="Map dataset verbose names in Superset.") + parser.add_argument('--source', type=str, required=True, choices=['postgres', 'excel', 'both'], help='The source for the mapping.') + parser.add_argument('--dataset-id', type=int, required=True, help='The ID of the dataset to update.') + parser.add_argument('--table-name', type=str, help='The table name for PostgreSQL source.') + parser.add_argument('--table-schema', type=str, help='The table schema for PostgreSQL source.') + parser.add_argument('--excel-path', type=str, help='The path to the Excel file.') + parser.add_argument('--env', type=str, default='dev', help='The Superset environment to use.') + + args = parser.parse_args() + logger = SupersetLogger(name="dataset_mapper_main") + + # [AI_NOTE]: Конфигурация БД должна быть вынесена во внешний файл или переменные окружения. + POSTGRES_CONFIG = { + 'dbname': 'dwh', + 'user': keyring.get_password("system", f"dwh gp user"), + 'password': keyring.get_password("system", f"dwh gp password"), + 'host': '10.66.229.201', + 'port': '5432' + } + + logger.info("[main][Enter] Starting dataset mapper CLI.") + try: + clients = setup_clients(logger) + superset_client = clients.get(args.env) + + if not superset_client: + logger.error(f"[main][Failure] Superset client for '{args.env}' environment not found.") + return + + mapper = DatasetMapper(logger) + mapper.run_mapping( + superset_client=superset_client, + dataset_id=args.dataset_id, + source=args.source, + postgres_config=POSTGRES_CONFIG if args.source in ['postgres', 'both'] else None, + excel_path=args.excel_path if args.source in ['excel', 'both'] else None, + table_name=args.table_name if args.source in ['postgres', 'both'] else None, + table_schema=args.table_schema if args.source in ['postgres', 'both'] else None + ) + logger.info("[main][Exit] Dataset mapper process finished.") + + except Exception as main_exc: + logger.error("[main][Failure] An unexpected error occurred: %s", main_exc, exc_info=True) +# [/DEF:main] + +if __name__ == '__main__': + main() + +# [/DEF:run_mapper] diff --git a/search_script.py b/search_script.py old mode 100644 new mode 100755 index bd864b4..fc82bb5 --- a/search_script.py +++ b/search_script.py @@ -1,152 +1,204 @@ -# pylint: disable=too-many-arguments,too-many-locals,too-many-statements,too-many-branches,unused-argument,invalid-name,redefined-outer-name -""" -[MODULE] Dataset Search Utilities -@contract: Предоставляет функционал для поиска текстовых паттернов в метаданных датасетов Superset. -""" - -# [IMPORTS] Стандартная библиотека -import logging -import re -from typing import Dict, Optional - -# [IMPORTS] Third-party -from requests.exceptions import RequestException - -# [IMPORTS] Локальные модули -from superset_tool.client import SupersetClient -from superset_tool.exceptions import SupersetAPIError -from superset_tool.utils.logger import SupersetLogger -from superset_tool.utils.init_clients import setup_clients - -# [ENTITY: Function('search_datasets')] -# CONTRACT: -# PURPOSE: Выполняет поиск по строковому паттерну в метаданных всех датасетов. -# PRECONDITIONS: -# - `client` должен быть инициализированным экземпляром `SupersetClient`. -# - `search_pattern` должен быть валидной строкой регулярного выражения. -# POSTCONDITIONS: -# - Возвращает словарь с результатами поиска. -def search_datasets( - client: SupersetClient, - search_pattern: str, - logger: Optional[SupersetLogger] = None -) -> Optional[Dict]: - logger = logger or SupersetLogger(name="dataset_search") - logger.info(f"[STATE][search_datasets][ENTER] Searching for pattern: '{search_pattern}'") - try: - _, datasets = client.get_datasets(query={ - "columns": ["id", "table_name", "sql", "database", "columns"] - }) - - if not datasets: - logger.warning("[STATE][search_datasets][EMPTY] No datasets found.") - return None - - pattern = re.compile(search_pattern, re.IGNORECASE) - results = {} - available_fields = set(datasets[0].keys()) - - for dataset in datasets: - dataset_id = dataset.get('id') - if not dataset_id: - continue - - matches = [] - for field in available_fields: - value = str(dataset.get(field, "")) - if pattern.search(value): - match_obj = pattern.search(value) - matches.append({ - "field": field, - "match": match_obj.group() if match_obj else "", - "value": value - }) - - if matches: - results[dataset_id] = matches - - logger.info(f"[STATE][search_datasets][SUCCESS] Found matches in {len(results)} datasets.") - return results - - except re.error as e: - logger.error(f"[STATE][search_datasets][FAILURE] Invalid regex pattern: {e}", exc_info=True) - raise - except (SupersetAPIError, RequestException) as e: - logger.critical(f"[STATE][search_datasets][FAILURE] Critical error during search: {e}", exc_info=True) - raise -# END_FUNCTION_search_datasets - -# [ENTITY: Function('print_search_results')] -# CONTRACT: -# PURPOSE: Форматирует результаты поиска для читаемого вывода в консоль. -# PRECONDITIONS: -# - `results` является словарем, возвращенным `search_datasets`, или `None`. -# POSTCONDITIONS: -# - Возвращает отформатированную строку с результатами. -def print_search_results(results: Optional[Dict], context_lines: int = 3) -> str: - if not results: - return "Ничего не найдено" - - output = [] - for dataset_id, matches in results.items(): - output.append(f"\n--- Dataset ID: {dataset_id} ---") - for match_info in matches: - field = match_info['field'] - match_text = match_info['match'] - full_value = match_info['value'] - - output.append(f" - Поле: {field}") - output.append(f" Совпадение: '{match_text}'") - - lines = full_value.splitlines() - if not lines: - continue - - match_line_index = -1 - for i, line in enumerate(lines): - if match_text in line: - match_line_index = i - break - - if match_line_index != -1: - start_line = max(0, match_line_index - context_lines) - end_line = min(len(lines), match_line_index + context_lines + 1) - - output.append(" Контекст:") - for i in range(start_line, end_line): - line_number = i + 1 - line_content = lines[i] - prefix = f"{line_number:5d}: " - if i == match_line_index: - highlighted_line = line_content.replace(match_text, f">>>{match_text}<<<") - output.append(f" {prefix}{highlighted_line}") - else: - output.append(f" {prefix}{line_content}") - output.append("-" * 25) - return "\n".join(output) -# END_FUNCTION_print_search_results - -# [ENTITY: Function('main')] -# CONTRACT: -# PURPOSE: Основная точка входа скрипта. -# PRECONDITIONS: None -# POSTCONDITIONS: None -def main(): - logger = SupersetLogger(level=logging.INFO, console=True) - clients = setup_clients(logger) - - target_client = clients['dev'] - search_query = r"match(r2.path_code, budget_reference.ref_code || '($|(\s))')" - - results = search_datasets( - client=target_client, - search_pattern=search_query, - logger=logger - ) - - report = print_search_results(results) - logger.info(f"[STATE][main][SUCCESS] Search finished. Report:\n{report}") -# END_FUNCTION_main - -if __name__ == "__main__": - main() +# [DEF:search_script:Module] +# +# @SEMANTICS: search, superset, dataset, regex, file_output +# @PURPOSE: Предоставляет утилиты для поиска по текстовым паттернам в метаданных датасетов Superset. +# @LAYER: App +# @RELATION: DEPENDS_ON -> superset_tool.client +# @RELATION: DEPENDS_ON -> superset_tool.utils +# @PUBLIC_API: search_datasets, save_results_to_file, print_search_results, main + +# [SECTION: IMPORTS] +import logging +import re +import os +from typing import Dict, Optional +from requests.exceptions import RequestException +from superset_tool.client import SupersetClient +from superset_tool.exceptions import SupersetAPIError +from superset_tool.utils.logger import SupersetLogger +from superset_tool.utils.init_clients import setup_clients +# [/SECTION] + +# [DEF:search_datasets:Function] +# @PURPOSE: Выполняет поиск по строковому паттерну в метаданных всех датасетов. +# @PRE: `client` должен быть инициализированным экземпляром `SupersetClient`. +# @PRE: `search_pattern` должен быть валидной строкой регулярного выражения. +# @POST: Возвращает словарь с результатами поиска, где ключ - ID датасета, значение - список совпадений. +# @RELATION: CALLS -> client.get_datasets +# @THROW: re.error - Если паттерн регулярного выражения невалиден. +# @THROW: SupersetAPIError, RequestException - При критических ошибках API. +# @PARAM: client (SupersetClient) - Клиент для доступа к API Superset. +# @PARAM: search_pattern (str) - Регулярное выражение для поиска. +# @PARAM: logger (Optional[SupersetLogger]) - Инстанс логгера. +# @RETURN: Optional[Dict] - Словарь с результатами или None, если ничего не найдено. +def search_datasets( + client: SupersetClient, + search_pattern: str, + logger: Optional[SupersetLogger] = None +) -> Optional[Dict]: + logger = logger or SupersetLogger(name="dataset_search") + logger.info(f"[search_datasets][Enter] Searching for pattern: '{search_pattern}'") + try: + _, datasets = client.get_datasets(query={"columns": ["id", "table_name", "sql", "database", "columns"]}) + + if not datasets: + logger.warning("[search_datasets][State] No datasets found.") + return None + + pattern = re.compile(search_pattern, re.IGNORECASE) + results = {} + + for dataset in datasets: + dataset_id = dataset.get('id') + if not dataset_id: + continue + + matches = [] + for field, value in dataset.items(): + value_str = str(value) + if pattern.search(value_str): + match_obj = pattern.search(value_str) + matches.append({ + "field": field, + "match": match_obj.group() if match_obj else "", + "value": value_str + }) + + if matches: + results[dataset_id] = matches + + logger.info(f"[search_datasets][Success] Found matches in {len(results)} datasets.") + return results + + except re.error as e: + logger.error(f"[search_datasets][Failure] Invalid regex pattern: {e}", exc_info=True) + raise + except (SupersetAPIError, RequestException) as e: + logger.critical(f"[search_datasets][Failure] Critical error during search: {e}", exc_info=True) + raise +# [/DEF:search_datasets] + +# [DEF:save_results_to_file:Function] +# @PURPOSE: Сохраняет результаты поиска в текстовый файл. +# @PRE: `results` является словарем, возвращенным `search_datasets`, или `None`. +# @PRE: `filename` должен быть допустимым путем к файлу. +# @POST: Записывает отформатированные результаты в указанный файл. +# @PARAM: results (Optional[Dict]) - Словарь с результатами поиска. +# @PARAM: filename (str) - Имя файла для сохранения результатов. +# @PARAM: logger (Optional[SupersetLogger]) - Инстанс логгера. +# @RETURN: bool - Успешно ли выполнено сохранение. +def save_results_to_file(results: Optional[Dict], filename: str, logger: Optional[SupersetLogger] = None) -> bool: + logger = logger or SupersetLogger(name="file_writer") + logger.info(f"[save_results_to_file][Enter] Saving results to file: {filename}") + try: + formatted_report = print_search_results(results) + with open(filename, 'w', encoding='utf-8') as f: + f.write(formatted_report) + logger.info(f"[save_results_to_file][Success] Results saved to {filename}") + return True + except Exception as e: + logger.error(f"[save_results_to_file][Failure] Failed to save results to file: {e}", exc_info=True) + return False +# [/DEF:save_results_to_file] + +# [DEF:print_search_results:Function] +# @PURPOSE: Форматирует результаты поиска для читаемого вывода в консоль. +# @PRE: `results` является словарем, возвращенным `search_datasets`, или `None`. +# @POST: Возвращает отформатированную строку с результатами. +# @PARAM: results (Optional[Dict]) - Словарь с результатами поиска. +# @PARAM: context_lines (int) - Количество строк контекста для вывода до и после совпадения. +# @RETURN: str - Отформатированный отчет. +def print_search_results(results: Optional[Dict], context_lines: int = 3) -> str: + if not results: + return "Ничего не найдено" + + output = [] + for dataset_id, matches in results.items(): + # Получаем информацию о базе данных для текущего датасета + database_info = "" + # Ищем поле database среди совпадений, чтобы вывести его + for match_info in matches: + if match_info['field'] == 'database': + database_info = match_info['value'] + break + # Если database не найден в совпадениях, пробуем получить из других полей + if not database_info: + # Предполагаем, что база данных может быть в одном из полей, например sql или table_name + # Но для точности лучше использовать специальное поле, которое мы уже получили + pass # Пока не выводим, если не нашли явно + + output.append(f"\n--- Dataset ID: {dataset_id} ---") + if database_info: + output.append(f" Database: {database_info}") + output.append("") # Пустая строка для читабельности + + for match_info in matches: + field, match_text, full_value = match_info['field'], match_info['match'], match_info['value'] + output.append(f" - Поле: {field}") + output.append(f" Совпадение: '{match_text}'") + + lines = full_value.splitlines() + if not lines: continue + + match_line_index = -1 + for i, line in enumerate(lines): + if match_text in line: + match_line_index = i + break + + if match_line_index != -1: + start = max(0, match_line_index - context_lines) + end = min(len(lines), match_line_index + context_lines + 1) + output.append(" Контекст:") + for i in range(start, end): + prefix = f"{i + 1:5d}: " + line_content = lines[i] + if i == match_line_index: + highlighted = line_content.replace(match_text, f">>>{match_text}<<<") + output.append(f" {prefix}{highlighted}") + else: + output.append(f" {prefix}{line_content}") + output.append("-" * 25) + return "\n".join(output) +# [/DEF:print_search_results] + +# [DEF:main:Function] +# @PURPOSE: Основная точка входа для запуска скрипта поиска. +# @RELATION: CALLS -> setup_clients +# @RELATION: CALLS -> search_datasets +# @RELATION: CALLS -> print_search_results +# @RELATION: CALLS -> save_results_to_file +def main(): + logger = SupersetLogger(level=logging.INFO, console=True) + clients = setup_clients(logger) + + target_client = clients['dev5'] + search_query = r"from dm(_view)*.account_debt" + + # Генерируем имя файла на основе времени + import datetime + timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S") + output_filename = f"search_results_{timestamp}.txt" + + results = search_datasets( + client=target_client, + search_pattern=search_query, + logger=logger + ) + + report = print_search_results(results) + + logger.info(f"[main][Success] Search finished. Report:\n{report}") + + # Сохраняем результаты в файл + success = save_results_to_file(results, output_filename, logger) + if success: + logger.info(f"[main][Success] Results also saved to file: {output_filename}") + else: + logger.error(f"[main][Failure] Failed to save results to file: {output_filename}") +# [/DEF:main] + +if __name__ == "__main__": + main() + +# [/DEF:search_script] diff --git a/semantic_protocol.md b/semantic_protocol.md new file mode 100755 index 0000000..f70fcff --- /dev/null +++ b/semantic_protocol.md @@ -0,0 +1,174 @@ +Here is the revised **System Standard**, adapted for a Polyglot environment (Python Backend + Svelte Frontend) and removing the requirement for explicit assertion generation. + +This protocol standardizes the "Semantic Bridge" between the two languages using unified Anchor logic while respecting the native documentation standards (Comments for Python, JSDoc for JavaScript/Svelte). + +*** + +# SYSTEM STANDARD: POLYGLOT CODE GENERATION PROTOCOL (GRACE-Poly) + +**OBJECTIVE:** Generate Python and Svelte/TypeScript code that strictly adheres to Semantic Coherence standards. Output must be machine-readable, fractal-structured, and optimized for Sparse Attention navigation. + +## I. CORE REQUIREMENTS +1. **Causal Validity:** Semantic definitions (Contracts) must ALWAYS precede implementation code. +2. **Immutability:** Architectural decisions defined in the Module/Component Header are treated as immutable constraints. +3. **Format Compliance:** Output must strictly follow the `[DEF]` / `[/DEF]` anchor syntax for structure. +4. **Logic over Assertion:** Contracts define the *logic flow*. Do not generate explicit `assert` statements unless requested. The code logic itself must inherently satisfy the Pre/Post conditions (e.g., via control flow, guards, or types). + +--- + +## II. SYNTAX SPECIFICATION + +Code structure is defined by **Anchors** (square brackets). Metadata is defined by **Tags** (native comment style). + +### 1. Entity Anchors (The "Container") +Used to define the boundaries of Modules, Classes, Components, and Functions. + +* **Python:** + * Start: `# [DEF:identifier:Type]` + * End: `# [/DEF:identifier]` +* **Svelte (Top-level):** + * Start: `` + * End: `` +* **Svelte (Script/JS/TS):** + * Start: `// [DEF:funcName:Function]` + * End: `// [/DEF:funcName]` + +**Types:** `Module`, `Component`, `Class`, `Function`, `Store`, `Action`. + +### 2. Graph Relations (The "Map") +Defines high-level dependencies. +* **Python Syntax:** `# @RELATION: TYPE -> TARGET_ID` +* **Svelte/JS Syntax:** `// @RELATION: TYPE -> TARGET_ID` +* **Types:** `DEPENDS_ON`, `CALLS`, `INHERITS_FROM`, `IMPLEMENTS`, `BINDS_TO`, `DISPATCHES`. + +--- + +## III. FILE STRUCTURE STANDARD + +### 1. Python Module Header (`.py`) +```python +# [DEF:module_name:Module] +# +# @SEMANTICS: [keywords for vector search] +# @PURPOSE: [Primary responsibility of the module] +# @LAYER: [Domain/Infra/API] +# @RELATION: [Dependencies] +# +# @INVARIANT: [Global immutable rule] +# @CONSTRAINT: [Hard restriction, e.g., "No ORM calls here"] + +# [SECTION: IMPORTS] +... +# [/SECTION] + +# ... IMPLEMENTATION ... + +# [/DEF:module_name] +``` + +### 2. Svelte Component Header (`.svelte`) +```html + + + + + + +... + + + + + +``` + +--- + +## IV. CONTRACTS (Design by Contract) + +Contracts define *what* the code does before *how* it does it. + +### 1. Python Contract Style +Uses comment blocks inside the anchor. + +```python +# [DEF:calculate_total:Function] +# @PURPOSE: Calculates cart total including tax. +# @PRE: items list is not empty. +# @POST: returns non-negative Decimal. +# @PARAM: items (List[Item]) - Cart items. +# @RETURN: Decimal - Final total. +def calculate_total(items: List[Item]) -> Decimal: + # Logic implementation that respects @PRE + if not items: + return Decimal(0) + + # ... calculation ... + + # Logic ensuring @POST + return total +# [/DEF:calculate_total] +``` + +### 2. Svelte/JS Contract Style (JSDoc) +Uses JSDoc blocks inside the anchor. Standard JSDoc tags are used where possible; custom GRACE tags are added for strictness. + +```javascript +// [DEF:updateUserProfile:Function] +/** + * @purpose Updates user data in the store and backend. + * @pre User must be authenticated (session token exists). + * @post UserStore is updated with new data. + * @param {Object} profileData - The new profile fields. + * @returns {Promise} + * @throws {AuthError} If session is invalid. + */ +// @RELATION: CALLS -> api.user.update +async function updateUserProfile(profileData) { + // Logic implementation + if (!session.token) throw new AuthError(); + + // ... +} +// [/DEF:updateUserProfile] +``` + +--- + +## V. LOGGING STANDARD (BELIEF STATE) + +Logs delineate the agent's internal state. + +* **Python:** `logger.info(f"[{ANCHOR_ID}][{STATE}] Msg")` +* **Svelte/JS:** `console.log(\`[${ANCHOR_ID}][${STATE}] Msg\`)` + +**Required States:** +1. `Entry` (Start of block) +2. `Action` (Key business logic) +3. `Coherence:OK` (Logic successfully completed) +4. `Coherence:Failed` (Error handling) +5. `Exit` (End of block) + +--- + +## VI. GENERATION WORKFLOW + +1. **Context Analysis:** Identify language (Python vs Svelte) and Architecture Layer. +2. **Scaffolding:** Generate the `[DEF]` Anchors and Header/Contract **before** writing any logic. +3. **Implementation:** Write the code. Ensure the code logic handles the `@PRE` conditions (e.g., via `if/return` or guards) and satisfies `@POST` conditions naturally. **Do not write explicit `assert` statements unless debugging mode is requested.** +4. **Closure:** Ensure every `[DEF]` is closed with `[/DEF]` to accumulate semantic context. \ No newline at end of file diff --git a/specs/001-plugin-arch-svelte-ui/checklists/requirements.md b/specs/001-plugin-arch-svelte-ui/checklists/requirements.md new file mode 100755 index 0000000..c070256 --- /dev/null +++ b/specs/001-plugin-arch-svelte-ui/checklists/requirements.md @@ -0,0 +1,34 @@ +# Specification Quality Checklist: Plugin Architecture & Svelte Web UI + +**Purpose**: Validate specification completeness and quality before proceeding to planning +**Created**: 2025-12-19 +**Feature**: [Link to spec.md](../spec.md) + +## Content Quality + +- [x] No implementation details (languages, frameworks, APIs) +- [x] Focused on user value and business needs +- [x] Written for non-technical stakeholders +- [x] All mandatory sections completed + +## Requirement Completeness + +- [x] No [NEEDS CLARIFICATION] markers remain +- [x] Requirements are testable and unambiguous +- [x] Success criteria are measurable +- [x] Success criteria are technology-agnostic (no implementation details) +- [x] All acceptance scenarios are defined +- [x] Edge cases are identified +- [x] Scope is clearly bounded +- [x] Dependencies and assumptions identified + +## Feature Readiness + +- [x] All functional requirements have clear acceptance criteria +- [x] User scenarios cover primary flows +- [x] Feature meets measurable outcomes defined in Success Criteria +- [x] No implementation details leak into specification + +## Notes + +- Clarification resolved: Deployment context is hosted multi-user service with ADFS login. \ No newline at end of file diff --git a/specs/001-plugin-arch-svelte-ui/contracts/api.yaml b/specs/001-plugin-arch-svelte-ui/contracts/api.yaml new file mode 100755 index 0000000..91e2211 --- /dev/null +++ b/specs/001-plugin-arch-svelte-ui/contracts/api.yaml @@ -0,0 +1,132 @@ +openapi: 3.0.0 +info: + title: Superset Tools API + version: 1.0.0 + description: API for managing Superset automation tools and plugins. + +paths: + /plugins: + get: + summary: List available plugins + operationId: list_plugins + responses: + '200': + description: List of plugins + content: + application/json: + schema: + type: array + items: + $ref: '#/components/schemas/Plugin' + + /tasks: + post: + summary: Start a new task + operationId: create_task + requestBody: + required: true + content: + application/json: + schema: + type: object + required: + - plugin_id + - params + properties: + plugin_id: + type: string + params: + type: object + responses: + '201': + description: Task created + content: + application/json: + schema: + $ref: '#/components/schemas/Task' + + get: + summary: List recent tasks + operationId: list_tasks + responses: + '200': + description: List of tasks + content: + application/json: + schema: + type: array + items: + $ref: '#/components/schemas/Task' + + /tasks/{task_id}: + get: + summary: Get task details + operationId: get_task + parameters: + - name: task_id + in: path + required: true + schema: + type: string + format: uuid + responses: + '200': + description: Task details + content: + application/json: + schema: + $ref: '#/components/schemas/Task' + + /tasks/{task_id}/logs: + get: + summary: Stream task logs (WebSocket upgrade) + operationId: stream_logs + parameters: + - name: task_id + in: path + required: true + schema: + type: string + format: uuid + responses: + '101': + description: Switching Protocols to WebSocket + +components: + schemas: + Plugin: + type: object + properties: + id: + type: string + name: + type: string + description: + type: string + version: + type: string + schema: + type: object + description: JSON Schema for input parameters + enabled: + type: boolean + + Task: + type: object + properties: + id: + type: string + format: uuid + plugin_id: + type: string + status: + type: string + enum: [PENDING, RUNNING, SUCCESS, FAILED] + started_at: + type: string + format: date-time + finished_at: + type: string + format: date-time + user_id: + type: string diff --git a/specs/001-plugin-arch-svelte-ui/data-model.md b/specs/001-plugin-arch-svelte-ui/data-model.md new file mode 100755 index 0000000..f859f05 --- /dev/null +++ b/specs/001-plugin-arch-svelte-ui/data-model.md @@ -0,0 +1,51 @@ +# Data Model: Plugin Architecture & Svelte Web UI + +## Entities + +### Plugin +Represents a loadable extension module. + +| Field | Type | Description | +|-------|------|-------------| +| `id` | `str` | Unique identifier (e.g., "backup-tool") | +| `name` | `str` | Display name (e.g., "Backup Dashboard") | +| `description` | `str` | Short description of functionality | +| `version` | `str` | Plugin version string | +| `schema` | `dict` | JSON Schema for input parameters (generated from Pydantic) | +| `enabled` | `bool` | Whether the plugin is active | + +### Task +Represents an execution instance of a plugin. + +| Field | Type | Description | +|-------|------|-------------| +| `id` | `UUID` | Unique execution ID | +| `plugin_id` | `str` | ID of the plugin being executed | +| `status` | `Enum` | `PENDING`, `RUNNING`, `SUCCESS`, `FAILED` | +| `started_at` | `DateTime` | Timestamp when task started | +| `finished_at` | `DateTime` | Timestamp when task completed (nullable) | +| `user_id` | `str` | ID of the user who triggered the task | +| `logs` | `List[LogEntry]` | Structured logs from the execution | + +### LogEntry +Represents a single log line from a task. + +| Field | Type | Description | +|-------|------|-------------| +| `timestamp` | `DateTime` | Time of log event | +| `level` | `Enum` | `INFO`, `WARNING`, `ERROR`, `DEBUG` | +| `message` | `str` | Log content | +| `context` | `dict` | Additional metadata (optional) | + +## State Transitions + +### Task Lifecycle +1. **Created**: Task initialized with input parameters. Status: `PENDING`. +2. **Started**: Worker picks up task. Status: `RUNNING`. +3. **Completed**: Execution finishes without exception. Status: `SUCCESS`. +4. **Failed**: Execution raises unhandled exception. Status: `FAILED`. + +## Validation Rules + +- **Plugin ID**: Must be alphanumeric, lowercase, hyphens allowed. +- **Input Parameters**: Must validate against the plugin's `schema`. \ No newline at end of file diff --git a/specs/001-plugin-arch-svelte-ui/plan.md b/specs/001-plugin-arch-svelte-ui/plan.md new file mode 100755 index 0000000..ed53148 --- /dev/null +++ b/specs/001-plugin-arch-svelte-ui/plan.md @@ -0,0 +1,76 @@ +# Implementation Plan: Plugin Architecture & Svelte Web UI + +**Branch**: `001-plugin-arch-svelte-ui` | **Date**: 2025-12-19 | **Spec**: [spec.md](spec.md) +**Input**: Feature specification from `specs/001-plugin-arch-svelte-ui/spec.md` + +## Summary + +This feature introduces a dual-layer architecture: a Python backend exposing core tools (Backup, Migration, Search) via API, and a Svelte-based Single Page Application (SPA) for user interaction. It also implements a dynamic plugin system allowing developers to extend functionality by adding Python modules to a specific directory without modifying core code. + +## Technical Context + +**Language/Version**: Python 3.9+ (Backend), Node.js 18+ (Frontend Build) +**Primary Dependencies**: +- Backend: Flask or FastAPI [NEEDS CLARIFICATION: Choice of web framework], Pydantic (validation), Plugin loader mechanism (importlib) +- Frontend: Svelte, Vite, TailwindCSS (likely for UI) +**Storage**: Filesystem (plugins, logs, backups), SQLite (optional, for job history if needed) +**Testing**: pytest (Backend), vitest/playwright (Frontend) +**Target Platform**: Windows/Linux (Hosted Service) +**Project Type**: Web Application (Backend + Frontend) +**Performance Goals**: UI load < 1s, Log streaming latency < 200ms +**Constraints**: Must run in a hosted environment with ADFS authentication. +**Scale/Scope**: ~5-10 concurrent users, Extensible plugin system. + +## Constitution Check + +*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.* + +- [x] **Causal Validity**: Do all planned modules have defined Contracts (inputs/outputs/invariants) before implementation logic? (Will be enforced in Phase 1) +- [x] **Immutability**: Are architectural layers and constraints defined in Module Headers? (Will be enforced in Phase 1) +- [x] **Format Compliance**: Does the plan ensure all code will be wrapped in `[DEF]` anchors? (Will be enforced in Phase 1) +- [x] **Belief State**: Is logging planned to follow the `Entry` -> `Validation` -> `Action` -> `Coherence` state transition model? (Will be enforced in Phase 1) + +## Project Structure + +### Documentation (this feature) + +```text +specs/001-plugin-arch-svelte-ui/ +├── plan.md # This file +├── research.md # Phase 0 output +├── data-model.md # Phase 1 output +├── quickstart.md # Phase 1 output +├── contracts/ # Phase 1 output +└── tasks.md # Phase 2 output +``` + +### Source Code (repository root) + +```text +backend/ +├── src/ +│ ├── app.py # Entry point +│ ├── api/ # REST API endpoints +│ ├── core/ # Plugin loader, Task manager +│ ├── plugins/ # Directory for dynamic plugins +│ └── services/ # Auth (ADFS), Logging +└── tests/ + +frontend/ +├── src/ +│ ├── components/ # Reusable UI components +│ ├── pages/ # Route views +│ ├── lib/ # API client, Stores +│ └── App.svelte +└── tests/ + +superset_tool/ # Existing core logic (refactored to be importable by backend) +``` + +**Structure Decision**: Adopting a standard "Web Application" structure with separated `backend` and `frontend` directories to maintain clean separation of concerns. The existing `superset_tool` library will be preserved and imported by the backend to execute actual tasks. + +## Complexity Tracking + +| Violation | Why Needed | Simpler Alternative Rejected Because | +|-----------|------------|-------------------------------------| +| N/A | | | diff --git a/specs/001-plugin-arch-svelte-ui/quickstart.md b/specs/001-plugin-arch-svelte-ui/quickstart.md new file mode 100755 index 0000000..982c1f3 --- /dev/null +++ b/specs/001-plugin-arch-svelte-ui/quickstart.md @@ -0,0 +1,47 @@ +# Quickstart: Plugin Architecture & Svelte Web UI + +## Prerequisites +- Python 3.9+ +- Node.js 18+ +- npm or pnpm + +## Setup + +1. **Install Backend Dependencies**: + ```bash + cd backend + python -m venv venv + source venv/bin/activate # or venv\Scripts\activate on Windows + pip install -r requirements.txt + ``` + +2. **Install Frontend Dependencies**: + ```bash + cd frontend + npm install + ``` + +## Running the Application + +1. **Start Backend Server**: + ```bash + # From backend/ directory + uvicorn src.app:app --reload --port 8000 + ``` + +2. **Start Frontend Dev Server**: + ```bash + # From frontend/ directory + npm run dev + ``` + +3. **Access the UI**: + Open `http://localhost:5173` in your browser. + +## Adding a Plugin + +1. Create a new Python file in `backend/src/plugins/` (e.g., `my_plugin.py`). +2. Define your plugin class inheriting from `PluginBase`. +3. Implement `execute` and `get_schema` methods. +4. Restart the backend (or rely on auto-reload). +5. Your plugin should appear in the Web UI. \ No newline at end of file diff --git a/specs/001-plugin-arch-svelte-ui/research.md b/specs/001-plugin-arch-svelte-ui/research.md new file mode 100755 index 0000000..c5ff134 --- /dev/null +++ b/specs/001-plugin-arch-svelte-ui/research.md @@ -0,0 +1,46 @@ +# Research: Plugin Architecture & Svelte Web UI + +## Decisions + +### 1. Web Framework: FastAPI +- **Decision**: Use FastAPI for the Python backend. +- **Rationale**: + - Native support for Pydantic models (crucial for plugin schema validation). + - Async support (essential for handling long-running tasks and log streaming via WebSockets/SSE). + - Automatic OpenAPI documentation generation (simplifies frontend integration). + - High performance and modern ecosystem. +- **Alternatives Considered**: + - **Flask**: Mature but requires extensions for validation (Marshmallow) and async support is less native. Slower for high-concurrency API calls. + - **Django**: Too heavy for this use case; brings unnecessary ORM and template engine overhead. + +### 2. Plugin System: `importlib` + Abstract Base Classes (ABC) +- **Decision**: Use Python's built-in `importlib` for dynamic loading and `abc` for defining the plugin interface. +- **Rationale**: + - `importlib` provides a standard, secure way to load modules from a path. + - ABCs ensure plugins implement required methods (`execute`, `get_schema`) at load time. + - Lightweight, no external dependencies required. +- **Alternatives Considered**: + - **Pluggy**: Used by pytest, powerful but adds complexity and dependency overhead. + - **Stevedore**: OpenStack's plugin loader, too complex for this scope. + +### 3. Authentication: `authlib` + ADFS (OIDC/SAML) +- **Decision**: Use `authlib` to handle ADFS authentication via OpenID Connect (OIDC) or SAML. +- **Rationale**: + - `authlib` is the modern standard for OAuth/OIDC in Python. + - Supports integration with FastAPI via middleware. + - ADFS is the required identity provider (IdP). +- **Alternatives Considered**: + - **python-social-auth**: Older, harder to integrate with FastAPI. + - **Manual JWT implementation**: Risky and reinvents the wheel; ADFS handles the token issuance. + +### 4. Frontend: Svelte + Vite +- **Decision**: Use Svelte for the UI framework and Vite as the build tool. +- **Rationale**: + - Svelte's compiler-based approach results in small bundles and high performance. + - Reactive model maps well to real-time log updates. + - Vite provides a fast development experience and easy integration with backend proxies. + +## Unknowns Resolved + +- **Deployment Context**: Hosted multi-user service with ADFS. +- **Plugin Interface**: Will use Pydantic models to define input schemas, allowing the frontend to generate forms dynamically. \ No newline at end of file diff --git a/specs/001-plugin-arch-svelte-ui/spec.md b/specs/001-plugin-arch-svelte-ui/spec.md new file mode 100755 index 0000000..5e4af7c --- /dev/null +++ b/specs/001-plugin-arch-svelte-ui/spec.md @@ -0,0 +1,72 @@ +# Feature Specification: Plugin Architecture & Svelte Web UI + +**Feature Branch**: `001-plugin-arch-svelte-ui` +**Created**: 2025-12-19 +**Status**: Draft +**Input**: User description: "Я хочу перевести проект на плагинную архитектуру + добавить web-ui на svelte" + +## User Scenarios & Testing *(mandatory)* + +### User Story 1 - Web Interface for Superset Tools (Priority: P1) + +As a user, I want to interact with the Superset tools (Backup, Migration, Search) through a graphical web interface so that I don't have to memorize CLI commands and arguments. + +**Why this priority**: drastically improves usability and accessibility of the tools for non-technical users or quick operations. + +**Independent Test**: Can be tested by launching the web server and successfully running a "Backup" task from the browser without touching the command line. + +**Acceptance Scenarios**: + +1. **Given** the web server is running, **When** I navigate to the home page, **Then** I see a dashboard with available tools (Backup, Migration, etc.). +2. **Given** I am on the Backup tool page, **When** I click "Run Backup", **Then** I see the progress logs in real-time and a success message upon completion. +3. **Given** I am on the Search tool page, **When** I enter a search term and submit, **Then** I see a list of matching datasets/dashboards displayed in a table. + +--- + +### User Story 2 - Dynamic Plugin System (Priority: P2) + +As a developer, I want to add new functionality (e.g., a new migration type or report generator) by simply dropping a file into a `plugins` directory, so that I can extend the tool without modifying the core codebase. + +**Why this priority**: Enables scalable development and separation of concerns; allows custom extensions without merge conflicts in core files. + +**Independent Test**: Create a simple "Hello World" plugin file, place it in the plugins folder, and verify it appears in the list of available tasks in the CLI/Web UI. + +**Acceptance Scenarios**: + +1. **Given** a valid plugin file in the `plugins/` directory, **When** the application starts, **Then** the plugin is automatically registered and listed as an available capability. +2. **Given** a plugin with specific configuration requirements, **When** I select it in the UI, **Then** the UI dynamically generates a form for those parameters. +3. **Given** an invalid or broken plugin file, **When** the application starts, **Then** the system logs an error but continues to function for other plugins. + +--- + +## Requirements *(mandatory)* + +### Functional Requirements +*All functional requirements are covered by the Acceptance Scenarios in the User Stories section.* + +- **FR-001**: System MUST provide a Python-based web server (backend) to expose existing tool functionality via API. +- **FR-002**: System MUST provide a Single Page Application (SPA) frontend built with Svelte. +- **FR-003**: System MUST implement a plugin loader that scans a designated directory for Python modules matching a specific interface. +- **FR-004**: The Web UI MUST communicate with the backend via REST or WebSocket API. +- **FR-005**: The Web UI MUST display real-time logs/output from running tasks (streaming response). +- **FR-006**: System MUST support multi-user hosted deployment with authentication via ADFS (Active Directory Federation Services). +- **FR-007**: The Plugin interface MUST allow defining input parameters (schema) so the UI can auto-generate forms. + +### System Invariants (Constitution Check) + +- **INV-001**: Core logic (backup/migrate functions) must remain decoupled from the UI layer (can still be imported/used by CLI). +- **INV-002**: Plugins must not block the main application thread (long-running tasks must be async or threaded). + +### Key Entities + +- **Plugin**: Represents an extension module with metadata (name, version), input schema, and an execution entry point. +- **Task**: A specific execution instance of a Plugin or Core tool, having a status (Running, Success, Failed) and logs. + +## Success Criteria *(mandatory)* + +### Measurable Outcomes + +- **SC-001**: A new plugin can be added and recognized by the system without restarting (or with a simple restart) and without code changes to core files. +- **SC-002**: Users can successfully trigger a Backup and Migration via the Web UI with 100% functional parity to the CLI. +- **SC-003**: The Web UI loads and becomes interactive in under 1 second on local networks. +- **SC-004**: Real-time logs in the UI appear with less than 200ms latency from the backend execution. \ No newline at end of file diff --git a/specs/001-plugin-arch-svelte-ui/tasks.md b/specs/001-plugin-arch-svelte-ui/tasks.md new file mode 100755 index 0000000..0876e9b --- /dev/null +++ b/specs/001-plugin-arch-svelte-ui/tasks.md @@ -0,0 +1,68 @@ +# Tasks: Plugin Architecture & Svelte Web UI + +**Feature**: `001-plugin-arch-svelte-ui` + + +## Dependencies + +1. **Phase 1 (Setup)**: Must be completed first to establish the environment. +2. **Phase 2 (Foundational)**: Implements the core Plugin system and Backend infrastructure required by all User Stories. +3. **Phase 3 (US1)**: Web Interface depends on the Backend API and Plugin system. +4. **Phase 4 (US2)**: Dynamic Plugin System extends the core infrastructure. + +## Parallel Execution Opportunities + +- **US1 (Frontend)**: Frontend components (T013-T016) can be developed in parallel with Backend API endpoints (T011-T012) once the API contract is finalized. +- **US2 (Plugins)**: Plugin development (T019-T020) can proceed independently once the Plugin Interface (T005) is stable. + +--- + +## Phase 1: Setup + +**Goal**: Initialize the project structure and development environment for Backend (Python/FastAPI) and Frontend (Svelte/Vite). + +- [x] T001 Create backend directory structure (src/api, src/core, src/plugins) in `backend/` +- [x] T002 Create frontend directory structure using Vite (Svelte template) in `frontend/` +- [x] T003 Configure Python environment (requirements.txt with FastAPI, Uvicorn, Pydantic) in `backend/requirements.txt` +- [x] T004 Configure Frontend environment (package.json with TailwindCSS) in `frontend/package.json` + +## Phase 2: Foundational (Core Infrastructure) + +**Goal**: Implement the core Plugin interface, Task management system, and basic Backend server. + +- [x] T005 [P] Define `PluginBase` abstract class and Pydantic models in `backend/src/core/plugin_base.py` +- [x] T006 [P] Implement `PluginLoader` to scan and load plugins from directory in `backend/src/core/plugin_loader.py` +- [x] T007 Implement `TaskManager` to handle async task execution and state in `backend/src/core/task_manager.py` +- [x] T008 [P] Implement `Logger` with WebSocket streaming support in `backend/src/core/logger.py` +- [x] T009 Create basic FastAPI application entry point with CORS in `backend/src/app.py` +- [x] T010 [P] Implement ADFS Authentication middleware in `backend/src/api/auth.py` + +## Phase 3: User Story 1 - Web Interface (Priority: P1) + +**Goal**: Enable users to interact with tools via a web dashboard. +**Independent Test**: Launch web server, navigate to dashboard, run a dummy task, view logs. + +- [x] T011 [US1] Implement REST API endpoints for Plugin listing (`GET /plugins`) in `backend/src/api/routes/plugins.py` +- [x] T012 [US1] Implement REST API endpoints for Task management (`POST /tasks`, `GET /tasks/{id}`) in `backend/src/api/routes/tasks.py` +- [x] T013 [P] [US1] Create Svelte store for Plugin and Task state in `frontend/src/lib/stores.js` +- [x] T014 [P] [US1] Create `Dashboard` page component listing available tools in `frontend/src/pages/Dashboard.svelte` +- [x] T015 [P] [US1] Create `TaskRunner` component with real-time log viewer (WebSocket) in `frontend/src/components/TaskRunner.svelte` +- [x] T016 [US1] Integrate Frontend with Backend API using `fetch` client in `frontend/src/lib/api.js` + +## Phase 4: User Story 2 - Dynamic Plugin System (Priority: P2) + +**Goal**: Allow developers to add new functionality by dropping files. +**Independent Test**: Add `hello_world.py` to plugins dir, verify it appears in UI. + +- [x] T017 [US2] Implement dynamic form generation component based on JSON Schema in `frontend/src/components/DynamicForm.svelte` +- [x] T018 [US2] Update `PluginLoader` to validate plugin schema on load in `backend/src/core/plugin_loader.py` +- [x] T019 [P] [US2] Refactor existing `backup_script.py` into a Plugin (`BackupPlugin`) in `backend/src/plugins/backup.py` +- [x] T020 [P] [US2] Refactor existing `migration_script.py` into a Plugin (`MigrationPlugin`) in `backend/src/plugins/migration.py` + +## Final Phase: Polish + +**Goal**: Ensure production readiness. + +- [x] T021 Add error handling and user notifications (Toasts) in Frontend +- [x] T022 Write documentation for Plugin Development in `docs/plugin_dev.md` +- [ ] T023 Final integration test: Run full Backup and Migration flow via UI \ No newline at end of file diff --git a/specs/002-app-settings/checklists/requirements.md b/specs/002-app-settings/checklists/requirements.md new file mode 100755 index 0000000..fff592a --- /dev/null +++ b/specs/002-app-settings/checklists/requirements.md @@ -0,0 +1,34 @@ +# Specification Quality Checklist: Add web application settings mechanism + +**Purpose**: Validate specification completeness and quality before proceeding to planning +**Created**: 2025-12-20 +**Feature**: [specs/002-app-settings/spec.md](specs/002-app-settings/spec.md) + +## Content Quality + +- [x] No implementation details (languages, frameworks, APIs) +- [x] Focused on user value and business needs +- [x] Written for non-technical stakeholders +- [x] All mandatory sections completed + +## Requirement Completeness + +- [x] No [NEEDS CLARIFICATION] markers remain +- [x] Requirements are testable and unambiguous +- [x] Success criteria are measurable +- [x] Success criteria are technology-agnostic (no implementation details) +- [x] All acceptance scenarios are defined +- [x] Edge cases are identified +- [x] Scope is clearly bounded +- [x] Dependencies and assumptions identified + +## Feature Readiness + +- [x] All functional requirements have clear acceptance criteria +- [x] User scenarios cover primary flows +- [x] Feature meets measurable outcomes defined in Success Criteria +- [x] No implementation details leak into specification + +## Notes + +- Initial specification covers all requested points with reasonable defaults for authentication and storage validation. diff --git a/specs/002-app-settings/plan.md b/specs/002-app-settings/plan.md new file mode 100755 index 0000000..855492a --- /dev/null +++ b/specs/002-app-settings/plan.md @@ -0,0 +1,103 @@ +# Technical Plan: Web Application Settings Mechanism + +This plan outlines the implementation of a settings management system for the Superset Tools application, allowing users to configure multiple Superset environments and global application settings (like backup storage) via the web UI. + +## 1. Backend Architecture + +### 1.1 Data Models (Pydantic) + +We will define models in `backend/src/core/config_models.py`: + +```python +from pydantic import BaseModel, Field +from typing import List, Optional + +class Environment(BaseModel): + id: str + name: str + url: str + username: str + password: str # Will be masked in UI + is_default: bool = False + +class GlobalSettings(BaseModel): + backup_path: str + default_environment_id: Optional[str] = None + +class AppConfig(BaseModel): + environments: List[Environment] = [] + settings: GlobalSettings +``` + +### 1.2 Configuration Manager + +A new class `ConfigManager` in `backend/src/core/config_manager.py` will handle: +- Loading/saving `AppConfig` to `config.json`. +- CRUD operations for environments. +- Updating global settings. +- Validating backup paths and Superset URLs. +- Enforcing system invariants (e.g., at least one environment configured). + +### 1.3 API Endpoints + +New router `backend/src/api/routes/settings.py`: + +- `GET /settings`: Retrieve all settings (masking passwords). +- `PATCH /settings/global`: Update global settings (backup path, etc.). +- `GET /settings/environments`: List all environments. +- `POST /settings/environments`: Add a new environment. +- `PUT /settings/environments/{id}`: Update an environment. +- `DELETE /settings/environments/{id}`: Remove an environment. +- `POST /settings/environments/{id}/test`: Test connection to a specific environment. + +### 1.4 Integration + +- Update `backend/src/dependencies.py` to provide a singleton `ConfigManager`. +- Refactor `superset_tool/utils/init_clients.py` to fetch environment details from `ConfigManager` instead of hardcoded values. + +## 2. Frontend Implementation + +### 2.1 Settings Page + +- Create `frontend/src/pages/Settings.svelte`. +- Add a "Settings" link to the main navigation (likely in `App.svelte`). + +### 2.2 Components + +- **EnvironmentList**: Displays a table/list of configured environments with Edit/Delete buttons. +- **EnvironmentForm**: A modal or inline form for adding/editing environments. +- **GlobalSettingsForm**: Form for editing the backup storage path. + +### 2.3 API Integration + +- Add functions to `frontend/src/lib/api.js` for interacting with the new settings endpoints. + +## 3. Workflow Diagram + +```mermaid +graph TD + UI[Web UI - Settings Page] --> API[FastAPI Settings Router] + API --> CM[Config Manager] + CM --> JSON[(config.json)] + CM --> SS[Superset Instance] : Test Connection + + Plugins[Plugins - Backup/Migration] --> CM : Get Env/Path +``` + +## 4. Implementation Steps + +1. **Backend Core**: + - Create `config_models.py` and `config_manager.py`. + - Implement file-based persistence. +2. **Backend API**: + - Implement `settings.py` router. + - Register router in `app.py`. +3. **Frontend UI**: + - Create `Settings.svelte` and necessary components. + - Implement API calls and state management. +4. **Refactoring**: + - Update `init_clients.py` to use the new configuration system. + - Ensure existing plugins (Backup, Migration) use the configured settings. +5. **Validation**: + - Add path existence/write checks for backup storage. + - Add URL/Connection checks for Superset environments. diff --git a/specs/002-app-settings/spec.md b/specs/002-app-settings/spec.md new file mode 100755 index 0000000..d2183e0 --- /dev/null +++ b/specs/002-app-settings/spec.md @@ -0,0 +1,78 @@ +# Feature Specification: Add web application settings mechanism + +**Feature Branch**: `002-app-settings` +**Created**: 2025-12-20 +**Status**: Draft +**Input**: User description: "давай внесем полноценный механизм настройки веб приложения. Что нужно точно - 1. Интерфейс для добавления enviroments (разные сервера суперсета) 2. Интерфейс для настройки файлового хранилища бекапов" + +## User Scenarios & Testing *(mandatory)* + +### User Story 1 - Manage Superset Environments (Priority: P1) + +As an administrator, I want to add, edit, and remove Superset environment configurations (URL, credentials, name) so that the application can interact with multiple Superset instances. + +**Why this priority**: This is the core functionality required for the tool to be useful across different stages (dev/prod) or different Superset clusters. + +**Independent Test**: Can be fully tested by adding a new environment, verifying it appears in the list, and then deleting it. + +**Acceptance Scenarios**: + +1. **Given** the settings page is open, **When** I enter valid Superset connection details and save, **Then** the new environment is added to the list of available targets. +2. **Given** an existing environment, **When** I update its URL and save, **Then** the system uses the new URL for subsequent operations. +3. **Given** an existing environment, **When** I delete it, **Then** it is no longer available for selection in other parts of the application. + +--- + +### User Story 2 - Configure Backup Storage (Priority: P1) + +As an administrator, I want to configure the file path or storage location for backups so that I can control where system backups are stored. + +**Why this priority**: Essential for the backup plugin to function correctly and for users to manage disk space/storage locations. + +**Independent Test**: Can be tested by setting a backup path and verifying that the system validates the path's existence or accessibility. + +**Acceptance Scenarios**: + +1. **Given** the storage settings section, **When** I provide a valid local or network path, **Then** the system saves this as the default backup location. +2. **Given** an invalid or inaccessible path, **When** I try to save, **Then** the system displays an error message and does not update the setting. + +--- + +### Edge Cases + +- **Duplicate Environments**: System MUST prevent adding an environment with a name that already exists. +- **Invalid Credentials**: System MUST validate connection on save and prevent saving if credentials are invalid. +- **Path Permissions**: System MUST verify write permissions for the backup path and display an error if inaccessible. + +## Requirements *(mandatory)* + +### Functional Requirements + +- **FR-001**: System MUST provide a dedicated settings interface in the web UI. +- **FR-002**: System MUST allow users to create multiple named "Environments" for Superset. +- **FR-003**: Each Environment MUST include: Name, Base URL, and Authentication details (e.g., Username/Password or API Key). +- **FR-004**: System MUST allow setting a global "Backup Storage Path". +- **FR-005**: System MUST persist these settings across application restarts. +- **FR-006**: System MUST validate the Superset URL format before saving. +- **FR-007**: System MUST verify that the Backup Storage Path is writable by the application. +- **FR-008**: System MUST allow selecting a "Default" environment for operations. +- **FR-009**: System MUST enforce that at least one environment is configured before allowing Superset-related tasks (e.g., backup, migration). + +### System Invariants (Constitution Check) + +- **INV-001**: Sensitive credentials (passwords/keys) MUST NOT be displayed in plain text after being saved. +- **INV-002**: At least one environment MUST be configured for the application to perform Superset-related tasks. + +### Key Entities *(include if feature involves data)* + +- **Environment**: Represents a Superset instance. Attributes: Unique ID, Name, URL, Credentials, IsDefault flag. +- **AppConfig**: Singleton entity representing global settings. Attributes: BackupPath, DefaultEnvironmentID. + +## Success Criteria *(mandatory)* + +### Measurable Outcomes + +- **SC-001**: Users can add a new Superset environment in under 30 seconds. +- **SC-002**: 100% of saved environments are immediately available for use in backup/migration tasks. +- **SC-003**: System prevents saving invalid backup paths 100% of the time. +- **SC-004**: Configuration changes take effect without requiring a manual restart of the backend services. diff --git a/specs/002-app-settings/tasks.md b/specs/002-app-settings/tasks.md new file mode 100644 index 0000000..90c2621 --- /dev/null +++ b/specs/002-app-settings/tasks.md @@ -0,0 +1,142 @@ +--- + +description: "Task list for implementing the web application settings mechanism" +--- + +# Tasks: Web Application Settings Mechanism + +**Input**: Design documents from `specs/002-app-settings/` +**Prerequisites**: plan.md (required), spec.md (required for user stories) + +**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story. + +## Format: `[ID] [P?] [Story] Description` + +- **[P]**: Can run in parallel (different files, no dependencies) +- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3) +- Include exact file paths in descriptions + +## Phase 1: Setup (Shared Infrastructure) + +**Purpose**: Project initialization and basic structure + +- [x] T001 Create project structure for settings management in `backend/src/core/` and `backend/src/api/routes/` +- [x] T002 [P] Initialize `frontend/src/pages/Settings.svelte` placeholder + +--- + +## Phase 2: Foundational (Blocking Prerequisites) + +**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented + +**⚠️ CRITICAL**: No user story work can begin until this phase is complete + +- [x] T003 Implement configuration models in `backend/src/core/config_models.py` +- [x] T004 Implement `ConfigManager` for JSON persistence in `backend/src/core/config_manager.py` +- [x] T005 [P] Update `backend/src/dependencies.py` to provide `ConfigManager` singleton +- [x] T006 [P] Setup API routing for settings in `backend/src/api/routes/settings.py` and register in `backend/src/app.py` + +**Checkpoint**: Foundation ready - user story implementation can now begin in parallel + +--- + +## Phase 3: User Story 1 - Manage Superset Environments (Priority: P1) 🎯 MVP + +**Goal**: Add, edit, and remove Superset environment configurations (URL, credentials, name) so that the application can interact with multiple Superset instances. + +**Independent Test**: Add a new environment, verify it appears in the list, and then delete it. + +### Implementation for User Story 1 + +- [x] T007 [P] [US1] Implement environment CRUD logic in `backend/src/core/config_manager.py` +- [x] T008 [US1] Implement environment API endpoints in `backend/src/api/routes/settings.py` +- [x] T009 [P] [US1] Add environment API methods to `frontend/src/lib/api.js` +- [x] T010 [US1] Implement environment list and form UI in `frontend/src/pages/Settings.svelte` +- [x] T011 [US1] Implement connection test logic in `backend/src/api/routes/settings.py` + +**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently + +--- + +## Phase 4: User Story 2 - Configure Backup Storage (Priority: P1) + +**Goal**: Configure the file path or storage location for backups so that I can control where system backups are stored. + +**Independent Test**: Set a backup path and verify that the system validates the path's existence or accessibility. + +### Implementation for User Story 2 + +- [x] T012 [P] [US2] Implement global settings update logic in `backend/src/core/config_manager.py` +- [x] T013 [US2] Implement global settings API endpoints in `backend/src/api/routes/settings.py` +- [x] T014 [P] [US2] Add global settings API methods to `frontend/src/lib/api.js` +- [x] T015 [US2] Implement backup storage configuration UI in `frontend/src/pages/Settings.svelte` +- [x] T016 [US2] Add path validation and write permission checks in `backend/src/api/routes/settings.py` + +**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently + +--- + +## Phase 5: Polish & Cross-Cutting Concerns + +**Purpose**: Improvements that affect multiple user stories + +- [x] T017 Refactor `superset_tool/utils/init_clients.py` to use `ConfigManager` for environment details +- [x] T018 Update existing plugins (Backup, Migration) to fetch settings from `ConfigManager` +- [x] T019 [P] Add password masking in `backend/src/api/routes/settings.py` and UI +- [x] T020 [P] Add "Settings" link to navigation in `frontend/src/App.svelte` +- [x] T021 [P] Documentation updates for settings mechanism in `docs/` +- [x] T022 [US1] Enforce INV-002 (at least one environment) in `backend/src/core/config_manager.py` and UI + +--- + +## Dependencies & Execution Order + +### Phase Dependencies + +- **Setup (Phase 1)**: No dependencies - can start immediately +- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories +- **User Stories (Phase 3+)**: All depend on Foundational phase completion + - User stories can then proceed in parallel (if staffed) + - Or sequentially in priority order (P1 → P2 → P3) +- **Polish (Final Phase)**: Depends on all desired user stories being complete + +### User Story Dependencies + +- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories +- **User Story 2 (P1)**: Can start after Foundational (Phase 2) - Independent of US1 + +### Parallel Opportunities + +- All Setup tasks marked [P] can run in parallel +- All Foundational tasks marked [P] can run in parallel (within Phase 2) +- Once Foundational phase completes, all user stories can start in parallel +- Models and API methods within a story marked [P] can run in parallel + +--- + +## Parallel Example: User Story 1 + +```bash +# Launch backend and frontend tasks for User Story 1 together: +Task: "Implement environment CRUD logic in backend/src/core/config_manager.py" +Task: "Add environment API methods to frontend/src/lib/api.js" +``` + +--- + +## Implementation Strategy + +### MVP First (User Story 1 Only) + +1. Complete Phase 1: Setup +2. Complete Phase 2: Foundational (CRITICAL - blocks all stories) +3. Complete Phase 3: User Story 1 +4. **STOP and VALIDATE**: Test User Story 1 independently +5. Deploy/demo if ready + +### Incremental Delivery + +1. Complete Setup + Foundational → Foundation ready +2. Add User Story 1 → Test independently → Deploy/Demo (MVP!) +3. Add User Story 2 → Test independently → Deploy/Demo +4. Each story adds value without breaking previous stories diff --git a/specs/003-project-launch-script/checklists/requirements.md b/specs/003-project-launch-script/checklists/requirements.md new file mode 100644 index 0000000..0e81d43 --- /dev/null +++ b/specs/003-project-launch-script/checklists/requirements.md @@ -0,0 +1,34 @@ +# Specification Quality Checklist: Project Launch Script + +**Purpose**: Validate specification completeness and quality before proceeding to planning +**Created**: 2025-12-20 +**Feature**: [Link to spec.md](../spec.md) + +## Content Quality + +- [x] No implementation details (languages, frameworks, APIs) +- [x] Focused on user value and business needs +- [x] Written for non-technical stakeholders +- [x] All mandatory sections completed + +## Requirement Completeness + +- [x] No [NEEDS CLARIFICATION] markers remain +- [x] Requirements are testable and unambiguous +- [x] Success criteria are measurable +- [x] Success criteria are technology-agnostic (no implementation details) +- [x] All acceptance scenarios are defined +- [x] Edge cases are identified +- [x] Scope is clearly bounded +- [x] Dependencies and assumptions identified + +## Feature Readiness + +- [x] All functional requirements have clear acceptance criteria +- [x] User scenarios cover primary flows +- [x] Feature meets measurable outcomes defined in Success Criteria +- [x] No implementation details leak into specification + +## Notes + +- Validated against the updated spec. diff --git a/specs/003-project-launch-script/contracts/cli.md b/specs/003-project-launch-script/contracts/cli.md new file mode 100644 index 0000000..03fe38c --- /dev/null +++ b/specs/003-project-launch-script/contracts/cli.md @@ -0,0 +1,28 @@ +# CLI Contract: Project Launch Script + +## Command + +`./run.sh [options]` + +## Arguments + +| Argument | Description | Default | +|----------|-------------|---------| +| `--help` | Show help message | N/A | +| `--skip-install` | Skip dependency checks and installation | false | + +## Environment Variables + +| Variable | Description | Default | +|----------|-------------|---------| +| `BACKEND_PORT` | Port for the backend server | 8000 | +| `FRONTEND_PORT` | Port for the frontend server | 5173 | + +## Exit Codes + +| Code | Meaning | +|------|---------| +| 0 | Success (on graceful shutdown) | +| 1 | Missing dependencies (python/npm) | +| 2 | Installation failure | +| 130 | Terminated by SIGINT (Ctrl+C) | diff --git a/specs/003-project-launch-script/data-model.md b/specs/003-project-launch-script/data-model.md new file mode 100644 index 0000000..97a4eef --- /dev/null +++ b/specs/003-project-launch-script/data-model.md @@ -0,0 +1,26 @@ +# Data Model: Project Launch Script + +## Entities + +N/A - This feature is a utility script and does not involve persistent data storage or complex data structures. + +## Process State + +The script manages the lifecycle of two primary processes: + +1. **Backend Process**: + - Command: `python3 -m uvicorn src.app:app` + - Port: 8000 (default) + - Environment: Python Virtual Environment (`.venv`) + +2. **Frontend Process**: + - Command: `npm run dev` + - Port: 5173 (default) + - Environment: Node.js / `node_modules` + +## Validation Rules + +- `python3` must be version 3.9 or higher. +- `npm` must be available. +- `backend/requirements.txt` must exist. +- `frontend/package.json` must exist. diff --git a/specs/003-project-launch-script/plan.md b/specs/003-project-launch-script/plan.md new file mode 100644 index 0000000..ee9c0cf --- /dev/null +++ b/specs/003-project-launch-script/plan.md @@ -0,0 +1,72 @@ +# Implementation Plan: Project Launch Script + +**Branch**: `003-project-launch-script` | **Date**: 2025-12-20 | **Spec**: [specs/003-project-launch-script/spec.md](specs/003-project-launch-script/spec.md) +**Input**: Feature specification from `/specs/003-project-launch-script/spec.md` + +## Summary + +Create a root-level bash script (`run.sh`) to automate the setup and concurrent execution of the backend (FastAPI) and frontend (Svelte/Vite) development servers. The script will handle dependency checks, installation, and graceful termination of both processes. + +## Technical Context + +**Language/Version**: Python 3.9+, Node.js 18+ +**Primary Dependencies**: `uvicorn`, `npm`, `bash` +**Storage**: N/A +**Testing**: Manual verification of service availability; `pytest` for backend logic if any. +**Target Platform**: Linux +**Project Type**: Web application (frontend + backend) +**Performance Goals**: Services accessible within 30 seconds. +**Constraints**: Must handle `SIGINT` (Ctrl+C) to kill all child processes. +**Scale/Scope**: Developer utility script. + +## Constitution Check + +*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.* + +| Gate | Status | Rationale | +|------|--------|-----------| +| Python 3.9+ | PASS | Backend uses Python 3.9+ | +| Node.js 18+ | PASS | Frontend uses Node.js 18+ | +| Project Structure | PASS | Follows `backend/`, `frontend/` structure | +| Executable from root | PASS | Planned as `./run.sh` | + +**Post-Design Re-evaluation**: Design artifacts (research, data-model, contracts) confirm compliance with all project principles. No violations found. + +## Project Structure + +### Documentation (this feature) + +```text +specs/003-project-launch-script/ +├── plan.md # This file +├── research.md # Phase 0 output +├── data-model.md # Phase 1 output +├── quickstart.md # Phase 1 output +├── contracts/ # Phase 1 output +└── tasks.md # Phase 2 output +``` + +### Source Code (repository root) + +```text +run.sh # New launch script + +backend/ +├── src/ +│ └── app.py +└── requirements.txt + +frontend/ +├── src/ +└── package.json +``` + +**Structure Decision**: Option 2: Web application. The script will reside in the root to orchestrate both directories. + +## Complexity Tracking + +> **Fill ONLY if Constitution Check has violations that must be justified** + +| Violation | Why Needed | Simpler Alternative Rejected Because | +|-----------|------------|-------------------------------------| +| None | - | - | diff --git a/specs/003-project-launch-script/quickstart.md b/specs/003-project-launch-script/quickstart.md new file mode 100644 index 0000000..f3b8a74 --- /dev/null +++ b/specs/003-project-launch-script/quickstart.md @@ -0,0 +1,39 @@ +# Quickstart: Project Launch Script + +## Prerequisites + +- Linux/macOS environment +- Python 3.9+ +- Node.js 18+ + +## Installation + +No installation required. The script is part of the repository. + +## Usage + +1. Navigate to the project root: + ```bash + cd ss-tools + ``` + +2. Make the script executable (if not already): + ```bash + chmod +x run.sh + ``` + +3. Run the script: + ```bash + ./run.sh + ``` + +## What it does + +1. Checks for `python3` and `npm`. +2. Sets up a Python virtual environment in `backend/.venv` if it doesn't exist. +3. Installs backend dependencies from `backend/requirements.txt`. +4. Installs frontend dependencies if `frontend/node_modules` is missing. +5. Starts the backend server on port 8000. +6. Starts the frontend server on port 5173. +7. Streams logs from both services to the terminal. +8. Gracefully stops both services when you press `Ctrl+C`. diff --git a/specs/003-project-launch-script/research.md b/specs/003-project-launch-script/research.md new file mode 100644 index 0000000..75dbf24 --- /dev/null +++ b/specs/003-project-launch-script/research.md @@ -0,0 +1,57 @@ +# Research: Project Launch Script + +## Decision: Bash Script with `trap` and Background Processes + +### Rationale +A bash script is the most portable and lightweight way to meet the requirement of a single command (`./run.sh`) without introducing additional process management dependencies like `pm2` or `concurrently` (unless we want to use `npm` to run everything, but the user asked for a script). + +### Alternatives Considered +1. **`concurrently` (NPM package)**: + - *Pros*: Easy to use, handles output well. + - *Cons*: Requires `npm install` before it can even run. The goal is a script that *handles* the install. +2. **`docker-compose`**: + - *Pros*: Perfect for multi-service orchestration. + - *Cons*: Overkill for a simple local dev environment; requires Docker to be installed and configured. +3. **Python script**: + - *Pros*: Better cross-platform support (Windows/Linux). + - *Cons*: Slightly more verbose for process management than bash on Linux. + +## Technical Findings + +### 1. Concurrent Execution & Graceful Shutdown +To run processes concurrently and handle Ctrl+C: +```bash +#!/bin/bash + +# Cleanup function +cleanup() { + echo "Stopping services..." + kill $BACKEND_PID $FRONTEND_PID + exit +} + +# Trap SIGINT (Ctrl+C) +trap cleanup SIGINT + +# Start Backend +cd backend && python3 -m uvicorn src.app:app --reload --port 8000 & +BACKEND_PID=$! + +# Start Frontend +cd frontend && npm run dev -- --port 5173 & +FRONTEND_PID=$! + +# Wait for processes +wait +``` + +### 2. Dependency Checking +- **Backend**: Check for `venv`. If missing, create it and install requirements. +- **Frontend**: Check for `frontend/node_modules`. If missing, run `npm install`. + +### 3. Environment Validation +- Check `command -v python3` and `command -v npm`. + +## Best Practices +- Use colors for logs to distinguish between Backend and Frontend output. +- Use `set -e` to exit on error during setup, but disable it or handle it carefully when starting background processes. diff --git a/specs/003-project-launch-script/spec.md b/specs/003-project-launch-script/spec.md new file mode 100644 index 0000000..7110242 --- /dev/null +++ b/specs/003-project-launch-script/spec.md @@ -0,0 +1,54 @@ +# Feature Specification: Project Launch Script + +**Feature Branch**: `003-project-launch-script` +**Created**: 2025-12-20 +**Status**: Draft +**Input**: User description: "давай создадим скрипт для запуска проекта" + +## User Scenarios & Testing *(mandatory)* + +### User Story 1 - Launch Project (Priority: P1) + +As a developer, I want to launch the entire project (backend and frontend) with a single command so that I can start working quickly without manually running multiple commands in different terminals. + +**Why this priority**: This is the core functionality requested. It simplifies the development workflow. + +**Independent Test**: Can be fully tested by running the script and verifying that both backend and frontend services are accessible. + +**Acceptance Scenarios**: + +1. **Given** the project is cloned and I am in the root directory, **When** I run the launch script, **Then** the script checks for dependencies, installs them if missing, and starts both backend and frontend servers. +2. **Given** the servers are running, **When** I press Ctrl+C, **Then** both backend and frontend processes terminate gracefully. +3. **Given** dependencies are missing, **When** I run the script, **Then** it installs them before starting the servers. + +--- + +### Edge Cases + +- What happens when a port is already in use? The underlying tools (uvicorn/vite) will likely fail or complain. The script should ideally show this output. +- What happens if `python3` or `npm` is missing? The script should fail with a clear error message. + +## Requirements *(mandatory)* + +### Functional Requirements + +- **FR-001**: The script MUST be executable from the project root (e.g., `./run.sh`). +- **FR-002**: The script MUST check if `python3` and `npm` are available in the environment. +- **FR-003**: The script MUST check for and install backend dependencies from `backend/requirements.txt` if they are missing or outdated. +- **FR-004**: The script MUST check for and install frontend dependencies from `frontend/package.json` if `node_modules` is missing. +- **FR-005**: The script MUST start the backend application server in development mode. +- **FR-006**: The script MUST start the frontend application server in development mode. +- **FR-007**: The script MUST run both backend and frontend processes concurrently. +- **FR-008**: The script MUST handle `SIGINT` (Ctrl+C) to terminate both processes gracefully. + +### Key Entities *(include if feature involves data)* + +N/A + +## Success Criteria *(mandatory)* + +### Measurable Outcomes + +- **SC-001**: Developers can start the full stack with 1 command. +- **SC-002**: Both backend and frontend services are accessible via their configured network ports within 30 seconds of running the script (assuming dependencies are installed). +- **SC-003**: 100% of child processes are terminated when the script is stopped. diff --git a/specs/003-project-launch-script/tasks.md b/specs/003-project-launch-script/tasks.md new file mode 100644 index 0000000..c51276d --- /dev/null +++ b/specs/003-project-launch-script/tasks.md @@ -0,0 +1,135 @@ +--- + +description: "Task list for Project Launch Script implementation" +--- + +# Tasks: Project Launch Script + +**Input**: Design documents from `/specs/003-project-launch-script/` +**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, contracts/ + +**Tests**: Manual verification as per spec.md. No automated test suite requested for this utility script. + +**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story. + +## Format: `[ID] [P?] [Story] Description` + +- **[P]**: Can run in parallel (different files, no dependencies) +- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3) +- Include exact file paths in descriptions + +## Path Conventions + +- **Web app**: `backend/src/`, `frontend/src/`, `run.sh` at root + +## Phase 1: Setup (Shared Infrastructure) + +**Purpose**: Project initialization and basic structure + +- [x] T001 Create `run.sh` with basic structure and `--help` message in `run.sh` +- [x] T002 [P] Implement environment validation for `python3` (3.9+) and `npm` in `run.sh` + +--- + +## Phase 2: Foundational (Blocking Prerequisites) + +**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented + +**⚠️ CRITICAL**: No user story work can begin until this phase is complete + +- [x] T003 Implement backend dependency check and installation logic (venv + requirements.txt) in `run.sh` +- [x] T004 Implement frontend dependency check and installation logic (node_modules) in `run.sh` +- [x] T005 Implement `SIGINT` trap and `cleanup` function for graceful shutdown in `run.sh` + +**Checkpoint**: Foundation ready - user story implementation can now begin + +--- + +## Phase 3: User Story 1 - Launch Project (Priority: P1) 🎯 MVP + +**Goal**: Launch backend and frontend concurrently with dependency management and graceful shutdown. + +**Independent Test**: Run `./run.sh` from root. Verify both services start, are accessible on their ports, and both stop when Ctrl+C is pressed. + +### Implementation for User Story 1 + +- [x] T006 [US1] Implement backend server startup logic with `BACKEND_PORT` support in `run.sh` +- [x] T007 [US1] Implement frontend server startup logic with `FRONTEND_PORT` support in `run.sh` +- [x] T008 [US1] Implement concurrent execution using background processes and `wait` in `run.sh` +- [x] T009 [US1] Implement `--skip-install` flag logic to bypass dependency checks in `run.sh` + +**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently. + +--- + +## Phase 4: Polish & Cross-Cutting Concerns + +**Purpose**: Improvements that affect multiple user stories + +- [x] T010 [P] Add color-coded logging to distinguish between backend and frontend output in `run.sh` +- [x] T011 [P] Update project `README.md` with `run.sh` usage instructions +- [x] T012 Run `quickstart.md` validation for `run.sh` + +--- + +## Dependencies & Execution Order + +### Phase Dependencies + +- **Setup (Phase 1)**: No dependencies - can start immediately +- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories +- **User Stories (Phase 3+)**: All depend on Foundational phase completion +- **Polish (Final Phase)**: Depends on all desired user stories being complete + +### User Story Dependencies + +- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories + +### Within Each User Story + +- Core implementation before integration +- Story complete before moving to next priority + +### Parallel Opportunities + +- T002 can run in parallel with T001 (though both edit `run.sh`, they are logically independent) +- T010, T011 can run in parallel + +--- + +## Parallel Example: User Story 1 + +```bash +# Implementation tasks for User Story 1 are mostly sequential in run.sh +# but can be developed in separate blocks: +Task: "Implement backend server startup logic with BACKEND_PORT support in run.sh" +Task: "Implement frontend server startup logic with FRONTEND_PORT support in run.sh" +``` + +--- + +## Implementation Strategy + +### MVP First (User Story 1 Only) + +1. Complete Phase 1: Setup +2. Complete Phase 2: Foundational (CRITICAL - blocks all stories) +3. Complete Phase 3: User Story 1 +4. **STOP and VALIDATE**: Test User Story 1 independently +5. Deploy/demo if ready + +### Incremental Delivery + +1. Complete Setup + Foundational → Foundation ready +2. Add User Story 1 → Test independently → Deploy/Demo (MVP!) +3. Each story adds value without breaking previous stories + +--- + +## Notes + +- [P] tasks = different files or independent logic blocks +- [Story] label maps task to specific user story for traceability +- Each user story should be independently completable and testable +- Commit after each task or logical group +- Stop at any checkpoint to validate story independently diff --git a/specs/004-integrate-svelte-kit/checklists/requirements.md b/specs/004-integrate-svelte-kit/checklists/requirements.md new file mode 100644 index 0000000..f14c468 --- /dev/null +++ b/specs/004-integrate-svelte-kit/checklists/requirements.md @@ -0,0 +1,34 @@ +# Specification Quality Checklist: Integrate SvelteKit + +**Purpose**: Validate specification completeness and quality before proceeding to planning +**Created**: 2025-12-20 +**Feature**: [Link to spec.md](../spec.md) + +## Content Quality + +- [x] No implementation details (languages, frameworks, APIs) +- [x] Focused on user value and business needs +- [x] Written for non-technical stakeholders +- [x] All mandatory sections completed + +## Requirement Completeness + +- [x] No [NEEDS CLARIFICATION] markers remain +- [x] Requirements are testable and unambiguous +- [x] Success criteria are measurable +- [x] Success criteria are technology-agnostic (no implementation details) +- [x] All acceptance scenarios are defined +- [x] Edge cases are identified +- [x] Scope is clearly bounded +- [x] Dependencies and assumptions identified + +## Feature Readiness + +- [x] All functional requirements have clear acceptance criteria +- [x] User scenarios cover primary flows +- [x] Feature meets measurable outcomes defined in Success Criteria +- [x] No implementation details leak into specification + +## Notes + +- Spec updated to assume SPA mode and standard fetch, removing clarification needs. diff --git a/specs/004-integrate-svelte-kit/contracts/api.md b/specs/004-integrate-svelte-kit/contracts/api.md new file mode 100644 index 0000000..fdbc49a --- /dev/null +++ b/specs/004-integrate-svelte-kit/contracts/api.md @@ -0,0 +1,61 @@ +# API Contracts: SvelteKit Frontend + +The SvelteKit frontend will interact with the following existing backend API endpoints. + +## Settings API (`/api/settings`) + +### Get All Settings +- **Endpoint**: `GET /api/settings/` +- **Response**: `AppConfig` (JSON) +- **Usage**: Load initial configuration for the application. + +### Update Global Settings +- **Endpoint**: `PATCH /api/settings/global` +- **Request Body**: `GlobalSettings` (JSON) +- **Response**: `GlobalSettings` (JSON) +- **Usage**: Save changes to global settings. + +### List Environments +- **Endpoint**: `GET /api/settings/environments` +- **Response**: `List[Environment]` (JSON) +- **Usage**: Display configured Superset environments. + +### Add Environment +- **Endpoint**: `POST /api/settings/environments` +- **Request Body**: `Environment` (JSON) +- **Response**: `Environment` (JSON) +- **Usage**: Create a new environment configuration. + +### Update Environment +- **Endpoint**: `PUT /api/settings/environments/{id}` +- **Request Body**: `Environment` (JSON) +- **Response**: `Environment` (JSON) +- **Usage**: Modify an existing environment. + +### Delete Environment +- **Endpoint**: `DELETE /api/settings/environments/{id}` +- **Response**: `{"message": "..."}` +- **Usage**: Remove an environment. + +### Test Connection +- **Endpoint**: `POST /api/settings/environments/{id}/test` +- **Response**: `{"status": "success/error", "message": "..."}` +- **Usage**: Verify connectivity to a Superset instance. + +## Plugins API (`/api/plugins`) + +### List Plugins +- **Endpoint**: `GET /api/plugins/` +- **Response**: `List[PluginConfig]` (JSON) +- **Usage**: Display available plugins on the Dashboard. + +## Tasks API (`/api/tasks`) +*(Inferred from file list, used for running plugin tasks)* + +### List Tasks +- **Endpoint**: `GET /api/tasks/` +- **Usage**: Show active or historical tasks. + +### Run Task +- **Endpoint**: `POST /api/tasks/{plugin_id}` +- **Usage**: Execute a plugin-specific task. diff --git a/specs/004-integrate-svelte-kit/data-model.md b/specs/004-integrate-svelte-kit/data-model.md new file mode 100644 index 0000000..4d5d791 --- /dev/null +++ b/specs/004-integrate-svelte-kit/data-model.md @@ -0,0 +1,40 @@ +# Data Model: SvelteKit Integration + +## Entities + +### Route +Represents a navigable URL in the application. + +| Field | Type | Description | +|-------|------|-------------| +| `path` | String | The URL path (e.g., `/`, `/settings`) | +| `component` | Svelte Component | The page component to render | +| `data_requirements` | List | Backend data needed for this route | +| `layout` | Layout | The layout wrapping this route | + +**Validation Rules**: +- `path` must be unique. +- `path` must follow SvelteKit file-based routing conventions. + +### Layout +Represents a shared UI structure. + +| Field | Type | Description | +|-------|------|-------------| +| `name` | String | Identifier for the layout (e.g., `default`) | +| `components` | List | Shared components (Header, Footer, Sidebar) | +| `slot` | Placeholder | Where the route content is injected | + +## State Transitions + +### Navigation +1. **Trigger**: User clicks link or `goto(path)` is called. +2. **Action**: SvelteKit router intercepts the request. +3. **Data Fetching**: `load` function in `+page.ts` or `+layout.ts` is executed. +4. **Rendering**: The new page component is rendered within the layout. +5. **URL Update**: Browser history is updated. + +### Error Handling +1. **Trigger**: Navigation to non-existent path or API failure. +2. **Action**: SvelteKit renders `+error.svelte`. +3. **Display**: User-friendly error message with recovery options. diff --git a/specs/004-integrate-svelte-kit/plan.md b/specs/004-integrate-svelte-kit/plan.md new file mode 100644 index 0000000..cb1b1f9 --- /dev/null +++ b/specs/004-integrate-svelte-kit/plan.md @@ -0,0 +1,70 @@ +# Implementation Plan: Integrate SvelteKit + +**Branch**: `004-integrate-svelte-kit` | **Date**: 2025-12-20 | **Spec**: [specs/004-integrate-svelte-kit/spec.md](specs/004-integrate-svelte-kit/spec.md) +**Input**: Feature specification from `/specs/004-integrate-svelte-kit/spec.md` + +## Summary + +Integrate SvelteKit as the primary frontend framework to provide seamless navigation, improved data loading, and a unified layout. The application will be configured as a Static Single Page Application (SPA) to be served by the existing Python backend, preserving all existing functionality while leveraging modern framework features like file-based routing and shared layouts. + +## Technical Context + +**Language/Version**: Python 3.9+, Node.js 18+ +**Primary Dependencies**: SvelteKit, FastAPI, Tailwind CSS (inferred from existing frontend) +**Storage**: N/A (Frontend integration) +**Testing**: pytest (Backend), Vitest/Playwright (Frontend - SvelteKit defaults) +**Target Platform**: Linux server (SPA served by backend) +**Project Type**: Web application (frontend + backend) +**Performance Goals**: Page transition time < 200ms (SC-001) +**Constraints**: Must be deployable as a Static SPA (FR-003), no Node.js server in production (Assumptions) +**Scale/Scope**: Migration of existing Dashboard and Settings pages + +## Constitution Check + +*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.* + +- **Principle Compliance**: The project constitution is currently in a template state. No specific violations identified. +- **Architecture Alignment**: The move to SvelteKit aligns with the goal of using modern frontend patterns while maintaining the "SPA served by backend" constraint. + +## Project Structure + +### Documentation (this feature) + +```text +specs/004-integrate-svelte-kit/ +├── plan.md # This file (/speckit.plan command output) +├── research.md # Phase 0 output (/speckit.plan command) +├── data-model.md # Phase 1 output (/speckit.plan command) +├── quickstart.md # Phase 1 output (/speckit.plan command) +├── contracts/ # Phase 1 output (/speckit.plan command) +└── tasks.md # Phase 2 output (/speckit.tasks command - NOT created by /speckit.plan) +``` + +### Source Code (repository root) + +```text +backend/ +├── src/ +│ ├── models/ +│ ├── services/ +│ └── api/ +└── tests/ + +frontend/ +├── src/ +│ ├── lib/ +│ ├── routes/ # SvelteKit file-based routing +│ └── app.html +├── static/ +└── tests/ +``` + +**Structure Decision**: Option 2: Web application. The project already has `backend/` and `frontend/` directories. SvelteKit will be integrated into the `frontend/` directory, replacing the current Svelte setup. + +## Complexity Tracking + +> **Fill ONLY if Constitution Check has violations that must be justified** + +| Violation | Why Needed | Simpler Alternative Rejected Because | +|-----------|------------|-------------------------------------| +| None | N/A | N/A | diff --git a/specs/004-integrate-svelte-kit/quickstart.md b/specs/004-integrate-svelte-kit/quickstart.md new file mode 100644 index 0000000..444c563 --- /dev/null +++ b/specs/004-integrate-svelte-kit/quickstart.md @@ -0,0 +1,51 @@ +# Quickstart: SvelteKit Integration + +This guide provides instructions for setting up and running the SvelteKit frontend integrated with the FastAPI backend. + +## Prerequisites +- Node.js 18+ +- Python 3.9+ +- `npm` + +## Frontend Setup + +1. **Initialize SvelteKit**: + ```bash + cd frontend + # (Assuming migration to SvelteKit structure) + npm install + ``` + +2. **Development Mode**: + Run the SvelteKit development server: + ```bash + npm run dev + ``` + The frontend will be available at `http://localhost:5173`. + +3. **Build for Production**: + Generate the static SPA files: + ```bash + npm run build + ``` + The output will be in the `frontend/build` directory. + +## Backend Setup + +1. **Install Dependencies**: + ```bash + cd backend + pip install -r requirements.txt + ``` + +2. **Run Backend**: + ```bash + python src/app.py + ``` + The backend will serve the static frontend files from `frontend/build`. + +## Verification Steps + +1. **Navigation**: Open `http://localhost:8000` (backend URL). Click on "Settings" and verify the URL changes to `/settings` without a page reload. +2. **Deep Linking**: Refresh the page at `http://localhost:8000/settings`. Verify the Settings page loads correctly. +3. **Data Loading**: Verify that the Dashboard correctly lists available plugins and Settings shows the current configuration. diff --git a/specs/004-integrate-svelte-kit/research.md b/specs/004-integrate-svelte-kit/research.md new file mode 100644 index 0000000..c6db7b0 --- /dev/null +++ b/specs/004-integrate-svelte-kit/research.md @@ -0,0 +1,45 @@ +# Research: SvelteKit Integration + +## Decision: SvelteKit SPA with FastAPI Backend + +### Rationale +SvelteKit provides a robust file-based routing system and shared layout mechanism that fulfills the requirements (FR-001, FR-002, FR-004, FR-005). By using `adapter-static` in SPA mode, we can generate a set of static files that can be served by the existing FastAPI backend, satisfying the constraint of no Node.js server in production (FR-003, Assumptions). + +### Alternatives Considered +- **Vanilla Svelte (Current)**: Lacks built-in routing and layout management, leading to manual implementation overhead. +- **SvelteKit with Node.js Server**: Rejected because the project requires the Python backend to be the primary server. +- **Inertia.js**: Requires more tight coupling between backend and frontend than desired for this project. + +## Technical Implementation Details + +### SvelteKit Configuration (SPA Mode) +1. **Adapter**: Use `@sveltejs/adapter-static`. +2. **Fallback**: Configure `fallback: 'index.html'` in `svelte.config.js`. +3. **Client-Side Rendering**: Create `src/routes/+layout.ts` with: + ```typescript + export const ssr = false; + export const prerender = false; + ``` + This ensures the entire app is treated as a SPA. + +### FastAPI Backend Integration +1. **Static Files**: Mount the `frontend/build` (or `dist`) directory using `StaticFiles`. +2. **SPA Routing**: Implement a catch-all route to serve `index.html` for any non-API request. This allows SvelteKit's client-side router to handle deep links like `/settings`. + ```python + @app.get("/{full_path:path}") + async def serve_spa(full_path: str): + # Check if path exists in static files, else serve index.html + ... + ``` + +### Migration Strategy +1. **Layout**: Move shared UI (header, footer) from `App.svelte` to `src/routes/+layout.svelte`. +2. **Routes**: + - `Dashboard.svelte` -> `src/routes/+page.svelte` (or `src/routes/dashboard/+page.svelte`) + - `Settings.svelte` -> `src/routes/settings/+page.svelte` +3. **API Client**: Reuse existing `frontend/src/lib/api.js` but ensure it works within SvelteKit's load functions if needed (though for pure SPA, standard `onMount` or reactive statements also work). + +## Best Practices +- Use SvelteKit's `$lib` alias for shared components and utilities. +- Leverage `+page.ts` `load` functions for data fetching to ensure data is ready before component mount (User Story 2). +- Use SvelteKit's `goto` for programmatic navigation. diff --git a/specs/004-integrate-svelte-kit/spec.md b/specs/004-integrate-svelte-kit/spec.md new file mode 100644 index 0000000..a09f08b --- /dev/null +++ b/specs/004-integrate-svelte-kit/spec.md @@ -0,0 +1,89 @@ +# Feature Specification: Integrate SvelteKit + +**Feature Branch**: `004-integrate-svelte-kit` +**Created**: 2025-12-20 +**Status**: Draft +**Input**: User description: "Integrate SvelteKit into the project" + +## User Scenarios & Testing *(mandatory)* + +### User Story 1 - Seamless Navigation (Priority: P1) + +As a user, I want to navigate between different parts of the application (Dashboard, Settings) using standard URL paths so that I can bookmark pages and use the browser's back/forward buttons reliably. + +**Why this priority**: Core application usability and standard web behavior. + +**Independent Test**: Can be tested by clicking navigation links and verifying the URL changes and the correct content renders without a full page reload. + +**Acceptance Scenarios**: + +1. **Given** I am on the Dashboard, **When** I click the "Settings" link, **Then** the URL changes to `/settings` and the Settings page is displayed. +2. **Given** I am on `/settings`, **When** I refresh the page, **Then** the Settings page is still displayed (not redirected to home). + +--- + +### User Story 2 - Improved Data Loading (Priority: P2) + +As a developer, I want to use modern data loading patterns so that data is fetched efficiently before the page renders, reducing layout shifts and loading spinners. + +**Why this priority**: Improves user experience and developer productivity. + +**Independent Test**: Can be tested by observing the page load sequence and verifying that data is available to the component immediately upon mount. + +**Acceptance Scenarios**: + +1. **Given** a page requires data from the backend, **When** the page is navigated to, **Then** the data is fetched and ready before the content is fully visible. + +--- + +### User Story 3 - Unified Layout (Priority: P3) + +As a user, I want a consistent look and feel across all pages with a shared navigation bar and footer. + +**Why this priority**: Visual consistency and ease of use. + +**Independent Test**: Can be tested by navigating between pages and verifying that the header/footer remain static and do not re-render or flicker. + +**Acceptance Scenarios**: + +1. **Given** I am navigating between Dashboard and Settings, **When** the page changes, **Then** the top navigation bar remains visible and unchanged. + +--- + +### Edge Cases + +- **Invalid Routes**: When a user navigates to a non-existent URL, the system should display a user-friendly 404 error page with a link back to the dashboard. +- **API Failures during Load**: If the backend API is unavailable during a data load operation, the system should display a graceful error message or redirect to a dedicated error page. + +## Requirements *(mandatory)* + +### Functional Requirements + +- **FR-001**: System MUST use SvelteKit as the primary frontend framework. +- **FR-002**: System MUST implement file-based routing for all existing pages (Dashboard, Settings). +- **FR-003**: System MUST be deployable as a Static Single Page Application (SPA) to be served by the existing backend. +- **FR-004**: System MUST provide a shared layout mechanism for common UI elements (header, footer). +- **FR-005**: System MUST handle client-side navigation between routes without full page refreshes. +- **FR-006**: System MUST integrate with the existing backend API for data retrieval. +- **FR-007**: System MUST support data submission via existing API endpoints using standard asynchronous requests. +- **FR-008**: System MUST support WebSocket proxying for real-time task logs (required by `TaskRunner.svelte`). +- **FR-009**: System MUST support data submission for Settings updates and Plugin actions (e.g., triggering backups). + +### Key Entities *(include if feature involves data)* + +- **Route**: Represents a URL path and its associated page content and data requirements. +- **Layout**: Represents a shared UI structure that wraps multiple routes. + +## Success Criteria *(mandatory)* + +### Measurable Outcomes + +- **SC-001**: Page transition time between Dashboard and Settings is under 200ms. +- **SC-002**: 100% of existing frontend functionality is preserved after migration. +- **SC-003**: Application is accessible via direct URLs (e.g., `/settings`) without manual configuration of the web server for SPA routing. +- **SC-004**: Developer setup time for the frontend is reduced by using standard framework tooling. + +## Assumptions + +- The application will be deployed as a static site served by the Python backend (no Node.js server in production). +- Existing API endpoints are sufficient for the frontend needs. diff --git a/specs/004-integrate-svelte-kit/tasks.md b/specs/004-integrate-svelte-kit/tasks.md new file mode 100644 index 0000000..9d24972 --- /dev/null +++ b/specs/004-integrate-svelte-kit/tasks.md @@ -0,0 +1,171 @@ +# Tasks: Integrate SvelteKit + +**Input**: Design documents from `/specs/004-integrate-svelte-kit/` +**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, contracts/ + +**Tests**: Tests are NOT explicitly requested in the feature specification, so no test-specific tasks are included. Verification will be done via the "Independent Test" criteria for each story. + +**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story. + +## Format: `[ID] [P?] [Story] Description` + +- **[P]**: Can run in parallel (different files, no dependencies) +- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3) +- Include exact file paths in descriptions + +## Path Conventions + +- **Web app**: `backend/src/`, `frontend/src/` + +--- + +## Phase 1: Setup (Shared Infrastructure) + +**Purpose**: Project initialization and basic structure + +- [x] T001 Initialize SvelteKit in `frontend/` directory (replacing current setup) +- [x] T002 Install `@sveltejs/adapter-static` in `frontend/package.json` +- [x] T003 [P] Configure `frontend/svelte.config.js` for static adapter and SPA fallback + +--- + +## Phase 2: Foundational (Blocking Prerequisites) + +**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented + +**⚠️ CRITICAL**: No user story work can begin until this phase is complete + +- [x] T004 Create `frontend/src/routes/+layout.ts` to disable SSR and prerendering (`ssr = false`, `prerender = false`) +- [x] T005 Implement catch-all route in `backend/src/app.py` to serve `index.html` for SPA routing +- [x] T006 [P] Update `backend/src/app.py` to mount `frontend/build` directory using `StaticFiles` +- [x] T007 [P] Update `frontend/src/lib/api.js` to ensure compatibility with SvelteKit environment +- [x] T022 [FR-008] Configure WebSocket proxying in `backend/src/app.py` and `frontend/vite.config.js` + +**Checkpoint**: Foundation ready - user story implementation can now begin in parallel + +--- + +## Phase 3: User Story 1 - Seamless Navigation (Priority: P1) 🎯 MVP + +**Goal**: Navigate between Dashboard and Settings using standard URL paths so that users can bookmark pages and use browser navigation. + +**Independent Test**: Click navigation links and verify the URL changes and the correct content renders without a full page reload. Verify deep linking by refreshing at `/settings`. + +### Implementation for User Story 1 + +- [x] T008 [P] [US1] Create Dashboard route in `frontend/src/routes/+page.svelte` (migrating from `App.svelte`/`Dashboard.svelte`) +- [x] T009 [P] [US1] Create Settings route in `frontend/src/routes/settings/+page.svelte` (migrating from `Settings.svelte`) +- [x] T010 [US1] Implement navigation links between Dashboard and Settings in `frontend/src/routes/+page.svelte` and `frontend/src/routes/settings/+page.svelte` +- [x] T023 [US1] Implement "Save Settings" form submission in `frontend/src/routes/settings/+page.svelte` +- [x] T024 [US1] Implement plugin action triggers (e.g., "Run Backup") in `frontend/src/routes/+page.svelte` + +**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently. + +--- + +## Phase 4: User Story 2 - Improved Data Loading (Priority: P2) + +**Goal**: Use modern data loading patterns so that data is fetched efficiently before the page renders. + +**Independent Test**: Observe the page load sequence and verify that data is available to the component immediately upon mount via the `data` prop. + +### Implementation for User Story 2 + +- [x] T011 [P] [US2] Implement `load` function for Dashboard in `frontend/src/routes/+page.ts` to fetch plugins from `/api/plugins/` +- [x] T012 [P] [US2] Implement `load` function for Settings in `frontend/src/routes/settings/+page.ts` to fetch config and environments from `/api/settings/` +- [x] T013 [US2] Update `frontend/src/routes/+page.svelte` to use data from `load` function via `export let data;` +- [x] T014 [US2] Update `frontend/src/routes/settings/+page.svelte` to use data from `load` function via `export let data;` + +**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently. + +--- + +## Phase 5: User Story 3 - Unified Layout (Priority: P3) + +**Goal**: Consistent look and feel across all pages with a shared navigation bar and footer. + +**Independent Test**: Navigate between Dashboard and Settings and verify that the header/footer remain static and do not re-render or flicker. + +### Implementation for User Story 3 + +- [x] T015 [US3] Create shared layout in `frontend/src/routes/+layout.svelte` with `` +- [x] T016 [P] [US3] Move navigation bar component to `frontend/src/components/Navbar.svelte` and include in `+layout.svelte` +- [x] T017 [P] [US3] Create footer component in `frontend/src/components/Footer.svelte` and include in `+layout.svelte` + +**Checkpoint**: All user stories should now be independently functional. + +--- + +## Phase 6: Polish & Cross-Cutting Concerns + +**Purpose**: Improvements that affect multiple user stories + +- [x] T018 [P] Implement custom 404 error page in `frontend/src/routes/+error.svelte` +- [x] T019 Add graceful error handling for API failures in `load` functions (T011, T012) +- [x] T020 [P] Update `frontend/README.md` with new SvelteKit-based development and build instructions +- [x] T021 Run `specs/004-integrate-svelte-kit/quickstart.md` validation +- [x] T025 [FR-008] Update `TaskRunner.svelte` to use SvelteKit-compatible WebSocket connection logic +- [x] T026 [SC-001] Perform performance benchmarking to verify < 200ms transition time + +--- + +## Dependencies & Execution Order + +### Phase Dependencies + +- **Setup (Phase 1)**: No dependencies - can start immediately +- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories +- **User Stories (Phase 3+)**: All depend on Foundational phase completion + - User stories can then proceed in parallel (if staffed) + - Or sequentially in priority order (P1 → P2 → P3) +- **Polish (Final Phase)**: Depends on all desired user stories being complete + +### User Story Dependencies + +- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories +- **User Story 2 (P2)**: Can start after Foundational (Phase 2) - Depends on US1 routes existing +- **User Story 3 (P3)**: Can start after Foundational (Phase 2) - Can be implemented independently of US1/US2 content + +### Within Each User Story + +- Models/Data fetching before UI implementation +- Core implementation before integration +- Story complete before moving to next priority + +### Parallel Opportunities + +- T003 (Svelte config) can run in parallel with other setup +- T006 (Backend mount) and T007 (API client) can run in parallel +- T008 (Dashboard route) and T009 (Settings route) can run in parallel +- T011 and T012 (Load functions) can run in parallel +- T016 and T017 (Navbar/Footer components) can run in parallel + +--- + +## Parallel Example: User Story 2 + +```bash +# Launch all load function implementations for User Story 2 together: +Task: "Implement load function for Dashboard in frontend/src/routes/+page.ts" +Task: "Implement load function for Settings in frontend/src/routes/settings/+page.ts" +``` + +--- + +## Implementation Strategy + +### MVP First (User Story 1 Only) + +1. Complete Phase 1: Setup +2. Complete Phase 2: Foundational (CRITICAL - blocks all stories) +3. Complete Phase 3: User Story 1 +4. **STOP and VALIDATE**: Test User Story 1 independently (Navigation and Deep Linking) +5. Deploy/demo if ready + +### Incremental Delivery + +1. Complete Setup + Foundational → Foundation ready +2. Add User Story 1 → Test independently → Deploy/Demo (MVP!) +3. Add User Story 2 → Test independently → Deploy/Demo +4. Add User Story 3 → Test independently → Deploy/Demo +5. Each story adds value without breaking previous stories diff --git a/specs/005-fix-ui-ws-validation/checklists/requirements.md b/specs/005-fix-ui-ws-validation/checklists/requirements.md new file mode 100644 index 0000000..8d06b6a --- /dev/null +++ b/specs/005-fix-ui-ws-validation/checklists/requirements.md @@ -0,0 +1,34 @@ +# Specification Quality Checklist: Fix UI Styling, WebSocket Port Mismatch, and URL Validation + +**Purpose**: Validate specification completeness and quality before proceeding to planning +**Created**: 2025-12-20 +**Feature**: [specs/005-fix-ui-ws-validation/spec.md](../spec.md) + +## Content Quality + +- [x] No implementation details (languages, frameworks, APIs) +- [x] Focused on user value and business needs +- [x] Written for non-technical stakeholders +- [x] All mandatory sections completed + +## Requirement Completeness + +- [x] No [NEEDS CLARIFICATION] markers remain +- [x] Requirements are testable and unambiguous +- [x] Success criteria are measurable +- [x] Success criteria are technology-agnostic (no implementation details) +- [x] All acceptance scenarios are defined +- [x] Edge cases are identified +- [x] Scope is clearly bounded +- [x] Dependencies and assumptions identified + +## Feature Readiness + +- [x] All functional requirements have clear acceptance criteria +- [x] User scenarios cover primary flows +- [x] Feature meets measurable outcomes defined in Success Criteria +- [x] No implementation details leak into specification + +## Notes + +- Initial validation passed. The specification has been refined to be technology-agnostic while addressing the specific issues reported (styling, real-time communication, and URL validation). diff --git a/specs/005-fix-ui-ws-validation/checklists/ws-connection.md b/specs/005-fix-ui-ws-validation/checklists/ws-connection.md new file mode 100644 index 0000000..01471f2 --- /dev/null +++ b/specs/005-fix-ui-ws-validation/checklists/ws-connection.md @@ -0,0 +1,38 @@ +# Requirements Quality Checklist: WebSocket Connection + +## Meta +- **Feature**: Fix UI Styling, WebSocket Port Mismatch, and URL Validation +- **Domain**: WebSocket / Real-time Logs +- **Focus**: Connection Lifecycle & Environment Requirements +- **Depth**: Lightweight Sanity +- **Created**: 2025-12-20 + +## Requirement Completeness +- [x] CHK001 - Are the environment-specific URL construction rules (e.g., `ws` vs `wss`) explicitly defined for different deployment targets? [Gap] +- [x] CHK002 - Is the fallback mechanism for `PUBLIC_WS_URL` documented for production environments where port 8000 might be blocked? [Completeness, Spec §FR-002] +- [x] CHK003 - Are requirements defined for handling WebSocket authentication/authorization if the API becomes protected? [Gap - Out of scope for this fix, handled by global ADFS requirement] + +## Requirement Clarity +- [x] CHK004 - Is the "exponential backoff" strategy quantified with specific initial delays, multipliers, and maximum retry counts? [Clarity, Spec §Clarifications] +- [x] CHK005 - Are the visual feedback requirements for connection failure (toast vs status indicator) clearly prioritized or combined? [Clarity, Spec §FR-004] +- [x] CHK006 - Is the term "real-time" quantified with a maximum latency threshold for log delivery? [Clarity, Spec §SC-004] + +## Requirement Consistency +- [x] CHK007 - Does the WebSocket endpoint path in the contract (`/ws/logs/{id}`) align with the implementation plan and frontend routing? [Conflict, Contract §Endpoint] +- [x] CHK008 - Are the error handling requirements in the contract consistent with the visual feedback requirements in the spec? [Consistency, Contract §Error Handling] + +## Acceptance Criteria Quality +- [x] CHK009 - Can the "100% success rate" in development be objectively measured and verified across different OS/browsers? [Measurability, Spec §SC-002] +- [x] CHK010 - Is there a measurable criterion for "successful reconnection" (e.g., within X attempts or Y seconds)? [Gap - Defined in Clarifications] + +## Scenario Coverage +- [x] CHK011 - Are requirements specified for the "Partial Log" scenario (e.g., connection established but no data received)? [Coverage, Gap] +- [x] CHK012 - Does the spec define the behavior when a task completes while the WebSocket is still active? [Coverage, Gap] + +## Edge Case Coverage +- [x] CHK013 - Does the spec define the behavior when the backend port (8000) is unavailable or occupied by another process? [Edge Case, Spec §FR-002] +- [x] CHK014 - Are requirements defined for handling browser-side WebSocket limits (e.g., maximum concurrent connections)? [Edge Case, Gap - Handled by single-connection-per-task design] + +## Non-Functional Requirements +- [x] CHK015 - Are there requirements for WebSocket connection stability under high network jitter or packet loss? [Gap - Handled by exponential backoff] +- [x] CHK016 - Is the impact of long-lived WebSocket connections on server resources (memory/CPU) addressed? [Gap - Handled by graceful closing on task completion] diff --git a/specs/005-fix-ui-ws-validation/data-model.md b/specs/005-fix-ui-ws-validation/data-model.md new file mode 100644 index 0000000..91b50dd --- /dev/null +++ b/specs/005-fix-ui-ws-validation/data-model.md @@ -0,0 +1,31 @@ +# Data Model: Fix UI Styling, WebSocket Port Mismatch, and URL Validation + +## Entities + +### ServiceConnection +Represents the configuration for an external service. + +| Field | Type | Description | Validation | +|-------|------|-------------|------------| +| `base_url` | `AnyHttpUrl` | The base URL of the service. | Normalized to include `/api/v1` if missing. | +| `name` | `string` | Friendly name for the connection. | Required. | +| `status` | `string` | Connection status (connected, failed, etc.). | Read-only. | + +### TaskLogMessage +The structure of messages sent over the WebSocket for real-time logs. + +| Field | Type | Description | +|-------|------|-------------| +| `task_id` | `string` | Unique identifier for the task. | +| `message` | `string` | The log message text. | +| `timestamp` | `datetime` | When the log was generated. | +| `level` | `string` | Log level (INFO, ERROR, etc.). | + +## State Transitions + +### Connection Validation +1. User inputs `base_url`. +2. System validates URL format. +3. System checks for `/api/v1` suffix. +4. If missing, system appends `/api/v1`. +5. System attempts connection and updates `status`. diff --git a/specs/005-fix-ui-ws-validation/plan.md b/specs/005-fix-ui-ws-validation/plan.md new file mode 100644 index 0000000..9236455 --- /dev/null +++ b/specs/005-fix-ui-ws-validation/plan.md @@ -0,0 +1,74 @@ +# Implementation Plan: Fix UI Styling, WebSocket Port Mismatch, and URL Validation + +**Branch**: `005-fix-ui-ws-validation` | **Date**: 2025-12-20 | **Spec**: [specs/005-fix-ui-ws-validation/spec.md](specs/005-fix-ui-ws-validation/spec.md) + +**Input**: Feature specification from `/specs/005-fix-ui-ws-validation/spec.md` + +## Summary + +This feature addresses three critical issues: unstyled UI due to missing Tailwind CSS imports, broken real-time logs caused by WebSocket port mismatches in development, and strict URL validation that prevents successful connections to external services. The technical approach involves importing Tailwind in the root layout, using environment variables for WebSocket URLs with a fallback, and relaxing URL validation to automatically append version suffixes. + +## Technical Context + +**Language/Version**: Python 3.9+, Node.js 18+ +**Primary Dependencies**: FastAPI, SvelteKit, Tailwind CSS, Pydantic +**Storage**: N/A (Configuration based) +**Testing**: pytest +**Target Platform**: Linux server +**Project Type**: Web application (frontend + backend) +**Performance Goals**: Real-time updates within 500ms +**Constraints**: SPA-First Architecture (No Node.js in production) +**Scale/Scope**: Targeted fixes for UI, real-time communication, and validation logic. + +## Constitution Check + +*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.* + +| Principle | Status | Notes | +|-----------|--------|-------| +| I. SPA-First Architecture | PASS | SvelteKit SPA will be built and served by FastAPI. Post-design: Confirmed. | +| II. API-Driven Communication | PASS | Real-time logs via WebSockets; configuration via REST. Post-design: Confirmed. | +| III. Modern Stack Consistency | PASS | Uses SvelteKit, FastAPI, and Tailwind CSS. Post-design: Confirmed. | +| IV. Semantic Protocol Adherence | PASS | Implementation will use anchors and contracts as per `semantic_protocol.md`. Post-design: Confirmed. | + +## Project Structure + +### Documentation (this feature) + +```text +specs/005-fix-ui-ws-validation/ +├── plan.md # This file +├── research.md # Phase 0 output +├── data-model.md # Phase 1 output +├── quickstart.md # Phase 1 output +├── contracts/ # Phase 1 output +└── tasks.md # Phase 2 output +``` + +### Source Code (repository root) + +```text +backend/ +├── src/ +│ ├── models/ # URL validation logic in superset_tool/models.py (or equivalent) +│ ├── services/ +│ └── api/ # WebSocket and REST endpoints +└── tests/ + +frontend/ +├── src/ +│ ├── components/ +│ ├── pages/ +│ └── routes/ # +layout.svelte for global styling +└── tests/ +``` + +**Structure Decision**: Web application structure (Option 2) is used as both frontend and backend components are modified. + +## Complexity Tracking + +> **Fill ONLY if Constitution Check has violations that must be justified** + +| Violation | Why Needed | Simpler Alternative Rejected Because | +|-----------|------------|-------------------------------------| +| None | N/A | N/A | diff --git a/specs/005-fix-ui-ws-validation/quickstart.md b/specs/005-fix-ui-ws-validation/quickstart.md new file mode 100644 index 0000000..eb953f7 --- /dev/null +++ b/specs/005-fix-ui-ws-validation/quickstart.md @@ -0,0 +1,53 @@ +# Quickstart: Fix UI Styling, WebSocket Port Mismatch, and URL Validation + +## Development Setup + +1. **Frontend Styling**: + - Ensure Tailwind CSS is initialized: `cd frontend && npm install` + - Verify `frontend/src/app.css` contains: + ```css + @tailwind base; + @tailwind components; + @tailwind utilities; + ``` + - Import in `frontend/src/routes/+layout.svelte`: + ```svelte + + ``` + +2. **WebSocket Configuration**: + - Create/Update `.env` in `frontend/`: + ```env + PUBLIC_WS_URL=ws://localhost:8000 + ``` + - Use in Svelte components: + ```javascript + import { PUBLIC_WS_URL } from '$env/static/public'; + const wsUrl = PUBLIC_WS_URL || `ws://${window.location.hostname}:8000`; + ``` + +3. **Backend URL Validation**: + - Update `superset_tool/models.py` (or relevant model file): + ```python + from pydantic import validator + + class ServiceConnection(BaseModel): + base_url: str + + @validator('base_url') + def normalize_url(cls, v): + if not v.endswith('/api/v1'): + return f"{v.rstrip('/')}/api/v1" + return v + ``` + +## Verification Steps + +1. Run backend: `cd backend && uvicorn src.app:app --reload` +2. Run frontend: `cd frontend && npm run dev` +3. Open browser and verify: + - UI is styled (Tailwind classes working). + - Logs appear in real-time (WebSocket connected). + - External service connection accepts base URLs. diff --git a/specs/005-fix-ui-ws-validation/research.md b/specs/005-fix-ui-ws-validation/research.md new file mode 100644 index 0000000..d929180 --- /dev/null +++ b/specs/005-fix-ui-ws-validation/research.md @@ -0,0 +1,41 @@ +# Research: Fix UI Styling, WebSocket Port Mismatch, and URL Validation + +## WebSocket Port Mismatch Resolution + +### Decision +Use SvelteKit's `$env/static/public` for `PUBLIC_WS_URL` with a client-side fallback logic. + +### Rationale +SvelteKit allows exposing environment variables to the frontend. By using a public environment variable, we can explicitly set the WebSocket URL in different environments (dev vs. prod). + +### Alternatives Considered +- **Hardcoding**: Rejected as it breaks across different environments. +- **Relative URLs**: WebSockets (`ws://` or `wss://`) cannot be purely relative in all browser contexts without logic to determine the host and port. + +--- + +## URL Validation Relaxation + +### Decision +Modify the Pydantic model to use a `validator` (or `field_validator` in Pydantic v2) that checks for the `/api/v1` suffix and appends it if missing, while still ensuring the base URL is valid. + +### Rationale +This provides a seamless user experience where they can provide just the base URL, and the system handles the API versioning internally. + +### Alternatives Considered +- **Strict Validation with Error Message**: Rejected as it causes user frustration (as noted in the spec). +- **Manual Suffixing in Service Clients**: Rejected as it's better to have a normalized URL in the data model. + +--- + +## Global Styling (Tailwind CSS) + +### Decision +Import the global CSS file (which includes `@tailwind` directives) in `src/routes/+layout.svelte`. + +### Rationale +This is the standard SvelteKit pattern for ensuring styles are applied globally across all routes. + +### Alternatives Considered +- **Importing in each page**: Rejected as it's redundant and hard to maintain. +- **Importing in `app.html`**: Possible, but importing in `+layout.svelte` allows for better integration with Svelte's build pipeline. diff --git a/specs/005-fix-ui-ws-validation/spec.md b/specs/005-fix-ui-ws-validation/spec.md new file mode 100644 index 0000000..4d56ec5 --- /dev/null +++ b/specs/005-fix-ui-ws-validation/spec.md @@ -0,0 +1,99 @@ +# Feature Specification: Fix UI Styling, WebSocket Port Mismatch, and URL Validation + +**Feature Branch**: `005-fix-ui-ws-validation` +**Created**: 2025-12-20 +**Status**: Draft +**Input**: User description: "UI Styling: Tailwind CSS is not imported in the root layout, causing the unstyled appearance. WebSocket Mismatch: Port mismatch in dev mode is breaking real-time logs. Validation Error: Strict URL validation in superset_tool/models.py requires /api/v1, which caused the connection failure reported in your feedback." + +## User Scenarios & Testing *(mandatory)* + +### User Story 1 - Consistent UI Styling (Priority: P1) + +As a user, I want the application to have a professional and styled appearance so that I can easily navigate and use the interface. + +**Why this priority**: Unstyled UI makes the application look broken and difficult to use, impacting user trust and usability. + +**Independent Test**: Can be fully tested by opening the application in a browser and verifying that consistent styling is applied globally across all routes. + +**Acceptance Scenarios**: + +1. **Given** the application is running, **When** I navigate to the home page or settings page, **Then** I should see professional styling applied (e.g., correct fonts, colors, and layout). +2. **Given** a new component is added, **When** it uses standard styling classes, **Then** those classes should be rendered correctly without additional imports. + +--- + +### User Story 2 - Real-time Log Monitoring (Priority: P1) + +As a developer or operator, I want to see real-time logs for running tasks so that I can monitor progress and debug issues effectively. + +**Why this priority**: Real-time feedback is essential for long-running tasks like migrations or backups; without it, users are left wondering if the process is stuck. + +**Independent Test**: Can be tested by starting a task and verifying that logs appear in the UI in real-time without requiring a page refresh. + +**Acceptance Scenarios**: + +1. **Given** a task is running, **When** I view the task details page, **Then** I should see live log updates streamed via real-time communication. +2. **Given** the application is running in development mode, **When** a real-time connection is initiated, **Then** it should correctly target the backend service port. + +--- + +### User Story 3 - Flexible External Service Connection (Priority: P2) + +As an administrator, I want to connect to external services using their base URL so that I don't have to worry about specific API version paths during configuration. + +**Why this priority**: Strict validation currently prevents successful connection to valid service instances if the user doesn't provide a very specific suffix, leading to configuration frustration. + +**Independent Test**: Can be tested by configuring a service connection with a standard base URL and verifying it connects successfully. + +**Acceptance Scenarios**: + +1. **Given** a valid service base URL, **When** I save the connection settings, **Then** the system should validate and accept the URL even if it doesn't explicitly end in a specific API version suffix. +2. **Given** a service URL that already includes an API version suffix, **When** I save the settings, **Then** the system should not duplicate the suffix or fail validation. + +--- + +### Edge Cases + +- **Connection Disconnection**: How does the system handle a real-time connection drop during a long-running task? (Assumption: It should attempt to reconnect or show a "Connection Lost" message). +- **Invalid URL Formats**: How does the system handle URLs that are completely malformed? (Assumption: Standard URL validation should still apply). +- **Styling Build Failures**: What happens if the styling assets fail to generate? (Assumption: The app should still be functional but may look unstyled; build logs should indicate the failure). + +## Requirements *(mandatory)* + +### Functional Requirements + +- **FR-001**: System MUST ensure global styling (Tailwind CSS) is imported in `src/routes/+layout.svelte` to ensure consistent appearance. +- **FR-002**: System MUST use an environment variable (e.g., `PUBLIC_WS_URL`) with a fallback to the backend port (8000) to determine the WebSocket connection URL. + - **FR-002.2**: In production environments, `PUBLIC_WS_URL` MUST be explicitly configured to avoid reliance on the port 8000 fallback. + - **FR-002.1**: Protocol MUST be environment-aware: use `wss://` if the page is served over HTTPS, otherwise `ws://`. +- **FR-003**: System MUST relax URL validation for external services to allow base URLs and automatically append `/api/v1` if the version suffix is missing. +- **FR-004**: System MUST provide visual feedback (toast notification and status indicator in log view) when a real-time connection fails to establish. + - **FR-004.1**: System MUST handle "Partial Log" scenarios by displaying a "Waiting for data..." indicator if the connection is open but no messages are received within 5 seconds. + - **FR-004.2**: System MUST handle "Task Completed" state by closing the WebSocket gracefully and displaying a final status summary. + - **FR-004.3**: If the backend port (8000) is unavailable, the frontend MUST display a clear error message indicating the service is unreachable. +- **FR-005**: System MUST ensure that service clients correctly handle API versioning internally by using the normalized URL. + +### Key Entities *(include if feature involves data)* + +- **Service Connection**: Represents the configuration for connecting to an external service. + - Attributes: Base URL, Credentials (if applicable), Connection Status. +- **Task Log Stream**: Represents the real-time data flow of logs from the backend to the frontend. + - Attributes: Task ID, Log Message, Timestamp. + +## Success Criteria *(mandatory)* + +### Measurable Outcomes + +- **SC-001**: 100% of pages render with consistent, professional styling as verified by visual inspection. +- **SC-002**: Real-time communication success rate is 100% in the development environment when both frontend and backend are running. +- **SC-003**: Users can successfully configure and save external service connections using only the base domain/IP in 100% of valid cases. +- **SC-004**: Real-time updates appear in the UI within 500ms of being generated on the backend. + +## Clarifications + +### Session 2025-12-20 +- Q: WebSocket Reconnection Strategy → A: Automatic reconnection with exponential backoff (Initial delay: 1s, Multiplier: 2x, Max delay: 30s, Max retries: 10). +- Q: URL Validation Strictness → A: Automatically append `/api/v1` if missing (Option A). +- Q: Global Styling Implementation → A: Import in `src/routes/+layout.svelte` (Option A). +- Q: WebSocket Port Configuration → A: Use environment variable with fallback (Option A). +- Q: Visual Feedback for Connection Failure → A: Toast notification + Status indicator (Option A). diff --git a/specs/005-fix-ui-ws-validation/tasks.md b/specs/005-fix-ui-ws-validation/tasks.md new file mode 100644 index 0000000..1dda19e --- /dev/null +++ b/specs/005-fix-ui-ws-validation/tasks.md @@ -0,0 +1,151 @@ +# Tasks: Fix UI Styling, WebSocket Port Mismatch, and URL Validation + +**Input**: Design documents from `/specs/005-fix-ui-ws-validation/` +**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, contracts/ + +**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story. + +## Format: `[ID] [P?] [Story] Description` + +- **[P]**: Can run in parallel (different files, no dependencies) +- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3) +- Include exact file paths in descriptions + +## Phase 1: Setup (Shared Infrastructure) + +**Purpose**: Project initialization and basic structure + +- [x] T001 Verify project structure and install dependencies in `backend/` and `frontend/` + +--- + +## Phase 2: Foundational (Blocking Prerequisites) + +**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented + +**⚠️ CRITICAL**: No user story work can begin until this phase is complete + +- [x] T002 [P] Configure `PUBLIC_WS_URL` in `frontend/.env` + +**Checkpoint**: Foundation ready - user story implementation can now begin in parallel + +--- + +## Phase 3: User Story 1 - Consistent UI Styling (Priority: P1) 🎯 MVP + +**Goal**: Apply Tailwind CSS globally via the root layout to ensure consistent appearance. + +**Independent Test**: Open the application in a browser and verify that Tailwind styling is applied to all elements (e.g., Navbar, Footer, Buttons). + +### Implementation for User Story 1 + +- [x] T003 [P] [US1] Verify Tailwind directives in `frontend/src/app.css` +- [x] T004 [US1] Import `../app.css` in `frontend/src/routes/+layout.svelte` + +**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently + +--- + +## Phase 4: User Story 2 - Real-time Log Monitoring (Priority: P1) + +**Goal**: Resolve WebSocket port mismatch using environment variables and fallback logic for real-time logs. + +**Independent Test**: Start a task (e.g., a mock migration) and verify that logs appear in the `TaskRunner` component in real-time. + +### Implementation for User Story 2 + +- [x] T005 [P] [US2] Implement WebSocket URL fallback logic in `frontend/src/lib/api.js` +- [x] T006 [US2] Update `frontend/src/components/TaskRunner.svelte` to use the dynamic WebSocket URL + +**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently + +--- + +## Phase 5: User Story 3 - Flexible External Service Connection (Priority: P2) + +**Goal**: Automatically append `/api/v1` to service base URLs if missing to simplify configuration. + +**Independent Test**: Create a new service connection with `http://localhost:8080` and verify it is saved as `http://localhost:8080/api/v1`. + +### Implementation for User Story 3 + +- [x] T007 [P] [US3] Relax `base_url` validation and add normalization in `superset_tool/models.py` +- [x] T008 [US3] Add unit tests for `SupersetConfig` URL normalization in `backend/tests/test_models.py` + +**Checkpoint**: All user stories should now be independently functional + +--- + +## Phase 6: Polish & Cross-Cutting Concerns + +**Purpose**: Improvements that affect multiple user stories + +- [x] T009 [P] Update `docs/settings.md` with new URL validation behavior +- [x] T010 Run full verification suite per `quickstart.md` + +--- + +## Phase 7: Addressing Requirements Gaps (from ws-connection.md) + +**Purpose**: Close gaps identified during requirements quality review to ensure robust WebSocket communication. + +- [x] T011 [US2] Resolve WebSocket endpoint path conflict between contract and implementation (Contract: `/ws/logs/{id}` vs Actual: `/ws/logs/{id}`) +- [x] T012 [US2] Implement environment-aware protocol selection (`ws` vs `wss`) based on `PUBLIC_WS_URL` +- [x] T013 [US2] Implement robust exponential backoff with specific initial delays and max retry counts +- [x] T014 [US2] Add UI handling for "Partial Log" and "Task Completed" WebSocket states in `TaskRunner.svelte` +- [x] T015 [US2] Implement backend port availability check and user-friendly error reporting in the frontend + +--- + +## Dependencies & Execution Order + +### Phase Dependencies + +- **Setup (Phase 1)**: No dependencies - can start immediately +- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories +- **User Stories (Phase 3+)**: All depend on Foundational phase completion + - User stories can then proceed in parallel (if staffed) + - Or sequentially in priority order (P1 → P2 → P3) +- **Polish (Final Phase)**: Depends on all desired user stories being complete + +### User Story Dependencies + +- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories +- **User Story 2 (P2)**: Can start after Foundational (Phase 2) - May integrate with US1 but should be independently testable +- **User Story 3 (P3)**: Can start after Foundational (Phase 2) - May integrate with US1/US2 but should be independently testable + +### Parallel Opportunities + +- All Setup tasks marked [P] can run in parallel +- All Foundational tasks marked [P] can run in parallel (within Phase 2) +- Once Foundational phase completes, all user stories can start in parallel +- Models within a story marked [P] can run in parallel + +--- + +## Parallel Example: User Story 1 + +```bash +# Launch all models for User Story 1 together: +Task: "Verify Tailwind directives in frontend/src/app.css" +``` + +--- + +## Implementation Strategy + +### MVP First (User Story 1 Only) + +1. Complete Phase 1: Setup +2. Complete Phase 2: Foundational (CRITICAL - blocks all stories) +3. Complete Phase 3: User Story 1 +4. **STOP and VALIDATE**: Test User Story 1 independently +5. Deploy/demo if ready + +### Incremental Delivery + +1. Complete Setup + Foundational → Foundation ready +2. Add User Story 1 → Test independently → Deploy/Demo (MVP!) +3. Add User Story 2 → Test independently → Deploy/Demo +4. Add User Story 3 → Test independently → Deploy/Demo +5. Each story adds value without breaking previous stories diff --git a/superset_tool/__init__.py b/superset_tool/__init__.py old mode 100644 new mode 100755 index e69de29..7bb8e1d --- a/superset_tool/__init__.py +++ b/superset_tool/__init__.py @@ -0,0 +1,14 @@ +# [DEF:superset_tool:Module] +# @SEMANTICS: package, root +# @PURPOSE: Root package for superset_tool. +# @LAYER: Domain +# @PUBLIC_API: SupersetClient, SupersetConfig + +# [SECTION: IMPORTS] +from .client import SupersetClient +from .models import SupersetConfig +# [/SECTION] + +__all__ = ["SupersetClient", "SupersetConfig"] + +# [/DEF:superset_tool] diff --git a/superset_tool/client.py b/superset_tool/client.py old mode 100644 new mode 100755 index f390454..f0e5d80 --- a/superset_tool/client.py +++ b/superset_tool/client.py @@ -1,313 +1,468 @@ -# pylint: disable=too-many-arguments,too-many-locals,too-many-statements,too-many-branches,unused-argument -""" -[MODULE] Superset API Client -@contract: Реализует полное взаимодействие с Superset API -""" - -# [IMPORTS] Стандартная библиотека -import json -from typing import Optional, Dict, Tuple, List, Any, Union -import datetime -from pathlib import Path -import zipfile -from requests import Response - -# [IMPORTS] Локальные модули -from superset_tool.models import SupersetConfig -from superset_tool.exceptions import ( - ExportError, - InvalidZipFormatError -) -from superset_tool.utils.fileio import get_filename_from_headers -from superset_tool.utils.logger import SupersetLogger -from superset_tool.utils.network import APIClient - -# [CONSTANTS] -DEFAULT_TIMEOUT = 30 - -# [TYPE-ALIASES] -JsonType = Union[Dict[str, Any], List[Dict[str, Any]]] -ResponseType = Tuple[bytes, str] - -class SupersetClient: - """[MAIN-CONTRACT] Клиент для работы с Superset API""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация клиента Superset. - # PRECONDITIONS: `config` должен быть валидным `SupersetConfig`. - # POSTCONDITIONS: Клиент успешно инициализирован. - def __init__(self, config: SupersetConfig, logger: Optional[SupersetLogger] = None): - self.logger = logger or SupersetLogger(name="SupersetClient") - self.logger.info("[INFO][SupersetClient.__init__][ENTER] Initializing SupersetClient.") - self._validate_config(config) - self.config = config - self.network = APIClient( - config=config.dict(), - verify_ssl=config.verify_ssl, - timeout=config.timeout, - logger=self.logger - ) - self.logger.info("[INFO][SupersetClient.__init__][SUCCESS] SupersetClient initialized successfully.") - # END_FUNCTION___init__ - - # [ENTITY: Function('_validate_config')] - # CONTRACT: - # PURPOSE: Валидация конфигурации клиента. - # PRECONDITIONS: `config` должен быть экземпляром `SupersetConfig`. - # POSTCONDITIONS: Конфигурация валидна. - def _validate_config(self, config: SupersetConfig) -> None: - self.logger.debug("[DEBUG][SupersetClient._validate_config][ENTER] Validating config.") - if not isinstance(config, SupersetConfig): - self.logger.error("[ERROR][SupersetClient._validate_config][FAILURE] Invalid config type.") - raise TypeError("Конфигурация должна быть экземпляром SupersetConfig") - self.logger.debug("[DEBUG][SupersetClient._validate_config][SUCCESS] Config validated.") - # END_FUNCTION__validate_config - - @property - def headers(self) -> dict: - """[INTERFACE] Базовые заголовки для API-вызовов.""" - return self.network.headers - # END_FUNCTION_headers - - # [ENTITY: Function('get_dashboards')] - # CONTRACT: - # PURPOSE: Получение списка дашбордов с пагинацией. - # PRECONDITIONS: None - # POSTCONDITIONS: Возвращает кортеж с общим количеством и списком дашбордов. - def get_dashboards(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]: - self.logger.info("[INFO][SupersetClient.get_dashboards][ENTER] Getting dashboards.") - validated_query = self._validate_query_params(query) - total_count = self._fetch_total_object_count(endpoint="/dashboard/") - paginated_data = self._fetch_all_pages( - endpoint="/dashboard/", - pagination_options={ - "base_query": validated_query, - "total_count": total_count, - "results_field": "result", - } - ) - self.logger.info("[INFO][SupersetClient.get_dashboards][SUCCESS] Got dashboards.") - return total_count, paginated_data - # END_FUNCTION_get_dashboards - - # [ENTITY: Function('get_dashboard')] - # CONTRACT: - # PURPOSE: Получение метаданных дашборда по ID или SLUG. - # PRECONDITIONS: `dashboard_id_or_slug` должен существовать. - # POSTCONDITIONS: Возвращает метаданные дашборда. - def get_dashboard(self, dashboard_id_or_slug: str) -> dict: - self.logger.info(f"[INFO][SupersetClient.get_dashboard][ENTER] Getting dashboard: {dashboard_id_or_slug}") - response_data = self.network.request( - method="GET", - endpoint=f"/dashboard/{dashboard_id_or_slug}", - ) - self.logger.info(f"[INFO][SupersetClient.get_dashboard][SUCCESS] Got dashboard: {dashboard_id_or_slug}") - return response_data.get("result", {}) - # END_FUNCTION_get_dashboard - - # [ENTITY: Function('get_datasets')] - # CONTRACT: - # PURPOSE: Получение списка датасетов с пагинацией. - # PRECONDITIONS: None - # POSTCONDITIONS: Возвращает кортеж с общим количеством и списком датасетов. - def get_datasets(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]: - self.logger.info("[INFO][SupersetClient.get_datasets][ENTER] Getting datasets.") - total_count = self._fetch_total_object_count(endpoint="/dataset/") - base_query = { - "columns": ["id", "table_name", "sql", "database", "schema"], - "page": 0, - "page_size": 100 - } - validated_query = {**base_query, **(query or {})} - datasets = self._fetch_all_pages( - endpoint="/dataset/", - pagination_options={ - "base_query": validated_query, - "total_count": total_count, - "results_field": "result", - } - ) - self.logger.info("[INFO][SupersetClient.get_datasets][SUCCESS] Got datasets.") - return total_count, datasets - # END_FUNCTION_get_datasets - - # [ENTITY: Function('get_dataset')] - # CONTRACT: - # PURPOSE: Получение метаданных датасета по ID. - # PRECONDITIONS: `dataset_id` должен существовать. - # POSTCONDITIONS: Возвращает метаданные датасета. - def get_dataset(self, dataset_id: str) -> dict: - self.logger.info(f"[INFO][SupersetClient.get_dataset][ENTER] Getting dataset: {dataset_id}") - response_data = self.network.request( - method="GET", - endpoint=f"/dataset/{dataset_id}", - ) - self.logger.info(f"[INFO][SupersetClient.get_dataset][SUCCESS] Got dataset: {dataset_id}") - return response_data.get("result", {}) - # END_FUNCTION_get_dataset - - # [ENTITY: Function('export_dashboard')] - # CONTRACT: - # PURPOSE: Экспорт дашборда в ZIP-архив. - # PRECONDITIONS: `dashboard_id` должен существовать. - # POSTCONDITIONS: Возвращает содержимое ZIP-архива и имя файла. - def export_dashboard(self, dashboard_id: int) -> Tuple[bytes, str]: - self.logger.info(f"[INFO][SupersetClient.export_dashboard][ENTER] Exporting dashboard: {dashboard_id}") - response = self.network.request( - method="GET", - endpoint="/dashboard/export/", - params={"q": json.dumps([dashboard_id])}, - stream=True, - raw_response=True - ) - self._validate_export_response(response, dashboard_id) - filename = self._resolve_export_filename(response, dashboard_id) - content = response.content - self.logger.info(f"[INFO][SupersetClient.export_dashboard][SUCCESS] Exported dashboard: {dashboard_id}") - return content, filename - # END_FUNCTION_export_dashboard - - # [ENTITY: Function('_validate_export_response')] - # CONTRACT: - # PURPOSE: Валидация ответа экспорта. - # PRECONDITIONS: `response` должен быть валидным HTTP-ответом. - # POSTCONDITIONS: Ответ валиден. - def _validate_export_response(self, response: Response, dashboard_id: int) -> None: - self.logger.debug(f"[DEBUG][SupersetClient._validate_export_response][ENTER] Validating export response for dashboard: {dashboard_id}") - content_type = response.headers.get('Content-Type', '') - if 'application/zip' not in content_type: - self.logger.error(f"[ERROR][SupersetClient._validate_export_response][FAILURE] Invalid content type: {content_type}") - raise ExportError(f"Получен не ZIP-архив (Content-Type: {content_type})") - if not response.content: - self.logger.error("[ERROR][SupersetClient._validate_export_response][FAILURE] Empty response content.") - raise ExportError("Получены пустые данные при экспорте") - self.logger.debug(f"[DEBUG][SupersetClient._validate_export_response][SUCCESS] Export response validated for dashboard: {dashboard_id}") - # END_FUNCTION__validate_export_response - - # [ENTITY: Function('_resolve_export_filename')] - # CONTRACT: - # PURPOSE: Определение имени экспортируемого файла. - # PRECONDITIONS: `response` должен быть валидным HTTP-ответом. - # POSTCONDITIONS: Возвращает имя файла. - def _resolve_export_filename(self, response: Response, dashboard_id: int) -> str: - self.logger.debug(f"[DEBUG][SupersetClient._resolve_export_filename][ENTER] Resolving export filename for dashboard: {dashboard_id}") - filename = get_filename_from_headers(response.headers) - if not filename: - timestamp = datetime.datetime.now().strftime('%Y%m%dT%H%M%S') - filename = f"dashboard_export_{dashboard_id}_{timestamp}.zip" - self.logger.warning(f"[WARNING][SupersetClient._resolve_export_filename][STATE_CHANGE] Could not resolve filename from headers, generated: {filename}") - self.logger.debug(f"[DEBUG][SupersetClient._resolve_export_filename][SUCCESS] Resolved export filename: {filename}") - return filename - # END_FUNCTION__resolve_export_filename - - # [ENTITY: Function('export_to_file')] - # CONTRACT: - # PURPOSE: Экспорт дашборда напрямую в файл. - # PRECONDITIONS: `output_dir` должен существовать. - # POSTCONDITIONS: Дашборд сохранен в файл. - def export_to_file(self, dashboard_id: int, output_dir: Union[str, Path]) -> Path: - self.logger.info(f"[INFO][SupersetClient.export_to_file][ENTER] Exporting dashboard {dashboard_id} to file in {output_dir}") - output_dir = Path(output_dir) - if not output_dir.exists(): - self.logger.error(f"[ERROR][SupersetClient.export_to_file][FAILURE] Output directory does not exist: {output_dir}") - raise FileNotFoundError(f"Директория {output_dir} не найдена") - content, filename = self.export_dashboard(dashboard_id) - target_path = output_dir / filename - with open(target_path, 'wb') as f: - f.write(content) - self.logger.info(f"[INFO][SupersetClient.export_to_file][SUCCESS] Exported dashboard {dashboard_id} to {target_path}") - return target_path - # END_FUNCTION_export_to_file - - # [ENTITY: Function('import_dashboard')] - # CONTRACT: - # PURPOSE: Импорт дашборда из ZIP-архива. - # PRECONDITIONS: `file_name` должен быть валидным ZIP-файлом. - # POSTCONDITIONS: Возвращает ответ API. - def import_dashboard(self, file_name: Union[str, Path]) -> Dict: - self.logger.info(f"[INFO][SupersetClient.import_dashboard][ENTER] Importing dashboard from: {file_name}") - self._validate_import_file(file_name) - import_response = self.network.upload_file( - endpoint="/dashboard/import/", - file_info={ - "file_obj": Path(file_name), - "file_name": Path(file_name).name, - "form_field": "formData", - }, - extra_data={'overwrite': 'true'}, - timeout=self.config.timeout * 2 - ) - self.logger.info(f"[INFO][SupersetClient.import_dashboard][SUCCESS] Imported dashboard from: {file_name}") - return import_response - # END_FUNCTION_import_dashboard - - # [ENTITY: Function('_validate_query_params')] - # CONTRACT: - # PURPOSE: Нормализация и валидация параметров запроса. - # PRECONDITIONS: None - # POSTCONDITIONS: Возвращает валидный словарь параметров. - def _validate_query_params(self, query: Optional[Dict]) -> Dict: - self.logger.debug("[DEBUG][SupersetClient._validate_query_params][ENTER] Validating query params.") - base_query = { - "columns": ["slug", "id", "changed_on_utc", "dashboard_title", "published"], - "page": 0, - "page_size": 1000 - } - validated_query = {**base_query, **(query or {})} - self.logger.debug(f"[DEBUG][SupersetClient._validate_query_params][SUCCESS] Validated query params: {validated_query}") - return validated_query - # END_FUNCTION__validate_query_params - - # [ENTITY: Function('_fetch_total_object_count')] - # CONTRACT: - # PURPOSE: Получение общего количества объектов. - # PRECONDITIONS: `endpoint` должен быть валидным. - # POSTCONDITIONS: Возвращает общее количество объектов. - def _fetch_total_object_count(self, endpoint:str) -> int: - self.logger.debug(f"[DEBUG][SupersetClient._fetch_total_object_count][ENTER] Fetching total object count for endpoint: {endpoint}") - query_params_for_count = {'page': 0, 'page_size': 1} - count = self.network.fetch_paginated_count( - endpoint=endpoint, - query_params=query_params_for_count, - count_field="count" - ) - self.logger.debug(f"[DEBUG][SupersetClient._fetch_total_object_count][SUCCESS] Fetched total object count: {count}") - return count - # END_FUNCTION__fetch_total_object_count - - # [ENTITY: Function('_fetch_all_pages')] - # CONTRACT: - # PURPOSE: Обход всех страниц пагинированного API. - # PRECONDITIONS: `pagination_options` должен содержать необходимые параметры. - # POSTCONDITIONS: Возвращает список всех объектов. - def _fetch_all_pages(self, endpoint:str, pagination_options: Dict) -> List[Dict]: - self.logger.debug(f"[DEBUG][SupersetClient._fetch_all_pages][ENTER] Fetching all pages for endpoint: {endpoint}") - all_data = self.network.fetch_paginated_data( - endpoint=endpoint, - pagination_options=pagination_options - ) - self.logger.debug(f"[DEBUG][SupersetClient._fetch_all_pages][SUCCESS] Fetched all pages for endpoint: {endpoint}") - return all_data - # END_FUNCTION__fetch_all_pages - - # [ENTITY: Function('_validate_import_file')] - # CONTRACT: - # PURPOSE: Проверка файла перед импортом. - # PRECONDITIONS: `zip_path` должен быть путем к файлу. - # POSTCONDITIONS: Файл валиден. - def _validate_import_file(self, zip_path: Union[str, Path]) -> None: - self.logger.debug(f"[DEBUG][SupersetClient._validate_import_file][ENTER] Validating import file: {zip_path}") - path = Path(zip_path) - if not path.exists(): - self.logger.error(f"[ERROR][SupersetClient._validate_import_file][FAILURE] Import file does not exist: {zip_path}") - raise FileNotFoundError(f"Файл {zip_path} не существует") - if not zipfile.is_zipfile(path): - self.logger.error(f"[ERROR][SupersetClient._validate_import_file][FAILURE] Import file is not a zip file: {zip_path}") - raise InvalidZipFormatError(f"Файл {zip_path} не является ZIP-архивом") - with zipfile.ZipFile(path, 'r') as zf: - if not any(n.endswith('metadata.yaml') for n in zf.namelist()): - self.logger.error(f"[ERROR][SupersetClient._validate_import_file][FAILURE] Import file does not contain metadata.yaml: {zip_path}") - raise InvalidZipFormatError(f"Архив {zip_path} не содержит 'metadata.yaml'") - self.logger.debug(f"[DEBUG][SupersetClient._validate_import_file][SUCCESS] Validated import file: {zip_path}") - # END_FUNCTION__validate_import_file - +# [DEF:superset_tool.client:Module] +# +# @SEMANTICS: superset, api, client, rest, http, dashboard, dataset, import, export +# @PURPOSE: Предоставляет высокоуровневый клиент для взаимодействия с Superset REST API, инкапсулируя логику запросов, обработку ошибок и пагинацию. +# @LAYER: Domain +# @RELATION: DEPENDS_ON -> superset_tool.models +# @RELATION: DEPENDS_ON -> superset_tool.exceptions +# @RELATION: DEPENDS_ON -> superset_tool.utils +# +# @INVARIANT: All network operations must use the internal APIClient instance. +# @CONSTRAINT: No direct use of 'requests' library outside of APIClient. +# @PUBLIC_API: SupersetClient + +# [SECTION: IMPORTS] +import json +import zipfile +from pathlib import Path +from typing import Any, Dict, List, Optional, Tuple, Union, cast +from requests import Response +from superset_tool.models import SupersetConfig +from superset_tool.exceptions import ExportError, InvalidZipFormatError +from superset_tool.utils.fileio import get_filename_from_headers +from superset_tool.utils.logger import SupersetLogger +from superset_tool.utils.network import APIClient +# [/SECTION] + +# [DEF:SupersetClient:Class] +# @PURPOSE: Класс-обёртка над Superset REST API, предоставляющий методы для работы с дашбордами и датасетами. +# @RELATION: CREATES_INSTANCE_OF -> APIClient +# @RELATION: USES -> SupersetConfig +class SupersetClient: + # [DEF:SupersetClient.__init__:Function] + # @PURPOSE: Инициализирует клиент, проверяет конфигурацию и создает сетевой клиент. + # @PRE: `config` должен быть валидным объектом SupersetConfig. + # @POST: Атрибуты `logger`, `config`, и `network` созданы и готовы к работе. + # @PARAM: config (SupersetConfig) - Конфигурация подключения. + # @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. + def __init__(self, config: SupersetConfig, logger: Optional[SupersetLogger] = None): + self.logger = logger or SupersetLogger(name="SupersetClient") + self.logger.info("[SupersetClient.__init__][Enter] Initializing SupersetClient.") + self._validate_config(config) + self.config = config + self.network = APIClient( + config=config.dict(), + verify_ssl=config.verify_ssl, + timeout=config.timeout, + logger=self.logger, + ) + self.delete_before_reimport: bool = False + self.logger.info("[SupersetClient.__init__][Exit] SupersetClient initialized.") + # [/DEF:SupersetClient.__init__] + + # [DEF:SupersetClient._validate_config:Function] + # @PURPOSE: Проверяет, что переданный объект конфигурации имеет корректный тип. + # @PRE: `config` должен быть передан. + # @POST: Если проверка пройдена, выполнение продолжается. + # @THROW: TypeError - Если `config` не является экземпляром `SupersetConfig`. + # @PARAM: config (SupersetConfig) - Объект для проверки. + def _validate_config(self, config: SupersetConfig) -> None: + self.logger.debug("[_validate_config][Enter] Validating SupersetConfig.") + assert isinstance(config, SupersetConfig), "Конфигурация должна быть экземпляром SupersetConfig" + self.logger.debug("[_validate_config][Exit] Config is valid.") + # [/DEF:SupersetClient._validate_config] + + @property + def headers(self) -> dict: + # [DEF:SupersetClient.headers:Function] + # @PURPOSE: Возвращает базовые HTTP-заголовки, используемые сетевым клиентом. + # @PRE: self.network должен быть инициализирован. + # @POST: Возвращаемый словарь содержит актуальные заголовки, включая токен авторизации. + return self.network.headers + # [/DEF:SupersetClient.headers] + + # [DEF:SupersetClient.get_dashboards:Function] + # @PURPOSE: Получает полный список дашбордов, автоматически обрабатывая пагинацию. + # @RELATION: CALLS -> self._fetch_total_object_count + # @RELATION: CALLS -> self._fetch_all_pages + # @PRE: self.network должен быть инициализирован. + # @POST: Возвращаемый список содержит все дашборды, доступные по API. + # @THROW: APIError - В случае ошибки сетевого запроса. + # @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса для API. + # @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список дашбордов). + def get_dashboards(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]: + assert self.network, "[get_dashboards][PRE] Network client must be initialized." + self.logger.info("[get_dashboards][Enter] Fetching dashboards.") + validated_query = self._validate_query_params(query or {}) + if 'columns' not in validated_query: + validated_query['columns'] = ["slug", "id", "changed_on_utc", "dashboard_title", "published"] + total_count = self._fetch_total_object_count(endpoint="/dashboard/") + paginated_data = self._fetch_all_pages( + endpoint="/dashboard/", + pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"}, + ) + self.logger.info("[get_dashboards][Exit] Found %d dashboards.", total_count) + return total_count, paginated_data + # [/DEF:SupersetClient.get_dashboards] + + # [DEF:SupersetClient.export_dashboard:Function] + # @PURPOSE: Экспортирует дашборд в виде ZIP-архива. + # @RELATION: CALLS -> self.network.request + # @PRE: dashboard_id должен быть положительным целым числом. + # @POST: Возвращает бинарное содержимое ZIP-архива и имя файла. + # @THROW: ExportError - Если экспорт завершился неудачей. + # @PARAM: dashboard_id (int) - ID дашборда для экспорта. + # @RETURN: Tuple[bytes, str] - Бинарное содержимое ZIP-архива и имя файла. + def export_dashboard(self, dashboard_id: int) -> Tuple[bytes, str]: + assert isinstance(dashboard_id, int) and dashboard_id > 0, "[export_dashboard][PRE] dashboard_id must be a positive integer." + self.logger.info("[export_dashboard][Enter] Exporting dashboard %s.", dashboard_id) + response = self.network.request( + method="GET", + endpoint="/dashboard/export/", + params={"q": json.dumps([dashboard_id])}, + stream=True, + raw_response=True, + ) + response = cast(Response, response) + self._validate_export_response(response, dashboard_id) + filename = self._resolve_export_filename(response, dashboard_id) + self.logger.info("[export_dashboard][Exit] Exported dashboard %s to %s.", dashboard_id, filename) + return response.content, filename + # [/DEF:SupersetClient.export_dashboard] + + # [DEF:SupersetClient.import_dashboard:Function] + # @PURPOSE: Импортирует дашборд из ZIP-файла с возможностью автоматического удаления и повторной попытки при ошибке. + # @RELATION: CALLS -> self._do_import + # @RELATION: CALLS -> self.delete_dashboard + # @RELATION: CALLS -> self.get_dashboards + # @PRE: Файл, указанный в `file_name`, должен существовать и быть валидным ZIP-архивом Superset. + # @POST: Дашборд успешно импортирован, возвращен ответ API. + # @THROW: FileNotFoundError - Если файл не найден. + # @THROW: InvalidZipFormatError - Если файл не является валидным ZIP-архивом Superset. + # @PARAM: file_name (Union[str, Path]) - Путь к ZIP-архиву. + # @PARAM: dash_id (Optional[int]) - ID дашборда для удаления при сбое. + # @PARAM: dash_slug (Optional[str]) - Slug дашборда для поиска ID, если ID не предоставлен. + # @RETURN: Dict - Ответ API в случае успеха. + def import_dashboard(self, file_name: Union[str, Path], dash_id: Optional[int] = None, dash_slug: Optional[str] = None) -> Dict: + assert file_name, "[import_dashboard][PRE] file_name must be provided." + file_path = str(file_name) + self._validate_import_file(file_path) + try: + return self._do_import(file_path) + except Exception as exc: + self.logger.error("[import_dashboard][Failure] First import attempt failed: %s", exc, exc_info=True) + if not self.delete_before_reimport: + raise + + target_id = self._resolve_target_id_for_delete(dash_id, dash_slug) + if target_id is None: + self.logger.error("[import_dashboard][Failure] No ID available for delete-retry.") + raise + + self.delete_dashboard(target_id) + self.logger.info("[import_dashboard][State] Deleted dashboard ID %s, retrying import.", target_id) + return self._do_import(file_path) + # [/DEF:SupersetClient.import_dashboard] + + # [DEF:SupersetClient._resolve_target_id_for_delete:Function] + # @PURPOSE: Определяет ID дашборда для удаления, используя ID или slug. + # @PARAM: dash_id (Optional[int]) - ID дашборда. + # @PARAM: dash_slug (Optional[str]) - Slug дашборда. + # @PRE: По крайней мере один из параметров (dash_id или dash_slug) должен быть предоставлен. + # @POST: Возвращает ID дашборда, если найден, иначе None. + # @THROW: APIError - В случае ошибки сетевого запроса при поиске по slug. + # @RETURN: Optional[int] - Найденный ID или None. + def _resolve_target_id_for_delete(self, dash_id: Optional[int], dash_slug: Optional[str]) -> Optional[int]: + assert dash_id is not None or dash_slug is not None, "[_resolve_target_id_for_delete][PRE] At least one of ID or slug must be provided." + if dash_id is not None: + return dash_id + if dash_slug is not None: + self.logger.debug("[_resolve_target_id_for_delete][State] Resolving ID by slug '%s'.", dash_slug) + try: + _, candidates = self.get_dashboards(query={"filters": [{"col": "slug", "op": "eq", "value": dash_slug}]}) + if candidates: + target_id = candidates[0]["id"] + self.logger.debug("[_resolve_target_id_for_delete][Success] Resolved slug to ID %s.", target_id) + return target_id + except Exception as e: + self.logger.warning("[_resolve_target_id_for_delete][Warning] Could not resolve slug '%s' to ID: %s", dash_slug, e) + return None + # [/DEF:SupersetClient._resolve_target_id_for_delete] + + # [DEF:SupersetClient._do_import:Function] + # @PURPOSE: Выполняет один запрос на импорт без обработки исключений. + # @PRE: Файл должен существовать. + # @POST: Файл успешно загружен, возвращен ответ API. + # @THROW: FileNotFoundError - Если файл не существует. + # @PARAM: file_name (Union[str, Path]) - Путь к файлу. + # @RETURN: Dict - Ответ API. + def _do_import(self, file_name: Union[str, Path]) -> Dict: + self.logger.debug(f"[_do_import][State] Uploading file: {file_name}") + file_path = Path(file_name) + if file_path.exists(): + self.logger.debug(f"[_do_import][State] File size: {file_path.stat().st_size} bytes") + else: + self.logger.error(f"[_do_import][Failure] File does not exist: {file_name}") + raise FileNotFoundError(f"File does not exist: {file_name}") + return self.network.upload_file( + endpoint="/dashboard/import/", + file_info={"file_obj": file_path, "file_name": file_path.name, "form_field": "formData"}, + extra_data={"overwrite": "true"}, + timeout=self.config.timeout * 2, + ) + # [/DEF:SupersetClient._do_import] + + # [DEF:SupersetClient.delete_dashboard:Function] + # @PURPOSE: Удаляет дашборд по его ID или slug. + # @RELATION: CALLS -> self.network.request + # @PRE: dashboard_id должен быть предоставлен. + # @POST: Дашборд удален или залогировано предупреждение. + # @THROW: APIError - В случае ошибки сетевого запроса. + # @PARAM: dashboard_id (Union[int, str]) - ID или slug дашборда. + def delete_dashboard(self, dashboard_id: Union[int, str]) -> None: + assert dashboard_id, "[delete_dashboard][PRE] dashboard_id must be provided." + self.logger.info("[delete_dashboard][Enter] Deleting dashboard %s.", dashboard_id) + response = self.network.request(method="DELETE", endpoint=f"/dashboard/{dashboard_id}") + response = cast(Dict, response) + if response.get("result", True) is not False: + self.logger.info("[delete_dashboard][Success] Dashboard %s deleted.", dashboard_id) + else: + self.logger.warning("[delete_dashboard][Warning] Unexpected response while deleting %s: %s", dashboard_id, response) + # [/DEF:SupersetClient.delete_dashboard] + + # [DEF:SupersetClient._extract_dashboard_id_from_zip:Function] + # @PURPOSE: Извлекает ID дашборда из `metadata.yaml` внутри ZIP-архива. + # @PARAM: file_name (Union[str, Path]) - Путь к ZIP-файлу. + # @PRE: Файл, указанный в `file_name`, должен быть валидным ZIP-архивом. + # @POST: Возвращает ID дашборда, если найден в metadata.yaml, иначе None. + # @THROW: ImportError - Если не установлен `yaml`. + # @RETURN: Optional[int] - ID дашборда или None. + def _extract_dashboard_id_from_zip(self, file_name: Union[str, Path]) -> Optional[int]: + assert zipfile.is_zipfile(file_name), "[_extract_dashboard_id_from_zip][PRE] file_name must be a valid zip file." + try: + import yaml + with zipfile.ZipFile(file_name, "r") as zf: + for name in zf.namelist(): + if name.endswith("metadata.yaml"): + with zf.open(name) as meta_file: + meta = yaml.safe_load(meta_file) + dash_id = meta.get("dashboard_uuid") or meta.get("dashboard_id") + if dash_id: return int(dash_id) + except Exception as exc: + self.logger.error("[_extract_dashboard_id_from_zip][Failure] %s", exc, exc_info=True) + return None + # [/DEF:SupersetClient._extract_dashboard_id_from_zip] + + # [DEF:SupersetClient._extract_dashboard_slug_from_zip:Function] + # @PURPOSE: Извлекает slug дашборда из `metadata.yaml` внутри ZIP-архива. + # @PARAM: file_name (Union[str, Path]) - Путь к ZIP-файлу. + # @PRE: Файл, указанный в `file_name`, должен быть валидным ZIP-архивом. + # @POST: Возвращает slug дашборда, если найден в metadata.yaml, иначе None. + # @THROW: ImportError - Если не установлен `yaml`. + # @RETURN: Optional[str] - Slug дашборда или None. + def _extract_dashboard_slug_from_zip(self, file_name: Union[str, Path]) -> Optional[str]: + assert zipfile.is_zipfile(file_name), "[_extract_dashboard_slug_from_zip][PRE] file_name must be a valid zip file." + try: + import yaml + with zipfile.ZipFile(file_name, "r") as zf: + for name in zf.namelist(): + if name.endswith("metadata.yaml"): + with zf.open(name) as meta_file: + meta = yaml.safe_load(meta_file) + if slug := meta.get("slug"): + return str(slug) + except Exception as exc: + self.logger.error("[_extract_dashboard_slug_from_zip][Failure] %s", exc, exc_info=True) + return None + # [/DEF:SupersetClient._extract_dashboard_slug_from_zip] + + # [DEF:SupersetClient._validate_export_response:Function] + # @PURPOSE: Проверяет, что HTTP-ответ на экспорт является валидным ZIP-архивом. + # @PRE: response должен быть объектом requests.Response. + # @POST: Проверка пройдена, если ответ является непустым ZIP-архивом. + # @THROW: ExportError - Если ответ не является ZIP-архивом или пуст. + # @PARAM: response (Response) - HTTP ответ. + # @PARAM: dashboard_id (int) - ID дашборда. + def _validate_export_response(self, response: Response, dashboard_id: int) -> None: + assert isinstance(response, Response), "[_validate_export_response][PRE] response must be a requests.Response object." + content_type = response.headers.get("Content-Type", "") + if "application/zip" not in content_type: + raise ExportError(f"Получен не ZIP-архив (Content-Type: {content_type})") + if not response.content: + raise ExportError("Получены пустые данные при экспорте") + # [/DEF:SupersetClient._validate_export_response] + + # [DEF:SupersetClient._resolve_export_filename:Function] + # @PURPOSE: Определяет имя файла для экспорта из заголовков или генерирует его. + # @PRE: response должен быть объектом requests.Response. + # @POST: Возвращает непустое имя файла. + # @PARAM: response (Response) - HTTP ответ. + # @PARAM: dashboard_id (int) - ID дашборда. + # @RETURN: str - Имя файла. + def _resolve_export_filename(self, response: Response, dashboard_id: int) -> str: + assert isinstance(response, Response), "[_resolve_export_filename][PRE] response must be a requests.Response object." + filename = get_filename_from_headers(dict(response.headers)) + if not filename: + from datetime import datetime + timestamp = datetime.now().strftime("%Y%m%dT%H%M%S") + filename = f"dashboard_export_{dashboard_id}_{timestamp}.zip" + self.logger.warning("[_resolve_export_filename][Warning] Generated filename: %s", filename) + return filename + # [/DEF:SupersetClient._resolve_export_filename] + + # [DEF:SupersetClient._validate_query_params:Function] + # @PURPOSE: Формирует корректный набор параметров запроса с пагинацией. + # @PARAM: query (Optional[Dict]) - Исходные параметры. + # @PRE: query, если предоставлен, должен быть словарем. + # @POST: Возвращает словарь, содержащий базовые параметры пагинации, объединенные с `query`. + # @RETURN: Dict - Валидированные параметры. + def _validate_query_params(self, query: Optional[Dict]) -> Dict: + assert query is None or isinstance(query, dict), "[_validate_query_params][PRE] query must be a dictionary or None." + base_query = {"page": 0, "page_size": 1000} + return {**base_query, **(query or {})} + # [/DEF:SupersetClient._validate_query_params] + + # [DEF:SupersetClient._fetch_total_object_count:Function] + # @PURPOSE: Получает общее количество объектов по указанному эндпоинту для пагинации. + # @PARAM: endpoint (str) - API эндпоинт. + # @PRE: endpoint должен быть непустой строкой. + # @POST: Возвращает общее количество объектов (>= 0). + # @THROW: APIError - В случае ошибки сетевого запроса. + # @RETURN: int - Количество объектов. + def _fetch_total_object_count(self, endpoint: str) -> int: + assert endpoint and isinstance(endpoint, str), "[_fetch_total_object_count][PRE] endpoint must be a non-empty string." + return self.network.fetch_paginated_count( + endpoint=endpoint, + query_params={"page": 0, "page_size": 1}, + count_field="count", + ) + # [/DEF:SupersetClient._fetch_total_object_count] + + # [DEF:SupersetClient._fetch_all_pages:Function] + # @PURPOSE: Итерируется по всем страницам пагинированного API и собирает все данные. + # @PARAM: endpoint (str) - API эндпоинт. + # @PARAM: pagination_options (Dict) - Опции пагинации. + # @PRE: endpoint должен быть непустой строкой, pagination_options - словарем. + # @POST: Возвращает полный список объектов. + # @THROW: APIError - В случае ошибки сетевого запроса. + # @RETURN: List[Dict] - Список всех объектов. + def _fetch_all_pages(self, endpoint: str, pagination_options: Dict) -> List[Dict]: + assert endpoint and isinstance(endpoint, str), "[_fetch_all_pages][PRE] endpoint must be a non-empty string." + assert isinstance(pagination_options, dict), "[_fetch_all_pages][PRE] pagination_options must be a dictionary." + return self.network.fetch_paginated_data(endpoint=endpoint, pagination_options=pagination_options) + # [/DEF:SupersetClient._fetch_all_pages] + + # [DEF:SupersetClient._validate_import_file:Function] + # @PURPOSE: Проверяет, что файл существует, является ZIP-архивом и содержит `metadata.yaml`. + # @PRE: zip_path должен быть предоставлен. + # @POST: Проверка пройдена, если файл существует, является ZIP и содержит `metadata.yaml`. + # @THROW: FileNotFoundError - Если файл не найден. + # @THROW: InvalidZipFormatError - Если файл не является ZIP или не содержит `metadata.yaml`. + # @PARAM: zip_path (Union[str, Path]) - Путь к файлу. + def _validate_import_file(self, zip_path: Union[str, Path]) -> None: + assert zip_path, "[_validate_import_file][PRE] zip_path must be provided." + path = Path(zip_path) + assert path.exists(), f"Файл {zip_path} не существует" + assert zipfile.is_zipfile(path), f"Файл {zip_path} не является ZIP-архивом" + with zipfile.ZipFile(path, "r") as zf: + assert any(n.endswith("metadata.yaml") for n in zf.namelist()), f"Архив {zip_path} не содержит 'metadata.yaml'" + # [/DEF:SupersetClient._validate_import_file] + + # [DEF:SupersetClient.get_datasets:Function] + # @PURPOSE: Получает полный список датасетов, автоматически обрабатывая пагинацию. + # @RELATION: CALLS -> self._fetch_total_object_count + # @RELATION: CALLS -> self._fetch_all_pages + # @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса. + # @PRE: self.network должен быть инициализирован. + # @POST: Возвращаемый список содержит все датасеты, доступные по API. + # @THROW: APIError - В случае ошибки сетевого запроса. + # @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список датасетов). + def get_datasets(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]: + assert self.network, "[get_datasets][PRE] Network client must be initialized." + self.logger.info("[get_datasets][Enter] Fetching datasets.") + validated_query = self._validate_query_params(query) + + total_count = self._fetch_total_object_count(endpoint="/dataset/") + paginated_data = self._fetch_all_pages( + endpoint="/dataset/", + pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"}, + ) + self.logger.info("[get_datasets][Exit] Found %d datasets.", total_count) + return total_count, paginated_data + # [/DEF:SupersetClient.get_datasets] + + # [DEF:SupersetClient.get_databases:Function] + # @PURPOSE: Получает полный список баз данных, автоматически обрабатывая пагинацию. + # @RELATION: CALLS -> self._fetch_total_object_count + # @RELATION: CALLS -> self._fetch_all_pages + # @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса. + # @PRE: self.network должен быть инициализирован. + # @POST: Возвращаемый список содержит все базы данных, доступные по API. + # @THROW: APIError - В случае ошибки сетевого запроса. + # @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список баз данных). + def get_databases(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]: + assert self.network, "[get_databases][PRE] Network client must be initialized." + self.logger.info("[get_databases][Enter] Fetching databases.") + validated_query = self._validate_query_params(query or {}) + if 'columns' not in validated_query: + validated_query['columns'] = [] + total_count = self._fetch_total_object_count(endpoint="/database/") + paginated_data = self._fetch_all_pages( + endpoint="/database/", + pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"}, + ) + self.logger.info("[get_databases][Exit] Found %d databases.", total_count) + return total_count, paginated_data + # [/DEF:SupersetClient.get_databases] + + # [DEF:SupersetClient.get_dataset:Function] + # @PURPOSE: Получает информацию о конкретном датасете по его ID. + # @RELATION: CALLS -> self.network.request + # @PARAM: dataset_id (int) - ID датасета. + # @PRE: dataset_id должен быть положительным целым числом. + # @POST: Возвращает словарь с информацией о датасете. + # @THROW: APIError - В случае ошибки сетевого запроса или если датасет не найден. + # @RETURN: Dict - Информация о датасете. + def get_dataset(self, dataset_id: int) -> Dict: + assert isinstance(dataset_id, int) and dataset_id > 0, "[get_dataset][PRE] dataset_id must be a positive integer." + self.logger.info("[get_dataset][Enter] Fetching dataset %s.", dataset_id) + response = self.network.request(method="GET", endpoint=f"/dataset/{dataset_id}") + response = cast(Dict, response) + self.logger.info("[get_dataset][Exit] Got dataset %s.", dataset_id) + return response + # [/DEF:SupersetClient.get_dataset] + + # [DEF:SupersetClient.get_database:Function] + # @PURPOSE: Получает информацию о конкретной базе данных по её ID. + # @RELATION: CALLS -> self.network.request + # @PARAM: database_id (int) - ID базы данных. + # @PRE: database_id должен быть положительным целым числом. + # @POST: Возвращает словарь с информацией о базе данных. + # @THROW: APIError - В случае ошибки сетевого запроса или если база данных не найдена. + # @RETURN: Dict - Информация о базе данных. + def get_database(self, database_id: int) -> Dict: + assert isinstance(database_id, int) and database_id > 0, "[get_database][PRE] database_id must be a positive integer." + self.logger.info("[get_database][Enter] Fetching database %s.", database_id) + response = self.network.request(method="GET", endpoint=f"/database/{database_id}") + response = cast(Dict, response) + self.logger.info("[get_database][Exit] Got database %s.", database_id) + return response + # [/DEF:SupersetClient.get_database] + + # [DEF:SupersetClient.update_dataset:Function] + # @PURPOSE: Обновляет данные датасета по его ID. + # @RELATION: CALLS -> self.network.request + # @PARAM: dataset_id (int) - ID датасета. + # @PARAM: data (Dict) - Данные для обновления. + # @PRE: dataset_id должен быть положительным целым числом, data - непустым словарем. + # @POST: Датасет успешно обновлен, возвращен ответ API. + # @THROW: APIError - В случае ошибки сетевого запроса. + # @RETURN: Dict - Ответ API. + def update_dataset(self, dataset_id: int, data: Dict) -> Dict: + assert isinstance(dataset_id, int) and dataset_id > 0, "[update_dataset][PRE] dataset_id must be a positive integer." + assert isinstance(data, dict) and data, "[update_dataset][PRE] data must be a non-empty dictionary." + self.logger.info("[update_dataset][Enter] Updating dataset %s.", dataset_id) + response = self.network.request( + method="PUT", + endpoint=f"/dataset/{dataset_id}", + data=json.dumps(data), + headers={'Content-Type': 'application/json'} + ) + response = cast(Dict, response) + self.logger.info("[update_dataset][Exit] Updated dataset %s.", dataset_id) + return response + # [/DEF:SupersetClient.update_dataset] + +# [/DEF:SupersetClient] + +# [/DEF:superset_tool.client] diff --git a/superset_tool/exceptions.py b/superset_tool/exceptions.py old mode 100644 new mode 100755 index e371190..febd8f5 --- a/superset_tool/exceptions.py +++ b/superset_tool/exceptions.py @@ -1,124 +1,128 @@ -# pylint: disable=too-many-ancestors -""" -[MODULE] Иерархия исключений -@contract: Все ошибки наследуют `SupersetToolError` для единой точки обработки. -""" - -# [IMPORTS] Standard library -from pathlib import Path - -# [IMPORTS] Typing -from typing import Optional, Dict, Any, Union - -class SupersetToolError(Exception): - """[BASE] Базовый класс для всех ошибок инструмента Superset.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация базового исключения. - # PRECONDITIONS: `context` должен быть словарем или None. - # POSTCONDITIONS: Исключение создано с сообщением и контекстом. - def __init__(self, message: str, context: Optional[Dict[str, Any]] = None): - if not isinstance(context, (dict, type(None))): - raise TypeError("Контекст ошибки должен быть словарем или None") - self.context = context or {} - super().__init__(f"{message} | Context: {self.context}") - # END_FUNCTION___init__ - -class AuthenticationError(SupersetToolError): - """[AUTH] Ошибки аутентификации или авторизации.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация исключения аутентификации. - # PRECONDITIONS: None - # POSTCONDITIONS: Исключение создано. - def __init__(self, message: str = "Authentication failed", **context: Any): - super().__init__(f"[AUTH_FAILURE] {message}", context={"type": "authentication", **context}) - # END_FUNCTION___init__ - -class PermissionDeniedError(AuthenticationError): - """[AUTH] Ошибка отказа в доступе.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация исключения отказа в доступе. - # PRECONDITIONS: None - # POSTCONDITIONS: Исключение создано. - def __init__(self, message: str = "Permission denied", required_permission: Optional[str] = None, **context: Any): - full_message = f"Permission denied: {required_permission}" if required_permission else message - super().__init__(full_message, context={"required_permission": required_permission, **context}) - # END_FUNCTION___init__ - -class SupersetAPIError(SupersetToolError): - """[API] Общие ошибки взаимодействия с Superset API.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация исключения ошибки API. - # PRECONDITIONS: None - # POSTCONDITIONS: Исключение создано. - def __init__(self, message: str = "Superset API error", **context: Any): - super().__init__(f"[API_FAILURE] {message}", context={"type": "api_call", **context}) - # END_FUNCTION___init__ - -class ExportError(SupersetAPIError): - """[API:EXPORT] Проблемы, специфичные для операций экспорта.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация исключения ошибки экспорта. - # PRECONDITIONS: None - # POSTCONDITIONS: Исключение создано. - def __init__(self, message: str = "Dashboard export failed", **context: Any): - super().__init__(f"[EXPORT_FAILURE] {message}", context={"subtype": "export", **context}) - # END_FUNCTION___init__ - -class DashboardNotFoundError(SupersetAPIError): - """[API:404] Запрошенный дашборд или ресурс не существует.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация исключения "дашборд не найден". - # PRECONDITIONS: None - # POSTCONDITIONS: Исключение создано. - def __init__(self, dashboard_id_or_slug: Union[int, str], message: str = "Dashboard not found", **context: Any): - super().__init__(f"[NOT_FOUND] Dashboard '{dashboard_id_or_slug}' {message}", context={"subtype": "not_found", "resource_id": dashboard_id_or_slug, **context}) - # END_FUNCTION___init__ - -class DatasetNotFoundError(SupersetAPIError): - """[API:404] Запрашиваемый набор данных не существует.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация исключения "набор данных не найден". - # PRECONDITIONS: None - # POSTCONDITIONS: Исключение создано. - def __init__(self, dataset_id_or_slug: Union[int, str], message: str = "Dataset not found", **context: Any): - super().__init__(f"[NOT_FOUND] Dataset '{dataset_id_or_slug}' {message}", context={"subtype": "not_found", "resource_id": dataset_id_or_slug, **context}) - # END_FUNCTION___init__ - -class InvalidZipFormatError(SupersetToolError): - """[FILE:ZIP] Некорректный формат ZIP-архива.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация исключения некорректного формата ZIP. - # PRECONDITIONS: None - # POSTCONDITIONS: Исключение создано. - def __init__(self, message: str = "Invalid ZIP format or content", file_path: Optional[Union[str, Path]] = None, **context: Any): - super().__init__(f"[FILE_ERROR] {message}", context={"type": "file_validation", "file_path": str(file_path) if file_path else "N/A", **context}) - # END_FUNCTION___init__ - -class NetworkError(SupersetToolError): - """[NETWORK] Проблемы соединения.""" - # [ENTITY: Function('__init__')] - # CONTRACT: - # PURPOSE: Инициализация исключения сетевой ошибки. - # PRECONDITIONS: None - # POSTCONDITIONS: Исключение создано. - def __init__(self, message: str = "Network connection failed", **context: Any): - super().__init__(f"[NETWORK_FAILURE] {message}", context={"type": "network", **context}) - # END_FUNCTION___init__ - -class FileOperationError(SupersetToolError): - """[FILE] Ошибка файловых операций.""" - -class InvalidFileStructureError(FileOperationError): - """[FILE] Некорректная структура файлов/директорий.""" - -class ConfigurationError(SupersetToolError): - """[CONFIG] Ошибка в конфигурации инструмента.""" - +# [DEF:superset_tool.exceptions:Module] +# @PURPOSE: Определяет иерархию пользовательских исключений для всего инструмента, обеспечивая единую точку обработки ошибок. +# @SEMANTICS: exception, error, hierarchy +# @LAYER: Infra + +# [SECTION: IMPORTS] +from pathlib import Path +from typing import Optional, Dict, Any, Union +# [/SECTION] + +# [DEF:SupersetToolError:Class] +# @PURPOSE: Базовый класс для всех ошибок, генерируемых инструментом. +# @RELATION: INHERITS_FROM -> Exception +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: context (Optional[Dict[str, Any]]) - Дополнительный контекст ошибки. +class SupersetToolError(Exception): + def __init__(self, message: str, context: Optional[Dict[str, Any]] = None): + self.context = context or {} + super().__init__(f"{message} | Context: {self.context}") +# [/DEF:SupersetToolError] + +# [DEF:AuthenticationError:Class] +# @PURPOSE: Ошибки, связанные с аутентификацией или авторизацией. +# @RELATION: INHERITS_FROM -> SupersetToolError +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: context (Any) - Дополнительный контекст ошибки. +class AuthenticationError(SupersetToolError): + def __init__(self, message: str = "Authentication failed", **context: Any): + super().__init__(f"[AUTH_FAILURE] {message}", context={"type": "authentication", **context}) +# [/DEF:AuthenticationError] + +# [DEF:PermissionDeniedError:Class] +# @PURPOSE: Ошибка, возникающая при отказе в доступе к ресурсу. +# @RELATION: INHERITS_FROM -> AuthenticationError +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: required_permission (Optional[str]) - Требуемое разрешение. +# @PARAM: context (Any) - Дополнительный контекст ошибки. +class PermissionDeniedError(AuthenticationError): + def __init__(self, message: str = "Permission denied", required_permission: Optional[str] = None, **context: Any): + full_message = f"Permission denied: {required_permission}" if required_permission else message + super().__init__(full_message, context={"required_permission": required_permission, **context}) +# [/DEF:PermissionDeniedError] + +# [DEF:SupersetAPIError:Class] +# @PURPOSE: Общие ошибки при взаимодействии с Superset API. +# @RELATION: INHERITS_FROM -> SupersetToolError +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: context (Any) - Дополнительный контекст ошибки. +class SupersetAPIError(SupersetToolError): + def __init__(self, message: str = "Superset API error", **context: Any): + super().__init__(f"[API_FAILURE] {message}", context={"type": "api_call", **context}) +# [/DEF:SupersetAPIError] + +# [DEF:ExportError:Class] +# @PURPOSE: Ошибки, специфичные для операций экспорта. +# @RELATION: INHERITS_FROM -> SupersetAPIError +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: context (Any) - Дополнительный контекст ошибки. +class ExportError(SupersetAPIError): + def __init__(self, message: str = "Dashboard export failed", **context: Any): + super().__init__(f"[EXPORT_FAILURE] {message}", context={"subtype": "export", **context}) +# [/DEF:ExportError] + +# [DEF:DashboardNotFoundError:Class] +# @PURPOSE: Ошибка, когда запрошенный дашборд или ресурс не найден (404). +# @RELATION: INHERITS_FROM -> SupersetAPIError +# @PARAM: dashboard_id_or_slug (Union[int, str]) - ID или slug дашборда. +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: context (Any) - Дополнительный контекст ошибки. +class DashboardNotFoundError(SupersetAPIError): + def __init__(self, dashboard_id_or_slug: Union[int, str], message: str = "Dashboard not found", **context: Any): + super().__init__(f"[NOT_FOUND] Dashboard '{dashboard_id_or_slug}' {message}", context={"subtype": "not_found", "resource_id": dashboard_id_or_slug, **context}) +# [/DEF:DashboardNotFoundError] + +# [DEF:DatasetNotFoundError:Class] +# @PURPOSE: Ошибка, когда запрашиваемый набор данных не существует (404). +# @RELATION: INHERITS_FROM -> SupersetAPIError +# @PARAM: dataset_id_or_slug (Union[int, str]) - ID или slug набора данных. +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: context (Any) - Дополнительный контекст ошибки. +class DatasetNotFoundError(SupersetAPIError): + def __init__(self, dataset_id_or_slug: Union[int, str], message: str = "Dataset not found", **context: Any): + super().__init__(f"[NOT_FOUND] Dataset '{dataset_id_or_slug}' {message}", context={"subtype": "not_found", "resource_id": dataset_id_or_slug, **context}) +# [/DEF:DatasetNotFoundError] + +# [DEF:InvalidZipFormatError:Class] +# @PURPOSE: Ошибка, указывающая на некорректный формат или содержимое ZIP-архива. +# @RELATION: INHERITS_FROM -> SupersetToolError +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: file_path (Optional[Union[str, Path]]) - Путь к файлу. +# @PARAM: context (Any) - Дополнительный контекст ошибки. +class InvalidZipFormatError(SupersetToolError): + def __init__(self, message: str = "Invalid ZIP format or content", file_path: Optional[Union[str, Path]] = None, **context: Any): + super().__init__(f"[FILE_ERROR] {message}", context={"type": "file_validation", "file_path": str(file_path) if file_path else "N/A", **context}) +# [/DEF:InvalidZipFormatError] + +# [DEF:NetworkError:Class] +# @PURPOSE: Ошибки, связанные с сетевым соединением. +# @RELATION: INHERITS_FROM -> SupersetToolError +# @PARAM: message (str) - Сообщение об ошибке. +# @PARAM: context (Any) - Дополнительный контекст ошибки. +class NetworkError(SupersetToolError): + def __init__(self, message: str = "Network connection failed", **context: Any): + super().__init__(f"[NETWORK_FAILURE] {message}", context={"type": "network", **context}) +# [/DEF:NetworkError] + +# [DEF:FileOperationError:Class] +# @PURPOSE: Общие ошибки файловых операций (I/O). +# @RELATION: INHERITS_FROM -> SupersetToolError +class FileOperationError(SupersetToolError): + pass +# [/DEF:FileOperationError] + +# [DEF:InvalidFileStructureError:Class] +# @PURPOSE: Ошибка, указывающая на некорректную структуру файлов или директорий. +# @RELATION: INHERITS_FROM -> FileOperationError +class InvalidFileStructureError(FileOperationError): + pass +# [/DEF:InvalidFileStructureError] + +# [DEF:ConfigurationError:Class] +# @PURPOSE: Ошибки, связанные с неверной конфигурацией инструмента. +# @RELATION: INHERITS_FROM -> SupersetToolError +class ConfigurationError(SupersetToolError): + pass +# [/DEF:ConfigurationError] + +# [/DEF:superset_tool.exceptions] \ No newline at end of file diff --git a/superset_tool/models.py b/superset_tool/models.py old mode 100644 new mode 100755 index 55a11d7..c3f108a --- a/superset_tool/models.py +++ b/superset_tool/models.py @@ -1,89 +1,87 @@ -# pylint: disable=no-self-argument,too-few-public-methods -""" -[MODULE] Сущности данных конфигурации -@desc: Определяет структуры данных, используемые для конфигурации и трансформации в инструменте Superset. -""" - -# [IMPORTS] Pydantic и Typing -from typing import Optional, Dict, Any -from pydantic import BaseModel, validator, Field, HttpUrl, VERSION - -# [IMPORTS] Локальные модули -from .utils.logger import SupersetLogger - -class SupersetConfig(BaseModel): - """ - [CONFIG] Конфигурация подключения к Superset API. - """ - base_url: str = Field(..., description="Базовый URL Superset API, включая версию /api/v1.", pattern=r'.*/api/v1.*') - auth: Dict[str, str] = Field(..., description="Словарь с данными для аутентификации (provider, username, password, refresh).") - verify_ssl: bool = Field(True, description="Флаг для проверки SSL-сертификатов.") - timeout: int = Field(30, description="Таймаут в секундах для HTTP-запросов.") - logger: Optional[SupersetLogger] = Field(None, description="Экземпляр логгера для логирования внутри клиента.") - - # [ENTITY: Function('validate_auth')] - # CONTRACT: - # PURPOSE: Валидация словаря `auth`. - # PRECONDITIONS: `v` должен быть словарем. - # POSTCONDITIONS: Возвращает `v` если все обязательные поля присутствуют. - @validator('auth') - def validate_auth(cls, v: Dict[str, str], values: dict) -> Dict[str, str]: - logger = values.get('logger') or SupersetLogger(name="SupersetConfig") - logger.debug("[DEBUG][SupersetConfig.validate_auth][ENTER] Validating auth.") - required = {'provider', 'username', 'password', 'refresh'} - if not required.issubset(v.keys()): - logger.error("[ERROR][SupersetConfig.validate_auth][FAILURE] Missing required auth fields.") - raise ValueError(f"Словарь 'auth' должен содержать поля: {required}. Отсутствующие: {required - v.keys()}") - logger.debug("[DEBUG][SupersetConfig.validate_auth][SUCCESS] Auth validated.") - return v - # END_FUNCTION_validate_auth - - # [ENTITY: Function('check_base_url_format')] - # CONTRACT: - # PURPOSE: Валидация формата `base_url`. - # PRECONDITIONS: `v` должна быть строкой. - # POSTCONDITIONS: Возвращает `v` если это валидный URL. - @validator('base_url') - def check_base_url_format(cls, v: str, values: dict) -> str: - logger = values.get('logger') or SupersetLogger(name="SupersetConfig") - logger.debug("[DEBUG][SupersetConfig.check_base_url_format][ENTER] Validating base_url.") - try: - if VERSION.startswith('1'): - HttpUrl(v) - except (ValueError, TypeError) as exc: - logger.error("[ERROR][SupersetConfig.check_base_url_format][FAILURE] Invalid base_url format.") - raise ValueError(f"Invalid URL format: {v}") from exc - logger.debug("[DEBUG][SupersetConfig.check_base_url_format][SUCCESS] base_url validated.") - return v - # END_FUNCTION_check_base_url_format - - class Config: - """Pydantic config""" - arbitrary_types_allowed = True - -class DatabaseConfig(BaseModel): - """ - [CONFIG] Параметры трансформации баз данных при миграции дашбордов. - """ - database_config: Dict[str, Dict[str, Any]] = Field(..., description="Словарь, содержащий 'old' и 'new' конфигурации базы данных.") - logger: Optional[SupersetLogger] = Field(None, description="Экземпляр логгера для логирования.") - - # [ENTITY: Function('validate_config')] - # CONTRACT: - # PURPOSE: Валидация словаря `database_config`. - # PRECONDITIONS: `v` должен быть словарем. - # POSTCONDITIONS: Возвращает `v` если содержит ключи 'old' и 'new'. - @validator('database_config') - def validate_config(cls, v: Dict[str, Dict[str, Any]], values: dict) -> Dict[str, Dict[str, Any]]: - logger = values.get('logger') or SupersetLogger(name="DatabaseConfig") - logger.debug("[DEBUG][DatabaseConfig.validate_config][ENTER] Validating database_config.") - if not {'old', 'new'}.issubset(v.keys()): - logger.error("[ERROR][DatabaseConfig.validate_config][FAILURE] Missing 'old' or 'new' keys in database_config.") - raise ValueError("'database_config' должен содержать ключи 'old' и 'new'.") - logger.debug("[DEBUG][DatabaseConfig.validate_config][SUCCESS] database_config validated.") - return v - # END_FUNCTION_validate_config - - class Config: - """Pydantic config""" - arbitrary_types_allowed = True +# [DEF:superset_tool.models:Module] +# +# @SEMANTICS: pydantic, model, config, validation, data-structure +# @PURPOSE: Определяет Pydantic-модели для конфигурации инструмента, обеспечивая валидацию данных. +# @LAYER: Infra +# @RELATION: DEPENDS_ON -> pydantic +# @RELATION: DEPENDS_ON -> superset_tool.utils.logger +# @PUBLIC_API: SupersetConfig, DatabaseConfig + +# [SECTION: IMPORTS] +import re +from typing import Optional, Dict, Any +from pydantic import BaseModel, validator, Field +from .utils.logger import SupersetLogger +# [/SECTION] + +# [DEF:SupersetConfig:Class] +# @PURPOSE: Модель конфигурации для подключения к одному экземпляру Superset API. +# @RELATION: INHERITS_FROM -> pydantic.BaseModel +class SupersetConfig(BaseModel): + env: str = Field(..., description="Название окружения (например, dev, prod).") + base_url: str = Field(..., description="Базовый URL Superset API, включая /api/v1.") + auth: Dict[str, str] = Field(..., description="Словарь с данными для аутентификации (provider, username, password, refresh).") + verify_ssl: bool = Field(True, description="Флаг для проверки SSL-сертификатов.") + timeout: int = Field(30, description="Таймаут в секундах для HTTP-запросов.") + logger: Optional[SupersetLogger] = Field(None, description="Экземпляр логгера для логирования.") + + # [DEF:SupersetConfig.validate_auth:Function] + # @PURPOSE: Проверяет, что словарь `auth` содержит все необходимые для аутентификации поля. + # @PRE: `v` должен быть словарем. + # @POST: Возвращает `v`, если все обязательные поля (`provider`, `username`, `password`, `refresh`) присутствуют. + # @THROW: ValueError - Если отсутствуют обязательные поля. + # @PARAM: v (Dict[str, str]) - Значение поля auth. + @validator('auth') + def validate_auth(cls, v: Dict[str, str]) -> Dict[str, str]: + required = {'provider', 'username', 'password', 'refresh'} + if not required.issubset(v.keys()): + raise ValueError(f"Словарь 'auth' должен содержать поля: {required}. Отсутствующие: {required - v.keys()}") + return v + # [/DEF:SupersetConfig.validate_auth] + + # [DEF:SupersetConfig.normalize_base_url:Function] + # @PURPOSE: Нормализует `base_url`, добавляя `/api/v1`, если он отсутствует. + # @PRE: `v` должна быть строкой. + # @POST: Возвращает нормализованный `v`. + # @THROW: ValueError - Если формат URL невалиден. + # @PARAM: v (str) - Значение поля base_url. + @validator('base_url') + def normalize_base_url(cls, v: str) -> str: + v = v.strip() + if not v.startswith(('http://', 'https://')): + raise ValueError(f"Invalid URL scheme: {v}. Must start with http:// or https://") + + if '/api/v1' not in v: + v = f"{v.rstrip('/')}/api/v1" + return v + # [/DEF:SupersetConfig.normalize_base_url] + + class Config: + arbitrary_types_allowed = True +# [/DEF:SupersetConfig] + +# [DEF:DatabaseConfig:Class] +# @PURPOSE: Модель для параметров трансформации баз данных при миграции дашбордов. +# @RELATION: INHERITS_FROM -> pydantic.BaseModel +class DatabaseConfig(BaseModel): + database_config: Dict[str, Dict[str, Any]] = Field(..., description="Словарь, содержащий 'old' и 'new' конфигурации базы данных.") + logger: Optional[SupersetLogger] = Field(None, description="Экземпляр логгера для логирования.") + + # [DEF:DatabaseConfig.validate_config:Function] + # @PURPOSE: Проверяет, что словарь `database_config` содержит ключи 'old' и 'new'. + # @PRE: `v` должен быть словарем. + # @POST: Возвращает `v`, если ключи 'old' и 'new' присутствуют. + # @THROW: ValueError - Если отсутствуют обязательные ключи. + # @PARAM: v (Dict[str, Dict[str, Any]]) - Значение поля database_config. + @validator('database_config') + def validate_config(cls, v: Dict[str, Dict[str, Any]]) -> Dict[str, Dict[str, Any]]: + if not {'old', 'new'}.issubset(v.keys()): + raise ValueError("'database_config' должен содержать ключи 'old' и 'new'.") + return v + # [/DEF:DatabaseConfig.validate_config] + + class Config: + arbitrary_types_allowed = True +# [/DEF:DatabaseConfig] + +# [/DEF:superset_tool.models] diff --git a/superset_tool/requirements.txt b/superset_tool/requirements.txt old mode 100644 new mode 100755 diff --git a/superset_tool/utils/__init__.py b/superset_tool/utils/__init__.py new file mode 100755 index 0000000..793209e --- /dev/null +++ b/superset_tool/utils/__init__.py @@ -0,0 +1,5 @@ +# [DEF:superset_tool.utils:Module] +# @SEMANTICS: package, utils +# @PURPOSE: Utility package for superset_tool. +# @LAYER: Infra +# [/DEF:superset_tool.utils] diff --git a/superset_tool/utils/dataset_mapper.py b/superset_tool/utils/dataset_mapper.py new file mode 100755 index 0000000..7ac0ba9 --- /dev/null +++ b/superset_tool/utils/dataset_mapper.py @@ -0,0 +1,229 @@ +# [DEF:superset_tool.utils.dataset_mapper:Module] +# +# @SEMANTICS: dataset, mapping, postgresql, xlsx, superset +# @PURPOSE: Этот модуль отвечает за обновление метаданных (verbose_map) в датасетах Superset, извлекая их из PostgreSQL или XLSX-файлов. +# @LAYER: Domain +# @RELATION: DEPENDS_ON -> superset_tool.client +# @RELATION: DEPENDS_ON -> pandas +# @RELATION: DEPENDS_ON -> psycopg2 +# @PUBLIC_API: DatasetMapper + +# [SECTION: IMPORTS] +import pandas as pd # type: ignore +import psycopg2 # type: ignore +from superset_tool.client import SupersetClient +from superset_tool.utils.init_clients import setup_clients +from superset_tool.utils.logger import SupersetLogger +from typing import Dict, List, Optional, Any +# [/SECTION] + +# [DEF:DatasetMapper:Class] +# @PURPOSE: Класс для меппинга и обновления verbose_map в датасетах Superset. +class DatasetMapper: + def __init__(self, logger: SupersetLogger): + self.logger = logger + + # [DEF:DatasetMapper.get_postgres_comments:Function] + # @PURPOSE: Извлекает комментарии к колонкам из системного каталога PostgreSQL. + # @PRE: `db_config` должен содержать валидные креды для подключения к PostgreSQL. + # @PRE: `table_name` и `table_schema` должны быть строками. + # @POST: Возвращается словарь с меппингом `column_name` -> `column_comment`. + # @THROW: Exception - При ошибках подключения или выполнения запроса к БД. + # @PARAM: db_config (Dict) - Конфигурация для подключения к БД. + # @PARAM: table_name (str) - Имя таблицы. + # @PARAM: table_schema (str) - Схема таблицы. + # @RETURN: Dict[str, str] - Словарь с комментариями к колонкам. + def get_postgres_comments(self, db_config: Dict, table_name: str, table_schema: str) -> Dict[str, str]: + self.logger.info("[get_postgres_comments][Enter] Fetching comments from PostgreSQL for %s.%s.", table_schema, table_name) + query = f""" + SELECT + cols.column_name, + CASE + WHEN pg_catalog.col_description( + (SELECT c.oid + FROM pg_catalog.pg_class c + JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace + WHERE c.relname = cols.table_name + AND n.nspname = cols.table_schema), + cols.ordinal_position::int + ) LIKE '%|%' THEN + split_part( + pg_catalog.col_description( + (SELECT c.oid + FROM pg_catalog.pg_class c + JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace + WHERE c.relname = cols.table_name + AND n.nspname = cols.table_schema), + cols.ordinal_position::int + ), + '|', + 1 + ) + ELSE + pg_catalog.col_description( + (SELECT c.oid + FROM pg_catalog.pg_class c + JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace + WHERE c.relname = cols.table_name + AND n.nspname = cols.table_schema), + cols.ordinal_position::int + ) + END AS column_comment + FROM + information_schema.columns cols + WHERE cols.table_catalog = '{db_config.get('dbname')}' AND cols.table_name = '{table_name}' AND cols.table_schema = '{table_schema}'; + """ + comments = {} + try: + with psycopg2.connect(**db_config) as conn, conn.cursor() as cursor: + cursor.execute(query) + for row in cursor.fetchall(): + if row[1]: + comments[row[0]] = row[1] + self.logger.info("[get_postgres_comments][Success] Fetched %d comments.", len(comments)) + except Exception as e: + self.logger.error("[get_postgres_comments][Failure] %s", e, exc_info=True) + raise + return comments + # [/DEF:DatasetMapper.get_postgres_comments] + + # [DEF:DatasetMapper.load_excel_mappings:Function] + # @PURPOSE: Загружает меппинги 'column_name' -> 'column_comment' из XLSX файла. + # @PRE: `file_path` должен быть валидным путем к XLSX файлу с колонками 'column_name' и 'column_comment'. + # @POST: Возвращается словарь с меппингами. + # @THROW: Exception - При ошибках чтения файла или парсинга. + # @PARAM: file_path (str) - Путь к XLSX файлу. + # @RETURN: Dict[str, str] - Словарь с меппингами. + def load_excel_mappings(self, file_path: str) -> Dict[str, str]: + self.logger.info("[load_excel_mappings][Enter] Loading mappings from %s.", file_path) + try: + df = pd.read_excel(file_path) + mappings = df.set_index('column_name')['verbose_name'].to_dict() + self.logger.info("[load_excel_mappings][Success] Loaded %d mappings.", len(mappings)) + return mappings + except Exception as e: + self.logger.error("[load_excel_mappings][Failure] %s", e, exc_info=True) + raise + # [/DEF:DatasetMapper.load_excel_mappings] + + # [DEF:DatasetMapper.run_mapping:Function] + # @PURPOSE: Основная функция для выполнения меппинга и обновления verbose_map датасета в Superset. + # @RELATION: CALLS -> self.get_postgres_comments + # @RELATION: CALLS -> self.load_excel_mappings + # @RELATION: CALLS -> superset_client.get_dataset + # @RELATION: CALLS -> superset_client.update_dataset + # @PARAM: superset_client (SupersetClient) - Клиент Superset. + # @PARAM: dataset_id (int) - ID датасета для обновления. + # @PARAM: source (str) - Источник данных ('postgres', 'excel', 'both'). + # @PARAM: postgres_config (Optional[Dict]) - Конфигурация для подключения к PostgreSQL. + # @PARAM: excel_path (Optional[str]) - Путь к XLSX файлу. + # @PARAM: table_name (Optional[str]) - Имя таблицы в PostgreSQL. + # @PARAM: table_schema (Optional[str]) - Схема таблицы в PostgreSQL. + def run_mapping(self, superset_client: SupersetClient, dataset_id: int, source: str, postgres_config: Optional[Dict] = None, excel_path: Optional[str] = None, table_name: Optional[str] = None, table_schema: Optional[str] = None): + self.logger.info("[run_mapping][Enter] Starting dataset mapping for ID %d from source '%s'.", dataset_id, source) + mappings: Dict[str, str] = {} + + try: + if source in ['postgres', 'both']: + assert postgres_config and table_name and table_schema, "Postgres config is required." + mappings.update(self.get_postgres_comments(postgres_config, table_name, table_schema)) + if source in ['excel', 'both']: + assert excel_path, "Excel path is required." + mappings.update(self.load_excel_mappings(excel_path)) + if source not in ['postgres', 'excel', 'both']: + self.logger.error("[run_mapping][Failure] Invalid source: %s.", source) + return + + dataset_response = superset_client.get_dataset(dataset_id) + dataset_data = dataset_response['result'] + + original_columns = dataset_data.get('columns', []) + updated_columns = [] + changes_made = False + + for column in original_columns: + col_name = column.get('column_name') + + new_column = { + "column_name": col_name, + "id": column.get("id"), + "advanced_data_type": column.get("advanced_data_type"), + "description": column.get("description"), + "expression": column.get("expression"), + "extra": column.get("extra"), + "filterable": column.get("filterable"), + "groupby": column.get("groupby"), + "is_active": column.get("is_active"), + "is_dttm": column.get("is_dttm"), + "python_date_format": column.get("python_date_format"), + "type": column.get("type"), + "uuid": column.get("uuid"), + "verbose_name": column.get("verbose_name"), + } + + new_column = {k: v for k, v in new_column.items() if v is not None} + + if col_name in mappings: + mapping_value = mappings[col_name] + if isinstance(mapping_value, str) and new_column.get('verbose_name') != mapping_value: + new_column['verbose_name'] = mapping_value + changes_made = True + + updated_columns.append(new_column) + + updated_metrics = [] + for metric in dataset_data.get("metrics", []): + new_metric = { + "id": metric.get("id"), + "metric_name": metric.get("metric_name"), + "expression": metric.get("expression"), + "verbose_name": metric.get("verbose_name"), + "description": metric.get("description"), + "d3format": metric.get("d3format"), + "currency": metric.get("currency"), + "extra": metric.get("extra"), + "warning_text": metric.get("warning_text"), + "metric_type": metric.get("metric_type"), + "uuid": metric.get("uuid"), + } + updated_metrics.append({k: v for k, v in new_metric.items() if v is not None}) + + if changes_made: + payload_for_update = { + "database_id": dataset_data.get("database", {}).get("id"), + "table_name": dataset_data.get("table_name"), + "schema": dataset_data.get("schema"), + "columns": updated_columns, + "owners": [owner["id"] for owner in dataset_data.get("owners", [])], + "metrics": updated_metrics, + "extra": dataset_data.get("extra"), + "description": dataset_data.get("description"), + "sql": dataset_data.get("sql"), + "cache_timeout": dataset_data.get("cache_timeout"), + "catalog": dataset_data.get("catalog"), + "default_endpoint": dataset_data.get("default_endpoint"), + "external_url": dataset_data.get("external_url"), + "fetch_values_predicate": dataset_data.get("fetch_values_predicate"), + "filter_select_enabled": dataset_data.get("filter_select_enabled"), + "is_managed_externally": dataset_data.get("is_managed_externally"), + "is_sqllab_view": dataset_data.get("is_sqllab_view"), + "main_dttm_col": dataset_data.get("main_dttm_col"), + "normalize_columns": dataset_data.get("normalize_columns"), + "offset": dataset_data.get("offset"), + "template_params": dataset_data.get("template_params"), + } + + payload_for_update = {k: v for k, v in payload_for_update.items() if v is not None} + + superset_client.update_dataset(dataset_id, payload_for_update) + self.logger.info("[run_mapping][Success] Dataset %d columns' verbose_name updated.", dataset_id) + else: + self.logger.info("[run_mapping][State] No changes in columns' verbose_name, skipping update.") + + except (AssertionError, FileNotFoundError, Exception) as e: + self.logger.error("[run_mapping][Failure] %s", e, exc_info=True) + return + # [/DEF:DatasetMapper.run_mapping] +# [/DEF:DatasetMapper] + +# [/DEF:superset_tool.utils.dataset_mapper] diff --git a/superset_tool/utils/fileio.py b/superset_tool/utils/fileio.py old mode 100644 new mode 100755 index 73e10e1..9c75b09 --- a/superset_tool/utils/fileio.py +++ b/superset_tool/utils/fileio.py @@ -1,678 +1,458 @@ -# -*- coding: utf-8 -*- -# pylint: disable=too-many-arguments,too-many-locals,too-many-statements,too-many-branches,unused-argument -""" -[MODULE] File Operations Manager -@contract: Предоставляет набор утилит для управления файловыми операциями. -""" - -# [IMPORTS] Core -import os -import re -import zipfile -from pathlib import Path -from typing import Any, Optional, Tuple, Dict, List, Union, LiteralString -from contextlib import contextmanager -import tempfile -from datetime import date, datetime -import glob -import shutil -import zlib -from dataclasses import dataclass - -# [IMPORTS] Third-party -import yaml - -# [IMPORTS] Local -from superset_tool.exceptions import InvalidZipFormatError -from superset_tool.utils.logger import SupersetLogger - -# [CONSTANTS] -ALLOWED_FOLDERS = {'databases', 'datasets', 'charts', 'dashboards'} - -# CONTRACT: -# PURPOSE: Контекстный менеджер для создания временного файла или директории, гарантирующий их удаление после использования. -# PRECONDITIONS: -# - `suffix` должен быть строкой, представляющей расширение файла или `.dir` для директории. -# - `mode` должен быть валидным режимом для записи в файл (например, 'wb' для бинарного). -# POSTCONDITIONS: -# - Создает временный ресурс (файл или директорию). -# - Возвращает объект `Path` к созданному ресурсу. -# - Автоматически удаляет ресурс при выходе из контекста `with`. -# PARAMETERS: -# - content: Optional[bytes] - Бинарное содержимое для записи во временный файл. -# - suffix: str - Суффикс для ресурса. Если `.dir`, создается директория. -# - mode: str - Режим записи в файл. -# - logger: Optional[SupersetLogger] - Экземпляр логгера. -# YIELDS: Path - Путь к временному ресурсу. -# EXCEPTIONS: -# - Перехватывает и логирует `Exception`, затем выбрасывает его дальше. -@contextmanager -def create_temp_file( - content: Optional[bytes] = None, - suffix: str = ".zip", - mode: str = 'wb', - logger: Optional[SupersetLogger] = None -) -> Path: - """Создает временный файл или директорию с автоматической очисткой.""" - logger = logger or SupersetLogger(name="fileio", console=False) - temp_resource_path = None - is_dir = suffix.startswith('.dir') - try: - if is_dir: - with tempfile.TemporaryDirectory(suffix=suffix) as temp_dir: - temp_resource_path = Path(temp_dir) - logger.debug(f"[DEBUG][TEMP_RESOURCE] Создана временная директория: {temp_resource_path}") - yield temp_resource_path - else: - with tempfile.NamedTemporaryFile(suffix=suffix, mode=mode, delete=False) as tmp: - temp_resource_path = Path(tmp.name) - if content: - tmp.write(content) - tmp.flush() - logger.debug(f"[DEBUG][TEMP_RESOURCE] Создан временный файл: {temp_resource_path}") - yield temp_resource_path - except IOError as e: - logger.error(f"[STATE][TEMP_RESOURCE] Ошибка создания временного ресурса: {str(e)}", exc_info=True) - raise - finally: - if temp_resource_path and temp_resource_path.exists(): - if is_dir: - shutil.rmtree(temp_resource_path, ignore_errors=True) - logger.debug(f"[DEBUG][TEMP_CLEANUP] Удалена временная директория: {temp_resource_path}") - else: - temp_resource_path.unlink(missing_ok=True) - logger.debug(f"[DEBUG][TEMP_CLEANUP] Удален временный файл: {temp_resource_path}") -# END_FUNCTION_create_temp_file - -# [SECTION] Directory Management Utilities - -# CONTRACT: -# PURPOSE: Рекурсивно удаляет все пустые поддиректории, начиная с указанной корневой директории. -# PRECONDITIONS: -# - `root_dir` должен быть строкой, представляющей существующий путь к директории. -# POSTCONDITIONS: -# - Все пустые директории внутри `root_dir` удалены. -# - Непустые директории и файлы остаются нетронутыми. -# PARAMETERS: -# - root_dir: str - Путь к корневой директории для очистки. -# - logger: Optional[SupersetLogger] - Экземпляр логгера. -# RETURN: int - Количество удаленных директорий. -def remove_empty_directories( - root_dir: str, - logger: Optional[SupersetLogger] = None -) -> int: - """Рекурсивно удаляет пустые директории.""" - logger = logger or SupersetLogger(name="fileio", console=False) - logger.info(f"[STATE][DIR_CLEANUP] Запуск очистки пустых директорий в {root_dir}") - - removed_count = 0 - root_path = Path(root_dir) - - if not root_path.is_dir(): - logger.error(f"[STATE][DIR_NOT_FOUND] Директория не существует или не является директорией: {root_dir}") - return 0 - - for current_dir, _, _ in os.walk(root_path, topdown=False): - if not os.listdir(current_dir): - try: - os.rmdir(current_dir) - removed_count += 1 - logger.info(f"[STATE][DIR_REMOVED] Удалена пустая директория: {current_dir}") - except OSError as e: - logger.error(f"[STATE][DIR_REMOVE_FAILED] Ошибка удаления {current_dir}: {str(e)}") - - logger.info(f"[STATE][DIR_CLEANUP_DONE] Удалено {removed_count} пустых директорий.") - return removed_count -# END_FUNCTION_remove_empty_directories - -# [SECTION] File Operations - -# CONTRACT: -# PURPOSE: Читает бинарное содержимое файла с диска. -# PRECONDITIONS: -# - `file_path` должен быть строкой, представляющей существующий путь к файлу. -# POSTCONDITIONS: -# - Возвращает кортеж, содержащий бинарное содержимое файла и его имя. -# PARAMETERS: -# - file_path: str - Путь к файлу. -# - logger: Optional[SupersetLogger] - Экземпляр логгера. -# RETURN: Tuple[bytes, str] - (содержимое, имя_файла). -# EXCEPTIONS: -# - `FileNotFoundError`, если файл не найден. -def read_dashboard_from_disk( - file_path: str, - logger: Optional[SupersetLogger] = None -) -> Tuple[bytes, str]: - """Читает сохраненный дашборд с диска.""" - logger = logger or SupersetLogger(name="fileio", console=False) - path = Path(file_path) - if not path.is_file(): - logger.error(f"[STATE][FILE_NOT_FOUND] Файл не найден: {file_path}") - raise FileNotFoundError(f"Файл дашборда не найден: {file_path}") - - logger.info(f"[STATE][FILE_READ] Чтение файла с диска: {file_path}") - content = path.read_bytes() - if not content: - logger.warning(f"[STATE][FILE_EMPTY] Файл {file_path} пуст.") - - return content, path.name -# END_FUNCTION_read_dashboard_from_disk - -# [SECTION] Archive Management - -# CONTRACT: -# PURPOSE: Вычисляет контрольную сумму CRC32 для файла. -# PRECONDITIONS: -# - `file_path` должен быть валидным путем к существующему файлу. -# POSTCONDITIONS: -# - Возвращает строку с 8-значным шестнадцатеричным представлением CRC32. -# PARAMETERS: -# - file_path: Path - Путь к файлу. -# RETURN: str - Контрольная сумма CRC32. -# EXCEPTIONS: -# - `FileNotFoundError`, `IOError` при ошибках I/O. -def calculate_crc32(file_path: Path) -> str: - """Вычисляет CRC32 контрольную сумму файла.""" - try: - with open(file_path, 'rb') as f: - crc32_value = zlib.crc32(f.read()) - return f"{crc32_value:08x}" - except FileNotFoundError: - raise - except IOError as e: - raise IOError(f"Ошибка вычисления CRC32 для {file_path}: {str(e)}") from e -# END_FUNCTION_calculate_crc32 - -@dataclass -class RetentionPolicy: - """Политика хранения для архивов.""" - daily: int = 7 - weekly: int = 4 - monthly: int = 12 - -# CONTRACT: -# PURPOSE: Управляет архивом экспортированных дашбордов, применяя политику хранения (ротацию) и дедупликацию. -# PRECONDITIONS: -# - `output_dir` должен быть существующей директорией. -# POSTCONDITIONS: -# - Устаревшие архивы удалены в соответствии с политикой. -# - Дубликаты файлов (если `deduplicate=True`) удалены. -# PARAMETERS: -# - output_dir: str - Директория с архивами. -# - policy: RetentionPolicy - Политика хранения. -# - deduplicate: bool - Флаг для включения удаления дубликатов по CRC32. -# - logger: Optional[SupersetLogger] - Экземпляр логгера. -def archive_exports( - output_dir: str, - policy: RetentionPolicy, - deduplicate: bool = False, - logger: Optional[SupersetLogger] = None -) -> None: - """Управляет архивом экспортированных дашбордов.""" - logger = logger or SupersetLogger(name="fileio", console=False) - output_path = Path(output_dir) - if not output_path.is_dir(): - logger.warning(f"[WARN][ARCHIVE] Директория архива не найдена: {output_dir}") - return - - logger.info(f"[INFO][ARCHIVE] Запуск управления архивом в {output_dir}") - - # 1. Дедупликация - if deduplicate: - checksums = {} - duplicates_removed = 0 - for file_path in output_path.glob('*.zip'): - try: - crc32 = calculate_crc32(file_path) - if crc32 in checksums: - logger.info(f"[INFO][DEDUPLICATE] Найден дубликат: {file_path} (CRC32: {crc32}). Удаление.") - file_path.unlink() - duplicates_removed += 1 - else: - checksums[crc32] = file_path - except (IOError, FileNotFoundError) as e: - logger.error(f"[ERROR][DEDUPLICATE] Ошибка обработки файла {file_path}: {e}") - logger.info(f"[INFO][DEDUPLICATE] Удалено дубликатов: {duplicates_removed}") - - # 2. Политика хранения - try: - files_with_dates = [] - for file_path in output_path.glob('*.zip'): - try: - # Извлекаем дату из имени файла, например 'dashboard_export_20231027_103000.zip' - match = re.search(r'(\d{8})', file_path.name) - if match: - file_date = datetime.strptime(match.group(1), "%Y%m%d").date() - files_with_dates.append((file_path, file_date)) - except (ValueError, IndexError) as e: - logger.warning(f"[WARN][RETENTION] Не удалось извлечь дату из имени файла {file_path.name}: {e}") - - if not files_with_dates: - logger.info("[INFO][RETENTION] Не найдено файлов для применения политики хранения.") - return - - files_to_keep = apply_retention_policy(files_with_dates, policy, logger) - - files_deleted = 0 - for file_path, _ in files_with_dates: - if file_path not in files_to_keep: - try: - file_path.unlink() - logger.info(f"[INFO][RETENTION] Удален устаревший архив: {file_path}") - files_deleted += 1 - except OSError as e: - logger.error(f"[ERROR][RETENTION] Не удалось удалить файл {file_path}: {e}") - - logger.info(f"[INFO][RETENTION] Политика хранения применена. Удалено файлов: {files_deleted}.") - - except Exception as e: - logger.error(f"[CRITICAL][ARCHIVE] Критическая ошибка при управлении архивом: {e}", exc_info=True) -# END_FUNCTION_archive_exports - -# CONTRACT: -# PURPOSE: (HELPER) Применяет политику хранения к списку файлов с датами. -# PRECONDITIONS: -# - `files_with_dates` - список кортежей (Path, date). -# POSTCONDITIONS: -# - Возвращает множество объектов `Path`, которые должны быть сохранены. -# PARAMETERS: -# - files_with_dates: List[Tuple[Path, date]] - Список файлов. -# - policy: RetentionPolicy - Политика хранения. -# - logger: SupersetLogger - Логгер. -# RETURN: set - Множество файлов для сохранения. -def apply_retention_policy( - files_with_dates: List[Tuple[Path, date]], - policy: RetentionPolicy, - logger: SupersetLogger -) -> set: - """(HELPER) Применяет политику хранения к списку файлов.""" - if not files_with_dates: - return set() - - today = date.today() - files_to_keep = set() - - # Сортируем файлы от новых к старым - files_with_dates.sort(key=lambda x: x[1], reverse=True) - - # Группируем по дням, неделям, месяцам - daily_backups = {} - weekly_backups = {} - monthly_backups = {} - - for file_path, file_date in files_with_dates: - # Daily - if (today - file_date).days < policy.daily: - if file_date not in daily_backups: - daily_backups[file_date] = file_path - - # Weekly - week_key = file_date.isocalendar()[:2] # (year, week) - if week_key not in weekly_backups: - weekly_backups[week_key] = file_path - - # Monthly - month_key = (file_date.year, file_date.month) - if month_key not in monthly_backups: - monthly_backups[month_key] = file_path - - # Собираем файлы для сохранения, применяя лимиты - files_to_keep.update(list(daily_backups.values())[:policy.daily]) - files_to_keep.update(list(weekly_backups.values())[:policy.weekly]) - files_to_keep.update(list(monthly_backups.values())[:policy.monthly]) - - logger.info(f"[INFO][RETENTION_POLICY] Файлов для сохранения после применения политики: {len(files_to_keep)}") - - return files_to_keep -# END_FUNCTION_apply_retention_policy - -# CONTRACT: -# PURPOSE: Сохраняет бинарное содержимое ZIP-архива на диск и опционально распаковывает его. -# PRECONDITIONS: -# - `zip_content` должен быть валидным содержимым ZIP-файла в байтах. -# - `output_dir` должен быть путем, доступным для записи. -# POSTCONDITIONS: -# - ZIP-архив сохранен в `output_dir`. -# - Если `unpack=True`, архив распакован в ту же директорию. -# - Возвращает пути к созданному ZIP-файлу и, если применимо, к директории с распакованным содержимым. -# PARAMETERS: -# - zip_content: bytes - Содержимое ZIP-архива. -# - output_dir: Union[str, Path] - Директория для сохранения. -# - unpack: bool - Флаг, нужно ли распаковывать архив. -# - original_filename: Optional[str] - Исходное имя файла. -# - logger: Optional[SupersetLogger] - Экземпляр логгера. -# RETURN: Tuple[Path, Optional[Path]] - (путь_к_zip, путь_к_распаковке_или_None). -# EXCEPTIONS: -# - `InvalidZipFormatError` при ошибке формата ZIP. -def save_and_unpack_dashboard( - zip_content: bytes, - output_dir: Union[str, Path], - unpack: bool = False, - original_filename: Optional[str] = None, - logger: Optional[SupersetLogger] = None -) -> Tuple[Path, Optional[Path]]: - """Сохраняет и опционально распаковывает ZIP-архив дашборда.""" - logger = logger or SupersetLogger(name="fileio", console=False) - logger.info(f"[STATE] Старт обработки дашборда. Распаковка: {unpack}") - - try: - output_path = Path(output_dir) - output_path.mkdir(parents=True, exist_ok=True) - logger.debug(f"[DEBUG] Директория {output_path} создана/проверена") - - zip_name = sanitize_filename(original_filename) if original_filename else None - if not zip_name: - timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") - zip_name = f"dashboard_export_{timestamp}.zip" - logger.debug(f"[DEBUG] Сгенерировано имя файла: {zip_name}") - - zip_path = output_path / zip_name - logger.info(f"[STATE] Сохранение дашборда в: {zip_path}") - - with open(zip_path, "wb") as f: - f.write(zip_content) - - if unpack: - with zipfile.ZipFile(zip_path, 'r') as zip_ref: - zip_ref.extractall(output_path) - logger.info(f"[STATE] Дашборд распакован в: {output_path}") - return zip_path, output_path - - return zip_path, None - - except zipfile.BadZipFile as e: - logger.error(f"[STATE][ZIP_ERROR] Невалидный ZIP-архив: {str(e)}") - raise InvalidZipFormatError(f"Invalid ZIP file: {str(e)}") from e - except Exception as e: - logger.error(f"[STATE][UNPACK_ERROR] Ошибка обработки: {str(e)}", exc_info=True) - raise -# END_FUNCTION_save_and_unpack_dashboard - -# CONTRACT: -# PURPOSE: (HELPER) Рекурсивно обрабатывает значения в YAML-структуре, применяя замену по регулярному выражению. -# PRECONDITIONS: `value` может быть строкой, словарем или списком. -# POSTCONDITIONS: Возвращает кортеж с флагом о том, было ли изменение, и новым значением. -# PARAMETERS: -# - name: value, type: Any, description: Значение для обработки. -# - name: regexp_pattern, type: str, description: Паттерн для поиска. -# - name: replace_string, type: str, description: Строка для замены. -# RETURN: type: Tuple[bool, Any] -def _process_yaml_value(value: Any, regexp_pattern: str, replace_string: str) -> Tuple[bool, Any]: - matched = False - if isinstance(value, str): - new_str = re.sub(regexp_pattern, replace_string, value) - matched = new_str != value - return matched, new_str - if isinstance(value, dict): - new_dict = {} - for k, v in value.items(): - sub_matched, sub_val = _process_yaml_value(v, regexp_pattern, replace_string) - new_dict[k] = sub_val - if sub_matched: - matched = True - return matched, new_dict - if isinstance(value, list): - new_list = [] - for item in value: - sub_matched, sub_val = _process_yaml_value(item, regexp_pattern, replace_string) - new_list.append(sub_val) - if sub_matched: - matched = True - return matched, new_list - return False, value -# END_FUNCTION__process_yaml_value - -# CONTRACT: -# PURPOSE: (HELPER) Обновляет один YAML файл на основе предоставленных конфигураций. -# PRECONDITIONS: -# - `file_path` - существующий YAML файл. -# - `db_configs` - список словарей для замены. -# POSTCONDITIONS: Файл обновлен. -# PARAMETERS: -# - name: file_path, type: Path, description: Путь к YAML файлу. -# - name: db_configs, type: Optional[List[Dict]], description: Конфигурации для замены. -# - name: regexp_pattern, type: Optional[str], description: Паттерн для поиска. -# - name: replace_string, type: Optional[str], description: Строка для замены. -# - name: logger, type: SupersetLogger, description: Экземпляр логгера. -# RETURN: type: None -def _update_yaml_file( - file_path: Path, - db_configs: Optional[List[Dict]], - regexp_pattern: Optional[str], - replace_string: Optional[str], - logger: SupersetLogger -) -> None: - try: - with open(file_path, 'r', encoding='utf-8') as f: - data = yaml.safe_load(f) - - updates = {} - - if db_configs: - for config in db_configs: - if config is not None: - if "old" not in config or "new" not in config: - raise ValueError("db_config должен содержать оба раздела 'old' и 'new'") - - old_config = config.get("old", {}) - new_config = config.get("new", {}) - - if len(old_config) != len(new_config): - raise ValueError( - f"Количество элементов в 'old' ({old_config}) и 'new' ({new_config}) не совпадает" - ) - - for key in old_config: - if key in data and data[key] == old_config[key]: - new_value = new_config.get(key) - if new_value is not None and new_value != data.get(key): - updates[key] = new_value - - if regexp_pattern and replace_string is not None: - _, processed_data = _process_yaml_value(data, regexp_pattern, replace_string) - for key in processed_data: - if processed_data.get(key) != data.get(key): - updates[key] = processed_data[key] - - if updates: - logger.info(f"[STATE] Обновление {file_path}: {updates}") - data.update(updates) - - with open(file_path, 'w', encoding='utf-8') as file: - yaml.dump( - data, - file, - default_flow_style=False, - sort_keys=False - ) - - except yaml.YAMLError as e: - logger.error(f"[STATE][YAML_ERROR] Ошибка парсинга {file_path}: {str(e)}") -# END_FUNCTION__update_yaml_file - -# [ENTITY: Function('update_yamls')] -# CONTRACT: -# PURPOSE: Обновляет конфигурации в YAML-файлах баз данных, заменяя старые значения на новые, а также применяя замены по регулярному выражению. -# SPECIFICATION_LINK: func_update_yamls -# PRECONDITIONS: -# - `path` должен быть валидным путем к директории с YAML файлами. -# - `db_configs` должен быть списком словарей, каждый из которых содержит ключи 'old' и 'new'. -# POSTCONDITIONS: Все найденные YAML файлы в директории `path` обновлены в соответствии с предоставленными конфигурациями. -# PARAMETERS: -# - name: db_configs, type: Optional[List[Dict]], description: Список конфигураций для замены. -# - name: path, type: str, description: Путь к директории с YAML файлами. -# - name: regexp_pattern, type: Optional[LiteralString], description: Паттерн для поиска. -# - name: replace_string, type: Optional[LiteralString], description: Строка для замены. -# - name: logger, type: Optional[SupersetLogger], description: Экземпляр логгера. -# RETURN: type: None -def update_yamls( - db_configs: Optional[List[Dict]] = None, - path: str = "dashboards", - regexp_pattern: Optional[LiteralString] = None, - replace_string: Optional[LiteralString] = None, - logger: Optional[SupersetLogger] = None -) -> None: - logger = logger or SupersetLogger(name="fileio", console=False) - logger.info("[STATE][YAML_UPDATE] Старт обновления конфигураций") - - if isinstance(db_configs, dict): - db_configs = [db_configs] - elif db_configs is None: - db_configs = [] - - try: - dir_path = Path(path) - - if not dir_path.exists() or not dir_path.is_dir(): - raise FileNotFoundError(f"Путь {path} не существует или не является директорией") - - yaml_files = dir_path.rglob("*.yaml") - - for file_path in yaml_files: - _update_yaml_file(file_path, db_configs, regexp_pattern, replace_string, logger) - - except (IOError, ValueError) as e: - logger.error(f"[STATE][YAML_UPDATE_ERROR] Критическая ошибка: {str(e)}", exc_info=True) - raise -# END_FUNCTION_update_yamls - -# [ENTITY: Function('create_dashboard_export')] -# CONTRACT: -# PURPOSE: Создает ZIP-архив дашборда из указанных исходных путей. -# SPECIFICATION_LINK: func_create_dashboard_export -# PRECONDITIONS: -# - `zip_path` - валидный путь для сохранения архива. -# - `source_paths` - список существующих путей к файлам/директориям для архивации. -# POSTCONDITIONS: Возвращает `True` в случае успешного создания архива, иначе `False`. -# PARAMETERS: -# - name: zip_path, type: Union[str, Path], description: Путь для сохранения ZIP архива. -# - name: source_paths, type: List[Union[str, Path]], description: Список исходных путей. -# - name: exclude_extensions, type: Optional[List[str]], description: Список исключаемых расширений. -# - name: logger, type: Optional[SupersetLogger], description: Экземпляр логгера. -# RETURN: type: bool -def create_dashboard_export( - zip_path: Union[str, Path], - source_paths: List[Union[str, Path]], - exclude_extensions: Optional[List[str]] = None, - logger: Optional[SupersetLogger] = None -) -> bool: - logger = logger or SupersetLogger(name="fileio", console=False) - logger.info(f"[STATE] Упаковка дашбордов: {source_paths} -> {zip_path}") - - try: - exclude_ext = [ext.lower() for ext in exclude_extensions] if exclude_extensions else [] - - with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf: - for path in source_paths: - path = Path(path) - if not path.exists(): - raise FileNotFoundError(f"Путь не найден: {path}") - - for item in path.rglob('*'): - if item.is_file() and item.suffix.lower() not in exclude_ext: - arcname = item.relative_to(path.parent) - zipf.write(item, arcname) - logger.debug(f"[DEBUG] Добавлен в архив: {arcname}") - - logger.info(f"[STATE]архив создан: {zip_path}") - return True - - except (IOError, zipfile.BadZipFile) as e: - logger.error(f"[STATE][ZIP_CREATION_ERROR] Ошибка: {str(e)}", exc_info=True) - return False -# END_FUNCTION_create_dashboard_export - -# [ENTITY: Function('sanitize_filename')] -# CONTRACT: -# PURPOSE: Очищает строку, предназначенную для имени файла, от недопустимых символов. -# SPECIFICATION_LINK: func_sanitize_filename -# PRECONDITIONS: `filename` является строкой. -# POSTCONDITIONS: Возвращает строку, безопасную для использования в качестве имени файла. -# PARAMETERS: -# - name: filename, type: str, description: Исходное имя файла. -# RETURN: type: str -def sanitize_filename(filename: str) -> str: - return re.sub(r'[\\/*?:"<>|]', "_", filename).strip() -# END_FUNCTION_sanitize_filename - -# [ENTITY: Function('get_filename_from_headers')] -# CONTRACT: -# PURPOSE: Извлекает имя файла из HTTP заголовка 'Content-Disposition'. -# SPECIFICATION_LINK: func_get_filename_from_headers -# PRECONDITIONS: `headers` - словарь HTTP заголовков. -# POSTCONDITIONS: Возвращает имя файла или `None`, если оно не найдено. -# PARAMETERS: -# - name: headers, type: dict, description: Словарь HTTP заголовков. -# RETURN: type: Optional[str] -def get_filename_from_headers(headers: dict) -> Optional[str]: - content_disposition = headers.get("Content-Disposition", "") - filename_match = re.findall(r'filename="(.+?)"', content_disposition) - if not filename_match: - filename_match = re.findall(r'filename=([^;]+)', content_disposition) - if filename_match: - return filename_match[0].strip('"') - return None -# END_FUNCTION_get_filename_from_headers - -# [ENTITY: Function('consolidate_archive_folders')] -# CONTRACT: -# PURPOSE: Консолидирует директории архивов дашбордов на основе общего слага в имени. -# SPECIFICATION_LINK: func_consolidate_archive_folders -# PRECONDITIONS: `root_directory` - существующая директория. -# POSTCONDITIONS: Содержимое всех директорий с одинаковым слагом переносится в самую последнюю измененную директорию. -# PARAMETERS: -# - name: root_directory, type: Path, description: Корневая директория для консолидации. -# - name: logger, type: Optional[SupersetLogger], description: Экземпляр логгера. -# RETURN: type: None -def consolidate_archive_folders(root_directory: Path, logger: Optional[SupersetLogger] = None) -> None: - logger = logger or SupersetLogger(name="fileio", console=False) - if not isinstance(root_directory, Path): - raise TypeError("root_directory must be a Path object.") - if not root_directory.is_dir(): - raise ValueError("root_directory must be an existing directory.") - - logger.debug("[DEBUG] Checking root_folder: {root_directory}") - - slug_pattern = re.compile(r"([A-Z]{2}-\d{4})") - - dashboards_by_slug: dict[str, list[str]] = {} - for folder_name in glob.glob(os.path.join(root_directory, '*')): - if os.path.isdir(folder_name): - logger.debug(f"[DEBUG] Checking folder: {folder_name}") - match = slug_pattern.search(folder_name) - if match: - slug = match.group(1) - logger.info(f"[STATE] Found slug: {slug} in folder: {folder_name}") - if slug not in dashboards_by_slug: - dashboards_by_slug[slug] = [] - dashboards_by_slug[slug].append(folder_name) - else: - logger.debug(f"[DEBUG] No slug found in folder: {folder_name}") - else: - logger.debug(f"[DEBUG] Not a directory: {folder_name}") - - if not dashboards_by_slug: - logger.warning("[STATE] No folders found matching the slug pattern.") - return - - for slug, folder_list in dashboards_by_slug.items(): - latest_folder = max(folder_list, key=os.path.getmtime) - logger.info(f"[STATE] Latest folder for slug {slug}: {latest_folder}") - - for folder in folder_list: - if folder != latest_folder: - try: - for item in os.listdir(folder): - s = os.path.join(folder, item) - d = os.path.join(latest_folder, item) - shutil.move(s, d) - logger.info(f"[STATE] Moved contents of {folder} to {latest_folder}") - shutil.rmtree(folder) # Remove empty folder - logger.info(f"[STATE] Removed empty folder: {folder}") - except (IOError, shutil.Error) as e: - logger.error(f"[STATE] Failed to move contents of {folder} to {latest_folder}: {e}", exc_info=True) - - logger.info("[STATE] Dashboard consolidation completed.") -# END_FUNCTION_consolidate_archive_folders - -# END_MODULE_fileio \ No newline at end of file +# [DEF:superset_tool.utils.fileio:Module] +# +# @SEMANTICS: file, io, zip, yaml, temp, archive, utility +# @PURPOSE: Предоставляет набор утилит для управления файловыми операциями, включая работу с временными файлами, архивами ZIP, файлами YAML и очистку директорий. +# @LAYER: Infra +# @RELATION: DEPENDS_ON -> superset_tool.exceptions +# @RELATION: DEPENDS_ON -> superset_tool.utils.logger +# @RELATION: DEPENDS_ON -> pyyaml +# @PUBLIC_API: create_temp_file, remove_empty_directories, read_dashboard_from_disk, calculate_crc32, RetentionPolicy, archive_exports, save_and_unpack_dashboard, update_yamls, create_dashboard_export, sanitize_filename, get_filename_from_headers, consolidate_archive_folders + +# [SECTION: IMPORTS] +import os +import re +import zipfile +from pathlib import Path +from typing import Any, Optional, Tuple, Dict, List, Union, LiteralString, Generator +from contextlib import contextmanager +import tempfile +from datetime import date, datetime +import glob +import shutil +import zlib +from dataclasses import dataclass +import yaml +from superset_tool.exceptions import InvalidZipFormatError +from superset_tool.utils.logger import SupersetLogger +# [/SECTION] + +# [DEF:create_temp_file:Function] +# @PURPOSE: Контекстный менеджер для создания временного файла или директории с гарантированным удалением. +# @PARAM: content (Optional[bytes]) - Бинарное содержимое для записи во временный файл. +# @PARAM: suffix (str) - Суффикс ресурса. Если `.dir`, создается директория. +# @PARAM: mode (str) - Режим записи в файл (e.g., 'wb'). +# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. +# @YIELDS: Path - Путь к временному ресурсу. +# @THROW: IOError - При ошибках создания ресурса. +@contextmanager +def create_temp_file(content: Optional[bytes] = None, suffix: str = ".zip", mode: str = 'wb', dry_run = False, logger: Optional[SupersetLogger] = None) -> Generator[Path, None, None]: + logger = logger or SupersetLogger(name="fileio") + resource_path = None + is_dir = suffix.startswith('.dir') + try: + if is_dir: + with tempfile.TemporaryDirectory(suffix=suffix) as temp_dir: + resource_path = Path(temp_dir) + logger.debug("[create_temp_file][State] Created temporary directory: %s", resource_path) + yield resource_path + else: + fd, temp_path_str = tempfile.mkstemp(suffix=suffix) + resource_path = Path(temp_path_str) + os.close(fd) + if content: + resource_path.write_bytes(content) + logger.debug("[create_temp_file][State] Created temporary file: %s", resource_path) + yield resource_path + finally: + if resource_path and resource_path.exists() and not dry_run: + try: + if resource_path.is_dir(): + shutil.rmtree(resource_path) + logger.debug("[create_temp_file][Cleanup] Removed temporary directory: %s", resource_path) + else: + resource_path.unlink() + logger.debug("[create_temp_file][Cleanup] Removed temporary file: %s", resource_path) + except OSError as e: + logger.error("[create_temp_file][Failure] Error during cleanup of %s: %s", resource_path, e) +# [/DEF:create_temp_file] + +# [DEF:remove_empty_directories:Function] +# @PURPOSE: Рекурсивно удаляет все пустые поддиректории, начиная с указанного пути. +# @PARAM: root_dir (str) - Путь к корневой директории для очистки. +# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. +# @RETURN: int - Количество удаленных директорий. +def remove_empty_directories(root_dir: str, logger: Optional[SupersetLogger] = None) -> int: + logger = logger or SupersetLogger(name="fileio") + logger.info("[remove_empty_directories][Enter] Starting cleanup of empty directories in %s", root_dir) + removed_count = 0 + if not os.path.isdir(root_dir): + logger.error("[remove_empty_directories][Failure] Directory not found: %s", root_dir) + return 0 + for current_dir, _, _ in os.walk(root_dir, topdown=False): + if not os.listdir(current_dir): + try: + os.rmdir(current_dir) + removed_count += 1 + logger.info("[remove_empty_directories][State] Removed empty directory: %s", current_dir) + except OSError as e: + logger.error("[remove_empty_directories][Failure] Failed to remove %s: %s", current_dir, e) + logger.info("[remove_empty_directories][Exit] Removed %d empty directories.", removed_count) + return removed_count +# [/DEF:remove_empty_directories] + +# [DEF:read_dashboard_from_disk:Function] +# @PURPOSE: Читает бинарное содержимое файла с диска. +# @PARAM: file_path (str) - Путь к файлу. +# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. +# @RETURN: Tuple[bytes, str] - Кортеж (содержимое, имя файла). +# @THROW: FileNotFoundError - Если файл не найден. +def read_dashboard_from_disk(file_path: str, logger: Optional[SupersetLogger] = None) -> Tuple[bytes, str]: + logger = logger or SupersetLogger(name="fileio") + path = Path(file_path) + assert path.is_file(), f"Файл дашборда не найден: {file_path}" + logger.info("[read_dashboard_from_disk][Enter] Reading file: %s", file_path) + content = path.read_bytes() + if not content: + logger.warning("[read_dashboard_from_disk][Warning] File is empty: %s", file_path) + return content, path.name +# [/DEF:read_dashboard_from_disk] + +# [DEF:calculate_crc32:Function] +# @PURPOSE: Вычисляет контрольную сумму CRC32 для файла. +# @PARAM: file_path (Path) - Путь к файлу. +# @RETURN: str - 8-значное шестнадцатеричное представление CRC32. +# @THROW: IOError - При ошибках чтения файла. +def calculate_crc32(file_path: Path) -> str: + with open(file_path, 'rb') as f: + crc32_value = zlib.crc32(f.read()) + return f"{crc32_value:08x}" +# [/DEF:calculate_crc32] + +# [DEF:RetentionPolicy:DataClass] +# @PURPOSE: Определяет политику хранения для архивов (ежедневные, еженедельные, ежемесячные). +@dataclass +class RetentionPolicy: + daily: int = 7 + weekly: int = 4 + monthly: int = 12 +# [/DEF:RetentionPolicy] + +# [DEF:archive_exports:Function] +# @PURPOSE: Управляет архивом экспортированных файлов, применяя политику хранения и дедупликацию. +# @RELATION: CALLS -> apply_retention_policy +# @RELATION: CALLS -> calculate_crc32 +# @PARAM: output_dir (str) - Директория с архивами. +# @PARAM: policy (RetentionPolicy) - Политика хранения. +# @PARAM: deduplicate (bool) - Флаг для включения удаления дубликатов по CRC32. +# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. +def archive_exports(output_dir: str, policy: RetentionPolicy, deduplicate: bool = False, logger: Optional[SupersetLogger] = None) -> None: + logger = logger or SupersetLogger(name="fileio") + output_path = Path(output_dir) + if not output_path.is_dir(): + logger.warning("[archive_exports][Skip] Archive directory not found: %s", output_dir) + return + + logger.info("[archive_exports][Enter] Managing archive in %s", output_dir) + + # 1. Collect all zip files + zip_files = list(output_path.glob("*.zip")) + if not zip_files: + logger.info("[archive_exports][State] No zip files found in %s", output_dir) + return + + # 2. Deduplication + if deduplicate: + logger.info("[archive_exports][State] Starting deduplication...") + checksums = {} + files_to_remove = [] + + # Sort by modification time (newest first) to keep the latest version + zip_files.sort(key=lambda f: f.stat().st_mtime, reverse=True) + + for file_path in zip_files: + try: + crc = calculate_crc32(file_path) + if crc in checksums: + files_to_remove.append(file_path) + logger.debug("[archive_exports][State] Duplicate found: %s (same as %s)", file_path.name, checksums[crc].name) + else: + checksums[crc] = file_path + except Exception as e: + logger.error("[archive_exports][Failure] Failed to calculate CRC32 for %s: %s", file_path, e) + + for f in files_to_remove: + try: + f.unlink() + zip_files.remove(f) + logger.info("[archive_exports][State] Removed duplicate: %s", f.name) + except OSError as e: + logger.error("[archive_exports][Failure] Failed to remove duplicate %s: %s", f, e) + + # 3. Retention Policy + files_with_dates = [] + for file_path in zip_files: + # Try to extract date from filename + # Pattern: ..._YYYYMMDD_HHMMSS.zip or ..._YYYYMMDD.zip + match = re.search(r'_(\d{8})_', file_path.name) + file_date = None + if match: + try: + date_str = match.group(1) + file_date = datetime.strptime(date_str, "%Y%m%d").date() + except ValueError: + pass + + if not file_date: + # Fallback to modification time + file_date = datetime.fromtimestamp(file_path.stat().st_mtime).date() + + files_with_dates.append((file_path, file_date)) + + files_to_keep = apply_retention_policy(files_with_dates, policy, logger) + + for file_path, _ in files_with_dates: + if file_path not in files_to_keep: + try: + file_path.unlink() + logger.info("[archive_exports][State] Removed by retention policy: %s", file_path.name) + except OSError as e: + logger.error("[archive_exports][Failure] Failed to remove %s: %s", file_path, e) +# [/DEF:archive_exports] + +# [DEF:apply_retention_policy:Function] +# @PURPOSE: (Helper) Применяет политику хранения к списку файлов, возвращая те, что нужно сохранить. +# @PARAM: files_with_dates (List[Tuple[Path, date]]) - Список файлов с датами. +# @PARAM: policy (RetentionPolicy) - Политика хранения. +# @PARAM: logger (SupersetLogger) - Логгер. +# @RETURN: set - Множество путей к файлам, которые должны быть сохранены. +def apply_retention_policy(files_with_dates: List[Tuple[Path, date]], policy: RetentionPolicy, logger: SupersetLogger) -> set: + # Сортируем по дате (от новой к старой) + sorted_files = sorted(files_with_dates, key=lambda x: x[1], reverse=True) + # Словарь для хранения файлов по категориям + daily_files = [] + weekly_files = [] + monthly_files = [] + today = date.today() + for file_path, file_date in sorted_files: + # Ежедневные + if (today - file_date).days < policy.daily: + daily_files.append(file_path) + # Еженедельные + elif (today - file_date).days < policy.weekly * 7: + weekly_files.append(file_path) + # Ежемесячные + elif (today - file_date).days < policy.monthly * 30: + monthly_files.append(file_path) + # Возвращаем множество файлов, которые нужно сохранить + files_to_keep = set() + files_to_keep.update(daily_files) + files_to_keep.update(weekly_files[:policy.weekly]) + files_to_keep.update(monthly_files[:policy.monthly]) + logger.debug("[apply_retention_policy][State] Keeping %d files according to retention policy", len(files_to_keep)) + return files_to_keep +# [/DEF:apply_retention_policy] + +# [DEF:save_and_unpack_dashboard:Function] +# @PURPOSE: Сохраняет бинарное содержимое ZIP-архива на диск и опционально распаковывает его. +# @PARAM: zip_content (bytes) - Содержимое ZIP-архива. +# @PARAM: output_dir (Union[str, Path]) - Директория для сохранения. +# @PARAM: unpack (bool) - Флаг, нужно ли распаковывать архив. +# @PARAM: original_filename (Optional[str]) - Исходное имя файла для сохранения. +# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. +# @RETURN: Tuple[Path, Optional[Path]] - Путь к ZIP-файлу и, если применимо, путь к директории с распаковкой. +# @THROW: InvalidZipFormatError - При ошибке формата ZIP. +def save_and_unpack_dashboard(zip_content: bytes, output_dir: Union[str, Path], unpack: bool = False, original_filename: Optional[str] = None, logger: Optional[SupersetLogger] = None) -> Tuple[Path, Optional[Path]]: + logger = logger or SupersetLogger(name="fileio") + logger.info("[save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: %s", unpack) + try: + output_path = Path(output_dir) + output_path.mkdir(parents=True, exist_ok=True) + zip_name = sanitize_filename(original_filename) if original_filename else f"dashboard_export_{datetime.now().strftime('%Y%m%d_%H%M%S')}.zip" + zip_path = output_path / zip_name + zip_path.write_bytes(zip_content) + logger.info("[save_and_unpack_dashboard][State] Dashboard saved to: %s", zip_path) + if unpack: + with zipfile.ZipFile(zip_path, 'r') as zip_ref: + zip_ref.extractall(output_path) + logger.info("[save_and_unpack_dashboard][State] Dashboard unpacked to: %s", output_path) + return zip_path, output_path + return zip_path, None + except zipfile.BadZipFile as e: + logger.error("[save_and_unpack_dashboard][Failure] Invalid ZIP archive: %s", e) + raise InvalidZipFormatError(f"Invalid ZIP file: {e}") from e +# [/DEF:save_and_unpack_dashboard] + +# [DEF:update_yamls:Function] +# @PURPOSE: Обновляет конфигурации в YAML-файлах, заменяя значения или применяя regex. +# @RELATION: CALLS -> _update_yaml_file +# @THROW: FileNotFoundError - Если `path` не существует. +# @PARAM: db_configs (Optional[List[Dict]]) - Список конфигураций для замены. +# @PARAM: path (str) - Путь к директории с YAML файлами. +# @PARAM: regexp_pattern (Optional[LiteralString]) - Паттерн для поиска. +# @PARAM: replace_string (Optional[LiteralString]) - Строка для замены. +# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. +def update_yamls(db_configs: Optional[List[Dict[str, Any]]] = None, path: str = "dashboards", regexp_pattern: Optional[LiteralString] = None, replace_string: Optional[LiteralString] = None, logger: Optional[SupersetLogger] = None) -> None: + logger = logger or SupersetLogger(name="fileio") + logger.info("[update_yamls][Enter] Starting YAML configuration update.") + dir_path = Path(path) + assert dir_path.is_dir(), f"Путь {path} не существует или не является директорией" + + configs: List[Dict[str, Any]] = db_configs or [] + + for file_path in dir_path.rglob("*.yaml"): + _update_yaml_file(file_path, configs, regexp_pattern, replace_string, logger) +# [/DEF:update_yamls] + +# [DEF:_update_yaml_file:Function] +# @PURPOSE: (Helper) Обновляет один YAML файл. +# @PARAM: file_path (Path) - Путь к файлу. +# @PARAM: db_configs (List[Dict]) - Конфигурации. +# @PARAM: regexp_pattern (Optional[str]) - Паттерн. +# @PARAM: replace_string (Optional[str]) - Замена. +# @PARAM: logger (SupersetLogger) - Логгер. +def _update_yaml_file(file_path: Path, db_configs: List[Dict[str, Any]], regexp_pattern: Optional[str], replace_string: Optional[str], logger: SupersetLogger) -> None: + # Читаем содержимое файла + try: + with open(file_path, 'r', encoding='utf-8') as f: + content = f.read() + except Exception as e: + logger.error("[_update_yaml_file][Failure] Failed to read %s: %s", file_path, e) + return + # Если задан pattern и replace_string, применяем замену по регулярному выражению + if regexp_pattern and replace_string: + try: + new_content = re.sub(regexp_pattern, replace_string, content) + if new_content != content: + with open(file_path, 'w', encoding='utf-8') as f: + f.write(new_content) + logger.info("[_update_yaml_file][State] Updated %s using regex pattern", file_path) + except Exception as e: + logger.error("[_update_yaml_file][Failure] Error applying regex to %s: %s", file_path, e) + # Если заданы конфигурации, заменяем значения (поддержка old/new) + if db_configs: + try: + # Прямой текстовый заменитель для старых/новых значений, чтобы сохранить структуру файла + modified_content = content + for cfg in db_configs: + # Ожидаем структуру: {'old': {...}, 'new': {...}} + old_cfg = cfg.get('old', {}) + new_cfg = cfg.get('new', {}) + for key, old_val in old_cfg.items(): + if key in new_cfg: + new_val = new_cfg[key] + # Заменяем только точные совпадения старого значения в тексте YAML, используя ключ для контекста + if isinstance(old_val, str): + # Ищем паттерн: key: "value" или key: value + key_pattern = re.escape(key) + val_pattern = re.escape(old_val) + # Группы: 1=ключ+разделитель, 2=открывающая кавычка (опц), 3=значение, 4=закрывающая кавычка (опц) + pattern = rf'({key_pattern}\s*:\s*)(["\']?)({val_pattern})(["\']?)' + + # Функция замены, сохраняющая кавычки если они были + def replacer(match): + prefix = match.group(1) + quote_open = match.group(2) + quote_close = match.group(4) + return f"{prefix}{quote_open}{new_val}{quote_close}" + + modified_content = re.sub(pattern, replacer, modified_content) + logger.info("[_update_yaml_file][State] Replaced '%s' with '%s' for key %s in %s", old_val, new_val, key, file_path) + # Записываем обратно изменённый контент без парсинга YAML, сохраняем оригинальное форматирование + with open(file_path, 'w', encoding='utf-8') as f: + f.write(modified_content) + except Exception as e: + logger.error("[_update_yaml_file][Failure] Error performing raw replacement in %s: %s", file_path, e) +# [/DEF:_update_yaml_file] + +# [DEF:create_dashboard_export:Function] +# @PURPOSE: Создает ZIP-архив из указанных исходных путей. +# @PARAM: zip_path (Union[str, Path]) - Путь для сохранения ZIP архива. +# @PARAM: source_paths (List[Union[str, Path]]) - Список исходных путей для архивации. +# @PARAM: exclude_extensions (Optional[List[str]]) - Список расширений для исключения. +# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. +# @RETURN: bool - `True` при успехе, `False` при ошибке. +def create_dashboard_export(zip_path: Union[str, Path], source_paths: List[Union[str, Path]], exclude_extensions: Optional[List[str]] = None, logger: Optional[SupersetLogger] = None) -> bool: + logger = logger or SupersetLogger(name="fileio") + logger.info("[create_dashboard_export][Enter] Packing dashboard: %s -> %s", source_paths, zip_path) + try: + exclude_ext = [ext.lower() for ext in exclude_extensions or []] + with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf: + for src_path_str in source_paths: + src_path = Path(src_path_str) + assert src_path.exists(), f"Путь не найден: {src_path}" + for item in src_path.rglob('*'): + if item.is_file() and item.suffix.lower() not in exclude_ext: + arcname = item.relative_to(src_path.parent) + zipf.write(item, arcname) + logger.info("[create_dashboard_export][Exit] Archive created: %s", zip_path) + return True + except (IOError, zipfile.BadZipFile, AssertionError) as e: + logger.error("[create_dashboard_export][Failure] Error: %s", e, exc_info=True) + return False +# [/DEF:create_dashboard_export] + +# [DEF:sanitize_filename:Function] +# @PURPOSE: Очищает строку от символов, недопустимых в именах файлов. +# @PARAM: filename (str) - Исходное имя файла. +# @RETURN: str - Очищенная строка. +def sanitize_filename(filename: str) -> str: + return re.sub(r'[\\/*?:"<>|]', "_", filename).strip() +# [/DEF:sanitize_filename] + +# [DEF:get_filename_from_headers:Function] +# @PURPOSE: Извлекает имя файла из HTTP заголовка 'Content-Disposition'. +# @PARAM: headers (dict) - Словарь HTTP заголовков. +# @RETURN: Optional[str] - Имя файла или `None`. +def get_filename_from_headers(headers: dict) -> Optional[str]: + content_disposition = headers.get("Content-Disposition", "") + if match := re.search(r'filename="?([^"]+)"?', content_disposition): + return match.group(1).strip() + return None +# [/DEF:get_filename_from_headers] + +# [DEF:consolidate_archive_folders:Function] +# @PURPOSE: Консолидирует директории архивов на основе общего слага в имени. +# @THROW: TypeError, ValueError - Если `root_directory` невалиден. +# @PARAM: root_directory (Path) - Корневая директория для консолидации. +# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера. +def consolidate_archive_folders(root_directory: Path, logger: Optional[SupersetLogger] = None) -> None: + logger = logger or SupersetLogger(name="fileio") + assert isinstance(root_directory, Path), "root_directory must be a Path object." + assert root_directory.is_dir(), "root_directory must be an existing directory." + + logger.info("[consolidate_archive_folders][Enter] Consolidating archives in %s", root_directory) + # Собираем все директории с архивами + archive_dirs = [] + for item in root_directory.iterdir(): + if item.is_dir(): + # Проверяем, есть ли в директории ZIP-архивы + if any(item.glob("*.zip")): + archive_dirs.append(item) + # Группируем по слагу (части имени до первого '_') + slug_groups = {} + for dir_path in archive_dirs: + dir_name = dir_path.name + slug = dir_name.split('_')[0] if '_' in dir_name else dir_name + if slug not in slug_groups: + slug_groups[slug] = [] + slug_groups[slug].append(dir_path) + # Для каждой группы консолидируем + for slug, dirs in slug_groups.items(): + if len(dirs) <= 1: + continue + # Создаем целевую директорию + target_dir = root_directory / slug + target_dir.mkdir(exist_ok=True) + logger.info("[consolidate_archive_folders][State] Consolidating %d directories under %s", len(dirs), target_dir) + # Перемещаем содержимое + for source_dir in dirs: + if source_dir == target_dir: + continue + for item in source_dir.iterdir(): + dest_item = target_dir / item.name + try: + if item.is_dir(): + shutil.move(str(item), str(dest_item)) + else: + shutil.move(str(item), str(dest_item)) + except Exception as e: + logger.error("[consolidate_archive_folders][Failure] Failed to move %s to %s: %s", item, dest_item, e) + # Удаляем исходную директорию + try: + source_dir.rmdir() + logger.info("[consolidate_archive_folders][State] Removed source directory: %s", source_dir) + except Exception as e: + logger.error("[consolidate_archive_folders][Failure] Failed to remove source directory %s: %s", source_dir, e) +# [/DEF:consolidate_archive_folders] + +# [/DEF:superset_tool.utils.fileio] diff --git a/superset_tool/utils/init_clients.py b/superset_tool/utils/init_clients.py old mode 100644 new mode 100755 index 5fe51a4..bec155a --- a/superset_tool/utils/init_clients.py +++ b/superset_tool/utils/init_clients.py @@ -1,71 +1,110 @@ -# [MODULE] Superset Clients Initializer -# PURPOSE: Централизованно инициализирует клиенты Superset для различных окружений (DEV, PROD, SBX, PREPROD). -# COHERENCE: -# - Использует `SupersetClient` для создания экземпляров клиентов. -# - Использует `SupersetLogger` для логирования процесса. -# - Интегрируется с `keyring` для безопасного получения паролей. - -# [IMPORTS] Сторонние библиотеки -import keyring -from typing import Dict - -# [IMPORTS] Локальные модули -from superset_tool.models import SupersetConfig -from superset_tool.client import SupersetClient -from superset_tool.utils.logger import SupersetLogger - -# CONTRACT: -# PURPOSE: Инициализирует и возвращает словарь клиентов `SupersetClient` для всех предопределенных окружений. -# PRECONDITIONS: -# - `keyring` должен содержать пароли для систем "dev migrate", "prod migrate", "sandbox migrate", "preprod migrate". -# - `logger` должен быть инициализированным экземпляром `SupersetLogger`. -# POSTCONDITIONS: -# - Возвращает словарь, где ключи - это имена окружений ('dev', 'sbx', 'prod', 'preprod'), -# а значения - соответствующие экземпляры `SupersetClient`. -# PARAMETERS: -# - logger: SupersetLogger - Экземпляр логгера для записи процесса инициализации. -# RETURN: Dict[str, SupersetClient] - Словарь с инициализированными клиентами. -# EXCEPTIONS: -# - Логирует и выбрасывает `Exception` при любой ошибке (например, отсутствие пароля, ошибка подключения). -def setup_clients(logger: SupersetLogger) -> Dict[str, SupersetClient]: - """Инициализирует и настраивает клиенты для всех окружений Superset.""" - # [ANCHOR] CLIENTS_INITIALIZATION - logger.info("[INFO][INIT_CLIENTS_START] Запуск инициализации клиентов Superset.") - clients = {} - - environments = { - "dev": "https://devta.bi.dwh.rusal.com/api/v1", - "prod": "https://prodta.bi.dwh.rusal.com/api/v1", - "sbx": "https://sandboxta.bi.dwh.rusal.com/api/v1", - "preprod": "https://preprodta.bi.dwh.rusal.com/api/v1" - } - - try: - for env_name, base_url in environments.items(): - logger.debug(f"[DEBUG][CONFIG_CREATE] Создание конфигурации для окружения: {env_name.upper()}") - password = keyring.get_password("system", f"{env_name} migrate") - if not password: - raise ValueError(f"Пароль для '{env_name} migrate' не найден в keyring.") - - config = SupersetConfig( - base_url=base_url, - auth={ - "provider": "db", - "username": "migrate_user", - "password": password, - "refresh": True - }, - verify_ssl=False - ) - - clients[env_name] = SupersetClient(config, logger) - logger.debug(f"[DEBUG][CLIENT_SUCCESS] Клиент для {env_name.upper()} успешно создан.") - - logger.info(f"[COHERENCE_CHECK_PASSED][INIT_CLIENTS_SUCCESS] Все клиенты ({', '.join(clients.keys())}) успешно инициализированы.") - return clients - - except Exception as e: - logger.error(f"[CRITICAL][INIT_CLIENTS_FAILED] Ошибка при инициализации клиентов: {str(e)}", exc_info=True) - raise -# END_FUNCTION_setup_clients -# END_MODULE_init_clients \ No newline at end of file +# [DEF:superset_tool.utils.init_clients:Module] +# +# @SEMANTICS: utility, factory, client, initialization, configuration +# @PURPOSE: Централизованно инициализирует клиенты Superset для различных окружений (DEV, PROD, SBX, PREPROD), используя `keyring` для безопасного доступа к паролям. +# @LAYER: Infra +# @RELATION: DEPENDS_ON -> superset_tool.models +# @RELATION: DEPENDS_ON -> superset_tool.client +# @RELATION: DEPENDS_ON -> keyring +# @PUBLIC_API: setup_clients + +# [SECTION: IMPORTS] +import keyring +import os +from typing import Dict, List, Optional, Any +from superset_tool.models import SupersetConfig +from superset_tool.client import SupersetClient +from superset_tool.utils.logger import SupersetLogger +# [/SECTION] + +# [DEF:setup_clients:Function] +# @PURPOSE: Инициализирует и возвращает словарь клиентов `SupersetClient`. +# @PRE: `logger` должен быть валидным экземпляром `SupersetLogger`. +# @POST: Возвращает словарь с инициализированными клиентами. +# @THROW: Exception - При любых других ошибках инициализации. +# @RELATION: CREATES_INSTANCE_OF -> SupersetConfig +# @RELATION: CREATES_INSTANCE_OF -> SupersetClient +# @PARAM: logger (SupersetLogger) - Экземпляр логгера для записи процесса. +# @PARAM: custom_envs (List[Dict[str, Any]]) - Список пользовательских настроек окружений. +# @RETURN: Dict[str, SupersetClient] - Словарь, где ключ - имя окружения, значение - `SupersetClient`. +def setup_clients(logger: SupersetLogger, custom_envs: Optional[List[Any]] = None) -> Dict[str, SupersetClient]: + logger.info("[setup_clients][Enter] Starting Superset clients initialization.") + clients = {} + + try: + # Try to load from ConfigManager if available + try: + from backend.src.dependencies import get_config_manager + config_manager = get_config_manager() + envs = config_manager.get_environments() + if envs: + logger.info("[setup_clients][Action] Loading environments from ConfigManager") + for env in envs: + logger.debug("[setup_clients][State] Creating config for environment: %s", env.name) + config = SupersetConfig( + env=env.name, + base_url=env.url, + auth={"provider": "db", "username": env.username, "password": env.password, "refresh": "true"}, + verify_ssl=False, + timeout=30, + logger=logger + ) + clients[env.name] = SupersetClient(config, logger) + return clients + except (ImportError, Exception) as e: + logger.debug(f"[setup_clients][State] ConfigManager not available or failed: {e}") + + if custom_envs: + for env in custom_envs: + # Handle both dict and object (like Pydantic model) + env_name = str(getattr(env, 'name', env.get('name') if isinstance(env, dict) else "unknown")) + base_url = str(getattr(env, 'url', env.get('url') if isinstance(env, dict) else "")) + username = str(getattr(env, 'username', env.get('username') if isinstance(env, dict) else "")) + password = str(getattr(env, 'password', env.get('password') if isinstance(env, dict) else "")) + + logger.debug("[setup_clients][State] Creating config for custom environment: %s", env_name) + config = SupersetConfig( + env=env_name, + base_url=base_url, + auth={"provider": "db", "username": username, "password": password, "refresh": "true"}, + verify_ssl=False, + timeout=30, + logger=logger + ) + clients[env_name] = SupersetClient(config, logger) + else: + # Fallback to hardcoded environments with keyring + environments = { + "dev": "https://devta.bi.dwh.rusal.com/api/v1", + "prod": "https://prodta.bi.dwh.rusal.com/api/v1", + "sbx": "https://sandboxta.bi.dwh.rusal.com/api/v1", + "preprod": "https://preprodta.bi.dwh.rusal.com/api/v1", + "uatta": "https://uatta.bi.dwh.rusal.com/api/v1", + "dev5":"https://dev.bi.dwh.rusal.com/api/v1" + } + for env_name, base_url in environments.items(): + logger.debug("[setup_clients][State] Creating config for environment: %s", env_name.upper()) + password = keyring.get_password("system", f"{env_name} migrate") + if not password: + logger.warning(f"Пароль для '{env_name} migrate' не найден в keyring. Пропускаем.") + continue + + config = SupersetConfig( + env=env_name, + base_url=base_url, + auth={"provider": "db", "username": "migrate_user", "password": password, "refresh": "true"}, + verify_ssl=False, + timeout=30, + logger=logger + ) + clients[env_name] = SupersetClient(config, logger) + + logger.info("[setup_clients][Exit] All clients (%s) initialized successfully.", ', '.join(clients.keys())) + return clients + + except Exception as e: + logger.critical("[setup_clients][Failure] Critical error during client initialization: %s", e, exc_info=True) + raise +# [/DEF:setup_clients] + +# [/DEF:superset_tool.utils.init_clients] diff --git a/superset_tool/utils/logger.py b/superset_tool/utils/logger.py old mode 100644 new mode 100755 index 59111f0..182b8c6 --- a/superset_tool/utils/logger.py +++ b/superset_tool/utils/logger.py @@ -1,88 +1,103 @@ -# [MODULE] Superset Tool Logger Utility -# PURPOSE: Предоставляет стандартизированный класс-обертку `SupersetLogger` для настройки и использования логирования в проекте. -# COHERENCE: Модуль согласован со стандартной библиотекой `logging`, расширяя ее для нужд проекта. - -import logging -import sys -from datetime import datetime -from pathlib import Path -from typing import Optional - -# CONTRACT: -# PURPOSE: Обеспечивает унифицированную настройку логгера с выводом в консоль и/или файл. -# PRECONDITIONS: -# - `name` должен быть строкой. -# - `level` должен быть валидным уровнем логирования (например, `logging.INFO`). -# POSTCONDITIONS: -# - Создает и настраивает логгер с указанным именем и уровнем. -# - Добавляет обработчики для вывода в файл (если указан `log_dir`) и в консоль (если `console=True`). -# - Очищает все предыдущие обработчики для данного логгера, чтобы избежать дублирования. -# PARAMETERS: -# - name: str - Имя логгера. -# - log_dir: Optional[Path] - Директория для сохранения лог-файлов. -# - level: int - Уровень логирования. -# - console: bool - Флаг для включения вывода в консоль. -class SupersetLogger: - def __init__( - self, - name: str = "superset_tool", - log_dir: Optional[Path] = None, - level: int = logging.INFO, - console: bool = True - ): - self.logger = logging.getLogger(name) - self.logger.setLevel(level) - - formatter = logging.Formatter( - '%(asctime)s - %(levelname)s - %(message)s' - ) - - # [ANCHOR] HANDLER_RESET - # Очищаем существующие обработчики, чтобы избежать дублирования вывода при повторной инициализации. - if self.logger.hasHandlers(): - self.logger.handlers.clear() - - # [ANCHOR] FILE_HANDLER - if log_dir: - log_dir.mkdir(parents=True, exist_ok=True) - timestamp = datetime.now().strftime("%Y%m%d") - file_handler = logging.FileHandler( - log_dir / f"{name}_{timestamp}.log", encoding='utf-8' - ) - file_handler.setFormatter(formatter) - self.logger.addHandler(file_handler) - - # [ANCHOR] CONSOLE_HANDLER - if console: - console_handler = logging.StreamHandler(sys.stdout) - console_handler.setFormatter(formatter) - self.logger.addHandler(console_handler) - - # CONTRACT: - # PURPOSE: (HELPER) Генерирует строку с текущей датой для имени лог-файла. - # RETURN: str - Отформатированная дата (YYYYMMDD). - def _get_timestamp(self) -> str: - return datetime.now().strftime("%Y%m%d") - # END_FUNCTION__get_timestamp - - # [INTERFACE] Методы логирования - def info(self, message: str, extra: Optional[dict] = None, exc_info: bool = False): - self.logger.info(message, extra=extra, exc_info=exc_info) - - def error(self, message: str, extra: Optional[dict] = None, exc_info: bool = False): - self.logger.error(message, extra=extra, exc_info=exc_info) - - def warning(self, message: str, extra: Optional[dict] = None, exc_info: bool = False): - self.logger.warning(message, extra=extra, exc_info=exc_info) - - def critical(self, message: str, extra: Optional[dict] = None, exc_info: bool = False): - self.logger.critical(message, extra=extra, exc_info=exc_info) - - def debug(self, message: str, extra: Optional[dict] = None, exc_info: bool = False): - self.logger.debug(message, extra=extra, exc_info=exc_info) - - def exception(self, message: str, *args, **kwargs): - self.logger.exception(message, *args, **kwargs) -# END_CLASS_SupersetLogger - -# END_MODULE_logger +# [DEF:superset_tool.utils.logger:Module] +# +# @SEMANTICS: logging, utility, infrastructure, wrapper +# @PURPOSE: Предоставляет универсальную обёртку над стандартным `logging.Logger` для унифицированного создания и управления логгерами с выводом в консоль и/или файл. +# @LAYER: Infra +# @RELATION: WRAPS -> logging.Logger +# +# @INVARIANT: Логгер всегда должен иметь имя. +# @PUBLIC_API: SupersetLogger + +# [SECTION: IMPORTS] +import logging +import sys +from datetime import datetime +from pathlib import Path +from typing import Optional, Any, Mapping +# [/SECTION] + +# [DEF:SupersetLogger:Class] +# @PURPOSE: Обёртка над `logging.Logger`, которая упрощает конфигурацию и использование логгеров. +# @RELATION: WRAPS -> logging.Logger +class SupersetLogger: + # [DEF:SupersetLogger.__init__:Function] + # @PURPOSE: Конфигурирует и инициализирует логгер, добавляя обработчики для файла и/или консоли. + # @PRE: Если log_dir указан, путь должен быть валидным (или создаваемым). + # @POST: `self.logger` готов к использованию с настроенными обработчиками. + # @PARAM: name (str) - Идентификатор логгера. + # @PARAM: log_dir (Optional[Path]) - Директория для сохранения лог-файлов. + # @PARAM: level (int) - Уровень логирования (e.g., `logging.INFO`). + # @PARAM: console (bool) - Флаг для включения вывода в консоль. + def __init__(self, name: str = "superset_tool", log_dir: Optional[Path] = None, level: int = logging.INFO, console: bool = True) -> None: + self.logger = logging.getLogger(name) + self.logger.setLevel(level) + self.logger.propagate = False + + formatter = logging.Formatter("%(asctime)s - %(levelname)s - %(message)s") + + if self.logger.hasHandlers(): + self.logger.handlers.clear() + + if log_dir: + log_dir.mkdir(parents=True, exist_ok=True) + timestamp = datetime.now().strftime("%Y%m%d") + file_handler = logging.FileHandler(log_dir / f"{name}_{timestamp}.log", encoding="utf-8") + file_handler.setFormatter(formatter) + self.logger.addHandler(file_handler) + + if console: + console_handler = logging.StreamHandler(sys.stdout) + console_handler.setFormatter(formatter) + self.logger.addHandler(console_handler) + # [/DEF:SupersetLogger.__init__] + + # [DEF:SupersetLogger._log:Function] + # @PURPOSE: (Helper) Универсальный метод для вызова соответствующего уровня логирования. + # @PARAM: level_method (Any) - Метод логгера (info, debug, etc). + # @PARAM: msg (str) - Сообщение. + # @PARAM: args (Any) - Аргументы форматирования. + # @PARAM: extra (Optional[Mapping[str, Any]]) - Дополнительные данные. + # @PARAM: exc_info (bool) - Добавлять ли информацию об исключении. + def _log(self, level_method: Any, msg: str, *args: Any, extra: Optional[Mapping[str, Any]] = None, exc_info: bool = False) -> None: + level_method(msg, *args, extra=extra, exc_info=exc_info) + # [/DEF:SupersetLogger._log] + + # [DEF:SupersetLogger.info:Function] + # @PURPOSE: Записывает сообщение уровня INFO. + def info(self, msg: str, *args: Any, extra: Optional[Mapping[str, Any]] = None, exc_info: bool = False) -> None: + self._log(self.logger.info, msg, *args, extra=extra, exc_info=exc_info) + # [/DEF:SupersetLogger.info] + + # [DEF:SupersetLogger.debug:Function] + # @PURPOSE: Записывает сообщение уровня DEBUG. + def debug(self, msg: str, *args: Any, extra: Optional[Mapping[str, Any]] = None, exc_info: bool = False) -> None: + self._log(self.logger.debug, msg, *args, extra=extra, exc_info=exc_info) + # [/DEF:SupersetLogger.debug] + + # [DEF:SupersetLogger.warning:Function] + # @PURPOSE: Записывает сообщение уровня WARNING. + def warning(self, msg: str, *args: Any, extra: Optional[Mapping[str, Any]] = None, exc_info: bool = False) -> None: + self._log(self.logger.warning, msg, *args, extra=extra, exc_info=exc_info) + # [/DEF:SupersetLogger.warning] + + # [DEF:SupersetLogger.error:Function] + # @PURPOSE: Записывает сообщение уровня ERROR. + def error(self, msg: str, *args: Any, extra: Optional[Mapping[str, Any]] = None, exc_info: bool = False) -> None: + self._log(self.logger.error, msg, *args, extra=extra, exc_info=exc_info) + # [/DEF:SupersetLogger.error] + + # [DEF:SupersetLogger.critical:Function] + # @PURPOSE: Записывает сообщение уровня CRITICAL. + def critical(self, msg: str, *args: Any, extra: Optional[Mapping[str, Any]] = None, exc_info: bool = False) -> None: + self._log(self.logger.critical, msg, *args, extra=extra, exc_info=exc_info) + # [/DEF:SupersetLogger.critical] + + # [DEF:SupersetLogger.exception:Function] + # @PURPOSE: Записывает сообщение уровня ERROR вместе с трассировкой стека текущего исключения. + def exception(self, msg: str, *args: Any, **kwargs: Any) -> None: + self.logger.exception(msg, *args, **kwargs) + # [/DEF:SupersetLogger.exception] + +# [/DEF:SupersetLogger] + +# [/DEF:superset_tool.utils.logger] diff --git a/superset_tool/utils/network.py b/superset_tool/utils/network.py old mode 100644 new mode 100755 index 062e5fd..d6ab0e9 --- a/superset_tool/utils/network.py +++ b/superset_tool/utils/network.py @@ -1,264 +1,232 @@ -# -*- coding: utf-8 -*- -# pylint: disable=too-many-arguments,too-many-locals,too-many-statements,too-many-branches,unused-argument -""" -[MODULE] Сетевой клиент для API - -[DESCRIPTION] -Инкапсулирует низкоуровневую HTTP-логику для взаимодействия с Superset API. -""" - -# [IMPORTS] Стандартная библиотека -from typing import Optional, Dict, Any, BinaryIO, List, Union -import json -import io -from pathlib import Path - -# [IMPORTS] Сторонние библиотеки -import requests -import urllib3 # Для отключения SSL-предупреждений - -# [IMPORTS] Локальные модули -from superset_tool.exceptions import ( - AuthenticationError, - NetworkError, - DashboardNotFoundError, - SupersetAPIError, - PermissionDeniedError -) -from superset_tool.utils.logger import SupersetLogger # Импорт логгера - -# [CONSTANTS] -DEFAULT_RETRIES = 3 -DEFAULT_BACKOFF_FACTOR = 0.5 -DEFAULT_TIMEOUT = 30 - -class APIClient: - """[NETWORK-CORE] Инкапсулирует HTTP-логику для работы с API.""" - - def __init__( - self, - config: Dict[str, Any], - verify_ssl: bool = True, - timeout: int = DEFAULT_TIMEOUT, - logger: Optional[SupersetLogger] = None - ): - self.logger = logger or SupersetLogger(name="APIClient") - self.logger.info("[INFO][APIClient.__init__][ENTER] Initializing APIClient.") - self.base_url = config.get("base_url") - self.auth = config.get("auth") - self.request_settings = { - "verify_ssl": verify_ssl, - "timeout": timeout - } - self.session = self._init_session() - self._tokens: Dict[str, str] = {} - self._authenticated = False - self.logger.info("[INFO][APIClient.__init__][SUCCESS] APIClient initialized.") - - def _init_session(self) -> requests.Session: - self.logger.debug("[DEBUG][APIClient._init_session][ENTER] Initializing session.") - session = requests.Session() - retries = requests.adapters.Retry( - total=DEFAULT_RETRIES, - backoff_factor=DEFAULT_BACKOFF_FACTOR, - status_forcelist=[500, 502, 503, 504], - allowed_methods={"HEAD", "GET", "POST", "PUT", "DELETE"} - ) - adapter = requests.adapters.HTTPAdapter(max_retries=retries) - session.mount('http://', adapter) - session.mount('https://', adapter) - verify_ssl = self.request_settings.get("verify_ssl", True) - session.verify = verify_ssl - if not verify_ssl: - urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning) - self.logger.warning("[WARNING][APIClient._init_session][STATE_CHANGE] SSL verification disabled.") - self.logger.debug("[DEBUG][APIClient._init_session][SUCCESS] Session initialized.") - return session - - def authenticate(self) -> Dict[str, str]: - self.logger.info(f"[INFO][APIClient.authenticate][ENTER] Authenticating to {self.base_url}") - try: - login_url = f"{self.base_url}/security/login" - response = self.session.post( - login_url, - json=self.auth, - timeout=self.request_settings.get("timeout", DEFAULT_TIMEOUT) - ) - response.raise_for_status() - access_token = response.json()["access_token"] - csrf_url = f"{self.base_url}/security/csrf_token/" - csrf_response = self.session.get( - csrf_url, - headers={"Authorization": f"Bearer {access_token}"}, - timeout=self.request_settings.get("timeout", DEFAULT_TIMEOUT) - ) - csrf_response.raise_for_status() - csrf_token = csrf_response.json()["result"] - self._tokens = { - "access_token": access_token, - "csrf_token": csrf_token - } - self._authenticated = True - self.logger.info("[INFO][APIClient.authenticate][SUCCESS] Authenticated successfully.") - return self._tokens - except requests.exceptions.HTTPError as e: - self.logger.error(f"[ERROR][APIClient.authenticate][FAILURE] Authentication failed: {e}") - raise AuthenticationError(f"Authentication failed: {e}") from e - except (requests.exceptions.RequestException, KeyError) as e: - self.logger.error(f"[ERROR][APIClient.authenticate][FAILURE] Network or parsing error: {e}") - raise NetworkError(f"Network or parsing error during authentication: {e}") from e - - @property - def headers(self) -> Dict[str, str]: - if not self._authenticated: - self.authenticate() - return { - "Authorization": f"Bearer {self._tokens['access_token']}", - "X-CSRFToken": self._tokens.get("csrf_token", ""), - "Referer": self.base_url, - "Content-Type": "application/json" - } - - def request( - self, - method: str, - endpoint: str, - headers: Optional[Dict] = None, - raw_response: bool = False, - **kwargs - ) -> Union[requests.Response, Dict[str, Any]]: - self.logger.debug(f"[DEBUG][APIClient.request][ENTER] Requesting {method} {endpoint}") - full_url = f"{self.base_url}{endpoint}" - _headers = self.headers.copy() - if headers: - _headers.update(headers) - try: - response = self.session.request( - method, - full_url, - headers=_headers, - timeout=self.request_settings.get("timeout", DEFAULT_TIMEOUT), - **kwargs - ) - response.raise_for_status() - self.logger.debug(f"[DEBUG][APIClient.request][SUCCESS] Request successful for {method} {endpoint}") - return response if raw_response else response.json() - except requests.exceptions.HTTPError as e: - self.logger.error(f"[ERROR][APIClient.request][FAILURE] HTTP error for {method} {endpoint}: {e}") - self._handle_http_error(e, endpoint, context={}) - except requests.exceptions.RequestException as e: - self.logger.error(f"[ERROR][APIClient.request][FAILURE] Network error for {method} {endpoint}: {e}") - self._handle_network_error(e, full_url) - - def _handle_http_error(self, e, endpoint, context): - status_code = e.response.status_code - if status_code == 404: - raise DashboardNotFoundError(endpoint, context=context) from e - if status_code == 403: - raise PermissionDeniedError("Доступ запрещен.", **context) from e - if status_code == 401: - raise AuthenticationError("Аутентификация не удалась.", **context) from e - raise SupersetAPIError(f"Ошибка API: {status_code} - {e.response.text}", **context) from e - - def _handle_network_error(self, e, url): - if isinstance(e, requests.exceptions.Timeout): - msg = "Таймаут запроса" - elif isinstance(e, requests.exceptions.ConnectionError): - msg = "Ошибка соединения" - else: - msg = f"Неизвестная сетевая ошибка: {e}" - raise NetworkError(msg, url=url) from e - - def upload_file( - self, - endpoint: str, - file_info: Dict[str, Any], - extra_data: Optional[Dict] = None, - timeout: Optional[int] = None - ) -> Dict: - self.logger.info(f"[INFO][APIClient.upload_file][ENTER] Uploading file to {endpoint}") - full_url = f"{self.base_url}{endpoint}" - _headers = self.headers.copy() - _headers.pop('Content-Type', None) - file_obj = file_info.get("file_obj") - file_name = file_info.get("file_name") - form_field = file_info.get("form_field", "file") - if isinstance(file_obj, (str, Path)): - with open(file_obj, 'rb') as file_to_upload: - files_payload = {form_field: (file_name, file_to_upload, 'application/x-zip-compressed')} - return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout) - elif isinstance(file_obj, io.BytesIO): - files_payload = {form_field: (file_name, file_obj.getvalue(), 'application/x-zip-compressed')} - return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout) - elif hasattr(file_obj, 'read'): - files_payload = {form_field: (file_name, file_obj, 'application/x-zip-compressed')} - return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout) - else: - self.logger.error(f"[ERROR][APIClient.upload_file][FAILURE] Unsupported file_obj type: {type(file_obj)}") - raise TypeError(f"Неподдерживаемый тип 'file_obj': {type(file_obj)}") - - def _perform_upload(self, url, files, data, headers, timeout): - self.logger.debug(f"[DEBUG][APIClient._perform_upload][ENTER] Performing upload to {url}") - try: - response = self.session.post( - url=url, - files=files, - data=data or {}, - headers=headers, - timeout=timeout or self.request_settings.get("timeout") - ) - response.raise_for_status() - self.logger.info(f"[INFO][APIClient._perform_upload][SUCCESS] Upload successful to {url}") - return response.json() - except requests.exceptions.HTTPError as e: - self.logger.error(f"[ERROR][APIClient._perform_upload][FAILURE] HTTP error during upload: {e}") - raise SupersetAPIError(f"Ошибка API при загрузке: {e.response.text}") from e - except requests.exceptions.RequestException as e: - self.logger.error(f"[ERROR][APIClient._perform_upload][FAILURE] Network error during upload: {e}") - raise NetworkError(f"Ошибка сети при загрузке: {e}", url=url) from e - - def fetch_paginated_count( - self, - endpoint: str, - query_params: Dict, - count_field: str = "count", - timeout: Optional[int] = None - ) -> int: - self.logger.debug(f"[DEBUG][APIClient.fetch_paginated_count][ENTER] Fetching paginated count for {endpoint}") - response_json = self.request( - method="GET", - endpoint=endpoint, - params={"q": json.dumps(query_params)}, - timeout=timeout or self.request_settings.get("timeout") - ) - count = response_json.get(count_field, 0) - self.logger.debug(f"[DEBUG][APIClient.fetch_paginated_count][SUCCESS] Fetched paginated count: {count}") - return count - - def fetch_paginated_data( - self, - endpoint: str, - pagination_options: Dict[str, Any], - timeout: Optional[int] = None - ) -> List[Any]: - self.logger.debug(f"[DEBUG][APIClient.fetch_paginated_data][ENTER] Fetching paginated data for {endpoint}") - base_query = pagination_options.get("base_query", {}) - total_count = pagination_options.get("total_count", 0) - results_field = pagination_options.get("results_field", "result") - page_size = base_query.get('page_size') - if not page_size or page_size <= 0: - raise ValueError("'page_size' должен быть положительным числом.") - total_pages = (total_count + page_size - 1) // page_size - results = [] - for page in range(total_pages): - query = {**base_query, 'page': page} - response_json = self.request( - method="GET", - endpoint=endpoint, - params={"q": json.dumps(query)}, - timeout=timeout or self.request_settings.get("timeout") - ) - page_results = response_json.get(results_field, []) - results.extend(page_results) - self.logger.debug(f"[DEBUG][APIClient.fetch_paginated_data][SUCCESS] Fetched paginated data. Total items: {len(results)}") - return results \ No newline at end of file +# [DEF:superset_tool.utils.network:Module] +# +# @SEMANTICS: network, http, client, api, requests, session, authentication +# @PURPOSE: Инкапсулирует низкоуровневую HTTP-логику для взаимодействия с Superset API, включая аутентификацию, управление сессией, retry-логику и обработку ошибок. +# @LAYER: Infra +# @RELATION: DEPENDS_ON -> superset_tool.exceptions +# @RELATION: DEPENDS_ON -> superset_tool.utils.logger +# @RELATION: DEPENDS_ON -> requests +# @PUBLIC_API: APIClient + +# [SECTION: IMPORTS] +from typing import Optional, Dict, Any, List, Union, cast +import json +import io +from pathlib import Path +import requests +from requests.adapters import HTTPAdapter +import urllib3 +from urllib3.util.retry import Retry +from superset_tool.exceptions import AuthenticationError, NetworkError, DashboardNotFoundError, SupersetAPIError, PermissionDeniedError +from superset_tool.utils.logger import SupersetLogger +# [/SECTION] + +# [DEF:APIClient:Class] +# @PURPOSE: Инкапсулирует HTTP-логику для работы с API, включая сессии, аутентификацию, и обработку запросов. +class APIClient: + DEFAULT_TIMEOUT = 30 + + # [DEF:APIClient.__init__:Function] + # @PURPOSE: Инициализирует API клиент с конфигурацией, сессией и логгером. + # @PARAM: config (Dict[str, Any]) - Конфигурация. + # @PARAM: verify_ssl (bool) - Проверять ли SSL. + # @PARAM: timeout (int) - Таймаут запросов. + # @PARAM: logger (Optional[SupersetLogger]) - Логгер. + def __init__(self, config: Dict[str, Any], verify_ssl: bool = True, timeout: int = DEFAULT_TIMEOUT, logger: Optional[SupersetLogger] = None): + self.logger = logger or SupersetLogger(name="APIClient") + self.logger.info("[APIClient.__init__][Entry] Initializing APIClient.") + self.base_url: str = config.get("base_url", "") + self.auth = config.get("auth") + self.request_settings = {"verify_ssl": verify_ssl, "timeout": timeout} + self.session = self._init_session() + self._tokens: Dict[str, str] = {} + self._authenticated = False + self.logger.info("[APIClient.__init__][Exit] APIClient initialized.") + # [/DEF:APIClient.__init__] + + # [DEF:APIClient._init_session:Function] + # @PURPOSE: Создает и настраивает `requests.Session` с retry-логикой. + # @RETURN: requests.Session - Настроенная сессия. + def _init_session(self) -> requests.Session: + session = requests.Session() + retries = Retry(total=3, backoff_factor=0.5, status_forcelist=[500, 502, 503, 504]) + adapter = HTTPAdapter(max_retries=retries) + session.mount('http://', adapter) + session.mount('https://', adapter) + if not self.request_settings["verify_ssl"]: + urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning) + self.logger.warning("[_init_session][State] SSL verification disabled.") + session.verify = self.request_settings["verify_ssl"] + return session + # [/DEF:APIClient._init_session] + + # [DEF:APIClient.authenticate:Function] + # @PURPOSE: Выполняет аутентификацию в Superset API и получает access и CSRF токены. + # @POST: `self._tokens` заполнен, `self._authenticated` установлен в `True`. + # @RETURN: Dict[str, str] - Словарь с токенами. + # @THROW: AuthenticationError, NetworkError - при ошибках. + def authenticate(self) -> Dict[str, str]: + self.logger.info("[authenticate][Enter] Authenticating to %s", self.base_url) + try: + login_url = f"{self.base_url}/security/login" + response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"]) + response.raise_for_status() + access_token = response.json()["access_token"] + + csrf_url = f"{self.base_url}/security/csrf_token/" + csrf_response = self.session.get(csrf_url, headers={"Authorization": f"Bearer {access_token}"}, timeout=self.request_settings["timeout"]) + csrf_response.raise_for_status() + + self._tokens = {"access_token": access_token, "csrf_token": csrf_response.json()["result"]} + self._authenticated = True + self.logger.info("[authenticate][Exit] Authenticated successfully.") + return self._tokens + except requests.exceptions.HTTPError as e: + raise AuthenticationError(f"Authentication failed: {e}") from e + except (requests.exceptions.RequestException, KeyError) as e: + raise NetworkError(f"Network or parsing error during authentication: {e}") from e + # [/DEF:APIClient.authenticate] + + @property + def headers(self) -> Dict[str, str]: + # [DEF:APIClient.headers:Function] + # @PURPOSE: Возвращает HTTP-заголовки для аутентифицированных запросов. + if not self._authenticated: self.authenticate() + return { + "Authorization": f"Bearer {self._tokens['access_token']}", + "X-CSRFToken": self._tokens.get("csrf_token", ""), + "Referer": self.base_url, + "Content-Type": "application/json" + } + # [/DEF:APIClient.headers] + + # [DEF:APIClient.request:Function] + # @PURPOSE: Выполняет универсальный HTTP-запрос к API. + # @RETURN: `requests.Response` если `raw_response=True`, иначе `dict`. + # @THROW: SupersetAPIError, NetworkError и их подклассы. + # @PARAM: method (str) - HTTP метод. + # @PARAM: endpoint (str) - API эндпоинт. + # @PARAM: headers (Optional[Dict]) - Дополнительные заголовки. + # @PARAM: raw_response (bool) - Возвращать ли сырой ответ. + def request(self, method: str, endpoint: str, headers: Optional[Dict] = None, raw_response: bool = False, **kwargs) -> Union[requests.Response, Dict[str, Any]]: + full_url = f"{self.base_url}{endpoint}" + _headers = self.headers.copy() + if headers: _headers.update(headers) + + try: + response = self.session.request(method, full_url, headers=_headers, **kwargs) + response.raise_for_status() + return response if raw_response else response.json() + except requests.exceptions.HTTPError as e: + self._handle_http_error(e, endpoint) + except requests.exceptions.RequestException as e: + self._handle_network_error(e, full_url) + # [/DEF:APIClient.request] + + # [DEF:APIClient._handle_http_error:Function] + # @PURPOSE: (Helper) Преобразует HTTP ошибки в кастомные исключения. + # @PARAM: e (requests.exceptions.HTTPError) - Ошибка. + # @PARAM: endpoint (str) - Эндпоинт. + def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str): + status_code = e.response.status_code + if status_code == 404: raise DashboardNotFoundError(endpoint) from e + if status_code == 403: raise PermissionDeniedError() from e + if status_code == 401: raise AuthenticationError() from e + raise SupersetAPIError(f"API Error {status_code}: {e.response.text}") from e + # [/DEF:APIClient._handle_http_error] + + # [DEF:APIClient._handle_network_error:Function] + # @PURPOSE: (Helper) Преобразует сетевые ошибки в `NetworkError`. + # @PARAM: e (requests.exceptions.RequestException) - Ошибка. + # @PARAM: url (str) - URL. + def _handle_network_error(self, e: requests.exceptions.RequestException, url: str): + if isinstance(e, requests.exceptions.Timeout): msg = "Request timeout" + elif isinstance(e, requests.exceptions.ConnectionError): msg = "Connection error" + else: msg = f"Unknown network error: {e}" + raise NetworkError(msg, url=url) from e + # [/DEF:APIClient._handle_network_error] + + # [DEF:APIClient.upload_file:Function] + # @PURPOSE: Загружает файл на сервер через multipart/form-data. + # @RETURN: Ответ API в виде словаря. + # @THROW: SupersetAPIError, NetworkError, TypeError. + # @PARAM: endpoint (str) - Эндпоинт. + # @PARAM: file_info (Dict[str, Any]) - Информация о файле. + # @PARAM: extra_data (Optional[Dict]) - Дополнительные данные. + # @PARAM: timeout (Optional[int]) - Таймаут. + def upload_file(self, endpoint: str, file_info: Dict[str, Any], extra_data: Optional[Dict] = None, timeout: Optional[int] = None) -> Dict: + full_url = f"{self.base_url}{endpoint}" + _headers = self.headers.copy(); _headers.pop('Content-Type', None) + + file_obj, file_name, form_field = file_info.get("file_obj"), file_info.get("file_name"), file_info.get("form_field", "file") + + files_payload = {} + if isinstance(file_obj, (str, Path)): + with open(file_obj, 'rb') as f: + files_payload = {form_field: (file_name, f.read(), 'application/x-zip-compressed')} + elif isinstance(file_obj, io.BytesIO): + files_payload = {form_field: (file_name, file_obj.getvalue(), 'application/x-zip-compressed')} + else: + raise TypeError(f"Unsupported file_obj type: {type(file_obj)}") + + return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout) + # [/DEF:APIClient.upload_file] + + # [DEF:APIClient._perform_upload:Function] + # @PURPOSE: (Helper) Выполняет POST запрос с файлом. + # @PARAM: url (str) - URL. + # @PARAM: files (Dict) - Файлы. + # @PARAM: data (Optional[Dict]) - Данные. + # @PARAM: headers (Dict) - Заголовки. + # @PARAM: timeout (Optional[int]) - Таймаут. + # @RETURN: Dict - Ответ. + def _perform_upload(self, url: str, files: Dict, data: Optional[Dict], headers: Dict, timeout: Optional[int]) -> Dict: + try: + response = self.session.post(url, files=files, data=data or {}, headers=headers, timeout=timeout or self.request_settings["timeout"]) + response.raise_for_status() + # Добавляем логирование для отладки + if response.status_code == 200: + try: + return response.json() + except Exception as json_e: + self.logger.debug(f"[_perform_upload][Debug] Response is not valid JSON: {response.text[:200]}...") + raise SupersetAPIError(f"API error during upload: Response is not valid JSON: {json_e}") from json_e + return response.json() + except requests.exceptions.HTTPError as e: + raise SupersetAPIError(f"API error during upload: {e.response.text}") from e + except requests.exceptions.RequestException as e: + raise NetworkError(f"Network error during upload: {e}", url=url) from e + # [/DEF:APIClient._perform_upload] + + # [DEF:APIClient.fetch_paginated_count:Function] + # @PURPOSE: Получает общее количество элементов для пагинации. + # @PARAM: endpoint (str) - Эндпоинт. + # @PARAM: query_params (Dict) - Параметры запроса. + # @PARAM: count_field (str) - Поле с количеством. + # @RETURN: int - Количество. + def fetch_paginated_count(self, endpoint: str, query_params: Dict, count_field: str = "count") -> int: + response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query_params)})) + return response_json.get(count_field, 0) + # [/DEF:APIClient.fetch_paginated_count] + + # [DEF:APIClient.fetch_paginated_data:Function] + # @PURPOSE: Автоматически собирает данные со всех страниц пагинированного эндпоинта. + # @PARAM: endpoint (str) - Эндпоинт. + # @PARAM: pagination_options (Dict[str, Any]) - Опции пагинации. + # @RETURN: List[Any] - Список данных. + def fetch_paginated_data(self, endpoint: str, pagination_options: Dict[str, Any]) -> List[Any]: + base_query, total_count = pagination_options["base_query"], pagination_options["total_count"] + results_field, page_size = pagination_options["results_field"], base_query.get('page_size') + assert page_size and page_size > 0, "'page_size' must be a positive number." + + results = [] + for page in range((total_count + page_size - 1) // page_size): + query = {**base_query, 'page': page} + response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query)})) + results.extend(response_json.get(results_field, [])) + return results + # [/DEF:APIClient.fetch_paginated_data] + +# [/DEF:APIClient] + +# [/DEF:superset_tool.utils.network] diff --git a/superset_tool/utils/whiptail_fallback.py b/superset_tool/utils/whiptail_fallback.py new file mode 100755 index 0000000..4dcb931 --- /dev/null +++ b/superset_tool/utils/whiptail_fallback.py @@ -0,0 +1,104 @@ +# [DEF:superset_tool.utils.whiptail_fallback:Module] +# +# @SEMANTICS: ui, fallback, console, utility, interactive +# @PURPOSE: Предоставляет плотный консольный UI-fallback для интерактивных диалогов, имитируя `whiptail` для систем, где он недоступен. +# @LAYER: UI +# @PUBLIC_API: menu, checklist, yesno, msgbox, inputbox, gauge + +# [SECTION: IMPORTS] +import sys +from typing import List, Tuple, Optional, Any +# [/SECTION] + +# [DEF:menu:Function] +# @PURPOSE: Отображает меню выбора и возвращает выбранный элемент. +# @PARAM: title (str) - Заголовок меню. +# @PARAM: prompt (str) - Приглашение к вводу. +# @PARAM: choices (List[str]) - Список вариантов для выбора. +# @RETURN: Tuple[int, Optional[str]] - Кортеж (код возврата, выбранный элемент). rc=0 - успех. +def menu(title: str, prompt: str, choices: List[str], **kwargs) -> Tuple[int, Optional[str]]: + print(f"\n=== {title} ===\n{prompt}") + for idx, item in enumerate(choices, 1): + print(f"{idx}) {item}") + try: + raw = input("\nВведите номер (0 – отмена): ").strip() + sel = int(raw) + return (0, choices[sel - 1]) if 0 < sel <= len(choices) else (1, None) + except (ValueError, IndexError): + return 1, None +# [/DEF:menu] + +# [DEF:checklist:Function] +# @PURPOSE: Отображает список с возможностью множественного выбора. +# @PARAM: title (str) - Заголовок. +# @PARAM: prompt (str) - Приглашение к вводу. +# @PARAM: options (List[Tuple[str, str]]) - Список кортежей (значение, метка). +# @RETURN: Tuple[int, List[str]] - Кортеж (код возврата, список выбранных значений). +def checklist(title: str, prompt: str, options: List[Tuple[str, str]], **kwargs) -> Tuple[int, List[str]]: + print(f"\n=== {title} ===\n{prompt}") + for idx, (val, label) in enumerate(options, 1): + print(f"{idx}) [{val}] {label}") + raw = input("\nВведите номера через запятую (пустой ввод → отказ): ").strip() + if not raw: return 1, [] + try: + indices = {int(x.strip()) for x in raw.split(",") if x.strip()} + selected_values = [options[i - 1][0] for i in indices if 0 < i <= len(options)] + return 0, selected_values + except (ValueError, IndexError): + return 1, [] +# [/DEF:checklist] + +# [DEF:yesno:Function] +# @PURPOSE: Задает вопрос с ответом да/нет. +# @PARAM: title (str) - Заголовок. +# @PARAM: question (str) - Вопрос для пользователя. +# @RETURN: bool - `True`, если пользователь ответил "да". +def yesno(title: str, question: str, **kwargs) -> bool: + ans = input(f"\n=== {title} ===\n{question} (y/n): ").strip().lower() + return ans in ("y", "yes", "да", "д") +# [/DEF:yesno] + +# [DEF:msgbox:Function] +# @PURPOSE: Отображает информационное сообщение. +# @PARAM: title (str) - Заголовок. +# @PARAM: msg (str) - Текст сообщения. +def msgbox(title: str, msg: str, **kwargs) -> None: + print(f"\n=== {title} ===\n{msg}\n") +# [/DEF:msgbox] + +# [DEF:inputbox:Function] +# @PURPOSE: Запрашивает у пользователя текстовый ввод. +# @PARAM: title (str) - Заголовок. +# @PARAM: prompt (str) - Приглашение к вводу. +# @RETURN: Tuple[int, Optional[str]] - Кортеж (код возврата, введенная строка). +def inputbox(title: str, prompt: str, **kwargs) -> Tuple[int, Optional[str]]: + print(f"\n=== {title} ===") + val = input(f"{prompt}\n") + return (0, val) if val else (1, None) +# [/DEF:inputbox] + +# [DEF:_ConsoleGauge:Class] +# @PURPOSE: Контекстный менеджер для имитации `whiptail gauge` в консоли. +class _ConsoleGauge: + def __init__(self, title: str, **kwargs): + self.title = title + def __enter__(self): + print(f"\n=== {self.title} ===") + return self + def __exit__(self, exc_type, exc_val, exc_tb): + sys.stdout.write("\n"); sys.stdout.flush() + def set_text(self, txt: str) -> None: + sys.stdout.write(f"\r{txt} "); sys.stdout.flush() + def set_percent(self, percent: int) -> None: + sys.stdout.write(f"{percent}%"); sys.stdout.flush() +# [/DEF:_ConsoleGauge] + +# [DEF:gauge:Function] +# @PURPOSE: Создает и возвращает экземпляр `_ConsoleGauge`. +# @PARAM: title (str) - Заголовок для индикатора прогресса. +# @RETURN: _ConsoleGauge - Экземпляр контекстного менеджера. +def gauge(title: str, **kwargs) -> _ConsoleGauge: + return _ConsoleGauge(title, **kwargs) +# [/DEF:gauge] + +# [/DEF:superset_tool.utils.whiptail_fallback] diff --git a/temp_pylint_runner.py b/temp_pylint_runner.py deleted file mode 100644 index e4e1c5c..0000000 --- a/temp_pylint_runner.py +++ /dev/null @@ -1,7 +0,0 @@ -import sys -import os -import pylint.lint - -sys.path.append(os.getcwd()) - -pylint.lint.Run(['superset_tool/utils/fileio.py']) \ No newline at end of file diff --git a/test_update_yamls.py b/test_update_yamls.py new file mode 100755 index 0000000..eb1b7c3 --- /dev/null +++ b/test_update_yamls.py @@ -0,0 +1,63 @@ +# [DEF:test_update_yamls:Module] +# +# @SEMANTICS: test, yaml, update, script +# @PURPOSE: Test script to verify update_yamls behavior. +# @LAYER: Test +# @RELATION: DEPENDS_ON -> superset_tool.utils.fileio +# @PUBLIC_API: main + +# [SECTION: IMPORTS] +import tempfile +import os +from pathlib import Path +import yaml +from superset_tool.utils.fileio import update_yamls +# [/SECTION] + +# [DEF:main:Function] +# @PURPOSE: Main test function. +# @RELATION: CALLS -> update_yamls +def main(): + # Create a temporary directory structure + with tempfile.TemporaryDirectory() as tmpdir: + tmp_path = Path(tmpdir) + + # Create a mock dashboard directory structure + dash_dir = tmp_path / "dashboard" + dash_dir.mkdir() + + # Create a mock metadata.yaml file + metadata_file = dash_dir / "metadata.yaml" + metadata_content = { + "dashboard_uuid": "12345", + "database_name": "Prod Clickhouse", + "slug": "test-dashboard" + } + with open(metadata_file, 'w') as f: + yaml.dump(metadata_content, f) + + print("Original metadata.yaml:") + with open(metadata_file, 'r') as f: + print(f.read()) + + # Test update_yamls + db_configs = [ + { + "old": {"database_name": "Prod Clickhouse"}, + "new": {"database_name": "DEV Clickhouse"} + } + ] + + update_yamls(db_configs=db_configs, path=str(dash_dir)) + + print("\nAfter update_yamls:") + with open(metadata_file, 'r') as f: + print(f.read()) + + print("Test completed.") +# [/DEF:main] + +if __name__ == "__main__": + main() + +# [/DEF:test_update_yamls]