Select environments and click "Fetch Databases" to start mapping.
+ {/if}
+ {/if}
+
+
+
+
+
+
diff --git a/logs/лог.md b/logs/лог.md
new file mode 100755
index 0000000..f6a8db8
--- /dev/null
+++ b/logs/лог.md
@@ -0,0 +1,298 @@
+PS H:\dev\ss-tools> & C:/ProgramData/anaconda3/python.exe h:/dev/ss-tools/migration_script.py
+2025-12-16 11:50:28,192 - INFO - [run][Entry] Запуск скрипта миграции.
+
+=== Поведение при ошибке импорта ===
+Если импорт завершится ошибкой, удалить существующий дашборд и попытаться импортировать заново? (y/n): n
+2025-12-16 11:50:33,363 - INFO - [ask_delete_on_failure][State] Delete-on-failure = False
+2025-12-16 11:50:33,368 - INFO - [select_environments][Entry] Шаг 1/5: Выбор окружений.
+2025-12-16 11:50:33,374 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
+2025-12-16 11:50:33,730 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
+2025-12-16 11:50:33,734 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
+2025-12-16 11:50:33,739 - WARNING - [_init_session][State] SSL verification disabled.
+2025-12-16 11:50:33,742 - INFO - [APIClient.__init__][Exit] APIClient initialized.
+2025-12-16 11:50:33,746 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
+2025-12-16 11:50:33,750 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
+2025-12-16 11:50:33,754 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
+2025-12-16 11:50:33,758 - WARNING - [_init_session][State] SSL verification disabled.
+2025-12-16 11:50:33,761 - INFO - [APIClient.__init__][Exit] APIClient initialized.
+2025-12-16 11:50:33,764 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
+2025-12-16 11:50:33,769 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
+2025-12-16 11:50:33,772 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
+2025-12-16 11:50:33,776 - WARNING - [_init_session][State] SSL verification disabled.
+2025-12-16 11:50:33,779 - INFO - [APIClient.__init__][Exit] APIClient initialized.
+2025-12-16 11:50:33,782 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
+2025-12-16 11:50:33,786 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
+2025-12-16 11:50:33,790 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
+2025-12-16 11:50:33,794 - WARNING - [_init_session][State] SSL verification disabled.
+2025-12-16 11:50:33,799 - INFO - [APIClient.__init__][Exit] APIClient initialized.
+2025-12-16 11:50:33,805 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
+2025-12-16 11:50:33,808 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
+2025-12-16 11:50:33,811 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
+2025-12-16 11:50:33,815 - WARNING - [_init_session][State] SSL verification disabled.
+2025-12-16 11:50:33,820 - INFO - [APIClient.__init__][Exit] APIClient initialized.
+2025-12-16 11:50:33,823 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
+2025-12-16 11:50:33,827 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
+2025-12-16 11:50:33,831 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
+2025-12-16 11:50:33,834 - WARNING - [_init_session][State] SSL verification disabled.
+2025-12-16 11:50:33,838 - INFO - [APIClient.__init__][Exit] APIClient initialized.
+2025-12-16 11:50:33,840 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
+2025-12-16 11:50:33,847 - INFO - [setup_clients][Exit] All clients (dev, prod, sbx, preprod, uatta, dev5) initialized successfully.
+
+=== Выбор окружения ===
+Исходное окружение:
+1) dev
+2) prod
+3) sbx
+4) preprod
+5) uatta
+6) dev5
+
+Введите номер (0 – отмена): 4
+2025-12-16 11:50:42,379 - INFO - [select_environments][State] from = preprod
+
+=== Выбор окружения ===
+Целевое окружение:
+1) dev
+2) prod
+3) sbx
+4) uatta
+5) dev5
+
+Введите номер (0 – отмена): 5
+2025-12-16 11:50:45,176 - INFO - [select_environments][State] to = dev5
+2025-12-16 11:50:45,182 - INFO - [select_environments][Exit] Шаг 1 завершён.
+2025-12-16 11:50:45,186 - INFO - [select_dashboards][Entry] Шаг 2/5: Выбор дашбордов.
+2025-12-16 11:50:45,190 - INFO - [get_dashboards][Enter] Fetching dashboards.
+2025-12-16 11:50:45,197 - INFO - [authenticate][Enter] Authenticating to https://preprodta.bi.dwh.rusal.com/api/v1
+2025-12-16 11:50:45,880 - INFO - [authenticate][Exit] Authenticated successfully.
+2025-12-16 11:50:46,025 - INFO - [get_dashboards][Exit] Found 95 dashboards.
+
+=== Поиск ===
+Введите регулярное выражение для поиска дашбордов:
+fi
+
+=== Выбор дашбордов ===
+Отметьте нужные дашборды (введите номера):
+1) [ALL] Все дашборды
+2) [185] FI-0060 Финансы. Налоги. Данные по налогам. Старый
+3) [184] FI-0083 Статистика по ДЗ/ПДЗ
+4) [187] FI-0081 ПДЗ Казначейство
+5) [122] FI-0080 Финансы. Оборотный Капитал ДЗ/КЗ
+6) [208] FI-0020 Просроченная дебиторская и кредиторская задолженность в динамике
+7) [126] FI-0022 Кредиторская задолженность для казначейства
+8) [196] FI-0023 Дебиторская задолженность для казначейства
+9) [113] FI-0060 Финансы. Налоги. Данные по налогам.
+10) [173] FI-0040 Оборотно-сальдовая ведомость (ОСВ) по контрагентам
+11) [174] FI-0021 Дебиторская и кредиторская задолженность по документам
+12) [172] FI-0030 Дебиторская задолженность по штрафам
+13) [170] FI-0050 Налог на прибыль (ОНА и ОНО)
+14) [159] FI-0070 Досье контрагента
+
+Введите номера через запятую (пустой ввод → отказ): 2
+2025-12-16 11:50:52,235 - INFO - [select_dashboards][State] Выбрано 1 дашбордов.
+2025-12-16 11:50:52,242 - INFO - [select_dashboards][Exit] Шаг 2 завершён.
+
+=== Замена БД ===
+Заменить конфигурацию БД в YAML‑файлах? (y/n): y
+2025-12-16 11:50:53,808 - INFO - [_select_databases][Entry] Selecting databases from both environments.
+2025-12-16 11:50:53,816 - INFO - [get_databases][Enter] Fetching databases.
+2025-12-16 11:50:53,918 - INFO - [get_databases][Exit] Found 12 databases.
+2025-12-16 11:50:53,923 - INFO - [get_databases][Enter] Fetching databases.
+2025-12-16 11:50:53,926 - INFO - [authenticate][Enter] Authenticating to https://dev.bi.dwh.rusal.com/api/v1
+2025-12-16 11:50:54,450 - INFO - [authenticate][Exit] Authenticated successfully.
+2025-12-16 11:50:54,551 - INFO - [get_databases][Exit] Found 4 databases.
+
+=== Выбор исходной БД ===
+Выберите исходную БД:
+1) DEV datalab (ID: 9)
+2) Prod Greenplum (ID: 7)
+3) DEV Clickhouse New (OLD) (ID: 16)
+4) Preprod Clickhouse New (ID: 15)
+5) DEV Greenplum (ID: 1)
+6) Prod Clickhouse Node 1 (ID: 11)
+7) Preprod Postgre Superset Internal (ID: 5)
+8) Prod Postgre Superset Internal (ID: 28)
+9) Prod Clickhouse (ID: 10)
+10) Dev Clickhouse (correct) (ID: 14)
+11) DEV ClickHouse New (ID: 23)
+12) Sandbox Postgre Superset Internal (ID: 12)
+
+Введите номер (0 – отмена): 9
+2025-12-16 11:51:11,008 - INFO - [get_database][Enter] Fetching database 10.
+2025-12-16 11:51:11,038 - INFO - [get_database][Exit] Got database 10.
+
+=== Выбор целевой БД ===
+Выберите целевую БД:
+1) DEV Greenplum (ID: 2)
+2) DEV Clickhouse (ID: 3)
+3) DEV ClickHouse New (ID: 4)
+4) Dev Postgre Superset Internal (ID: 1)
+
+Введите номер (0 – отмена): 2
+2025-12-16 11:51:15,559 - INFO - [get_database][Enter] Fetching database 3.
+2025-12-16 11:51:15,586 - INFO - [get_database][Exit] Got database 3.
+2025-12-16 11:51:15,589 - INFO - [_select_databases][Exit] Selected databases: Без имени -> Без имени
+old_db: {'id': 10, 'result': {'allow_ctas': False, 'allow_cvas': False, 'allow_dml': True, 'allow_file_upload': False, 'allow_run_async': False, 'backen
+d': 'clickhousedb', 'cache_timeout': None, 'configuration_method': 'sqlalchemy_form', 'database_name': 'Prod Clickhouse', 'driver': 'connect', 'engine_i
+nformation': {'disable_ssh_tunneling': False, 'supports_file_upload': False}, 'expose_in_sqllab': True, 'force_ctas_schema': None, 'id': 10, 'impersonat
+e_user': False, 'is_managed_externally': False, 'uuid': '97aced68-326a-4094-b381-27980560efa9'}}
+2025-12-16 11:51:15,591 - INFO - [confirm_db_config_replacement][State] Replacement set: {'old': {'database_name': None, 'uuid': None, 'id': '10'}, 'new
+': {'database_name': None, 'uuid': None, 'id': '3'}}
+2025-12-16 11:51:15,594 - INFO - [execute_migration][Entry] Starting migration of 1 dashboards.
+
+=== Миграция... ===
+Миграция: FI-0060 Финансы. Налоги. Данные по налогам. Старый (1/1) 0%2025-12-16 11:51:15,598 - INFO - [export_dashboard][Enter] Exporting dashboard 185.
+2025-12-16 11:51:16,142 - INFO - [export_dashboard][Exit] Exported dashboard 185 to dashboard_export_20251216T085115.zip.
+2025-12-16 11:51:16,205 - INFO - [update_yamls][Enter] Starting YAML configuration update.
+2025-12-16 11:51:16,208 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\metadata.yaml
+2025-12-16 11:51:16,209 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-01_2787.yaml
+2025-12-16 11:51:16,210 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-01_2_4030.yaml
+2025-12-16 11:51:16,212 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-01_4029.yaml
+2025-12-16 11:51:16,213 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-01_TOTAL2_4036.yaml
+2025-12-16 11:51:16,215 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-01_TOTAL2_4037.yaml
+2025-12-16 11:51:16,216 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-01_TOTAL_4028.yaml
+2025-12-16 11:51:16,217 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-01_ZNODE_ROOT2_4024.yaml
+2025-12-16 11:51:16,218 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-01_ZNODE_ROOT_4033.yaml
+2025-12-16 11:51:16,220 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-02_ZFUND-BD2_4021.yaml
+2025-12-16 11:51:16,221 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-02_ZFUND_4027.yaml
+2025-12-16 11:51:16,222 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02-02_ZFUND_4034.yaml
+2025-12-16 11:51:16,224 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02_ZTAX_4022.yaml
+2025-12-16 11:51:16,226 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-02_ZTAX_4035.yaml
+2025-12-16 11:51:16,227 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-04-2_4031.yaml
+2025-12-16 11:51:16,228 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-05-01_4026.yaml
+2025-12-16 11:51:16,230 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-05-01_4032.yaml
+2025-12-16 11:51:16,231 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-06_1_4023.yaml
+2025-12-16 11:51:16,233 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060-06_2_4020.yaml
+2025-12-16 11:51:16,234 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\charts\FI-0060_4025.yaml
+2025-12-16 11:51:16,236 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\dashboards\FI-0060_185.yaml
+2025-12-16 11:51:16,238 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\databases\Prod_Clickhouse_10.yaml
+2025-12-16 11:51:16,240 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0000_-_685.yaml
+2025-12-16 11:51:16,241 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-01-2_zfund_reciever_-_861.yaml
+2025-12-16 11:51:16,242 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-01_zfund_reciever_click_689.yaml
+2025-12-16 11:51:16,244 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-02_680.yaml
+2025-12-16 11:51:16,245 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-03_ztax_862.yaml
+2025-12-16 11:51:16,246 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-04_zpbe_681.yaml
+2025-12-16 11:51:16,247 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-05_ZTAXZFUND_679.yaml
+2025-12-16 11:51:16,249 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-06_860.yaml
+2025-12-16 11:51:16,250 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-08_682.yaml
+2025-12-16 11:51:16,251 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-10_zpbe_688.yaml
+2025-12-16 11:51:16,253 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-11_ZTAX_NAME_863.yaml
+2025-12-16 11:51:16,254 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_683.yaml
+2025-12-16 11:51:16,255 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_684.yaml
+2025-12-16 11:51:16,256 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_686.yaml
+2025-12-16 11:51:16,258 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
+_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_690.yaml
+2025-12-16 11:51:16,259 - INFO - [create_dashboard_export][Enter] Packing dashboard: ['C:\\Users\\LO54FB~1\\Temp\\tmpuidfegpd.dir'] -> C:\Users\LO54FB~1
+\Temp\tmps7cuv2ti.zip
+2025-12-16 11:51:16,347 - INFO - [create_dashboard_export][Exit] Archive created: C:\Users\LO54FB~1\Temp\tmps7cuv2ti.zip
+2025-12-16 11:51:16,372 - ERROR - [import_dashboard][Failure] First import attempt failed: [API_FAILURE] API error during upload: {"errors": [{"message"
+: "Expecting value: line 1 column 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "messag
+e": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448
+\u0438\u0431\u043a\u0430."}]}}]} | Context: {'type': 'api_call'}
+Traceback (most recent call last):
+ File "h:\dev\ss-tools\superset_tool\utils\network.py", line 186, in _perform_upload
+ response.raise_for_status()
+ File "C:\ProgramData\anaconda3\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
+ raise HTTPError(http_error_msg, response=self)
+requests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: https://dev.bi.dwh.rusal.com/api/v1/dashboard/import/
+
+The above exception was the direct cause of the following exception:
+
+Traceback (most recent call last):
+ File "h:\dev\ss-tools\superset_tool\client.py", line 141, in import_dashboard
+ return self._do_import(file_path)
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
+ File "h:\dev\ss-tools\superset_tool\client.py", line 197, in _do_import
+ return self.network.upload_file(
+ ^^^^^^^^^^^^^^^^^^^^^^^^^
+ File "h:\dev\ss-tools\superset_tool\utils\network.py", line 172, in upload_file
+ return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ File "h:\dev\ss-tools\superset_tool\utils\network.py", line 196, in _perform_upload
+ raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
+superset_tool.exceptions.SupersetAPIError: [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "
+error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437
+\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'ty
+pe': 'api_call'}
+2025-12-16 11:51:16,511 - ERROR - [execute_migration][Failure] [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 c
+olumn 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u04
+40\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]
+}}]} | Context: {'type': 'api_call'}
+Traceback (most recent call last):
+ File "h:\dev\ss-tools\superset_tool\utils\network.py", line 186, in _perform_upload
+ response.raise_for_status()
+ File "C:\ProgramData\anaconda3\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
+ raise HTTPError(http_error_msg, response=self)
+requests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: https://dev.bi.dwh.rusal.com/api/v1/dashboard/import/
+
+The above exception was the direct cause of the following exception:
+
+Traceback (most recent call last):
+ File "h:\dev\ss-tools\migration_script.py", line 366, in execute_migration
+ self.to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug)
+ File "h:\dev\ss-tools\superset_tool\client.py", line 141, in import_dashboard
+ return self._do_import(file_path)
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
+ File "h:\dev\ss-tools\superset_tool\client.py", line 197, in _do_import
+ return self.network.upload_file(
+ ^^^^^^^^^^^^^^^^^^^^^^^^^
+ File "h:\dev\ss-tools\superset_tool\utils\network.py", line 172, in upload_file
+ return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ File "h:\dev\ss-tools\superset_tool\utils\network.py", line 196, in _perform_upload
+ raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
+superset_tool.exceptions.SupersetAPIError: [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "
+error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437
+\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'ty
+pe': 'api_call'}
+
+=== Ошибка ===
+Не удалось мигрировать дашборд FI-0060 Финансы. Налоги. Данные по налогам. Старый.
+
+[API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "leve
+l": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438
+\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'type': 'api_call'}
+
+100%
+2025-12-16 11:51:16,598 - INFO - [execute_migration][Exit] Migration finished.
+
+=== Информация ===
+Миграция завершена!
+
+2025-12-16 11:51:16,605 - INFO - [run][Exit] Скрипт миграции завершён.
\ No newline at end of file
diff --git a/specs/001-migration-ui-redesign/checklists/requirements.md b/specs/001-migration-ui-redesign/checklists/requirements.md
new file mode 100644
index 0000000..c7e9bbd
--- /dev/null
+++ b/specs/001-migration-ui-redesign/checklists/requirements.md
@@ -0,0 +1,34 @@
+# Specification Quality Checklist: Migration Process and UI Redesign
+
+**Purpose**: Validate specification completeness and quality before proceeding to planning
+**Created**: 2025-12-20
+**Feature**: [specs/001-migration-ui-redesign/spec.md](specs/001-migration-ui-redesign/spec.md)
+
+## Content Quality
+
+- [x] No implementation details (languages, frameworks, APIs)
+- [x] Focused on user value and business needs
+- [x] Written for non-technical stakeholders
+- [x] All mandatory sections completed
+
+## Requirement Completeness
+
+- [x] No [NEEDS CLARIFICATION] markers remain
+- [x] Requirements are testable and unambiguous
+- [x] Success criteria are measurable
+- [x] Success criteria are technology-agnostic (no implementation details)
+- [x] All acceptance scenarios are defined
+- [x] Edge cases are identified
+- [x] Scope is clearly bounded
+- [x] Dependencies and assumptions identified
+
+## Feature Readiness
+
+- [x] All functional requirements have clear acceptance criteria
+- [x] User scenarios cover primary flows
+- [x] Feature meets measurable outcomes defined in Success Criteria
+- [x] No implementation details leak into specification
+
+## Notes
+
+- Items marked incomplete require spec updates before `/speckit.clarify` or `/speckit.plan`
diff --git a/specs/001-migration-ui-redesign/contracts/api.md b/specs/001-migration-ui-redesign/contracts/api.md
new file mode 100644
index 0000000..8be634c
--- /dev/null
+++ b/specs/001-migration-ui-redesign/contracts/api.md
@@ -0,0 +1,115 @@
+# API Contracts: Migration Process and UI Redesign
+
+## Environment Management
+
+### GET /api/environments
+List all configured environments.
+
+**Response (200 OK)**:
+```json
+[
+ {
+ "id": "uuid",
+ "name": "Development",
+ "url": "https://superset-dev.example.com"
+ }
+]
+```
+
+### GET /api/environments/{id}/databases
+Fetch the list of databases from a specific environment.
+
+**Response (200 OK)**:
+```json
+[
+ {
+ "uuid": "db-uuid",
+ "database_name": "Dev Clickhouse",
+ "engine": "clickhouse"
+ }
+]
+```
+
+## Database Mapping
+
+### GET /api/mappings
+List all saved database mappings.
+
+**Query Parameters**:
+- `source_env_id`: Filter by source environment.
+- `target_env_id`: Filter by target environment.
+
+**Response (200 OK)**:
+```json
+[
+ {
+ "id": "uuid",
+ "source_env_id": "uuid",
+ "target_env_id": "uuid",
+ "source_db_uuid": "uuid",
+ "target_db_uuid": "uuid",
+ "source_db_name": "Dev Clickhouse",
+ "target_db_name": "Prod Clickhouse"
+ }
+]
+```
+
+### POST /api/mappings
+Create or update a database mapping.
+
+**Request Body**:
+```json
+{
+ "source_env_id": "uuid",
+ "target_env_id": "uuid",
+ "source_db_uuid": "uuid",
+ "target_db_uuid": "uuid"
+}
+```
+
+### POST /api/mappings/suggest
+Get suggested mappings based on fuzzy matching.
+
+**Request Body**:
+```json
+{
+ "source_env_id": "uuid",
+ "target_env_id": "uuid"
+}
+```
+
+**Response (200 OK)**:
+```json
+[
+ {
+ "source_db_uuid": "uuid",
+ "target_db_uuid": "uuid",
+ "confidence": 0.95
+ }
+]
+```
+
+## Migration Execution
+
+### POST /api/migrations
+Start a migration job.
+
+**Request Body**:
+```json
+{
+ "source_env_id": "uuid",
+ "target_env_id": "uuid",
+ "assets": [
+ {"type": "dashboard", "id": 123}
+ ],
+ "replace_db": true
+}
+```
+
+**Response (202 Accepted)**:
+```json
+{
+ "job_id": "uuid",
+ "status": "RUNNING"
+}
+```
diff --git a/specs/001-migration-ui-redesign/data-model.md b/specs/001-migration-ui-redesign/data-model.md
new file mode 100644
index 0000000..955e209
--- /dev/null
+++ b/specs/001-migration-ui-redesign/data-model.md
@@ -0,0 +1,48 @@
+# Data Model: Migration Process and UI Redesign
+
+## Entities
+
+### Environment
+Represents a Superset instance.
+
+| Field | Type | Description |
+|-------|------|-------------|
+| `id` | UUID | Primary Key |
+| `name` | String | Display name (e.g., "Development", "Production") |
+| `url` | String | Base URL of the Superset instance |
+| `credentials_id` | String | Reference to encrypted credentials in the config manager |
+
+### DatabaseMapping
+Represents a mapping between a database in the source environment and a database in the target environment.
+
+| Field | Type | Description |
+|-------|------|-------------|
+| `id` | UUID | Primary Key |
+| `source_env_id` | UUID | Foreign Key to Environment (Source) |
+| `target_env_id` | UUID | Foreign Key to Environment (Target) |
+| `source_db_uuid` | String | UUID of the database in the source environment |
+| `target_db_uuid` | String | UUID of the database in the target environment |
+| `source_db_name` | String | Name of the database in the source environment (for UI) |
+| `target_db_name` | String | Name of the database in the target environment (for UI) |
+| `engine` | String | Database engine type (e.g., "clickhouse", "postgres") |
+
+### MigrationJob
+Represents a single migration execution.
+
+| Field | Type | Description |
+|-------|------|-------------|
+| `id` | UUID | Primary Key |
+| `source_env_id` | UUID | Foreign Key to Environment |
+| `target_env_id` | UUID | Foreign Key to Environment |
+| `status` | Enum | `PENDING`, `RUNNING`, `COMPLETED`, `FAILED`, `AWAITING_MAPPING` |
+| `replace_db` | Boolean | Whether to apply database mappings |
+| `created_at` | DateTime | Timestamp of creation |
+
+## Relationships
+- `DatabaseMapping` belongs to a pair of `Environments`.
+- `MigrationJob` references two `Environments`.
+
+## Validation Rules
+- `source_env_id` and `target_env_id` must be different.
+- `source_db_uuid` and `target_db_uuid` must belong to databases with compatible engines (optional warning).
+- Mappings must be unique for a given `(source_env_id, target_env_id, source_db_uuid)` triplet.
diff --git a/specs/001-migration-ui-redesign/plan.md b/specs/001-migration-ui-redesign/plan.md
new file mode 100644
index 0000000..1ec94e4
--- /dev/null
+++ b/specs/001-migration-ui-redesign/plan.md
@@ -0,0 +1,79 @@
+# Implementation Plan: Migration Process and UI Redesign
+
+**Branch**: `001-migration-ui-redesign` | **Date**: 2025-12-20 | **Spec**: [specs/001-migration-ui-redesign/spec.md](specs/001-migration-ui-redesign/spec.md)
+
+## Summary
+
+Redesign the migration process to support environment-based selection and automated database mapping. The technical approach involves using a SQLite database to persist mappings between source and target databases, implementing a fuzzy matching algorithm for empirical suggestions, and intercepting asset definitions during migration to apply these mappings.
+
+## Technical Context
+
+**Language/Version**: Python 3.9+, Node.js 18+
+**Primary Dependencies**: FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLite
+**Storage**: SQLite (for database mappings and environment metadata)
+**Testing**: pytest (Backend), Vitest/Playwright (Frontend)
+**Target Platform**: Linux server
+**Project Type**: Web application (FastAPI + SvelteKit SPA)
+**Performance Goals**: SC-001: Users can complete a full database mapping for 5+ databases in under 60 seconds.
+**Constraints**: SPA-First Architecture (Constitution Principle I), API-Driven Communication (Constitution Principle II).
+**Scale/Scope**: Support for multiple environments and hundreds of database mappings.
+
+## Constitution Check
+
+*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
+
+| Principle | Status | Notes |
+|-----------|--------|-------|
+| I. SPA-First Architecture | PASS | SvelteKit will be built as a static SPA and served by FastAPI. |
+| II. API-Driven Communication | PASS | All mapping and migration actions will go through FastAPI endpoints. |
+| III. Modern Stack Consistency | PASS | Using FastAPI, SvelteKit, and Tailwind CSS. |
+| IV. Semantic Protocol Adherence | PASS | Code will include GRACE-Poly anchors and contracts. |
+
+## Project Structure
+
+### Documentation (this feature)
+
+```text
+specs/001-migration-ui-redesign/
+├── plan.md # This file
+├── research.md # Phase 0 output
+├── data-model.md # Phase 1 output
+├── quickstart.md # Phase 1 output
+├── contracts/ # Phase 1 output
+└── tasks.md # Phase 2 output
+```
+
+### Source Code (repository root)
+
+```text
+backend/
+├── src/
+│ ├── api/
+│ │ └── routes/
+│ │ ├── environments.py # New: Env selection
+│ │ └── mappings.py # New: DB mapping management
+│ ├── core/
+│ │ └── migration_engine.py # Update: DB replacement logic
+│ └── models/
+│ └── mapping.py # New: SQLite models
+└── tests/
+
+frontend/
+├── src/
+│ ├── components/
+│ │ ├── MappingTable.svelte # New: DB mapping UI
+│ │ └── EnvSelector.svelte # New: Source/Target selection
+│ └── routes/
+│ └── migration/ # New: Migration dashboard
+└── tests/
+```
+
+**Structure Decision**: Web application structure (Option 2) is selected to maintain separation between the FastAPI backend and SvelteKit frontend while adhering to the SPA-first principle.
+
+## Complexity Tracking
+
+> **Fill ONLY if Constitution Check has violations that must be justified**
+
+| Violation | Why Needed | Simpler Alternative Rejected Because |
+|-----------|------------|-------------------------------------|
+| None | N/A | N/A |
diff --git a/specs/001-migration-ui-redesign/quickstart.md b/specs/001-migration-ui-redesign/quickstart.md
new file mode 100644
index 0000000..527b053
--- /dev/null
+++ b/specs/001-migration-ui-redesign/quickstart.md
@@ -0,0 +1,39 @@
+# Quickstart: Migration Process and UI Redesign
+
+## Setup
+
+1. **Install Dependencies**:
+ ```bash
+ pip install rapidfuzz sqlalchemy
+ cd frontend && npm install
+ ```
+
+2. **Configure Environments**:
+ Ensure you have at least two Superset environments configured in the application settings.
+
+3. **Initialize Database**:
+ The system will automatically create the `mappings.db` SQLite file on the first run.
+
+## Usage
+
+### 1. Define Mappings
+1. Navigate to the **Database Mapping** tab.
+2. Select your **Source** and **Target** environments.
+3. Click **Fetch Databases**.
+4. Review the **Suggested Mappings** (highlighted in green).
+5. Manually adjust any mappings using the dropdowns.
+6. Click **Save Mappings**.
+
+### 2. Run Migration
+1. Go to the **Migration** dashboard.
+2. Select the **Source** and **Target** environments.
+3. Select the assets (Dashboards/Datasets) you want to migrate.
+4. Enable the **Replace Database** toggle.
+5. Click **Start Migration**.
+6. If a database is missing a mapping, a modal will appear prompting you to select a target database.
+
+## Troubleshooting
+
+- **Connection Error**: Ensure the backend can reach both Superset instances. Check credentials in settings.
+- **Mapping Not Applied**: Verify that the "Replace Database" toggle was enabled and that the mapping exists for the specific environment pair.
+- **Fuzzy Match Failure**: If names are too different, manual mapping is required. The system learns from manual overrides.
diff --git a/specs/001-migration-ui-redesign/research.md b/specs/001-migration-ui-redesign/research.md
new file mode 100644
index 0000000..d6049ed
--- /dev/null
+++ b/specs/001-migration-ui-redesign/research.md
@@ -0,0 +1,33 @@
+# Research: Migration Process and UI Redesign
+
+## Decision: Fuzzy Matching Algorithm
+- **Choice**: `RapidFuzz` library with `fuzz.token_sort_ratio`.
+- **Rationale**: `RapidFuzz` is significantly faster than `FuzzyWuzzy` and provides robust string similarity metrics. `token_sort_ratio` is ideal for database names because it ignores word order and is less sensitive to prefixes like "Dev-" or "Prod-".
+- **Alternatives considered**:
+ - `Levenshtein`: Too sensitive to string length and prefixes.
+ - `Jaro-Winkler`: Good for short strings but less effective for multi-word names with different orders.
+
+## Decision: Asset Interception Strategy
+- **Choice**: ZIP-based transformation during migration.
+- **Rationale**: Superset's native export/import format is a ZIP archive containing YAML definitions. Intercepting this archive allows for precise modification of database references (UUIDs) before they reach the target environment.
+- **Implementation**:
+ 1. Export dashboard/dataset from source (ZIP).
+ 2. Extract ZIP to a temporary directory.
+ 3. Iterate through `datasets/*.yaml` files.
+ 4. Replace `database_uuid` values based on the mapping table.
+ 5. Re-package the ZIP.
+ 6. Import to target.
+
+## Decision: Database Mapping Persistence
+- **Choice**: SQLite with SQLAlchemy/SQLModel.
+- **Rationale**: SQLite is lightweight, requires no separate server, and is perfect for storing local configuration and mappings. It aligns with the project's existing stack.
+- **Schema**:
+ - `Environment`: `id`, `name`, `url`, `credentials_id`.
+ - `DatabaseMapping`: `id`, `source_env_id`, `target_env_id`, `source_db_uuid`, `target_db_uuid`, `source_db_name`, `target_db_name`.
+
+## Decision: Superset API Integration
+- **Choice**: Extend existing `SupersetClient`.
+- **Rationale**: `SupersetClient` already handles authentication, network requests, and basic CRUD for dashboards/datasets. Adding environment-specific fetching and database listing is a natural extension.
+- **New Endpoints to use**:
+ - `GET /api/v1/database/`: List all databases.
+ - `GET /api/v1/database/{id}`: Get detailed database config.
diff --git a/specs/001-migration-ui-redesign/spec.md b/specs/001-migration-ui-redesign/spec.md
new file mode 100644
index 0000000..81a1fd6
--- /dev/null
+++ b/specs/001-migration-ui-redesign/spec.md
@@ -0,0 +1,109 @@
+# Feature Specification: Migration Process and UI Redesign
+
+**Feature Branch**: `001-migration-ui-redesign`
+**Created**: 2025-12-20
+**Status**: Draft
+**Input**: User description: "я хочу переработать процесс и интерфейс миграции. 1. Необходимо чтобы был выпадающий список enviroments (откуда и куда), а также просто галка замены БД 2. Процесс замены БД должен быть предустановленными парами , необходима отдельная вкладка которая бы считывала базы данных с источника и цели и позволяла их маппить, при этом первоначально эмпирически подставляя пары вида 'Dev Clickhouse' -> 'Prod Clickhouse'. Меппинг нужно сохранять и иметь возможность его редактировать"
+
+## Clarifications
+
+### Session 2025-12-20
+- Q: Scope of Database Mapping → A: Map the full configuration object obtained from the Superset API.
+- Q: Persistence of mappings → A: Use a SQLite database for storing mappings.
+- Q: Handling of missing mappings during migration → A: Show a modal dialog during the migration process to prompt for missing mappings.
+- Q: Empirical matching algorithm details → A: Use name-based fuzzy matching (ignoring common prefixes like Dev/Prod).
+- Q: Scope of "Replace Database" toggle → A: Apply replacement to all assets (Dashboards, Datasets, Charts) included in the migration.
+- Q: Backend exposure of Superset databases → A: Dedicated environment database endpoints (e.g., `/api/environments/{id}/databases`).
+- Q: Superset API authentication → A: Use stored environment credentials from the backend.
+- Q: Error handling for unreachable environments → A: Return structured error responses (502/503) with descriptive messages.
+- Q: Database list filtering → A: Return all available databases with metadata (engine type, etc.).
+- Q: Handling large database lists → A: Return full list (no pagination) for simplicity.
+
+## User Scenarios & Testing *(mandatory)*
+
+### User Story 1 - Environment-Based Migration Setup (Priority: P1)
+
+As a migration operator, I want to easily select the source and target environments from a list so that I can quickly define the scope of my migration without manual URL entry.
+
+**Why this priority**: This is the core interaction for starting any migration. Using predefined environments reduces errors and improves speed.
+
+**Independent Test**: Can be tested by opening the migration page and verifying that the "Source" and "Target" dropdowns are populated with configured environments and can be selected.
+
+**Acceptance Scenarios**:
+
+1. **Given** multiple environments are configured in settings, **When** I open the migration page, **Then** I should see two dropdowns for "Source" and "Target" containing these environments.
+2. **Given** a source and target are selected, **When** I toggle the "Replace Database" checkbox, **Then** the system should prepare to apply database mappings during the next migration step.
+
+---
+
+### User Story 2 - Database Mapping Management (Priority: P1)
+
+As an administrator, I want to define how databases in my development environment map to databases in production so that my dashboards and datasets work correctly after migration.
+
+**Why this priority**: Migrations often fail or require manual fixups because database references point to the wrong environment. Automated mapping is critical for reliable migrations.
+
+**Independent Test**: Can be tested by navigating to the "Database Mapping" tab, fetching databases, and verifying that mappings can be created, saved, and edited.
+
+**Acceptance Scenarios**:
+
+1. **Given** a source and target environment are selected, **When** I open the "Database Mapping" tab, **Then** the system should fetch and display lists of databases from both environments.
+2. **Given** the database lists are loaded, **When** the system identifies similar names (e.g., "Dev Clickhouse" and "Prod Clickhouse"), **Then** it should automatically suggest these as a mapping pair.
+3. **Given** suggested or manual mappings, **When** I click "Save Mappings", **Then** these pairs should be persisted and associated with the selected environment pair.
+
+---
+
+### User Story 3 - Migration with Automated DB Replacement (Priority: P2)
+
+As a user, I want the migration process to automatically update database references based on my saved mappings so that I don't have to manually edit exported files or post-migration settings.
+
+**Why this priority**: This delivers the actual value of the mapping feature by automating a tedious and error-prone task.
+
+**Independent Test**: Can be tested by running a migration with "Replace Database" enabled and verifying that the resulting assets in the target environment point to the mapped databases.
+
+**Acceptance Scenarios**:
+
+1. **Given** saved mappings exist for the selected environments, **When** I start a migration with "Replace Database" enabled, **Then** the system should replace all source database IDs/names with their corresponding target values during the transfer.
+2. **Given** "Replace Database" is enabled but a source database has no mapping, **When** the migration runs, **Then** the system should pause and show a modal dialog prompting the user to provide a mapping on-the-fly for the missing database.
+
+---
+
+### Edge Cases
+
+- **Environment Connectivity**: If the source or target environment is unreachable, the backend MUST return a structured error (502/503), and the frontend MUST display a clear connection error with a retry option.
+- **Duplicate Mappings**: How does the system handle multiple source databases mapping to the same target database? (Assumption: This is allowed, as multiple dev DBs might consolidate into one prod DB).
+- **Missing Target Database**: What if a mapped target database no longer exists in the target environment? (Assumption: Validation should occur before migration starts, highlighting broken mappings).
+
+## Requirements *(mandatory)*
+
+### Functional Requirements
+
+- **FR-001**: System MUST provide dropdown menus for selecting "Source Environment" and "Target Environment" on the migration screen.
+- **FR-002**: System MUST provide a "Replace Database" checkbox that, when enabled, triggers the database mapping logic for all assets (Dashboards, Datasets, Charts) during migration.
+- **FR-003**: System MUST include a dedicated "Database Mapping" tab or view accessible from the migration interface.
+- **FR-004**: System MUST fetch available databases from both source and target environments via their respective APIs when the mapping tab is opened.
+- **FR-005**: System MUST implement a name-based fuzzy matching algorithm to suggest initial mappings, ignoring common environment prefixes (e.g., "Dev", "Prod").
+- **FR-006**: System MUST allow users to manually override suggested mappings and create new ones via a drag-and-drop or dropdown-based interface.
+- **FR-007**: System MUST persist database mappings in a local SQLite database, keyed by the source and target environment identifiers.
+- **FR-008**: System MUST provide an "Edit" capability for existing mappings, allowing users to update or delete them.
+- **FR-009**: During migration, if "Replace Database" is active, the system MUST intercept asset definitions (JSON/YAML) and replace database references according to the active mapping table.
+
+### Key Entities *(include if feature involves data)*
+
+- **Environment**: A configured Superset instance (Name, URL, Credentials).
+- **Database Mapping**: A record linking a source database configuration (including metadata like engine type) to a target database configuration for a specific `source_env` -> `target_env` pair.
+- **Migration Configuration**: The set of parameters for a migration job, including selected environments and the "Replace Database" toggle state.
+
+## Success Criteria *(mandatory)*
+
+### Measurable Outcomes
+
+- **SC-001**: Users can complete a full database mapping for 5+ databases in under 60 seconds using the empirical suggestions.
+- **SC-002**: 100% of assets migrated with "Replace Database" enabled correctly reference the target databases as defined in the mapping table.
+- **SC-003**: Mapping persistence allows users to run subsequent migrations between the same environments without re-configuring database pairs in 100% of cases.
+- **SC-004**: The system successfully identifies and suggests at least 90% of matching pairs when naming follows a "Prefix + Name" pattern (e.g., "Dev-Sales" -> "Prod-Sales").
+
+## Assumptions
+
+- **AS-001**: Environments are already configured in the application's global settings.
+- **AS-002**: The backend has access to stored credentials for both source and target environments to perform API requests.
+- **AS-003**: Database names or IDs are stable enough within an environment to be used as reliable mapping keys.
diff --git a/specs/001-migration-ui-redesign/tasks.md b/specs/001-migration-ui-redesign/tasks.md
new file mode 100644
index 0000000..04e14b1
--- /dev/null
+++ b/specs/001-migration-ui-redesign/tasks.md
@@ -0,0 +1,186 @@
+---
+
+description: "Task list for Migration Process and UI Redesign implementation"
+---
+
+# Tasks: Migration Process and UI Redesign
+
+**Input**: Design documents from `specs/001-migration-ui-redesign/`
+**Prerequisites**: plan.md (required), spec.md (required for user stories), research.md, data-model.md, quickstart.md
+
+**Tests**: Tests are NOT explicitly requested in the feature specification, so they are omitted from this task list.
+
+**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story.
+
+## Format: `[ID] [P?] [Story] Description`
+
+- **[P]**: Can run in parallel (different files, no dependencies)
+- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3)
+- Include exact file paths in descriptions
+
+## Path Conventions
+
+- **Web app**: `backend/src/`, `frontend/src/`
+
+---
+
+## Phase 1: Setup (Shared Infrastructure)
+
+**Purpose**: Project initialization and basic structure
+
+- [ ] T001 Create project structure per implementation plan in `backend/src/` and `frontend/src/`
+- [ ] T002 [P] Install backend dependencies (rapidfuzz, sqlalchemy) in `backend/requirements.txt`
+- [ ] T003 [P] Install frontend dependencies (if any new) in `frontend/package.json`
+
+---
+
+## Phase 2: Foundational (Blocking Prerequisites)
+
+**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented
+
+**⚠️ CRITICAL**: No user story work can begin until this phase is complete
+
+- [ ] T004 Setup SQLite database schema and SQLAlchemy models in `backend/src/models/mapping.py`
+- [ ] T005 [P] Implement fuzzy matching utility using RapidFuzz in `backend/src/core/utils/matching.py`
+- [ ] T006 [P] Extend SupersetClient to support database listing and metadata fetching in `backend/src/core/superset_client.py`
+- [ ] T007 Configure database mapping persistence layer in `backend/src/core/database.py`
+
+**Checkpoint**: Foundation ready - user story implementation can now begin in parallel
+
+---
+
+## Phase 3: User Story 1 - Environment-Based Migration Setup (Priority: P1) 🎯 MVP
+
+**Goal**: Enable selection of source and target environments and toggle database replacement.
+
+**Independent Test**: Open the migration page and verify that the "Source" and "Target" dropdowns are populated with configured environments and can be selected.
+
+### Implementation for User Story 1
+
+- [ ] T008 [P] [US1] Implement environment selection API endpoints in `backend/src/api/routes/environments.py`
+- [ ] T009 [P] [US1] Create `EnvSelector.svelte` component for source/target selection in `frontend/src/components/EnvSelector.svelte`
+- [ ] T010 [US1] Integrate `EnvSelector` and "Replace Database" toggle into migration dashboard in `frontend/src/routes/migration/+page.svelte`
+- [ ] T011 [US1] Add validation to ensure source and target environments are different in `frontend/src/routes/migration/+page.svelte`
+
+**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently.
+
+---
+
+## Phase 4: User Story 2 - Database Mapping Management (Priority: P1)
+
+**Goal**: Fetch databases from environments, suggest mappings using fuzzy matching, and allow manual overrides/persistence.
+
+**Independent Test**: Navigate to the "Database Mapping" tab, fetch databases, and verify that mappings can be created, saved, and edited.
+
+### Implementation for User Story 2
+
+- [ ] T012 [P] [US2] Implement database mapping CRUD API endpoints in `backend/src/api/routes/mappings.py`
+- [ ] T013 [US2] Implement mapping service with fuzzy matching logic in `backend/src/services/mapping_service.py`
+- [ ] T014 [P] [US2] Create `MappingTable.svelte` component for displaying and editing pairs in `frontend/src/components/MappingTable.svelte`
+- [ ] T015 [US2] Create database mapping management view in `frontend/src/routes/migration/mappings/+page.svelte`
+- [ ] T016 [US2] Implement "Fetch Databases" action and suggestion highlighting in `frontend/src/routes/migration/mappings/+page.svelte`
+
+**Checkpoint**: At this point, User Stories 1 AND 2 should both work independently.
+
+---
+
+## Phase 5: User Story 3 - Migration with Automated DB Replacement (Priority: P2)
+
+**Goal**: Intercept assets during migration, apply database mappings, and prompt for missing ones.
+
+**Independent Test**: Run a migration with "Replace Database" enabled and verify that the resulting assets in the target environment point to the mapped databases.
+
+### Implementation for User Story 3
+
+- [ ] T017 [US3] Implement ZIP-based asset interception and YAML transformation logic in `backend/src/core/migration_engine.py`
+- [ ] T018 [US3] Integrate database mapping application into the migration job execution flow in `backend/src/core/task_manager.py`
+- [ ] T019 [P] [US3] Create `MissingMappingModal.svelte` for on-the-fly mapping prompts in `frontend/src/components/MissingMappingModal.svelte`
+- [ ] T020 [US3] Implement backend pause and frontend modal trigger for missing mappings in `backend/src/api/routes/tasks.py` and `frontend/src/components/TaskRunner.svelte`
+
+**Checkpoint**: All user stories should now be independently functional.
+
+---
+
+## Phase 6: Polish & Cross-Cutting Concerns
+
+**Purpose**: Improvements that affect multiple user stories
+
+- [ ] T021 [P] Update documentation in `docs/` to include database mapping instructions
+- [ ] T022 Code cleanup and refactoring of migration logic
+- [ ] T023 [P] Performance optimization for fuzzy matching and ZIP processing
+- [ ] T024 Run `quickstart.md` validation to ensure end-to-end flow works as documented
+
+---
+
+## Dependencies & Execution Order
+
+### Phase Dependencies
+
+- **Setup (Phase 1)**: No dependencies - can start immediately
+- **Foundational (Phase 2)**: Depends on Setup completion - BLOCKS all user stories
+- **User Stories (Phase 3+)**: All depend on Foundational phase completion
+ - User stories can then proceed in parallel (if staffed)
+ - Or sequentially in priority order (P1 → P2 → P3)
+- **Polish (Final Phase)**: Depends on all desired user stories being complete
+
+### User Story Dependencies
+
+- **User Story 1 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
+- **User Story 2 (P1)**: Can start after Foundational (Phase 2) - No dependencies on other stories
+- **User Story 3 (P2)**: Can start after Foundational (Phase 2) - Depends on US1/US2 for mapping data and configuration
+
+### Within Each User Story
+
+- Models before services
+- Services before endpoints
+- Core implementation before integration
+- Story complete before moving to next priority
+
+### Parallel Opportunities
+
+- All Setup tasks marked [P] can run in parallel
+- All Foundational tasks marked [P] can run in parallel (within Phase 2)
+- Once Foundational phase completes, US1 and US2 can start in parallel
+- Models and UI components within a story marked [P] can run in parallel
+
+---
+
+## Parallel Example: User Story 2
+
+```bash
+# Launch backend and frontend components for User Story 2 together:
+Task: "Implement database mapping CRUD API endpoints in backend/src/api/routes/mappings.py"
+Task: "Create MappingTable.svelte component for displaying and editing pairs in frontend/src/components/MappingTable.svelte"
+```
+
+---
+
+## Implementation Strategy
+
+### MVP First (User Story 1 & 2)
+
+1. Complete Phase 1: Setup
+2. Complete Phase 2: Foundational (CRITICAL - blocks all stories)
+3. Complete Phase 3: User Story 1
+4. Complete Phase 4: User Story 2
+5. **STOP and VALIDATE**: Test environment selection and mapping management independently
+6. Deploy/demo if ready
+
+### Incremental Delivery
+
+1. Complete Setup + Foundational → Foundation ready
+2. Add User Story 1 → Test independently → Deploy/Demo
+3. Add User Story 2 → Test independently → Deploy/Demo (MVP!)
+4. Add User Story 3 → Test independently → Deploy/Demo
+5. Each story adds value without breaking previous stories
+
+---
+
+## Notes
+
+- [P] tasks = different files, no dependencies
+- [Story] label maps task to specific user story for traceability
+- Each user story should be independently completable and testable
+- Commit after each task or logical group
+- Stop at any checkpoint to validate story independently
+- Avoid: vague tasks, same file conflicts, cross-story dependencies that break independence
diff --git a/specs/005-fix-ui-ws-validation/contracts/ws-logs.md b/specs/005-fix-ui-ws-validation/contracts/ws-logs.md
new file mode 100644
index 0000000..ee99844
--- /dev/null
+++ b/specs/005-fix-ui-ws-validation/contracts/ws-logs.md
@@ -0,0 +1,24 @@
+# WebSocket Contract: Task Logs
+
+## Endpoint
+`WS /ws/logs/{task_id}`
+
+## Description
+Streams real-time logs for a specific task.
+
+## Connection Parameters
+- `task_id`: UUID of the task to monitor.
+
+## Message Format (Server -> Client)
+```json
+{
+ "task_id": "uuid",
+ "message": "Log message text",
+ "timestamp": "2025-12-20T20:20:00Z",
+ "level": "INFO"
+}
+```
+
+## Error Handling
+- If `task_id` is invalid, the connection is closed with code `4004` (Not Found).
+- If the connection fails, the client should attempt reconnection with exponential backoff.