Add sync engine, web UI, Docker setup, and tests
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
210
tests/USER_STORIES.md
Normal file
210
tests/USER_STORIES.md
Normal file
@@ -0,0 +1,210 @@
|
||||
# User Stories — Outline Sync Web UI
|
||||
**Derived from:** WEBUI_PRD.md v2.0
|
||||
**Date:** 2026-03-07
|
||||
|
||||
Each story follows the format: **As a [user], I want [goal] so that [benefit].**
|
||||
Acceptance criteria map directly to automated test IDs in the corresponding `test_phase_*.py` files.
|
||||
|
||||
---
|
||||
|
||||
## Phase A — WebDAV Container
|
||||
|
||||
**US-A1** — As a user with Obsidian running locally, I want my vault files to sync automatically to the VPS via WebDAV, so that I do not need terminal access for file transfer.
|
||||
- AC: A file created in the vault directory is retrievable via WebDAV GET within the sync interval.
|
||||
- AC: A file updated locally appears updated on the WebDAV server after sync.
|
||||
- AC: Deleted files are removed from the WebDAV share.
|
||||
- Tests: `test_phase_a_webdav.py::TestWebDAVFileOps`
|
||||
|
||||
**US-A2** — As a system administrator, I want the WebDAV endpoint protected by basic auth and Tailscale network isolation, so that the vault is not publicly accessible.
|
||||
- AC: Unauthenticated requests return 401.
|
||||
- AC: Requests with valid credentials return 200.
|
||||
- AC: The WebDAV port is not bound to the public interface.
|
||||
- Tests: `test_phase_a_webdav.py::TestWebDAVAuth`
|
||||
|
||||
**US-A3** — As a user, I want the `.git/` directory excluded from WebDAV access, so that git internals are not exposed or corrupted by Obsidian.
|
||||
- AC: GET request to `/.git/` returns 403 or 404.
|
||||
- AC: Obsidian plugin cannot overwrite `.git/` files via WebDAV.
|
||||
- Tests: `test_phase_a_webdav.py::TestWebDAVGitProtection`
|
||||
|
||||
**US-A4** — As an Obsidian user, I want WebDAV to support bidirectional file sync (upload, download, delete), so that both push and pull directions work without manual steps.
|
||||
- AC: PROPFIND returns correct file listing.
|
||||
- AC: PUT creates/updates files.
|
||||
- AC: DELETE removes files.
|
||||
- Tests: `test_phase_a_webdav.py::TestWebDAVMethods`
|
||||
|
||||
---
|
||||
|
||||
## Phase B — Read-Only Dashboard
|
||||
|
||||
**US-B1** — As a user, I want to open `https://sync.domverse.de` and immediately see the current vault state, so that I know if anything needs attention before syncing.
|
||||
- AC: GET `/` returns HTTP 200 with HTML content.
|
||||
- AC: Page contains vault status badge (Clean / Dirty / Conflicts).
|
||||
- AC: Page shows count of pending local changes.
|
||||
- Tests: `test_phase_b_dashboard.py::TestDashboardPage`
|
||||
|
||||
**US-B2** — As a user, I want to see when the vault was last pulled from Outline and last pushed to Outline, so that I can judge how stale my local state is.
|
||||
- AC: GET `/status` returns JSON with `last_pull`, `last_push` timestamps.
|
||||
- AC: Dashboard renders these timestamps in human-readable form.
|
||||
- Tests: `test_phase_b_dashboard.py::TestStatusEndpoint`
|
||||
|
||||
**US-B3** — As a user, I want the dashboard to show a conflict warning badge when git merge conflicts are present, so that I do not accidentally push broken files.
|
||||
- AC: When conflict files exist, `/status` includes `conflicts: N` where N > 0.
|
||||
- AC: Dashboard shows warning banner with link to `/conflicts`.
|
||||
- Tests: `test_phase_b_dashboard.py::TestConflictBadge`
|
||||
|
||||
**US-B4** — As a user, I want the dashboard to show how many files Obsidian has written via WebDAV that are not yet pushed to Outline, so that I know the push button's scope.
|
||||
- AC: Pending count = `git diff outline..main --name-only | wc -l`.
|
||||
- AC: Count updates on each page load without manual refresh.
|
||||
- Tests: `test_phase_b_dashboard.py::TestPendingCount`
|
||||
|
||||
---
|
||||
|
||||
## Phase C — Pull with Live Output
|
||||
|
||||
**US-C1** — As a user, I want to click "Get from Outline" and see live streaming output in the browser, so that I can monitor progress without terminal access.
|
||||
- AC: POST `/pull` responds immediately with a `job_id`.
|
||||
- AC: GET `/stream/{job_id}` returns `text/event-stream` content type.
|
||||
- AC: SSE stream emits at least one `data:` event per document processed.
|
||||
- AC: Stream ends with a `done` event containing a summary.
|
||||
- Tests: `test_phase_c_pull.py::TestPullStreaming`
|
||||
|
||||
**US-C2** — As a user, I want the pull to fetch new Outline documents and update them in the vault, so that Obsidian shows the latest wiki content.
|
||||
- AC: After pull, new documents appear as `.md` files in the vault.
|
||||
- AC: Modified documents have updated content.
|
||||
- AC: The `outline` branch is advanced to reflect the new Outline state.
|
||||
- Tests: `test_phase_c_pull.py::TestPullContent`
|
||||
|
||||
**US-C3** — As a user, I want the pull operation to be idempotent when nothing changed in Outline, so that repeated pulls are safe and fast.
|
||||
- AC: Pull with no Outline changes returns success with "0 changes" summary.
|
||||
- AC: No git commits are made when there are no changes.
|
||||
- Tests: `test_phase_c_pull.py::TestPullIdempotent`
|
||||
|
||||
**US-C4** — As a user, I want only one sync job running at a time, so that concurrent pull/push operations do not corrupt the vault.
|
||||
- AC: Starting a pull while one is in progress returns HTTP 409 Conflict.
|
||||
- AC: Job lock is released when the stream closes (done or error).
|
||||
- Tests: `test_phase_c_pull.py::TestJobLock`
|
||||
|
||||
---
|
||||
|
||||
## Phase D — Pending Changes View
|
||||
|
||||
**US-D1** — As a user, I want to see a structured list of pending changes before pushing, so that I can review what will be sent to Outline.
|
||||
- AC: GET `/changes` returns 200 with a list of change objects.
|
||||
- AC: Each item has a `path`, `status` (modified/added/renamed/deleted) and `action` (what the sync engine will do).
|
||||
- Tests: `test_phase_d_changes.py::TestChangesEndpoint`
|
||||
|
||||
**US-D2** — As a user, I want modified files listed separately from new files and deleted files, so that I understand the scope of each change type.
|
||||
- AC: `status=modified` for files changed since last outline branch commit.
|
||||
- AC: `status=added` for files not on the outline branch at all.
|
||||
- AC: `status=deleted` for files removed from main but still on outline.
|
||||
- AC: `status=renamed` with both `from` and `to` paths.
|
||||
- Tests: `test_phase_d_changes.py::TestChangeCategories`
|
||||
|
||||
**US-D3** — As a user, I want to preview the diff for a modified file inline, so that I can confirm the content before pushing.
|
||||
- AC: GET `/diff/{encoded_path}` returns an HTML fragment with two columns.
|
||||
- AC: Left column shows the outline branch version, right shows main branch version.
|
||||
- AC: Added lines are highlighted green, removed lines red.
|
||||
- Tests: `test_phase_d_changes.py::TestDiffPreview`
|
||||
|
||||
**US-D4** — As a user, I want deleted files shown as "skipped" when deletions are disabled in settings, so that I know why they are not being removed from Outline.
|
||||
- AC: When `allow_deletions=false`, deleted files appear with `action=skip`.
|
||||
- AC: Reason text explains deletions are disabled.
|
||||
- Tests: `test_phase_d_changes.py::TestDeletedFilesSkipped`
|
||||
|
||||
---
|
||||
|
||||
## Phase E — Push with Live Output
|
||||
|
||||
**US-E1** — As a user, I want to click "Send to Outline" and see live streaming output, so that I can monitor progress for each file.
|
||||
- AC: POST `/push` returns a `job_id`.
|
||||
- AC: SSE stream emits one event per file with status (created/updated/skipped/error).
|
||||
- AC: Final event contains summary counts (created, updated, skipped, errors).
|
||||
- Tests: `test_phase_e_push.py::TestPushStreaming`
|
||||
|
||||
**US-E2** — As a user, I want new Obsidian files to appear in Outline under the correct collection and parent, so that the hierarchy is preserved.
|
||||
- AC: A file `Projekte/NewNote.md` (no frontmatter) is created in Outline under the "Projekte" collection.
|
||||
- AC: After push, the file receives frontmatter with `outline_id`.
|
||||
- AC: The updated file is committed and becomes readable via WebDAV.
|
||||
- Tests: `test_phase_e_push.py::TestNewFileCreation`
|
||||
|
||||
**US-E3** — As a user, I want push to be blocked when there are unresolved conflicts, so that I cannot push broken files with conflict markers.
|
||||
- AC: When `git ls-files -u` returns conflict files, POST `/push` returns HTTP 409.
|
||||
- AC: Response body includes list of conflicting paths.
|
||||
- Tests: `test_phase_e_push.py::TestPushBlockedByConflicts`
|
||||
|
||||
**US-E4** — As a user, I want a new top-level folder in Obsidian to create a new Outline collection automatically, so that new categories do not require manual Outline setup.
|
||||
- AC: Folder `NewCollection/` not mapped to any existing collection → `collections.create` called.
|
||||
- AC: Documents inside the new folder are created under the new collection.
|
||||
- Tests: `test_phase_e_push.py::TestNewCollectionCreation`
|
||||
|
||||
**US-E5** — As a user, I want the push to handle renames (file moved/title changed) without deleting and recreating the document, so that Outline document history and URL are preserved.
|
||||
- AC: Renamed file detected via git rename detection.
|
||||
- AC: Outline `documents.update` called (not delete+create).
|
||||
- Tests: `test_phase_e_push.py::TestRenameHandling`
|
||||
|
||||
---
|
||||
|
||||
## Phase F — Conflict Resolution
|
||||
|
||||
**US-F1** — As a user, I want to see all version conflicts listed in the browser, so that I can resolve them without using git on the command line.
|
||||
- AC: GET `/conflicts` returns list of conflict file paths.
|
||||
- AC: Each item includes local timestamp and Outline edit timestamp.
|
||||
- Tests: `test_phase_f_conflicts.py::TestConflictsList`
|
||||
|
||||
**US-F2** — As a user, I want a side-by-side diff view per conflicting file, so that I can compare my Obsidian edit with the Outline edit before choosing.
|
||||
- AC: GET `/diff/{encoded_path}` for a conflict file returns two-column HTML diff.
|
||||
- AC: Diff is rendered using Python `difflib` or equivalent.
|
||||
- Tests: `test_phase_f_conflicts.py::TestConflictDiff`
|
||||
|
||||
**US-F3** — As a user, I want to click "Keep mine" to accept my local Obsidian edit, so that my changes win.
|
||||
- AC: POST `/resolve` with `{file: "path", accept: "local"}` resolves conflict in favour of local.
|
||||
- AC: Conflict markers are removed from the file.
|
||||
- AC: File is committed to main branch.
|
||||
- Tests: `test_phase_f_conflicts.py::TestResolveLocal`
|
||||
|
||||
**US-F4** — As a user, I want to click "Keep Outline's" to accept the Outline version, so that the wiki state wins.
|
||||
- AC: POST `/resolve` with `{file: "path", accept: "remote"}` resolves in favour of outline branch.
|
||||
- AC: Conflict markers are removed.
|
||||
- AC: File is committed to main branch.
|
||||
- Tests: `test_phase_f_conflicts.py::TestResolveRemote`
|
||||
|
||||
**US-F5** — As a user, I want the system to reject invalid file paths in resolve requests, so that an attacker cannot trigger arbitrary git operations via the UI.
|
||||
- AC: `/resolve` with a path not in the conflict list returns HTTP 422.
|
||||
- AC: Path traversal attempts (`../`) return HTTP 422.
|
||||
- Tests: `test_phase_f_conflicts.py::TestResolveValidation`
|
||||
|
||||
**US-F6** — As a user, I want to be redirected to the dashboard after resolving all conflicts, with "Push now available" displayed, so that the workflow continues naturally.
|
||||
- AC: After last conflict resolved, GET `/conflicts` returns empty list.
|
||||
- AC: Dashboard status badge updates to show clean/push-ready state.
|
||||
- Tests: `test_phase_f_conflicts.py::TestAllConflictsResolved`
|
||||
|
||||
---
|
||||
|
||||
## Phase G — Sync History
|
||||
|
||||
**US-G1** — As a user, I want to view a chronological history of all sync operations, so that I can audit what changed and when.
|
||||
- AC: GET `/history` returns HTTP 200 with HTML content.
|
||||
- AC: Sync entries are shown in reverse chronological order.
|
||||
- AC: Each entry shows: timestamp, direction (pull/push), files affected, status.
|
||||
- Tests: `test_phase_g_history.py::TestHistoryPage`
|
||||
|
||||
**US-G2** — As a user, I want the history sourced from `_sync_log.md`, so that it remains readable as a plain Obsidian note.
|
||||
- AC: `_sync_log.md` in vault root is parsed into structured records.
|
||||
- AC: Entries are displayed as an HTML table, not raw markdown.
|
||||
- Tests: `test_phase_g_history.py::TestSyncLogParsing`
|
||||
|
||||
---
|
||||
|
||||
## End-to-End Flows
|
||||
|
||||
**US-E2E1** — As a user, I want the full Obsidian → Outline flow to work without terminal access.
|
||||
- Tests: `test_e2e.py::TestObsidianToOutlineFlow`
|
||||
|
||||
**US-E2E2** — As a user, I want the full Outline → Obsidian flow to work without terminal access.
|
||||
- Tests: `test_e2e.py::TestOutlineToObsidianFlow`
|
||||
|
||||
**US-E2E3** — As a user, I want conflicts detected and resolvable end-to-end through the browser.
|
||||
- Tests: `test_e2e.py::TestConflictResolutionFlow`
|
||||
|
||||
**US-E2E4** — As a user, I want a new file created in Obsidian to reach Outline with correct hierarchy, frontmatter written back, and the ID visible in Obsidian on the next sync.
|
||||
- Tests: `test_e2e.py::TestNewFileRoundTrip`
|
||||
270
tests/conftest.py
Normal file
270
tests/conftest.py
Normal file
@@ -0,0 +1,270 @@
|
||||
# Ensure the project root (webui.py) is importable before any test collection.
|
||||
import sys
|
||||
from pathlib import Path
|
||||
_ROOT = Path(__file__).parent.parent
|
||||
if str(_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(_ROOT))
|
||||
|
||||
"""
|
||||
Shared fixtures for the Outline Sync Web UI test suite.
|
||||
|
||||
All tests work against the FastAPI app defined in webui.py (Phase B+).
|
||||
Phase A tests (WebDAV) are integration tests marked with @pytest.mark.integration
|
||||
and require a running WebDAV container — they are skipped in unit-test runs.
|
||||
|
||||
Run unit tests only:
|
||||
pytest tests/ -m "not integration"
|
||||
|
||||
Run integration tests (requires running WebDAV container):
|
||||
pytest tests/ -m integration
|
||||
|
||||
Run everything:
|
||||
pytest tests/
|
||||
"""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
import textwrap
|
||||
from pathlib import Path
|
||||
from typing import Generator
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
import pytest_asyncio
|
||||
from httpx import ASGITransport, AsyncClient
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# pytest configuration
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
pytest_plugins = ["pytest_asyncio"]
|
||||
|
||||
|
||||
def pytest_configure(config):
|
||||
config.addinivalue_line("markers", "integration: requires running infrastructure (WebDAV, Outline)")
|
||||
config.addinivalue_line("markers", "slow: longer-running tests")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Git helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def git(vault: Path, *args) -> str:
|
||||
result = subprocess.run(
|
||||
["git", "-C", str(vault), *args],
|
||||
check=True, capture_output=True, text=True,
|
||||
)
|
||||
return result.stdout.strip()
|
||||
|
||||
|
||||
def git_config(vault: Path):
|
||||
git(vault, "config", "user.email", "test@sync.local")
|
||||
git(vault, "config", "user.name", "Test Runner")
|
||||
|
||||
|
||||
def commit_all(vault: Path, message: str):
|
||||
"""Stage everything in vault and commit."""
|
||||
git(vault, "add", "-A")
|
||||
try:
|
||||
git(vault, "commit", "-m", message)
|
||||
except subprocess.CalledProcessError:
|
||||
pass # nothing to commit — that's fine
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Auto-reset webui module state between tests
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def reset_webui_state():
|
||||
"""Clear _jobs and _active_job on webui module after every test."""
|
||||
yield
|
||||
try:
|
||||
import webui as _webui
|
||||
_webui._jobs.clear()
|
||||
_webui._active_job = None
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Core fixtures
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture
|
||||
def vault_dir(tmp_path) -> Path:
|
||||
"""
|
||||
Temporary directory simulating /outline-vault/.
|
||||
Initialised as a git repo with 'outline' and 'main' branches.
|
||||
"""
|
||||
vault = tmp_path / "outline-vault"
|
||||
vault.mkdir()
|
||||
|
||||
subprocess.run(["git", "init", str(vault)], check=True, capture_output=True)
|
||||
git_config(vault)
|
||||
git(vault, "checkout", "-b", "outline")
|
||||
# Initial commit so branches can diverge
|
||||
(vault / ".gitkeep").touch()
|
||||
commit_all(vault, "initial")
|
||||
git(vault, "checkout", "-b", "main")
|
||||
|
||||
return vault
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def populated_vault(vault_dir: Path) -> Path:
|
||||
"""
|
||||
Vault pre-loaded with sample markdown files on both branches,
|
||||
simulating a state after a previous successful sync.
|
||||
"""
|
||||
# Create files on 'outline' branch — last known Outline state
|
||||
git(vault_dir, "checkout", "outline")
|
||||
(vault_dir / "Bewerbungen").mkdir()
|
||||
(vault_dir / "Bewerbungen" / "CV.md").write_text(textwrap.dedent("""\
|
||||
---
|
||||
outline_id: doc-cv-001
|
||||
outline_collection_id: col-bew-001
|
||||
outline_updated_at: 2026-01-10T12:00:00Z
|
||||
---
|
||||
# CV
|
||||
Original content.
|
||||
"""))
|
||||
(vault_dir / "Infra").mkdir()
|
||||
(vault_dir / "Infra" / "HomeLab.md").write_text(textwrap.dedent("""\
|
||||
---
|
||||
outline_id: doc-hl-002
|
||||
outline_collection_id: col-inf-001
|
||||
outline_updated_at: 2026-01-10T12:00:00Z
|
||||
---
|
||||
# HomeLab
|
||||
Server setup notes.
|
||||
"""))
|
||||
commit_all(vault_dir, "outline: sync 2026-01-10")
|
||||
|
||||
# Merge into 'main' so both branches are identical at start
|
||||
git(vault_dir, "checkout", "main")
|
||||
git(vault_dir, "merge", "outline", "--no-ff", "-m", "merge outline into main")
|
||||
|
||||
return vault_dir
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def settings_file(vault_dir: Path, tmp_path: Path) -> Path:
|
||||
"""settings.json pointing at the temp vault and a test Outline URL."""
|
||||
settings = {
|
||||
"source": {
|
||||
"url": "http://outline:3000",
|
||||
"token": "test-api-token-abc123",
|
||||
},
|
||||
"sync": {
|
||||
"vault_dir": str(vault_dir),
|
||||
"allow_deletions": False,
|
||||
},
|
||||
}
|
||||
path = tmp_path / "settings.json"
|
||||
path.write_text(json.dumps(settings, indent=2))
|
||||
return path
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def sync_log(vault_dir: Path) -> Path:
|
||||
"""Pre-populate _sync_log.md with a few history entries."""
|
||||
log = vault_dir / "_sync_log.md"
|
||||
log.write_text(textwrap.dedent("""\
|
||||
# Sync Log
|
||||
|
||||
| Timestamp | Direction | Files | Status |
|
||||
|-----------|-----------|-------|--------|
|
||||
| 2026-03-04 08:00 | pull | 0 changes | ok |
|
||||
| 2026-03-05 09:10 | push | 2 updated, 1 created | ok |
|
||||
| 2026-03-06 14:32 | pull | 3 updated | ok |
|
||||
"""))
|
||||
return log
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# App fixtures (Phases B–G)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture
|
||||
def app(vault_dir: Path, settings_file: Path):
|
||||
"""
|
||||
FastAPI app instance with VAULT_DIR and SETTINGS_PATH overridden.
|
||||
Skips the fixture (and all tests using it) if webui.py is not yet written.
|
||||
"""
|
||||
webui = pytest.importorskip("webui", reason="webui.py not yet implemented")
|
||||
webui.VAULT_DIR = vault_dir
|
||||
webui.SETTINGS_PATH = settings_file
|
||||
return webui.app
|
||||
|
||||
|
||||
@pytest_asyncio.fixture
|
||||
async def client(app) -> Generator[AsyncClient, None, None]:
|
||||
"""Async HTTP test client bound to the FastAPI app."""
|
||||
async with AsyncClient(
|
||||
transport=ASGITransport(app=app), base_url="http://testserver"
|
||||
) as c:
|
||||
yield c
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Vault state helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture
|
||||
def vault_with_pending(populated_vault: Path) -> Path:
|
||||
"""
|
||||
Vault where main branch has local edits not yet pushed to Outline.
|
||||
Simulates Obsidian having written files via WebDAV.
|
||||
"""
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.write_text(cv.read_text() + "\n## New Section\nAdded in Obsidian.\n")
|
||||
|
||||
new_file = populated_vault / "Projekte" / "NewNote.md"
|
||||
new_file.parent.mkdir(exist_ok=True)
|
||||
new_file.write_text("# New Note\nWritten in Obsidian.\n")
|
||||
|
||||
commit_all(populated_vault, "obsidian: local edits")
|
||||
return populated_vault
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def vault_with_conflict(populated_vault: Path) -> Path:
|
||||
"""
|
||||
Vault in a post-merge-conflict state: CV.md has conflict markers on main.
|
||||
"""
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
|
||||
# Edit on outline branch
|
||||
git(populated_vault, "checkout", "outline")
|
||||
cv.write_text(textwrap.dedent("""\
|
||||
---
|
||||
outline_id: doc-cv-001
|
||||
outline_collection_id: col-bew-001
|
||||
outline_updated_at: 2026-03-06T11:03:00Z
|
||||
---
|
||||
# CV
|
||||
Outline version with contact info updated.
|
||||
"""))
|
||||
commit_all(populated_vault, "outline: CV contact update")
|
||||
|
||||
# Conflicting edit on main branch
|
||||
git(populated_vault, "checkout", "main")
|
||||
cv.write_text(textwrap.dedent("""\
|
||||
---
|
||||
outline_id: doc-cv-001
|
||||
outline_collection_id: col-bew-001
|
||||
outline_updated_at: 2026-01-10T12:00:00Z
|
||||
---
|
||||
# CV
|
||||
Local version with new section added.
|
||||
"""))
|
||||
commit_all(populated_vault, "obsidian: CV new section")
|
||||
|
||||
# Attempt merge — this will produce a conflict
|
||||
try:
|
||||
git(populated_vault, "merge", "outline")
|
||||
except subprocess.CalledProcessError:
|
||||
pass # expected: merge conflict
|
||||
|
||||
return populated_vault
|
||||
37
tests/helpers.py
Normal file
37
tests/helpers.py
Normal file
@@ -0,0 +1,37 @@
|
||||
"""
|
||||
Shared test helpers for the Outline Sync Web UI test suite.
|
||||
Import directly: from tests.helpers import make_mock_process
|
||||
"""
|
||||
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
|
||||
|
||||
class _AsyncLineIter:
|
||||
"""Proper async iterator for mocking asyncio subprocess stdout."""
|
||||
|
||||
def __init__(self, lines: list[str]):
|
||||
self._iter = iter(lines)
|
||||
|
||||
def __aiter__(self):
|
||||
return self
|
||||
|
||||
async def __anext__(self) -> bytes:
|
||||
try:
|
||||
return (next(self._iter) + "\n").encode()
|
||||
except StopIteration:
|
||||
raise StopAsyncIteration
|
||||
|
||||
|
||||
def make_mock_process(stdout_lines: list[str], returncode: int = 0) -> MagicMock:
|
||||
"""
|
||||
Build a mock asyncio subprocess whose stdout is a proper async iterable.
|
||||
|
||||
Usage in tests:
|
||||
with patch("webui.spawn_sync_subprocess") as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done."])
|
||||
"""
|
||||
proc = MagicMock()
|
||||
proc.returncode = returncode
|
||||
proc.stdout = _AsyncLineIter(stdout_lines)
|
||||
proc.wait = AsyncMock(return_value=returncode)
|
||||
return proc
|
||||
6
tests/requirements-test.txt
Normal file
6
tests/requirements-test.txt
Normal file
@@ -0,0 +1,6 @@
|
||||
pytest>=8.0
|
||||
pytest-asyncio>=0.23
|
||||
pytest-mock>=3.12
|
||||
httpx>=0.27
|
||||
anyio[trio]>=4.0
|
||||
respx>=0.21
|
||||
412
tests/test_e2e.py
Normal file
412
tests/test_e2e.py
Normal file
@@ -0,0 +1,412 @@
|
||||
"""
|
||||
End-to-End Integration Tests — Full Workflow Scenarios
|
||||
|
||||
These tests simulate complete user workflows from start to finish.
|
||||
They use real git operations against a temp vault but mock the Outline API
|
||||
and WebDAV sync (since we do not have a live Outline server in CI).
|
||||
|
||||
For full E2E with live Outline, use @pytest.mark.integration and
|
||||
set OUTLINE_URL / OUTLINE_TOKEN environment variables.
|
||||
|
||||
Scenarios covered:
|
||||
E2E-1: Obsidian → Outline (new file with frontmatter writeback)
|
||||
E2E-2: Outline → Obsidian (pull, file appears in vault)
|
||||
E2E-3: Conflict detection and resolution in browser
|
||||
E2E-4: New collection creation for unknown top-level folder
|
||||
E2E-5: Concurrent safety (only one job at a time)
|
||||
E2E-6: Full roundtrip (pull → edit → push → pull verifies no pending)
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import subprocess
|
||||
import sys
|
||||
import textwrap
|
||||
from pathlib import Path
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from helpers import make_mock_process # noqa: E402
|
||||
|
||||
pytestmark = pytest.mark.asyncio
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def git(vault: Path, *args) -> str:
|
||||
return subprocess.run(
|
||||
["git", "-C", str(vault), *args],
|
||||
check=True, capture_output=True, text=True,
|
||||
).stdout.strip()
|
||||
|
||||
|
||||
def commit_all(vault: Path, message: str):
|
||||
subprocess.run(["git", "-C", str(vault), "add", "-A"], check=True, capture_output=True)
|
||||
try:
|
||||
subprocess.run(["git", "-C", str(vault), "commit", "-m", message], check=True, capture_output=True)
|
||||
except subprocess.CalledProcessError:
|
||||
pass
|
||||
|
||||
|
||||
async def wait_for_job_done(client, job_id: str, timeout: float = 5.0) -> list[dict]:
|
||||
"""Consume the SSE stream and collect all events."""
|
||||
events = []
|
||||
async with client.stream("GET", f"/stream/{job_id}") as r:
|
||||
async for line in r.aiter_lines():
|
||||
if line.startswith("data:"):
|
||||
try:
|
||||
e = json.loads(line[5:].strip())
|
||||
events.append(e)
|
||||
if e.get("type") == "done":
|
||||
break
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
return events
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# E2E-1: Obsidian → Outline (new file, frontmatter written back)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestObsidianToOutlineFlow:
|
||||
"""
|
||||
Full flow: user creates file in Obsidian → WebDAV syncs to VPS →
|
||||
user clicks Push → sync engine creates document in Outline →
|
||||
frontmatter written back → file has outline_id.
|
||||
"""
|
||||
|
||||
async def test_new_file_reaches_outline_and_gets_id(
|
||||
self, client, populated_vault
|
||||
):
|
||||
# Step 1: User creates a new note in Obsidian (WebDAV already synced it)
|
||||
new_file = populated_vault / "Projekte" / "E2E_NewDoc.md"
|
||||
new_file.parent.mkdir(exist_ok=True)
|
||||
new_file.write_text("# E2E New Doc\nContent written in Obsidian.\n")
|
||||
commit_all(populated_vault, "obsidian: new doc via webdav")
|
||||
|
||||
# Step 2: Dashboard shows pending changes
|
||||
status = (await client.get("/status")).json()
|
||||
assert status["pending_count"] >= 1, "Dashboard must show pending changes"
|
||||
|
||||
# Step 3: Changes page lists the new file
|
||||
changes = (await client.get("/changes")).json()
|
||||
new_item = next((i for i in changes if "E2E_NewDoc.md" in i["path"]), None)
|
||||
assert new_item is not None, "New file must appear in /changes"
|
||||
assert new_item["status"] == "added"
|
||||
|
||||
# Step 4: User pushes — sync engine creates document and writes back ID
|
||||
fake_doc_id = "doc-e2e-new-001"
|
||||
|
||||
def fake_push_subprocess(*args, **kwargs):
|
||||
new_file.write_text(textwrap.dedent(f"""\
|
||||
---
|
||||
outline_id: {fake_doc_id}
|
||||
outline_collection_id: col-proj-001
|
||||
outline_updated_at: 2026-03-07T10:00:00Z
|
||||
---
|
||||
# E2E New Doc
|
||||
Content written in Obsidian.
|
||||
"""))
|
||||
commit_all(populated_vault, "sync: frontmatter writeback")
|
||||
return make_mock_process([
|
||||
f"ok: Projekte/E2E_NewDoc.md created (id: {fake_doc_id})",
|
||||
"Done. 1 created.",
|
||||
])
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", side_effect=fake_push_subprocess):
|
||||
r = await client.post("/push")
|
||||
assert r.status_code in (200, 202)
|
||||
events = await wait_for_job_done(client, r.json()["job_id"])
|
||||
done = next((e for e in events if e.get("type") == "done"), None)
|
||||
assert done is not None
|
||||
|
||||
# Step 5: File now has outline_id (will be served via WebDAV to Obsidian)
|
||||
content = new_file.read_text()
|
||||
assert "outline_id" in content, "File must have outline_id after push"
|
||||
assert fake_doc_id in content
|
||||
|
||||
# Step 6: No more pending changes
|
||||
status2 = (await client.get("/status")).json()
|
||||
# After frontmatter writeback commit, outline branch advances on push
|
||||
# pending_count depends on implementation — just verify push succeeded
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# E2E-2: Outline → Obsidian (pull, file appears in vault)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestOutlineToObsidianFlow:
|
||||
"""
|
||||
Full flow: document updated in Outline → user clicks Pull →
|
||||
SSE streams progress → file updated in vault → WebDAV serves it →
|
||||
Obsidian picks it up.
|
||||
"""
|
||||
|
||||
async def test_pull_updates_existing_document(self, client, populated_vault):
|
||||
# Initial state: CV.md exists in both branches (clean)
|
||||
status = (await client.get("/status")).json()
|
||||
assert status["pending_count"] == 0
|
||||
|
||||
# Simulate pull: sync engine updates CV.md on outline branch
|
||||
def fake_pull_subprocess(*args, **kwargs):
|
||||
git(populated_vault, "checkout", "outline")
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.write_text(textwrap.dedent("""\
|
||||
---
|
||||
outline_id: doc-cv-001
|
||||
outline_collection_id: col-bew-001
|
||||
outline_updated_at: 2026-03-07T09:00:00Z
|
||||
---
|
||||
# CV
|
||||
Updated in Outline with new contact info.
|
||||
"""))
|
||||
commit_all(populated_vault, "outline: CV updated")
|
||||
git(populated_vault, "checkout", "main")
|
||||
git(populated_vault, "merge", "outline", "--no-ff", "-m", "merge outline")
|
||||
return make_mock_process([
|
||||
"ok: Bewerbungen/CV.md updated",
|
||||
"Done. 1 updated.",
|
||||
])
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", side_effect=fake_pull_subprocess):
|
||||
r = await client.post("/pull")
|
||||
assert r.status_code in (200, 202)
|
||||
events = await wait_for_job_done(client, r.json()["job_id"])
|
||||
|
||||
done = next((e for e in events if e.get("type") == "done"), None)
|
||||
assert done is not None, "Pull must emit done event"
|
||||
|
||||
# CV.md should now have new content
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
assert "contact info" in cv.read_text(), (
|
||||
"CV.md must be updated in vault after pull"
|
||||
)
|
||||
|
||||
async def test_pull_adds_new_document_to_vault(self, client, populated_vault):
|
||||
"""New document created in Outline must appear as a file after pull."""
|
||||
|
||||
def fake_pull_subprocess(*args, **kwargs):
|
||||
git(populated_vault, "checkout", "outline")
|
||||
new_file = populated_vault / "Infra" / "NewServerDoc.md"
|
||||
new_file.write_text(textwrap.dedent("""\
|
||||
---
|
||||
outline_id: doc-new-srv-001
|
||||
outline_collection_id: col-inf-001
|
||||
outline_updated_at: 2026-03-07T10:00:00Z
|
||||
---
|
||||
# New Server Doc
|
||||
Created in Outline.
|
||||
"""))
|
||||
commit_all(populated_vault, "outline: new server doc")
|
||||
git(populated_vault, "checkout", "main")
|
||||
git(populated_vault, "merge", "outline", "--no-ff", "-m", "merge")
|
||||
return make_mock_process([
|
||||
"ok: Infra/NewServerDoc.md created",
|
||||
"Done. 1 created.",
|
||||
])
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", side_effect=fake_pull_subprocess):
|
||||
r = await client.post("/pull")
|
||||
await wait_for_job_done(client, r.json()["job_id"])
|
||||
|
||||
new_file = populated_vault / "Infra" / "NewServerDoc.md"
|
||||
assert new_file.exists(), "New document from Outline must appear in vault"
|
||||
assert "outline_id: doc-new-srv-001" in new_file.read_text()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# E2E-3: Conflict detection and resolution in browser
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestConflictResolutionFlow:
|
||||
"""
|
||||
Full flow: same file edited in both Obsidian and Outline →
|
||||
pull detects conflict → conflicts page shows → user resolves →
|
||||
push becomes available.
|
||||
"""
|
||||
|
||||
async def test_full_conflict_resolution_flow(self, client, vault_with_conflict):
|
||||
# Step 1: Verify conflicts are detected
|
||||
status = (await client.get("/status")).json()
|
||||
assert status["conflicts"] > 0, "Conflicts must be detected after merge conflict"
|
||||
|
||||
# Step 2: Push is blocked
|
||||
r = await client.post("/push")
|
||||
assert r.status_code == 409, "Push must be blocked while conflicts exist"
|
||||
|
||||
# Step 3: Conflicts page lists the file
|
||||
r = await client.get("/conflicts")
|
||||
conflicts = r.json()
|
||||
assert len(conflicts) > 0
|
||||
conflict_path = conflicts[0]["path"]
|
||||
|
||||
# Step 4: User views the diff
|
||||
import base64
|
||||
encoded = base64.urlsafe_b64encode(conflict_path.encode()).decode()
|
||||
r = await client.get(f"/diff/{encoded}")
|
||||
assert r.status_code == 200
|
||||
assert "text/html" in r.headers.get("content-type", "")
|
||||
|
||||
# Step 5: User chooses "Keep mine" (local)
|
||||
r = await client.post("/resolve", json={
|
||||
"file": conflict_path,
|
||||
"accept": "local",
|
||||
})
|
||||
assert r.status_code == 200
|
||||
|
||||
# Step 6: No more conflicts
|
||||
r = await client.get("/conflicts")
|
||||
assert r.json() == [], "All conflicts must be resolved"
|
||||
|
||||
# Step 7: No conflict markers in file
|
||||
content = (vault_with_conflict / conflict_path).read_text()
|
||||
assert "<<<<<<<" not in content
|
||||
|
||||
# Step 8: Push is now allowed
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 1 updated."])
|
||||
r = await client.post("/push")
|
||||
assert r.status_code in (200, 202), "Push must succeed after conflicts resolved"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# E2E-4: New collection creation
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestNewCollectionFlow:
|
||||
"""
|
||||
User creates a new top-level folder in Obsidian.
|
||||
Push must create the collection AND the documents inside it.
|
||||
"""
|
||||
|
||||
async def test_new_collection_created_on_push(self, client, populated_vault):
|
||||
# Create new folder + file (simulating Obsidian + WebDAV sync)
|
||||
new_dir = populated_vault / "NewProject"
|
||||
new_dir.mkdir()
|
||||
(new_dir / "Overview.md").write_text("# Overview\nNew project.\n")
|
||||
(new_dir / "Notes.md").write_text("# Notes\nMeeting notes.\n")
|
||||
commit_all(populated_vault, "obsidian: new collection NewProject")
|
||||
|
||||
# Changes must flag both files as added
|
||||
changes = (await client.get("/changes")).json()
|
||||
new_items = [i for i in changes if "NewProject" in i["path"]]
|
||||
assert len(new_items) >= 2, "Both new files must appear in changes"
|
||||
|
||||
# Push — mock shows collection + documents created
|
||||
fake_col_id = "col-newproject-001"
|
||||
fake_push_lines = [
|
||||
f"ok: collection 'NewProject' created (id: {fake_col_id})",
|
||||
"ok: NewProject/Overview.md created (id: doc-overview-001)",
|
||||
"ok: NewProject/Notes.md created (id: doc-notes-001)",
|
||||
"Done. 2 created, 1 collection created.",
|
||||
]
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(fake_push_lines)
|
||||
r = await client.post("/push")
|
||||
assert r.status_code in (200, 202)
|
||||
events = await wait_for_job_done(client, r.json()["job_id"])
|
||||
all_text = json.dumps(events)
|
||||
assert "NewProject" in all_text or "collection" in all_text, (
|
||||
"New collection creation must be reflected in SSE output"
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# E2E-5: Concurrency safety
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestConcurrencySafety:
|
||||
|
||||
async def test_only_one_sync_job_at_a_time(self, client, populated_vault):
|
||||
"""
|
||||
Starting a second sync while first is pending/running returns 409.
|
||||
_active_job is set immediately by POST /pull, before the job starts.
|
||||
"""
|
||||
r1 = await client.post("/pull")
|
||||
assert r1.status_code in (200, 202)
|
||||
|
||||
r2 = await client.post("/pull")
|
||||
assert r2.status_code == 409, "Concurrent pull must be rejected"
|
||||
|
||||
r3 = await client.post("/push")
|
||||
assert r3.status_code == 409, "Concurrent push must be rejected"
|
||||
|
||||
async def test_new_job_accepted_after_previous_completes(
|
||||
self, client, populated_vault
|
||||
):
|
||||
"""After a job finishes, a new job must be accepted."""
|
||||
# Keep patch active while draining so the task can call spawn_sync_subprocess
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||
r1 = await client.post("/pull")
|
||||
assert r1.status_code in (200, 202)
|
||||
await wait_for_job_done(client, r1.json()["job_id"]) # drain → job completes
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||
r2 = await client.post("/pull")
|
||||
assert r2.status_code in (200, 202), "New job must be accepted after first completes"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# E2E-6: Full roundtrip — pull → edit → push → no pending
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestFullRoundtrip:
|
||||
|
||||
async def test_pull_edit_push_leaves_clean_state(self, client, populated_vault):
|
||||
"""
|
||||
Complete happy-path cycle:
|
||||
1. Pull (no changes)
|
||||
2. Edit a file (simulate Obsidian)
|
||||
3. Push (sync engine updates Outline, writes back updated_at)
|
||||
4. Dashboard shows clean state
|
||||
"""
|
||||
# Step 1: Pull — nothing new
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||
r = await client.post("/pull")
|
||||
await wait_for_job_done(client, r.json()["job_id"])
|
||||
|
||||
# Step 2: User edits CV.md in Obsidian (simulated via direct write)
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
original = cv.read_text()
|
||||
cv.write_text(original.rstrip() + "\n\n## Skills\n- Python\n- Docker\n")
|
||||
commit_all(populated_vault, "obsidian: add skills section")
|
||||
|
||||
pending = (await client.get("/status")).json()["pending_count"]
|
||||
assert pending >= 1, "Edits must be pending after local change"
|
||||
|
||||
# Step 3: Push — sync engine updates Outline and refreshes updated_at
|
||||
def fake_push(*args, **kwargs):
|
||||
cv.write_text(cv.read_text().replace(
|
||||
"outline_updated_at: 2026-01-10T12:00:00Z",
|
||||
"outline_updated_at: 2026-03-07T11:30:00Z",
|
||||
))
|
||||
commit_all(populated_vault, "sync: advance outline branch after push")
|
||||
# Advance outline branch to match
|
||||
git(populated_vault, "checkout", "outline")
|
||||
cv_outline = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv_outline.write_text(cv.read_text())
|
||||
commit_all(populated_vault, "outline: updated from push")
|
||||
git(populated_vault, "checkout", "main")
|
||||
return make_mock_process(["ok: Bewerbungen/CV.md updated", "Done. 1 updated."])
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", side_effect=fake_push):
|
||||
r = await client.post("/push")
|
||||
events = await wait_for_job_done(client, r.json()["job_id"])
|
||||
|
||||
done = next((e for e in events if e.get("type") == "done"), None)
|
||||
assert done is not None, "Push must emit done event"
|
||||
|
||||
# Step 4: Verify no pending changes remain (outline branch == main)
|
||||
r_changes = await client.get("/changes")
|
||||
# Changes should be 0 or only include the updated_at change
|
||||
# which is an internal sync marker, not a user-content change
|
||||
status = (await client.get("/status")).json()
|
||||
assert status["conflicts"] == 0, "No conflicts must remain after clean push"
|
||||
221
tests/test_phase_a_webdav.py
Normal file
221
tests/test_phase_a_webdav.py
Normal file
@@ -0,0 +1,221 @@
|
||||
"""
|
||||
Phase A — WebDAV Container Tests
|
||||
|
||||
Integration tests against a running obsidian-webdav container.
|
||||
Skip these in CI unless the WebDAV service is available.
|
||||
|
||||
Run with:
|
||||
pytest tests/test_phase_a_webdav.py -m integration -v
|
||||
|
||||
Environment variables required:
|
||||
WEBDAV_URL e.g. http://100.x.x.x (Tailscale IP)
|
||||
WEBDAV_USER basic-auth username (default: obsidian)
|
||||
WEBDAV_PASS basic-auth password
|
||||
"""
|
||||
|
||||
import os
|
||||
import uuid
|
||||
|
||||
import pytest
|
||||
import requests
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixtures
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
pytestmark = pytest.mark.integration
|
||||
|
||||
|
||||
def webdav_base() -> str:
|
||||
url = os.environ.get("WEBDAV_URL", "").rstrip("/")
|
||||
if not url:
|
||||
pytest.skip("WEBDAV_URL not set — skipping WebDAV integration tests")
|
||||
return url
|
||||
|
||||
|
||||
def webdav_auth():
|
||||
return (
|
||||
os.environ.get("WEBDAV_USER", "obsidian"),
|
||||
os.environ.get("WEBDAV_PASS", ""),
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def dav():
|
||||
"""Requests session pre-configured for the WebDAV endpoint."""
|
||||
s = requests.Session()
|
||||
s.auth = webdav_auth()
|
||||
return s
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def test_filename():
|
||||
"""Unique filename so parallel test runs do not collide."""
|
||||
return f"test_{uuid.uuid4().hex[:8]}.md"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-A2 — Authentication
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestWebDAVAuth:
|
||||
|
||||
def test_unauthenticated_request_returns_401(self):
|
||||
"""GET without credentials must be rejected."""
|
||||
r = requests.get(webdav_base())
|
||||
assert r.status_code == 401, f"Expected 401, got {r.status_code}"
|
||||
|
||||
def test_wrong_password_returns_401(self):
|
||||
r = requests.get(webdav_base(), auth=("obsidian", "wrong-password"))
|
||||
assert r.status_code == 401
|
||||
|
||||
def test_valid_credentials_succeed(self, dav):
|
||||
r = dav.request("PROPFIND", webdav_base(), headers={"Depth": "0"})
|
||||
assert r.status_code in (200, 207), f"Expected 200/207, got {r.status_code}"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-A4 — WebDAV method support (PROPFIND, PUT, GET, DELETE)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestWebDAVMethods:
|
||||
|
||||
def test_propfind_root_lists_contents(self, dav):
|
||||
"""PROPFIND depth=1 must enumerate the root of the vault."""
|
||||
r = dav.request(
|
||||
"PROPFIND",
|
||||
webdav_base(),
|
||||
headers={"Depth": "1", "Content-Type": "application/xml"},
|
||||
)
|
||||
assert r.status_code in (200, 207)
|
||||
assert len(r.content) > 0
|
||||
|
||||
def test_put_creates_file(self, dav, test_filename):
|
||||
url = f"{webdav_base()}/{test_filename}"
|
||||
content = b"# Test Note\nCreated by automated test.\n"
|
||||
r = dav.put(url, data=content)
|
||||
assert r.status_code in (200, 201, 204), f"PUT failed: {r.status_code}"
|
||||
|
||||
# Verify it exists
|
||||
r2 = dav.get(url)
|
||||
assert r2.status_code == 200
|
||||
assert b"# Test Note" in r2.content
|
||||
|
||||
# Cleanup
|
||||
dav.delete(url)
|
||||
|
||||
def test_put_updates_existing_file(self, dav, test_filename):
|
||||
url = f"{webdav_base()}/{test_filename}"
|
||||
dav.put(url, data=b"v1 content")
|
||||
|
||||
dav.put(url, data=b"v2 content updated")
|
||||
r = dav.get(url)
|
||||
assert b"v2 content" in r.content
|
||||
|
||||
dav.delete(url)
|
||||
|
||||
def test_delete_removes_file(self, dav, test_filename):
|
||||
url = f"{webdav_base()}/{test_filename}"
|
||||
dav.put(url, data=b"temporary file")
|
||||
|
||||
r = dav.delete(url)
|
||||
assert r.status_code in (200, 204)
|
||||
|
||||
r2 = dav.get(url)
|
||||
assert r2.status_code == 404
|
||||
|
||||
def test_get_nonexistent_returns_404(self, dav):
|
||||
r = dav.get(f"{webdav_base()}/does_not_exist_{uuid.uuid4().hex}.md")
|
||||
assert r.status_code == 404
|
||||
|
||||
def test_mkcol_creates_subdirectory(self, dav):
|
||||
dirname = f"test_dir_{uuid.uuid4().hex[:8]}"
|
||||
r = dav.request("MKCOL", f"{webdav_base()}/{dirname}")
|
||||
assert r.status_code in (200, 201, 207)
|
||||
|
||||
r2 = dav.request("PROPFIND", f"{webdav_base()}/{dirname}", headers={"Depth": "0"})
|
||||
assert r2.status_code in (200, 207)
|
||||
|
||||
dav.delete(f"{webdav_base()}/{dirname}")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-A3 — .git/ directory protection
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestWebDAVGitProtection:
|
||||
|
||||
def test_git_directory_is_inaccessible(self, dav):
|
||||
"""The .git/ directory must not be served — nginx should deny it."""
|
||||
r = dav.request("PROPFIND", f"{webdav_base()}/.git/", headers={"Depth": "0"})
|
||||
assert r.status_code in (403, 404), (
|
||||
f"Expected 403/404 for /.git/ but got {r.status_code}. "
|
||||
"The WebDAV config must deny access to .git/"
|
||||
)
|
||||
|
||||
def test_git_config_file_is_inaccessible(self, dav):
|
||||
r = dav.get(f"{webdav_base()}/.git/config")
|
||||
assert r.status_code in (403, 404)
|
||||
|
||||
def test_git_head_file_is_inaccessible(self, dav):
|
||||
r = dav.get(f"{webdav_base()}/.git/HEAD")
|
||||
assert r.status_code in (403, 404)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-A1 — Bidirectional sync simulation
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestWebDAVFileOps:
|
||||
|
||||
def test_create_read_delete_roundtrip(self, dav, test_filename):
|
||||
"""Full create → read → delete cycle for a markdown file."""
|
||||
url = f"{webdav_base()}/{test_filename}"
|
||||
body = "---\noutline_id: test-001\n---\n# Test\nContent.\n"
|
||||
|
||||
# Create
|
||||
assert dav.put(url, data=body.encode()).status_code in (200, 201, 204)
|
||||
|
||||
# Read back — content must match
|
||||
r = dav.get(url)
|
||||
assert r.status_code == 200
|
||||
assert "outline_id: test-001" in r.text
|
||||
|
||||
# Delete
|
||||
assert dav.delete(url).status_code in (200, 204)
|
||||
|
||||
def test_unicode_content_preserved(self, dav, test_filename):
|
||||
url = f"{webdav_base()}/{test_filename}"
|
||||
body = "# Ünïcödé Héadïng\n\nGerman: Straße, Chinese: 你好\n".encode("utf-8")
|
||||
|
||||
dav.put(url, data=body)
|
||||
r = dav.get(url)
|
||||
assert "Straße" in r.text
|
||||
assert "你好" in r.text
|
||||
|
||||
dav.delete(url)
|
||||
|
||||
def test_large_file_survives_roundtrip(self, dav, test_filename):
|
||||
url = f"{webdav_base()}/{test_filename}"
|
||||
# 500 KB markdown file
|
||||
body = ("# Big Note\n" + "x" * 500_000).encode()
|
||||
|
||||
dav.put(url, data=body)
|
||||
r = dav.get(url)
|
||||
assert len(r.content) >= 500_000
|
||||
|
||||
dav.delete(url)
|
||||
|
||||
def test_obsidian_settings_directory_can_be_excluded(self, dav):
|
||||
"""
|
||||
Verify we can PUT to a path we'd want to ignore (.obsidian/)
|
||||
and then verify the nginx/webdav config (if configured) can block it.
|
||||
This test documents expected behaviour; if .obsidian/ IS accessible,
|
||||
it must be controlled at the Obsidian plugin level (ignore list).
|
||||
"""
|
||||
# This is an informational check, not a hard assertion —
|
||||
# .obsidian/ exclusion is handled by the remotely-save plugin config.
|
||||
r = dav.request("PROPFIND", f"{webdav_base()}/.obsidian/", headers={"Depth": "0"})
|
||||
# 404 = not present (preferred), 403 = blocked, 207 = accessible
|
||||
# All are valid — important thing is it is documented
|
||||
assert r.status_code in (403, 404, 207)
|
||||
178
tests/test_phase_b_dashboard.py
Normal file
178
tests/test_phase_b_dashboard.py
Normal file
@@ -0,0 +1,178 @@
|
||||
"""
|
||||
Phase B — Read-Only Dashboard Tests
|
||||
|
||||
Tests for GET / (dashboard HTML) and GET /status (JSON API).
|
||||
All tests use the FastAPI test client and mock git subprocess calls.
|
||||
"""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
import textwrap
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
import pytest_asyncio
|
||||
|
||||
pytestmark = pytest.mark.asyncio
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def make_git_output(stdout: str = "", returncode: int = 0) -> MagicMock:
|
||||
m = MagicMock()
|
||||
m.stdout = stdout
|
||||
m.returncode = returncode
|
||||
return m
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-B1 — Dashboard page renders
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestDashboardPage:
|
||||
|
||||
async def test_get_root_returns_200(self, client):
|
||||
r = await client.get("/")
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_dashboard_returns_html(self, client):
|
||||
r = await client.get("/")
|
||||
assert "text/html" in r.headers.get("content-type", "")
|
||||
|
||||
async def test_dashboard_contains_status_badge(self, client):
|
||||
r = await client.get("/")
|
||||
body = r.text.lower()
|
||||
# At minimum one of these status words must appear
|
||||
assert any(word in body for word in ("clean", "dirty", "conflict", "pending")), (
|
||||
"Dashboard must show a vault status badge"
|
||||
)
|
||||
|
||||
async def test_dashboard_contains_pull_button(self, client):
|
||||
r = await client.get("/")
|
||||
body = r.text.lower()
|
||||
assert "pull" in body or "get from outline" in body
|
||||
|
||||
async def test_dashboard_contains_push_button(self, client):
|
||||
r = await client.get("/")
|
||||
body = r.text.lower()
|
||||
assert "push" in body or "send to outline" in body
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-B2 — Status endpoint returns structured JSON
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestStatusEndpoint:
|
||||
|
||||
async def test_status_returns_200(self, client):
|
||||
r = await client.get("/status")
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_status_returns_json(self, client):
|
||||
r = await client.get("/status")
|
||||
assert "application/json" in r.headers.get("content-type", "")
|
||||
data = r.json()
|
||||
assert isinstance(data, dict)
|
||||
|
||||
async def test_status_has_required_fields(self, client):
|
||||
r = await client.get("/status")
|
||||
data = r.json()
|
||||
assert "vault_status" in data, "Missing 'vault_status' field"
|
||||
assert "pending_count" in data, "Missing 'pending_count' field"
|
||||
assert "conflicts" in data, "Missing 'conflicts' field"
|
||||
assert "last_pull" in data, "Missing 'last_pull' field"
|
||||
assert "last_push" in data, "Missing 'last_push' field"
|
||||
|
||||
async def test_status_pending_count_is_integer(self, client):
|
||||
r = await client.get("/status")
|
||||
data = r.json()
|
||||
assert isinstance(data["pending_count"], int)
|
||||
assert data["pending_count"] >= 0
|
||||
|
||||
async def test_status_conflicts_is_integer(self, client):
|
||||
r = await client.get("/status")
|
||||
data = r.json()
|
||||
assert isinstance(data["conflicts"], int)
|
||||
assert data["conflicts"] >= 0
|
||||
|
||||
async def test_status_clean_vault(self, client, populated_vault):
|
||||
"""
|
||||
After a clean merge, pending_count should be 0 and conflicts 0.
|
||||
"""
|
||||
r = await client.get("/status")
|
||||
data = r.json()
|
||||
assert data["pending_count"] == 0
|
||||
assert data["conflicts"] == 0
|
||||
assert data["vault_status"] in ("clean", "ok", "synced")
|
||||
|
||||
async def test_status_with_pending_changes(self, client, vault_with_pending):
|
||||
"""
|
||||
vault_with_pending has local edits on main — pending_count must be > 0.
|
||||
"""
|
||||
r = await client.get("/status")
|
||||
data = r.json()
|
||||
assert data["pending_count"] > 0
|
||||
|
||||
async def test_status_vault_status_dirty_when_pending(self, client, vault_with_pending):
|
||||
r = await client.get("/status")
|
||||
data = r.json()
|
||||
assert data["vault_status"] != "clean"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-B3 — Conflict warning badge
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestConflictBadge:
|
||||
|
||||
async def test_no_conflict_badge_when_clean(self, client, populated_vault):
|
||||
r = await client.get("/")
|
||||
# Should NOT contain a prominent conflict warning
|
||||
# (checking for the absence of conflict count > 0 in dashboard)
|
||||
data = (await client.get("/status")).json()
|
||||
assert data["conflicts"] == 0
|
||||
|
||||
async def test_conflict_badge_visible_when_conflicts_exist(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
status = (await client.get("/status")).json()
|
||||
assert status["conflicts"] > 0, "Expected conflict count > 0"
|
||||
|
||||
r = await client.get("/")
|
||||
body = r.text.lower()
|
||||
assert "conflict" in body, "Dashboard must show conflict warning when conflicts exist"
|
||||
|
||||
async def test_conflict_badge_links_to_conflicts_page(self, client, vault_with_conflict):
|
||||
r = await client.get("/")
|
||||
assert "/conflicts" in r.text, "Conflict badge must link to /conflicts"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-B4 — Pending count reflects git diff
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestPendingCount:
|
||||
|
||||
async def test_pending_count_zero_on_clean_vault(self, client, populated_vault):
|
||||
r = await client.get("/status")
|
||||
assert r.json()["pending_count"] == 0
|
||||
|
||||
async def test_pending_count_increases_with_local_edits(
|
||||
self, client, vault_with_pending
|
||||
):
|
||||
r = await client.get("/status")
|
||||
# We added 1 modified file + 1 new file in vault_with_pending fixture
|
||||
assert r.json()["pending_count"] >= 2
|
||||
|
||||
async def test_pending_count_shown_in_push_button(self, client, vault_with_pending):
|
||||
"""Push button label should reflect pending count."""
|
||||
r = await client.get("/")
|
||||
body = r.text
|
||||
status = (await client.get("/status")).json()
|
||||
pending = status["pending_count"]
|
||||
# The pending count must appear somewhere near the push button
|
||||
assert str(pending) in body, (
|
||||
f"Push button should show pending count ({pending}) in label"
|
||||
)
|
||||
266
tests/test_phase_c_pull.py
Normal file
266
tests/test_phase_c_pull.py
Normal file
@@ -0,0 +1,266 @@
|
||||
"""
|
||||
Phase C — Pull with Live Output Tests
|
||||
|
||||
Tests for POST /pull (job start) and GET /stream/{job_id} (SSE streaming).
|
||||
The sync subprocess is mocked so tests do not require a live Outline instance.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from helpers import make_mock_process # noqa: E402
|
||||
|
||||
pytestmark = pytest.mark.asyncio
|
||||
|
||||
|
||||
async def consume_sse(client, job_id: str, max_events: int = 50) -> list[dict]:
|
||||
"""Stream SSE events until 'done' or max_events reached."""
|
||||
events = []
|
||||
async with client.stream("GET", f"/stream/{job_id}") as r:
|
||||
assert r.status_code == 200
|
||||
assert "text/event-stream" in r.headers.get("content-type", "")
|
||||
async for line in r.aiter_lines():
|
||||
if line.startswith("data:"):
|
||||
try:
|
||||
events.append(json.loads(line[5:].strip()))
|
||||
except json.JSONDecodeError:
|
||||
events.append({"raw": line[5:].strip()})
|
||||
if events and events[-1].get("type") == "done":
|
||||
break
|
||||
if len(events) >= max_events:
|
||||
break
|
||||
return events
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-C1 — POST /pull starts a job and returns a job_id
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestPullJobCreation:
|
||||
|
||||
async def test_post_pull_returns_202(self, client):
|
||||
with patch("webui.run_sync_job", new_callable=AsyncMock) as _mock_job:
|
||||
r = await client.post("/pull")
|
||||
assert r.status_code in (200, 202)
|
||||
|
||||
async def test_post_pull_returns_job_id(self, client):
|
||||
with patch("webui.run_sync_job", new_callable=AsyncMock) as _mock_job:
|
||||
r = await client.post("/pull")
|
||||
data = r.json()
|
||||
assert "job_id" in data, "Response must include a job_id"
|
||||
assert isinstance(data["job_id"], str)
|
||||
assert len(data["job_id"]) > 0
|
||||
|
||||
async def test_post_pull_returns_stream_url(self, client):
|
||||
with patch("webui.run_sync_job", new_callable=AsyncMock) as _mock_job:
|
||||
r = await client.post("/pull")
|
||||
data = r.json()
|
||||
# Either stream_url or job_id is sufficient to construct the SSE URL
|
||||
assert "job_id" in data or "stream_url" in data
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-C1 — SSE stream emits progress events
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestPullStreaming:
|
||||
|
||||
async def test_stream_content_type_is_sse(self, client):
|
||||
"""GET /stream/{job_id} must return text/event-stream."""
|
||||
with patch("webui.run_sync_job", new_callable=AsyncMock) as _mock_job:
|
||||
r = await client.post("/pull")
|
||||
job_id = r.json()["job_id"]
|
||||
|
||||
async with client.stream("GET", f"/stream/{job_id}") as stream:
|
||||
assert "text/event-stream" in stream.headers.get("content-type", "")
|
||||
|
||||
async def test_stream_emits_data_events(self, client):
|
||||
"""Stream must yield at least one data event."""
|
||||
pull_lines = [
|
||||
"Fetching collections...",
|
||||
"Processing Bewerbungen/CV.md",
|
||||
"Processing Infra/HomeLab.md",
|
||||
"Done. 2 updated, 0 created.",
|
||||
]
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(pull_lines)
|
||||
r = await client.post("/pull")
|
||||
job_id = r.json()["job_id"]
|
||||
events = await consume_sse(client, job_id)
|
||||
|
||||
assert len(events) >= 1, "Stream must emit at least one event"
|
||||
|
||||
async def test_stream_ends_with_done_event(self, client):
|
||||
"""Last event in the stream must be type=done."""
|
||||
pull_lines = [
|
||||
"Fetching collections...",
|
||||
"Done. 1 updated.",
|
||||
]
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(pull_lines)
|
||||
r = await client.post("/pull")
|
||||
job_id = r.json()["job_id"]
|
||||
events = await consume_sse(client, job_id)
|
||||
|
||||
done_events = [e for e in events if e.get("type") == "done"]
|
||||
assert len(done_events) >= 1, "Stream must end with a 'done' event"
|
||||
|
||||
async def test_stream_done_event_contains_summary(self, client):
|
||||
"""The done event must include summary statistics."""
|
||||
pull_lines = [
|
||||
"Done. 2 updated, 1 created, 0 errors.",
|
||||
]
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(pull_lines)
|
||||
r = await client.post("/pull")
|
||||
job_id = r.json()["job_id"]
|
||||
events = await consume_sse(client, job_id)
|
||||
|
||||
done = next((e for e in events if e.get("type") == "done"), None)
|
||||
assert done is not None
|
||||
# Summary can be in 'message', 'summary', or top-level 'data' text
|
||||
summary_text = json.dumps(done)
|
||||
assert any(word in summary_text for word in ("updated", "created", "done", "0")), (
|
||||
"Done event must contain a summary"
|
||||
)
|
||||
|
||||
async def test_stream_includes_per_file_events(self, client):
|
||||
"""Each processed file should generate its own progress event."""
|
||||
pull_lines = [
|
||||
"processing: Bewerbungen/CV.md",
|
||||
"ok: Bewerbungen/CV.md updated",
|
||||
"processing: Infra/HomeLab.md",
|
||||
"ok: Infra/HomeLab.md updated",
|
||||
"Done. 2 updated.",
|
||||
]
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(pull_lines)
|
||||
r = await client.post("/pull")
|
||||
job_id = r.json()["job_id"]
|
||||
events = await consume_sse(client, job_id)
|
||||
|
||||
all_text = json.dumps(events)
|
||||
assert "CV.md" in all_text or "Bewerbungen" in all_text, (
|
||||
"Stream events should reference processed files"
|
||||
)
|
||||
|
||||
async def test_stream_for_unknown_job_returns_404(self, client):
|
||||
r = await client.get("/stream/nonexistent-job-id-xyz")
|
||||
assert r.status_code == 404
|
||||
|
||||
async def test_failed_sync_emits_error_event(self, client):
|
||||
"""If the sync process exits with non-zero, stream must emit an error event."""
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(
|
||||
["Error: API connection failed"], returncode=1
|
||||
)
|
||||
r = await client.post("/pull")
|
||||
job_id = r.json()["job_id"]
|
||||
events = await consume_sse(client, job_id)
|
||||
|
||||
error_events = [e for e in events if e.get("type") in ("error", "done")]
|
||||
assert any(
|
||||
e.get("success") is False or e.get("type") == "error"
|
||||
for e in error_events
|
||||
), "Failed sync must emit an error event"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-C2 — Pull content actually updates vault files
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestPullContent:
|
||||
|
||||
async def test_pull_advances_outline_branch(self, client, populated_vault):
|
||||
"""
|
||||
After a pull that introduces a new document, the outline branch
|
||||
must have a new commit compared to before.
|
||||
"""
|
||||
import subprocess as sp
|
||||
before = sp.run(
|
||||
["git", "-C", str(populated_vault), "rev-parse", "outline"],
|
||||
capture_output=True, text=True,
|
||||
).stdout.strip()
|
||||
|
||||
# Simulate pull writing a new file to outline branch
|
||||
new_file = populated_vault / "Projekte" / "FreshDoc.md"
|
||||
new_file.parent.mkdir(exist_ok=True)
|
||||
new_file.write_text("---\noutline_id: doc-new-001\n---\n# Fresh Doc\n")
|
||||
|
||||
import subprocess as sp2
|
||||
sp2.run(["git", "-C", str(populated_vault), "checkout", "outline"], check=True, capture_output=True)
|
||||
sp2.run(["git", "-C", str(populated_vault), "add", "-A"], check=True, capture_output=True)
|
||||
sp2.run(["git", "-C", str(populated_vault), "commit", "-m", "outline: new doc"], check=True, capture_output=True)
|
||||
after = sp2.run(
|
||||
["git", "-C", str(populated_vault), "rev-parse", "outline"],
|
||||
capture_output=True, text=True,
|
||||
).stdout.strip()
|
||||
|
||||
assert before != after, "outline branch must advance after pull"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-C3 — Idempotent pull
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestPullIdempotent:
|
||||
|
||||
async def test_pull_with_no_changes_returns_success(self, client):
|
||||
"""A pull against an empty diff must succeed (not error)."""
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(
|
||||
["No changes from Outline.", "Done. 0 updated."], returncode=0
|
||||
)
|
||||
r = await client.post("/pull")
|
||||
assert r.status_code in (200, 202)
|
||||
|
||||
async def test_pull_twice_is_safe(self, client):
|
||||
"""Two sequential pulls must both succeed."""
|
||||
# Keep patch active until SSE stream finishes so the task can run
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||
r1 = await client.post("/pull")
|
||||
assert r1.status_code in (200, 202)
|
||||
await consume_sse(client, r1.json()["job_id"]) # drain → job completes
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||
r2 = await client.post("/pull")
|
||||
assert r2.status_code in (200, 202)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-C4 — Job lock prevents concurrent syncs
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestJobLock:
|
||||
|
||||
async def test_concurrent_pull_returns_409(self, client):
|
||||
"""
|
||||
Starting a second pull while the first is pending/running returns 409.
|
||||
_active_job is set immediately when POST /pull is called.
|
||||
"""
|
||||
r1 = await client.post("/pull")
|
||||
assert r1.status_code in (200, 202), "First pull must be accepted"
|
||||
|
||||
# _active_job is now set — second pull must be rejected
|
||||
r2 = await client.post("/pull")
|
||||
assert r2.status_code == 409, (
|
||||
"Second pull while first is pending must return 409 Conflict"
|
||||
)
|
||||
|
||||
async def test_push_while_pull_running_returns_409(self, client):
|
||||
"""Push is also blocked while a pull is pending/running."""
|
||||
await client.post("/pull")
|
||||
r = await client.post("/push")
|
||||
assert r.status_code == 409
|
||||
263
tests/test_phase_d_changes.py
Normal file
263
tests/test_phase_d_changes.py
Normal file
@@ -0,0 +1,263 @@
|
||||
"""
|
||||
Phase D — Pending Changes View Tests
|
||||
|
||||
Tests for GET /changes (structured change list) and GET /diff/{path} (inline diff).
|
||||
Git operations run against the real temp vault — no subprocess mocking needed here.
|
||||
"""
|
||||
|
||||
import base64
|
||||
import subprocess
|
||||
import textwrap
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.asyncio
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def encode_path(path: str) -> str:
|
||||
"""URL-safe base64 encoding of a path, matching what the app uses."""
|
||||
return base64.urlsafe_b64encode(path.encode()).decode()
|
||||
|
||||
|
||||
def git(vault: Path, *args) -> str:
|
||||
return subprocess.run(
|
||||
["git", "-C", str(vault), *args],
|
||||
check=True, capture_output=True, text=True,
|
||||
).stdout.strip()
|
||||
|
||||
|
||||
def commit_all(vault: Path, message: str):
|
||||
subprocess.run(["git", "-C", str(vault), "add", "-A"], check=True, capture_output=True)
|
||||
try:
|
||||
subprocess.run(["git", "-C", str(vault), "commit", "-m", message], check=True, capture_output=True)
|
||||
except subprocess.CalledProcessError:
|
||||
pass
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-D1 — Changes endpoint structure
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestChangesEndpoint:
|
||||
|
||||
async def test_get_changes_returns_200(self, client):
|
||||
r = await client.get("/changes")
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_changes_returns_json(self, client):
|
||||
r = await client.get("/changes")
|
||||
assert "application/json" in r.headers.get("content-type", "")
|
||||
|
||||
async def test_changes_returns_list(self, client):
|
||||
r = await client.get("/changes")
|
||||
data = r.json()
|
||||
assert isinstance(data, list)
|
||||
|
||||
async def test_changes_empty_when_clean(self, client, populated_vault):
|
||||
r = await client.get("/changes")
|
||||
assert r.json() == []
|
||||
|
||||
async def test_each_change_has_required_fields(self, client, vault_with_pending):
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
assert len(items) > 0, "Expected pending changes"
|
||||
for item in items:
|
||||
assert "path" in item, f"Missing 'path' in item: {item}"
|
||||
assert "status" in item, f"Missing 'status' in item: {item}"
|
||||
assert "action" in item, f"Missing 'action' in item: {item}"
|
||||
|
||||
async def test_status_values_are_valid(self, client, vault_with_pending):
|
||||
valid_statuses = {"modified", "added", "deleted", "renamed"}
|
||||
r = await client.get("/changes")
|
||||
for item in r.json():
|
||||
assert item["status"] in valid_statuses, (
|
||||
f"Invalid status '{item['status']}' — must be one of {valid_statuses}"
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-D2 — Change categories
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestChangeCategories:
|
||||
|
||||
async def test_modified_file_shown_as_modified(self, client, populated_vault):
|
||||
# Edit an existing file on main
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.write_text(cv.read_text() + "\n## Appendix\nNew content.\n")
|
||||
commit_all(populated_vault, "obsidian: edit CV")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
cv_item = next((i for i in items if "CV.md" in i["path"]), None)
|
||||
assert cv_item is not None, "CV.md must appear in changes"
|
||||
assert cv_item["status"] == "modified"
|
||||
|
||||
async def test_new_file_shown_as_added(self, client, populated_vault):
|
||||
new_file = populated_vault / "Projekte" / "NewDoc.md"
|
||||
new_file.parent.mkdir(exist_ok=True)
|
||||
new_file.write_text("# New Doc\nWritten in Obsidian.\n")
|
||||
commit_all(populated_vault, "obsidian: new doc")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
new_item = next((i for i in items if "NewDoc.md" in i["path"]), None)
|
||||
assert new_item is not None, "NewDoc.md must appear in changes"
|
||||
assert new_item["status"] == "added"
|
||||
|
||||
async def test_deleted_file_shown_as_deleted(self, client, populated_vault):
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.unlink()
|
||||
commit_all(populated_vault, "obsidian: delete CV")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
del_item = next((i for i in items if "CV.md" in i["path"]), None)
|
||||
assert del_item is not None, "Deleted CV.md must appear in changes"
|
||||
assert del_item["status"] == "deleted"
|
||||
|
||||
async def test_renamed_file_shown_as_renamed(self, client, populated_vault):
|
||||
old_path = populated_vault / "Bewerbungen" / "CV.md"
|
||||
new_path = populated_vault / "Bewerbungen" / "Curriculum Vitae.md"
|
||||
old_path.rename(new_path)
|
||||
commit_all(populated_vault, "obsidian: rename CV")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
# Either old or new name should appear with renamed status
|
||||
renamed = [i for i in items if i["status"] == "renamed"]
|
||||
assert len(renamed) >= 1, "Renamed file must appear with status=renamed"
|
||||
# Check both from_path and to_path since either may contain "CV"
|
||||
all_paths = " ".join(
|
||||
str(i.get("from_path", "")) + " " + str(i.get("to_path", i["path"]))
|
||||
for i in renamed
|
||||
)
|
||||
assert "CV" in all_paths, "Renamed paths must reference the original filename"
|
||||
|
||||
async def test_renamed_item_has_from_and_to_paths(self, client, populated_vault):
|
||||
old_path = populated_vault / "Bewerbungen" / "CV.md"
|
||||
new_path = populated_vault / "Bewerbungen" / "Resume.md"
|
||||
old_path.rename(new_path)
|
||||
commit_all(populated_vault, "obsidian: rename")
|
||||
|
||||
r = await client.get("/changes")
|
||||
renamed = [i for i in r.json() if i["status"] == "renamed"]
|
||||
assert len(renamed) >= 1
|
||||
item = renamed[0]
|
||||
assert "from" in item or "from_path" in item, "Rename must include source path"
|
||||
assert "to" in item or "to_path" in item, "Rename must include destination path"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-D3 — Diff preview
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestDiffPreview:
|
||||
|
||||
async def test_diff_endpoint_returns_200(self, client, populated_vault):
|
||||
# Edit a file to create a diff
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
original = cv.read_text()
|
||||
cv.write_text(original + "\n## New Section\n")
|
||||
commit_all(populated_vault, "edit for diff")
|
||||
|
||||
encoded = encode_path("Bewerbungen/CV.md")
|
||||
r = await client.get(f"/diff/{encoded}")
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_diff_returns_html_fragment(self, client, populated_vault):
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.write_text(cv.read_text() + "\nExtra line.\n")
|
||||
commit_all(populated_vault, "edit for diff")
|
||||
|
||||
encoded = encode_path("Bewerbungen/CV.md")
|
||||
r = await client.get(f"/diff/{encoded}")
|
||||
assert "text/html" in r.headers.get("content-type", ""), (
|
||||
"Diff endpoint must return HTML"
|
||||
)
|
||||
|
||||
async def test_diff_contains_two_columns(self, client, populated_vault):
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.write_text(cv.read_text() + "\nAdded line.\n")
|
||||
commit_all(populated_vault, "edit for diff")
|
||||
|
||||
encoded = encode_path("Bewerbungen/CV.md")
|
||||
r = await client.get(f"/diff/{encoded}")
|
||||
body = r.text.lower()
|
||||
# Two-column layout — check for table or grid structure
|
||||
assert "table" in body or "column" in body or "diff" in body, (
|
||||
"Diff HTML must contain a two-column comparison layout"
|
||||
)
|
||||
|
||||
async def test_diff_for_unknown_file_returns_404(self, client, populated_vault):
|
||||
encoded = encode_path("DoesNotExist/ghost.md")
|
||||
r = await client.get(f"/diff/{encoded}")
|
||||
assert r.status_code == 404
|
||||
|
||||
async def test_diff_added_lines_have_distinct_marking(self, client, populated_vault):
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.write_text(cv.read_text() + "\nThis line was added.\n")
|
||||
commit_all(populated_vault, "add line")
|
||||
|
||||
encoded = encode_path("Bewerbungen/CV.md")
|
||||
r = await client.get(f"/diff/{encoded}")
|
||||
body = r.text
|
||||
# Added lines must be visually distinct (green class, + prefix, or ins tag)
|
||||
# difflib.HtmlDiff marks added lines with class="diff_add"
|
||||
assert any(marker in body for marker in (
|
||||
'class="diff_add"', "diff_add", 'class="add"', "<ins>", "diff-add",
|
||||
)), "Added lines must be visually marked in the diff"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-D4 — Deleted files skipped when allow_deletions=false
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestDeletedFilesSkipped:
|
||||
|
||||
async def test_deleted_file_action_is_skip_when_deletions_off(
|
||||
self, client, populated_vault, settings_file
|
||||
):
|
||||
"""With allow_deletions=false in settings, deleted files must show action=skip."""
|
||||
import json
|
||||
settings = json.loads(settings_file.read_text())
|
||||
settings["sync"]["allow_deletions"] = False
|
||||
settings_file.write_text(json.dumps(settings))
|
||||
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.unlink()
|
||||
commit_all(populated_vault, "delete CV")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
del_item = next((i for i in items if "CV.md" in i["path"]), None)
|
||||
assert del_item is not None
|
||||
assert del_item["action"] in ("skip", "skipped"), (
|
||||
"Deleted file must have action=skip when deletions are disabled"
|
||||
)
|
||||
|
||||
async def test_deleted_file_action_is_delete_when_deletions_on(
|
||||
self, client, populated_vault, settings_file
|
||||
):
|
||||
"""With allow_deletions=true, deleted file action must be delete."""
|
||||
import json
|
||||
settings = json.loads(settings_file.read_text())
|
||||
settings["sync"]["allow_deletions"] = True
|
||||
settings_file.write_text(json.dumps(settings))
|
||||
|
||||
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||
cv.unlink()
|
||||
commit_all(populated_vault, "delete CV")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
del_item = next((i for i in items if "CV.md" in i["path"]), None)
|
||||
assert del_item is not None
|
||||
assert del_item["action"] in ("delete", "archive"), (
|
||||
"Deleted file must have action=delete when deletions are enabled"
|
||||
)
|
||||
290
tests/test_phase_e_push.py
Normal file
290
tests/test_phase_e_push.py
Normal file
@@ -0,0 +1,290 @@
|
||||
"""
|
||||
Phase E — Push with Live Output Tests
|
||||
|
||||
Tests for POST /push (job start) and the SSE stream.
|
||||
The Outline API is mocked so tests do not require a live Outline instance.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import subprocess
|
||||
import sys
|
||||
import textwrap
|
||||
from pathlib import Path
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from helpers import make_mock_process # noqa: E402
|
||||
|
||||
pytestmark = pytest.mark.asyncio
|
||||
|
||||
|
||||
async def consume_sse(client, job_id: str, max_events: int = 100) -> list[dict]:
|
||||
events = []
|
||||
async with client.stream("GET", f"/stream/{job_id}") as r:
|
||||
async for line in r.aiter_lines():
|
||||
if line.startswith("data:"):
|
||||
try:
|
||||
events.append(json.loads(line[5:].strip()))
|
||||
except json.JSONDecodeError:
|
||||
events.append({"raw": line[5:].strip()})
|
||||
if events and events[-1].get("type") == "done":
|
||||
break
|
||||
return events
|
||||
|
||||
|
||||
def git(vault: Path, *args) -> str:
|
||||
return subprocess.run(
|
||||
["git", "-C", str(vault), *args],
|
||||
check=True, capture_output=True, text=True,
|
||||
).stdout.strip()
|
||||
|
||||
|
||||
def commit_all(vault: Path, message: str):
|
||||
subprocess.run(["git", "-C", str(vault), "add", "-A"], check=True, capture_output=True)
|
||||
try:
|
||||
subprocess.run(["git", "-C", str(vault), "commit", "-m", message], check=True, capture_output=True)
|
||||
except subprocess.CalledProcessError:
|
||||
pass
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-E1 — Push streaming
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestPushStreaming:
|
||||
|
||||
async def test_post_push_returns_job_id(self, client, vault_with_pending):
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 2 updated, 1 created."])
|
||||
r = await client.post("/push")
|
||||
assert r.status_code in (200, 202)
|
||||
assert "job_id" in r.json()
|
||||
|
||||
async def test_stream_content_type_is_sse(self, client, vault_with_pending):
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done."])
|
||||
r = await client.post("/push")
|
||||
job_id = r.json()["job_id"]
|
||||
|
||||
async with client.stream("GET", f"/stream/{job_id}") as stream:
|
||||
assert "text/event-stream" in stream.headers.get("content-type", "")
|
||||
|
||||
async def test_stream_ends_with_done_event(self, client, vault_with_pending):
|
||||
push_lines = [
|
||||
"ok: Bewerbungen/CV.md updated",
|
||||
"ok: Projekte/NewNote.md created (id: abc123)",
|
||||
"Done. 1 updated, 1 created.",
|
||||
]
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(push_lines)
|
||||
r = await client.post("/push")
|
||||
job_id = r.json()["job_id"]
|
||||
events = await consume_sse(client, job_id)
|
||||
|
||||
done_events = [e for e in events if e.get("type") == "done"]
|
||||
assert len(done_events) == 1
|
||||
|
||||
async def test_done_event_contains_summary_counts(self, client, vault_with_pending):
|
||||
push_lines = ["Done. 1 updated, 1 created, 0 skipped, 0 errors."]
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(push_lines)
|
||||
r = await client.post("/push")
|
||||
events = await consume_sse(client, r.json()["job_id"])
|
||||
done = next(e for e in events if e.get("type") == "done")
|
||||
summary = json.dumps(done)
|
||||
# Summary counts must appear somewhere in the done event
|
||||
assert any(k in summary for k in ("updated", "created", "skipped", "errors")), (
|
||||
"Done event must include summary counts"
|
||||
)
|
||||
|
||||
async def test_per_file_events_emitted(self, client, vault_with_pending):
|
||||
push_lines = [
|
||||
"processing: Bewerbungen/CV.md",
|
||||
"ok: Bewerbungen/CV.md updated",
|
||||
"processing: Projekte/NewNote.md",
|
||||
"ok: Projekte/NewNote.md created (id: xyz789)",
|
||||
"Done.",
|
||||
]
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(push_lines)
|
||||
r = await client.post("/push")
|
||||
events = await consume_sse(client, r.json()["job_id"])
|
||||
all_text = json.dumps(events)
|
||||
assert "CV.md" in all_text, "Events should mention CV.md"
|
||||
assert "NewNote.md" in all_text, "Events should mention NewNote.md"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-E2 — New file frontmatter writeback
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestNewFileCreation:
|
||||
|
||||
async def test_new_file_appears_in_pending_changes(self, client, populated_vault):
|
||||
new_file = populated_vault / "Projekte" / "BrandNew.md"
|
||||
new_file.parent.mkdir(exist_ok=True)
|
||||
new_file.write_text("# Brand New\nContent without frontmatter.\n")
|
||||
commit_all(populated_vault, "obsidian: new file")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
new_item = next((i for i in items if "BrandNew.md" in i["path"]), None)
|
||||
assert new_item is not None
|
||||
assert new_item["status"] == "added"
|
||||
|
||||
async def test_push_writes_frontmatter_back_to_new_file(
|
||||
self, client, populated_vault
|
||||
):
|
||||
"""
|
||||
After push, a new file must have frontmatter with outline_id injected.
|
||||
The mock simulates the sync engine writing back the ID.
|
||||
"""
|
||||
new_file = populated_vault / "Projekte" / "FrontmatterTest.md"
|
||||
new_file.parent.mkdir(exist_ok=True)
|
||||
new_file.write_text("# Frontmatter Test\nNo ID yet.\n")
|
||||
commit_all(populated_vault, "obsidian: new file no frontmatter")
|
||||
|
||||
fake_id = "doc-new-frontmatter-001"
|
||||
|
||||
def fake_push(*args, **kwargs):
|
||||
# Simulate sync engine writing frontmatter back
|
||||
new_file.write_text(textwrap.dedent(f"""\
|
||||
---
|
||||
outline_id: {fake_id}
|
||||
outline_collection_id: col-proj-001
|
||||
---
|
||||
# Frontmatter Test
|
||||
No ID yet.
|
||||
"""))
|
||||
commit_all(populated_vault, "sync: write back frontmatter")
|
||||
return make_mock_process([
|
||||
f"ok: Projekte/FrontmatterTest.md created (id: {fake_id})",
|
||||
"Done. 1 created.",
|
||||
])
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", side_effect=fake_push):
|
||||
r = await client.post("/push")
|
||||
assert r.status_code in (200, 202)
|
||||
await consume_sse(client, r.json()["job_id"])
|
||||
|
||||
content = new_file.read_text()
|
||||
assert "outline_id" in content, "Sync engine must write outline_id back to new file"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-E3 — Push blocked by conflicts
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestPushBlockedByConflicts:
|
||||
|
||||
async def test_push_returns_409_when_conflicts_exist(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r = await client.post("/push")
|
||||
assert r.status_code == 409, (
|
||||
"Push must return 409 Conflict when unresolved merge conflicts exist"
|
||||
)
|
||||
|
||||
async def test_push_409_response_includes_conflict_paths(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r = await client.post("/push")
|
||||
assert r.status_code == 409
|
||||
body = r.json()
|
||||
assert "conflicts" in body or "files" in body or "message" in body, (
|
||||
"409 response must explain which files are conflicted"
|
||||
)
|
||||
|
||||
async def test_push_allowed_after_conflicts_resolved(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
"""Resolve the conflict, then push must be accepted."""
|
||||
# Resolve: check out local version
|
||||
subprocess.run(
|
||||
["git", "-C", str(vault_with_conflict), "checkout", "--ours",
|
||||
"Bewerbungen/CV.md"],
|
||||
check=True, capture_output=True,
|
||||
)
|
||||
commit_all(vault_with_conflict, "resolve: keep ours")
|
||||
subprocess.run(
|
||||
["git", "-C", str(vault_with_conflict), "merge", "--abort"],
|
||||
capture_output=True,
|
||||
)
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 1 updated."])
|
||||
r = await client.post("/push")
|
||||
assert r.status_code in (200, 202), (
|
||||
"Push must be allowed after conflicts are resolved"
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-E4 — New collection creation
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestNewCollectionCreation:
|
||||
|
||||
async def test_new_top_level_folder_detected_as_new_collection(
|
||||
self, client, populated_vault
|
||||
):
|
||||
"""A new folder at the top level must appear in changes as a new collection."""
|
||||
new_doc = populated_vault / "NewCollection" / "FirstDoc.md"
|
||||
new_doc.parent.mkdir()
|
||||
new_doc.write_text("# First Doc\nNew collection content.\n")
|
||||
commit_all(populated_vault, "obsidian: new collection")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
new_item = next((i for i in items if "FirstDoc.md" in i["path"]), None)
|
||||
assert new_item is not None
|
||||
# The action or a note must indicate a new collection will be created
|
||||
item_str = json.dumps(new_item)
|
||||
assert "collection" in item_str.lower() or new_item["status"] == "added", (
|
||||
"New file in unknown folder must be flagged as requiring new collection"
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-E5 — Rename handling
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestRenameHandling:
|
||||
|
||||
async def test_renamed_file_shown_in_changes(self, client, populated_vault):
|
||||
old = populated_vault / "Bewerbungen" / "CV.md"
|
||||
new = populated_vault / "Bewerbungen" / "Resume.md"
|
||||
old.rename(new)
|
||||
commit_all(populated_vault, "obsidian: rename CV to Resume")
|
||||
|
||||
r = await client.get("/changes")
|
||||
items = r.json()
|
||||
renamed = [i for i in items if i["status"] == "renamed"]
|
||||
assert len(renamed) >= 1
|
||||
|
||||
async def test_push_rename_uses_update_not_create(self, client, populated_vault):
|
||||
"""
|
||||
The sync engine must call documents.update (not delete+create) for renames,
|
||||
preserving the Outline document ID.
|
||||
"""
|
||||
old = populated_vault / "Bewerbungen" / "CV.md"
|
||||
new = populated_vault / "Bewerbungen" / "Resume.md"
|
||||
old.rename(new)
|
||||
commit_all(populated_vault, "obsidian: rename")
|
||||
|
||||
push_lines = [
|
||||
"ok: Bewerbungen/Resume.md → title updated (id: doc-cv-001)",
|
||||
"Done. 1 renamed.",
|
||||
]
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(push_lines)
|
||||
r = await client.post("/push")
|
||||
events_raw = await consume_sse(client, r.json()["job_id"])
|
||||
all_text = json.dumps(events_raw)
|
||||
# Should not see "created" for a renamed document
|
||||
assert "doc-cv-001" in all_text or "renamed" in all_text or "updated" in all_text, (
|
||||
"Rename should update the existing document, not create a new one"
|
||||
)
|
||||
354
tests/test_phase_f_conflicts.py
Normal file
354
tests/test_phase_f_conflicts.py
Normal file
@@ -0,0 +1,354 @@
|
||||
"""
|
||||
Phase F — Conflict Resolution Tests
|
||||
|
||||
Tests for GET /conflicts, GET /diff/{path}, and POST /resolve.
|
||||
Uses the vault_with_conflict fixture which creates a real git merge conflict.
|
||||
"""
|
||||
|
||||
import base64
|
||||
import json
|
||||
import subprocess
|
||||
import sys
|
||||
import textwrap
|
||||
from pathlib import Path
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, str(__import__("pathlib").Path(__file__).parent))
|
||||
from helpers import make_mock_process # noqa: E402
|
||||
|
||||
pytestmark = pytest.mark.asyncio
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def encode_path(path: str) -> str:
|
||||
return base64.urlsafe_b64encode(path.encode()).decode()
|
||||
|
||||
|
||||
def git(vault: Path, *args) -> str:
|
||||
return subprocess.run(
|
||||
["git", "-C", str(vault), *args],
|
||||
check=True, capture_output=True, text=True,
|
||||
).stdout.strip()
|
||||
|
||||
|
||||
def commit_all(vault: Path, message: str):
|
||||
subprocess.run(["git", "-C", str(vault), "add", "-A"], check=True, capture_output=True)
|
||||
try:
|
||||
subprocess.run(["git", "-C", str(vault), "commit", "-m", message], check=True, capture_output=True)
|
||||
except subprocess.CalledProcessError:
|
||||
pass
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-F1 — Conflicts list
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestConflictsList:
|
||||
|
||||
async def test_conflicts_returns_200(self, client):
|
||||
r = await client.get("/conflicts")
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_conflicts_returns_json(self, client):
|
||||
r = await client.get("/conflicts")
|
||||
assert "application/json" in r.headers.get("content-type", "")
|
||||
|
||||
async def test_conflicts_empty_when_clean(self, client, populated_vault):
|
||||
r = await client.get("/conflicts")
|
||||
data = r.json()
|
||||
assert isinstance(data, list)
|
||||
assert len(data) == 0
|
||||
|
||||
async def test_conflicts_lists_conflicted_files(self, client, vault_with_conflict):
|
||||
r = await client.get("/conflicts")
|
||||
data = r.json()
|
||||
assert len(data) >= 1, "Expected at least one conflict"
|
||||
paths = [item["path"] if isinstance(item, dict) else item for item in data]
|
||||
assert any("CV.md" in p for p in paths), "CV.md must appear in conflict list"
|
||||
|
||||
async def test_each_conflict_has_required_fields(self, client, vault_with_conflict):
|
||||
r = await client.get("/conflicts")
|
||||
for item in r.json():
|
||||
assert "path" in item, f"Missing 'path' in conflict item: {item}"
|
||||
# At minimum path is required; timestamps are recommended
|
||||
assert isinstance(item["path"], str)
|
||||
|
||||
async def test_conflict_item_includes_timestamps(self, client, vault_with_conflict):
|
||||
"""Conflict items should indicate when each side was last modified."""
|
||||
r = await client.get("/conflicts")
|
||||
items = r.json()
|
||||
assert len(items) >= 1
|
||||
item = items[0]
|
||||
# At least one timestamp or modification indicator should be present
|
||||
has_time = any(k in item for k in (
|
||||
"local_time", "remote_time", "local_updated", "outline_updated",
|
||||
"ours_time", "theirs_time",
|
||||
))
|
||||
# This is recommended, not strictly required — log warning if missing
|
||||
if not has_time:
|
||||
pytest.warns(
|
||||
UserWarning,
|
||||
match="conflict timestamps",
|
||||
# Informational: timestamps improve UX but are not blocking
|
||||
) if False else None # non-blocking check
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-F2 — Conflict diff view
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestConflictDiff:
|
||||
|
||||
async def test_diff_returns_200_for_conflict_file(self, client, vault_with_conflict):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
conflict_path = r_conflicts.json()[0]["path"]
|
||||
encoded = encode_path(conflict_path)
|
||||
|
||||
r = await client.get(f"/diff/{encoded}")
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_diff_returns_html(self, client, vault_with_conflict):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
r = await client.get(f"/diff/{encode_path(path)}")
|
||||
assert "text/html" in r.headers.get("content-type", "")
|
||||
|
||||
async def test_diff_shows_both_versions(self, client, vault_with_conflict):
|
||||
"""Both the local and Outline version must appear in the diff HTML."""
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
r = await client.get(f"/diff/{encode_path(path)}")
|
||||
body = r.text
|
||||
# The diff must show both sides — check for two-column markers or headings
|
||||
sides_shown = sum(1 for label in (
|
||||
"yours", "mine", "local", "obsidian",
|
||||
"outline", "remote", "theirs",
|
||||
) if label in body.lower())
|
||||
assert sides_shown >= 2, (
|
||||
"Diff must label both sides (local/Obsidian and remote/Outline)"
|
||||
)
|
||||
|
||||
async def test_diff_for_non_conflict_file_returns_404(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r = await client.get(f"/diff/{encode_path('Infra/HomeLab.md')}")
|
||||
# HomeLab.md is not conflicted — must return 404 from conflicts endpoint
|
||||
# (the regular diff endpoint may return 200 for any file)
|
||||
# This test just verifies invalid paths to the conflicts-specific diff fail
|
||||
assert r.status_code in (200, 404) # implementation-defined; document behavior
|
||||
|
||||
async def test_diff_for_unknown_path_returns_404(self, client, vault_with_conflict):
|
||||
r = await client.get(f"/diff/{encode_path('ghost/file.md')}")
|
||||
assert r.status_code == 404
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-F3 — Resolve: keep local version
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestResolveLocal:
|
||||
|
||||
async def test_resolve_local_returns_200(self, client, vault_with_conflict):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
r = await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_resolve_local_removes_conflict_markers(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
vault = vault_with_conflict
|
||||
|
||||
r = await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||
assert r.status_code == 200
|
||||
|
||||
content = (vault / path).read_text()
|
||||
assert "<<<<<<<" not in content, "Resolve must remove <<<<<<< markers"
|
||||
assert "=======" not in content, "Resolve must remove ======= markers"
|
||||
assert ">>>>>>>" not in content, "Resolve must remove >>>>>>> markers"
|
||||
|
||||
async def test_resolve_local_keeps_local_content(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
vault = vault_with_conflict
|
||||
|
||||
await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||
content = (vault / path).read_text()
|
||||
# The local (Obsidian) version had "new section added"
|
||||
assert "new section" in content.lower() or "local version" in content.lower(), (
|
||||
"Resolving with 'local' must keep the Obsidian version content"
|
||||
)
|
||||
|
||||
async def test_resolve_local_commits_to_main(self, client, vault_with_conflict):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
before = git(vault_with_conflict, "rev-parse", "HEAD")
|
||||
await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||
after = git(vault_with_conflict, "rev-parse", "HEAD")
|
||||
|
||||
assert before != after, "Resolve must create a new commit on main"
|
||||
|
||||
async def test_file_no_longer_in_conflicts_after_resolve(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||
|
||||
r2 = await client.get("/conflicts")
|
||||
remaining_paths = [i["path"] for i in r2.json()]
|
||||
assert path not in remaining_paths, (
|
||||
"Resolved file must no longer appear in /conflicts"
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-F4 — Resolve: keep Outline's version
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestResolveRemote:
|
||||
|
||||
async def test_resolve_remote_returns_200(self, client, vault_with_conflict):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
r = await client.post("/resolve", json={"file": path, "accept": "remote"})
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_resolve_remote_removes_conflict_markers(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
await client.post("/resolve", json={"file": path, "accept": "remote"})
|
||||
content = (vault_with_conflict / path).read_text()
|
||||
assert "<<<<<<<" not in content
|
||||
assert ">>>>>>>" not in content
|
||||
|
||||
async def test_resolve_remote_keeps_outline_content(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
await client.post("/resolve", json={"file": path, "accept": "remote"})
|
||||
content = (vault_with_conflict / path).read_text()
|
||||
# The Outline version had "contact info updated"
|
||||
assert "contact info" in content.lower() or "outline version" in content.lower(), (
|
||||
"Resolving with 'remote' must keep the Outline version content"
|
||||
)
|
||||
|
||||
async def test_resolve_remote_commits_to_main(self, client, vault_with_conflict):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
before = git(vault_with_conflict, "rev-parse", "HEAD")
|
||||
await client.post("/resolve", json={"file": path, "accept": "remote"})
|
||||
after = git(vault_with_conflict, "rev-parse", "HEAD")
|
||||
|
||||
assert before != after
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-F5 — Input validation on /resolve
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestResolveValidation:
|
||||
|
||||
async def test_resolve_with_unknown_file_returns_422(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r = await client.post("/resolve", json={
|
||||
"file": "NotInConflict/ghost.md",
|
||||
"accept": "local",
|
||||
})
|
||||
assert r.status_code in (404, 422), (
|
||||
"Resolving an unknown file must return 404 or 422"
|
||||
)
|
||||
|
||||
async def test_resolve_with_path_traversal_returns_422(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r = await client.post("/resolve", json={
|
||||
"file": "../../etc/passwd",
|
||||
"accept": "local",
|
||||
})
|
||||
assert r.status_code in (400, 404, 422), (
|
||||
"Path traversal must be rejected"
|
||||
)
|
||||
|
||||
async def test_resolve_with_invalid_accept_value_returns_422(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r_conflicts = await client.get("/conflicts")
|
||||
path = r_conflicts.json()[0]["path"]
|
||||
|
||||
r = await client.post("/resolve", json={"file": path, "accept": "neither"})
|
||||
assert r.status_code == 422, (
|
||||
"'accept' must be 'local' or 'remote' — other values must be rejected"
|
||||
)
|
||||
|
||||
async def test_resolve_missing_fields_returns_422(self, client, vault_with_conflict):
|
||||
r = await client.post("/resolve", json={"file": "something.md"})
|
||||
assert r.status_code == 422
|
||||
|
||||
async def test_resolve_requires_json_body(self, client, vault_with_conflict):
|
||||
r = await client.post("/resolve")
|
||||
assert r.status_code in (400, 422)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-F6 — All conflicts resolved → push available
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestAllConflictsResolved:
|
||||
|
||||
async def test_conflicts_empty_after_all_resolved(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r = await client.get("/conflicts")
|
||||
for item in r.json():
|
||||
await client.post("/resolve", json={"file": item["path"], "accept": "local"})
|
||||
|
||||
r2 = await client.get("/conflicts")
|
||||
assert r2.json() == [], "No conflicts should remain after all are resolved"
|
||||
|
||||
async def test_status_shows_clean_after_all_resolved(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
r = await client.get("/conflicts")
|
||||
for item in r.json():
|
||||
await client.post("/resolve", json={"file": item["path"], "accept": "local"})
|
||||
|
||||
status = (await client.get("/status")).json()
|
||||
assert status["conflicts"] == 0
|
||||
|
||||
async def test_push_allowed_after_all_conflicts_resolved(
|
||||
self, client, vault_with_conflict
|
||||
):
|
||||
from unittest.mock import patch
|
||||
r = await client.get("/conflicts")
|
||||
for item in r.json():
|
||||
await client.post("/resolve", json={"file": item["path"], "accept": "local"})
|
||||
|
||||
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||
mock_spawn.return_value = make_mock_process(["Done. 1 updated."])
|
||||
r = await client.post("/push")
|
||||
assert r.status_code in (200, 202), (
|
||||
"Push must be allowed after all conflicts are resolved"
|
||||
)
|
||||
182
tests/test_phase_g_history.py
Normal file
182
tests/test_phase_g_history.py
Normal file
@@ -0,0 +1,182 @@
|
||||
"""
|
||||
Phase G — Sync History Tests
|
||||
|
||||
Tests for GET /history: rendering _sync_log.md as a reverse-chronological table.
|
||||
"""
|
||||
|
||||
import textwrap
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.asyncio
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
SAMPLE_LOG = textwrap.dedent("""\
|
||||
# Sync Log
|
||||
|
||||
| Timestamp | Direction | Files | Status |
|
||||
|-----------|-----------|-------|--------|
|
||||
| 2026-03-03 22:15 | push | 1 updated | error: CV.md failed |
|
||||
| 2026-03-04 08:00 | pull | 0 changes | ok |
|
||||
| 2026-03-05 09:10 | push | 2 updated, 1 created | ok |
|
||||
| 2026-03-06 14:32 | pull | 3 updated | ok |
|
||||
""")
|
||||
|
||||
MINIMAL_LOG = textwrap.dedent("""\
|
||||
# Sync Log
|
||||
|
||||
| Timestamp | Direction | Files | Status |
|
||||
|-----------|-----------|-------|--------|
|
||||
| 2026-01-01 00:00 | pull | 1 updated | ok |
|
||||
""")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-G1 — History page renders
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestHistoryPage:
|
||||
|
||||
async def test_history_returns_200(self, client):
|
||||
r = await client.get("/history")
|
||||
assert r.status_code == 200
|
||||
|
||||
async def test_history_returns_html(self, client):
|
||||
r = await client.get("/history")
|
||||
assert "text/html" in r.headers.get("content-type", "")
|
||||
|
||||
async def test_history_page_contains_table(self, client, vault_dir, sync_log):
|
||||
r = await client.get("/history")
|
||||
body = r.text.lower()
|
||||
assert "<table" in body, "History page must render an HTML table"
|
||||
|
||||
async def test_history_shows_direction_labels(self, client, vault_dir):
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history")
|
||||
body = r.text.lower()
|
||||
assert "pull" in body, "History must show pull entries"
|
||||
assert "push" in body, "History must show push entries"
|
||||
|
||||
async def test_history_shows_timestamps(self, client, vault_dir):
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history")
|
||||
body = r.text
|
||||
assert "2026-03-06" in body, "History must show timestamps from _sync_log.md"
|
||||
|
||||
async def test_history_shows_file_counts(self, client, vault_dir):
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history")
|
||||
body = r.text.lower()
|
||||
assert "updated" in body or "created" in body, (
|
||||
"History must show file change counts"
|
||||
)
|
||||
|
||||
async def test_history_shows_status(self, client, vault_dir):
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history")
|
||||
body = r.text.lower()
|
||||
assert "ok" in body or "error" in body, "History must show entry status"
|
||||
|
||||
async def test_history_empty_when_no_log(self, client, vault_dir):
|
||||
"""If _sync_log.md does not exist, page should render gracefully (no 500)."""
|
||||
log_path = vault_dir / "_sync_log.md"
|
||||
if log_path.exists():
|
||||
log_path.unlink()
|
||||
|
||||
r = await client.get("/history")
|
||||
assert r.status_code == 200, "History page must not crash when log is missing"
|
||||
|
||||
async def test_history_empty_state_message(self, client, vault_dir):
|
||||
"""Empty history should show a helpful message, not a blank page."""
|
||||
log_path = vault_dir / "_sync_log.md"
|
||||
if log_path.exists():
|
||||
log_path.unlink()
|
||||
|
||||
r = await client.get("/history")
|
||||
body = r.text.lower()
|
||||
assert any(phrase in body for phrase in (
|
||||
"no history", "no sync", "empty", "no entries", "nothing yet"
|
||||
)), "Empty history must show a message"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# US-G2 — _sync_log.md parsing
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestSyncLogParsing:
|
||||
|
||||
async def test_entries_shown_in_reverse_chronological_order(
|
||||
self, client, vault_dir
|
||||
):
|
||||
"""Most recent entry must appear before older entries in the HTML."""
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history")
|
||||
body = r.text
|
||||
|
||||
pos_newest = body.find("2026-03-06")
|
||||
pos_oldest = body.find("2026-03-03")
|
||||
|
||||
assert pos_newest != -1, "Most recent entry must appear in history"
|
||||
assert pos_oldest != -1, "Oldest entry must appear in history"
|
||||
assert pos_newest < pos_oldest, (
|
||||
"Most recent entry (2026-03-06) must appear before oldest (2026-03-03)"
|
||||
)
|
||||
|
||||
async def test_error_entries_visually_distinct(self, client, vault_dir):
|
||||
"""Entries with non-ok status should be highlighted differently."""
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history")
|
||||
body = r.text.lower()
|
||||
# Error entry from 2026-03-03 should have visual distinction
|
||||
# This is checked loosely: error word near some CSS class or color
|
||||
assert "error" in body, "Error entries must be shown in history"
|
||||
|
||||
async def test_raw_markdown_not_shown_as_pipe_table(self, client, vault_dir):
|
||||
"""The raw markdown pipe-table syntax must not be visible in rendered output."""
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history")
|
||||
body = r.text
|
||||
# Pipe characters from the markdown table should NOT appear verbatim
|
||||
# (they should be parsed and rendered as HTML <table>)
|
||||
raw_table_lines = [l for l in body.splitlines() if l.strip().startswith("|---")]
|
||||
assert len(raw_table_lines) == 0, (
|
||||
"Raw markdown table separator lines must not appear in rendered HTML"
|
||||
)
|
||||
|
||||
async def test_all_log_entries_appear(self, client, vault_dir):
|
||||
"""All 4 entries in SAMPLE_LOG must appear in the rendered history."""
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history")
|
||||
body = r.text
|
||||
|
||||
assert "2026-03-06" in body
|
||||
assert "2026-03-05" in body
|
||||
assert "2026-03-04" in body
|
||||
assert "2026-03-03" in body
|
||||
|
||||
async def test_single_entry_log_renders(self, client, vault_dir):
|
||||
(vault_dir / "_sync_log.md").write_text(MINIMAL_LOG)
|
||||
r = await client.get("/history")
|
||||
assert r.status_code == 200
|
||||
assert "2026-01-01" in r.text
|
||||
|
||||
async def test_history_api_endpoint_returns_json(self, client, vault_dir):
|
||||
"""
|
||||
GET /history?format=json returns structured history data.
|
||||
This is optional but strongly recommended for future HTMX updates.
|
||||
"""
|
||||
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||
r = await client.get("/history?format=json")
|
||||
# If not implemented, 200 HTML is also acceptable
|
||||
if r.status_code == 200 and "application/json" in r.headers.get("content-type", ""):
|
||||
data = r.json()
|
||||
assert isinstance(data, list)
|
||||
assert len(data) >= 4
|
||||
for entry in data:
|
||||
assert "timestamp" in entry or "date" in entry
|
||||
assert "direction" in entry
|
||||
Reference in New Issue
Block a user