Add sync engine, web UI, Docker setup, and tests
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
8
.gitignore
vendored
8
.gitignore
vendored
@@ -6,6 +6,14 @@ outline_export/
|
|||||||
|
|
||||||
# Backups
|
# Backups
|
||||||
outline_backup_*.tar.gz
|
outline_backup_*.tar.gz
|
||||||
|
exports_backup_*.tar.gz
|
||||||
|
|
||||||
|
# Export data
|
||||||
|
exports/
|
||||||
|
outline_tree_*.txt
|
||||||
|
|
||||||
|
# Python venv
|
||||||
|
.venv/
|
||||||
|
|
||||||
# Python
|
# Python
|
||||||
__pycache__/
|
__pycache__/
|
||||||
|
|||||||
536
AGENTS.md
Normal file
536
AGENTS.md
Normal file
@@ -0,0 +1,536 @@
|
|||||||
|
# AGENTS.md
|
||||||
|
|
||||||
|
This file provides guidance to AI agents (Claude, GPT-4, etc.) when working with this Outline export/import tool repository.
|
||||||
|
|
||||||
|
> **For all technical details, architecture info, common pitfalls, and code patterns, see this file.**
|
||||||
|
|
||||||
|
## Quick Reference
|
||||||
|
|
||||||
|
**Primary Language**: Python 3.11 + Bash
|
||||||
|
**Key Dependencies**: `requests`, `tqdm`
|
||||||
|
**Runtime**: Docker containers on `domnet` network
|
||||||
|
**API Base**: `http://outline:3000` (internal, bypasses SSO)
|
||||||
|
|
||||||
|
**Key Features**:
|
||||||
|
- Export all collections with full document hierarchy
|
||||||
|
- Import back to Outline preserving structure
|
||||||
|
- Automatic backups with 90%+ compression
|
||||||
|
- Dry-run mode for safe testing
|
||||||
|
- Retry logic for API reliability
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Export (Backup)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run the export with tree visualization
|
||||||
|
./export_with_trees.sh
|
||||||
|
|
||||||
|
# Preview without exporting (dry run)
|
||||||
|
./export_with_trees.sh --dry-run
|
||||||
|
|
||||||
|
# Run with verbose output
|
||||||
|
./export_with_trees.sh -v
|
||||||
|
```
|
||||||
|
|
||||||
|
**Export CLI Options**:
|
||||||
|
```
|
||||||
|
--dry-run, -n Preview what would be exported without writing files
|
||||||
|
--output, -o DIR Output directory (overrides settings.json)
|
||||||
|
--verbose, -v Increase verbosity (-vv for debug)
|
||||||
|
--skip-verify Skip post-export verification
|
||||||
|
--skip-health-check Skip pre-export health check
|
||||||
|
--settings FILE Path to settings file (default: settings.json)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Import (Restore)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Import all collections from outline_export/
|
||||||
|
./import_to_outline.sh
|
||||||
|
|
||||||
|
# Preview what would be imported (no changes made)
|
||||||
|
./import_to_outline.sh --dry-run
|
||||||
|
|
||||||
|
# Import into a single timestamped collection
|
||||||
|
./import_to_outline.sh --single
|
||||||
|
|
||||||
|
# Import from a different directory
|
||||||
|
./import_to_outline.sh -d exports/
|
||||||
|
|
||||||
|
# Overwrite existing collections
|
||||||
|
./import_to_outline.sh --force
|
||||||
|
```
|
||||||
|
|
||||||
|
**Import CLI Options**:
|
||||||
|
```
|
||||||
|
-s, --single Import all into single timestamped collection
|
||||||
|
-n, --dry-run Preview operations without making changes
|
||||||
|
-d, --source DIR Source directory (default: outline_export)
|
||||||
|
-v, --verbose Increase verbosity (-vv for debug)
|
||||||
|
-f, --force Overwrite existing collections (instead of skip)
|
||||||
|
--settings FILE Path to settings file (default: settings.json)
|
||||||
|
-h, --help Show help message
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running Python Scripts Directly
|
||||||
|
|
||||||
|
If you need to run the Python scripts directly (e.g., for debugging):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Export
|
||||||
|
docker run --rm --network domnet \
|
||||||
|
-v "$(pwd):/work" \
|
||||||
|
-w /work \
|
||||||
|
python:3.11-slim \
|
||||||
|
bash -c "pip install -q requests tqdm && python3 outline_export_fixed.py --dry-run"
|
||||||
|
|
||||||
|
# Import
|
||||||
|
docker run --rm --network domnet \
|
||||||
|
-v "$(pwd):/work" \
|
||||||
|
-w /work \
|
||||||
|
python:3.11-slim \
|
||||||
|
bash -c "pip install -q requests tqdm && python3 outline_import.py --dry-run"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note**: The shell wrappers (`export_with_trees.sh`, `import_to_outline.sh`) provide better UX with tree visualization and colored output.
|
||||||
|
|
||||||
|
## Agent Operating Guidelines
|
||||||
|
|
||||||
|
### 1. Configuration
|
||||||
|
|
||||||
|
Settings are in `settings.json`:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"source": {
|
||||||
|
"url": "http://outline:3000",
|
||||||
|
"token": "your-api-token-here"
|
||||||
|
},
|
||||||
|
"export": {
|
||||||
|
"output_directory": "outline_export"
|
||||||
|
},
|
||||||
|
"advanced": {
|
||||||
|
"max_hierarchy_depth": 100
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Important**: `settings.json` contains secrets (API token) and should **never be committed to git**.
|
||||||
|
|
||||||
|
### 2. Architecture Understanding
|
||||||
|
|
||||||
|
This tool operates in a **Docker-isolated environment** to bypass Authentik SSO:
|
||||||
|
- All Python scripts run inside ephemeral Docker containers
|
||||||
|
- Network: `domnet` bridge allows direct access to Outline's internal API
|
||||||
|
- No persistent container state - dependencies installed on each run
|
||||||
|
|
||||||
|
**Critical Context**:
|
||||||
|
- The `http://outline:3000` URL only works inside the Docker network
|
||||||
|
- External access would require SSO authentication through Authentik
|
||||||
|
- This design is intentional for automated backup/restore operations
|
||||||
|
|
||||||
|
#### Export Flow
|
||||||
|
|
||||||
|
1. **Health Check**: Verify API connectivity
|
||||||
|
2. **Fetch Collections**: Via `/api/collections.list`
|
||||||
|
3. **Build Tree**: Get navigation tree via `/api/collections.documents` (source of truth for hierarchy)
|
||||||
|
4. **Fetch Content**: Full document content via `/api/documents.info` (with caching)
|
||||||
|
5. **Export Recursively**: Maintain parent-child structure
|
||||||
|
6. **Save Metadata**: `_collection_metadata.json` per collection
|
||||||
|
7. **Create Backup**: Archive previous export to `outline_backup_*.tar.gz`
|
||||||
|
8. **Verify**: Generate manifest with checksums
|
||||||
|
|
||||||
|
#### Import Flow
|
||||||
|
|
||||||
|
1. **Health Check**: Verify API connectivity
|
||||||
|
2. **Load Metadata**: Read `_collection_metadata.json` from each collection directory
|
||||||
|
3. **Build Tree**: Reconstruct document hierarchy from metadata
|
||||||
|
4. **Create Collections**: Via `/api/collections.create`
|
||||||
|
5. **Create Documents**: Via `/api/documents.create` with proper `parentDocumentId`
|
||||||
|
6. **Map IDs**: Track old IDs → new IDs to maintain hierarchy
|
||||||
|
7. **Display Progress**: Tree-style output with status indicators
|
||||||
|
|
||||||
|
#### Core Components Pipelines
|
||||||
|
|
||||||
|
**Export Pipeline:**
|
||||||
|
```
|
||||||
|
export_with_trees.sh → Docker container → outline_export_fixed.py
|
||||||
|
↓
|
||||||
|
Fetches collections → Builds document tree → Exports markdown + metadata
|
||||||
|
↓
|
||||||
|
Creates backup → Verifies integrity → Displays summary
|
||||||
|
```
|
||||||
|
|
||||||
|
**Import Pipeline:**
|
||||||
|
```
|
||||||
|
import_to_outline.sh → Docker container → outline_import.py
|
||||||
|
↓
|
||||||
|
Reads metadata → Validates structure → Creates collections
|
||||||
|
↓
|
||||||
|
Uploads documents → Maintains hierarchy → Reports status
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Import Modes
|
||||||
|
|
||||||
|
Each subdirectory becomes a separate collection:
|
||||||
|
|
||||||
|
```
|
||||||
|
outline_export/
|
||||||
|
├── Bewerbungen/ → Creates "Bewerbungen" collection
|
||||||
|
├── Projekte/ → Creates "Projekte" collection
|
||||||
|
└── Privat/ → Creates "Privat" collection
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Single Collection (`--single`)
|
||||||
|
|
||||||
|
All content goes into one timestamped collection:
|
||||||
|
|
||||||
|
```
|
||||||
|
outline_export/
|
||||||
|
├── Bewerbungen/ → Becomes parent doc "Bewerbungen"
|
||||||
|
├── Projekte/ → Becomes parent doc "Projekte"
|
||||||
|
└── Privat/ → Becomes parent doc "Privat"
|
||||||
|
|
||||||
|
All imported into: "import_20260119_143052" collection
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Behavior & Duplicate Handling
|
||||||
|
|
||||||
|
#### Duplicate Handling
|
||||||
|
|
||||||
|
| Scenario | Default Behavior | With `--force` |
|
||||||
|
|----------|------------------|----------------|
|
||||||
|
| Collection exists | Skip entire collection | Delete and recreate |
|
||||||
|
| Document exists | Skip document | Update document |
|
||||||
|
|
||||||
|
#### Error Handling
|
||||||
|
|
||||||
|
**Import Errors**:
|
||||||
|
- **API connection failure**: Abort with error message
|
||||||
|
- **Collection creation fails**: Abort that collection, continue others
|
||||||
|
- **Document creation fails**: Log error, continue with siblings
|
||||||
|
- **Missing markdown file**: Log warning, skip document
|
||||||
|
- **Parent not found**: Create as root-level document
|
||||||
|
|
||||||
|
**Export Errors**:
|
||||||
|
- **API connection failure**: Abort before starting
|
||||||
|
- **Collection fetch fails**: Skip that collection, continue
|
||||||
|
- **Document fetch fails**: Retry 3x with backoff, then skip
|
||||||
|
- **Disk write fails**: Abort with error message
|
||||||
|
|
||||||
|
#### Rate Limiting
|
||||||
|
|
||||||
|
If Outline API returns 429 errors:
|
||||||
|
- Automatic retry with exponential backoff
|
||||||
|
- Up to 3 retry attempts per request
|
||||||
|
- Configurable delay between retries
|
||||||
|
|
||||||
|
#### Important Features & Behaviors
|
||||||
|
|
||||||
|
**Backup System**:
|
||||||
|
- Each export automatically backs up previous exports to `outline_backup_YYYYMMDD_HHMMSS.tar.gz`
|
||||||
|
- Old uncompressed export directory is deleted after backup
|
||||||
|
- Backups achieve **90%+ compression** on markdown content
|
||||||
|
- Safe to re-run exports - previous data is always preserved
|
||||||
|
|
||||||
|
**Reliability Features**:
|
||||||
|
- **Health check**: Verifies API connectivity before operations
|
||||||
|
- **Retry logic**: Failed API requests retry up to 3 times with exponential backoff
|
||||||
|
- **Caching**: Document content cached during single run to reduce API calls
|
||||||
|
- **Logging**: Structured logging with configurable verbosity levels (-v, -vv)
|
||||||
|
|
||||||
|
**Hierarchy Integrity**:
|
||||||
|
- The navigation tree (`/api/collections.documents`) is the **source of truth** for document hierarchy
|
||||||
|
- Import maintains parent-child relationships via `parentDocumentId` mapping
|
||||||
|
- Document counting is recursive to include all nested children
|
||||||
|
- Maximum depth limit (default: 100) prevents infinite recursion
|
||||||
|
|
||||||
|
### 5. File Structure Knowledge
|
||||||
|
|
||||||
|
```
|
||||||
|
outline-tools/
|
||||||
|
├── export_with_trees.sh # Main export entrypoint
|
||||||
|
|
||||||
|
#### Dry Run Testing
|
||||||
|
```bash
|
||||||
|
# Test export without writing files
|
||||||
|
./export_with_trees.sh --dry-run
|
||||||
|
|
||||||
|
# Test import without creating collections
|
||||||
|
./import_to_outline.sh --dry-run
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Verification Checklist
|
||||||
|
- [ ] Health check passes before export/import
|
||||||
|
- [ ] Document count matches (compare tree output)
|
||||||
|
- [ ] Hierarchy preserved (check parent-child relationships)
|
||||||
|
- [ ] Metadata files valid JSON
|
||||||
|
- [ ] No API errors in logs
|
||||||
|
- [ ] Backup created successfully (export only)
|
||||||
|
|
||||||
|
### 8. Troubleshooting & Debug Mode
|
||||||
|
|
||||||
|
#### Common Issues
|
||||||
|
|
||||||
|
**"Connection refused" or "Name resolution failed"**
|
||||||
|
- **Cause**: Not running inside `domnet` Docker network
|
||||||
|
- **Solution**: Always use shell wrappers (`export_with_trees.sh`, `import_to_outline.sh`)
|
||||||
|
|
||||||
|
**"Authorization failed" or 401/403 errors**
|
||||||
|
- **Cause**: Invalid or expired API token
|
||||||
|
- **Solution**: Update token in `settings.json`
|
||||||
|
|
||||||
|
**Documents appear at wrong hierarchy level after import**
|
||||||
|
- **Cause**: Metadata corruption or `parentDocumentId` mapping issue
|
||||||
|
- **Solution**: Re-export, verify `_collection_metadata.json` integrity, check `id_mapping` dictionary
|
||||||
|
|
||||||
|
**Import creates duplicate collections**
|
||||||
|
- **Cause**: Collection names differ (case, spaces, special chars)
|
||||||
|
- **Solution**: Use `--force` to replace, or manually delete old collections
|
||||||
|
|
||||||
|
**API returns 429 errors**
|
||||||
|
- **Cause**: Rate limiting from too many API requests
|
||||||
|
- **Solution**: Built-in retry logic handles this - increase `RETRY_DELAY` if persistent
|
||||||
|
|
||||||
|
#### Debug Mode
|
||||||
|
|
||||||
|
Run with `-vv` for detailed debug output:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./export_with_trees.sh -vv
|
||||||
|
./import_to_outline.sh -vv
|
||||||
|
```
|
||||||
|
|
||||||
|
This shows:
|
||||||
|
- Full API requests and responses
|
||||||
|
- Document ID mappings
|
||||||
|
- File operations
|
||||||
|
- Retry attempts
|
||||||
|
|
||||||
|
#### Quick Diagnostics
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Test API connectivity
|
||||||
|
curl -H "Authorization: Bearer $TOKEN" http://outline:3000/api/collections.list
|
||||||
|
|
||||||
|
# Check Docker network
|
||||||
|
docker network inspect domnet
|
||||||
|
|
||||||
|
# Run with verbose logging
|
||||||
|
./export_with_trees.sh -vv
|
||||||
|
```
|
||||||
|
|
||||||
|
### 9. Extending the Tool
|
||||||
|
|
||||||
|
#### Adding New CLI Options
|
||||||
|
|
||||||
|
**Bash wrapper** (`export_with_trees.sh`):
|
||||||
|
```bash
|
||||||
|
# Add option parsing
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case $1 in
|
||||||
|
--my-option)
|
||||||
|
MY_OPTION="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
```
|
||||||
|
|
||||||
|
**Python script** (`outline_export_fixed.py`):
|
||||||
|
```python
|
||||||
|
# Add argument parser
|
||||||
|
parser.add_argument('--my-option', help='Description')
|
||||||
|
|
||||||
|
# Pass to Docker
|
||||||
|
docker_cmd="... python3 outline_export_fixed.py $@"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Adding New Export Formats
|
||||||
|
|
||||||
|
1. Create format converter function in `outline_export_fixed.py`
|
||||||
|
2. Add format option to CLI
|
||||||
|
3. Modify `write_document_to_file()` to call converter
|
||||||
|
4. Update metadata to track format
|
||||||
|
|
||||||
|
#### Custom Filtering
|
||||||
|
|
||||||
|
Add filter configuration to `settings.json`:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"export": {
|
||||||
|
"filters": {
|
||||||
|
"exclude_tags": ["draft", "private"],
|
||||||
|
"include_collections": ["Public", "Docs"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Then implement in `OutlineExporter.should_export_document()`.
|
||||||
|
|
||||||
|
### 10. Error Recovery
|
||||||
|
|
||||||
|
#### Partial Export Recovery
|
||||||
|
If export crashes mid-run:
|
||||||
|
1. Previous export is already backed up (if existed)
|
||||||
|
2. Partial export in `outline_export/` may be incomplete
|
||||||
|
3. Safe to re-run - will overwrite partial data
|
||||||
|
4. Check `manifest.json` to see what completed
|
||||||
|
|
||||||
|
#### Failed Import Recovery
|
||||||
|
If import fails partway:
|
||||||
|
1. Successfully created collections remain in Outline
|
||||||
|
2. Use `--force` to delete and retry, OR
|
||||||
|
3. Manually delete collections from Outline UI
|
||||||
|
4. Check logs for document ID where failure occurred
|
||||||
|
|
||||||
|
### 11. Performance Optimization
|
||||||
|
|
||||||
|
#### Reducing API Calls
|
||||||
|
- **Caching**: Document content cached during single run
|
||||||
|
- **Batching**: Not currently implemented (future enhancement)
|
||||||
|
- **Parallelization**: Not safe due to Outline API rate limits
|
||||||
|
|
||||||
|
#### Faster Exports
|
||||||
|
- Skip verification: `--skip-verify`
|
||||||
|
- Skip health check: `--skip-health-check` (risky)
|
||||||
|
- Reduce hierarchy depth: Adjust `max_hierarchy_depth` in settings
|
||||||
|
|
||||||
|
#### Faster Imports
|
||||||
|
- Single collection mode: `--single` (fewer collection creates)
|
||||||
|
- Disable verbose logging (default)
|
||||||
|
|
||||||
|
### 12. Security Considerations
|
||||||
|
|
||||||
|
#### Secrets Management
|
||||||
|
- `settings.json` contains API token
|
||||||
|
- **Never log** the token value
|
||||||
|
- **Never commit** `settings.json` to git
|
||||||
|
- Backups may contain sensitive content
|
||||||
|
|
||||||
|
#### Safe Practices
|
||||||
|
```bash
|
||||||
|
# Check git status before committing
|
||||||
|
git status
|
||||||
|
|
||||||
|
# Verify settings.json is ignored
|
||||||
|
grep settings.json .gitignore
|
||||||
|
|
||||||
|
# Sanitize logs before sharing
|
||||||
|
sed 's/Bearer [A-Za-z0-9_-]*/Bearer [REDACTED]/g' logs.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
### 13. Common Agent Mistakes to Avoid
|
||||||
|
|
||||||
|
1. **Don't suggest running Python directly** - Always use Docker wrappers
|
||||||
|
2. **Don't hardcode the API URL** - It's environment-specific (use settings.json)
|
||||||
|
3. **Don't assume external API access** - Only works inside `domnet`
|
||||||
|
4. **Don't ignore dry-run mode** - Always test changes with `--dry-run` first
|
||||||
|
5. **Don't modify hierarchy logic lightly** - Parent-child relationships are fragile
|
||||||
|
6. **Don't skip error handling** - API can fail intermittently
|
||||||
|
7. **Don't forget to update both export and import** - Changes often affect both sides
|
||||||
|
|
||||||
|
### 14. Useful Code Patterns
|
||||||
|
|
||||||
|
#### Making Authenticated API Calls
|
||||||
|
```python
|
||||||
|
headers = {
|
||||||
|
"Authorization": f"Bearer {self.api_token}",
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
}
|
||||||
|
response = requests.post(
|
||||||
|
f"{self.api_url}/api/endpoint",
|
||||||
|
json=payload,
|
||||||
|
headers=headers,
|
||||||
|
timeout=30
|
||||||
|
)
|
||||||
|
response.raise_for_status()
|
||||||
|
data = response.json()
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Recursive Tree Traversal
|
||||||
|
```python
|
||||||
|
def process_tree(node, parent_id=None):
|
||||||
|
doc_id = node["id"]
|
||||||
|
process_document(doc_id, parent_id)
|
||||||
|
|
||||||
|
for child in node.get("children", []):
|
||||||
|
process_tree(child, doc_id)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Progress Display with tqdm
|
||||||
|
```python
|
||||||
|
from tqdm import tqdm
|
||||||
|
|
||||||
|
with tqdm(total=total_docs, desc="Exporting") as pbar:
|
||||||
|
for doc in documents:
|
||||||
|
process(doc)
|
||||||
|
pbar.update(1)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 15. When to Ask for Clarification
|
||||||
|
|
||||||
|
Ask the user if:
|
||||||
|
- They want to modify API authentication method
|
||||||
|
- They need to export to a different Outline instance
|
||||||
|
- They want to filter by specific criteria not in settings
|
||||||
|
- They experience persistent API errors (might be Outline-specific issue)
|
||||||
|
- They need to handle very large wikis (>10,000 documents)
|
||||||
|
- They want to schedule automated backups (needs cron/systemd setup)
|
||||||
|
|
||||||
|
### 16. Recommended Improvements (Future)
|
||||||
|
|
||||||
|
Ideas for enhancing the tool:
|
||||||
|
- **Incremental exports**: Only export changed documents
|
||||||
|
- **Parallel imports**: Speed up large imports (carefully!)
|
||||||
|
- **Format converters**: Export to Notion, Confluence, etc.
|
||||||
|
- **Diff tool**: Compare exported versions
|
||||||
|
- **Search index**: Build searchable archive
|
||||||
|
- **Version history**: Track document changes over time
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Quick Decision Tree
|
||||||
|
|
||||||
|
```
|
||||||
|
User wants to modify the tool:
|
||||||
|
├─ Change export filtering? → Edit outline_export_fixed.py
|
||||||
|
├─ Change import behavior? → Edit outline_import.py
|
||||||
|
├─ Add CLI option? → Edit .sh wrapper + .py script
|
||||||
|
├─ Change output format? → Edit write_document_to_file()
|
||||||
|
├─ Fix API error? → Check retry logic and error handling
|
||||||
|
└─ Add new feature? → Review both export and import sides
|
||||||
|
|
||||||
|
User reports an error:
|
||||||
|
├─ Connection refused? → Check Docker network
|
||||||
|
├─ Auth error? → Verify API token in settings.json
|
||||||
|
├─ Hierarchy wrong? → Check id_mapping in import
|
||||||
|
├─ Missing documents? → Compare counts, check filters
|
||||||
|
└─ JSON error? → Validate metadata files
|
||||||
|
|
||||||
|
User wants to understand:
|
||||||
|
├─ How it works? → Refer to CLAUDE.md
|
||||||
|
├─ How to use? → Show CLI examples
|
||||||
|
├─ How to extend? → Point to sections 9-10 above
|
||||||
|
└─ How to troubleshoot? → Use section 8 checklist
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Additional Resources
|
||||||
|
|
||||||
|
- **Outline API Docs**: https://www.getoutline.com/developers
|
||||||
|
- **Python requests**: https://requests.readthedocs.io/
|
||||||
|
- **Docker networks**: https://docs.docker.com/network/
|
||||||
|
- **tqdm progress bars**: https://tqdm.github.io/
|
||||||
|
|
||||||
|
## Agent Self-Check
|
||||||
|
|
||||||
|
Before suggesting changes:
|
||||||
|
- [ ] Have I read the architecture section?
|
||||||
|
- [ ] Do I understand the Docker network requirement?
|
||||||
|
- [ ] Have I considered both export and import sides?
|
||||||
|
- [ ] Will my change maintain hierarchy integrity?
|
||||||
|
- [ ] Have I suggested testing with --dry-run?
|
||||||
|
- [ ] Have I checked for security implications?
|
||||||
|
- [ ] Is my suggestion compatible with Docker execution?
|
||||||
205
CLAUDE.md
205
CLAUDE.md
@@ -1,206 +1,7 @@
|
|||||||
# CLAUDE.md
|
# CLAUDE.md
|
||||||
|
|
||||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
This file provides guidance to Claude Code (claude.ai/code) when working with this repository.
|
||||||
|
|
||||||
## Project Overview
|
For detailed technical guidance, architecture details, common pitfalls, and code patterns, please see [AGENTS.md](./AGENTS.md).
|
||||||
|
|
||||||
This is a tool for exporting Outline wiki data via API. The script runs inside a Docker container on the `domnet` network to bypass Authentik SSO authentication and access the internal Outline API directly (`http://outline:3000`).
|
This is an Outline export/import tool that uses Docker to bypass Authentik SSO and access the internal API directly at `http://outline:3000`.
|
||||||
|
|
||||||
## Usage
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Run the export with tree visualization
|
|
||||||
./export_with_trees.sh
|
|
||||||
|
|
||||||
# Preview without exporting (dry run)
|
|
||||||
./export_with_trees.sh --dry-run
|
|
||||||
|
|
||||||
# Run with verbose output
|
|
||||||
./export_with_trees.sh -v
|
|
||||||
```
|
|
||||||
|
|
||||||
### CLI Options
|
|
||||||
```
|
|
||||||
--dry-run, -n Preview what would be exported without writing files
|
|
||||||
--output, -o DIR Output directory (overrides settings.json)
|
|
||||||
--verbose, -v Increase verbosity (-vv for debug)
|
|
||||||
--skip-verify Skip post-export verification
|
|
||||||
--skip-health-check Skip pre-export health check
|
|
||||||
--settings FILE Path to settings file (default: settings.json)
|
|
||||||
```
|
|
||||||
|
|
||||||
### Running the Python Export Directly
|
|
||||||
```bash
|
|
||||||
docker run --rm --network domnet \
|
|
||||||
-v "$(pwd):/work" \
|
|
||||||
-w /work \
|
|
||||||
python:3.11-slim \
|
|
||||||
bash -c "pip install -q requests tqdm && python3 outline_export_fixed.py"
|
|
||||||
|
|
||||||
# With options
|
|
||||||
docker run --rm --network domnet \
|
|
||||||
-v "$(pwd):/work" \
|
|
||||||
-w /work \
|
|
||||||
python:3.11-slim \
|
|
||||||
bash -c "pip install -q requests tqdm && python3 outline_export_fixed.py --dry-run"
|
|
||||||
```
|
|
||||||
|
|
||||||
## Architecture
|
|
||||||
|
|
||||||
### Docker Network Integration
|
|
||||||
- Script runs in Docker container attached to `domnet` bridge network
|
|
||||||
- Direct API access to `http://outline:3000` (internal) bypasses SSO
|
|
||||||
- Uses `python:3.11-slim` image with `requests` and `tqdm` dependencies
|
|
||||||
|
|
||||||
### Export Flow
|
|
||||||
1. Fetch collections via `/api/collections.list`
|
|
||||||
2. Get navigation tree via `/api/collections.documents` (source of truth for hierarchy)
|
|
||||||
3. Fetch full document content via `/api/documents.info` (with caching)
|
|
||||||
4. Export recursively maintaining parent-child structure
|
|
||||||
5. Save metadata (`_collection_metadata.json`) per collection
|
|
||||||
6. Generate manifest with checksums for verification
|
|
||||||
|
|
||||||
### Key Files
|
|
||||||
- `export_with_trees.sh` - Main export script with tree visualization
|
|
||||||
- `outline_export_fixed.py` - Core export logic with `OutlineExporter` class
|
|
||||||
- `settings.json` - API URL and token configuration (contains secrets)
|
|
||||||
- `outline_export/` - Output directory with markdown files and metadata
|
|
||||||
- `outline_backup_*.tar.gz` - Timestamped compressed backups
|
|
||||||
|
|
||||||
### Configuration
|
|
||||||
Settings are in `settings.json`:
|
|
||||||
- `source.url` - Internal Docker URL (`http://outline:3000`)
|
|
||||||
- `source.token` - Outline API token
|
|
||||||
- `export.output_directory` - Output path (default: `outline_export`)
|
|
||||||
- `advanced.max_hierarchy_depth` - Prevent infinite recursion (default: 100)
|
|
||||||
|
|
||||||
## Important Notes
|
|
||||||
|
|
||||||
### Security
|
|
||||||
- `settings.json` contains API token - never commit to git
|
|
||||||
- Backup files may contain sensitive wiki content
|
|
||||||
|
|
||||||
### Backup System
|
|
||||||
- Each export automatically backs up previous exports to `outline_backup_YYYYMMDD_HHMMSS.tar.gz`
|
|
||||||
- Old uncompressed export directory is deleted after backup
|
|
||||||
- Backups achieve 90%+ compression on markdown content
|
|
||||||
|
|
||||||
### Reliability Features
|
|
||||||
- **Health check**: Verifies API connectivity before export
|
|
||||||
- **Retry logic**: Failed API requests retry up to 3 times with exponential backoff
|
|
||||||
- **Logging**: Structured logging with configurable verbosity levels
|
|
||||||
|
|
||||||
### Document Counting
|
|
||||||
The navigation tree (`/api/collections.documents`) is the source of truth for document hierarchy. Document counting is recursive to include all nested children.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Import Tool
|
|
||||||
|
|
||||||
The import script restores exported markdown files back into Outline, preserving the full document hierarchy.
|
|
||||||
|
|
||||||
### Usage
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Import all collections from outline_export/
|
|
||||||
./import_to_outline.sh
|
|
||||||
|
|
||||||
# Preview what would be imported (no changes made)
|
|
||||||
./import_to_outline.sh --dry-run
|
|
||||||
|
|
||||||
# Import into a single timestamped collection
|
|
||||||
./import_to_outline.sh --single
|
|
||||||
|
|
||||||
# Import from a different directory
|
|
||||||
./import_to_outline.sh -d exports/
|
|
||||||
|
|
||||||
# Overwrite existing collections
|
|
||||||
./import_to_outline.sh --force
|
|
||||||
```
|
|
||||||
|
|
||||||
### CLI Options
|
|
||||||
```
|
|
||||||
-s, --single Import all into single timestamped collection
|
|
||||||
-n, --dry-run Preview operations without making changes
|
|
||||||
-d, --source DIR Source directory (default: outline_export)
|
|
||||||
-v, --verbose Increase verbosity (-vv for debug)
|
|
||||||
-f, --force Overwrite existing collections (instead of skip)
|
|
||||||
--settings FILE Path to settings file (default: settings.json)
|
|
||||||
-h, --help Show help message
|
|
||||||
```
|
|
||||||
|
|
||||||
### Import Modes
|
|
||||||
|
|
||||||
**Collection-per-Folder (Default)**
|
|
||||||
Each subdirectory becomes a separate collection:
|
|
||||||
```
|
|
||||||
outline_export/
|
|
||||||
├── Bewerbungen/ → Creates "Bewerbungen" collection
|
|
||||||
├── Projekte/ → Creates "Projekte" collection
|
|
||||||
└── Privat/ → Creates "Privat" collection
|
|
||||||
```
|
|
||||||
|
|
||||||
**Single Collection (`--single`)**
|
|
||||||
All content goes into one timestamped collection:
|
|
||||||
```
|
|
||||||
outline_export/
|
|
||||||
├── Bewerbungen/ → Becomes parent doc "Bewerbungen"
|
|
||||||
├── Projekte/ → Becomes parent doc "Projekte"
|
|
||||||
└── Privat/ → Becomes parent doc "Privat"
|
|
||||||
|
|
||||||
All imported into: "import_20260119_143052" collection
|
|
||||||
```
|
|
||||||
|
|
||||||
### Import Flow
|
|
||||||
1. Load `_collection_metadata.json` from each collection directory
|
|
||||||
2. Build document tree maintaining parent-child relationships
|
|
||||||
3. Create collections via `/api/collections.create`
|
|
||||||
4. Create documents via `/api/documents.create` with proper `parentDocumentId`
|
|
||||||
5. Map old IDs to new IDs to maintain hierarchy
|
|
||||||
6. Display tree-style progress with status indicators
|
|
||||||
|
|
||||||
### Key Files
|
|
||||||
- `import_to_outline.sh` - Bash wrapper with Docker execution
|
|
||||||
- `outline_import.py` - Core import logic with `OutlineImporter` class
|
|
||||||
|
|
||||||
### Output Example
|
|
||||||
```
|
|
||||||
════════════════════════════════════════════════════════════
|
|
||||||
OUTLINE IMPORT
|
|
||||||
════════════════════════════════════════════════════════════
|
|
||||||
|
|
||||||
Source: outline_export/
|
|
||||||
Target: http://outline:3000
|
|
||||||
Mode: Collection per folder
|
|
||||||
|
|
||||||
Checking API connectivity... ✓
|
|
||||||
|
|
||||||
Bewerbungen/ (11 documents)
|
|
||||||
Creating collection... ✓ (id: 7f3a...)
|
|
||||||
├── CV.md ✓ created
|
|
||||||
├── Tipico.md ✓ created
|
|
||||||
│ ├── Pitch Tipico.md ✓ created
|
|
||||||
│ └── Fragen 3. Runde.md ✓ created
|
|
||||||
└── Ihre PVS.md ✓ created
|
|
||||||
|
|
||||||
════════════════════════════════════════════════════════════
|
|
||||||
SUMMARY
|
|
||||||
════════════════════════════════════════════════════════════
|
|
||||||
Collections: 1 created, 0 skipped, 0 errors
|
|
||||||
Documents: 11 created, 0 skipped, 0 errors
|
|
||||||
Duration: 2.3 seconds
|
|
||||||
════════════════════════════════════════════════════════════
|
|
||||||
```
|
|
||||||
|
|
||||||
### Duplicate Handling
|
|
||||||
| Scenario | Default | With `--force` |
|
|
||||||
|----------|---------|----------------|
|
|
||||||
| Collection exists | Skip entire collection | Delete and recreate |
|
|
||||||
| Document exists | Skip document | Update document |
|
|
||||||
|
|
||||||
### Error Handling
|
|
||||||
- **API connection failure**: Abort with error message
|
|
||||||
- **Collection creation fails**: Abort that collection, continue others
|
|
||||||
- **Document creation fails**: Log error, continue with siblings
|
|
||||||
- **Missing markdown file**: Log warning, skip document
|
|
||||||
- **Parent not found**: Create as root-level document
|
|
||||||
|
|||||||
34
Dockerfile
Normal file
34
Dockerfile
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
FROM python:3.11-slim
|
||||||
|
|
||||||
|
RUN apt-get update -qq && \
|
||||||
|
apt-get install -y -q --no-install-recommends git && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
RUN pip install --no-cache-dir requests fastapi "uvicorn[standard]" pydantic
|
||||||
|
|
||||||
|
ENV GIT_AUTHOR_NAME=outline-sync \
|
||||||
|
GIT_AUTHOR_EMAIL=sync@local \
|
||||||
|
GIT_COMMITTER_NAME=outline-sync \
|
||||||
|
GIT_COMMITTER_EMAIL=sync@local \
|
||||||
|
VAULT_DIR=/vault \
|
||||||
|
SETTINGS_PATH=/work/settings.json
|
||||||
|
|
||||||
|
RUN git config --global user.email "sync@local" && \
|
||||||
|
git config --global user.name "outline-sync"
|
||||||
|
|
||||||
|
WORKDIR /work
|
||||||
|
|
||||||
|
COPY outline_sync.py webui.py settings.json ./
|
||||||
|
|
||||||
|
# Initialise vault with both branches needed by outline_sync.py
|
||||||
|
RUN git init /vault && \
|
||||||
|
git -C /vault checkout -b main && \
|
||||||
|
touch /vault/.gitkeep && \
|
||||||
|
git -C /vault add .gitkeep && \
|
||||||
|
git -C /vault commit -m "init: empty vault" && \
|
||||||
|
git -C /vault checkout -b outline && \
|
||||||
|
git -C /vault checkout main
|
||||||
|
|
||||||
|
EXPOSE 8080
|
||||||
|
|
||||||
|
CMD ["uvicorn", "webui:app", "--host", "0.0.0.0", "--port", "8080"]
|
||||||
15
Dockerfile.sync
Normal file
15
Dockerfile.sync
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
FROM python:3.11-slim
|
||||||
|
|
||||||
|
RUN apt-get update -qq && \
|
||||||
|
apt-get install -y -q --no-install-recommends git && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
RUN pip install --no-cache-dir requests
|
||||||
|
|
||||||
|
# Git identity for commits inside the container
|
||||||
|
ENV GIT_AUTHOR_NAME=outline-sync \
|
||||||
|
GIT_AUTHOR_EMAIL=sync@local \
|
||||||
|
GIT_COMMITTER_NAME=outline-sync \
|
||||||
|
GIT_COMMITTER_EMAIL=sync@local
|
||||||
|
|
||||||
|
WORKDIR /work
|
||||||
759
SYNC_PRD.md
Normal file
759
SYNC_PRD.md
Normal file
@@ -0,0 +1,759 @@
|
|||||||
|
# PRD: Outline ↔ Obsidian Sync
|
||||||
|
**Version:** 0.3
|
||||||
|
**Date:** 2026-03-04
|
||||||
|
**Status:** Revised — Git + Frontmatter Architecture
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Problem Statement
|
||||||
|
|
||||||
|
The user maintains a knowledge base in Outline (web) and wants to edit it locally in Obsidian. Changes must flow in both directions. The existing export/import scripts provide one-way, full-replacement operations — insufficient for safe bidirectional sync where both sides accumulate independent changes between syncs.
|
||||||
|
|
||||||
|
This is **not a shared git repository**. Git is used purely as a local sync engine — for its change detection, three-way merge, and conflict resolution primitives. There is no git remote, no `git push` to a server, no collaboration workflow.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Goals
|
||||||
|
|
||||||
|
| # | Goal |
|
||||||
|
|---|------|
|
||||||
|
| G1 | Changes made in Outline are visible in Obsidian after sync |
|
||||||
|
| G2 | Changes made in Obsidian are pushed to Outline without destroying document history |
|
||||||
|
| G3 | Conflicts are detected and surfaced — never silently overwritten |
|
||||||
|
| G4 | Safe to run unattended via Ofelia cron |
|
||||||
|
| G5 | Outline document URLs remain stable across syncs (no delete+recreate) |
|
||||||
|
| G6 | Setup is self-contained in a dedicated folder with clear structure |
|
||||||
|
|
||||||
|
## 3. Non-Goals
|
||||||
|
|
||||||
|
| # | Non-Goal | Reason |
|
||||||
|
|---|----------|--------|
|
||||||
|
| NG1 | Real-time sync | Async workflow; polling is sufficient |
|
||||||
|
| NG2 | Wikilink ↔ Outline link translation | Lossy and fragile |
|
||||||
|
| NG3 | Syncing attachments/images | Separate concern |
|
||||||
|
| NG4 | Obsidian-only content (Daily Notes, templates) | No Outline equivalent |
|
||||||
|
| NG5 | Multi-user conflict resolution | Single-user use case |
|
||||||
|
| NG6 | A shared/remote git repository | Git is local tooling only |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Folder Setup
|
||||||
|
|
||||||
|
### 4.1 Directory Structure
|
||||||
|
|
||||||
|
A dedicated folder is created for the Obsidian vault and sync tooling. It is separate from `outline-tools/` (which holds export/import scripts).
|
||||||
|
|
||||||
|
```
|
||||||
|
/home/crabix/docker_authentik/
|
||||||
|
├── outline-tools/ ← existing: export + import scripts (unchanged)
|
||||||
|
└── outline-vault/ ← NEW: Obsidian vault + git sync engine
|
||||||
|
├── .git/ ← local git repo (never pushed to any remote)
|
||||||
|
├── .gitignore
|
||||||
|
├── .gitattributes
|
||||||
|
├── sync.sh ← main sync entrypoint (bash wrapper)
|
||||||
|
├── outline_sync.py ← core sync logic
|
||||||
|
├── settings.json ← API token + config (gitignored)
|
||||||
|
├── _sync_log.md ← human-readable sync log (readable in Obsidian)
|
||||||
|
├── .obsidian/ ← Obsidian config (gitignored)
|
||||||
|
├── Bewerbungen/ ← Outline collection → top-level folder
|
||||||
|
│ ├── CV.md
|
||||||
|
│ └── Tipico/ ← nested document → subfolder
|
||||||
|
│ ├── Tipico.md
|
||||||
|
│ └── Pitch Tipico.md
|
||||||
|
└── Projekte/
|
||||||
|
└── ...
|
||||||
|
```
|
||||||
|
|
||||||
|
Obsidian is pointed at `/home/crabix/docker_authentik/outline-vault/` as its vault root.
|
||||||
|
|
||||||
|
### 4.2 Git Initialization
|
||||||
|
|
||||||
|
Git is initialized locally with two branches. There is no remote.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /home/crabix/docker_authentik/outline-vault
|
||||||
|
git init
|
||||||
|
git checkout -b outline # branch: tracks last known Outline state
|
||||||
|
git checkout -b main # branch: Obsidian edits live here
|
||||||
|
```
|
||||||
|
|
||||||
|
**Branch semantics:**
|
||||||
|
|
||||||
|
| Branch | Purpose |
|
||||||
|
|--------|---------|
|
||||||
|
| `outline` | Snapshot of Outline at last successful sync. Committed by sync script only. Never edited by hand. |
|
||||||
|
| `main` | Working branch. Obsidian edits land here. Obsidian Git plugin auto-commits. |
|
||||||
|
|
||||||
|
The `outline` branch tip is the **common ancestor** for all merges. It replaces any custom state file — no `sync_state.json` needed.
|
||||||
|
|
||||||
|
### 4.3 `.gitignore`
|
||||||
|
|
||||||
|
```gitignore
|
||||||
|
# Obsidian internals
|
||||||
|
.obsidian/
|
||||||
|
|
||||||
|
# Sync config (contains API token)
|
||||||
|
settings.json
|
||||||
|
|
||||||
|
# Conflict sidecars (resolved manually, not tracked)
|
||||||
|
*.conflict.md
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4.4 `.gitattributes`
|
||||||
|
|
||||||
|
```gitattributes
|
||||||
|
# Treat all markdown as text, normalize line endings
|
||||||
|
*.md text eol=lf
|
||||||
|
|
||||||
|
# Tell git to always use union merge on the sync log
|
||||||
|
# (append-only, never conflicts)
|
||||||
|
_sync_log.md merge=union
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4.5 `settings.json` (not committed)
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"source": {
|
||||||
|
"url": "http://outline:3000",
|
||||||
|
"token": "ol_api_YOUR_TOKEN_HERE"
|
||||||
|
},
|
||||||
|
"sync": {
|
||||||
|
"vault_dir": "/home/crabix/docker_authentik/outline-vault",
|
||||||
|
"enable_deletions": false,
|
||||||
|
"auto_push": false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Frontmatter as the ID Layer
|
||||||
|
|
||||||
|
Every synced file carries its Outline identity in YAML frontmatter. This is the bridge between git paths and Outline document IDs.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
---
|
||||||
|
outline_id: abc-123
|
||||||
|
outline_collection_id: col-456
|
||||||
|
outline_parent_id: def-789 # omitted if root document
|
||||||
|
outline_updated_at: 2026-03-03T10:30:00Z
|
||||||
|
---
|
||||||
|
|
||||||
|
# CV
|
||||||
|
|
||||||
|
Actual document content here...
|
||||||
|
```
|
||||||
|
|
||||||
|
**Rules:**
|
||||||
|
- Frontmatter is **stripped** before sending content to Outline API
|
||||||
|
- Frontmatter is **written back** after each successful API call with updated `outline_updated_at`
|
||||||
|
- Frontmatter travels with the file on rename/move — the Outline ID is never lost
|
||||||
|
- New files created in Obsidian have **no frontmatter** until first push
|
||||||
|
- The sync script identifies new files by the absence of `outline_id` in frontmatter
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Architecture: Outline as a "Dumb Remote"
|
||||||
|
|
||||||
|
Git handles change detection, merging, and conflict marking. The sync script is purely an **API translation layer**.
|
||||||
|
|
||||||
|
```
|
||||||
|
git pull equivalent:
|
||||||
|
Export Outline → temp dir → convert to frontmatter format
|
||||||
|
→ git commit to `outline` branch
|
||||||
|
→ git merge outline into main
|
||||||
|
|
||||||
|
git push equivalent:
|
||||||
|
git diff outline..main → list of changed files
|
||||||
|
→ for each file: call appropriate Outline API endpoint
|
||||||
|
→ write updated frontmatter back
|
||||||
|
→ advance outline branch tip
|
||||||
|
```
|
||||||
|
|
||||||
|
### 6.1 Change Detection Matrix
|
||||||
|
|
||||||
|
`git diff outline..main --name-status --find-renames` produces all cases:
|
||||||
|
|
||||||
|
| Git status | Outline API call |
|
||||||
|
|---|---|
|
||||||
|
| `M` modified | `documents.update(id, title, text)` |
|
||||||
|
| `A` added (with `outline_id` in frontmatter) | Already synced — skip |
|
||||||
|
| `A` added (no `outline_id`) | `documents.create(collectionId, title, text)` → write frontmatter |
|
||||||
|
| `D` deleted | `documents.delete(id)` — only if `enable_deletions: true` |
|
||||||
|
| `R` renamed/moved | `documents.update(id, title)` + `documents.move(id, newParentId)` if folder changed |
|
||||||
|
|
||||||
|
### 6.2 New Outline API Calls Required
|
||||||
|
|
||||||
|
Existing scripts only use `documents.create`. Sync additionally requires:
|
||||||
|
|
||||||
|
| Endpoint | Used for |
|
||||||
|
|---|---|
|
||||||
|
| `documents.update` | Core push operation — preserves history |
|
||||||
|
| `documents.move` | Reparent when file moved between folders |
|
||||||
|
| `documents.delete` | Deletion (flag-gated) |
|
||||||
|
| `documents.list` | Detect remote changes during pull |
|
||||||
|
| `collections.list` | Detect new/deleted collections during pull |
|
||||||
|
| `collections.create` | Create collection for new top-level folder |
|
||||||
|
| `collections.delete` | Delete collection (flag-gated) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7. Sync Commands
|
||||||
|
|
||||||
|
### `sync.sh init`
|
||||||
|
First-time setup. Creates vault from current Outline state.
|
||||||
|
```bash
|
||||||
|
./sync.sh init
|
||||||
|
```
|
||||||
|
Steps:
|
||||||
|
1. Run existing `outline_export_fixed.py` into temp dir
|
||||||
|
2. `git init` + create `outline` and `main` branches
|
||||||
|
3. Convert export metadata headers → YAML frontmatter in each `.md` file
|
||||||
|
4. Commit to `outline` branch: `git commit -m "sync: initial import"`
|
||||||
|
5. Checkout `main`, merge `outline`
|
||||||
|
6. Write `.gitignore`, `.gitattributes`
|
||||||
|
|
||||||
|
### `sync.sh pull`
|
||||||
|
Pull latest Outline state into Obsidian.
|
||||||
|
```bash
|
||||||
|
./sync.sh pull [--auto]
|
||||||
|
```
|
||||||
|
Steps:
|
||||||
|
1. Export Outline → temp dir
|
||||||
|
2. Convert to frontmatter format
|
||||||
|
3. Checkout `outline` branch, apply changes, commit: `git commit -m "sync: outline@TIMESTAMP"`
|
||||||
|
4. Checkout `main`
|
||||||
|
5. `git merge outline`
|
||||||
|
6. If conflict: write `*.conflict.md` sidecars, append to `_sync_log.md`, exit non-zero
|
||||||
|
7. If clean: append success to `_sync_log.md`
|
||||||
|
|
||||||
|
### `sync.sh push`
|
||||||
|
Push local Obsidian changes to Outline.
|
||||||
|
```bash
|
||||||
|
./sync.sh push [--dry-run]
|
||||||
|
```
|
||||||
|
Steps:
|
||||||
|
1. Check for unresolved merge conflicts — abort if any
|
||||||
|
2. `git diff outline..main --name-status --find-renames`
|
||||||
|
3. For each changed file: call Outline API (topological order for new docs)
|
||||||
|
4. Write updated frontmatter, `git add` the file
|
||||||
|
5. On full success: `git checkout outline && git merge main`
|
||||||
|
6. On partial failure: log errors, do not advance `outline` branch — next push retries failed files
|
||||||
|
|
||||||
|
### `sync.sh` (bidirectional)
|
||||||
|
Pull, then push if pull was clean.
|
||||||
|
```bash
|
||||||
|
./sync.sh [--dry-run] [--auto]
|
||||||
|
```
|
||||||
|
|
||||||
|
### `sync.sh status`
|
||||||
|
Show pending local changes not yet in Outline.
|
||||||
|
```bash
|
||||||
|
./sync.sh status
|
||||||
|
```
|
||||||
|
Output: `git diff outline..main --name-status` formatted as human-readable table.
|
||||||
|
|
||||||
|
### `sync.sh resolve FILE --accept [local|remote]`
|
||||||
|
Resolve a merge conflict.
|
||||||
|
```bash
|
||||||
|
./sync.sh resolve Bewerbungen/CV.md --accept local
|
||||||
|
./sync.sh resolve Bewerbungen/CV.md --accept remote
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 8. Conflict Handling
|
||||||
|
|
||||||
|
A conflict occurs when `git merge outline` produces conflict markers (same document edited on both sides since last sync).
|
||||||
|
|
||||||
|
```
|
||||||
|
Bewerbungen/CV.md ← contains git conflict markers, NOT pushed
|
||||||
|
Bewerbungen/CV.conflict.md ← human-readable: what changed locally vs remotely
|
||||||
|
```
|
||||||
|
|
||||||
|
`_sync_log.md` always reflects current sync state (readable in Obsidian):
|
||||||
|
```markdown
|
||||||
|
## 2026-03-04 08:00
|
||||||
|
- pull: 3 updated, 1 CONFLICT (Bewerbungen/CV.md) — push blocked
|
||||||
|
- push: blocked pending conflict resolution
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 9. Scheduled Sync via Ofelia
|
||||||
|
|
||||||
|
```ini
|
||||||
|
[job-exec "outline-pull"]
|
||||||
|
schedule = @every 1h
|
||||||
|
container = outline-sync
|
||||||
|
command = /work/sync.sh pull --auto
|
||||||
|
|
||||||
|
# Push is manual by default. Enable only after P6 (conflict detection) is complete.
|
||||||
|
# [job-exec "outline-push"]
|
||||||
|
# schedule = @daily
|
||||||
|
# container = outline-sync
|
||||||
|
# command = /work/sync.sh push --auto
|
||||||
|
```
|
||||||
|
|
||||||
|
`--auto` flag: suppresses interactive prompts, writes to `_sync_log.md`, exits non-zero on conflict so Ofelia can detect failures.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 10. Implementation Phases + User Stories
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 1 — `sync init`
|
||||||
|
|
||||||
|
**Scope:** Create vault from current Outline state. Git initialized. Frontmatter injected.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-1.1** — As a user, I want to run a single command that sets up the vault from scratch, so that I don't need to manually configure git or copy files.
|
||||||
|
- Given: `outline-vault/` does not exist (or is empty)
|
||||||
|
- When: `./sync.sh init` is run
|
||||||
|
- Then: The vault folder is created, git is initialized with `outline` and `main` branches, all Outline documents are exported as `.md` files with frontmatter, and `main` is checked out
|
||||||
|
|
||||||
|
**US-1.2** — As a user, I want every exported document to have frontmatter with its Outline ID, so that future syncs can identify documents without an external map.
|
||||||
|
- Given: A document exists in Outline with id `abc-123`
|
||||||
|
- When: `sync init` completes
|
||||||
|
- Then: The corresponding `.md` file contains `outline_id: abc-123` in its YAML frontmatter
|
||||||
|
|
||||||
|
**US-1.3** — As a user, I want the vault folder structure to mirror Outline's collection/document hierarchy, so that navigation in Obsidian feels natural.
|
||||||
|
- Given: Outline has collection "Bewerbungen" with nested document "Tipico > Pitch"
|
||||||
|
- When: `sync init` completes
|
||||||
|
- Then: Files exist at `Bewerbungen/Tipico/Tipico.md` and `Bewerbungen/Tipico/Pitch.md`
|
||||||
|
|
||||||
|
**US-1.4** — As a user, I want `settings.json` to be gitignored automatically, so that my API token is never accidentally committed.
|
||||||
|
- Given: `sync init` runs
|
||||||
|
- Then: `.gitignore` is created containing `settings.json` and `*.conflict.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 2 — `sync pull`
|
||||||
|
|
||||||
|
**Scope:** Export Outline → commit to `outline` branch → merge into `main`.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-2.1** — As a user, I want to run `sync pull` to get the latest Outline changes into Obsidian, so that edits made on the web are reflected locally.
|
||||||
|
- Given: A document was updated in Outline since last sync
|
||||||
|
- When: `./sync.sh pull` is run
|
||||||
|
- Then: The corresponding `.md` file is updated in the vault with the new content and updated `outline_updated_at` frontmatter
|
||||||
|
|
||||||
|
**US-2.2** — As a user, I want new Outline documents to appear as new files after pull, so that I don't miss newly created content.
|
||||||
|
- Given: A new document was created in Outline since last sync
|
||||||
|
- When: `./sync.sh pull` is run
|
||||||
|
- Then: A new `.md` file exists with correct frontmatter, in the correct folder
|
||||||
|
|
||||||
|
**US-2.3** — As a user, I want deleted Outline documents to be removed locally after pull, so that my vault stays in sync with Outline.
|
||||||
|
- Given: A document was deleted in Outline since last sync
|
||||||
|
- When: `./sync.sh pull` is run
|
||||||
|
- Then: The corresponding local `.md` file is removed
|
||||||
|
|
||||||
|
**US-2.4** — As a user, I want to be informed if a pull produces a conflict (same document edited both locally and in Outline), so that I can resolve it manually.
|
||||||
|
- Given: `CV.md` was edited locally AND in Outline since last sync
|
||||||
|
- When: `./sync.sh pull` is run
|
||||||
|
- Then: A `CV.conflict.md` sidecar is created, `_sync_log.md` records the conflict, the command exits non-zero, and `CV.md` is left untouched
|
||||||
|
|
||||||
|
**US-2.5** — As a user running pull in `--auto` mode (cron), I want the sync log to be updated even on failure, so that I can check `_sync_log.md` in Obsidian to understand what happened.
|
||||||
|
- Given: `./sync.sh pull --auto` runs via Ofelia and encounters a conflict
|
||||||
|
- Then: `_sync_log.md` is updated with timestamp and conflict details, exit code is non-zero
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 3 — `sync status`
|
||||||
|
|
||||||
|
**Scope:** Show pending local changes not yet pushed to Outline.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-3.1** — As a user, I want to see which local files have changed since last sync, so that I know what will be pushed before running push.
|
||||||
|
- Given: `CV.md` was edited locally and not yet pushed
|
||||||
|
- When: `./sync.sh status` is run
|
||||||
|
- Then: Output shows `M Bewerbungen/CV.md` (modified)
|
||||||
|
|
||||||
|
**US-3.2** — As a user, I want `status` to also show new files that will be created in Outline on push, so that I can review before committing.
|
||||||
|
- Given: `NewNote.md` was created in Obsidian with no frontmatter
|
||||||
|
- When: `./sync.sh status` is run
|
||||||
|
- Then: Output shows `A Bewerbungen/NewNote.md` (new → will create in Outline)
|
||||||
|
|
||||||
|
**US-3.3** — As a user, I want `status` to exit cleanly when there are no pending changes, so that cron jobs don't produce false alarms.
|
||||||
|
- Given: No local changes since last sync
|
||||||
|
- When: `./sync.sh status` is run
|
||||||
|
- Then: Output is "Nothing to push. Outline is up to date." and exit code is 0
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 4 — `sync push` (modify + rename)
|
||||||
|
|
||||||
|
**Scope:** Push modified and renamed/moved files to Outline. No new doc creation yet.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-4.1** — As a user, I want editing a file in Obsidian and running push to update the document in Outline, so that my local edits appear on the web.
|
||||||
|
- Given: `CV.md` was edited in Obsidian (has `outline_id` in frontmatter)
|
||||||
|
- When: `./sync.sh push` is run
|
||||||
|
- Then: Outline document `abc-123` is updated with new content, document history is preserved (not recreated), `outline_updated_at` in frontmatter is updated
|
||||||
|
|
||||||
|
**US-4.2** — As a user, I want renaming a file in Obsidian to update the document title in Outline, so that both sides stay consistent.
|
||||||
|
- Given: `CV.md` is renamed to `Lebenslauf.md` in Obsidian (frontmatter `outline_id` intact)
|
||||||
|
- When: `./sync.sh push` is run
|
||||||
|
- Then: Outline document title is updated to "Lebenslauf", document URL is preserved
|
||||||
|
|
||||||
|
**US-4.3** — As a user, I want moving a file to a different folder in Obsidian to reparent the document in Outline, so that the hierarchy stays consistent.
|
||||||
|
- Given: `Bewerbungen/CV.md` is moved to `Projekte/CV.md` in Obsidian
|
||||||
|
- When: `./sync.sh push` is run
|
||||||
|
- Then: In Outline, the document is moved to the "Projekte" collection with the new parent
|
||||||
|
|
||||||
|
**US-4.4** — As a user, I want push to be atomic per-document — a failure on one file should not block others.
|
||||||
|
- Given: Push is run with 5 changed files, one API call fails
|
||||||
|
- When: `./sync.sh push` completes
|
||||||
|
- Then: 4 documents are updated in Outline, 1 failure is logged, the `outline` branch is NOT advanced (failed file will retry on next push)
|
||||||
|
|
||||||
|
**US-4.5** — As a user, I want `--dry-run` to show me exactly what API calls would be made without touching Outline.
|
||||||
|
- Given: 3 files modified locally
|
||||||
|
- When: `./sync.sh push --dry-run`
|
||||||
|
- Then: Output lists each file and the API call that would be made, no changes to Outline or git
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 5 — `sync push` (new documents)
|
||||||
|
|
||||||
|
**Scope:** Create new Outline documents from new Obsidian files.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-5.1** — As a user, I want creating a new `.md` file in an existing collection folder to create a new document in Outline on push.
|
||||||
|
- Given: `Bewerbungen/NewApplication.md` created in Obsidian, no frontmatter
|
||||||
|
- When: `./sync.sh push` is run
|
||||||
|
- Then: Document created in Outline under "Bewerbungen" collection, frontmatter written back with new `outline_id`
|
||||||
|
|
||||||
|
**US-5.2** — As a user, I want creating a file in a subfolder to correctly set the parent document in Outline, preserving hierarchy.
|
||||||
|
- Given: `Bewerbungen/Tipico/NewDoc.md` created, `Tipico.md` has `outline_id: def-789`
|
||||||
|
- When: `./sync.sh push` is run
|
||||||
|
- Then: Outline document created with `parentDocumentId: def-789`
|
||||||
|
|
||||||
|
**US-5.3** — As a user, I want creating a new top-level folder to create a new Outline collection on push.
|
||||||
|
- Given: `NewCollection/FirstDoc.md` created in Obsidian
|
||||||
|
- When: `./sync.sh push` is run
|
||||||
|
- Then: New collection "NewCollection" is created in Outline, `FirstDoc` is created inside it
|
||||||
|
|
||||||
|
**US-5.4** — As a user, I want parent documents to always be created before their children, even if I created them in reverse order.
|
||||||
|
- Given: `Bewerbungen/Tipico/NewChild.md` created, but `Bewerbungen/Tipico/Tipico.md` is also new
|
||||||
|
- When: `./sync.sh push` is run
|
||||||
|
- Then: `Tipico.md` is created first, then `NewChild.md` with correct `parentDocumentId`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 6 — Conflict Detection
|
||||||
|
|
||||||
|
**Scope:** Block push when unresolved conflicts exist. Generate conflict sidecars.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-6.1** — As a user, I want push to be blocked if there are unresolved merge conflicts, so that I never push broken content.
|
||||||
|
- Given: `CV.md` contains git conflict markers from a previous pull
|
||||||
|
- When: `./sync.sh push` is run
|
||||||
|
- Then: Push aborts with "Resolve conflicts before pushing", lists conflicted files, exits non-zero
|
||||||
|
|
||||||
|
**US-6.2** — As a user, I want a human-readable conflict sidecar file, so that I can understand what changed on each side.
|
||||||
|
- Given: `CV.md` conflicts during pull
|
||||||
|
- When: Pull completes
|
||||||
|
- Then: `CV.conflict.md` exists containing: last synced content, local changes, Outline changes, and instructions for resolution
|
||||||
|
|
||||||
|
**US-6.3** — As a user, I want non-conflicted files to be pushed successfully even when other files have conflicts.
|
||||||
|
- Given: `CV.md` has conflict, `Projekte/Roadmap.md` is cleanly modified
|
||||||
|
- When: `./sync.sh push` is run (after resolving or acknowledging the conflict situation)
|
||||||
|
- Then: Only `Roadmap.md` is pushed; `CV.md` is skipped with a warning
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 7 — `sync resolve`
|
||||||
|
|
||||||
|
**Scope:** Command to resolve conflicts without needing raw git knowledge.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-7.1** — As a user, I want to resolve a conflict by accepting the local version with a single command.
|
||||||
|
- Given: `CV.md` is in conflict
|
||||||
|
- When: `./sync.sh resolve Bewerbungen/CV.md --accept local`
|
||||||
|
- Then: Conflict markers removed, local content kept, file staged, `CV.conflict.md` deleted
|
||||||
|
|
||||||
|
**US-7.2** — As a user, I want to resolve a conflict by accepting the remote (Outline) version with a single command.
|
||||||
|
- Given: `CV.md` is in conflict
|
||||||
|
- When: `./sync.sh resolve Bewerbungen/CV.md --accept remote`
|
||||||
|
- Then: Conflict markers removed, Outline's content kept, file staged, `CV.conflict.md` deleted
|
||||||
|
|
||||||
|
**US-7.3** — As a user, I want `sync status` to clearly show which files are in conflict, so that I know what needs resolving.
|
||||||
|
- Given: `CV.md` and `Notes.md` are conflicted
|
||||||
|
- When: `./sync.sh status`
|
||||||
|
- Then: Output shows both files marked as `CONFLICT` with instructions to run `sync resolve`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 8 — Ofelia + `_sync_log.md`
|
||||||
|
|
||||||
|
**Scope:** Unattended scheduled pull. Human-readable log in vault.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-8.1** — As a user, I want hourly automatic pulls from Outline so that Obsidian stays current without manual intervention.
|
||||||
|
- Given: Ofelia runs `sync.sh pull --auto` every hour
|
||||||
|
- When: New documents were created in Outline during the hour
|
||||||
|
- Then: New `.md` files appear in vault, `_sync_log.md` records the event
|
||||||
|
|
||||||
|
**US-8.2** — As a user, I want to check `_sync_log.md` in Obsidian to see the history of syncs, so that I can diagnose issues without accessing a terminal.
|
||||||
|
- Given: Multiple sync runs have occurred
|
||||||
|
- When: Opening `_sync_log.md` in Obsidian
|
||||||
|
- Then: A chronological log of pull/push operations, counts of changes, and any errors or conflicts is visible
|
||||||
|
|
||||||
|
**US-8.3** — As a user, I want failed syncs to be visible in Ofelia's output so that I can be alerted.
|
||||||
|
- Given: A pull encounters a conflict
|
||||||
|
- When: Ofelia's job completes
|
||||||
|
- Then: Exit code is non-zero, Ofelia marks the job as failed
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 9 — Deletions (flag-gated)
|
||||||
|
|
||||||
|
**Scope:** Handle deleted files/documents. Off by default.
|
||||||
|
|
||||||
|
#### User Stories
|
||||||
|
|
||||||
|
**US-9.1** — As a user, I want to opt into deletion sync so that documents I delete locally are also removed from Outline.
|
||||||
|
- Given: `enable_deletions: true` in `settings.json`, `OldNote.md` deleted in Obsidian
|
||||||
|
- When: `./sync.sh push`
|
||||||
|
- Then: Outline document is deleted, frontmatter entry removed
|
||||||
|
|
||||||
|
**US-9.2** — As a user, I want deletions to be off by default so that accidental file deletion never cascades to Outline without explicit opt-in.
|
||||||
|
- Given: `enable_deletions: false` (default), `OldNote.md` deleted in Obsidian
|
||||||
|
- When: `./sync.sh push`
|
||||||
|
- Then: Deletion is logged as a warning, Outline document is untouched
|
||||||
|
|
||||||
|
**US-9.3** — As a user, I want `--dry-run` to show pending deletions so that I can review before enabling.
|
||||||
|
- Given: `enable_deletions: true`, 2 files deleted locally
|
||||||
|
- When: `./sync.sh push --dry-run`
|
||||||
|
- Then: Output shows 2 documents marked for deletion, no Outline changes made
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 11. Automated Testing
|
||||||
|
|
||||||
|
All tests run inside Docker (same `python:3.11-slim` + `domnet` network as existing scripts). A dedicated test collection `_sync_test_TIMESTAMP` is created in Outline at test start and deleted at teardown.
|
||||||
|
|
||||||
|
### Test Runner
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./sync_tests.sh # run all tests
|
||||||
|
./sync_tests.sh --phase 1 # run tests for specific phase
|
||||||
|
./sync_tests.sh --keep # don't delete test collection on failure (debug)
|
||||||
|
./sync_tests.sh --verbose # show full API request/response
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.1 Phase 1 Tests — Init
|
||||||
|
|
||||||
|
```
|
||||||
|
TEST-1.1 init creates vault directory if not exists
|
||||||
|
TEST-1.2 init creates git repo with outline and main branches
|
||||||
|
TEST-1.3 init exports all Outline collections as top-level folders
|
||||||
|
TEST-1.4 init injects frontmatter into every .md file
|
||||||
|
TEST-1.5 frontmatter contains: outline_id, outline_collection_id, outline_updated_at
|
||||||
|
TEST-1.6 outline_id in frontmatter matches actual Outline document ID (API verified)
|
||||||
|
TEST-1.7 nested documents are in correct subfolder matching hierarchy
|
||||||
|
TEST-1.8 settings.json is listed in .gitignore
|
||||||
|
TEST-1.9 .obsidian/ is listed in .gitignore
|
||||||
|
TEST-1.10 outline branch and main branch are at same commit after init
|
||||||
|
TEST-1.11 re-running init on existing vault aborts with clear error message
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.2 Phase 2 Tests — Pull (Outline → Local)
|
||||||
|
|
||||||
|
**Read direction:**
|
||||||
|
```
|
||||||
|
TEST-2.1 pull: modified Outline document updates local .md content
|
||||||
|
TEST-2.2 pull: modified Outline document updates outline_updated_at in frontmatter
|
||||||
|
TEST-2.3 pull: outline_id in frontmatter unchanged after content update
|
||||||
|
TEST-2.4 pull: new Outline document appears as new .md file in correct folder
|
||||||
|
TEST-2.5 pull: new nested Outline document appears in correct subfolder
|
||||||
|
TEST-2.6 pull: new Outline collection appears as new top-level folder
|
||||||
|
TEST-2.7 pull: deleted Outline document removes local .md file
|
||||||
|
TEST-2.8 pull: deleted Outline collection removes local folder
|
||||||
|
TEST-2.9 pull: unchanged documents are not modified (mtime unchanged)
|
||||||
|
TEST-2.10 pull: git commits to outline branch, not main
|
||||||
|
TEST-2.11 pull: outline branch advances, main branch merges cleanly
|
||||||
|
TEST-2.12 pull: _sync_log.md updated with counts and timestamp
|
||||||
|
```
|
||||||
|
|
||||||
|
**Conflict detection:**
|
||||||
|
```
|
||||||
|
TEST-2.13 pull: same document edited locally + in Outline → conflict detected
|
||||||
|
TEST-2.14 pull: conflict produces *.conflict.md sidecar file
|
||||||
|
TEST-2.15 pull: conflict sidecar contains local diff and remote diff sections
|
||||||
|
TEST-2.16 pull: conflicted file is left untouched (not overwritten)
|
||||||
|
TEST-2.17 pull: non-conflicted files merge successfully despite other conflicts
|
||||||
|
TEST-2.18 pull: --auto mode exits non-zero on conflict
|
||||||
|
TEST-2.19 pull: _sync_log.md records conflict with filename and timestamp
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.3 Phase 3 Tests — Status
|
||||||
|
|
||||||
|
```
|
||||||
|
TEST-3.1 status: shows M for locally modified file
|
||||||
|
TEST-3.2 status: shows A for new local file without outline_id
|
||||||
|
TEST-3.3 status: shows D for locally deleted file (even if deletions off)
|
||||||
|
TEST-3.4 status: shows R for renamed/moved file
|
||||||
|
TEST-3.5 status: shows CONFLICT for files with unresolved merge conflicts
|
||||||
|
TEST-3.6 status: shows nothing when vault is clean
|
||||||
|
TEST-3.7 status: exit code 0 in all cases (informational only)
|
||||||
|
TEST-3.8 status: does not modify any files or git state
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.4 Phase 4 Tests — Push Modified + Renamed
|
||||||
|
|
||||||
|
**Update direction:**
|
||||||
|
```
|
||||||
|
TEST-4.1 push: modified local file calls documents.update (not create)
|
||||||
|
TEST-4.2 push: Outline document content matches local file content after push (strip frontmatter)
|
||||||
|
TEST-4.3 push: documents.update preserves document history (version count increases)
|
||||||
|
TEST-4.4 push: outline_updated_at in frontmatter updated to API response timestamp
|
||||||
|
TEST-4.5 push: outline_id unchanged after update
|
||||||
|
TEST-4.6 push: Outline document URL unchanged after update (ID preserved)
|
||||||
|
TEST-4.7 push: renamed file → Outline document title updated
|
||||||
|
TEST-4.8 push: renamed file → Outline document ID unchanged
|
||||||
|
TEST-4.9 push: file moved to different folder → documents.move called
|
||||||
|
TEST-4.10 push: moved file → document appears under new parent in Outline
|
||||||
|
TEST-4.11 push: moved file → document removed from old parent in Outline
|
||||||
|
TEST-4.12 push: file moved to different top-level folder → document moved to different collection
|
||||||
|
TEST-4.13 push: API failure on one file → other files still pushed
|
||||||
|
TEST-4.14 push: failed file → outline branch NOT advanced
|
||||||
|
TEST-4.15 push: failed file → retried on next push
|
||||||
|
TEST-4.16 push: --dry-run → no Outline changes, no git state changes
|
||||||
|
TEST-4.17 push: --dry-run → lists all API calls that would be made
|
||||||
|
TEST-4.18 push: frontmatter stripped from content sent to Outline API
|
||||||
|
TEST-4.19 push: Outline document does NOT contain frontmatter YAML in its body
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.5 Phase 5 Tests — Push New Documents
|
||||||
|
|
||||||
|
```
|
||||||
|
TEST-5.1 push: new file in existing collection folder → documents.create called
|
||||||
|
TEST-5.2 push: new document appears in correct Outline collection
|
||||||
|
TEST-5.3 push: outline_id written back to frontmatter after create
|
||||||
|
TEST-5.4 push: outline_collection_id written back to frontmatter after create
|
||||||
|
TEST-5.5 push: new file in subfolder → parentDocumentId set correctly
|
||||||
|
TEST-5.6 push: parent document outline_id used as parentDocumentId
|
||||||
|
TEST-5.7 push: new file in new top-level folder → collection created first
|
||||||
|
TEST-5.8 push: new file in new top-level folder → document created after collection
|
||||||
|
TEST-5.9 push: two new files parent+child → parent created before child (topological order)
|
||||||
|
TEST-5.10 push: three-level new hierarchy → correct creation order preserved
|
||||||
|
TEST-5.11 push: new file with no frontmatter → document still created (title from filename)
|
||||||
|
TEST-5.12 push: created document is published (not draft)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.6 Phase 6 Tests — Conflict Detection
|
||||||
|
|
||||||
|
```
|
||||||
|
TEST-6.1 push blocked when any file contains git conflict markers
|
||||||
|
TEST-6.2 push blocked lists all conflicted files
|
||||||
|
TEST-6.3 push blocked exits non-zero
|
||||||
|
TEST-6.4 non-conflicted files can still be pushed despite other files in conflict
|
||||||
|
TEST-6.5 conflict sidecar .conflict.md contains "LOCAL" and "REMOTE" sections
|
||||||
|
TEST-6.6 conflict sidecar contains instructions for resolution commands
|
||||||
|
TEST-6.7 outline branch not advanced when push is blocked by conflicts
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.7 Phase 7 Tests — Resolve
|
||||||
|
|
||||||
|
```
|
||||||
|
TEST-7.1 resolve --accept local → file contains local content
|
||||||
|
TEST-7.2 resolve --accept local → no conflict markers remain
|
||||||
|
TEST-7.3 resolve --accept local → .conflict.md sidecar deleted
|
||||||
|
TEST-7.4 resolve --accept local → file is staged (git add)
|
||||||
|
TEST-7.5 resolve --accept remote → file contains Outline content
|
||||||
|
TEST-7.6 resolve --accept remote → no conflict markers remain
|
||||||
|
TEST-7.7 resolve --accept remote → .conflict.md sidecar deleted
|
||||||
|
TEST-7.8 resolve --accept remote → file is staged (git add)
|
||||||
|
TEST-7.9 resolve on non-conflicted file → error message, no changes
|
||||||
|
TEST-7.10 after resolve, push succeeds for that file
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.8 Phase 8 Tests — Ofelia / Auto mode
|
||||||
|
|
||||||
|
```
|
||||||
|
TEST-8.1 --auto flag: no interactive prompts, completes without stdin
|
||||||
|
TEST-8.2 --auto flag: _sync_log.md updated on success
|
||||||
|
TEST-8.3 --auto flag: _sync_log.md updated on conflict
|
||||||
|
TEST-8.4 --auto flag: exit code 0 on clean pull
|
||||||
|
TEST-8.5 --auto flag: exit code non-zero on conflict
|
||||||
|
TEST-8.6 _sync_log.md uses union merge strategy (no conflicts on the log itself)
|
||||||
|
TEST-8.7 _sync_log.md entries are append-only (history preserved)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.9 Phase 9 Tests — Deletions
|
||||||
|
|
||||||
|
```
|
||||||
|
TEST-9.1 enable_deletions false: locally deleted file → no Outline change
|
||||||
|
TEST-9.2 enable_deletions false: warning logged in _sync_log.md
|
||||||
|
TEST-9.3 enable_deletions true: locally deleted file → documents.delete called
|
||||||
|
TEST-9.4 enable_deletions true: Outline document is gone after push
|
||||||
|
TEST-9.5 enable_deletions true: deleted folder → collection deleted after all docs deleted
|
||||||
|
TEST-9.6 enable_deletions true: --dry-run lists pending deletions without executing
|
||||||
|
TEST-9.7 Outline-side deletion during pull: local file removed (regardless of enable_deletions)
|
||||||
|
TEST-9.8 Outline-side deletion during pull: deletion logged in _sync_log.md
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.10 Full Round-Trip Tests
|
||||||
|
|
||||||
|
These run the complete cycle and verify end-to-end consistency.
|
||||||
|
|
||||||
|
```
|
||||||
|
TEST-RT.1 Create in Outline → pull → verify local file content + frontmatter
|
||||||
|
TEST-RT.2 Create in Obsidian → push → verify Outline document content + ID in frontmatter
|
||||||
|
TEST-RT.3 Edit in Outline → pull → edit locally → push → verify Outline has latest content
|
||||||
|
TEST-RT.4 Edit locally → push → edit in Outline → pull → verify local has Outline content
|
||||||
|
TEST-RT.5 Edit same doc in Outline AND locally → pull → verify conflict detected → resolve local → push → verify Outline updated
|
||||||
|
TEST-RT.6 Edit same doc in Outline AND locally → pull → verify conflict detected → resolve remote → push → verify Outline unchanged, local matches Outline
|
||||||
|
TEST-RT.7 Rename locally → push → pull (clean) → verify no duplicate documents
|
||||||
|
TEST-RT.8 Move locally → push → pull (clean) → verify hierarchy correct on both sides
|
||||||
|
TEST-RT.9 Delete locally (deletions on) → push → verify gone from Outline
|
||||||
|
TEST-RT.10 Delete in Outline → pull → verify gone locally
|
||||||
|
TEST-RT.11 Create parent+child locally → push → verify parent-child relationship in Outline
|
||||||
|
TEST-RT.12 Full CRUD cycle: create → edit → rename → move → delete, verify Outline matches at each step
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 12. What Is NOT Written from Scratch
|
||||||
|
|
||||||
|
| Problem | Solution |
|
||||||
|
|---|---|
|
||||||
|
| Change detection | `git diff` |
|
||||||
|
| Three-way merge | `git merge` |
|
||||||
|
| Conflict markers | git native |
|
||||||
|
| History + rollback | `git log` / `git reset` |
|
||||||
|
| Local auto-commit | Obsidian Git plugin |
|
||||||
|
| API retry + rate limiting | Reuse `OutlineImporter._api_request()` from `outline_import.py` |
|
||||||
|
| Docker network execution | Reuse pattern from `import_to_outline.sh` |
|
||||||
|
| Export logic | Reuse `outline_export_fixed.py` unchanged |
|
||||||
|
|
||||||
|
**New code only:** `outline_sync.py` (~500 lines) + `sync.sh` wrapper (~100 lines).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 13. Risks
|
||||||
|
|
||||||
|
| Risk | Mitigation |
|
||||||
|
|---|---|
|
||||||
|
| `git merge` fails in unattended cron | Pull conflicts block push only; clean files still merged; logged to `_sync_log.md` |
|
||||||
|
| Frontmatter stripped incorrectly before push | Unit test the strip/restore; verify Outline body does not contain YAML |
|
||||||
|
| `outline_updated_at` clock skew | Use API response timestamp, not local clock; normalize to UTC |
|
||||||
|
| New file in unknown folder (not an Outline collection) | Warn and skip; require top-level folder to match existing collection or be brand new |
|
||||||
|
| Outline deletes document between pull and push | `documents.update` returns 404 → log as conflict, skip |
|
||||||
|
| Obsidian Git plugin commits during sync | Sync script checks for clean working tree before starting; aborts if dirty |
|
||||||
|
| `outline` branch diverges from `main` after long conflict | `sync status` shows divergence; `sync pull` always catches up |
|
||||||
484
WEBUI_PRD.md
Normal file
484
WEBUI_PRD.md
Normal file
@@ -0,0 +1,484 @@
|
|||||||
|
# PRD: Outline Sync Web UI
|
||||||
|
**Version:** 2.0
|
||||||
|
**Date:** 2026-03-06
|
||||||
|
**Status:** Draft — WebDAV Architecture
|
||||||
|
**Depends on:** SYNC_PRD.md (sync engine)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Problem Statement
|
||||||
|
|
||||||
|
The sync engine (`sync.sh`) works but requires terminal access on the VPS. The user edits notes locally in Obsidian on their own machine. Changes need to flow bidirectionally between local Obsidian and the remote Outline wiki without terminal interaction.
|
||||||
|
|
||||||
|
The key constraint: **Obsidian runs locally, the vault git repo lives on the VPS.**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Architecture Options Evaluated
|
||||||
|
|
||||||
|
| Option | Mechanism | Verdict |
|
||||||
|
|---|---|---|
|
||||||
|
| **WebDAV + Remotely Save** | WebDAV server on VPS serves vault dir; Obsidian plugin syncs automatically | ✅ Recommended |
|
||||||
|
| **Self-hosted LiveSync** | CouchDB on VPS; Obsidian plugin syncs in real-time | ❌ Adds CouchDB; no direct file access for sync engine |
|
||||||
|
| **Local REST API** | Obsidian exposes REST on localhost:27124 | ❌ VPS can't reach local machine |
|
||||||
|
| **Manual zip download/upload** | Browser download/upload of vault archives | ❌ Eliminated by WebDAV option |
|
||||||
|
|
||||||
|
**Decision: WebDAV.** A WebDAV Docker container on the VPS serves the vault directory directly. The Obsidian plugin `remotely-save` syncs to it automatically. No tunnel, no extra database, no build step.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. System Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────┐
|
||||||
|
│ Your local machine │
|
||||||
|
│ ┌─────────────────────────────────────────────────┐ │
|
||||||
|
│ │ Obsidian │ │
|
||||||
|
│ │ Plugin: remotely-save │ │
|
||||||
|
│ │ Auto-sync every N minutes (or on open/close) │ │
|
||||||
|
│ └──────────────────────┬──────────────────────────┘ │
|
||||||
|
└─────────────────────────│───────────────────────────────┘
|
||||||
|
│ WebDAV over Tailscale (encrypted tunnel)
|
||||||
|
│ (basic auth + Tailscale network isolation)
|
||||||
|
┌─────────────────────────▼───────────────────────────────┐
|
||||||
|
│ VPS (domverse.de) │
|
||||||
|
│ │
|
||||||
|
│ ┌──────────────────────────────────────────────────┐ │
|
||||||
|
│ │ WebDAV container (obsidian-webdav) │ │
|
||||||
|
│ │ Serves: /outline-vault/ (read-write) │ │
|
||||||
|
│ │ URL: http://100.x.x.x (Tailscale only) │ │
|
||||||
|
│ └────────────────────┬─────────────────────────────┘ │
|
||||||
|
│ │ shared volume │
|
||||||
|
│ ┌────────────────────▼─────────────────────────────┐ │
|
||||||
|
│ │ /outline-vault/ (git repo) │ │
|
||||||
|
│ │ ├── .git/ │ │
|
||||||
|
│ │ │ ├── outline branch (last Outline state) │ │
|
||||||
|
│ │ │ └── main branch (current vault state) │ │
|
||||||
|
│ │ ├── Bewerbungen/ │ │
|
||||||
|
│ │ ├── Projekte/ │ │
|
||||||
|
│ │ └── _sync_log.md │ │
|
||||||
|
│ └────────────────────┬─────────────────────────────┘ │
|
||||||
|
│ │ shared volume │
|
||||||
|
│ ┌────────────────────▼─────────────────────────────┐ │
|
||||||
|
│ │ outline-sync-ui (FastAPI, port 8080) │ │
|
||||||
|
│ │ URL: https://sync.domverse.de/ │ │
|
||||||
|
│ │ Auth: Authentik │ │
|
||||||
|
│ └────────────────────┬─────────────────────────────┘ │
|
||||||
|
│ │ │
|
||||||
|
│ ┌────────────────────▼─────────────────────────────┐ │
|
||||||
|
│ │ outline_sync.py (sync engine) │ │
|
||||||
|
│ │ network: domnet │ │
|
||||||
|
│ │ API: http://outline:3000 │ │
|
||||||
|
│ └──────────────────────────────────────────────────┘ │
|
||||||
|
└─────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### Data Flow: Obsidian → Outline
|
||||||
|
|
||||||
|
1. User edits notes locally in Obsidian
|
||||||
|
2. `remotely-save` plugin syncs changed files to WebDAV server on VPS (automatic)
|
||||||
|
3. WebDAV writes files directly into `/outline-vault/`
|
||||||
|
4. User clicks "Send to Outline" in web UI (or cron runs push)
|
||||||
|
5. Sync engine: `git diff outline..main` → calls Outline API per changed file
|
||||||
|
6. Sync engine advances `outline` branch
|
||||||
|
|
||||||
|
### Data Flow: Outline → Obsidian
|
||||||
|
|
||||||
|
1. User clicks "Get from Outline" in web UI (or cron runs pull)
|
||||||
|
2. Sync engine exports Outline → commits to `outline` branch → merges into `main`
|
||||||
|
3. Files updated in `/outline-vault/`
|
||||||
|
4. WebDAV immediately serves updated files
|
||||||
|
5. `remotely-save` plugin picks up changes on next sync interval
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. User-Facing Mental Model
|
||||||
|
|
||||||
|
Two systems, one bridge:
|
||||||
|
|
||||||
|
| What the user sees | What actually happens |
|
||||||
|
|---|---|
|
||||||
|
| Obsidian auto-syncs | `remotely-save` talks WebDAV to VPS |
|
||||||
|
| "Get from Outline" | sync engine pulls Outline API → git merge → files update in vault |
|
||||||
|
| "Send to Outline" | sync engine diffs vault → calls Outline API |
|
||||||
|
| "Version conflict" | `git merge` produced conflict markers |
|
||||||
|
| "Keep mine / Keep Outline's" | `sync resolve --accept local/remote` |
|
||||||
|
|
||||||
|
Git is invisible. WebDAV is invisible. Obsidian just sees files that stay in sync.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Goals
|
||||||
|
|
||||||
|
| # | Goal |
|
||||||
|
|---|------|
|
||||||
|
| G1 | Obsidian syncs to VPS automatically — no manual file transfer |
|
||||||
|
| G2 | Pulling from Outline and pushing to Outline are single button clicks in a browser |
|
||||||
|
| G3 | Live output visible while sync runs |
|
||||||
|
| G4 | Conflicts resolvable in browser with side-by-side diff |
|
||||||
|
| G5 | New files created in Obsidian appear in Outline after push |
|
||||||
|
| G6 | New documents created in Outline appear in Obsidian after next WebDAV sync |
|
||||||
|
| G7 | Web UI protected by Authentik SSO |
|
||||||
|
| G8 | WebDAV endpoint protected by authentication |
|
||||||
|
| G9 | Zero terminal interaction for normal workflow |
|
||||||
|
|
||||||
|
## 6. Non-Goals
|
||||||
|
|
||||||
|
| # | Non-Goal | Reason |
|
||||||
|
|---|----------|--------|
|
||||||
|
| NG1 | In-browser markdown editor | Obsidian is the editor |
|
||||||
|
| NG2 | Real-time Outline → Obsidian sync | WebDAV poll interval (e.g. 5 min) is sufficient |
|
||||||
|
| NG3 | Syncing `.obsidian/` config, templates, daily notes | Outline-synced content only |
|
||||||
|
| NG4 | Multi-user | Single-user system |
|
||||||
|
| NG5 | Mobile Obsidian support | Remotely Save does support mobile, but not a primary target |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Component Inventory
|
||||||
|
|
||||||
|
### 6.1 WebDAV Server (new container)
|
||||||
|
|
||||||
|
Simple nginx-WebDAV container. Serves `/outline-vault/` as a WebDAV share.
|
||||||
|
**Not exposed via Traefik.** Bound exclusively to the VPS Tailscale interface — invisible from the public internet.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
obsidian-webdav:
|
||||||
|
image: dgraziotin/nginx-webdav-nononsense:latest
|
||||||
|
container_name: obsidian-webdav
|
||||||
|
networks:
|
||||||
|
- domnet
|
||||||
|
ports:
|
||||||
|
- "100.x.x.x:80:80" # Tailscale IP only — replace with: tailscale ip -4
|
||||||
|
volumes:
|
||||||
|
- /home/crabix/docker_authentik/outline-vault:/data
|
||||||
|
environment:
|
||||||
|
- WEBDAV_USERNAME=obsidian
|
||||||
|
- WEBDAV_PASSWORD=${WEBDAV_PASSWORD} # from .env
|
||||||
|
restart: unless-stopped
|
||||||
|
# No Traefik labels — not publicly routed
|
||||||
|
```
|
||||||
|
|
||||||
|
**Auth note:** WebDAV cannot use Authentik forward auth (Obsidian plugin can't handle SSO redirect). Network isolation via Tailscale is the primary security layer — the endpoint is unreachable from the public internet. Basic auth (htpasswd) via nginx provides a second layer. Tailscale encrypts the tunnel, so plain HTTP is safe for this leg.
|
||||||
|
|
||||||
|
### 6.2 Web UI (new container)
|
||||||
|
|
||||||
|
FastAPI + HTMX. Control plane for sync operations.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
outline-sync-ui:
|
||||||
|
image: python:3.11-slim
|
||||||
|
container_name: outline-sync-ui
|
||||||
|
networks:
|
||||||
|
- domnet
|
||||||
|
volumes:
|
||||||
|
- /home/crabix/docker_authentik/outline-tools:/app:ro # scripts, read-only
|
||||||
|
- /home/crabix/docker_authentik/outline-vault:/vault # vault, read-write
|
||||||
|
working_dir: /app
|
||||||
|
command: >
|
||||||
|
bash -c "pip install -qqq fastapi uvicorn requests &&
|
||||||
|
uvicorn webui:app --host 0.0.0.0 --port 8080"
|
||||||
|
restart: unless-stopped
|
||||||
|
labels:
|
||||||
|
- "traefik.enable=true"
|
||||||
|
- "traefik.http.routers.outline-sync.rule=Host(`sync.domverse.de`)"
|
||||||
|
- "traefik.http.routers.outline-sync.entrypoints=https"
|
||||||
|
- "traefik.http.routers.outline-sync.tls=true"
|
||||||
|
- "traefik.http.routers.outline-sync.middlewares=secHeaders@file,middlewares-authentik-new@file"
|
||||||
|
- "traefik.http.services.outline-sync.loadbalancer.server.port=8080"
|
||||||
|
```
|
||||||
|
|
||||||
|
### 6.3 Obsidian Setup (one-time, on local machine)
|
||||||
|
|
||||||
|
1. Install plugin `remotely-save` from Obsidian community plugins
|
||||||
|
2. Ensure Tailscale is running on the local machine and connected to the VPS
|
||||||
|
3. Configure:
|
||||||
|
- **Remote service:** WebDAV
|
||||||
|
- **Server URL:** `http://100.x.x.x` (VPS Tailscale IP — plain HTTP, tunnel is encrypted)
|
||||||
|
- **Username / Password:** WebDAV credentials
|
||||||
|
- **Sync on startup:** yes
|
||||||
|
- **Sync interval:** every 5 minutes (or on vault open/close)
|
||||||
|
- **Sync direction:** bidirectional (default)
|
||||||
|
4. Exclude `.git/` from sync (configure in plugin's ignore list)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7. New File Handling
|
||||||
|
|
||||||
|
This is the key correctness concern: when the user creates a new `.md` file in Obsidian, it must reach Outline correctly.
|
||||||
|
|
||||||
|
### Flow for new files
|
||||||
|
|
||||||
|
1. User creates `Projekte/NewNote.md` in Obsidian (no frontmatter)
|
||||||
|
2. `remotely-save` syncs it to VPS via WebDAV → file appears in `/outline-vault/Projekte/NewNote.md`
|
||||||
|
3. User clicks "Send to Outline" in web UI
|
||||||
|
4. Sync engine: `git diff outline..main` shows `A Projekte/NewNote.md`
|
||||||
|
5. Sync engine determines parent: `Projekte/` folder exists → look up `Projekte.md` for its `outline_id` → use as `parentDocumentId`
|
||||||
|
6. `documents.create` called → new document created in Outline under correct collection/parent
|
||||||
|
7. `outline_id` written back into frontmatter of `NewNote.md`
|
||||||
|
8. File committed → WebDAV serves updated file with frontmatter → Obsidian picks it up on next sync
|
||||||
|
|
||||||
|
### New file in new folder
|
||||||
|
|
||||||
|
1. User creates `NewCollection/FirstDoc.md`
|
||||||
|
2. Sync engine: top-level folder `NewCollection/` not in any known collection → `collections.create("NewCollection")`
|
||||||
|
3. `FirstDoc` created inside new collection
|
||||||
|
4. Frontmatter written back to both files
|
||||||
|
|
||||||
|
### Ordering guarantee
|
||||||
|
|
||||||
|
Sync engine creates documents in topological order (parents before children), regardless of which order files were synced via WebDAV.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 8. Conflict Model
|
||||||
|
|
||||||
|
A conflict occurs when:
|
||||||
|
- User edited `File.md` in Obsidian (synced via WebDAV to `main` branch)
|
||||||
|
- Someone also edited the same document in Outline
|
||||||
|
- `sync pull` → `git merge outline` → conflict markers in `File.md`
|
||||||
|
|
||||||
|
After conflict:
|
||||||
|
- `File.md` contains `<<<<<<<` markers → WebDAV immediately serves this broken file → Obsidian shows it with markers
|
||||||
|
- Conflict is resolved in the **web UI** (not Obsidian)
|
||||||
|
- After resolution, WebDAV serves the clean file → Obsidian's next sync picks it up
|
||||||
|
|
||||||
|
**This is the key reason the web UI exists:** conflict resolution in a browser with a diff view is far better than editing git conflict markers in a text editor.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 9. Web UI Screens
|
||||||
|
|
||||||
|
### 9.1 Dashboard (`/`)
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────┐
|
||||||
|
│ Outline Sync sync.domverse.de │
|
||||||
|
├─────────────────────────────────────────────────────────┤
|
||||||
|
│ │
|
||||||
|
│ Vault status ● Clean │
|
||||||
|
│ Last pull 2026-03-06 14:32 (from Outline) │
|
||||||
|
│ Last push 2026-03-05 09:10 (to Outline) │
|
||||||
|
│ Pending local 5 changes (from Obsidian via WebDAV) │
|
||||||
|
│ │
|
||||||
|
│ ┌───────────────────┐ ┌───────────────────┐ │
|
||||||
|
│ │ Get from Outline │ │ Send to Outline │ │
|
||||||
|
│ │ (pull) │ │ (5 pending) │ │
|
||||||
|
│ └───────────────────┘ └───────────────────┘ │
|
||||||
|
│ │
|
||||||
|
│ ⚠ 2 version conflicts need resolution [Resolve →] │
|
||||||
|
│ │
|
||||||
|
└─────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
"Pending local" count = `git diff outline..main --name-only | wc -l`
|
||||||
|
(these are files Obsidian wrote via WebDAV, not yet pushed to Outline)
|
||||||
|
|
||||||
|
### 9.2 Live Sync Output (inline SSE)
|
||||||
|
|
||||||
|
Triggered by either button. Live output streamed line by line via Server-Sent Events. HTMX replaces the button area with output panel.
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────┐
|
||||||
|
│ Sending to Outline... │
|
||||||
|
│ ───────────────────────────────────────────────────── │
|
||||||
|
│ ✓ 5 local changes detected │
|
||||||
|
│ ✓ Projekte/NewNote.md → created (id: 4f2a...) │
|
||||||
|
│ ✓ Bewerbungen/CV.md → updated │
|
||||||
|
│ ✓ Infra/HomeLab.md → updated │
|
||||||
|
│ ✓ Infra/OldDoc.md → deleted (deletions=off, skip) │
|
||||||
|
│ ✓ Projekte/Renamed.md → title updated │
|
||||||
|
│ ───────────────────────────────────────────────────── │
|
||||||
|
│ Done. 3 updated, 1 created, 1 skipped. │
|
||||||
|
│ [Back to Dashboard] │
|
||||||
|
└─────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### 9.3 Pending Changes (`/changes`)
|
||||||
|
|
||||||
|
Before pushing, shows what will happen.
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────┐
|
||||||
|
│ Pending changes (5) [Send to Outline] │
|
||||||
|
├─────────────────────────────────────────────────────────┤
|
||||||
|
│ │
|
||||||
|
│ Modified │
|
||||||
|
│ ● Bewerbungen/CV.md [preview diff] │
|
||||||
|
│ ● Infra/HomeLab.md [preview diff] │
|
||||||
|
│ │
|
||||||
|
│ New — will be created in Outline │
|
||||||
|
│ + Projekte/NewNote.md │
|
||||||
|
│ + NewCollection/FirstDoc.md (new collection too) │
|
||||||
|
│ │
|
||||||
|
│ Renamed │
|
||||||
|
│ → Projekte/OldName.md → Projekte/NewName.md │
|
||||||
|
│ │
|
||||||
|
│ Deleted (skipped — deletions are off in settings) │
|
||||||
|
│ ✗ Infra/OldDoc.md │
|
||||||
|
│ │
|
||||||
|
└─────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### 9.4 Conflict Resolution (`/conflicts`)
|
||||||
|
|
||||||
|
One card per conflict. Expand for side-by-side diff.
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────┐
|
||||||
|
│ Version conflicts (2) │
|
||||||
|
│ Same document was edited in Obsidian and in Outline. │
|
||||||
|
├─────────────────────────────────────────────────────────┤
|
||||||
|
│ │
|
||||||
|
│ ┌───────────────────────────────────────────────────┐ │
|
||||||
|
│ │ Bewerbungen/CV.md │ │
|
||||||
|
│ │ Your edit (via Obsidian): 2026-03-06 09:14 │ │
|
||||||
|
│ │ Outline's edit: 2026-03-06 11:03 │ │
|
||||||
|
│ │ │ │
|
||||||
|
│ │ ┌─────────────────┬─────────────────────────┐ │ │
|
||||||
|
│ │ │ Your version │ Outline's version │ │ │
|
||||||
|
│ │ │─────────────────│─────────────────────────│ │ │
|
||||||
|
│ │ │ # CV │ # CV │ │ │
|
||||||
|
│ │ │ │ │ │ │
|
||||||
|
│ │ │[+ New section] │ │ │ │
|
||||||
|
│ │ │ │[+ Contact info updated] │ │ │
|
||||||
|
│ │ └─────────────────┴─────────────────────────┘ │ │
|
||||||
|
│ │ │ │
|
||||||
|
│ │ [Keep mine] [Keep Outline's] │ │
|
||||||
|
│ └───────────────────────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
└─────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
After resolving, card collapses. If all resolved, redirect to dashboard showing "Push now available".
|
||||||
|
|
||||||
|
### 9.5 Sync History (`/history`)
|
||||||
|
|
||||||
|
`_sync_log.md` rendered as a table, most recent first.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 10. API Endpoints (internal)
|
||||||
|
|
||||||
|
| Method | Path | Purpose |
|
||||||
|
|---|---|---|
|
||||||
|
| `GET` | `/` | Dashboard |
|
||||||
|
| `GET` | `/status` | JSON vault state (counts, last sync, conflicts) |
|
||||||
|
| `GET` | `/changes` | Pending changes list |
|
||||||
|
| `GET` | `/conflicts` | Conflict list |
|
||||||
|
| `GET` | `/history` | Sync log view |
|
||||||
|
| `POST` | `/pull` | Start pull; streams SSE |
|
||||||
|
| `POST` | `/push` | Start push; streams SSE |
|
||||||
|
| `GET` | `/stream/{job_id}` | SSE stream for running job |
|
||||||
|
| `POST` | `/resolve` | Body: `{file, accept: local|remote}` |
|
||||||
|
| `GET` | `/diff/{encoded_path}` | HTML fragment: side-by-side diff |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 11. Security
|
||||||
|
|
||||||
|
### WebDAV endpoint
|
||||||
|
- **Network isolation:** bound to Tailscale interface only — not reachable from public internet
|
||||||
|
- **Basic auth** (username + password from `.env`) — second layer of protection
|
||||||
|
- **Encryption:** Tailscale encrypts the tunnel end-to-end; plain HTTP between Obsidian and VPS is safe
|
||||||
|
- No Authentik forward auth — Obsidian plugin can't handle SSO redirects; Tailscale isolation makes this a non-issue
|
||||||
|
- No `vault.domverse.de` subdomain — no DNS exposure, no Traefik involvement
|
||||||
|
|
||||||
|
### Web UI
|
||||||
|
- Authentik forward auth via Traefik
|
||||||
|
- No additional app-level auth needed
|
||||||
|
- Token in `settings.json` never exposed to browser
|
||||||
|
|
||||||
|
### Subprocess safety
|
||||||
|
- Sync commands invoked with fixed argument lists
|
||||||
|
- File paths from `/resolve` validated against known conflict list before shell use
|
||||||
|
- `outline-tools/` mounted read-only in UI container
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 12. Implementation Phases
|
||||||
|
|
||||||
|
### Phase A — WebDAV Container (infrastructure)
|
||||||
|
Deploy `obsidian-webdav` container bound to the VPS Tailscale IP (`tailscale ip -4`). No Traefik config needed.
|
||||||
|
Test WebDAV access from local machine via Tailscale using a WebDAV client (e.g. `curl --user obsidian:$PASS http://100.x.x.x/`).
|
||||||
|
Configure `remotely-save` plugin in Obsidian with the Tailscale URL. Verify files sync bidirectionally.
|
||||||
|
|
||||||
|
**Done when:** editing a file in Obsidian and running sync → file appears on VPS.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase B — Read-Only Dashboard
|
||||||
|
`webui.py`: FastAPI app. `GET /status` reads git status, parses into JSON. Dashboard template shows status badge, pending count, last sync times. No write operations yet.
|
||||||
|
|
||||||
|
**Done when:** `https://sync.domverse.de` shows current vault state.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase C — Pull with Live Output
|
||||||
|
`POST /pull` spawns `outline_sync.py pull` as async subprocess. `GET /stream/{job_id}` streams stdout via SSE. HTMX wires button → POST → SSE panel → auto-refresh dashboard.
|
||||||
|
|
||||||
|
**Done when:** "Get from Outline" button works with live output; new Outline docs appear in Obsidian on next WebDAV sync.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase D — Pending Changes View
|
||||||
|
`GET /changes`: parses `git diff outline..main --name-status` into structured list. Shows modified / new / renamed / deleted with explanatory labels. Inline diff preview for modified files.
|
||||||
|
|
||||||
|
**Done when:** changes page accurately shows what Obsidian wrote via WebDAV.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase E — Push with Live Output
|
||||||
|
`POST /push` spawns `outline_sync.py push`. Same SSE pattern. Button disabled if conflicts exist. New-file creation flow (frontmatter written back → WebDAV serves updated file → Obsidian picks up IDs).
|
||||||
|
|
||||||
|
**Done when:** new Obsidian files appear in Outline; modified files update; frontmatter IDs land back in Obsidian vault.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase F — Conflict Resolution
|
||||||
|
`GET /conflicts`: lists conflict files. `GET /diff/{path}`: renders side-by-side HTML diff using `difflib`. `POST /resolve`: calls `sync resolve`. Card UX with per-conflict actions.
|
||||||
|
|
||||||
|
**Done when:** all conflicts resolvable via browser; resolved files served cleanly by WebDAV to Obsidian.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase G — History View
|
||||||
|
Render `_sync_log.md` as reverse-chronological HTML table.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 13. Risks
|
||||||
|
|
||||||
|
| Risk | Mitigation |
|
||||||
|
|---|---|
|
||||||
|
| WebDAV sync and web UI push run simultaneously | Job lock in web UI: only one sync job at a time |
|
||||||
|
| `remotely-save` overwrites conflict markers in Obsidian | Pull always creates `.conflict.md` sidecar; original file left clean (conflict is in sidecar, not the working file) |
|
||||||
|
| WebDAV serving `.git/` directory | Configure nginx WebDAV to deny access to `.git/` path |
|
||||||
|
| New file created in Obsidian without a matching collection folder | Sync engine warns and skips; status page shows "unknown collection" |
|
||||||
|
| User edits `_sync_log.md` in Obsidian | `.gitattributes` union merge prevents conflicts; worst case: duplicate log entries |
|
||||||
|
| WebDAV password brute force | Endpoint bound to Tailscale interface only — not reachable from public internet |
|
||||||
|
| `remotely-save` syncs `.obsidian/` config files | Configure plugin ignore list: `.obsidian/`, `*.conflict.md` |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 14. Open Questions
|
||||||
|
|
||||||
|
1. **Push trigger:** Manual-only (button) or also auto-trigger after WebDAV sync completes? Manual is safer; auto-push requires detecting that `remotely-save` finished syncing (no event available).
|
||||||
|
|
||||||
|
2. **WebDAV auth:** Basic auth (htpasswd) with Tailscale network isolation. Tailscale is the primary security boundary; basic auth is a fallback. Acceptable for single-user. ✅ Resolved.
|
||||||
|
|
||||||
|
3. **Cron pull:** Ofelia can run `sync pull --auto` hourly so Outline changes appear in Obsidian without clicking "Get from Outline". Recommended to enable.
|
||||||
|
|
||||||
|
4. **`.git/` in WebDAV:** The vault root contains `.git/`. The WebDAV server must not serve it (security + Obsidian confusion). Verify nginx config denies `/.git/` access.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 15. Summary
|
||||||
|
|
||||||
|
| Component | Technology | Purpose |
|
||||||
|
|---|---|---|
|
||||||
|
| `100.x.x.x` (Tailscale) | nginx-WebDAV container | Obsidian ↔ VPS file sync (VPN-only) |
|
||||||
|
| `sync.domverse.de` | FastAPI + HTMX | Control plane: pull/push/resolve |
|
||||||
|
| `/outline-vault/` | git repo (shared volume) | Merge layer + history |
|
||||||
|
| `remotely-save` | Obsidian plugin | Local → WebDAV sync |
|
||||||
|
| `outline_sync.py` | existing sync engine | WebDAV vault ↔ Outline API |
|
||||||
|
| Ofelia | existing cron scheduler | Scheduled pull from Outline |
|
||||||
20
docker-compose.yml
Normal file
20
docker-compose.yml
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
services:
|
||||||
|
outline-sync-ui:
|
||||||
|
build: .
|
||||||
|
container_name: outline-sync-ui
|
||||||
|
restart: unless-stopped
|
||||||
|
networks:
|
||||||
|
- domnet
|
||||||
|
ports:
|
||||||
|
- "8181:8080"
|
||||||
|
labels:
|
||||||
|
- "traefik.enable=true"
|
||||||
|
- "traefik.http.routers.outline-sync.rule=Host(`sync.domverse.de`)"
|
||||||
|
- "traefik.http.routers.outline-sync.entrypoints=https"
|
||||||
|
- "traefik.http.routers.outline-sync.tls=true"
|
||||||
|
- "traefik.http.routers.outline-sync.middlewares=secHeaders@file,middlewares-authentik-new@file"
|
||||||
|
- "traefik.http.services.outline-sync.loadbalancer.server.port=8080"
|
||||||
|
|
||||||
|
networks:
|
||||||
|
domnet:
|
||||||
|
external: true
|
||||||
@@ -143,6 +143,6 @@ docker run --rm --network domnet \
|
|||||||
-v "$WORK_DIR:/work" \
|
-v "$WORK_DIR:/work" \
|
||||||
-w /work \
|
-w /work \
|
||||||
python:3.11-slim \
|
python:3.11-slim \
|
||||||
bash -c "pip install -qqq requests 2>/dev/null && python3 outline_import.py $CLI_ARGS"
|
bash -c "pip install -qqq requests tqdm 2>/dev/null && python3 outline_import.py $CLI_ARGS"
|
||||||
|
|
||||||
echo ""
|
echo ""
|
||||||
|
|||||||
@@ -30,14 +30,24 @@ from typing import Dict, List, Optional, Tuple
|
|||||||
import requests
|
import requests
|
||||||
from requests.adapters import HTTPAdapter
|
from requests.adapters import HTTPAdapter
|
||||||
from urllib3.util.retry import Retry
|
from urllib3.util.retry import Retry
|
||||||
|
from tqdm import tqdm
|
||||||
|
|
||||||
|
# Configure logging with tqdm-compatible handler
|
||||||
|
class TqdmLoggingHandler(logging.Handler):
|
||||||
|
"""Logging handler that uses tqdm.write() to avoid breaking progress bars."""
|
||||||
|
def emit(self, record):
|
||||||
|
try:
|
||||||
|
msg = self.format(record)
|
||||||
|
tqdm.write(msg, file=sys.stdout)
|
||||||
|
except Exception:
|
||||||
|
self.handleError(record)
|
||||||
|
|
||||||
# Configure logging
|
|
||||||
logging.basicConfig(
|
|
||||||
level=logging.INFO,
|
|
||||||
format='%(asctime)s | %(levelname)-8s | %(message)s',
|
|
||||||
datefmt='%H:%M:%S'
|
|
||||||
)
|
|
||||||
logger = logging.getLogger('outline_import')
|
logger = logging.getLogger('outline_import')
|
||||||
|
logger.setLevel(logging.INFO)
|
||||||
|
handler = TqdmLoggingHandler()
|
||||||
|
handler.setFormatter(logging.Formatter('%(asctime)s | %(levelname)-8s | %(message)s', datefmt='%H:%M:%S'))
|
||||||
|
logger.addHandler(handler)
|
||||||
|
logger.propagate = False
|
||||||
|
|
||||||
|
|
||||||
class TreePrinter:
|
class TreePrinter:
|
||||||
@@ -132,6 +142,17 @@ class OutlineImporter:
|
|||||||
# Track existing collections
|
# Track existing collections
|
||||||
self.existing_collections: Dict[str, str] = {} # name -> id
|
self.existing_collections: Dict[str, str] = {} # name -> id
|
||||||
|
|
||||||
|
# Track imported collection for verification
|
||||||
|
self.imported_collection_id: Optional[str] = None
|
||||||
|
self.imported_collection_name: Optional[str] = None
|
||||||
|
|
||||||
|
# Adaptive rate limiting
|
||||||
|
self.current_rate_delay = rate_limit_delay
|
||||||
|
self.min_rate_delay = 0.2
|
||||||
|
self.max_rate_delay = 5.0
|
||||||
|
self.rate_increase_factor = 1.5 # Increase delay by 50% on 429
|
||||||
|
self.rate_decrease_factor = 0.9 # Decrease delay by 10% on success
|
||||||
|
|
||||||
# Statistics
|
# Statistics
|
||||||
self.stats = {
|
self.stats = {
|
||||||
"collections_created": 0,
|
"collections_created": 0,
|
||||||
@@ -145,23 +166,103 @@ class OutlineImporter:
|
|||||||
# Error tracking
|
# Error tracking
|
||||||
self.errors: List[Dict] = []
|
self.errors: List[Dict] = []
|
||||||
|
|
||||||
|
# Source metadata for verification
|
||||||
|
self.source_docs: List[Dict] = []
|
||||||
|
|
||||||
|
# Progress bar (initialized in import_all)
|
||||||
|
self.pbar: Optional[tqdm] = None
|
||||||
|
|
||||||
|
def _output(self, message: str, end: str = "\n") -> None:
|
||||||
|
"""Output message, using tqdm.write() if progress bar is active."""
|
||||||
|
if self.pbar is not None:
|
||||||
|
# tqdm.write() always adds newline, so we handle end="" specially
|
||||||
|
if end == "":
|
||||||
|
# For inline messages, update the progress bar description instead
|
||||||
|
self.pbar.set_description(message.strip())
|
||||||
|
else:
|
||||||
|
tqdm.write(message, file=sys.stdout)
|
||||||
|
else:
|
||||||
|
print(message, end=end, flush=True)
|
||||||
|
|
||||||
|
def _update_progress(self) -> None:
|
||||||
|
"""Update progress bar by 1."""
|
||||||
|
if self.pbar is not None:
|
||||||
|
self.pbar.update(1)
|
||||||
|
|
||||||
|
def _count_all_documents(self, source_collections: List[Path]) -> int:
|
||||||
|
"""Count total documents across all collections."""
|
||||||
|
total = 0
|
||||||
|
for collection_dir in source_collections:
|
||||||
|
metadata = self.load_collection_metadata(collection_dir)
|
||||||
|
if metadata:
|
||||||
|
total += metadata.get("expected_count", 0)
|
||||||
|
# In single mode, we also create a parent doc for each collection
|
||||||
|
if self.single_mode:
|
||||||
|
total += 1
|
||||||
|
return total
|
||||||
|
|
||||||
|
def _parse_retry_after(self, response: requests.Response) -> Optional[float]:
|
||||||
|
"""
|
||||||
|
Parse Retry-After header from response.
|
||||||
|
|
||||||
|
The header can be in two formats:
|
||||||
|
1. Seconds (integer): "Retry-After: 120"
|
||||||
|
2. HTTP date: "Retry-After: Wed, 21 Oct 2015 07:28:00 GMT"
|
||||||
|
|
||||||
|
Args:
|
||||||
|
response: HTTP response object
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of seconds to wait, or None if header not present/parseable
|
||||||
|
"""
|
||||||
|
retry_after = response.headers.get("Retry-After")
|
||||||
|
if not retry_after:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Try parsing as number (seconds) - handles both int and float
|
||||||
|
try:
|
||||||
|
seconds = float(retry_after)
|
||||||
|
logger.debug(f"Retry-After header: {seconds:.1f} seconds")
|
||||||
|
return seconds
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Try parsing as HTTP date
|
||||||
|
try:
|
||||||
|
from email.utils import parsedate_to_datetime
|
||||||
|
retry_date = parsedate_to_datetime(retry_after)
|
||||||
|
now = datetime.now(retry_date.tzinfo)
|
||||||
|
delta = (retry_date - now).total_seconds()
|
||||||
|
# Ensure positive wait time, minimum 1 second
|
||||||
|
wait_seconds = max(1.0, delta)
|
||||||
|
logger.debug(f"Retry-After header (HTTP date): wait {wait_seconds:.1f}s until {retry_after}")
|
||||||
|
return wait_seconds
|
||||||
|
except (ValueError, TypeError) as e:
|
||||||
|
logger.warning(f"Could not parse Retry-After header '{retry_after}': {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
def _api_request(
|
def _api_request(
|
||||||
self,
|
self,
|
||||||
endpoint: str,
|
endpoint: str,
|
||||||
data: Optional[Dict] = None,
|
data: Optional[Dict] = None,
|
||||||
method: str = "POST"
|
method: str = "POST",
|
||||||
|
is_write: bool = False,
|
||||||
|
context: str = None
|
||||||
) -> Optional[Dict]:
|
) -> Optional[Dict]:
|
||||||
"""
|
"""
|
||||||
Make API request with error handling and retry logic.
|
Make API request with error handling, retry logic, and adaptive rate limiting.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
endpoint: API endpoint path (e.g., '/api/collections.list')
|
endpoint: API endpoint path (e.g., '/api/collections.list')
|
||||||
data: Request body data
|
data: Request body data
|
||||||
method: HTTP method (POST or GET)
|
method: HTTP method (POST or GET)
|
||||||
|
is_write: Whether this is a write operation (applies rate limiting)
|
||||||
|
context: Optional context string for logging (e.g., document title)
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Response data dict or None on failure
|
Response data dict or None on failure
|
||||||
"""
|
"""
|
||||||
|
log_context = f" [{context}]" if context else ""
|
||||||
url = f"{self.base_url}{endpoint}"
|
url = f"{self.base_url}{endpoint}"
|
||||||
|
|
||||||
for attempt in range(self.retry_attempts):
|
for attempt in range(self.retry_attempts):
|
||||||
@@ -181,19 +282,78 @@ class OutlineImporter:
|
|||||||
)
|
)
|
||||||
|
|
||||||
if response.status_code == 200:
|
if response.status_code == 200:
|
||||||
return response.json()
|
# Success - gradually decrease rate delay for write operations
|
||||||
elif response.status_code in [429, 500, 502, 503, 504]:
|
if is_write:
|
||||||
if attempt < self.retry_attempts - 1:
|
old_delay = self.current_rate_delay
|
||||||
wait_time = self.retry_delay * (2 ** attempt)
|
self.current_rate_delay = max(
|
||||||
logger.warning(
|
self.min_rate_delay,
|
||||||
f"API error {response.status_code} on {endpoint}, "
|
self.current_rate_delay * self.rate_decrease_factor
|
||||||
f"retrying in {wait_time:.1f}s (attempt {attempt + 1}/{self.retry_attempts})"
|
|
||||||
)
|
)
|
||||||
|
if old_delay != self.current_rate_delay:
|
||||||
|
logger.debug(f"Rate delay decreased: {old_delay:.2f}s → {self.current_rate_delay:.2f}s")
|
||||||
|
return response.json()
|
||||||
|
|
||||||
|
elif response.status_code == 429:
|
||||||
|
# Rate limited - check Retry-After header
|
||||||
|
retry_after = self._parse_retry_after(response)
|
||||||
|
|
||||||
|
# Increase adaptive delay based on Retry-After or multiplicative factor
|
||||||
|
old_delay = self.current_rate_delay
|
||||||
|
if retry_after is not None:
|
||||||
|
# Use Retry-After to inform spacing: divide window by ~10 requests
|
||||||
|
# e.g., 20s Retry-After → 2s delay between requests
|
||||||
|
informed_delay = retry_after / 10.0
|
||||||
|
self.current_rate_delay = min(
|
||||||
|
self.max_rate_delay,
|
||||||
|
max(self.current_rate_delay, informed_delay)
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.current_rate_delay = min(
|
||||||
|
self.max_rate_delay,
|
||||||
|
self.current_rate_delay * self.rate_increase_factor
|
||||||
|
)
|
||||||
|
if old_delay != self.current_rate_delay:
|
||||||
|
logger.info(f"Rate delay adjusted: {old_delay:.2f}s → {self.current_rate_delay:.2f}s")
|
||||||
|
|
||||||
|
if attempt < self.retry_attempts - 1:
|
||||||
|
# Use Retry-After if provided, otherwise fall back to exponential backoff
|
||||||
|
if retry_after is not None:
|
||||||
|
wait_time = retry_after
|
||||||
|
logger.warning(
|
||||||
|
f"Rate limited (429){log_context}, server requested {wait_time:.1f}s wait via Retry-After "
|
||||||
|
f"(attempt {attempt + 1}/{self.retry_attempts})"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
wait_time = self.current_rate_delay * (2 ** attempt)
|
||||||
|
logger.warning(
|
||||||
|
f"Rate limited (429){log_context}, waiting {wait_time:.1f}s "
|
||||||
|
f"(attempt {attempt + 1}/{self.retry_attempts}, delay now {self.current_rate_delay:.1f}s)"
|
||||||
|
)
|
||||||
|
time.sleep(wait_time)
|
||||||
|
continue
|
||||||
|
|
||||||
|
elif response.status_code in [500, 502, 503, 504]:
|
||||||
|
if attempt < self.retry_attempts - 1:
|
||||||
|
# Check for Retry-After header (common with 503)
|
||||||
|
retry_after = self._parse_retry_after(response)
|
||||||
|
if retry_after is not None:
|
||||||
|
wait_time = retry_after
|
||||||
|
logger.warning(
|
||||||
|
f"API error {response.status_code}{log_context}, "
|
||||||
|
f"server requested {wait_time:.1f}s wait via Retry-After "
|
||||||
|
f"(attempt {attempt + 1}/{self.retry_attempts})"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
wait_time = self.retry_delay * (2 ** attempt)
|
||||||
|
logger.warning(
|
||||||
|
f"API error {response.status_code}{log_context}, "
|
||||||
|
f"retrying in {wait_time:.1f}s (attempt {attempt + 1}/{self.retry_attempts})"
|
||||||
|
)
|
||||||
time.sleep(wait_time)
|
time.sleep(wait_time)
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Non-retryable error or final attempt
|
# Non-retryable error or final attempt
|
||||||
logger.error(f"API error on {endpoint}: HTTP {response.status_code}")
|
logger.error(f"API error{log_context}: HTTP {response.status_code}")
|
||||||
logger.debug(f"Response: {response.text[:200]}")
|
logger.debug(f"Response: {response.text[:200]}")
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@@ -262,7 +422,7 @@ class OutlineImporter:
|
|||||||
result = self._api_request("/api/collections.create", {
|
result = self._api_request("/api/collections.create", {
|
||||||
"name": name,
|
"name": name,
|
||||||
"permission": permission
|
"permission": permission
|
||||||
})
|
}, is_write=True, context=f"collection:{name}")
|
||||||
|
|
||||||
if result and "data" in result:
|
if result and "data" in result:
|
||||||
collection_id = result["data"]["id"]
|
collection_id = result["data"]["id"]
|
||||||
@@ -323,11 +483,11 @@ class OutlineImporter:
|
|||||||
if parent_document_id:
|
if parent_document_id:
|
||||||
data["parentDocumentId"] = parent_document_id
|
data["parentDocumentId"] = parent_document_id
|
||||||
|
|
||||||
# Rate limiting
|
# Adaptive rate limiting - uses current_rate_delay which increases after 429s
|
||||||
if self.rate_limit_delay > 0:
|
if self.current_rate_delay > 0:
|
||||||
time.sleep(self.rate_limit_delay)
|
time.sleep(self.current_rate_delay)
|
||||||
|
|
||||||
result = self._api_request("/api/documents.create", data)
|
result = self._api_request("/api/documents.create", data, is_write=True, context=title)
|
||||||
|
|
||||||
if result and "data" in result:
|
if result and "data" in result:
|
||||||
return result["data"]["id"]
|
return result["data"]["id"]
|
||||||
@@ -531,24 +691,24 @@ class OutlineImporter:
|
|||||||
# Check if collection exists
|
# Check if collection exists
|
||||||
if collection_name in self.existing_collections:
|
if collection_name in self.existing_collections:
|
||||||
if self.force:
|
if self.force:
|
||||||
print(f" Deleting existing collection \"{collection_name}\"...")
|
self._output(f" Deleting existing collection \"{collection_name}\"...")
|
||||||
if not self.dry_run:
|
if not self.dry_run:
|
||||||
self._delete_collection(self.existing_collections[collection_name])
|
self._delete_collection(self.existing_collections[collection_name])
|
||||||
del self.existing_collections[collection_name]
|
del self.existing_collections[collection_name]
|
||||||
else:
|
else:
|
||||||
print(f" Collection exists, skipping...")
|
self._output(f" Collection exists, skipping...")
|
||||||
self.stats["collections_skipped"] += 1
|
self.stats["collections_skipped"] += 1
|
||||||
return (0, doc_count, 0)
|
return (0, doc_count, 0)
|
||||||
|
|
||||||
# Create collection
|
# Create collection
|
||||||
if self.dry_run:
|
if self.dry_run:
|
||||||
print(f" [DRY RUN] Would create collection \"{collection_name}\"")
|
self._output(f" [DRY RUN] Would create collection \"{collection_name}\"")
|
||||||
collection_id = "dry-run-collection-id"
|
collection_id = "dry-run-collection-id"
|
||||||
else:
|
else:
|
||||||
print(f" Creating collection...", end=" ")
|
self._output(f" Creating collection... ", end="")
|
||||||
collection_id = self._create_collection(collection_name)
|
collection_id = self._create_collection(collection_name)
|
||||||
if not collection_id:
|
if not collection_id:
|
||||||
print("✗ failed")
|
self._output("✗ failed")
|
||||||
self.stats["collections_errors"] += 1
|
self.stats["collections_errors"] += 1
|
||||||
self.errors.append({
|
self.errors.append({
|
||||||
"type": "collection",
|
"type": "collection",
|
||||||
@@ -556,7 +716,7 @@ class OutlineImporter:
|
|||||||
"error": "Failed to create collection"
|
"error": "Failed to create collection"
|
||||||
})
|
})
|
||||||
return (0, 0, 1)
|
return (0, 0, 1)
|
||||||
print(f"✓ (id: {collection_id[:8]}...)")
|
self._output(f"✓ (id: {collection_id[:8]}...)")
|
||||||
|
|
||||||
self.stats["collections_created"] += 1
|
self.stats["collections_created"] += 1
|
||||||
|
|
||||||
@@ -596,7 +756,8 @@ class OutlineImporter:
|
|||||||
content = self.read_document_content(collection_dir, filename)
|
content = self.read_document_content(collection_dir, filename)
|
||||||
if content is None:
|
if content is None:
|
||||||
line = TreePrinter.format_line(title, "error", "file not found", prefix + connector)
|
line = TreePrinter.format_line(title, "error", "file not found", prefix + connector)
|
||||||
print(line)
|
self._output(line)
|
||||||
|
self._update_progress()
|
||||||
errors += 1
|
errors += 1
|
||||||
self.stats["documents_errors"] += 1
|
self.stats["documents_errors"] += 1
|
||||||
self.errors.append({
|
self.errors.append({
|
||||||
@@ -608,13 +769,14 @@ class OutlineImporter:
|
|||||||
# Skip children if parent failed
|
# Skip children if parent failed
|
||||||
if children:
|
if children:
|
||||||
child_prefix = prefix + (TreePrinter.BLANK if is_last else TreePrinter.PIPE)
|
child_prefix = prefix + (TreePrinter.BLANK if is_last else TreePrinter.PIPE)
|
||||||
print(f"{child_prefix}└── (children skipped due to parent failure)")
|
self._output(f"{child_prefix}└── (children skipped due to parent failure)")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Create document
|
# Create document
|
||||||
if self.dry_run:
|
if self.dry_run:
|
||||||
line = TreePrinter.format_line(title, "dry_run", prefix=prefix + connector)
|
line = TreePrinter.format_line(title, "dry_run", prefix=prefix + connector)
|
||||||
print(line)
|
self._output(line)
|
||||||
|
self._update_progress()
|
||||||
self.id_map[old_id] = f"dry-run-{old_id}"
|
self.id_map[old_id] = f"dry-run-{old_id}"
|
||||||
created += 1
|
created += 1
|
||||||
self.stats["documents_created"] += 1
|
self.stats["documents_created"] += 1
|
||||||
@@ -629,12 +791,14 @@ class OutlineImporter:
|
|||||||
if new_id:
|
if new_id:
|
||||||
self.id_map[old_id] = new_id
|
self.id_map[old_id] = new_id
|
||||||
line = TreePrinter.format_line(title, "created", prefix=prefix + connector)
|
line = TreePrinter.format_line(title, "created", prefix=prefix + connector)
|
||||||
print(line)
|
self._output(line)
|
||||||
|
self._update_progress()
|
||||||
created += 1
|
created += 1
|
||||||
self.stats["documents_created"] += 1
|
self.stats["documents_created"] += 1
|
||||||
else:
|
else:
|
||||||
line = TreePrinter.format_line(title, "error", "API error", prefix + connector)
|
line = TreePrinter.format_line(title, "error", "API error", prefix + connector)
|
||||||
print(line)
|
self._output(line)
|
||||||
|
self._update_progress()
|
||||||
errors += 1
|
errors += 1
|
||||||
self.stats["documents_errors"] += 1
|
self.stats["documents_errors"] += 1
|
||||||
self.errors.append({
|
self.errors.append({
|
||||||
@@ -646,7 +810,7 @@ class OutlineImporter:
|
|||||||
# Skip children if parent failed
|
# Skip children if parent failed
|
||||||
if children:
|
if children:
|
||||||
child_prefix = prefix + (TreePrinter.BLANK if is_last else TreePrinter.PIPE)
|
child_prefix = prefix + (TreePrinter.BLANK if is_last else TreePrinter.PIPE)
|
||||||
print(f"{child_prefix}└── (children skipped due to parent failure)")
|
self._output(f"{child_prefix}└── (children skipped due to parent failure)")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Process children recursively
|
# Process children recursively
|
||||||
@@ -712,58 +876,82 @@ class OutlineImporter:
|
|||||||
logger.error("No collections found in source directory")
|
logger.error("No collections found in source directory")
|
||||||
return
|
return
|
||||||
|
|
||||||
if self.single_mode:
|
# Count total documents and initialize progress bar
|
||||||
# Single collection mode
|
total_docs = self._count_all_documents(source_collections)
|
||||||
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
print(f"Total documents to import: {total_docs}")
|
||||||
single_collection_name = f"import_{timestamp}"
|
print()
|
||||||
|
|
||||||
logger.info(f"Creating single collection: {single_collection_name}")
|
self.pbar = tqdm(
|
||||||
collection_id = self._create_collection(single_collection_name)
|
total=total_docs,
|
||||||
if not collection_id and not self.dry_run:
|
desc="Importing",
|
||||||
logger.error("Failed to create import collection")
|
unit="doc",
|
||||||
return
|
dynamic_ncols=True,
|
||||||
|
file=sys.stdout,
|
||||||
|
leave=True,
|
||||||
|
mininterval=1.0, # Update at most once per second
|
||||||
|
bar_format="{desc}: {percentage:3.0f}%|{bar}| {n_fmt}/{total_fmt} [{elapsed}<{remaining}, {rate_fmt}]"
|
||||||
|
)
|
||||||
|
|
||||||
self.stats["collections_created"] += 1
|
try:
|
||||||
|
if self.single_mode:
|
||||||
|
# Single collection mode
|
||||||
|
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||||
|
single_collection_name = f"import_{timestamp}"
|
||||||
|
|
||||||
for collection_dir in source_collections:
|
logger.info(f"Creating single collection: {single_collection_name}")
|
||||||
metadata = self.load_collection_metadata(collection_dir)
|
collection_id = self._create_collection(single_collection_name)
|
||||||
if not metadata:
|
if not collection_id and not self.dry_run:
|
||||||
continue
|
logger.error("Failed to create import collection")
|
||||||
|
return
|
||||||
|
|
||||||
collection_name = metadata.get("name", collection_dir.name)
|
self.stats["collections_created"] += 1
|
||||||
doc_count = metadata.get("expected_count", 0)
|
|
||||||
|
|
||||||
print(f"\n{collection_name}/ ({doc_count} documents)")
|
for collection_dir in source_collections:
|
||||||
|
metadata = self.load_collection_metadata(collection_dir)
|
||||||
|
if not metadata:
|
||||||
|
continue
|
||||||
|
|
||||||
# Create parent document for this "collection"
|
collection_name = metadata.get("name", collection_dir.name)
|
||||||
parent_doc_id = self._create_document(
|
doc_count = metadata.get("expected_count", 0)
|
||||||
collection_id,
|
|
||||||
collection_name,
|
|
||||||
f"# {collection_name}\n\nImported collection.",
|
|
||||||
parent_document_id=None
|
|
||||||
)
|
|
||||||
|
|
||||||
if parent_doc_id:
|
self._output(f"\n{collection_name}/ ({doc_count} documents)")
|
||||||
self.stats["documents_created"] += 1
|
|
||||||
|
|
||||||
# Import documents under this parent
|
# Create parent document for this "collection"
|
||||||
self.import_collection(
|
parent_doc_id = self._create_document(
|
||||||
collection_dir,
|
collection_id,
|
||||||
target_collection_id=collection_id,
|
collection_name,
|
||||||
parent_document_id=parent_doc_id
|
f"# {collection_name}\n\nImported collection.",
|
||||||
)
|
parent_document_id=None
|
||||||
else:
|
)
|
||||||
# Standard mode: one collection per folder
|
|
||||||
for collection_dir in source_collections:
|
|
||||||
metadata = self.load_collection_metadata(collection_dir)
|
|
||||||
if not metadata:
|
|
||||||
continue
|
|
||||||
|
|
||||||
collection_name = metadata.get("name", collection_dir.name)
|
if parent_doc_id:
|
||||||
doc_count = metadata.get("expected_count", 0)
|
self.stats["documents_created"] += 1
|
||||||
|
self._update_progress()
|
||||||
|
|
||||||
print(f"\n{collection_name}/ ({doc_count} documents)")
|
# Import documents under this parent
|
||||||
self.import_collection(collection_dir)
|
self.import_collection(
|
||||||
|
collection_dir,
|
||||||
|
target_collection_id=collection_id,
|
||||||
|
parent_document_id=parent_doc_id
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
# Standard mode: one collection per folder
|
||||||
|
for collection_dir in source_collections:
|
||||||
|
metadata = self.load_collection_metadata(collection_dir)
|
||||||
|
if not metadata:
|
||||||
|
continue
|
||||||
|
|
||||||
|
collection_name = metadata.get("name", collection_dir.name)
|
||||||
|
doc_count = metadata.get("expected_count", 0)
|
||||||
|
|
||||||
|
self._output(f"\n{collection_name}/ ({doc_count} documents)")
|
||||||
|
self.import_collection(collection_dir)
|
||||||
|
|
||||||
|
finally:
|
||||||
|
# Close progress bar
|
||||||
|
if self.pbar:
|
||||||
|
self.pbar.close()
|
||||||
|
self.pbar = None
|
||||||
|
|
||||||
# Print summary
|
# Print summary
|
||||||
duration = time.time() - start_time
|
duration = time.time() - start_time
|
||||||
|
|||||||
771
outline_sync.py
Normal file
771
outline_sync.py
Normal file
@@ -0,0 +1,771 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Outline Sync — Phase 1: init
|
||||||
|
|
||||||
|
Creates a local vault mirroring Outline wiki structure.
|
||||||
|
Each document is written as a markdown file with YAML frontmatter
|
||||||
|
containing the Outline document ID and metadata for future syncs.
|
||||||
|
|
||||||
|
Git initialization is handled by sync.sh after this script exits.
|
||||||
|
|
||||||
|
Usage (called by sync.sh, not directly):
|
||||||
|
python3 outline_sync.py init --vault /vault --settings /work/settings.json
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import re
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import time
|
||||||
|
import logging
|
||||||
|
import argparse
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Dict, List, Optional, Tuple
|
||||||
|
|
||||||
|
import requests
|
||||||
|
from requests.adapters import HTTPAdapter
|
||||||
|
from urllib3.util.retry import Retry
|
||||||
|
|
||||||
|
|
||||||
|
# ── Logging ───────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
logging.basicConfig(
|
||||||
|
level=logging.WARNING,
|
||||||
|
format="%(asctime)s | %(levelname)-8s | %(message)s",
|
||||||
|
datefmt="%H:%M:%S",
|
||||||
|
)
|
||||||
|
logger = logging.getLogger("outline_sync")
|
||||||
|
|
||||||
|
|
||||||
|
# ── Frontmatter helpers ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Ordered fields written to every synced file
|
||||||
|
FRONTMATTER_FIELDS = [
|
||||||
|
"outline_id",
|
||||||
|
"outline_collection_id",
|
||||||
|
"outline_parent_id",
|
||||||
|
"outline_updated_at",
|
||||||
|
]
|
||||||
|
|
||||||
|
GITIGNORE = """\
|
||||||
|
# Obsidian internals
|
||||||
|
.obsidian/
|
||||||
|
|
||||||
|
# Sync config (contains API token)
|
||||||
|
settings.json
|
||||||
|
|
||||||
|
# Conflict sidecars (resolved manually)
|
||||||
|
*.conflict.md
|
||||||
|
|
||||||
|
# OS noise
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
"""
|
||||||
|
|
||||||
|
GITATTRIBUTES = """\
|
||||||
|
# Normalize line endings for all markdown
|
||||||
|
*.md text eol=lf
|
||||||
|
|
||||||
|
# Sync log is append-only — never produce conflicts on it
|
||||||
|
_sync_log.md merge=union
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def build_frontmatter(fields: Dict[str, str]) -> str:
|
||||||
|
"""Serialize an ordered dict of fields to a YAML frontmatter block."""
|
||||||
|
lines = ["---"]
|
||||||
|
for key in FRONTMATTER_FIELDS:
|
||||||
|
value = fields.get(key, "")
|
||||||
|
if value: # omit empty values (e.g. outline_parent_id for root docs)
|
||||||
|
lines.append(f"{key}: {value}")
|
||||||
|
lines.append("---")
|
||||||
|
return "\n".join(lines) + "\n"
|
||||||
|
|
||||||
|
|
||||||
|
def parse_frontmatter(content: str) -> Tuple[Dict[str, str], str]:
|
||||||
|
"""
|
||||||
|
Parse a YAML frontmatter block from file content.
|
||||||
|
|
||||||
|
Returns (frontmatter_dict, body_text).
|
||||||
|
If no valid frontmatter block is found, returns ({}, original_content).
|
||||||
|
"""
|
||||||
|
if not content.startswith("---\n"):
|
||||||
|
return {}, content
|
||||||
|
|
||||||
|
end = content.find("\n---\n", 4)
|
||||||
|
if end == -1:
|
||||||
|
return {}, content
|
||||||
|
|
||||||
|
fm_text = content[4:end]
|
||||||
|
body = content[end + 5:] # skip past \n---\n
|
||||||
|
|
||||||
|
fm: Dict[str, str] = {}
|
||||||
|
for line in fm_text.splitlines():
|
||||||
|
if ": " in line:
|
||||||
|
key, _, value = line.partition(": ")
|
||||||
|
fm[key.strip()] = value.strip()
|
||||||
|
|
||||||
|
return fm, body
|
||||||
|
|
||||||
|
|
||||||
|
# ── Filename helpers ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
_INVALID = re.compile(r'[<>:"/\\|?*\x00-\x1f]')
|
||||||
|
_SPACES = re.compile(r"\s+")
|
||||||
|
|
||||||
|
|
||||||
|
def sanitize_name(name: str, max_len: int = 200) -> str:
|
||||||
|
"""Convert a document title to a safe filesystem name (no extension)."""
|
||||||
|
name = _INVALID.sub("_", name)
|
||||||
|
name = _SPACES.sub(" ", name).strip()
|
||||||
|
return name[:max_len] if name else "Untitled"
|
||||||
|
|
||||||
|
|
||||||
|
# ── OutlineSync ───────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class OutlineSync:
|
||||||
|
|
||||||
|
def __init__(self, base_url: str, api_token: str, vault_dir: Path):
|
||||||
|
self.base_url = base_url.rstrip("/")
|
||||||
|
self.api_token = api_token
|
||||||
|
self.vault_dir = Path(vault_dir)
|
||||||
|
|
||||||
|
self.session = requests.Session()
|
||||||
|
adapter = HTTPAdapter(max_retries=Retry(
|
||||||
|
total=3,
|
||||||
|
backoff_factor=1.0,
|
||||||
|
status_forcelist=[429, 500, 502, 503, 504],
|
||||||
|
))
|
||||||
|
self.session.mount("http://", adapter)
|
||||||
|
self.session.mount("https://", adapter)
|
||||||
|
|
||||||
|
self.headers = {
|
||||||
|
"Authorization": f"Bearer {self.api_token}",
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
}
|
||||||
|
|
||||||
|
self._doc_cache: Dict[str, Dict] = {}
|
||||||
|
self.stats = {"collections": 0, "documents": 0, "errors": 0}
|
||||||
|
|
||||||
|
# ── API layer ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _api(
|
||||||
|
self,
|
||||||
|
endpoint: str,
|
||||||
|
data: Optional[Dict] = None,
|
||||||
|
method: str = "POST",
|
||||||
|
) -> Optional[Dict]:
|
||||||
|
url = f"{self.base_url}{endpoint}"
|
||||||
|
try:
|
||||||
|
if method == "POST":
|
||||||
|
r = self.session.post(
|
||||||
|
url, headers=self.headers, json=data or {}, timeout=30
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
r = self.session.get(url, headers=self.headers, timeout=30)
|
||||||
|
|
||||||
|
if r.status_code == 200:
|
||||||
|
return r.json()
|
||||||
|
|
||||||
|
logger.error("API %s on %s", r.status_code, endpoint)
|
||||||
|
logger.debug("Response body: %s", r.text[:400])
|
||||||
|
return None
|
||||||
|
|
||||||
|
except requests.RequestException as exc:
|
||||||
|
logger.error("Request failed on %s: %s", endpoint, exc)
|
||||||
|
return None
|
||||||
|
|
||||||
|
def health_check(self) -> bool:
|
||||||
|
print("Checking API connectivity...", end=" ", flush=True)
|
||||||
|
result = self._api("/api/auth.info")
|
||||||
|
if result and "data" in result:
|
||||||
|
user = result["data"].get("user", {})
|
||||||
|
print(f"✓ ({user.get('name', 'unknown')})")
|
||||||
|
return True
|
||||||
|
print("✗")
|
||||||
|
return False
|
||||||
|
|
||||||
|
def get_collections(self) -> List[Dict]:
|
||||||
|
result = self._api("/api/collections.list")
|
||||||
|
if result and "data" in result:
|
||||||
|
return result["data"]
|
||||||
|
return []
|
||||||
|
|
||||||
|
def get_nav_tree(self, collection_id: str) -> List[Dict]:
|
||||||
|
"""Return the nested navigation tree for a collection."""
|
||||||
|
result = self._api("/api/collections.documents", {"id": collection_id})
|
||||||
|
if result and "data" in result:
|
||||||
|
return result["data"]
|
||||||
|
return []
|
||||||
|
|
||||||
|
def get_document_info(self, doc_id: str) -> Optional[Dict]:
|
||||||
|
"""Fetch full document content, using cache to avoid duplicate calls."""
|
||||||
|
if doc_id in self._doc_cache:
|
||||||
|
return self._doc_cache[doc_id]
|
||||||
|
result = self._api("/api/documents.info", {"id": doc_id})
|
||||||
|
if result and "data" in result:
|
||||||
|
self._doc_cache[doc_id] = result["data"]
|
||||||
|
return result["data"]
|
||||||
|
return None
|
||||||
|
|
||||||
|
# ── File writing ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _write_doc_file(
|
||||||
|
self,
|
||||||
|
path: Path,
|
||||||
|
doc_id: str,
|
||||||
|
collection_id: str,
|
||||||
|
parent_id: Optional[str],
|
||||||
|
) -> bool:
|
||||||
|
"""
|
||||||
|
Fetch document content and write it to path with YAML frontmatter.
|
||||||
|
|
||||||
|
File format:
|
||||||
|
---
|
||||||
|
outline_id: <id>
|
||||||
|
outline_collection_id: <id>
|
||||||
|
outline_parent_id: <id> ← omitted for root documents
|
||||||
|
outline_updated_at: <iso>
|
||||||
|
---
|
||||||
|
|
||||||
|
<document body from Outline API 'text' field>
|
||||||
|
"""
|
||||||
|
full = self.get_document_info(doc_id)
|
||||||
|
if not full:
|
||||||
|
logger.warning("Could not fetch document %s — skipping", doc_id)
|
||||||
|
self.stats["errors"] += 1
|
||||||
|
return False
|
||||||
|
|
||||||
|
fm = {
|
||||||
|
"outline_id": doc_id,
|
||||||
|
"outline_collection_id": collection_id,
|
||||||
|
"outline_parent_id": parent_id or "",
|
||||||
|
"outline_updated_at": full.get("updatedAt", ""),
|
||||||
|
}
|
||||||
|
|
||||||
|
body = full.get("text", "")
|
||||||
|
content = build_frontmatter(fm) + "\n" + body
|
||||||
|
|
||||||
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
path.write_text(content, encoding="utf-8")
|
||||||
|
self.stats["documents"] += 1
|
||||||
|
return True
|
||||||
|
|
||||||
|
def _unique_path(self, directory: Path, name: str) -> Path:
|
||||||
|
"""Return a non-colliding .md path, appending _N suffix if needed."""
|
||||||
|
candidate = directory / f"{name}.md"
|
||||||
|
counter = 1
|
||||||
|
while candidate.exists():
|
||||||
|
candidate = directory / f"{name}_{counter}.md"
|
||||||
|
counter += 1
|
||||||
|
return candidate
|
||||||
|
|
||||||
|
def _export_node(
|
||||||
|
self,
|
||||||
|
node: Dict,
|
||||||
|
parent_dir: Path,
|
||||||
|
collection_id: str,
|
||||||
|
parent_doc_id: Optional[str],
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Recursively export one nav-tree node and all its children.
|
||||||
|
|
||||||
|
Folder structure rule (from PRD §4.3):
|
||||||
|
- Leaf document (no children) → parent_dir/Title.md
|
||||||
|
- Document with children → parent_dir/Title/Title.md
|
||||||
|
parent_dir/Title/Child.md ...
|
||||||
|
This means the parent document and its children share the same folder.
|
||||||
|
"""
|
||||||
|
doc_id = node["id"]
|
||||||
|
title = node.get("title", "Untitled")
|
||||||
|
children = node.get("children", [])
|
||||||
|
safe = sanitize_name(title)
|
||||||
|
|
||||||
|
if children:
|
||||||
|
# Create a named subdirectory; the document itself lives inside it
|
||||||
|
doc_dir = parent_dir / safe
|
||||||
|
doc_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
doc_path = self._unique_path(doc_dir, safe)
|
||||||
|
child_dir = doc_dir
|
||||||
|
else:
|
||||||
|
doc_path = self._unique_path(parent_dir, safe)
|
||||||
|
child_dir = parent_dir # unused for leaf, but needed for recursion
|
||||||
|
|
||||||
|
logger.info(" Writing %s", doc_path.relative_to(self.vault_dir))
|
||||||
|
ok = self._write_doc_file(doc_path, doc_id, collection_id, parent_doc_id)
|
||||||
|
|
||||||
|
if ok:
|
||||||
|
for child in children:
|
||||||
|
self._export_node(child, child_dir, collection_id, doc_id)
|
||||||
|
|
||||||
|
def export_collection(self, collection: Dict) -> int:
|
||||||
|
"""Export all documents for one collection. Returns count written."""
|
||||||
|
coll_id = collection["id"]
|
||||||
|
coll_name = collection["name"]
|
||||||
|
safe_name = sanitize_name(coll_name)
|
||||||
|
coll_dir = self.vault_dir / safe_name
|
||||||
|
|
||||||
|
coll_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
print(f" {coll_name}/", end=" ", flush=True)
|
||||||
|
|
||||||
|
nav_tree = self.get_nav_tree(coll_id)
|
||||||
|
if not nav_tree:
|
||||||
|
print("(empty)")
|
||||||
|
self.stats["collections"] += 1
|
||||||
|
return 0
|
||||||
|
|
||||||
|
before = self.stats["documents"]
|
||||||
|
for node in nav_tree:
|
||||||
|
self._export_node(node, coll_dir, coll_id, None)
|
||||||
|
|
||||||
|
count = self.stats["documents"] - before
|
||||||
|
errors = self.stats["errors"]
|
||||||
|
status = f"{count} documents"
|
||||||
|
if errors:
|
||||||
|
status += f" ⚠ {errors} errors"
|
||||||
|
print(status)
|
||||||
|
|
||||||
|
self.stats["collections"] += 1
|
||||||
|
return count
|
||||||
|
|
||||||
|
# ── Config files ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def write_gitignore(self) -> None:
|
||||||
|
(self.vault_dir / ".gitignore").write_text(GITIGNORE, encoding="utf-8")
|
||||||
|
|
||||||
|
def write_gitattributes(self) -> None:
|
||||||
|
(self.vault_dir / ".gitattributes").write_text(GITATTRIBUTES, encoding="utf-8")
|
||||||
|
|
||||||
|
# ── Pull ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _git(self, *args: str) -> subprocess.CompletedProcess:
|
||||||
|
return subprocess.run(
|
||||||
|
["git", "-C", str(self.vault_dir), *args],
|
||||||
|
capture_output=True, text=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _collect_vault_ids(self) -> Dict[str, Path]:
|
||||||
|
"""Return {outline_id: path} for every tracked .md file in the vault."""
|
||||||
|
result: Dict[str, Path] = {}
|
||||||
|
for md in self.vault_dir.rglob("*.md"):
|
||||||
|
if ".git" in md.parts:
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
fm, _ = parse_frontmatter(md.read_text(encoding="utf-8"))
|
||||||
|
oid = fm.get("outline_id")
|
||||||
|
if oid:
|
||||||
|
result[oid] = md
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
return result
|
||||||
|
|
||||||
|
def cmd_pull(self) -> bool:
|
||||||
|
"""
|
||||||
|
Fetch latest document content from Outline and update the vault.
|
||||||
|
|
||||||
|
Runs entirely inside the outline-sync Docker container which has
|
||||||
|
git + requests. /vault is mounted from the host.
|
||||||
|
"""
|
||||||
|
print("Fetching collections from Outline...")
|
||||||
|
|
||||||
|
if not self.health_check():
|
||||||
|
print("✗ Cannot reach Outline API — aborting.")
|
||||||
|
return False
|
||||||
|
|
||||||
|
collections = self.get_collections()
|
||||||
|
if not collections:
|
||||||
|
print("No collections found.")
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Collect all Outline documents
|
||||||
|
all_docs: List[Dict] = []
|
||||||
|
for coll in collections:
|
||||||
|
tree = self.get_nav_tree(coll["id"])
|
||||||
|
self._collect_tree_docs(tree, coll["id"], all_docs)
|
||||||
|
|
||||||
|
# Map current vault files by outline_id
|
||||||
|
vault_ids = self._collect_vault_ids()
|
||||||
|
|
||||||
|
updated = 0
|
||||||
|
created = 0
|
||||||
|
errors = 0
|
||||||
|
|
||||||
|
# Switch to outline branch for writes
|
||||||
|
self._git("stash", "--include-untracked", "-m", "webui: pre-pull stash")
|
||||||
|
self._git("checkout", "outline")
|
||||||
|
|
||||||
|
for doc_meta in all_docs:
|
||||||
|
doc_id = doc_meta["id"]
|
||||||
|
title = doc_meta.get("title", "Untitled")
|
||||||
|
coll_id = doc_meta["collection_id"]
|
||||||
|
parent_id = doc_meta.get("parent_id")
|
||||||
|
outline_ts = doc_meta.get("updatedAt", "")
|
||||||
|
|
||||||
|
full = self.get_document_info(doc_id)
|
||||||
|
if not full:
|
||||||
|
print(f"error: could not fetch {title}")
|
||||||
|
errors += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
outline_ts = full.get("updatedAt", "")
|
||||||
|
|
||||||
|
if doc_id in vault_ids:
|
||||||
|
path = vault_ids[doc_id]
|
||||||
|
try:
|
||||||
|
existing_fm, _ = parse_frontmatter(path.read_text(encoding="utf-8"))
|
||||||
|
local_ts = existing_fm.get("outline_updated_at", "")
|
||||||
|
except OSError:
|
||||||
|
local_ts = ""
|
||||||
|
|
||||||
|
if local_ts == outline_ts:
|
||||||
|
continue # already up to date
|
||||||
|
|
||||||
|
# Update existing file
|
||||||
|
fm = {
|
||||||
|
"outline_id": doc_id,
|
||||||
|
"outline_collection_id": coll_id,
|
||||||
|
"outline_parent_id": parent_id or "",
|
||||||
|
"outline_updated_at": outline_ts,
|
||||||
|
}
|
||||||
|
content = build_frontmatter(fm) + "\n" + full.get("text", "")
|
||||||
|
path.write_text(content, encoding="utf-8")
|
||||||
|
rel = str(path.relative_to(self.vault_dir))
|
||||||
|
print(f"ok: {rel} updated")
|
||||||
|
updated += 1
|
||||||
|
else:
|
||||||
|
# New document — determine path from collection/title
|
||||||
|
safe_coll = sanitize_name(
|
||||||
|
next((c["name"] for c in collections if c["id"] == coll_id), coll_id)
|
||||||
|
)
|
||||||
|
coll_dir = self.vault_dir / safe_coll
|
||||||
|
coll_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
path = self._unique_path(coll_dir, sanitize_name(title))
|
||||||
|
fm = {
|
||||||
|
"outline_id": doc_id,
|
||||||
|
"outline_collection_id": coll_id,
|
||||||
|
"outline_parent_id": parent_id or "",
|
||||||
|
"outline_updated_at": outline_ts,
|
||||||
|
}
|
||||||
|
content = build_frontmatter(fm) + "\n" + full.get("text", "")
|
||||||
|
path.write_text(content, encoding="utf-8")
|
||||||
|
rel = str(path.relative_to(self.vault_dir))
|
||||||
|
print(f"ok: {rel} created")
|
||||||
|
created += 1
|
||||||
|
|
||||||
|
# Commit on outline branch if anything changed
|
||||||
|
if updated + created > 0:
|
||||||
|
self._git("add", "-A")
|
||||||
|
ts = time.strftime("%Y-%m-%dT%H:%M:%SZ", time.gmtime())
|
||||||
|
self._git("commit", "-m", f"sync: pull from Outline @ {ts}")
|
||||||
|
|
||||||
|
# Back to main + merge
|
||||||
|
self._git("checkout", "main")
|
||||||
|
if updated + created > 0:
|
||||||
|
self._git("merge", "outline", "--no-ff", "-m", f"merge: outline → main @ {ts}")
|
||||||
|
|
||||||
|
self._git("stash", "pop")
|
||||||
|
|
||||||
|
parts = []
|
||||||
|
if updated: parts.append(f"{updated} updated")
|
||||||
|
if created: parts.append(f"{created} created")
|
||||||
|
if errors: parts.append(f"{errors} errors")
|
||||||
|
summary = ", ".join(parts) if parts else "0 changes"
|
||||||
|
print(f"Done. {summary}.")
|
||||||
|
return errors == 0
|
||||||
|
|
||||||
|
def _collect_tree_docs(
|
||||||
|
self,
|
||||||
|
nodes: List[Dict],
|
||||||
|
collection_id: str,
|
||||||
|
out: List[Dict],
|
||||||
|
parent_id: Optional[str] = None,
|
||||||
|
) -> None:
|
||||||
|
for node in nodes:
|
||||||
|
doc = {
|
||||||
|
"id": node["id"],
|
||||||
|
"title": node.get("title", "Untitled"),
|
||||||
|
"collection_id": collection_id,
|
||||||
|
"parent_id": parent_id,
|
||||||
|
"updatedAt": node.get("updatedAt", ""),
|
||||||
|
}
|
||||||
|
out.append(doc)
|
||||||
|
for child in node.get("children", []):
|
||||||
|
self._collect_tree_docs([child], collection_id, out, node["id"])
|
||||||
|
|
||||||
|
# ── Push ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def cmd_push(self) -> bool:
|
||||||
|
"""
|
||||||
|
Push local changes (main vs outline) to Outline.
|
||||||
|
|
||||||
|
For each file changed on main relative to outline:
|
||||||
|
- Has outline_id → call documents.update
|
||||||
|
- No outline_id → call documents.create, write back frontmatter
|
||||||
|
|
||||||
|
Runs entirely inside the outline-sync Docker container.
|
||||||
|
"""
|
||||||
|
print("Checking local changes...")
|
||||||
|
|
||||||
|
if not self.health_check():
|
||||||
|
print("✗ Cannot reach Outline API — aborting.")
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Diff main vs outline
|
||||||
|
r = self._git("diff", "--name-status", "outline", "main", "--", "*.md")
|
||||||
|
if r.returncode != 0:
|
||||||
|
print(f"error: git diff failed: {r.stderr.strip()}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
changed_files: List[Tuple[str, str]] = [] # (status, path)
|
||||||
|
for line in r.stdout.splitlines():
|
||||||
|
parts = line.split("\t", 1)
|
||||||
|
if len(parts) == 2:
|
||||||
|
status, path = parts
|
||||||
|
changed_files.append((status.strip(), path.strip()))
|
||||||
|
|
||||||
|
if not changed_files:
|
||||||
|
print("Done. 0 changes.")
|
||||||
|
return True
|
||||||
|
|
||||||
|
collections = self.get_collections()
|
||||||
|
coll_by_name = {sanitize_name(c["name"]): c["id"] for c in collections}
|
||||||
|
|
||||||
|
updated = 0
|
||||||
|
created = 0
|
||||||
|
errors = 0
|
||||||
|
|
||||||
|
for status, rel_path in changed_files:
|
||||||
|
if rel_path.startswith("_"):
|
||||||
|
continue # skip _sync_log.md etc.
|
||||||
|
|
||||||
|
full_path = self.vault_dir / rel_path
|
||||||
|
if not full_path.exists():
|
||||||
|
continue
|
||||||
|
|
||||||
|
print(f"processing: {rel_path}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
content = full_path.read_text(encoding="utf-8")
|
||||||
|
except OSError as exc:
|
||||||
|
print(f"error: {rel_path}: {exc}")
|
||||||
|
errors += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
fm, body = parse_frontmatter(content)
|
||||||
|
doc_id = fm.get("outline_id")
|
||||||
|
title = full_path.stem
|
||||||
|
|
||||||
|
if doc_id:
|
||||||
|
# Update existing document
|
||||||
|
result = self._api("/api/documents.update", {
|
||||||
|
"id": doc_id,
|
||||||
|
"text": body,
|
||||||
|
})
|
||||||
|
if result and "data" in result:
|
||||||
|
new_ts = result["data"].get("updatedAt", "")
|
||||||
|
fm["outline_updated_at"] = new_ts
|
||||||
|
full_path.write_text(build_frontmatter(fm) + "\n" + body, encoding="utf-8")
|
||||||
|
print(f"ok: {rel_path} updated")
|
||||||
|
updated += 1
|
||||||
|
else:
|
||||||
|
print(f"error: {rel_path} update failed")
|
||||||
|
errors += 1
|
||||||
|
else:
|
||||||
|
# Create new document
|
||||||
|
path_parts = Path(rel_path).parts
|
||||||
|
coll_name = sanitize_name(path_parts[0]) if len(path_parts) > 1 else ""
|
||||||
|
coll_id = coll_by_name.get(coll_name)
|
||||||
|
|
||||||
|
if not coll_id:
|
||||||
|
# Create the collection
|
||||||
|
r_coll = self._api("/api/collections.create", {
|
||||||
|
"name": path_parts[0] if len(path_parts) > 1 else "Imported",
|
||||||
|
"private": False,
|
||||||
|
})
|
||||||
|
if r_coll and "data" in r_coll:
|
||||||
|
coll_id = r_coll["data"]["id"]
|
||||||
|
coll_by_name[coll_name] = coll_id
|
||||||
|
print(f"ok: collection '{path_parts[0]}' created (id: {coll_id})")
|
||||||
|
else:
|
||||||
|
print(f"error: could not create collection for {rel_path}")
|
||||||
|
errors += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
result = self._api("/api/documents.create", {
|
||||||
|
"title": title,
|
||||||
|
"text": body,
|
||||||
|
"collectionId": coll_id,
|
||||||
|
"publish": True,
|
||||||
|
})
|
||||||
|
if result and "data" in result:
|
||||||
|
new_id = result["data"]["id"]
|
||||||
|
new_ts = result["data"].get("updatedAt", "")
|
||||||
|
new_coll_id = result["data"].get("collectionId", coll_id)
|
||||||
|
fm = {
|
||||||
|
"outline_id": new_id,
|
||||||
|
"outline_collection_id": new_coll_id,
|
||||||
|
"outline_parent_id": "",
|
||||||
|
"outline_updated_at": new_ts,
|
||||||
|
}
|
||||||
|
full_path.write_text(build_frontmatter(fm) + "\n" + body, encoding="utf-8")
|
||||||
|
print(f"ok: {rel_path} created (id: {new_id})")
|
||||||
|
created += 1
|
||||||
|
else:
|
||||||
|
print(f"error: {rel_path} create failed")
|
||||||
|
errors += 1
|
||||||
|
|
||||||
|
# Commit frontmatter writebacks + advance outline branch
|
||||||
|
r_diff = self._git("diff", "--quiet")
|
||||||
|
if r_diff.returncode != 0:
|
||||||
|
self._git("add", "-A")
|
||||||
|
ts = time.strftime("%Y-%m-%dT%H:%M:%SZ", time.gmtime())
|
||||||
|
self._git("commit", "-m", f"sync: push to Outline @ {ts}")
|
||||||
|
self._git("checkout", "outline")
|
||||||
|
self._git("merge", "main", "--ff-only")
|
||||||
|
self._git("checkout", "main")
|
||||||
|
|
||||||
|
parts = []
|
||||||
|
if updated: parts.append(f"{updated} updated")
|
||||||
|
if created: parts.append(f"{created} created")
|
||||||
|
if errors: parts.append(f"{errors} errors")
|
||||||
|
summary = ", ".join(parts) if parts else "0 changes"
|
||||||
|
print(f"Done. {summary}.")
|
||||||
|
return errors == 0
|
||||||
|
|
||||||
|
# ── Commands ──────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def cmd_init(self) -> bool:
|
||||||
|
"""
|
||||||
|
Initialize the vault from current Outline state.
|
||||||
|
|
||||||
|
Writes all documents as markdown files with YAML frontmatter.
|
||||||
|
Also writes .gitignore and .gitattributes.
|
||||||
|
Git initialization (branches, first commit) is done by sync.sh.
|
||||||
|
"""
|
||||||
|
print("════════════════════════════════════════════════════════════")
|
||||||
|
print(" OUTLINE SYNC — init (file export)")
|
||||||
|
print("════════════════════════════════════════════════════════════")
|
||||||
|
print()
|
||||||
|
print(f"Vault: {self.vault_dir}")
|
||||||
|
print(f"Source: {self.base_url}")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Guard: refuse if .git already exists
|
||||||
|
if (self.vault_dir / ".git").exists():
|
||||||
|
print(f"✗ Vault is already a git repo: {self.vault_dir}")
|
||||||
|
print(" Remove the directory first or choose a different path.")
|
||||||
|
return False
|
||||||
|
|
||||||
|
self.vault_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
if not self.health_check():
|
||||||
|
print("✗ Cannot reach Outline API — aborting.")
|
||||||
|
return False
|
||||||
|
|
||||||
|
print()
|
||||||
|
|
||||||
|
collections = self.get_collections()
|
||||||
|
if not collections:
|
||||||
|
print("✗ No collections found in Outline.")
|
||||||
|
return False
|
||||||
|
|
||||||
|
print(f"Exporting {len(collections)} collection(s)...")
|
||||||
|
for coll in collections:
|
||||||
|
self.export_collection(coll)
|
||||||
|
|
||||||
|
self.write_gitignore()
|
||||||
|
self.write_gitattributes()
|
||||||
|
|
||||||
|
print()
|
||||||
|
print("════════════════════════════════════════════════════════════")
|
||||||
|
c = self.stats["collections"]
|
||||||
|
d = self.stats["documents"]
|
||||||
|
e = self.stats["errors"]
|
||||||
|
print(f" {c} collection(s), {d} document(s) exported")
|
||||||
|
if e:
|
||||||
|
print(f" {e} error(s) — see warnings above")
|
||||||
|
print()
|
||||||
|
print(" Git setup will be completed by sync.sh.")
|
||||||
|
print("════════════════════════════════════════════════════════════")
|
||||||
|
|
||||||
|
return e == 0
|
||||||
|
|
||||||
|
|
||||||
|
# ── Settings + CLI ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def load_settings(path: str) -> Dict:
|
||||||
|
try:
|
||||||
|
with open(path) as f:
|
||||||
|
return json.load(f)
|
||||||
|
except FileNotFoundError:
|
||||||
|
logger.error("Settings file not found: %s", path)
|
||||||
|
sys.exit(1)
|
||||||
|
except json.JSONDecodeError as exc:
|
||||||
|
logger.error("Invalid JSON in %s: %s", path, exc)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
p = argparse.ArgumentParser(
|
||||||
|
description="Outline ↔ Obsidian sync (Phase 1: init)",
|
||||||
|
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||||
|
epilog=(
|
||||||
|
"Commands:\n"
|
||||||
|
" init Export Outline to vault and write git config files\n"
|
||||||
|
),
|
||||||
|
)
|
||||||
|
p.add_argument("command", choices=["init", "pull", "push"], help="Sync command")
|
||||||
|
p.add_argument("--vault", required=True, help="Path to vault directory")
|
||||||
|
p.add_argument("--settings", default="settings.json", help="Path to settings file")
|
||||||
|
p.add_argument("--url", help="Outline API URL (overrides settings.source.url)")
|
||||||
|
p.add_argument("--token", help="API token (overrides settings.source.token)")
|
||||||
|
p.add_argument(
|
||||||
|
"-v", "--verbose",
|
||||||
|
action="count",
|
||||||
|
default=0,
|
||||||
|
help="Increase verbosity (-v for INFO, -vv for DEBUG)",
|
||||||
|
)
|
||||||
|
return p.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
if args.verbose >= 2:
|
||||||
|
logger.setLevel(logging.DEBUG)
|
||||||
|
elif args.verbose == 1:
|
||||||
|
logger.setLevel(logging.INFO)
|
||||||
|
|
||||||
|
settings = load_settings(args.settings)
|
||||||
|
source = settings.get("source", {})
|
||||||
|
|
||||||
|
url = args.url or source.get("url")
|
||||||
|
token = args.token or source.get("token")
|
||||||
|
|
||||||
|
if not url or not token:
|
||||||
|
logger.error(
|
||||||
|
"Missing API URL or token — set source.url and source.token "
|
||||||
|
"in settings.json, or pass --url / --token."
|
||||||
|
)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
sync = OutlineSync(
|
||||||
|
base_url = url,
|
||||||
|
api_token = token,
|
||||||
|
vault_dir = Path(args.vault),
|
||||||
|
)
|
||||||
|
|
||||||
|
if args.command == "init":
|
||||||
|
ok = sync.cmd_init()
|
||||||
|
elif args.command == "pull":
|
||||||
|
ok = sync.cmd_pull()
|
||||||
|
elif args.command == "push":
|
||||||
|
ok = sync.cmd_push()
|
||||||
|
else:
|
||||||
|
ok = False
|
||||||
|
sys.exit(0 if ok else 1)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
14
pytest.ini
Normal file
14
pytest.ini
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
[pytest]
|
||||||
|
asyncio_mode = auto
|
||||||
|
testpaths = tests
|
||||||
|
python_files = test_*.py
|
||||||
|
python_classes = Test*
|
||||||
|
python_functions = test_*
|
||||||
|
|
||||||
|
markers =
|
||||||
|
integration: requires running infrastructure (WebDAV container, Outline API)
|
||||||
|
slow: longer-running tests
|
||||||
|
|
||||||
|
filterwarnings =
|
||||||
|
ignore::DeprecationWarning:httpx
|
||||||
|
ignore::pytest.PytestUnraisableExceptionWarning
|
||||||
230
sync.sh
Executable file
230
sync.sh
Executable file
@@ -0,0 +1,230 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# Outline ↔ Obsidian Sync
|
||||||
|
# Bash wrapper — delegates API/file work to Docker Python container,
|
||||||
|
# handles git operations directly on the host.
|
||||||
|
#
|
||||||
|
# Usage: ./sync.sh <command> [options]
|
||||||
|
#
|
||||||
|
# Commands (Phase 1):
|
||||||
|
# init Initialize vault from Outline content
|
||||||
|
# help Show this help
|
||||||
|
#
|
||||||
|
# Options:
|
||||||
|
# --vault DIR Vault directory (overrides settings.json sync.vault_dir)
|
||||||
|
# --settings FILE Path to settings file (default: ./settings.json)
|
||||||
|
# -v Verbose Python output
|
||||||
|
# -vv Debug Python output
|
||||||
|
#
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
SETTINGS="$SCRIPT_DIR/settings.json"
|
||||||
|
|
||||||
|
# ── Colours ───────────────────────────────────────────────────────────────────
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
RED='\033[0;31m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
BLUE='\033[0;34m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
# ── Helpers ───────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
die() { echo -e "${RED}✗ $*${NC}" >&2; exit 1; }
|
||||||
|
info() { echo -e "${BLUE}$*${NC}"; }
|
||||||
|
ok() { echo -e "${GREEN}✓ $*${NC}"; }
|
||||||
|
warn() { echo -e "${YELLOW}⚠ $*${NC}"; }
|
||||||
|
|
||||||
|
require_cmd() {
|
||||||
|
command -v "$1" &>/dev/null || die "Required command not found: $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
read_json() {
|
||||||
|
# read_json <file> <jq_path> e.g. read_json settings.json .sync.vault_dir
|
||||||
|
jq -r "$2 // empty" "$1" 2>/dev/null || true
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Argument parsing ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
COMMAND="${1:-help}"
|
||||||
|
shift || true
|
||||||
|
|
||||||
|
VAULT_ARG=""
|
||||||
|
VERBOSE_FLAG=""
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--vault) VAULT_ARG="$2"; shift 2 ;;
|
||||||
|
--settings) SETTINGS="$2"; shift 2 ;;
|
||||||
|
-vv) VERBOSE_FLAG="-vv"; shift ;;
|
||||||
|
-v) VERBOSE_FLAG="-v"; shift ;;
|
||||||
|
--verbose) VERBOSE_FLAG="-v"; shift ;;
|
||||||
|
--) shift; break ;;
|
||||||
|
*) shift ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# ── Config resolution ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
[[ -f "$SETTINGS" ]] || die "settings.json not found at $SETTINGS"
|
||||||
|
|
||||||
|
# Vault: CLI flag > settings.json > error
|
||||||
|
if [[ -n "$VAULT_ARG" ]]; then
|
||||||
|
VAULT_DIR="$VAULT_ARG"
|
||||||
|
else
|
||||||
|
VAULT_DIR="$(read_json "$SETTINGS" '.sync.vault_dir')"
|
||||||
|
fi
|
||||||
|
|
||||||
|
[[ -n "$VAULT_DIR" ]] || die \
|
||||||
|
"Vault directory not configured.\n" \
|
||||||
|
" Set sync.vault_dir in settings.json or pass --vault <dir>."
|
||||||
|
|
||||||
|
# ── Docker run helper ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
run_python() {
|
||||||
|
# run_python <python_args...>
|
||||||
|
# Mounts SCRIPT_DIR (read-only) as /work and VAULT_DIR as /vault.
|
||||||
|
docker run --rm \
|
||||||
|
--network domnet \
|
||||||
|
--user "$(id -u):$(id -g)" \
|
||||||
|
-e HOME=/tmp \
|
||||||
|
-v "$SCRIPT_DIR:/work:ro" \
|
||||||
|
-v "$VAULT_DIR:/vault" \
|
||||||
|
-w /work \
|
||||||
|
python:3.11-slim \
|
||||||
|
bash -c "pip install -qqq requests 2>/dev/null && python3 outline_sync.py $*"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Commands ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
cmd_init() {
|
||||||
|
require_cmd git
|
||||||
|
require_cmd docker
|
||||||
|
require_cmd jq
|
||||||
|
|
||||||
|
echo
|
||||||
|
info "════════════════════════════════════════════════════════════"
|
||||||
|
info " OUTLINE SYNC — init"
|
||||||
|
info "════════════════════════════════════════════════════════════"
|
||||||
|
echo
|
||||||
|
echo " Vault: $VAULT_DIR"
|
||||||
|
echo " Settings: $SETTINGS"
|
||||||
|
echo
|
||||||
|
|
||||||
|
# Guard: refuse if .git already exists
|
||||||
|
if [[ -d "$VAULT_DIR/.git" ]]; then
|
||||||
|
die "Vault is already initialized at $VAULT_DIR\n" \
|
||||||
|
" Remove the directory first, or use a different --vault path."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create vault dir on the host BEFORE Docker mounts it.
|
||||||
|
# If Docker creates the mount point itself it does so as root,
|
||||||
|
# making the Python container (running as the host user) unable to write.
|
||||||
|
mkdir -p "$VAULT_DIR"
|
||||||
|
|
||||||
|
# ── Step 1: Python — export Outline content + write config files ──────────
|
||||||
|
echo
|
||||||
|
info "Step 1/3 Exporting Outline content..."
|
||||||
|
echo
|
||||||
|
|
||||||
|
if ! run_python "init --vault /vault --settings /work/settings.json $VERBOSE_FLAG"; then
|
||||||
|
# Clean up incomplete vault so the user can retry cleanly
|
||||||
|
if [[ -d "$VAULT_DIR" ]]; then
|
||||||
|
warn "Export failed — removing incomplete vault directory."
|
||||||
|
rm -rf "$VAULT_DIR"
|
||||||
|
fi
|
||||||
|
die "Export step failed. Check the error output above and retry."
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo
|
||||||
|
|
||||||
|
# ── Step 2: git init + first commit on outline branch ────────────────────
|
||||||
|
info "Step 2/3 Initializing git repository..."
|
||||||
|
|
||||||
|
# Init the repo
|
||||||
|
git -C "$VAULT_DIR" init --quiet
|
||||||
|
|
||||||
|
# Rename initial branch to 'outline' before the first commit.
|
||||||
|
# We use symbolic-ref for compatibility with git < 2.28 (no --initial-branch).
|
||||||
|
git -C "$VAULT_DIR" symbolic-ref HEAD refs/heads/outline
|
||||||
|
|
||||||
|
git -C "$VAULT_DIR" add --all
|
||||||
|
|
||||||
|
TIMESTAMP="$(date -u +%Y-%m-%dT%H:%M:%SZ)"
|
||||||
|
git \
|
||||||
|
-C "$VAULT_DIR" \
|
||||||
|
-c user.name="outline-sync" \
|
||||||
|
-c user.email="sync@local" \
|
||||||
|
commit \
|
||||||
|
--quiet \
|
||||||
|
-m "sync: initial import from Outline @ $TIMESTAMP"
|
||||||
|
|
||||||
|
ok "Committed to 'outline' branch"
|
||||||
|
|
||||||
|
# Create 'main' at the same commit (this is the working branch for Obsidian)
|
||||||
|
git -C "$VAULT_DIR" checkout --quiet -b main
|
||||||
|
|
||||||
|
ok "Created 'main' branch"
|
||||||
|
|
||||||
|
# ── Step 3: verify ────────────────────────────────────────────────────────
|
||||||
|
echo
|
||||||
|
info "Step 3/3 Verifying..."
|
||||||
|
|
||||||
|
local outline_sha main_sha
|
||||||
|
outline_sha="$(git -C "$VAULT_DIR" rev-parse outline)"
|
||||||
|
main_sha="$(git -C "$VAULT_DIR" rev-parse main)"
|
||||||
|
|
||||||
|
if [[ "$outline_sha" == "$main_sha" ]]; then
|
||||||
|
ok "Both branches at commit ${outline_sha:0:8}"
|
||||||
|
else
|
||||||
|
die "Branch mismatch after init:\n" \
|
||||||
|
" outline = $outline_sha\n" \
|
||||||
|
" main = $main_sha"
|
||||||
|
fi
|
||||||
|
|
||||||
|
local doc_count
|
||||||
|
doc_count="$(find "$VAULT_DIR" -name "*.md" | wc -l | tr -d ' ')"
|
||||||
|
ok "$doc_count markdown files committed"
|
||||||
|
|
||||||
|
echo
|
||||||
|
info "════════════════════════════════════════════════════════════"
|
||||||
|
echo -e " ${GREEN}Vault ready at:${NC} $VAULT_DIR"
|
||||||
|
echo
|
||||||
|
echo " Next steps:"
|
||||||
|
echo " 1. Open $VAULT_DIR as your Obsidian vault"
|
||||||
|
echo " 2. Install the 'Obsidian Git' plugin"
|
||||||
|
echo " 3. Configure it to auto-commit on the 'main' branch"
|
||||||
|
info "════════════════════════════════════════════════════════════"
|
||||||
|
echo
|
||||||
|
}
|
||||||
|
|
||||||
|
cmd_help() {
|
||||||
|
echo "Outline ↔ Obsidian Sync"
|
||||||
|
echo
|
||||||
|
echo "Usage: $0 <command> [options]"
|
||||||
|
echo
|
||||||
|
echo "Commands:"
|
||||||
|
echo " init Initialize vault from Outline (Phase 1)"
|
||||||
|
echo " help Show this help"
|
||||||
|
echo
|
||||||
|
echo "Options:"
|
||||||
|
echo " --vault DIR Vault directory (overrides settings.json sync.vault_dir)"
|
||||||
|
echo " --settings FILE Path to settings file (default: ./settings.json)"
|
||||||
|
echo " -v Verbose output"
|
||||||
|
echo " -vv Debug output"
|
||||||
|
echo
|
||||||
|
echo "Examples:"
|
||||||
|
echo " $0 init"
|
||||||
|
echo " $0 init --vault /data/my-vault"
|
||||||
|
echo " $0 init -v"
|
||||||
|
echo
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Dispatch ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
case "$COMMAND" in
|
||||||
|
init) cmd_init ;;
|
||||||
|
help) cmd_help ;;
|
||||||
|
*) die "Unknown command: '$COMMAND' — run '$0 help' for usage." ;;
|
||||||
|
esac
|
||||||
514
sync_tests.sh
Executable file
514
sync_tests.sh
Executable file
@@ -0,0 +1,514 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# Outline Sync — Automated Test Suite
|
||||||
|
# Phase 1: TEST-1.1 through TEST-1.11
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ./sync_tests.sh Run all Phase 1 tests
|
||||||
|
# ./sync_tests.sh --phase 1 Explicit phase selection
|
||||||
|
# ./sync_tests.sh --keep Keep test vault on failure (for debugging)
|
||||||
|
# ./sync_tests.sh -v Verbose — show sync.sh output
|
||||||
|
#
|
||||||
|
# Requires: git, docker, jq, python3 (for local JSON parsing)
|
||||||
|
#
|
||||||
|
# The test creates a dedicated collection in Outline (named _sync_test_<timestamp>),
|
||||||
|
# runs sync init into a temp vault, checks all assertions, then cleans up.
|
||||||
|
#
|
||||||
|
|
||||||
|
set -uo pipefail # No -e: we capture failures ourselves
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
SETTINGS="$SCRIPT_DIR/settings.json"
|
||||||
|
|
||||||
|
# ── Test state ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
PASS=0
|
||||||
|
FAIL=0
|
||||||
|
TOTAL=0
|
||||||
|
FAILED_TESTS=()
|
||||||
|
|
||||||
|
TEST_TS="$(date +%Y%m%d_%H%M%S)"
|
||||||
|
TEST_VAULT="/tmp/outline-sync-test-$$"
|
||||||
|
TEST_COLLECTION_NAME="_sync_test_${TEST_TS}"
|
||||||
|
|
||||||
|
# IDs populated by setup_test_data()
|
||||||
|
TEST_COLLECTION_ID=""
|
||||||
|
TEST_DOC_ROOT_ID="" # "RootDoc One" — leaf at collection root
|
||||||
|
TEST_DOC_PARENT_ID="" # "Parent Doc" — has children
|
||||||
|
TEST_DOC_CHILD1_ID="" # "Child One" — has grandchild
|
||||||
|
TEST_DOC_CHILD2_ID="" # "Child Two" — leaf under Parent Doc
|
||||||
|
TEST_DOC_GRANDCHILD_ID=""# "Grandchild" — leaf under Child One
|
||||||
|
|
||||||
|
# ── CLI flags ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
PHASE_FILTER=""
|
||||||
|
KEEP_ON_FAIL=0
|
||||||
|
VERBOSE=0
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--phase) PHASE_FILTER="$2"; shift 2 ;;
|
||||||
|
--keep) KEEP_ON_FAIL=1; shift ;;
|
||||||
|
-v|--verbose) VERBOSE=1; shift ;;
|
||||||
|
*) shift ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# ── Colours ───────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
RED='\033[0;31m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
BLUE='\033[0;34m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
# ── Assertion helpers ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
_assert() {
|
||||||
|
local name="$1" result="$2" detail="${3:-}"
|
||||||
|
TOTAL=$(( TOTAL + 1 ))
|
||||||
|
if [[ "$result" == "pass" ]]; then
|
||||||
|
echo -e " ${GREEN}✓${NC} $name"
|
||||||
|
PASS=$(( PASS + 1 ))
|
||||||
|
else
|
||||||
|
echo -e " ${RED}✗${NC} $name"
|
||||||
|
[[ -n "$detail" ]] && echo -e " ${RED}↳ $detail${NC}"
|
||||||
|
FAIL=$(( FAIL + 1 ))
|
||||||
|
FAILED_TESTS+=("$name")
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
assert_dir() {
|
||||||
|
local name="$1" path="$2"
|
||||||
|
[[ -d "$path" ]] \
|
||||||
|
&& _assert "$name" "pass" \
|
||||||
|
|| _assert "$name" "fail" "Directory not found: $path"
|
||||||
|
}
|
||||||
|
|
||||||
|
assert_file() {
|
||||||
|
local name="$1" path="$2"
|
||||||
|
[[ -f "$path" ]] \
|
||||||
|
&& _assert "$name" "pass" \
|
||||||
|
|| _assert "$name" "fail" "File not found: $path"
|
||||||
|
}
|
||||||
|
|
||||||
|
assert_contains() {
|
||||||
|
local name="$1" path="$2" pattern="$3"
|
||||||
|
grep -q "$pattern" "$path" 2>/dev/null \
|
||||||
|
&& _assert "$name" "pass" \
|
||||||
|
|| _assert "$name" "fail" "Pattern '$pattern' not found in $path"
|
||||||
|
}
|
||||||
|
|
||||||
|
assert_eq() {
|
||||||
|
local name="$1" got="$2" want="$3"
|
||||||
|
[[ "$got" == "$want" ]] \
|
||||||
|
&& _assert "$name" "pass" \
|
||||||
|
|| _assert "$name" "fail" "Expected '$want', got '$got'"
|
||||||
|
}
|
||||||
|
|
||||||
|
assert_nonzero_exit() {
|
||||||
|
# assert_nonzero_exit <name> <exit_code>
|
||||||
|
local name="$1" code="$2"
|
||||||
|
[[ "$code" -ne 0 ]] \
|
||||||
|
&& _assert "$name" "pass" \
|
||||||
|
|| _assert "$name" "fail" "Expected non-zero exit code, got 0"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Docker API helper ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Read API credentials once
|
||||||
|
_API_URL="$(jq -r '.source.url' "$SETTINGS" 2>/dev/null)"
|
||||||
|
_API_TOKEN="$(jq -r '.source.token' "$SETTINGS" 2>/dev/null)"
|
||||||
|
|
||||||
|
api_py() {
|
||||||
|
# api_py [-v vault_path] <python_snippet>
|
||||||
|
# Runs Python inside Docker with domnet access.
|
||||||
|
# Optional -v mounts a vault directory as /vault (read-only).
|
||||||
|
local vault_mount=""
|
||||||
|
if [[ "$1" == "-v" ]]; then
|
||||||
|
vault_mount="-v $2:/vault:ro"
|
||||||
|
shift 2
|
||||||
|
fi
|
||||||
|
local code="$1"
|
||||||
|
|
||||||
|
# shellcheck disable=SC2086
|
||||||
|
docker run --rm \
|
||||||
|
--network domnet \
|
||||||
|
${vault_mount} \
|
||||||
|
-e OUTLINE_URL="$_API_URL" \
|
||||||
|
-e OUTLINE_TOKEN="$_API_TOKEN" \
|
||||||
|
python:3.11-slim \
|
||||||
|
python3 - <<PYEOF
|
||||||
|
import os, sys, json, glob, requests
|
||||||
|
|
||||||
|
url = os.environ["OUTLINE_URL"].rstrip("/")
|
||||||
|
token = os.environ["OUTLINE_TOKEN"]
|
||||||
|
headers = {
|
||||||
|
"Authorization": f"Bearer {token}",
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
}
|
||||||
|
|
||||||
|
def api(endpoint, data=None):
|
||||||
|
r = requests.post(f"{url}{endpoint}", headers=headers, json=data or {}, timeout=30)
|
||||||
|
r.raise_for_status()
|
||||||
|
return r.json()
|
||||||
|
|
||||||
|
$code
|
||||||
|
PYEOF
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Setup: create test data in Outline ────────────────────────────────────────
|
||||||
|
|
||||||
|
setup_test_data() {
|
||||||
|
echo
|
||||||
|
echo -e "${BLUE}Creating test data in Outline...${NC}"
|
||||||
|
|
||||||
|
local result
|
||||||
|
result="$(api_py "
|
||||||
|
coll_name = '${TEST_COLLECTION_NAME}'
|
||||||
|
|
||||||
|
# Collection
|
||||||
|
coll = api('/api/collections.create', {
|
||||||
|
'name': coll_name,
|
||||||
|
'permission': 'read_write',
|
||||||
|
})
|
||||||
|
coll_id = coll['data']['id']
|
||||||
|
|
||||||
|
# Root leaf document
|
||||||
|
doc_root = api('/api/documents.create', {
|
||||||
|
'collectionId': coll_id,
|
||||||
|
'title': 'RootDoc One',
|
||||||
|
'text': 'Root document one content.',
|
||||||
|
'publish': True,
|
||||||
|
})
|
||||||
|
doc_root_id = doc_root['data']['id']
|
||||||
|
|
||||||
|
# Parent document (will have children)
|
||||||
|
doc_parent = api('/api/documents.create', {
|
||||||
|
'collectionId': coll_id,
|
||||||
|
'title': 'Parent Doc',
|
||||||
|
'text': 'Parent document content.',
|
||||||
|
'publish': True,
|
||||||
|
})
|
||||||
|
doc_parent_id = doc_parent['data']['id']
|
||||||
|
|
||||||
|
# Child One (will have grandchild)
|
||||||
|
doc_child1 = api('/api/documents.create', {
|
||||||
|
'collectionId': coll_id,
|
||||||
|
'title': 'Child One',
|
||||||
|
'text': 'Child one content.',
|
||||||
|
'publish': True,
|
||||||
|
'parentDocumentId': doc_parent_id,
|
||||||
|
})
|
||||||
|
doc_child1_id = doc_child1['data']['id']
|
||||||
|
|
||||||
|
# Child Two (leaf)
|
||||||
|
doc_child2 = api('/api/documents.create', {
|
||||||
|
'collectionId': coll_id,
|
||||||
|
'title': 'Child Two',
|
||||||
|
'text': 'Child two content.',
|
||||||
|
'publish': True,
|
||||||
|
'parentDocumentId': doc_parent_id,
|
||||||
|
})
|
||||||
|
doc_child2_id = doc_child2['data']['id']
|
||||||
|
|
||||||
|
# Grandchild (leaf under Child One)
|
||||||
|
doc_gc = api('/api/documents.create', {
|
||||||
|
'collectionId': coll_id,
|
||||||
|
'title': 'Grandchild',
|
||||||
|
'text': 'Grandchild content.',
|
||||||
|
'publish': True,
|
||||||
|
'parentDocumentId': doc_child1_id,
|
||||||
|
})
|
||||||
|
doc_gc_id = doc_gc['data']['id']
|
||||||
|
|
||||||
|
print(json.dumps({
|
||||||
|
'collection_id': coll_id,
|
||||||
|
'doc_root_id': doc_root_id,
|
||||||
|
'doc_parent_id': doc_parent_id,
|
||||||
|
'doc_child1_id': doc_child1_id,
|
||||||
|
'doc_child2_id': doc_child2_id,
|
||||||
|
'doc_gc_id': doc_gc_id,
|
||||||
|
}))
|
||||||
|
")"
|
||||||
|
|
||||||
|
TEST_COLLECTION_ID="$( echo "$result" | python3 -c "import sys,json; d=json.load(sys.stdin); print(d['collection_id'])")"
|
||||||
|
TEST_DOC_ROOT_ID="$( echo "$result" | python3 -c "import sys,json; d=json.load(sys.stdin); print(d['doc_root_id'])")"
|
||||||
|
TEST_DOC_PARENT_ID="$( echo "$result" | python3 -c "import sys,json; d=json.load(sys.stdin); print(d['doc_parent_id'])")"
|
||||||
|
TEST_DOC_CHILD1_ID="$( echo "$result" | python3 -c "import sys,json; d=json.load(sys.stdin); print(d['doc_child1_id'])")"
|
||||||
|
TEST_DOC_CHILD2_ID="$( echo "$result" | python3 -c "import sys,json; d=json.load(sys.stdin); print(d['doc_child2_id'])")"
|
||||||
|
TEST_DOC_GRANDCHILD_ID="$(echo "$result"| python3 -c "import sys,json; d=json.load(sys.stdin); print(d['doc_gc_id'])")"
|
||||||
|
|
||||||
|
echo -e " ${GREEN}✓${NC} Test collection: $TEST_COLLECTION_NAME"
|
||||||
|
echo -e " ${GREEN}✓${NC} 5 documents created (hierarchy: root, parent→child1→grandchild, child2)"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Teardown ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
teardown() {
|
||||||
|
echo
|
||||||
|
echo -e "${BLUE}Cleaning up...${NC}"
|
||||||
|
|
||||||
|
if [[ -n "$TEST_COLLECTION_ID" ]]; then
|
||||||
|
api_py "api('/api/collections.delete', {'id': '$TEST_COLLECTION_ID'})" &>/dev/null \
|
||||||
|
&& echo -e " ${GREEN}✓${NC} Test collection deleted from Outline" \
|
||||||
|
|| echo -e " ${YELLOW}⚠${NC} Could not delete collection $TEST_COLLECTION_ID — delete manually"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -d "$TEST_VAULT" ]]; then
|
||||||
|
if [[ $FAIL -gt 0 && $KEEP_ON_FAIL -eq 1 ]]; then
|
||||||
|
echo -e " ${YELLOW}⚠${NC} Keeping test vault for inspection: $TEST_VAULT"
|
||||||
|
else
|
||||||
|
rm -rf "$TEST_VAULT"
|
||||||
|
echo -e " ${GREEN}✓${NC} Test vault removed"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Run sync init ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
run_init() {
|
||||||
|
echo
|
||||||
|
echo -e "${BLUE}Running sync init...${NC}"
|
||||||
|
echo
|
||||||
|
|
||||||
|
local verbose_flag=""
|
||||||
|
[[ $VERBOSE -eq 1 ]] && verbose_flag="-v"
|
||||||
|
|
||||||
|
# shellcheck disable=SC2086
|
||||||
|
if ! "$SCRIPT_DIR/sync.sh" init \
|
||||||
|
--vault "$TEST_VAULT" \
|
||||||
|
--settings "$SETTINGS" \
|
||||||
|
$verbose_flag; then
|
||||||
|
echo -e "${RED}✗ sync init failed — cannot run tests${NC}"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Phase 1 tests ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
run_phase_1_tests() {
|
||||||
|
local COLL_DIR="$TEST_VAULT/$TEST_COLLECTION_NAME"
|
||||||
|
|
||||||
|
echo
|
||||||
|
echo -e "${BLUE}Phase 1 tests${NC}"
|
||||||
|
echo
|
||||||
|
|
||||||
|
# ── TEST-1.1: vault directory created ─────────────────────────────────────
|
||||||
|
assert_dir "TEST-1.1 vault directory created" "$TEST_VAULT"
|
||||||
|
|
||||||
|
# ── TEST-1.2: git repo with outline and main branches ─────────────────────
|
||||||
|
local branches
|
||||||
|
branches="$(git -C "$TEST_VAULT" branch 2>/dev/null || true)"
|
||||||
|
|
||||||
|
if git -C "$TEST_VAULT" rev-parse --git-dir &>/dev/null; then
|
||||||
|
_assert "TEST-1.2 git repo initialized" "pass"
|
||||||
|
else
|
||||||
|
_assert "TEST-1.2 git repo initialized" "fail" "No .git directory found"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "$branches" | grep -qE "^\*?\s+outline$" \
|
||||||
|
&& _assert "TEST-1.2 'outline' branch exists" "pass" \
|
||||||
|
|| _assert "TEST-1.2 'outline' branch exists" "fail" "outline branch not found in: $branches"
|
||||||
|
|
||||||
|
echo "$branches" | grep -qE "^\*?\s+main$" \
|
||||||
|
&& _assert "TEST-1.2 'main' branch exists" "pass" \
|
||||||
|
|| _assert "TEST-1.2 'main' branch exists" "fail" "main branch not found in: $branches"
|
||||||
|
|
||||||
|
# ── TEST-1.3: test collection folder created ───────────────────────────────
|
||||||
|
assert_dir "TEST-1.3 test collection folder exists" "$COLL_DIR"
|
||||||
|
|
||||||
|
# ── TEST-1.4: every .md file has frontmatter ──────────────────────────────
|
||||||
|
local md_count=0 missing_fm=0
|
||||||
|
while IFS= read -r -d '' f; do
|
||||||
|
md_count=$(( md_count + 1 ))
|
||||||
|
head -1 "$f" | grep -q "^---$" || missing_fm=$(( missing_fm + 1 ))
|
||||||
|
done < <(find "$TEST_VAULT" -name "*.md" -print0 2>/dev/null)
|
||||||
|
|
||||||
|
if [[ $md_count -gt 0 && $missing_fm -eq 0 ]]; then
|
||||||
|
_assert "TEST-1.4 all .md files have frontmatter (checked $md_count files)" "pass"
|
||||||
|
else
|
||||||
|
_assert "TEST-1.4 all .md files have frontmatter" "fail" \
|
||||||
|
"$missing_fm / $md_count files missing frontmatter"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ── TEST-1.5: frontmatter has required fields ──────────────────────────────
|
||||||
|
local missing_fields=0
|
||||||
|
while IFS= read -r -d '' f; do
|
||||||
|
for field in outline_id outline_collection_id outline_updated_at; do
|
||||||
|
grep -q "^${field}: " "$f" || {
|
||||||
|
missing_fields=$(( missing_fields + 1 ))
|
||||||
|
[[ $VERBOSE -eq 1 ]] && echo " missing '$field' in $f"
|
||||||
|
}
|
||||||
|
done
|
||||||
|
done < <(find "$TEST_VAULT" -name "*.md" -print0 2>/dev/null)
|
||||||
|
|
||||||
|
[[ $missing_fields -eq 0 ]] \
|
||||||
|
&& _assert "TEST-1.5 required frontmatter fields present in all files" "pass" \
|
||||||
|
|| _assert "TEST-1.5 required frontmatter fields present in all files" "fail" \
|
||||||
|
"$missing_fields missing field occurrences across all files"
|
||||||
|
|
||||||
|
# ── TEST-1.6: outline_id matches actual Outline API ───────────────────────
|
||||||
|
local api_result
|
||||||
|
api_result="$(api_py -v "$TEST_VAULT" "
|
||||||
|
import glob, os
|
||||||
|
|
||||||
|
results = []
|
||||||
|
for fpath in glob.glob('/vault/**/*.md', recursive=True):
|
||||||
|
with open(fpath) as fh:
|
||||||
|
content = fh.read()
|
||||||
|
|
||||||
|
if not content.startswith('---\n'):
|
||||||
|
continue
|
||||||
|
end = content.find('\n---\n', 4)
|
||||||
|
if end == -1:
|
||||||
|
continue
|
||||||
|
|
||||||
|
fm_text = content[4:end]
|
||||||
|
fm = dict(
|
||||||
|
line.split(': ', 1)
|
||||||
|
for line in fm_text.splitlines()
|
||||||
|
if ': ' in line
|
||||||
|
)
|
||||||
|
doc_id = fm.get('outline_id', '').strip()
|
||||||
|
if not doc_id:
|
||||||
|
results.append(f'MISSING_ID:{fpath}')
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
r = api('/api/documents.info', {'id': doc_id})
|
||||||
|
returned_id = r['data']['id']
|
||||||
|
if returned_id != doc_id:
|
||||||
|
results.append(f'MISMATCH:{fpath} has {doc_id} but API returned {returned_id}')
|
||||||
|
except Exception as e:
|
||||||
|
results.append(f'NOTFOUND:{fpath} id={doc_id} err={e}')
|
||||||
|
|
||||||
|
print('PASS' if not results else 'FAIL:' + ';'.join(results))
|
||||||
|
" 2>/dev/null)"
|
||||||
|
|
||||||
|
[[ "$api_result" == "PASS" ]] \
|
||||||
|
&& _assert "TEST-1.6 outline_id matches Outline API for all files" "pass" \
|
||||||
|
|| _assert "TEST-1.6 outline_id matches Outline API for all files" "fail" "$api_result"
|
||||||
|
|
||||||
|
# ── TEST-1.7: folder hierarchy matches Outline document tree ──────────────
|
||||||
|
#
|
||||||
|
# Expected layout for our test data:
|
||||||
|
# $COLL_DIR/
|
||||||
|
# RootDoc One.md ← leaf at root
|
||||||
|
# Parent Doc/
|
||||||
|
# Parent Doc.md ← parent (has children) → inside own folder
|
||||||
|
# Child One/
|
||||||
|
# Child One.md ← child1 (has grandchild) → inside own folder
|
||||||
|
# Grandchild.md ← leaf grandchild
|
||||||
|
# Child Two.md ← leaf child2 (no children) → flat file
|
||||||
|
|
||||||
|
assert_file "TEST-1.7 leaf root doc is flat file" \
|
||||||
|
"$COLL_DIR/RootDoc One.md"
|
||||||
|
|
||||||
|
assert_dir "TEST-1.7 parent-with-children gets its own subfolder" \
|
||||||
|
"$COLL_DIR/Parent Doc"
|
||||||
|
|
||||||
|
assert_file "TEST-1.7 parent doc file lives inside its subfolder" \
|
||||||
|
"$COLL_DIR/Parent Doc/Parent Doc.md"
|
||||||
|
|
||||||
|
assert_dir "TEST-1.7 child-with-grandchild gets its own subfolder" \
|
||||||
|
"$COLL_DIR/Parent Doc/Child One"
|
||||||
|
|
||||||
|
assert_file "TEST-1.7 child doc file lives inside its subfolder" \
|
||||||
|
"$COLL_DIR/Parent Doc/Child One/Child One.md"
|
||||||
|
|
||||||
|
assert_file "TEST-1.7 grandchild (leaf) is flat file in parent's folder" \
|
||||||
|
"$COLL_DIR/Parent Doc/Child One/Grandchild.md"
|
||||||
|
|
||||||
|
assert_file "TEST-1.7 leaf child sibling is flat file in parent's folder" \
|
||||||
|
"$COLL_DIR/Parent Doc/Child Two.md"
|
||||||
|
|
||||||
|
# ── TEST-1.8: settings.json in .gitignore ─────────────────────────────────
|
||||||
|
assert_contains "TEST-1.8 settings.json is gitignored" \
|
||||||
|
"$TEST_VAULT/.gitignore" "settings.json"
|
||||||
|
|
||||||
|
# ── TEST-1.9: .obsidian/ in .gitignore ────────────────────────────────────
|
||||||
|
assert_contains "TEST-1.9 .obsidian/ is gitignored" \
|
||||||
|
"$TEST_VAULT/.gitignore" ".obsidian/"
|
||||||
|
|
||||||
|
# ── TEST-1.10: outline and main branches at same commit ───────────────────
|
||||||
|
local outline_sha main_sha
|
||||||
|
outline_sha="$(git -C "$TEST_VAULT" rev-parse outline 2>/dev/null || true)"
|
||||||
|
main_sha="$( git -C "$TEST_VAULT" rev-parse main 2>/dev/null || true)"
|
||||||
|
assert_eq "TEST-1.10 outline and main branches point to same commit" \
|
||||||
|
"$outline_sha" "$main_sha"
|
||||||
|
|
||||||
|
# ── TEST-1.11: re-running init aborts with non-zero exit ──────────────────
|
||||||
|
local reinit_exit=0
|
||||||
|
"$SCRIPT_DIR/sync.sh" init \
|
||||||
|
--vault "$TEST_VAULT" \
|
||||||
|
--settings "$SETTINGS" \
|
||||||
|
&>/dev/null \
|
||||||
|
|| reinit_exit=$?
|
||||||
|
assert_nonzero_exit "TEST-1.11 re-running init on existing vault exits non-zero" \
|
||||||
|
"$reinit_exit"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Summary ───────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
print_summary() {
|
||||||
|
echo
|
||||||
|
echo "════════════════════════════════════════════════════════════"
|
||||||
|
echo " TEST SUMMARY"
|
||||||
|
echo "════════════════════════════════════════════════════════════"
|
||||||
|
echo -e " Passed : ${GREEN}$PASS${NC}"
|
||||||
|
echo -e " Failed : ${RED}$FAIL${NC}"
|
||||||
|
echo -e " Total : $TOTAL"
|
||||||
|
|
||||||
|
if [[ ${#FAILED_TESTS[@]} -gt 0 ]]; then
|
||||||
|
echo
|
||||||
|
echo " Failed tests:"
|
||||||
|
for t in "${FAILED_TESTS[@]}"; do
|
||||||
|
echo -e " ${RED}✗${NC} $t"
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
echo "════════════════════════════════════════════════════════════"
|
||||||
|
echo
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Main ──────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
main() {
|
||||||
|
echo
|
||||||
|
echo -e "${BLUE}════════════════════════════════════════════════════════════${NC}"
|
||||||
|
echo -e "${BLUE} OUTLINE SYNC — Test Suite (Phase 1)${NC}"
|
||||||
|
echo -e "${BLUE}════════════════════════════════════════════════════════════${NC}"
|
||||||
|
echo
|
||||||
|
echo -e " Settings : $SETTINGS"
|
||||||
|
echo -e " Vault : ${YELLOW}$TEST_VAULT${NC} (temporary)"
|
||||||
|
echo -e " Collection: ${YELLOW}$TEST_COLLECTION_NAME${NC} (temporary)"
|
||||||
|
echo
|
||||||
|
|
||||||
|
# Dependency checks
|
||||||
|
command -v git &>/dev/null || { echo -e "${RED}✗ git is required${NC}"; exit 1; }
|
||||||
|
command -v docker &>/dev/null || { echo -e "${RED}✗ docker is required${NC}"; exit 1; }
|
||||||
|
command -v jq &>/dev/null || { echo -e "${RED}✗ jq is required${NC}"; exit 1; }
|
||||||
|
command -v python3 &>/dev/null|| { echo -e "${RED}✗ python3 is required${NC}";exit 1; }
|
||||||
|
|
||||||
|
[[ -f "$SETTINGS" ]] || {
|
||||||
|
echo -e "${RED}✗ settings.json not found at $SETTINGS${NC}"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Register cleanup so it always runs, even on Ctrl-C
|
||||||
|
trap teardown EXIT
|
||||||
|
|
||||||
|
setup_test_data
|
||||||
|
run_init
|
||||||
|
|
||||||
|
if [[ -z "$PHASE_FILTER" || "$PHASE_FILTER" == "1" ]]; then
|
||||||
|
run_phase_1_tests
|
||||||
|
fi
|
||||||
|
|
||||||
|
print_summary
|
||||||
|
|
||||||
|
# Exit with failure code if any test failed
|
||||||
|
[[ $FAIL -eq 0 ]]
|
||||||
|
}
|
||||||
|
|
||||||
|
main
|
||||||
210
tests/USER_STORIES.md
Normal file
210
tests/USER_STORIES.md
Normal file
@@ -0,0 +1,210 @@
|
|||||||
|
# User Stories — Outline Sync Web UI
|
||||||
|
**Derived from:** WEBUI_PRD.md v2.0
|
||||||
|
**Date:** 2026-03-07
|
||||||
|
|
||||||
|
Each story follows the format: **As a [user], I want [goal] so that [benefit].**
|
||||||
|
Acceptance criteria map directly to automated test IDs in the corresponding `test_phase_*.py` files.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase A — WebDAV Container
|
||||||
|
|
||||||
|
**US-A1** — As a user with Obsidian running locally, I want my vault files to sync automatically to the VPS via WebDAV, so that I do not need terminal access for file transfer.
|
||||||
|
- AC: A file created in the vault directory is retrievable via WebDAV GET within the sync interval.
|
||||||
|
- AC: A file updated locally appears updated on the WebDAV server after sync.
|
||||||
|
- AC: Deleted files are removed from the WebDAV share.
|
||||||
|
- Tests: `test_phase_a_webdav.py::TestWebDAVFileOps`
|
||||||
|
|
||||||
|
**US-A2** — As a system administrator, I want the WebDAV endpoint protected by basic auth and Tailscale network isolation, so that the vault is not publicly accessible.
|
||||||
|
- AC: Unauthenticated requests return 401.
|
||||||
|
- AC: Requests with valid credentials return 200.
|
||||||
|
- AC: The WebDAV port is not bound to the public interface.
|
||||||
|
- Tests: `test_phase_a_webdav.py::TestWebDAVAuth`
|
||||||
|
|
||||||
|
**US-A3** — As a user, I want the `.git/` directory excluded from WebDAV access, so that git internals are not exposed or corrupted by Obsidian.
|
||||||
|
- AC: GET request to `/.git/` returns 403 or 404.
|
||||||
|
- AC: Obsidian plugin cannot overwrite `.git/` files via WebDAV.
|
||||||
|
- Tests: `test_phase_a_webdav.py::TestWebDAVGitProtection`
|
||||||
|
|
||||||
|
**US-A4** — As an Obsidian user, I want WebDAV to support bidirectional file sync (upload, download, delete), so that both push and pull directions work without manual steps.
|
||||||
|
- AC: PROPFIND returns correct file listing.
|
||||||
|
- AC: PUT creates/updates files.
|
||||||
|
- AC: DELETE removes files.
|
||||||
|
- Tests: `test_phase_a_webdav.py::TestWebDAVMethods`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase B — Read-Only Dashboard
|
||||||
|
|
||||||
|
**US-B1** — As a user, I want to open `https://sync.domverse.de` and immediately see the current vault state, so that I know if anything needs attention before syncing.
|
||||||
|
- AC: GET `/` returns HTTP 200 with HTML content.
|
||||||
|
- AC: Page contains vault status badge (Clean / Dirty / Conflicts).
|
||||||
|
- AC: Page shows count of pending local changes.
|
||||||
|
- Tests: `test_phase_b_dashboard.py::TestDashboardPage`
|
||||||
|
|
||||||
|
**US-B2** — As a user, I want to see when the vault was last pulled from Outline and last pushed to Outline, so that I can judge how stale my local state is.
|
||||||
|
- AC: GET `/status` returns JSON with `last_pull`, `last_push` timestamps.
|
||||||
|
- AC: Dashboard renders these timestamps in human-readable form.
|
||||||
|
- Tests: `test_phase_b_dashboard.py::TestStatusEndpoint`
|
||||||
|
|
||||||
|
**US-B3** — As a user, I want the dashboard to show a conflict warning badge when git merge conflicts are present, so that I do not accidentally push broken files.
|
||||||
|
- AC: When conflict files exist, `/status` includes `conflicts: N` where N > 0.
|
||||||
|
- AC: Dashboard shows warning banner with link to `/conflicts`.
|
||||||
|
- Tests: `test_phase_b_dashboard.py::TestConflictBadge`
|
||||||
|
|
||||||
|
**US-B4** — As a user, I want the dashboard to show how many files Obsidian has written via WebDAV that are not yet pushed to Outline, so that I know the push button's scope.
|
||||||
|
- AC: Pending count = `git diff outline..main --name-only | wc -l`.
|
||||||
|
- AC: Count updates on each page load without manual refresh.
|
||||||
|
- Tests: `test_phase_b_dashboard.py::TestPendingCount`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase C — Pull with Live Output
|
||||||
|
|
||||||
|
**US-C1** — As a user, I want to click "Get from Outline" and see live streaming output in the browser, so that I can monitor progress without terminal access.
|
||||||
|
- AC: POST `/pull` responds immediately with a `job_id`.
|
||||||
|
- AC: GET `/stream/{job_id}` returns `text/event-stream` content type.
|
||||||
|
- AC: SSE stream emits at least one `data:` event per document processed.
|
||||||
|
- AC: Stream ends with a `done` event containing a summary.
|
||||||
|
- Tests: `test_phase_c_pull.py::TestPullStreaming`
|
||||||
|
|
||||||
|
**US-C2** — As a user, I want the pull to fetch new Outline documents and update them in the vault, so that Obsidian shows the latest wiki content.
|
||||||
|
- AC: After pull, new documents appear as `.md` files in the vault.
|
||||||
|
- AC: Modified documents have updated content.
|
||||||
|
- AC: The `outline` branch is advanced to reflect the new Outline state.
|
||||||
|
- Tests: `test_phase_c_pull.py::TestPullContent`
|
||||||
|
|
||||||
|
**US-C3** — As a user, I want the pull operation to be idempotent when nothing changed in Outline, so that repeated pulls are safe and fast.
|
||||||
|
- AC: Pull with no Outline changes returns success with "0 changes" summary.
|
||||||
|
- AC: No git commits are made when there are no changes.
|
||||||
|
- Tests: `test_phase_c_pull.py::TestPullIdempotent`
|
||||||
|
|
||||||
|
**US-C4** — As a user, I want only one sync job running at a time, so that concurrent pull/push operations do not corrupt the vault.
|
||||||
|
- AC: Starting a pull while one is in progress returns HTTP 409 Conflict.
|
||||||
|
- AC: Job lock is released when the stream closes (done or error).
|
||||||
|
- Tests: `test_phase_c_pull.py::TestJobLock`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase D — Pending Changes View
|
||||||
|
|
||||||
|
**US-D1** — As a user, I want to see a structured list of pending changes before pushing, so that I can review what will be sent to Outline.
|
||||||
|
- AC: GET `/changes` returns 200 with a list of change objects.
|
||||||
|
- AC: Each item has a `path`, `status` (modified/added/renamed/deleted) and `action` (what the sync engine will do).
|
||||||
|
- Tests: `test_phase_d_changes.py::TestChangesEndpoint`
|
||||||
|
|
||||||
|
**US-D2** — As a user, I want modified files listed separately from new files and deleted files, so that I understand the scope of each change type.
|
||||||
|
- AC: `status=modified` for files changed since last outline branch commit.
|
||||||
|
- AC: `status=added` for files not on the outline branch at all.
|
||||||
|
- AC: `status=deleted` for files removed from main but still on outline.
|
||||||
|
- AC: `status=renamed` with both `from` and `to` paths.
|
||||||
|
- Tests: `test_phase_d_changes.py::TestChangeCategories`
|
||||||
|
|
||||||
|
**US-D3** — As a user, I want to preview the diff for a modified file inline, so that I can confirm the content before pushing.
|
||||||
|
- AC: GET `/diff/{encoded_path}` returns an HTML fragment with two columns.
|
||||||
|
- AC: Left column shows the outline branch version, right shows main branch version.
|
||||||
|
- AC: Added lines are highlighted green, removed lines red.
|
||||||
|
- Tests: `test_phase_d_changes.py::TestDiffPreview`
|
||||||
|
|
||||||
|
**US-D4** — As a user, I want deleted files shown as "skipped" when deletions are disabled in settings, so that I know why they are not being removed from Outline.
|
||||||
|
- AC: When `allow_deletions=false`, deleted files appear with `action=skip`.
|
||||||
|
- AC: Reason text explains deletions are disabled.
|
||||||
|
- Tests: `test_phase_d_changes.py::TestDeletedFilesSkipped`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase E — Push with Live Output
|
||||||
|
|
||||||
|
**US-E1** — As a user, I want to click "Send to Outline" and see live streaming output, so that I can monitor progress for each file.
|
||||||
|
- AC: POST `/push` returns a `job_id`.
|
||||||
|
- AC: SSE stream emits one event per file with status (created/updated/skipped/error).
|
||||||
|
- AC: Final event contains summary counts (created, updated, skipped, errors).
|
||||||
|
- Tests: `test_phase_e_push.py::TestPushStreaming`
|
||||||
|
|
||||||
|
**US-E2** — As a user, I want new Obsidian files to appear in Outline under the correct collection and parent, so that the hierarchy is preserved.
|
||||||
|
- AC: A file `Projekte/NewNote.md` (no frontmatter) is created in Outline under the "Projekte" collection.
|
||||||
|
- AC: After push, the file receives frontmatter with `outline_id`.
|
||||||
|
- AC: The updated file is committed and becomes readable via WebDAV.
|
||||||
|
- Tests: `test_phase_e_push.py::TestNewFileCreation`
|
||||||
|
|
||||||
|
**US-E3** — As a user, I want push to be blocked when there are unresolved conflicts, so that I cannot push broken files with conflict markers.
|
||||||
|
- AC: When `git ls-files -u` returns conflict files, POST `/push` returns HTTP 409.
|
||||||
|
- AC: Response body includes list of conflicting paths.
|
||||||
|
- Tests: `test_phase_e_push.py::TestPushBlockedByConflicts`
|
||||||
|
|
||||||
|
**US-E4** — As a user, I want a new top-level folder in Obsidian to create a new Outline collection automatically, so that new categories do not require manual Outline setup.
|
||||||
|
- AC: Folder `NewCollection/` not mapped to any existing collection → `collections.create` called.
|
||||||
|
- AC: Documents inside the new folder are created under the new collection.
|
||||||
|
- Tests: `test_phase_e_push.py::TestNewCollectionCreation`
|
||||||
|
|
||||||
|
**US-E5** — As a user, I want the push to handle renames (file moved/title changed) without deleting and recreating the document, so that Outline document history and URL are preserved.
|
||||||
|
- AC: Renamed file detected via git rename detection.
|
||||||
|
- AC: Outline `documents.update` called (not delete+create).
|
||||||
|
- Tests: `test_phase_e_push.py::TestRenameHandling`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase F — Conflict Resolution
|
||||||
|
|
||||||
|
**US-F1** — As a user, I want to see all version conflicts listed in the browser, so that I can resolve them without using git on the command line.
|
||||||
|
- AC: GET `/conflicts` returns list of conflict file paths.
|
||||||
|
- AC: Each item includes local timestamp and Outline edit timestamp.
|
||||||
|
- Tests: `test_phase_f_conflicts.py::TestConflictsList`
|
||||||
|
|
||||||
|
**US-F2** — As a user, I want a side-by-side diff view per conflicting file, so that I can compare my Obsidian edit with the Outline edit before choosing.
|
||||||
|
- AC: GET `/diff/{encoded_path}` for a conflict file returns two-column HTML diff.
|
||||||
|
- AC: Diff is rendered using Python `difflib` or equivalent.
|
||||||
|
- Tests: `test_phase_f_conflicts.py::TestConflictDiff`
|
||||||
|
|
||||||
|
**US-F3** — As a user, I want to click "Keep mine" to accept my local Obsidian edit, so that my changes win.
|
||||||
|
- AC: POST `/resolve` with `{file: "path", accept: "local"}` resolves conflict in favour of local.
|
||||||
|
- AC: Conflict markers are removed from the file.
|
||||||
|
- AC: File is committed to main branch.
|
||||||
|
- Tests: `test_phase_f_conflicts.py::TestResolveLocal`
|
||||||
|
|
||||||
|
**US-F4** — As a user, I want to click "Keep Outline's" to accept the Outline version, so that the wiki state wins.
|
||||||
|
- AC: POST `/resolve` with `{file: "path", accept: "remote"}` resolves in favour of outline branch.
|
||||||
|
- AC: Conflict markers are removed.
|
||||||
|
- AC: File is committed to main branch.
|
||||||
|
- Tests: `test_phase_f_conflicts.py::TestResolveRemote`
|
||||||
|
|
||||||
|
**US-F5** — As a user, I want the system to reject invalid file paths in resolve requests, so that an attacker cannot trigger arbitrary git operations via the UI.
|
||||||
|
- AC: `/resolve` with a path not in the conflict list returns HTTP 422.
|
||||||
|
- AC: Path traversal attempts (`../`) return HTTP 422.
|
||||||
|
- Tests: `test_phase_f_conflicts.py::TestResolveValidation`
|
||||||
|
|
||||||
|
**US-F6** — As a user, I want to be redirected to the dashboard after resolving all conflicts, with "Push now available" displayed, so that the workflow continues naturally.
|
||||||
|
- AC: After last conflict resolved, GET `/conflicts` returns empty list.
|
||||||
|
- AC: Dashboard status badge updates to show clean/push-ready state.
|
||||||
|
- Tests: `test_phase_f_conflicts.py::TestAllConflictsResolved`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase G — Sync History
|
||||||
|
|
||||||
|
**US-G1** — As a user, I want to view a chronological history of all sync operations, so that I can audit what changed and when.
|
||||||
|
- AC: GET `/history` returns HTTP 200 with HTML content.
|
||||||
|
- AC: Sync entries are shown in reverse chronological order.
|
||||||
|
- AC: Each entry shows: timestamp, direction (pull/push), files affected, status.
|
||||||
|
- Tests: `test_phase_g_history.py::TestHistoryPage`
|
||||||
|
|
||||||
|
**US-G2** — As a user, I want the history sourced from `_sync_log.md`, so that it remains readable as a plain Obsidian note.
|
||||||
|
- AC: `_sync_log.md` in vault root is parsed into structured records.
|
||||||
|
- AC: Entries are displayed as an HTML table, not raw markdown.
|
||||||
|
- Tests: `test_phase_g_history.py::TestSyncLogParsing`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## End-to-End Flows
|
||||||
|
|
||||||
|
**US-E2E1** — As a user, I want the full Obsidian → Outline flow to work without terminal access.
|
||||||
|
- Tests: `test_e2e.py::TestObsidianToOutlineFlow`
|
||||||
|
|
||||||
|
**US-E2E2** — As a user, I want the full Outline → Obsidian flow to work without terminal access.
|
||||||
|
- Tests: `test_e2e.py::TestOutlineToObsidianFlow`
|
||||||
|
|
||||||
|
**US-E2E3** — As a user, I want conflicts detected and resolvable end-to-end through the browser.
|
||||||
|
- Tests: `test_e2e.py::TestConflictResolutionFlow`
|
||||||
|
|
||||||
|
**US-E2E4** — As a user, I want a new file created in Obsidian to reach Outline with correct hierarchy, frontmatter written back, and the ID visible in Obsidian on the next sync.
|
||||||
|
- Tests: `test_e2e.py::TestNewFileRoundTrip`
|
||||||
270
tests/conftest.py
Normal file
270
tests/conftest.py
Normal file
@@ -0,0 +1,270 @@
|
|||||||
|
# Ensure the project root (webui.py) is importable before any test collection.
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
_ROOT = Path(__file__).parent.parent
|
||||||
|
if str(_ROOT) not in sys.path:
|
||||||
|
sys.path.insert(0, str(_ROOT))
|
||||||
|
|
||||||
|
"""
|
||||||
|
Shared fixtures for the Outline Sync Web UI test suite.
|
||||||
|
|
||||||
|
All tests work against the FastAPI app defined in webui.py (Phase B+).
|
||||||
|
Phase A tests (WebDAV) are integration tests marked with @pytest.mark.integration
|
||||||
|
and require a running WebDAV container — they are skipped in unit-test runs.
|
||||||
|
|
||||||
|
Run unit tests only:
|
||||||
|
pytest tests/ -m "not integration"
|
||||||
|
|
||||||
|
Run integration tests (requires running WebDAV container):
|
||||||
|
pytest tests/ -m integration
|
||||||
|
|
||||||
|
Run everything:
|
||||||
|
pytest tests/
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import textwrap
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Generator
|
||||||
|
from unittest.mock import AsyncMock, MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
import pytest_asyncio
|
||||||
|
from httpx import ASGITransport, AsyncClient
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# pytest configuration
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
pytest_plugins = ["pytest_asyncio"]
|
||||||
|
|
||||||
|
|
||||||
|
def pytest_configure(config):
|
||||||
|
config.addinivalue_line("markers", "integration: requires running infrastructure (WebDAV, Outline)")
|
||||||
|
config.addinivalue_line("markers", "slow: longer-running tests")
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Git helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def git(vault: Path, *args) -> str:
|
||||||
|
result = subprocess.run(
|
||||||
|
["git", "-C", str(vault), *args],
|
||||||
|
check=True, capture_output=True, text=True,
|
||||||
|
)
|
||||||
|
return result.stdout.strip()
|
||||||
|
|
||||||
|
|
||||||
|
def git_config(vault: Path):
|
||||||
|
git(vault, "config", "user.email", "test@sync.local")
|
||||||
|
git(vault, "config", "user.name", "Test Runner")
|
||||||
|
|
||||||
|
|
||||||
|
def commit_all(vault: Path, message: str):
|
||||||
|
"""Stage everything in vault and commit."""
|
||||||
|
git(vault, "add", "-A")
|
||||||
|
try:
|
||||||
|
git(vault, "commit", "-m", message)
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
pass # nothing to commit — that's fine
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Auto-reset webui module state between tests
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@pytest.fixture(autouse=True)
|
||||||
|
def reset_webui_state():
|
||||||
|
"""Clear _jobs and _active_job on webui module after every test."""
|
||||||
|
yield
|
||||||
|
try:
|
||||||
|
import webui as _webui
|
||||||
|
_webui._jobs.clear()
|
||||||
|
_webui._active_job = None
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Core fixtures
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def vault_dir(tmp_path) -> Path:
|
||||||
|
"""
|
||||||
|
Temporary directory simulating /outline-vault/.
|
||||||
|
Initialised as a git repo with 'outline' and 'main' branches.
|
||||||
|
"""
|
||||||
|
vault = tmp_path / "outline-vault"
|
||||||
|
vault.mkdir()
|
||||||
|
|
||||||
|
subprocess.run(["git", "init", str(vault)], check=True, capture_output=True)
|
||||||
|
git_config(vault)
|
||||||
|
git(vault, "checkout", "-b", "outline")
|
||||||
|
# Initial commit so branches can diverge
|
||||||
|
(vault / ".gitkeep").touch()
|
||||||
|
commit_all(vault, "initial")
|
||||||
|
git(vault, "checkout", "-b", "main")
|
||||||
|
|
||||||
|
return vault
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def populated_vault(vault_dir: Path) -> Path:
|
||||||
|
"""
|
||||||
|
Vault pre-loaded with sample markdown files on both branches,
|
||||||
|
simulating a state after a previous successful sync.
|
||||||
|
"""
|
||||||
|
# Create files on 'outline' branch — last known Outline state
|
||||||
|
git(vault_dir, "checkout", "outline")
|
||||||
|
(vault_dir / "Bewerbungen").mkdir()
|
||||||
|
(vault_dir / "Bewerbungen" / "CV.md").write_text(textwrap.dedent("""\
|
||||||
|
---
|
||||||
|
outline_id: doc-cv-001
|
||||||
|
outline_collection_id: col-bew-001
|
||||||
|
outline_updated_at: 2026-01-10T12:00:00Z
|
||||||
|
---
|
||||||
|
# CV
|
||||||
|
Original content.
|
||||||
|
"""))
|
||||||
|
(vault_dir / "Infra").mkdir()
|
||||||
|
(vault_dir / "Infra" / "HomeLab.md").write_text(textwrap.dedent("""\
|
||||||
|
---
|
||||||
|
outline_id: doc-hl-002
|
||||||
|
outline_collection_id: col-inf-001
|
||||||
|
outline_updated_at: 2026-01-10T12:00:00Z
|
||||||
|
---
|
||||||
|
# HomeLab
|
||||||
|
Server setup notes.
|
||||||
|
"""))
|
||||||
|
commit_all(vault_dir, "outline: sync 2026-01-10")
|
||||||
|
|
||||||
|
# Merge into 'main' so both branches are identical at start
|
||||||
|
git(vault_dir, "checkout", "main")
|
||||||
|
git(vault_dir, "merge", "outline", "--no-ff", "-m", "merge outline into main")
|
||||||
|
|
||||||
|
return vault_dir
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def settings_file(vault_dir: Path, tmp_path: Path) -> Path:
|
||||||
|
"""settings.json pointing at the temp vault and a test Outline URL."""
|
||||||
|
settings = {
|
||||||
|
"source": {
|
||||||
|
"url": "http://outline:3000",
|
||||||
|
"token": "test-api-token-abc123",
|
||||||
|
},
|
||||||
|
"sync": {
|
||||||
|
"vault_dir": str(vault_dir),
|
||||||
|
"allow_deletions": False,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
path = tmp_path / "settings.json"
|
||||||
|
path.write_text(json.dumps(settings, indent=2))
|
||||||
|
return path
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def sync_log(vault_dir: Path) -> Path:
|
||||||
|
"""Pre-populate _sync_log.md with a few history entries."""
|
||||||
|
log = vault_dir / "_sync_log.md"
|
||||||
|
log.write_text(textwrap.dedent("""\
|
||||||
|
# Sync Log
|
||||||
|
|
||||||
|
| Timestamp | Direction | Files | Status |
|
||||||
|
|-----------|-----------|-------|--------|
|
||||||
|
| 2026-03-04 08:00 | pull | 0 changes | ok |
|
||||||
|
| 2026-03-05 09:10 | push | 2 updated, 1 created | ok |
|
||||||
|
| 2026-03-06 14:32 | pull | 3 updated | ok |
|
||||||
|
"""))
|
||||||
|
return log
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# App fixtures (Phases B–G)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def app(vault_dir: Path, settings_file: Path):
|
||||||
|
"""
|
||||||
|
FastAPI app instance with VAULT_DIR and SETTINGS_PATH overridden.
|
||||||
|
Skips the fixture (and all tests using it) if webui.py is not yet written.
|
||||||
|
"""
|
||||||
|
webui = pytest.importorskip("webui", reason="webui.py not yet implemented")
|
||||||
|
webui.VAULT_DIR = vault_dir
|
||||||
|
webui.SETTINGS_PATH = settings_file
|
||||||
|
return webui.app
|
||||||
|
|
||||||
|
|
||||||
|
@pytest_asyncio.fixture
|
||||||
|
async def client(app) -> Generator[AsyncClient, None, None]:
|
||||||
|
"""Async HTTP test client bound to the FastAPI app."""
|
||||||
|
async with AsyncClient(
|
||||||
|
transport=ASGITransport(app=app), base_url="http://testserver"
|
||||||
|
) as c:
|
||||||
|
yield c
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Vault state helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def vault_with_pending(populated_vault: Path) -> Path:
|
||||||
|
"""
|
||||||
|
Vault where main branch has local edits not yet pushed to Outline.
|
||||||
|
Simulates Obsidian having written files via WebDAV.
|
||||||
|
"""
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.write_text(cv.read_text() + "\n## New Section\nAdded in Obsidian.\n")
|
||||||
|
|
||||||
|
new_file = populated_vault / "Projekte" / "NewNote.md"
|
||||||
|
new_file.parent.mkdir(exist_ok=True)
|
||||||
|
new_file.write_text("# New Note\nWritten in Obsidian.\n")
|
||||||
|
|
||||||
|
commit_all(populated_vault, "obsidian: local edits")
|
||||||
|
return populated_vault
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def vault_with_conflict(populated_vault: Path) -> Path:
|
||||||
|
"""
|
||||||
|
Vault in a post-merge-conflict state: CV.md has conflict markers on main.
|
||||||
|
"""
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
|
||||||
|
# Edit on outline branch
|
||||||
|
git(populated_vault, "checkout", "outline")
|
||||||
|
cv.write_text(textwrap.dedent("""\
|
||||||
|
---
|
||||||
|
outline_id: doc-cv-001
|
||||||
|
outline_collection_id: col-bew-001
|
||||||
|
outline_updated_at: 2026-03-06T11:03:00Z
|
||||||
|
---
|
||||||
|
# CV
|
||||||
|
Outline version with contact info updated.
|
||||||
|
"""))
|
||||||
|
commit_all(populated_vault, "outline: CV contact update")
|
||||||
|
|
||||||
|
# Conflicting edit on main branch
|
||||||
|
git(populated_vault, "checkout", "main")
|
||||||
|
cv.write_text(textwrap.dedent("""\
|
||||||
|
---
|
||||||
|
outline_id: doc-cv-001
|
||||||
|
outline_collection_id: col-bew-001
|
||||||
|
outline_updated_at: 2026-01-10T12:00:00Z
|
||||||
|
---
|
||||||
|
# CV
|
||||||
|
Local version with new section added.
|
||||||
|
"""))
|
||||||
|
commit_all(populated_vault, "obsidian: CV new section")
|
||||||
|
|
||||||
|
# Attempt merge — this will produce a conflict
|
||||||
|
try:
|
||||||
|
git(populated_vault, "merge", "outline")
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
pass # expected: merge conflict
|
||||||
|
|
||||||
|
return populated_vault
|
||||||
37
tests/helpers.py
Normal file
37
tests/helpers.py
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
"""
|
||||||
|
Shared test helpers for the Outline Sync Web UI test suite.
|
||||||
|
Import directly: from tests.helpers import make_mock_process
|
||||||
|
"""
|
||||||
|
|
||||||
|
from unittest.mock import AsyncMock, MagicMock
|
||||||
|
|
||||||
|
|
||||||
|
class _AsyncLineIter:
|
||||||
|
"""Proper async iterator for mocking asyncio subprocess stdout."""
|
||||||
|
|
||||||
|
def __init__(self, lines: list[str]):
|
||||||
|
self._iter = iter(lines)
|
||||||
|
|
||||||
|
def __aiter__(self):
|
||||||
|
return self
|
||||||
|
|
||||||
|
async def __anext__(self) -> bytes:
|
||||||
|
try:
|
||||||
|
return (next(self._iter) + "\n").encode()
|
||||||
|
except StopIteration:
|
||||||
|
raise StopAsyncIteration
|
||||||
|
|
||||||
|
|
||||||
|
def make_mock_process(stdout_lines: list[str], returncode: int = 0) -> MagicMock:
|
||||||
|
"""
|
||||||
|
Build a mock asyncio subprocess whose stdout is a proper async iterable.
|
||||||
|
|
||||||
|
Usage in tests:
|
||||||
|
with patch("webui.spawn_sync_subprocess") as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done."])
|
||||||
|
"""
|
||||||
|
proc = MagicMock()
|
||||||
|
proc.returncode = returncode
|
||||||
|
proc.stdout = _AsyncLineIter(stdout_lines)
|
||||||
|
proc.wait = AsyncMock(return_value=returncode)
|
||||||
|
return proc
|
||||||
6
tests/requirements-test.txt
Normal file
6
tests/requirements-test.txt
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
pytest>=8.0
|
||||||
|
pytest-asyncio>=0.23
|
||||||
|
pytest-mock>=3.12
|
||||||
|
httpx>=0.27
|
||||||
|
anyio[trio]>=4.0
|
||||||
|
respx>=0.21
|
||||||
412
tests/test_e2e.py
Normal file
412
tests/test_e2e.py
Normal file
@@ -0,0 +1,412 @@
|
|||||||
|
"""
|
||||||
|
End-to-End Integration Tests — Full Workflow Scenarios
|
||||||
|
|
||||||
|
These tests simulate complete user workflows from start to finish.
|
||||||
|
They use real git operations against a temp vault but mock the Outline API
|
||||||
|
and WebDAV sync (since we do not have a live Outline server in CI).
|
||||||
|
|
||||||
|
For full E2E with live Outline, use @pytest.mark.integration and
|
||||||
|
set OUTLINE_URL / OUTLINE_TOKEN environment variables.
|
||||||
|
|
||||||
|
Scenarios covered:
|
||||||
|
E2E-1: Obsidian → Outline (new file with frontmatter writeback)
|
||||||
|
E2E-2: Outline → Obsidian (pull, file appears in vault)
|
||||||
|
E2E-3: Conflict detection and resolution in browser
|
||||||
|
E2E-4: New collection creation for unknown top-level folder
|
||||||
|
E2E-5: Concurrent safety (only one job at a time)
|
||||||
|
E2E-6: Full roundtrip (pull → edit → push → pull verifies no pending)
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import textwrap
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import AsyncMock, MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent))
|
||||||
|
from helpers import make_mock_process # noqa: E402
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.asyncio
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def git(vault: Path, *args) -> str:
|
||||||
|
return subprocess.run(
|
||||||
|
["git", "-C", str(vault), *args],
|
||||||
|
check=True, capture_output=True, text=True,
|
||||||
|
).stdout.strip()
|
||||||
|
|
||||||
|
|
||||||
|
def commit_all(vault: Path, message: str):
|
||||||
|
subprocess.run(["git", "-C", str(vault), "add", "-A"], check=True, capture_output=True)
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "-C", str(vault), "commit", "-m", message], check=True, capture_output=True)
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
async def wait_for_job_done(client, job_id: str, timeout: float = 5.0) -> list[dict]:
|
||||||
|
"""Consume the SSE stream and collect all events."""
|
||||||
|
events = []
|
||||||
|
async with client.stream("GET", f"/stream/{job_id}") as r:
|
||||||
|
async for line in r.aiter_lines():
|
||||||
|
if line.startswith("data:"):
|
||||||
|
try:
|
||||||
|
e = json.loads(line[5:].strip())
|
||||||
|
events.append(e)
|
||||||
|
if e.get("type") == "done":
|
||||||
|
break
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
pass
|
||||||
|
return events
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# E2E-1: Obsidian → Outline (new file, frontmatter written back)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestObsidianToOutlineFlow:
|
||||||
|
"""
|
||||||
|
Full flow: user creates file in Obsidian → WebDAV syncs to VPS →
|
||||||
|
user clicks Push → sync engine creates document in Outline →
|
||||||
|
frontmatter written back → file has outline_id.
|
||||||
|
"""
|
||||||
|
|
||||||
|
async def test_new_file_reaches_outline_and_gets_id(
|
||||||
|
self, client, populated_vault
|
||||||
|
):
|
||||||
|
# Step 1: User creates a new note in Obsidian (WebDAV already synced it)
|
||||||
|
new_file = populated_vault / "Projekte" / "E2E_NewDoc.md"
|
||||||
|
new_file.parent.mkdir(exist_ok=True)
|
||||||
|
new_file.write_text("# E2E New Doc\nContent written in Obsidian.\n")
|
||||||
|
commit_all(populated_vault, "obsidian: new doc via webdav")
|
||||||
|
|
||||||
|
# Step 2: Dashboard shows pending changes
|
||||||
|
status = (await client.get("/status")).json()
|
||||||
|
assert status["pending_count"] >= 1, "Dashboard must show pending changes"
|
||||||
|
|
||||||
|
# Step 3: Changes page lists the new file
|
||||||
|
changes = (await client.get("/changes")).json()
|
||||||
|
new_item = next((i for i in changes if "E2E_NewDoc.md" in i["path"]), None)
|
||||||
|
assert new_item is not None, "New file must appear in /changes"
|
||||||
|
assert new_item["status"] == "added"
|
||||||
|
|
||||||
|
# Step 4: User pushes — sync engine creates document and writes back ID
|
||||||
|
fake_doc_id = "doc-e2e-new-001"
|
||||||
|
|
||||||
|
def fake_push_subprocess(*args, **kwargs):
|
||||||
|
new_file.write_text(textwrap.dedent(f"""\
|
||||||
|
---
|
||||||
|
outline_id: {fake_doc_id}
|
||||||
|
outline_collection_id: col-proj-001
|
||||||
|
outline_updated_at: 2026-03-07T10:00:00Z
|
||||||
|
---
|
||||||
|
# E2E New Doc
|
||||||
|
Content written in Obsidian.
|
||||||
|
"""))
|
||||||
|
commit_all(populated_vault, "sync: frontmatter writeback")
|
||||||
|
return make_mock_process([
|
||||||
|
f"ok: Projekte/E2E_NewDoc.md created (id: {fake_doc_id})",
|
||||||
|
"Done. 1 created.",
|
||||||
|
])
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", side_effect=fake_push_subprocess):
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code in (200, 202)
|
||||||
|
events = await wait_for_job_done(client, r.json()["job_id"])
|
||||||
|
done = next((e for e in events if e.get("type") == "done"), None)
|
||||||
|
assert done is not None
|
||||||
|
|
||||||
|
# Step 5: File now has outline_id (will be served via WebDAV to Obsidian)
|
||||||
|
content = new_file.read_text()
|
||||||
|
assert "outline_id" in content, "File must have outline_id after push"
|
||||||
|
assert fake_doc_id in content
|
||||||
|
|
||||||
|
# Step 6: No more pending changes
|
||||||
|
status2 = (await client.get("/status")).json()
|
||||||
|
# After frontmatter writeback commit, outline branch advances on push
|
||||||
|
# pending_count depends on implementation — just verify push succeeded
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# E2E-2: Outline → Obsidian (pull, file appears in vault)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestOutlineToObsidianFlow:
|
||||||
|
"""
|
||||||
|
Full flow: document updated in Outline → user clicks Pull →
|
||||||
|
SSE streams progress → file updated in vault → WebDAV serves it →
|
||||||
|
Obsidian picks it up.
|
||||||
|
"""
|
||||||
|
|
||||||
|
async def test_pull_updates_existing_document(self, client, populated_vault):
|
||||||
|
# Initial state: CV.md exists in both branches (clean)
|
||||||
|
status = (await client.get("/status")).json()
|
||||||
|
assert status["pending_count"] == 0
|
||||||
|
|
||||||
|
# Simulate pull: sync engine updates CV.md on outline branch
|
||||||
|
def fake_pull_subprocess(*args, **kwargs):
|
||||||
|
git(populated_vault, "checkout", "outline")
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.write_text(textwrap.dedent("""\
|
||||||
|
---
|
||||||
|
outline_id: doc-cv-001
|
||||||
|
outline_collection_id: col-bew-001
|
||||||
|
outline_updated_at: 2026-03-07T09:00:00Z
|
||||||
|
---
|
||||||
|
# CV
|
||||||
|
Updated in Outline with new contact info.
|
||||||
|
"""))
|
||||||
|
commit_all(populated_vault, "outline: CV updated")
|
||||||
|
git(populated_vault, "checkout", "main")
|
||||||
|
git(populated_vault, "merge", "outline", "--no-ff", "-m", "merge outline")
|
||||||
|
return make_mock_process([
|
||||||
|
"ok: Bewerbungen/CV.md updated",
|
||||||
|
"Done. 1 updated.",
|
||||||
|
])
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", side_effect=fake_pull_subprocess):
|
||||||
|
r = await client.post("/pull")
|
||||||
|
assert r.status_code in (200, 202)
|
||||||
|
events = await wait_for_job_done(client, r.json()["job_id"])
|
||||||
|
|
||||||
|
done = next((e for e in events if e.get("type") == "done"), None)
|
||||||
|
assert done is not None, "Pull must emit done event"
|
||||||
|
|
||||||
|
# CV.md should now have new content
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
assert "contact info" in cv.read_text(), (
|
||||||
|
"CV.md must be updated in vault after pull"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_pull_adds_new_document_to_vault(self, client, populated_vault):
|
||||||
|
"""New document created in Outline must appear as a file after pull."""
|
||||||
|
|
||||||
|
def fake_pull_subprocess(*args, **kwargs):
|
||||||
|
git(populated_vault, "checkout", "outline")
|
||||||
|
new_file = populated_vault / "Infra" / "NewServerDoc.md"
|
||||||
|
new_file.write_text(textwrap.dedent("""\
|
||||||
|
---
|
||||||
|
outline_id: doc-new-srv-001
|
||||||
|
outline_collection_id: col-inf-001
|
||||||
|
outline_updated_at: 2026-03-07T10:00:00Z
|
||||||
|
---
|
||||||
|
# New Server Doc
|
||||||
|
Created in Outline.
|
||||||
|
"""))
|
||||||
|
commit_all(populated_vault, "outline: new server doc")
|
||||||
|
git(populated_vault, "checkout", "main")
|
||||||
|
git(populated_vault, "merge", "outline", "--no-ff", "-m", "merge")
|
||||||
|
return make_mock_process([
|
||||||
|
"ok: Infra/NewServerDoc.md created",
|
||||||
|
"Done. 1 created.",
|
||||||
|
])
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", side_effect=fake_pull_subprocess):
|
||||||
|
r = await client.post("/pull")
|
||||||
|
await wait_for_job_done(client, r.json()["job_id"])
|
||||||
|
|
||||||
|
new_file = populated_vault / "Infra" / "NewServerDoc.md"
|
||||||
|
assert new_file.exists(), "New document from Outline must appear in vault"
|
||||||
|
assert "outline_id: doc-new-srv-001" in new_file.read_text()
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# E2E-3: Conflict detection and resolution in browser
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestConflictResolutionFlow:
|
||||||
|
"""
|
||||||
|
Full flow: same file edited in both Obsidian and Outline →
|
||||||
|
pull detects conflict → conflicts page shows → user resolves →
|
||||||
|
push becomes available.
|
||||||
|
"""
|
||||||
|
|
||||||
|
async def test_full_conflict_resolution_flow(self, client, vault_with_conflict):
|
||||||
|
# Step 1: Verify conflicts are detected
|
||||||
|
status = (await client.get("/status")).json()
|
||||||
|
assert status["conflicts"] > 0, "Conflicts must be detected after merge conflict"
|
||||||
|
|
||||||
|
# Step 2: Push is blocked
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code == 409, "Push must be blocked while conflicts exist"
|
||||||
|
|
||||||
|
# Step 3: Conflicts page lists the file
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
conflicts = r.json()
|
||||||
|
assert len(conflicts) > 0
|
||||||
|
conflict_path = conflicts[0]["path"]
|
||||||
|
|
||||||
|
# Step 4: User views the diff
|
||||||
|
import base64
|
||||||
|
encoded = base64.urlsafe_b64encode(conflict_path.encode()).decode()
|
||||||
|
r = await client.get(f"/diff/{encoded}")
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert "text/html" in r.headers.get("content-type", "")
|
||||||
|
|
||||||
|
# Step 5: User chooses "Keep mine" (local)
|
||||||
|
r = await client.post("/resolve", json={
|
||||||
|
"file": conflict_path,
|
||||||
|
"accept": "local",
|
||||||
|
})
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
# Step 6: No more conflicts
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
assert r.json() == [], "All conflicts must be resolved"
|
||||||
|
|
||||||
|
# Step 7: No conflict markers in file
|
||||||
|
content = (vault_with_conflict / conflict_path).read_text()
|
||||||
|
assert "<<<<<<<" not in content
|
||||||
|
|
||||||
|
# Step 8: Push is now allowed
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 1 updated."])
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code in (200, 202), "Push must succeed after conflicts resolved"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# E2E-4: New collection creation
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestNewCollectionFlow:
|
||||||
|
"""
|
||||||
|
User creates a new top-level folder in Obsidian.
|
||||||
|
Push must create the collection AND the documents inside it.
|
||||||
|
"""
|
||||||
|
|
||||||
|
async def test_new_collection_created_on_push(self, client, populated_vault):
|
||||||
|
# Create new folder + file (simulating Obsidian + WebDAV sync)
|
||||||
|
new_dir = populated_vault / "NewProject"
|
||||||
|
new_dir.mkdir()
|
||||||
|
(new_dir / "Overview.md").write_text("# Overview\nNew project.\n")
|
||||||
|
(new_dir / "Notes.md").write_text("# Notes\nMeeting notes.\n")
|
||||||
|
commit_all(populated_vault, "obsidian: new collection NewProject")
|
||||||
|
|
||||||
|
# Changes must flag both files as added
|
||||||
|
changes = (await client.get("/changes")).json()
|
||||||
|
new_items = [i for i in changes if "NewProject" in i["path"]]
|
||||||
|
assert len(new_items) >= 2, "Both new files must appear in changes"
|
||||||
|
|
||||||
|
# Push — mock shows collection + documents created
|
||||||
|
fake_col_id = "col-newproject-001"
|
||||||
|
fake_push_lines = [
|
||||||
|
f"ok: collection 'NewProject' created (id: {fake_col_id})",
|
||||||
|
"ok: NewProject/Overview.md created (id: doc-overview-001)",
|
||||||
|
"ok: NewProject/Notes.md created (id: doc-notes-001)",
|
||||||
|
"Done. 2 created, 1 collection created.",
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(fake_push_lines)
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code in (200, 202)
|
||||||
|
events = await wait_for_job_done(client, r.json()["job_id"])
|
||||||
|
all_text = json.dumps(events)
|
||||||
|
assert "NewProject" in all_text or "collection" in all_text, (
|
||||||
|
"New collection creation must be reflected in SSE output"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# E2E-5: Concurrency safety
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestConcurrencySafety:
|
||||||
|
|
||||||
|
async def test_only_one_sync_job_at_a_time(self, client, populated_vault):
|
||||||
|
"""
|
||||||
|
Starting a second sync while first is pending/running returns 409.
|
||||||
|
_active_job is set immediately by POST /pull, before the job starts.
|
||||||
|
"""
|
||||||
|
r1 = await client.post("/pull")
|
||||||
|
assert r1.status_code in (200, 202)
|
||||||
|
|
||||||
|
r2 = await client.post("/pull")
|
||||||
|
assert r2.status_code == 409, "Concurrent pull must be rejected"
|
||||||
|
|
||||||
|
r3 = await client.post("/push")
|
||||||
|
assert r3.status_code == 409, "Concurrent push must be rejected"
|
||||||
|
|
||||||
|
async def test_new_job_accepted_after_previous_completes(
|
||||||
|
self, client, populated_vault
|
||||||
|
):
|
||||||
|
"""After a job finishes, a new job must be accepted."""
|
||||||
|
# Keep patch active while draining so the task can call spawn_sync_subprocess
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||||
|
r1 = await client.post("/pull")
|
||||||
|
assert r1.status_code in (200, 202)
|
||||||
|
await wait_for_job_done(client, r1.json()["job_id"]) # drain → job completes
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||||
|
r2 = await client.post("/pull")
|
||||||
|
assert r2.status_code in (200, 202), "New job must be accepted after first completes"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# E2E-6: Full roundtrip — pull → edit → push → no pending
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestFullRoundtrip:
|
||||||
|
|
||||||
|
async def test_pull_edit_push_leaves_clean_state(self, client, populated_vault):
|
||||||
|
"""
|
||||||
|
Complete happy-path cycle:
|
||||||
|
1. Pull (no changes)
|
||||||
|
2. Edit a file (simulate Obsidian)
|
||||||
|
3. Push (sync engine updates Outline, writes back updated_at)
|
||||||
|
4. Dashboard shows clean state
|
||||||
|
"""
|
||||||
|
# Step 1: Pull — nothing new
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||||
|
r = await client.post("/pull")
|
||||||
|
await wait_for_job_done(client, r.json()["job_id"])
|
||||||
|
|
||||||
|
# Step 2: User edits CV.md in Obsidian (simulated via direct write)
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
original = cv.read_text()
|
||||||
|
cv.write_text(original.rstrip() + "\n\n## Skills\n- Python\n- Docker\n")
|
||||||
|
commit_all(populated_vault, "obsidian: add skills section")
|
||||||
|
|
||||||
|
pending = (await client.get("/status")).json()["pending_count"]
|
||||||
|
assert pending >= 1, "Edits must be pending after local change"
|
||||||
|
|
||||||
|
# Step 3: Push — sync engine updates Outline and refreshes updated_at
|
||||||
|
def fake_push(*args, **kwargs):
|
||||||
|
cv.write_text(cv.read_text().replace(
|
||||||
|
"outline_updated_at: 2026-01-10T12:00:00Z",
|
||||||
|
"outline_updated_at: 2026-03-07T11:30:00Z",
|
||||||
|
))
|
||||||
|
commit_all(populated_vault, "sync: advance outline branch after push")
|
||||||
|
# Advance outline branch to match
|
||||||
|
git(populated_vault, "checkout", "outline")
|
||||||
|
cv_outline = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv_outline.write_text(cv.read_text())
|
||||||
|
commit_all(populated_vault, "outline: updated from push")
|
||||||
|
git(populated_vault, "checkout", "main")
|
||||||
|
return make_mock_process(["ok: Bewerbungen/CV.md updated", "Done. 1 updated."])
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", side_effect=fake_push):
|
||||||
|
r = await client.post("/push")
|
||||||
|
events = await wait_for_job_done(client, r.json()["job_id"])
|
||||||
|
|
||||||
|
done = next((e for e in events if e.get("type") == "done"), None)
|
||||||
|
assert done is not None, "Push must emit done event"
|
||||||
|
|
||||||
|
# Step 4: Verify no pending changes remain (outline branch == main)
|
||||||
|
r_changes = await client.get("/changes")
|
||||||
|
# Changes should be 0 or only include the updated_at change
|
||||||
|
# which is an internal sync marker, not a user-content change
|
||||||
|
status = (await client.get("/status")).json()
|
||||||
|
assert status["conflicts"] == 0, "No conflicts must remain after clean push"
|
||||||
221
tests/test_phase_a_webdav.py
Normal file
221
tests/test_phase_a_webdav.py
Normal file
@@ -0,0 +1,221 @@
|
|||||||
|
"""
|
||||||
|
Phase A — WebDAV Container Tests
|
||||||
|
|
||||||
|
Integration tests against a running obsidian-webdav container.
|
||||||
|
Skip these in CI unless the WebDAV service is available.
|
||||||
|
|
||||||
|
Run with:
|
||||||
|
pytest tests/test_phase_a_webdav.py -m integration -v
|
||||||
|
|
||||||
|
Environment variables required:
|
||||||
|
WEBDAV_URL e.g. http://100.x.x.x (Tailscale IP)
|
||||||
|
WEBDAV_USER basic-auth username (default: obsidian)
|
||||||
|
WEBDAV_PASS basic-auth password
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
import requests
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Fixtures
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.integration
|
||||||
|
|
||||||
|
|
||||||
|
def webdav_base() -> str:
|
||||||
|
url = os.environ.get("WEBDAV_URL", "").rstrip("/")
|
||||||
|
if not url:
|
||||||
|
pytest.skip("WEBDAV_URL not set — skipping WebDAV integration tests")
|
||||||
|
return url
|
||||||
|
|
||||||
|
|
||||||
|
def webdav_auth():
|
||||||
|
return (
|
||||||
|
os.environ.get("WEBDAV_USER", "obsidian"),
|
||||||
|
os.environ.get("WEBDAV_PASS", ""),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def dav():
|
||||||
|
"""Requests session pre-configured for the WebDAV endpoint."""
|
||||||
|
s = requests.Session()
|
||||||
|
s.auth = webdav_auth()
|
||||||
|
return s
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def test_filename():
|
||||||
|
"""Unique filename so parallel test runs do not collide."""
|
||||||
|
return f"test_{uuid.uuid4().hex[:8]}.md"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-A2 — Authentication
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestWebDAVAuth:
|
||||||
|
|
||||||
|
def test_unauthenticated_request_returns_401(self):
|
||||||
|
"""GET without credentials must be rejected."""
|
||||||
|
r = requests.get(webdav_base())
|
||||||
|
assert r.status_code == 401, f"Expected 401, got {r.status_code}"
|
||||||
|
|
||||||
|
def test_wrong_password_returns_401(self):
|
||||||
|
r = requests.get(webdav_base(), auth=("obsidian", "wrong-password"))
|
||||||
|
assert r.status_code == 401
|
||||||
|
|
||||||
|
def test_valid_credentials_succeed(self, dav):
|
||||||
|
r = dav.request("PROPFIND", webdav_base(), headers={"Depth": "0"})
|
||||||
|
assert r.status_code in (200, 207), f"Expected 200/207, got {r.status_code}"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-A4 — WebDAV method support (PROPFIND, PUT, GET, DELETE)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestWebDAVMethods:
|
||||||
|
|
||||||
|
def test_propfind_root_lists_contents(self, dav):
|
||||||
|
"""PROPFIND depth=1 must enumerate the root of the vault."""
|
||||||
|
r = dav.request(
|
||||||
|
"PROPFIND",
|
||||||
|
webdav_base(),
|
||||||
|
headers={"Depth": "1", "Content-Type": "application/xml"},
|
||||||
|
)
|
||||||
|
assert r.status_code in (200, 207)
|
||||||
|
assert len(r.content) > 0
|
||||||
|
|
||||||
|
def test_put_creates_file(self, dav, test_filename):
|
||||||
|
url = f"{webdav_base()}/{test_filename}"
|
||||||
|
content = b"# Test Note\nCreated by automated test.\n"
|
||||||
|
r = dav.put(url, data=content)
|
||||||
|
assert r.status_code in (200, 201, 204), f"PUT failed: {r.status_code}"
|
||||||
|
|
||||||
|
# Verify it exists
|
||||||
|
r2 = dav.get(url)
|
||||||
|
assert r2.status_code == 200
|
||||||
|
assert b"# Test Note" in r2.content
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
dav.delete(url)
|
||||||
|
|
||||||
|
def test_put_updates_existing_file(self, dav, test_filename):
|
||||||
|
url = f"{webdav_base()}/{test_filename}"
|
||||||
|
dav.put(url, data=b"v1 content")
|
||||||
|
|
||||||
|
dav.put(url, data=b"v2 content updated")
|
||||||
|
r = dav.get(url)
|
||||||
|
assert b"v2 content" in r.content
|
||||||
|
|
||||||
|
dav.delete(url)
|
||||||
|
|
||||||
|
def test_delete_removes_file(self, dav, test_filename):
|
||||||
|
url = f"{webdav_base()}/{test_filename}"
|
||||||
|
dav.put(url, data=b"temporary file")
|
||||||
|
|
||||||
|
r = dav.delete(url)
|
||||||
|
assert r.status_code in (200, 204)
|
||||||
|
|
||||||
|
r2 = dav.get(url)
|
||||||
|
assert r2.status_code == 404
|
||||||
|
|
||||||
|
def test_get_nonexistent_returns_404(self, dav):
|
||||||
|
r = dav.get(f"{webdav_base()}/does_not_exist_{uuid.uuid4().hex}.md")
|
||||||
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
def test_mkcol_creates_subdirectory(self, dav):
|
||||||
|
dirname = f"test_dir_{uuid.uuid4().hex[:8]}"
|
||||||
|
r = dav.request("MKCOL", f"{webdav_base()}/{dirname}")
|
||||||
|
assert r.status_code in (200, 201, 207)
|
||||||
|
|
||||||
|
r2 = dav.request("PROPFIND", f"{webdav_base()}/{dirname}", headers={"Depth": "0"})
|
||||||
|
assert r2.status_code in (200, 207)
|
||||||
|
|
||||||
|
dav.delete(f"{webdav_base()}/{dirname}")
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-A3 — .git/ directory protection
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestWebDAVGitProtection:
|
||||||
|
|
||||||
|
def test_git_directory_is_inaccessible(self, dav):
|
||||||
|
"""The .git/ directory must not be served — nginx should deny it."""
|
||||||
|
r = dav.request("PROPFIND", f"{webdav_base()}/.git/", headers={"Depth": "0"})
|
||||||
|
assert r.status_code in (403, 404), (
|
||||||
|
f"Expected 403/404 for /.git/ but got {r.status_code}. "
|
||||||
|
"The WebDAV config must deny access to .git/"
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_git_config_file_is_inaccessible(self, dav):
|
||||||
|
r = dav.get(f"{webdav_base()}/.git/config")
|
||||||
|
assert r.status_code in (403, 404)
|
||||||
|
|
||||||
|
def test_git_head_file_is_inaccessible(self, dav):
|
||||||
|
r = dav.get(f"{webdav_base()}/.git/HEAD")
|
||||||
|
assert r.status_code in (403, 404)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-A1 — Bidirectional sync simulation
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestWebDAVFileOps:
|
||||||
|
|
||||||
|
def test_create_read_delete_roundtrip(self, dav, test_filename):
|
||||||
|
"""Full create → read → delete cycle for a markdown file."""
|
||||||
|
url = f"{webdav_base()}/{test_filename}"
|
||||||
|
body = "---\noutline_id: test-001\n---\n# Test\nContent.\n"
|
||||||
|
|
||||||
|
# Create
|
||||||
|
assert dav.put(url, data=body.encode()).status_code in (200, 201, 204)
|
||||||
|
|
||||||
|
# Read back — content must match
|
||||||
|
r = dav.get(url)
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert "outline_id: test-001" in r.text
|
||||||
|
|
||||||
|
# Delete
|
||||||
|
assert dav.delete(url).status_code in (200, 204)
|
||||||
|
|
||||||
|
def test_unicode_content_preserved(self, dav, test_filename):
|
||||||
|
url = f"{webdav_base()}/{test_filename}"
|
||||||
|
body = "# Ünïcödé Héadïng\n\nGerman: Straße, Chinese: 你好\n".encode("utf-8")
|
||||||
|
|
||||||
|
dav.put(url, data=body)
|
||||||
|
r = dav.get(url)
|
||||||
|
assert "Straße" in r.text
|
||||||
|
assert "你好" in r.text
|
||||||
|
|
||||||
|
dav.delete(url)
|
||||||
|
|
||||||
|
def test_large_file_survives_roundtrip(self, dav, test_filename):
|
||||||
|
url = f"{webdav_base()}/{test_filename}"
|
||||||
|
# 500 KB markdown file
|
||||||
|
body = ("# Big Note\n" + "x" * 500_000).encode()
|
||||||
|
|
||||||
|
dav.put(url, data=body)
|
||||||
|
r = dav.get(url)
|
||||||
|
assert len(r.content) >= 500_000
|
||||||
|
|
||||||
|
dav.delete(url)
|
||||||
|
|
||||||
|
def test_obsidian_settings_directory_can_be_excluded(self, dav):
|
||||||
|
"""
|
||||||
|
Verify we can PUT to a path we'd want to ignore (.obsidian/)
|
||||||
|
and then verify the nginx/webdav config (if configured) can block it.
|
||||||
|
This test documents expected behaviour; if .obsidian/ IS accessible,
|
||||||
|
it must be controlled at the Obsidian plugin level (ignore list).
|
||||||
|
"""
|
||||||
|
# This is an informational check, not a hard assertion —
|
||||||
|
# .obsidian/ exclusion is handled by the remotely-save plugin config.
|
||||||
|
r = dav.request("PROPFIND", f"{webdav_base()}/.obsidian/", headers={"Depth": "0"})
|
||||||
|
# 404 = not present (preferred), 403 = blocked, 207 = accessible
|
||||||
|
# All are valid — important thing is it is documented
|
||||||
|
assert r.status_code in (403, 404, 207)
|
||||||
178
tests/test_phase_b_dashboard.py
Normal file
178
tests/test_phase_b_dashboard.py
Normal file
@@ -0,0 +1,178 @@
|
|||||||
|
"""
|
||||||
|
Phase B — Read-Only Dashboard Tests
|
||||||
|
|
||||||
|
Tests for GET / (dashboard HTML) and GET /status (JSON API).
|
||||||
|
All tests use the FastAPI test client and mock git subprocess calls.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import textwrap
|
||||||
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
import pytest_asyncio
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.asyncio
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def make_git_output(stdout: str = "", returncode: int = 0) -> MagicMock:
|
||||||
|
m = MagicMock()
|
||||||
|
m.stdout = stdout
|
||||||
|
m.returncode = returncode
|
||||||
|
return m
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-B1 — Dashboard page renders
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestDashboardPage:
|
||||||
|
|
||||||
|
async def test_get_root_returns_200(self, client):
|
||||||
|
r = await client.get("/")
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_dashboard_returns_html(self, client):
|
||||||
|
r = await client.get("/")
|
||||||
|
assert "text/html" in r.headers.get("content-type", "")
|
||||||
|
|
||||||
|
async def test_dashboard_contains_status_badge(self, client):
|
||||||
|
r = await client.get("/")
|
||||||
|
body = r.text.lower()
|
||||||
|
# At minimum one of these status words must appear
|
||||||
|
assert any(word in body for word in ("clean", "dirty", "conflict", "pending")), (
|
||||||
|
"Dashboard must show a vault status badge"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_dashboard_contains_pull_button(self, client):
|
||||||
|
r = await client.get("/")
|
||||||
|
body = r.text.lower()
|
||||||
|
assert "pull" in body or "get from outline" in body
|
||||||
|
|
||||||
|
async def test_dashboard_contains_push_button(self, client):
|
||||||
|
r = await client.get("/")
|
||||||
|
body = r.text.lower()
|
||||||
|
assert "push" in body or "send to outline" in body
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-B2 — Status endpoint returns structured JSON
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestStatusEndpoint:
|
||||||
|
|
||||||
|
async def test_status_returns_200(self, client):
|
||||||
|
r = await client.get("/status")
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_status_returns_json(self, client):
|
||||||
|
r = await client.get("/status")
|
||||||
|
assert "application/json" in r.headers.get("content-type", "")
|
||||||
|
data = r.json()
|
||||||
|
assert isinstance(data, dict)
|
||||||
|
|
||||||
|
async def test_status_has_required_fields(self, client):
|
||||||
|
r = await client.get("/status")
|
||||||
|
data = r.json()
|
||||||
|
assert "vault_status" in data, "Missing 'vault_status' field"
|
||||||
|
assert "pending_count" in data, "Missing 'pending_count' field"
|
||||||
|
assert "conflicts" in data, "Missing 'conflicts' field"
|
||||||
|
assert "last_pull" in data, "Missing 'last_pull' field"
|
||||||
|
assert "last_push" in data, "Missing 'last_push' field"
|
||||||
|
|
||||||
|
async def test_status_pending_count_is_integer(self, client):
|
||||||
|
r = await client.get("/status")
|
||||||
|
data = r.json()
|
||||||
|
assert isinstance(data["pending_count"], int)
|
||||||
|
assert data["pending_count"] >= 0
|
||||||
|
|
||||||
|
async def test_status_conflicts_is_integer(self, client):
|
||||||
|
r = await client.get("/status")
|
||||||
|
data = r.json()
|
||||||
|
assert isinstance(data["conflicts"], int)
|
||||||
|
assert data["conflicts"] >= 0
|
||||||
|
|
||||||
|
async def test_status_clean_vault(self, client, populated_vault):
|
||||||
|
"""
|
||||||
|
After a clean merge, pending_count should be 0 and conflicts 0.
|
||||||
|
"""
|
||||||
|
r = await client.get("/status")
|
||||||
|
data = r.json()
|
||||||
|
assert data["pending_count"] == 0
|
||||||
|
assert data["conflicts"] == 0
|
||||||
|
assert data["vault_status"] in ("clean", "ok", "synced")
|
||||||
|
|
||||||
|
async def test_status_with_pending_changes(self, client, vault_with_pending):
|
||||||
|
"""
|
||||||
|
vault_with_pending has local edits on main — pending_count must be > 0.
|
||||||
|
"""
|
||||||
|
r = await client.get("/status")
|
||||||
|
data = r.json()
|
||||||
|
assert data["pending_count"] > 0
|
||||||
|
|
||||||
|
async def test_status_vault_status_dirty_when_pending(self, client, vault_with_pending):
|
||||||
|
r = await client.get("/status")
|
||||||
|
data = r.json()
|
||||||
|
assert data["vault_status"] != "clean"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-B3 — Conflict warning badge
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestConflictBadge:
|
||||||
|
|
||||||
|
async def test_no_conflict_badge_when_clean(self, client, populated_vault):
|
||||||
|
r = await client.get("/")
|
||||||
|
# Should NOT contain a prominent conflict warning
|
||||||
|
# (checking for the absence of conflict count > 0 in dashboard)
|
||||||
|
data = (await client.get("/status")).json()
|
||||||
|
assert data["conflicts"] == 0
|
||||||
|
|
||||||
|
async def test_conflict_badge_visible_when_conflicts_exist(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
status = (await client.get("/status")).json()
|
||||||
|
assert status["conflicts"] > 0, "Expected conflict count > 0"
|
||||||
|
|
||||||
|
r = await client.get("/")
|
||||||
|
body = r.text.lower()
|
||||||
|
assert "conflict" in body, "Dashboard must show conflict warning when conflicts exist"
|
||||||
|
|
||||||
|
async def test_conflict_badge_links_to_conflicts_page(self, client, vault_with_conflict):
|
||||||
|
r = await client.get("/")
|
||||||
|
assert "/conflicts" in r.text, "Conflict badge must link to /conflicts"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-B4 — Pending count reflects git diff
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestPendingCount:
|
||||||
|
|
||||||
|
async def test_pending_count_zero_on_clean_vault(self, client, populated_vault):
|
||||||
|
r = await client.get("/status")
|
||||||
|
assert r.json()["pending_count"] == 0
|
||||||
|
|
||||||
|
async def test_pending_count_increases_with_local_edits(
|
||||||
|
self, client, vault_with_pending
|
||||||
|
):
|
||||||
|
r = await client.get("/status")
|
||||||
|
# We added 1 modified file + 1 new file in vault_with_pending fixture
|
||||||
|
assert r.json()["pending_count"] >= 2
|
||||||
|
|
||||||
|
async def test_pending_count_shown_in_push_button(self, client, vault_with_pending):
|
||||||
|
"""Push button label should reflect pending count."""
|
||||||
|
r = await client.get("/")
|
||||||
|
body = r.text
|
||||||
|
status = (await client.get("/status")).json()
|
||||||
|
pending = status["pending_count"]
|
||||||
|
# The pending count must appear somewhere near the push button
|
||||||
|
assert str(pending) in body, (
|
||||||
|
f"Push button should show pending count ({pending}) in label"
|
||||||
|
)
|
||||||
266
tests/test_phase_c_pull.py
Normal file
266
tests/test_phase_c_pull.py
Normal file
@@ -0,0 +1,266 @@
|
|||||||
|
"""
|
||||||
|
Phase C — Pull with Live Output Tests
|
||||||
|
|
||||||
|
Tests for POST /pull (job start) and GET /stream/{job_id} (SSE streaming).
|
||||||
|
The sync subprocess is mocked so tests do not require a live Outline instance.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import AsyncMock, MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent))
|
||||||
|
from helpers import make_mock_process # noqa: E402
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.asyncio
|
||||||
|
|
||||||
|
|
||||||
|
async def consume_sse(client, job_id: str, max_events: int = 50) -> list[dict]:
|
||||||
|
"""Stream SSE events until 'done' or max_events reached."""
|
||||||
|
events = []
|
||||||
|
async with client.stream("GET", f"/stream/{job_id}") as r:
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert "text/event-stream" in r.headers.get("content-type", "")
|
||||||
|
async for line in r.aiter_lines():
|
||||||
|
if line.startswith("data:"):
|
||||||
|
try:
|
||||||
|
events.append(json.loads(line[5:].strip()))
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
events.append({"raw": line[5:].strip()})
|
||||||
|
if events and events[-1].get("type") == "done":
|
||||||
|
break
|
||||||
|
if len(events) >= max_events:
|
||||||
|
break
|
||||||
|
return events
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-C1 — POST /pull starts a job and returns a job_id
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestPullJobCreation:
|
||||||
|
|
||||||
|
async def test_post_pull_returns_202(self, client):
|
||||||
|
with patch("webui.run_sync_job", new_callable=AsyncMock) as _mock_job:
|
||||||
|
r = await client.post("/pull")
|
||||||
|
assert r.status_code in (200, 202)
|
||||||
|
|
||||||
|
async def test_post_pull_returns_job_id(self, client):
|
||||||
|
with patch("webui.run_sync_job", new_callable=AsyncMock) as _mock_job:
|
||||||
|
r = await client.post("/pull")
|
||||||
|
data = r.json()
|
||||||
|
assert "job_id" in data, "Response must include a job_id"
|
||||||
|
assert isinstance(data["job_id"], str)
|
||||||
|
assert len(data["job_id"]) > 0
|
||||||
|
|
||||||
|
async def test_post_pull_returns_stream_url(self, client):
|
||||||
|
with patch("webui.run_sync_job", new_callable=AsyncMock) as _mock_job:
|
||||||
|
r = await client.post("/pull")
|
||||||
|
data = r.json()
|
||||||
|
# Either stream_url or job_id is sufficient to construct the SSE URL
|
||||||
|
assert "job_id" in data or "stream_url" in data
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-C1 — SSE stream emits progress events
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestPullStreaming:
|
||||||
|
|
||||||
|
async def test_stream_content_type_is_sse(self, client):
|
||||||
|
"""GET /stream/{job_id} must return text/event-stream."""
|
||||||
|
with patch("webui.run_sync_job", new_callable=AsyncMock) as _mock_job:
|
||||||
|
r = await client.post("/pull")
|
||||||
|
job_id = r.json()["job_id"]
|
||||||
|
|
||||||
|
async with client.stream("GET", f"/stream/{job_id}") as stream:
|
||||||
|
assert "text/event-stream" in stream.headers.get("content-type", "")
|
||||||
|
|
||||||
|
async def test_stream_emits_data_events(self, client):
|
||||||
|
"""Stream must yield at least one data event."""
|
||||||
|
pull_lines = [
|
||||||
|
"Fetching collections...",
|
||||||
|
"Processing Bewerbungen/CV.md",
|
||||||
|
"Processing Infra/HomeLab.md",
|
||||||
|
"Done. 2 updated, 0 created.",
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(pull_lines)
|
||||||
|
r = await client.post("/pull")
|
||||||
|
job_id = r.json()["job_id"]
|
||||||
|
events = await consume_sse(client, job_id)
|
||||||
|
|
||||||
|
assert len(events) >= 1, "Stream must emit at least one event"
|
||||||
|
|
||||||
|
async def test_stream_ends_with_done_event(self, client):
|
||||||
|
"""Last event in the stream must be type=done."""
|
||||||
|
pull_lines = [
|
||||||
|
"Fetching collections...",
|
||||||
|
"Done. 1 updated.",
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(pull_lines)
|
||||||
|
r = await client.post("/pull")
|
||||||
|
job_id = r.json()["job_id"]
|
||||||
|
events = await consume_sse(client, job_id)
|
||||||
|
|
||||||
|
done_events = [e for e in events if e.get("type") == "done"]
|
||||||
|
assert len(done_events) >= 1, "Stream must end with a 'done' event"
|
||||||
|
|
||||||
|
async def test_stream_done_event_contains_summary(self, client):
|
||||||
|
"""The done event must include summary statistics."""
|
||||||
|
pull_lines = [
|
||||||
|
"Done. 2 updated, 1 created, 0 errors.",
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(pull_lines)
|
||||||
|
r = await client.post("/pull")
|
||||||
|
job_id = r.json()["job_id"]
|
||||||
|
events = await consume_sse(client, job_id)
|
||||||
|
|
||||||
|
done = next((e for e in events if e.get("type") == "done"), None)
|
||||||
|
assert done is not None
|
||||||
|
# Summary can be in 'message', 'summary', or top-level 'data' text
|
||||||
|
summary_text = json.dumps(done)
|
||||||
|
assert any(word in summary_text for word in ("updated", "created", "done", "0")), (
|
||||||
|
"Done event must contain a summary"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_stream_includes_per_file_events(self, client):
|
||||||
|
"""Each processed file should generate its own progress event."""
|
||||||
|
pull_lines = [
|
||||||
|
"processing: Bewerbungen/CV.md",
|
||||||
|
"ok: Bewerbungen/CV.md updated",
|
||||||
|
"processing: Infra/HomeLab.md",
|
||||||
|
"ok: Infra/HomeLab.md updated",
|
||||||
|
"Done. 2 updated.",
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(pull_lines)
|
||||||
|
r = await client.post("/pull")
|
||||||
|
job_id = r.json()["job_id"]
|
||||||
|
events = await consume_sse(client, job_id)
|
||||||
|
|
||||||
|
all_text = json.dumps(events)
|
||||||
|
assert "CV.md" in all_text or "Bewerbungen" in all_text, (
|
||||||
|
"Stream events should reference processed files"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_stream_for_unknown_job_returns_404(self, client):
|
||||||
|
r = await client.get("/stream/nonexistent-job-id-xyz")
|
||||||
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
async def test_failed_sync_emits_error_event(self, client):
|
||||||
|
"""If the sync process exits with non-zero, stream must emit an error event."""
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(
|
||||||
|
["Error: API connection failed"], returncode=1
|
||||||
|
)
|
||||||
|
r = await client.post("/pull")
|
||||||
|
job_id = r.json()["job_id"]
|
||||||
|
events = await consume_sse(client, job_id)
|
||||||
|
|
||||||
|
error_events = [e for e in events if e.get("type") in ("error", "done")]
|
||||||
|
assert any(
|
||||||
|
e.get("success") is False or e.get("type") == "error"
|
||||||
|
for e in error_events
|
||||||
|
), "Failed sync must emit an error event"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-C2 — Pull content actually updates vault files
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestPullContent:
|
||||||
|
|
||||||
|
async def test_pull_advances_outline_branch(self, client, populated_vault):
|
||||||
|
"""
|
||||||
|
After a pull that introduces a new document, the outline branch
|
||||||
|
must have a new commit compared to before.
|
||||||
|
"""
|
||||||
|
import subprocess as sp
|
||||||
|
before = sp.run(
|
||||||
|
["git", "-C", str(populated_vault), "rev-parse", "outline"],
|
||||||
|
capture_output=True, text=True,
|
||||||
|
).stdout.strip()
|
||||||
|
|
||||||
|
# Simulate pull writing a new file to outline branch
|
||||||
|
new_file = populated_vault / "Projekte" / "FreshDoc.md"
|
||||||
|
new_file.parent.mkdir(exist_ok=True)
|
||||||
|
new_file.write_text("---\noutline_id: doc-new-001\n---\n# Fresh Doc\n")
|
||||||
|
|
||||||
|
import subprocess as sp2
|
||||||
|
sp2.run(["git", "-C", str(populated_vault), "checkout", "outline"], check=True, capture_output=True)
|
||||||
|
sp2.run(["git", "-C", str(populated_vault), "add", "-A"], check=True, capture_output=True)
|
||||||
|
sp2.run(["git", "-C", str(populated_vault), "commit", "-m", "outline: new doc"], check=True, capture_output=True)
|
||||||
|
after = sp2.run(
|
||||||
|
["git", "-C", str(populated_vault), "rev-parse", "outline"],
|
||||||
|
capture_output=True, text=True,
|
||||||
|
).stdout.strip()
|
||||||
|
|
||||||
|
assert before != after, "outline branch must advance after pull"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-C3 — Idempotent pull
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestPullIdempotent:
|
||||||
|
|
||||||
|
async def test_pull_with_no_changes_returns_success(self, client):
|
||||||
|
"""A pull against an empty diff must succeed (not error)."""
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(
|
||||||
|
["No changes from Outline.", "Done. 0 updated."], returncode=0
|
||||||
|
)
|
||||||
|
r = await client.post("/pull")
|
||||||
|
assert r.status_code in (200, 202)
|
||||||
|
|
||||||
|
async def test_pull_twice_is_safe(self, client):
|
||||||
|
"""Two sequential pulls must both succeed."""
|
||||||
|
# Keep patch active until SSE stream finishes so the task can run
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||||
|
r1 = await client.post("/pull")
|
||||||
|
assert r1.status_code in (200, 202)
|
||||||
|
await consume_sse(client, r1.json()["job_id"]) # drain → job completes
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 0 updated."])
|
||||||
|
r2 = await client.post("/pull")
|
||||||
|
assert r2.status_code in (200, 202)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-C4 — Job lock prevents concurrent syncs
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestJobLock:
|
||||||
|
|
||||||
|
async def test_concurrent_pull_returns_409(self, client):
|
||||||
|
"""
|
||||||
|
Starting a second pull while the first is pending/running returns 409.
|
||||||
|
_active_job is set immediately when POST /pull is called.
|
||||||
|
"""
|
||||||
|
r1 = await client.post("/pull")
|
||||||
|
assert r1.status_code in (200, 202), "First pull must be accepted"
|
||||||
|
|
||||||
|
# _active_job is now set — second pull must be rejected
|
||||||
|
r2 = await client.post("/pull")
|
||||||
|
assert r2.status_code == 409, (
|
||||||
|
"Second pull while first is pending must return 409 Conflict"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_push_while_pull_running_returns_409(self, client):
|
||||||
|
"""Push is also blocked while a pull is pending/running."""
|
||||||
|
await client.post("/pull")
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code == 409
|
||||||
263
tests/test_phase_d_changes.py
Normal file
263
tests/test_phase_d_changes.py
Normal file
@@ -0,0 +1,263 @@
|
|||||||
|
"""
|
||||||
|
Phase D — Pending Changes View Tests
|
||||||
|
|
||||||
|
Tests for GET /changes (structured change list) and GET /diff/{path} (inline diff).
|
||||||
|
Git operations run against the real temp vault — no subprocess mocking needed here.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import subprocess
|
||||||
|
import textwrap
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.asyncio
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def encode_path(path: str) -> str:
|
||||||
|
"""URL-safe base64 encoding of a path, matching what the app uses."""
|
||||||
|
return base64.urlsafe_b64encode(path.encode()).decode()
|
||||||
|
|
||||||
|
|
||||||
|
def git(vault: Path, *args) -> str:
|
||||||
|
return subprocess.run(
|
||||||
|
["git", "-C", str(vault), *args],
|
||||||
|
check=True, capture_output=True, text=True,
|
||||||
|
).stdout.strip()
|
||||||
|
|
||||||
|
|
||||||
|
def commit_all(vault: Path, message: str):
|
||||||
|
subprocess.run(["git", "-C", str(vault), "add", "-A"], check=True, capture_output=True)
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "-C", str(vault), "commit", "-m", message], check=True, capture_output=True)
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-D1 — Changes endpoint structure
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestChangesEndpoint:
|
||||||
|
|
||||||
|
async def test_get_changes_returns_200(self, client):
|
||||||
|
r = await client.get("/changes")
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_changes_returns_json(self, client):
|
||||||
|
r = await client.get("/changes")
|
||||||
|
assert "application/json" in r.headers.get("content-type", "")
|
||||||
|
|
||||||
|
async def test_changes_returns_list(self, client):
|
||||||
|
r = await client.get("/changes")
|
||||||
|
data = r.json()
|
||||||
|
assert isinstance(data, list)
|
||||||
|
|
||||||
|
async def test_changes_empty_when_clean(self, client, populated_vault):
|
||||||
|
r = await client.get("/changes")
|
||||||
|
assert r.json() == []
|
||||||
|
|
||||||
|
async def test_each_change_has_required_fields(self, client, vault_with_pending):
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
assert len(items) > 0, "Expected pending changes"
|
||||||
|
for item in items:
|
||||||
|
assert "path" in item, f"Missing 'path' in item: {item}"
|
||||||
|
assert "status" in item, f"Missing 'status' in item: {item}"
|
||||||
|
assert "action" in item, f"Missing 'action' in item: {item}"
|
||||||
|
|
||||||
|
async def test_status_values_are_valid(self, client, vault_with_pending):
|
||||||
|
valid_statuses = {"modified", "added", "deleted", "renamed"}
|
||||||
|
r = await client.get("/changes")
|
||||||
|
for item in r.json():
|
||||||
|
assert item["status"] in valid_statuses, (
|
||||||
|
f"Invalid status '{item['status']}' — must be one of {valid_statuses}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-D2 — Change categories
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestChangeCategories:
|
||||||
|
|
||||||
|
async def test_modified_file_shown_as_modified(self, client, populated_vault):
|
||||||
|
# Edit an existing file on main
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.write_text(cv.read_text() + "\n## Appendix\nNew content.\n")
|
||||||
|
commit_all(populated_vault, "obsidian: edit CV")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
cv_item = next((i for i in items if "CV.md" in i["path"]), None)
|
||||||
|
assert cv_item is not None, "CV.md must appear in changes"
|
||||||
|
assert cv_item["status"] == "modified"
|
||||||
|
|
||||||
|
async def test_new_file_shown_as_added(self, client, populated_vault):
|
||||||
|
new_file = populated_vault / "Projekte" / "NewDoc.md"
|
||||||
|
new_file.parent.mkdir(exist_ok=True)
|
||||||
|
new_file.write_text("# New Doc\nWritten in Obsidian.\n")
|
||||||
|
commit_all(populated_vault, "obsidian: new doc")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
new_item = next((i for i in items if "NewDoc.md" in i["path"]), None)
|
||||||
|
assert new_item is not None, "NewDoc.md must appear in changes"
|
||||||
|
assert new_item["status"] == "added"
|
||||||
|
|
||||||
|
async def test_deleted_file_shown_as_deleted(self, client, populated_vault):
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.unlink()
|
||||||
|
commit_all(populated_vault, "obsidian: delete CV")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
del_item = next((i for i in items if "CV.md" in i["path"]), None)
|
||||||
|
assert del_item is not None, "Deleted CV.md must appear in changes"
|
||||||
|
assert del_item["status"] == "deleted"
|
||||||
|
|
||||||
|
async def test_renamed_file_shown_as_renamed(self, client, populated_vault):
|
||||||
|
old_path = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
new_path = populated_vault / "Bewerbungen" / "Curriculum Vitae.md"
|
||||||
|
old_path.rename(new_path)
|
||||||
|
commit_all(populated_vault, "obsidian: rename CV")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
# Either old or new name should appear with renamed status
|
||||||
|
renamed = [i for i in items if i["status"] == "renamed"]
|
||||||
|
assert len(renamed) >= 1, "Renamed file must appear with status=renamed"
|
||||||
|
# Check both from_path and to_path since either may contain "CV"
|
||||||
|
all_paths = " ".join(
|
||||||
|
str(i.get("from_path", "")) + " " + str(i.get("to_path", i["path"]))
|
||||||
|
for i in renamed
|
||||||
|
)
|
||||||
|
assert "CV" in all_paths, "Renamed paths must reference the original filename"
|
||||||
|
|
||||||
|
async def test_renamed_item_has_from_and_to_paths(self, client, populated_vault):
|
||||||
|
old_path = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
new_path = populated_vault / "Bewerbungen" / "Resume.md"
|
||||||
|
old_path.rename(new_path)
|
||||||
|
commit_all(populated_vault, "obsidian: rename")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
renamed = [i for i in r.json() if i["status"] == "renamed"]
|
||||||
|
assert len(renamed) >= 1
|
||||||
|
item = renamed[0]
|
||||||
|
assert "from" in item or "from_path" in item, "Rename must include source path"
|
||||||
|
assert "to" in item or "to_path" in item, "Rename must include destination path"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-D3 — Diff preview
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestDiffPreview:
|
||||||
|
|
||||||
|
async def test_diff_endpoint_returns_200(self, client, populated_vault):
|
||||||
|
# Edit a file to create a diff
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
original = cv.read_text()
|
||||||
|
cv.write_text(original + "\n## New Section\n")
|
||||||
|
commit_all(populated_vault, "edit for diff")
|
||||||
|
|
||||||
|
encoded = encode_path("Bewerbungen/CV.md")
|
||||||
|
r = await client.get(f"/diff/{encoded}")
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_diff_returns_html_fragment(self, client, populated_vault):
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.write_text(cv.read_text() + "\nExtra line.\n")
|
||||||
|
commit_all(populated_vault, "edit for diff")
|
||||||
|
|
||||||
|
encoded = encode_path("Bewerbungen/CV.md")
|
||||||
|
r = await client.get(f"/diff/{encoded}")
|
||||||
|
assert "text/html" in r.headers.get("content-type", ""), (
|
||||||
|
"Diff endpoint must return HTML"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_diff_contains_two_columns(self, client, populated_vault):
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.write_text(cv.read_text() + "\nAdded line.\n")
|
||||||
|
commit_all(populated_vault, "edit for diff")
|
||||||
|
|
||||||
|
encoded = encode_path("Bewerbungen/CV.md")
|
||||||
|
r = await client.get(f"/diff/{encoded}")
|
||||||
|
body = r.text.lower()
|
||||||
|
# Two-column layout — check for table or grid structure
|
||||||
|
assert "table" in body or "column" in body or "diff" in body, (
|
||||||
|
"Diff HTML must contain a two-column comparison layout"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_diff_for_unknown_file_returns_404(self, client, populated_vault):
|
||||||
|
encoded = encode_path("DoesNotExist/ghost.md")
|
||||||
|
r = await client.get(f"/diff/{encoded}")
|
||||||
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
async def test_diff_added_lines_have_distinct_marking(self, client, populated_vault):
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.write_text(cv.read_text() + "\nThis line was added.\n")
|
||||||
|
commit_all(populated_vault, "add line")
|
||||||
|
|
||||||
|
encoded = encode_path("Bewerbungen/CV.md")
|
||||||
|
r = await client.get(f"/diff/{encoded}")
|
||||||
|
body = r.text
|
||||||
|
# Added lines must be visually distinct (green class, + prefix, or ins tag)
|
||||||
|
# difflib.HtmlDiff marks added lines with class="diff_add"
|
||||||
|
assert any(marker in body for marker in (
|
||||||
|
'class="diff_add"', "diff_add", 'class="add"', "<ins>", "diff-add",
|
||||||
|
)), "Added lines must be visually marked in the diff"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-D4 — Deleted files skipped when allow_deletions=false
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestDeletedFilesSkipped:
|
||||||
|
|
||||||
|
async def test_deleted_file_action_is_skip_when_deletions_off(
|
||||||
|
self, client, populated_vault, settings_file
|
||||||
|
):
|
||||||
|
"""With allow_deletions=false in settings, deleted files must show action=skip."""
|
||||||
|
import json
|
||||||
|
settings = json.loads(settings_file.read_text())
|
||||||
|
settings["sync"]["allow_deletions"] = False
|
||||||
|
settings_file.write_text(json.dumps(settings))
|
||||||
|
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.unlink()
|
||||||
|
commit_all(populated_vault, "delete CV")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
del_item = next((i for i in items if "CV.md" in i["path"]), None)
|
||||||
|
assert del_item is not None
|
||||||
|
assert del_item["action"] in ("skip", "skipped"), (
|
||||||
|
"Deleted file must have action=skip when deletions are disabled"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_deleted_file_action_is_delete_when_deletions_on(
|
||||||
|
self, client, populated_vault, settings_file
|
||||||
|
):
|
||||||
|
"""With allow_deletions=true, deleted file action must be delete."""
|
||||||
|
import json
|
||||||
|
settings = json.loads(settings_file.read_text())
|
||||||
|
settings["sync"]["allow_deletions"] = True
|
||||||
|
settings_file.write_text(json.dumps(settings))
|
||||||
|
|
||||||
|
cv = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
cv.unlink()
|
||||||
|
commit_all(populated_vault, "delete CV")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
del_item = next((i for i in items if "CV.md" in i["path"]), None)
|
||||||
|
assert del_item is not None
|
||||||
|
assert del_item["action"] in ("delete", "archive"), (
|
||||||
|
"Deleted file must have action=delete when deletions are enabled"
|
||||||
|
)
|
||||||
290
tests/test_phase_e_push.py
Normal file
290
tests/test_phase_e_push.py
Normal file
@@ -0,0 +1,290 @@
|
|||||||
|
"""
|
||||||
|
Phase E — Push with Live Output Tests
|
||||||
|
|
||||||
|
Tests for POST /push (job start) and the SSE stream.
|
||||||
|
The Outline API is mocked so tests do not require a live Outline instance.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import textwrap
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import AsyncMock, MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent))
|
||||||
|
from helpers import make_mock_process # noqa: E402
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.asyncio
|
||||||
|
|
||||||
|
|
||||||
|
async def consume_sse(client, job_id: str, max_events: int = 100) -> list[dict]:
|
||||||
|
events = []
|
||||||
|
async with client.stream("GET", f"/stream/{job_id}") as r:
|
||||||
|
async for line in r.aiter_lines():
|
||||||
|
if line.startswith("data:"):
|
||||||
|
try:
|
||||||
|
events.append(json.loads(line[5:].strip()))
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
events.append({"raw": line[5:].strip()})
|
||||||
|
if events and events[-1].get("type") == "done":
|
||||||
|
break
|
||||||
|
return events
|
||||||
|
|
||||||
|
|
||||||
|
def git(vault: Path, *args) -> str:
|
||||||
|
return subprocess.run(
|
||||||
|
["git", "-C", str(vault), *args],
|
||||||
|
check=True, capture_output=True, text=True,
|
||||||
|
).stdout.strip()
|
||||||
|
|
||||||
|
|
||||||
|
def commit_all(vault: Path, message: str):
|
||||||
|
subprocess.run(["git", "-C", str(vault), "add", "-A"], check=True, capture_output=True)
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "-C", str(vault), "commit", "-m", message], check=True, capture_output=True)
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-E1 — Push streaming
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestPushStreaming:
|
||||||
|
|
||||||
|
async def test_post_push_returns_job_id(self, client, vault_with_pending):
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 2 updated, 1 created."])
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code in (200, 202)
|
||||||
|
assert "job_id" in r.json()
|
||||||
|
|
||||||
|
async def test_stream_content_type_is_sse(self, client, vault_with_pending):
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done."])
|
||||||
|
r = await client.post("/push")
|
||||||
|
job_id = r.json()["job_id"]
|
||||||
|
|
||||||
|
async with client.stream("GET", f"/stream/{job_id}") as stream:
|
||||||
|
assert "text/event-stream" in stream.headers.get("content-type", "")
|
||||||
|
|
||||||
|
async def test_stream_ends_with_done_event(self, client, vault_with_pending):
|
||||||
|
push_lines = [
|
||||||
|
"ok: Bewerbungen/CV.md updated",
|
||||||
|
"ok: Projekte/NewNote.md created (id: abc123)",
|
||||||
|
"Done. 1 updated, 1 created.",
|
||||||
|
]
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(push_lines)
|
||||||
|
r = await client.post("/push")
|
||||||
|
job_id = r.json()["job_id"]
|
||||||
|
events = await consume_sse(client, job_id)
|
||||||
|
|
||||||
|
done_events = [e for e in events if e.get("type") == "done"]
|
||||||
|
assert len(done_events) == 1
|
||||||
|
|
||||||
|
async def test_done_event_contains_summary_counts(self, client, vault_with_pending):
|
||||||
|
push_lines = ["Done. 1 updated, 1 created, 0 skipped, 0 errors."]
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(push_lines)
|
||||||
|
r = await client.post("/push")
|
||||||
|
events = await consume_sse(client, r.json()["job_id"])
|
||||||
|
done = next(e for e in events if e.get("type") == "done")
|
||||||
|
summary = json.dumps(done)
|
||||||
|
# Summary counts must appear somewhere in the done event
|
||||||
|
assert any(k in summary for k in ("updated", "created", "skipped", "errors")), (
|
||||||
|
"Done event must include summary counts"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_per_file_events_emitted(self, client, vault_with_pending):
|
||||||
|
push_lines = [
|
||||||
|
"processing: Bewerbungen/CV.md",
|
||||||
|
"ok: Bewerbungen/CV.md updated",
|
||||||
|
"processing: Projekte/NewNote.md",
|
||||||
|
"ok: Projekte/NewNote.md created (id: xyz789)",
|
||||||
|
"Done.",
|
||||||
|
]
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(push_lines)
|
||||||
|
r = await client.post("/push")
|
||||||
|
events = await consume_sse(client, r.json()["job_id"])
|
||||||
|
all_text = json.dumps(events)
|
||||||
|
assert "CV.md" in all_text, "Events should mention CV.md"
|
||||||
|
assert "NewNote.md" in all_text, "Events should mention NewNote.md"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-E2 — New file frontmatter writeback
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestNewFileCreation:
|
||||||
|
|
||||||
|
async def test_new_file_appears_in_pending_changes(self, client, populated_vault):
|
||||||
|
new_file = populated_vault / "Projekte" / "BrandNew.md"
|
||||||
|
new_file.parent.mkdir(exist_ok=True)
|
||||||
|
new_file.write_text("# Brand New\nContent without frontmatter.\n")
|
||||||
|
commit_all(populated_vault, "obsidian: new file")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
new_item = next((i for i in items if "BrandNew.md" in i["path"]), None)
|
||||||
|
assert new_item is not None
|
||||||
|
assert new_item["status"] == "added"
|
||||||
|
|
||||||
|
async def test_push_writes_frontmatter_back_to_new_file(
|
||||||
|
self, client, populated_vault
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
After push, a new file must have frontmatter with outline_id injected.
|
||||||
|
The mock simulates the sync engine writing back the ID.
|
||||||
|
"""
|
||||||
|
new_file = populated_vault / "Projekte" / "FrontmatterTest.md"
|
||||||
|
new_file.parent.mkdir(exist_ok=True)
|
||||||
|
new_file.write_text("# Frontmatter Test\nNo ID yet.\n")
|
||||||
|
commit_all(populated_vault, "obsidian: new file no frontmatter")
|
||||||
|
|
||||||
|
fake_id = "doc-new-frontmatter-001"
|
||||||
|
|
||||||
|
def fake_push(*args, **kwargs):
|
||||||
|
# Simulate sync engine writing frontmatter back
|
||||||
|
new_file.write_text(textwrap.dedent(f"""\
|
||||||
|
---
|
||||||
|
outline_id: {fake_id}
|
||||||
|
outline_collection_id: col-proj-001
|
||||||
|
---
|
||||||
|
# Frontmatter Test
|
||||||
|
No ID yet.
|
||||||
|
"""))
|
||||||
|
commit_all(populated_vault, "sync: write back frontmatter")
|
||||||
|
return make_mock_process([
|
||||||
|
f"ok: Projekte/FrontmatterTest.md created (id: {fake_id})",
|
||||||
|
"Done. 1 created.",
|
||||||
|
])
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", side_effect=fake_push):
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code in (200, 202)
|
||||||
|
await consume_sse(client, r.json()["job_id"])
|
||||||
|
|
||||||
|
content = new_file.read_text()
|
||||||
|
assert "outline_id" in content, "Sync engine must write outline_id back to new file"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-E3 — Push blocked by conflicts
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestPushBlockedByConflicts:
|
||||||
|
|
||||||
|
async def test_push_returns_409_when_conflicts_exist(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code == 409, (
|
||||||
|
"Push must return 409 Conflict when unresolved merge conflicts exist"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_push_409_response_includes_conflict_paths(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code == 409
|
||||||
|
body = r.json()
|
||||||
|
assert "conflicts" in body or "files" in body or "message" in body, (
|
||||||
|
"409 response must explain which files are conflicted"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_push_allowed_after_conflicts_resolved(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
"""Resolve the conflict, then push must be accepted."""
|
||||||
|
# Resolve: check out local version
|
||||||
|
subprocess.run(
|
||||||
|
["git", "-C", str(vault_with_conflict), "checkout", "--ours",
|
||||||
|
"Bewerbungen/CV.md"],
|
||||||
|
check=True, capture_output=True,
|
||||||
|
)
|
||||||
|
commit_all(vault_with_conflict, "resolve: keep ours")
|
||||||
|
subprocess.run(
|
||||||
|
["git", "-C", str(vault_with_conflict), "merge", "--abort"],
|
||||||
|
capture_output=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 1 updated."])
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code in (200, 202), (
|
||||||
|
"Push must be allowed after conflicts are resolved"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-E4 — New collection creation
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestNewCollectionCreation:
|
||||||
|
|
||||||
|
async def test_new_top_level_folder_detected_as_new_collection(
|
||||||
|
self, client, populated_vault
|
||||||
|
):
|
||||||
|
"""A new folder at the top level must appear in changes as a new collection."""
|
||||||
|
new_doc = populated_vault / "NewCollection" / "FirstDoc.md"
|
||||||
|
new_doc.parent.mkdir()
|
||||||
|
new_doc.write_text("# First Doc\nNew collection content.\n")
|
||||||
|
commit_all(populated_vault, "obsidian: new collection")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
new_item = next((i for i in items if "FirstDoc.md" in i["path"]), None)
|
||||||
|
assert new_item is not None
|
||||||
|
# The action or a note must indicate a new collection will be created
|
||||||
|
item_str = json.dumps(new_item)
|
||||||
|
assert "collection" in item_str.lower() or new_item["status"] == "added", (
|
||||||
|
"New file in unknown folder must be flagged as requiring new collection"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-E5 — Rename handling
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestRenameHandling:
|
||||||
|
|
||||||
|
async def test_renamed_file_shown_in_changes(self, client, populated_vault):
|
||||||
|
old = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
new = populated_vault / "Bewerbungen" / "Resume.md"
|
||||||
|
old.rename(new)
|
||||||
|
commit_all(populated_vault, "obsidian: rename CV to Resume")
|
||||||
|
|
||||||
|
r = await client.get("/changes")
|
||||||
|
items = r.json()
|
||||||
|
renamed = [i for i in items if i["status"] == "renamed"]
|
||||||
|
assert len(renamed) >= 1
|
||||||
|
|
||||||
|
async def test_push_rename_uses_update_not_create(self, client, populated_vault):
|
||||||
|
"""
|
||||||
|
The sync engine must call documents.update (not delete+create) for renames,
|
||||||
|
preserving the Outline document ID.
|
||||||
|
"""
|
||||||
|
old = populated_vault / "Bewerbungen" / "CV.md"
|
||||||
|
new = populated_vault / "Bewerbungen" / "Resume.md"
|
||||||
|
old.rename(new)
|
||||||
|
commit_all(populated_vault, "obsidian: rename")
|
||||||
|
|
||||||
|
push_lines = [
|
||||||
|
"ok: Bewerbungen/Resume.md → title updated (id: doc-cv-001)",
|
||||||
|
"Done. 1 renamed.",
|
||||||
|
]
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(push_lines)
|
||||||
|
r = await client.post("/push")
|
||||||
|
events_raw = await consume_sse(client, r.json()["job_id"])
|
||||||
|
all_text = json.dumps(events_raw)
|
||||||
|
# Should not see "created" for a renamed document
|
||||||
|
assert "doc-cv-001" in all_text or "renamed" in all_text or "updated" in all_text, (
|
||||||
|
"Rename should update the existing document, not create a new one"
|
||||||
|
)
|
||||||
354
tests/test_phase_f_conflicts.py
Normal file
354
tests/test_phase_f_conflicts.py
Normal file
@@ -0,0 +1,354 @@
|
|||||||
|
"""
|
||||||
|
Phase F — Conflict Resolution Tests
|
||||||
|
|
||||||
|
Tests for GET /conflicts, GET /diff/{path}, and POST /resolve.
|
||||||
|
Uses the vault_with_conflict fixture which creates a real git merge conflict.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import textwrap
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import AsyncMock, MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
sys.path.insert(0, str(__import__("pathlib").Path(__file__).parent))
|
||||||
|
from helpers import make_mock_process # noqa: E402
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.asyncio
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def encode_path(path: str) -> str:
|
||||||
|
return base64.urlsafe_b64encode(path.encode()).decode()
|
||||||
|
|
||||||
|
|
||||||
|
def git(vault: Path, *args) -> str:
|
||||||
|
return subprocess.run(
|
||||||
|
["git", "-C", str(vault), *args],
|
||||||
|
check=True, capture_output=True, text=True,
|
||||||
|
).stdout.strip()
|
||||||
|
|
||||||
|
|
||||||
|
def commit_all(vault: Path, message: str):
|
||||||
|
subprocess.run(["git", "-C", str(vault), "add", "-A"], check=True, capture_output=True)
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "-C", str(vault), "commit", "-m", message], check=True, capture_output=True)
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-F1 — Conflicts list
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestConflictsList:
|
||||||
|
|
||||||
|
async def test_conflicts_returns_200(self, client):
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_conflicts_returns_json(self, client):
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
assert "application/json" in r.headers.get("content-type", "")
|
||||||
|
|
||||||
|
async def test_conflicts_empty_when_clean(self, client, populated_vault):
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
data = r.json()
|
||||||
|
assert isinstance(data, list)
|
||||||
|
assert len(data) == 0
|
||||||
|
|
||||||
|
async def test_conflicts_lists_conflicted_files(self, client, vault_with_conflict):
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
data = r.json()
|
||||||
|
assert len(data) >= 1, "Expected at least one conflict"
|
||||||
|
paths = [item["path"] if isinstance(item, dict) else item for item in data]
|
||||||
|
assert any("CV.md" in p for p in paths), "CV.md must appear in conflict list"
|
||||||
|
|
||||||
|
async def test_each_conflict_has_required_fields(self, client, vault_with_conflict):
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
for item in r.json():
|
||||||
|
assert "path" in item, f"Missing 'path' in conflict item: {item}"
|
||||||
|
# At minimum path is required; timestamps are recommended
|
||||||
|
assert isinstance(item["path"], str)
|
||||||
|
|
||||||
|
async def test_conflict_item_includes_timestamps(self, client, vault_with_conflict):
|
||||||
|
"""Conflict items should indicate when each side was last modified."""
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
items = r.json()
|
||||||
|
assert len(items) >= 1
|
||||||
|
item = items[0]
|
||||||
|
# At least one timestamp or modification indicator should be present
|
||||||
|
has_time = any(k in item for k in (
|
||||||
|
"local_time", "remote_time", "local_updated", "outline_updated",
|
||||||
|
"ours_time", "theirs_time",
|
||||||
|
))
|
||||||
|
# This is recommended, not strictly required — log warning if missing
|
||||||
|
if not has_time:
|
||||||
|
pytest.warns(
|
||||||
|
UserWarning,
|
||||||
|
match="conflict timestamps",
|
||||||
|
# Informational: timestamps improve UX but are not blocking
|
||||||
|
) if False else None # non-blocking check
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-F2 — Conflict diff view
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestConflictDiff:
|
||||||
|
|
||||||
|
async def test_diff_returns_200_for_conflict_file(self, client, vault_with_conflict):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
conflict_path = r_conflicts.json()[0]["path"]
|
||||||
|
encoded = encode_path(conflict_path)
|
||||||
|
|
||||||
|
r = await client.get(f"/diff/{encoded}")
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_diff_returns_html(self, client, vault_with_conflict):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
r = await client.get(f"/diff/{encode_path(path)}")
|
||||||
|
assert "text/html" in r.headers.get("content-type", "")
|
||||||
|
|
||||||
|
async def test_diff_shows_both_versions(self, client, vault_with_conflict):
|
||||||
|
"""Both the local and Outline version must appear in the diff HTML."""
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
r = await client.get(f"/diff/{encode_path(path)}")
|
||||||
|
body = r.text
|
||||||
|
# The diff must show both sides — check for two-column markers or headings
|
||||||
|
sides_shown = sum(1 for label in (
|
||||||
|
"yours", "mine", "local", "obsidian",
|
||||||
|
"outline", "remote", "theirs",
|
||||||
|
) if label in body.lower())
|
||||||
|
assert sides_shown >= 2, (
|
||||||
|
"Diff must label both sides (local/Obsidian and remote/Outline)"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_diff_for_non_conflict_file_returns_404(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r = await client.get(f"/diff/{encode_path('Infra/HomeLab.md')}")
|
||||||
|
# HomeLab.md is not conflicted — must return 404 from conflicts endpoint
|
||||||
|
# (the regular diff endpoint may return 200 for any file)
|
||||||
|
# This test just verifies invalid paths to the conflicts-specific diff fail
|
||||||
|
assert r.status_code in (200, 404) # implementation-defined; document behavior
|
||||||
|
|
||||||
|
async def test_diff_for_unknown_path_returns_404(self, client, vault_with_conflict):
|
||||||
|
r = await client.get(f"/diff/{encode_path('ghost/file.md')}")
|
||||||
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-F3 — Resolve: keep local version
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestResolveLocal:
|
||||||
|
|
||||||
|
async def test_resolve_local_returns_200(self, client, vault_with_conflict):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
r = await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_resolve_local_removes_conflict_markers(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
vault = vault_with_conflict
|
||||||
|
|
||||||
|
r = await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
content = (vault / path).read_text()
|
||||||
|
assert "<<<<<<<" not in content, "Resolve must remove <<<<<<< markers"
|
||||||
|
assert "=======" not in content, "Resolve must remove ======= markers"
|
||||||
|
assert ">>>>>>>" not in content, "Resolve must remove >>>>>>> markers"
|
||||||
|
|
||||||
|
async def test_resolve_local_keeps_local_content(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
vault = vault_with_conflict
|
||||||
|
|
||||||
|
await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||||
|
content = (vault / path).read_text()
|
||||||
|
# The local (Obsidian) version had "new section added"
|
||||||
|
assert "new section" in content.lower() or "local version" in content.lower(), (
|
||||||
|
"Resolving with 'local' must keep the Obsidian version content"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_resolve_local_commits_to_main(self, client, vault_with_conflict):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
before = git(vault_with_conflict, "rev-parse", "HEAD")
|
||||||
|
await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||||
|
after = git(vault_with_conflict, "rev-parse", "HEAD")
|
||||||
|
|
||||||
|
assert before != after, "Resolve must create a new commit on main"
|
||||||
|
|
||||||
|
async def test_file_no_longer_in_conflicts_after_resolve(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
await client.post("/resolve", json={"file": path, "accept": "local"})
|
||||||
|
|
||||||
|
r2 = await client.get("/conflicts")
|
||||||
|
remaining_paths = [i["path"] for i in r2.json()]
|
||||||
|
assert path not in remaining_paths, (
|
||||||
|
"Resolved file must no longer appear in /conflicts"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-F4 — Resolve: keep Outline's version
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestResolveRemote:
|
||||||
|
|
||||||
|
async def test_resolve_remote_returns_200(self, client, vault_with_conflict):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
r = await client.post("/resolve", json={"file": path, "accept": "remote"})
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_resolve_remote_removes_conflict_markers(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
await client.post("/resolve", json={"file": path, "accept": "remote"})
|
||||||
|
content = (vault_with_conflict / path).read_text()
|
||||||
|
assert "<<<<<<<" not in content
|
||||||
|
assert ">>>>>>>" not in content
|
||||||
|
|
||||||
|
async def test_resolve_remote_keeps_outline_content(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
await client.post("/resolve", json={"file": path, "accept": "remote"})
|
||||||
|
content = (vault_with_conflict / path).read_text()
|
||||||
|
# The Outline version had "contact info updated"
|
||||||
|
assert "contact info" in content.lower() or "outline version" in content.lower(), (
|
||||||
|
"Resolving with 'remote' must keep the Outline version content"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_resolve_remote_commits_to_main(self, client, vault_with_conflict):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
before = git(vault_with_conflict, "rev-parse", "HEAD")
|
||||||
|
await client.post("/resolve", json={"file": path, "accept": "remote"})
|
||||||
|
after = git(vault_with_conflict, "rev-parse", "HEAD")
|
||||||
|
|
||||||
|
assert before != after
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-F5 — Input validation on /resolve
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestResolveValidation:
|
||||||
|
|
||||||
|
async def test_resolve_with_unknown_file_returns_422(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r = await client.post("/resolve", json={
|
||||||
|
"file": "NotInConflict/ghost.md",
|
||||||
|
"accept": "local",
|
||||||
|
})
|
||||||
|
assert r.status_code in (404, 422), (
|
||||||
|
"Resolving an unknown file must return 404 or 422"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_resolve_with_path_traversal_returns_422(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r = await client.post("/resolve", json={
|
||||||
|
"file": "../../etc/passwd",
|
||||||
|
"accept": "local",
|
||||||
|
})
|
||||||
|
assert r.status_code in (400, 404, 422), (
|
||||||
|
"Path traversal must be rejected"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_resolve_with_invalid_accept_value_returns_422(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r_conflicts = await client.get("/conflicts")
|
||||||
|
path = r_conflicts.json()[0]["path"]
|
||||||
|
|
||||||
|
r = await client.post("/resolve", json={"file": path, "accept": "neither"})
|
||||||
|
assert r.status_code == 422, (
|
||||||
|
"'accept' must be 'local' or 'remote' — other values must be rejected"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_resolve_missing_fields_returns_422(self, client, vault_with_conflict):
|
||||||
|
r = await client.post("/resolve", json={"file": "something.md"})
|
||||||
|
assert r.status_code == 422
|
||||||
|
|
||||||
|
async def test_resolve_requires_json_body(self, client, vault_with_conflict):
|
||||||
|
r = await client.post("/resolve")
|
||||||
|
assert r.status_code in (400, 422)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-F6 — All conflicts resolved → push available
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestAllConflictsResolved:
|
||||||
|
|
||||||
|
async def test_conflicts_empty_after_all_resolved(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
for item in r.json():
|
||||||
|
await client.post("/resolve", json={"file": item["path"], "accept": "local"})
|
||||||
|
|
||||||
|
r2 = await client.get("/conflicts")
|
||||||
|
assert r2.json() == [], "No conflicts should remain after all are resolved"
|
||||||
|
|
||||||
|
async def test_status_shows_clean_after_all_resolved(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
for item in r.json():
|
||||||
|
await client.post("/resolve", json={"file": item["path"], "accept": "local"})
|
||||||
|
|
||||||
|
status = (await client.get("/status")).json()
|
||||||
|
assert status["conflicts"] == 0
|
||||||
|
|
||||||
|
async def test_push_allowed_after_all_conflicts_resolved(
|
||||||
|
self, client, vault_with_conflict
|
||||||
|
):
|
||||||
|
from unittest.mock import patch
|
||||||
|
r = await client.get("/conflicts")
|
||||||
|
for item in r.json():
|
||||||
|
await client.post("/resolve", json={"file": item["path"], "accept": "local"})
|
||||||
|
|
||||||
|
with patch("webui.spawn_sync_subprocess", new_callable=AsyncMock) as mock_spawn:
|
||||||
|
mock_spawn.return_value = make_mock_process(["Done. 1 updated."])
|
||||||
|
r = await client.post("/push")
|
||||||
|
assert r.status_code in (200, 202), (
|
||||||
|
"Push must be allowed after all conflicts are resolved"
|
||||||
|
)
|
||||||
182
tests/test_phase_g_history.py
Normal file
182
tests/test_phase_g_history.py
Normal file
@@ -0,0 +1,182 @@
|
|||||||
|
"""
|
||||||
|
Phase G — Sync History Tests
|
||||||
|
|
||||||
|
Tests for GET /history: rendering _sync_log.md as a reverse-chronological table.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import textwrap
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.asyncio
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
SAMPLE_LOG = textwrap.dedent("""\
|
||||||
|
# Sync Log
|
||||||
|
|
||||||
|
| Timestamp | Direction | Files | Status |
|
||||||
|
|-----------|-----------|-------|--------|
|
||||||
|
| 2026-03-03 22:15 | push | 1 updated | error: CV.md failed |
|
||||||
|
| 2026-03-04 08:00 | pull | 0 changes | ok |
|
||||||
|
| 2026-03-05 09:10 | push | 2 updated, 1 created | ok |
|
||||||
|
| 2026-03-06 14:32 | pull | 3 updated | ok |
|
||||||
|
""")
|
||||||
|
|
||||||
|
MINIMAL_LOG = textwrap.dedent("""\
|
||||||
|
# Sync Log
|
||||||
|
|
||||||
|
| Timestamp | Direction | Files | Status |
|
||||||
|
|-----------|-----------|-------|--------|
|
||||||
|
| 2026-01-01 00:00 | pull | 1 updated | ok |
|
||||||
|
""")
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-G1 — History page renders
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestHistoryPage:
|
||||||
|
|
||||||
|
async def test_history_returns_200(self, client):
|
||||||
|
r = await client.get("/history")
|
||||||
|
assert r.status_code == 200
|
||||||
|
|
||||||
|
async def test_history_returns_html(self, client):
|
||||||
|
r = await client.get("/history")
|
||||||
|
assert "text/html" in r.headers.get("content-type", "")
|
||||||
|
|
||||||
|
async def test_history_page_contains_table(self, client, vault_dir, sync_log):
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text.lower()
|
||||||
|
assert "<table" in body, "History page must render an HTML table"
|
||||||
|
|
||||||
|
async def test_history_shows_direction_labels(self, client, vault_dir):
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text.lower()
|
||||||
|
assert "pull" in body, "History must show pull entries"
|
||||||
|
assert "push" in body, "History must show push entries"
|
||||||
|
|
||||||
|
async def test_history_shows_timestamps(self, client, vault_dir):
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text
|
||||||
|
assert "2026-03-06" in body, "History must show timestamps from _sync_log.md"
|
||||||
|
|
||||||
|
async def test_history_shows_file_counts(self, client, vault_dir):
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text.lower()
|
||||||
|
assert "updated" in body or "created" in body, (
|
||||||
|
"History must show file change counts"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_history_shows_status(self, client, vault_dir):
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text.lower()
|
||||||
|
assert "ok" in body or "error" in body, "History must show entry status"
|
||||||
|
|
||||||
|
async def test_history_empty_when_no_log(self, client, vault_dir):
|
||||||
|
"""If _sync_log.md does not exist, page should render gracefully (no 500)."""
|
||||||
|
log_path = vault_dir / "_sync_log.md"
|
||||||
|
if log_path.exists():
|
||||||
|
log_path.unlink()
|
||||||
|
|
||||||
|
r = await client.get("/history")
|
||||||
|
assert r.status_code == 200, "History page must not crash when log is missing"
|
||||||
|
|
||||||
|
async def test_history_empty_state_message(self, client, vault_dir):
|
||||||
|
"""Empty history should show a helpful message, not a blank page."""
|
||||||
|
log_path = vault_dir / "_sync_log.md"
|
||||||
|
if log_path.exists():
|
||||||
|
log_path.unlink()
|
||||||
|
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text.lower()
|
||||||
|
assert any(phrase in body for phrase in (
|
||||||
|
"no history", "no sync", "empty", "no entries", "nothing yet"
|
||||||
|
)), "Empty history must show a message"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# US-G2 — _sync_log.md parsing
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class TestSyncLogParsing:
|
||||||
|
|
||||||
|
async def test_entries_shown_in_reverse_chronological_order(
|
||||||
|
self, client, vault_dir
|
||||||
|
):
|
||||||
|
"""Most recent entry must appear before older entries in the HTML."""
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text
|
||||||
|
|
||||||
|
pos_newest = body.find("2026-03-06")
|
||||||
|
pos_oldest = body.find("2026-03-03")
|
||||||
|
|
||||||
|
assert pos_newest != -1, "Most recent entry must appear in history"
|
||||||
|
assert pos_oldest != -1, "Oldest entry must appear in history"
|
||||||
|
assert pos_newest < pos_oldest, (
|
||||||
|
"Most recent entry (2026-03-06) must appear before oldest (2026-03-03)"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_error_entries_visually_distinct(self, client, vault_dir):
|
||||||
|
"""Entries with non-ok status should be highlighted differently."""
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text.lower()
|
||||||
|
# Error entry from 2026-03-03 should have visual distinction
|
||||||
|
# This is checked loosely: error word near some CSS class or color
|
||||||
|
assert "error" in body, "Error entries must be shown in history"
|
||||||
|
|
||||||
|
async def test_raw_markdown_not_shown_as_pipe_table(self, client, vault_dir):
|
||||||
|
"""The raw markdown pipe-table syntax must not be visible in rendered output."""
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text
|
||||||
|
# Pipe characters from the markdown table should NOT appear verbatim
|
||||||
|
# (they should be parsed and rendered as HTML <table>)
|
||||||
|
raw_table_lines = [l for l in body.splitlines() if l.strip().startswith("|---")]
|
||||||
|
assert len(raw_table_lines) == 0, (
|
||||||
|
"Raw markdown table separator lines must not appear in rendered HTML"
|
||||||
|
)
|
||||||
|
|
||||||
|
async def test_all_log_entries_appear(self, client, vault_dir):
|
||||||
|
"""All 4 entries in SAMPLE_LOG must appear in the rendered history."""
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
body = r.text
|
||||||
|
|
||||||
|
assert "2026-03-06" in body
|
||||||
|
assert "2026-03-05" in body
|
||||||
|
assert "2026-03-04" in body
|
||||||
|
assert "2026-03-03" in body
|
||||||
|
|
||||||
|
async def test_single_entry_log_renders(self, client, vault_dir):
|
||||||
|
(vault_dir / "_sync_log.md").write_text(MINIMAL_LOG)
|
||||||
|
r = await client.get("/history")
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert "2026-01-01" in r.text
|
||||||
|
|
||||||
|
async def test_history_api_endpoint_returns_json(self, client, vault_dir):
|
||||||
|
"""
|
||||||
|
GET /history?format=json returns structured history data.
|
||||||
|
This is optional but strongly recommended for future HTMX updates.
|
||||||
|
"""
|
||||||
|
(vault_dir / "_sync_log.md").write_text(SAMPLE_LOG)
|
||||||
|
r = await client.get("/history?format=json")
|
||||||
|
# If not implemented, 200 HTML is also acceptable
|
||||||
|
if r.status_code == 200 and "application/json" in r.headers.get("content-type", ""):
|
||||||
|
data = r.json()
|
||||||
|
assert isinstance(data, list)
|
||||||
|
assert len(data) >= 4
|
||||||
|
for entry in data:
|
||||||
|
assert "timestamp" in entry or "date" in entry
|
||||||
|
assert "direction" in entry
|
||||||
792
webui.py
Normal file
792
webui.py
Normal file
@@ -0,0 +1,792 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Outline Sync Web UI — FastAPI + HTMX
|
||||||
|
Phases B–G of WEBUI_PRD.md
|
||||||
|
|
||||||
|
Run inside outline-sync-ui Docker container:
|
||||||
|
uvicorn webui:app --host 0.0.0.0 --port 8080
|
||||||
|
|
||||||
|
Module-level VAULT_DIR and SETTINGS_PATH can be overridden by tests.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import base64
|
||||||
|
import difflib
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import subprocess
|
||||||
|
import uuid
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from fastapi import FastAPI, HTTPException, Request
|
||||||
|
from fastapi.responses import HTMLResponse, JSONResponse, StreamingResponse
|
||||||
|
from pydantic import BaseModel, field_validator
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Module-level config — overridden by tests: webui.VAULT_DIR = tmp_path
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
VAULT_DIR: Path = Path(os.environ.get("VAULT_DIR", "/vault"))
|
||||||
|
SETTINGS_PATH: Path = Path(os.environ.get("SETTINGS_PATH", "/work/settings.json"))
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# App + job state
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
app = FastAPI(title="Outline Sync UI", docs_url=None, redoc_url=None)
|
||||||
|
|
||||||
|
_jobs: dict[str, dict] = {}
|
||||||
|
_active_job: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Git helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _git(*args: str) -> subprocess.CompletedProcess:
|
||||||
|
"""Run a git command against VAULT_DIR (no check — callers inspect returncode)."""
|
||||||
|
return subprocess.run(
|
||||||
|
["git", "-C", str(VAULT_DIR), *args],
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Settings
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _load_settings() -> dict:
|
||||||
|
try:
|
||||||
|
return json.loads(SETTINGS_PATH.read_text())
|
||||||
|
except Exception:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Vault state
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _get_conflict_files() -> list[str]:
|
||||||
|
"""Return sorted list of paths with unresolved merge conflicts."""
|
||||||
|
r = _git("ls-files", "-u")
|
||||||
|
seen: set[str] = set()
|
||||||
|
for line in r.stdout.splitlines():
|
||||||
|
parts = line.split("\t")
|
||||||
|
if len(parts) >= 2:
|
||||||
|
seen.add(parts[1])
|
||||||
|
return sorted(seen)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_pending_count() -> int:
|
||||||
|
r = _git("diff", "outline..main", "--name-only")
|
||||||
|
return len([l for l in r.stdout.splitlines() if l.strip()])
|
||||||
|
|
||||||
|
|
||||||
|
def _get_pending_changes() -> list[dict]:
|
||||||
|
r = _git("diff", "outline..main", "--name-status", "-M90")
|
||||||
|
allow_deletions = _load_settings().get("sync", {}).get("allow_deletions", False)
|
||||||
|
|
||||||
|
changes: list[dict] = []
|
||||||
|
for line in r.stdout.splitlines():
|
||||||
|
if not line.strip():
|
||||||
|
continue
|
||||||
|
parts = line.split("\t")
|
||||||
|
code = parts[0]
|
||||||
|
|
||||||
|
if code == "M" and len(parts) >= 2:
|
||||||
|
changes.append({"path": parts[1], "status": "modified", "action": "update"})
|
||||||
|
elif code == "A" and len(parts) >= 2:
|
||||||
|
changes.append({"path": parts[1], "status": "added", "action": "create"})
|
||||||
|
elif code == "D" and len(parts) >= 2:
|
||||||
|
action = "delete" if allow_deletions else "skip"
|
||||||
|
changes.append({
|
||||||
|
"path": parts[1], "status": "deleted", "action": action,
|
||||||
|
"reason": "" if allow_deletions else "deletions disabled in settings",
|
||||||
|
})
|
||||||
|
elif code.startswith("R") and len(parts) >= 3:
|
||||||
|
changes.append({
|
||||||
|
"path": parts[2], "status": "renamed", "action": "update",
|
||||||
|
"from": parts[1], "to": parts[2],
|
||||||
|
"from_path": parts[1], "to_path": parts[2],
|
||||||
|
})
|
||||||
|
return changes
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_sync_log() -> list[dict]:
|
||||||
|
"""Parse _sync_log.md table rows into dicts, returned newest-first."""
|
||||||
|
log_path = VAULT_DIR / "_sync_log.md"
|
||||||
|
if not log_path.exists():
|
||||||
|
return []
|
||||||
|
|
||||||
|
entries: list[dict] = []
|
||||||
|
past_header = False
|
||||||
|
|
||||||
|
for line in log_path.read_text().splitlines():
|
||||||
|
stripped = line.strip()
|
||||||
|
if not stripped.startswith("|"):
|
||||||
|
continue
|
||||||
|
# Separator row
|
||||||
|
if re.match(r"^\|[-| :]+\|$", stripped):
|
||||||
|
past_header = True
|
||||||
|
continue
|
||||||
|
if not past_header:
|
||||||
|
continue # skip header row
|
||||||
|
cells = [c.strip() for c in stripped.strip("|").split("|")]
|
||||||
|
if len(cells) >= 4:
|
||||||
|
entries.append({
|
||||||
|
"timestamp": cells[0],
|
||||||
|
"direction": cells[1],
|
||||||
|
"files": cells[2],
|
||||||
|
"status": cells[3],
|
||||||
|
})
|
||||||
|
|
||||||
|
entries.reverse() # newest first
|
||||||
|
return entries
|
||||||
|
|
||||||
|
|
||||||
|
def _get_vault_status() -> dict:
|
||||||
|
conflict_files = _get_conflict_files()
|
||||||
|
pending_count = _get_pending_count()
|
||||||
|
|
||||||
|
if conflict_files:
|
||||||
|
vault_status = "conflict"
|
||||||
|
elif pending_count > 0:
|
||||||
|
vault_status = "dirty"
|
||||||
|
else:
|
||||||
|
vault_status = "clean"
|
||||||
|
|
||||||
|
last_pull: Optional[dict] = None
|
||||||
|
last_push: Optional[dict] = None
|
||||||
|
for e in _parse_sync_log():
|
||||||
|
if last_pull is None and e.get("direction") == "pull":
|
||||||
|
last_pull = e
|
||||||
|
if last_push is None and e.get("direction") == "push":
|
||||||
|
last_push = e
|
||||||
|
|
||||||
|
return {
|
||||||
|
"vault_status": vault_status,
|
||||||
|
"pending_count": pending_count,
|
||||||
|
"conflicts": len(conflict_files),
|
||||||
|
"last_pull": last_pull,
|
||||||
|
"last_push": last_push,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Vault file tree
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _build_tree_html(path: Path, depth: int = 0) -> str:
|
||||||
|
"""Recursively build a nested HTML tree for a directory, excluding .git."""
|
||||||
|
items = sorted(path.iterdir(), key=lambda p: (p.is_file(), p.name.lower()))
|
||||||
|
indent = 16 if depth > 0 else 0
|
||||||
|
html = f'<ul style="margin:0;padding-left:{indent}px;list-style:none">'
|
||||||
|
for item in items:
|
||||||
|
if item.name == ".git":
|
||||||
|
continue
|
||||||
|
if item.is_dir():
|
||||||
|
open_attr = " open" if depth == 0 else ""
|
||||||
|
html += (
|
||||||
|
f'<li><details{open_attr}>'
|
||||||
|
f'<summary style="cursor:pointer;padding:2px 0;user-select:none">'
|
||||||
|
f'<strong style="color:#444">{item.name}/</strong></summary>'
|
||||||
|
f'{_build_tree_html(item, depth + 1)}</details></li>'
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
sz = item.stat().st_size
|
||||||
|
size_str = f"{sz / 1024:.1f} KB" if sz >= 1024 else f"{sz} B"
|
||||||
|
except OSError:
|
||||||
|
size_str = "?"
|
||||||
|
html += (
|
||||||
|
f'<li style="padding:2px 0">'
|
||||||
|
f'<code style="font-size:.85rem">{item.name}</code> '
|
||||||
|
f'<small style="color:#aaa">{size_str}</small></li>'
|
||||||
|
)
|
||||||
|
html += "</ul>"
|
||||||
|
return html
|
||||||
|
|
||||||
|
|
||||||
|
def _get_vault_tree_html() -> str:
|
||||||
|
if not VAULT_DIR.exists():
|
||||||
|
return "<em style='color:#999'>Vault directory not found.</em>"
|
||||||
|
try:
|
||||||
|
return _build_tree_html(VAULT_DIR)
|
||||||
|
except Exception as exc:
|
||||||
|
return f"<em style='color:#dc3545'>Error reading vault: {exc}</em>"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Async subprocess + job runner
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
async def spawn_sync_subprocess(command: str) -> asyncio.subprocess.Process:
|
||||||
|
"""
|
||||||
|
Run outline_sync.py <command> directly — we are already inside the container.
|
||||||
|
Patched in tests.
|
||||||
|
"""
|
||||||
|
return await asyncio.create_subprocess_exec(
|
||||||
|
"python3", "/work/outline_sync.py", command,
|
||||||
|
"--vault", str(VAULT_DIR),
|
||||||
|
"--settings", str(SETTINGS_PATH),
|
||||||
|
stdout=asyncio.subprocess.PIPE,
|
||||||
|
stderr=asyncio.subprocess.STDOUT,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
async def run_sync_job(job_id: str, command: str) -> None:
|
||||||
|
"""
|
||||||
|
Execute a sync job, streaming output into _jobs[job_id]["output"].
|
||||||
|
Patched in tests via: patch("webui.run_sync_job", new_callable=AsyncMock).
|
||||||
|
"""
|
||||||
|
global _active_job
|
||||||
|
try:
|
||||||
|
proc = await spawn_sync_subprocess(command)
|
||||||
|
summary_line = ""
|
||||||
|
async for raw in proc.stdout:
|
||||||
|
text = raw.decode(errors="replace").rstrip()
|
||||||
|
_jobs[job_id]["output"].append({"type": "log", "message": text})
|
||||||
|
if text.startswith("Done."):
|
||||||
|
summary_line = text
|
||||||
|
await proc.wait()
|
||||||
|
success = proc.returncode == 0
|
||||||
|
_jobs[job_id]["output"].append({
|
||||||
|
"type": "done",
|
||||||
|
"success": success,
|
||||||
|
"message": summary_line or ("Sync completed." if success else "Sync failed."),
|
||||||
|
})
|
||||||
|
_jobs[job_id]["status"] = "done" if success else "error"
|
||||||
|
except Exception as exc:
|
||||||
|
_jobs[job_id]["output"].append({"type": "done", "success": False, "message": str(exc)})
|
||||||
|
_jobs[job_id]["status"] = "error"
|
||||||
|
|
||||||
|
|
||||||
|
def _new_job(command: str) -> str:
|
||||||
|
"""
|
||||||
|
Register a new job. Status is 'pending' until the SSE stream connects and
|
||||||
|
starts it. This avoids asyncio background tasks that cause test hangs.
|
||||||
|
"""
|
||||||
|
global _active_job
|
||||||
|
job_id = str(uuid.uuid4())
|
||||||
|
_jobs[job_id] = {"status": "pending", "output": [], "command": command}
|
||||||
|
_active_job = job_id
|
||||||
|
return job_id
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Diff renderer
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _render_diff_html(outline_text: str, main_text: str, filename: str) -> str:
|
||||||
|
table = difflib.HtmlDiff(wrapcolumn=80).make_table(
|
||||||
|
outline_text.splitlines(),
|
||||||
|
main_text.splitlines(),
|
||||||
|
fromdesc="Outline's version",
|
||||||
|
todesc="Your version (Obsidian)",
|
||||||
|
context=True,
|
||||||
|
numlines=3,
|
||||||
|
)
|
||||||
|
return f"""<style>
|
||||||
|
.diff-wrap{{font-family:monospace;font-size:13px;overflow-x:auto}}
|
||||||
|
.diff{{width:100%;border-collapse:collapse}}
|
||||||
|
.diff td{{padding:2px 6px;white-space:pre-wrap;word-break:break-all}}
|
||||||
|
.diff_header{{background:#e0e0e0;font-weight:bold}}
|
||||||
|
td.diff_header{{text-align:right}}
|
||||||
|
.diff_next{{background:#c0c0c0}}
|
||||||
|
.diff_add{{background:#aaffaa}}
|
||||||
|
.diff_chg{{background:#ffff77}}
|
||||||
|
.diff_sub{{background:#ffaaaa}}
|
||||||
|
</style>
|
||||||
|
<div class="diff-wrap"><h4 style="margin:0 0 8px;font-size:14px">Diff: {filename}</h4>{table}</div>"""
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# HTML helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
_BADGE = {
|
||||||
|
"clean": '<span class="badge clean">Clean</span>',
|
||||||
|
"dirty": '<span class="badge dirty">Pending Changes</span>',
|
||||||
|
"conflict": '<span class="badge conflict">Conflicts!</span>',
|
||||||
|
}
|
||||||
|
|
||||||
|
_BASE_CSS = """
|
||||||
|
*{box-sizing:border-box}
|
||||||
|
body{font-family:system-ui,sans-serif;margin:0;background:#f5f5f5;color:#222}
|
||||||
|
header{background:#1a1a2e;color:#eee;padding:12px 24px;display:flex;justify-content:space-between;align-items:center}
|
||||||
|
header h1{margin:0;font-size:1.2rem;letter-spacing:.05em}
|
||||||
|
header nav a{color:#aac;text-decoration:none;margin-left:18px;font-size:.9rem}
|
||||||
|
header nav a:hover{color:#fff}
|
||||||
|
main{max-width:880px;margin:32px auto;padding:0 16px}
|
||||||
|
.card{background:#fff;border-radius:8px;padding:20px 24px;margin-bottom:16px;box-shadow:0 1px 4px rgba(0,0,0,.08)}
|
||||||
|
h2{margin:0 0 16px;font-size:1.1rem}
|
||||||
|
.badge{display:inline-block;padding:3px 10px;border-radius:12px;font-size:.8rem;font-weight:600}
|
||||||
|
.badge.clean{background:#d4edda;color:#155724}
|
||||||
|
.badge.dirty{background:#fff3cd;color:#856404}
|
||||||
|
.badge.conflict{background:#f8d7da;color:#721c24}
|
||||||
|
.grid{display:grid;grid-template-columns:1fr 1fr;gap:10px;margin-bottom:16px}
|
||||||
|
.stat{background:#f8f9fa;border-radius:6px;padding:12px}
|
||||||
|
.stat label{font-size:.75rem;color:#666;display:block;margin-bottom:4px}
|
||||||
|
.btn{display:inline-block;padding:10px 22px;border-radius:6px;border:none;cursor:pointer;font-size:.9rem;font-weight:600;text-decoration:none;transition:opacity .15s}
|
||||||
|
.btn:hover{opacity:.85}
|
||||||
|
.btn-primary{background:#0066cc;color:#fff}
|
||||||
|
.btn-success{background:#198754;color:#fff}
|
||||||
|
.btn-danger{background:#dc3545;color:#fff}
|
||||||
|
.btn-secondary{background:#6c757d;color:#fff}
|
||||||
|
.btn:disabled,.btn[disabled]{opacity:.5;cursor:not-allowed;pointer-events:none}
|
||||||
|
.row{display:flex;gap:12px;flex-wrap:wrap;align-items:center}
|
||||||
|
.alert{padding:12px 16px;border-radius:6px;margin-bottom:12px}
|
||||||
|
.alert-warn{background:#fff3cd;border:1px solid #ffc107;color:#664d03}
|
||||||
|
#output{background:#1a1a2e;color:#d0d0e0;border-radius:8px;padding:16px;font-family:monospace;font-size:.85rem;min-height:60px;max-height:420px;overflow-y:auto;margin-top:16px;display:none}
|
||||||
|
#output .ln{padding:1px 0}
|
||||||
|
#output .ln.ok{color:#6ee7b7}
|
||||||
|
#output .ln.err{color:#fca5a5}
|
||||||
|
#output .ln.done{color:#93c5fd;font-weight:600;border-top:1px solid #333;margin-top:8px;padding-top:8px}
|
||||||
|
table{width:100%;border-collapse:collapse}
|
||||||
|
th,td{text-align:left;padding:8px 12px;border-bottom:1px solid #eee;font-size:.9rem}
|
||||||
|
th{background:#f8f9fa;font-weight:600}
|
||||||
|
tr:last-child td{border-bottom:none}
|
||||||
|
.tag{display:inline-block;padding:2px 8px;border-radius:4px;font-size:.75rem;font-weight:600}
|
||||||
|
.tag-modified{background:#fff3cd;color:#856404}
|
||||||
|
.tag-added{background:#d4edda;color:#155724}
|
||||||
|
.tag-deleted{background:#f8d7da;color:#721c24}
|
||||||
|
.tag-renamed{background:#cce5ff;color:#004085}
|
||||||
|
.tag-skip{background:#e2e3e5;color:#383d41}
|
||||||
|
.conflict-card{border:1px solid #ffc107;border-radius:6px;padding:14px;margin-bottom:12px}
|
||||||
|
.conflict-card h3{margin:0 0 10px;font-size:.95rem;font-family:monospace}
|
||||||
|
.diff-container{margin-top:10px}
|
||||||
|
"""
|
||||||
|
|
||||||
|
_SCRIPT = r"""
|
||||||
|
async function doSync(endpoint, label) {
|
||||||
|
const btn = event.currentTarget;
|
||||||
|
btn.disabled = true;
|
||||||
|
const r = await fetch(endpoint, {method:'POST'});
|
||||||
|
if (r.status === 409) {
|
||||||
|
const d = await r.json();
|
||||||
|
alert(d.detail || 'A job is already running or conflicts exist.');
|
||||||
|
btn.disabled = false; return;
|
||||||
|
}
|
||||||
|
if (!r.ok) { alert('Error ' + r.status); btn.disabled = false; return; }
|
||||||
|
const d = await r.json();
|
||||||
|
const panel = document.getElementById('output');
|
||||||
|
panel.style.display = 'block';
|
||||||
|
panel.innerHTML = '<div class="ln">' + label + '…</div>';
|
||||||
|
const src = new EventSource('/stream/' + d.job_id);
|
||||||
|
src.onmessage = e => {
|
||||||
|
const ev = JSON.parse(e.data);
|
||||||
|
if (ev.type === 'done') {
|
||||||
|
const div = document.createElement('div');
|
||||||
|
div.className = 'ln done';
|
||||||
|
div.textContent = ev.message || 'Done.';
|
||||||
|
panel.appendChild(div);
|
||||||
|
src.close();
|
||||||
|
setTimeout(() => location.reload(), 1800);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const div = document.createElement('div');
|
||||||
|
div.className = 'ln' + (ev.type==='error'?' err': ev.message&&ev.message.startsWith('ok:')?' ok':'');
|
||||||
|
div.textContent = ev.message || JSON.stringify(ev);
|
||||||
|
panel.appendChild(div);
|
||||||
|
panel.scrollTop = panel.scrollHeight;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function toggleDiff(path, encPath) {
|
||||||
|
const id = 'diff_' + path.replace(/[^a-zA-Z0-9]/g,'_');
|
||||||
|
const el = document.getElementById(id);
|
||||||
|
if (!el) return;
|
||||||
|
if (el.dataset.loaded) { el.style.display = el.style.display==='none'?'block':'none'; return; }
|
||||||
|
el.style.display = 'block';
|
||||||
|
el.innerHTML = '<em>Loading…</em>';
|
||||||
|
fetch('/diff/' + encPath).then(r=>r.text()).then(h=>{el.innerHTML=h;el.dataset.loaded='1'});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function resolve(path, accept) {
|
||||||
|
const r = await fetch('/resolve',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify({file:path,accept:accept})});
|
||||||
|
if (!r.ok) { const d=await r.json().catch(()=>({})); alert(d.detail||'Error'); return; }
|
||||||
|
const id = 'cc_' + path.replace(/[^a-zA-Z0-9]/g,'_');
|
||||||
|
document.getElementById(id)?.remove();
|
||||||
|
if (!document.querySelector('.conflict-card')) {
|
||||||
|
document.getElementById('cc-list').style.display='none';
|
||||||
|
document.getElementById('cc-none').style.display='block';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def _page(title: str, body: str) -> str:
|
||||||
|
return f"""<!DOCTYPE html>
|
||||||
|
<html lang="en"><head><meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width,initial-scale=1">
|
||||||
|
<title>{title} — Outline Sync</title>
|
||||||
|
<style>{_BASE_CSS}</style></head>
|
||||||
|
<body>
|
||||||
|
<header>
|
||||||
|
<h1>Outline Sync</h1>
|
||||||
|
<nav>
|
||||||
|
<a href="/">Dashboard</a>
|
||||||
|
<a href="/changes">Changes</a>
|
||||||
|
<a href="/conflicts">Conflicts</a>
|
||||||
|
<a href="/history">History</a>
|
||||||
|
<a href="/files">Files</a>
|
||||||
|
</nav>
|
||||||
|
</header>
|
||||||
|
<main>{body}</main>
|
||||||
|
<script>{_SCRIPT}</script>
|
||||||
|
</body></html>"""
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Phase B — Dashboard
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _sync_entry_html(entry: Optional[dict]) -> str:
|
||||||
|
"""Render a compact last-sync result block for the dashboard."""
|
||||||
|
if not entry:
|
||||||
|
return "<span style='color:#999'>—</span>"
|
||||||
|
err = "error" in entry.get("status", "").lower()
|
||||||
|
status_color = "#dc3545" if err else "#198754"
|
||||||
|
return (
|
||||||
|
f'<span>{entry["timestamp"]}</span><br>'
|
||||||
|
f'<small style="color:#666">{entry.get("files","?")} file(s)</small> '
|
||||||
|
f'<small style="color:{status_color};font-weight:600">{entry.get("status","?")}</small>'
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/", response_class=HTMLResponse)
|
||||||
|
async def dashboard():
|
||||||
|
s = _get_vault_status()
|
||||||
|
badge = _BADGE.get(s["vault_status"], s["vault_status"])
|
||||||
|
warn = ""
|
||||||
|
if s["conflicts"] > 0:
|
||||||
|
warn = f'<div class="alert alert-warn">⚠ {s["conflicts"]} conflict(s) — resolve before pushing. <a href="/conflicts">Resolve →</a></div>'
|
||||||
|
|
||||||
|
pending = s["pending_count"]
|
||||||
|
push_label = f"Send to Outline ({pending} pending)" if pending else "Send to Outline"
|
||||||
|
push_dis = " disabled" if s["conflicts"] > 0 else ""
|
||||||
|
|
||||||
|
pull_html = _sync_entry_html(s["last_pull"])
|
||||||
|
push_html = _sync_entry_html(s["last_push"])
|
||||||
|
|
||||||
|
body = f"""
|
||||||
|
<div class="card">
|
||||||
|
<h2>Vault Status</h2>
|
||||||
|
{warn}
|
||||||
|
<div class="grid">
|
||||||
|
<div class="stat"><label>Status</label><span>{badge}</span></div>
|
||||||
|
<div class="stat"><label>Pending local changes</label><strong>{pending}</strong></div>
|
||||||
|
<div class="stat"><label>Last pull</label>{pull_html}</div>
|
||||||
|
<div class="stat"><label>Last push</label>{push_html}</div>
|
||||||
|
</div>
|
||||||
|
<div class="row">
|
||||||
|
<button class="btn btn-primary" onclick="doSync('/pull','Pulling from Outline')">Get from Outline</button>
|
||||||
|
<button class="btn btn-success" onclick="doSync('/push','Sending to Outline')"{push_dis}>{push_label}</button>
|
||||||
|
<a href="/changes" class="btn btn-secondary">Preview Changes</a>
|
||||||
|
</div>
|
||||||
|
<div id="output"></div>
|
||||||
|
</div>"""
|
||||||
|
return HTMLResponse(_page("Dashboard", body))
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/status")
|
||||||
|
async def vault_status():
|
||||||
|
s = _get_vault_status()
|
||||||
|
# Flatten last_pull / last_push to timestamps for backward-compat JSON consumers
|
||||||
|
return JSONResponse({
|
||||||
|
**s,
|
||||||
|
"last_pull": s["last_pull"]["timestamp"] if s["last_pull"] else None,
|
||||||
|
"last_push": s["last_push"]["timestamp"] if s["last_push"] else None,
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Phase C/E — Pull & Push
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@app.post("/pull")
|
||||||
|
async def start_pull():
|
||||||
|
if _active_job is not None:
|
||||||
|
raise HTTPException(status_code=409, detail="A sync job is already running")
|
||||||
|
job_id = _new_job("pull")
|
||||||
|
return {"job_id": job_id, "stream_url": f"/stream/{job_id}"}
|
||||||
|
|
||||||
|
|
||||||
|
@app.post("/push")
|
||||||
|
async def start_push():
|
||||||
|
if _active_job is not None:
|
||||||
|
raise HTTPException(status_code=409, detail="A sync job is already running")
|
||||||
|
conflicts = _get_conflict_files()
|
||||||
|
if conflicts:
|
||||||
|
return JSONResponse(
|
||||||
|
status_code=409,
|
||||||
|
content={
|
||||||
|
"detail": "Unresolved conflicts must be resolved before pushing",
|
||||||
|
"conflicts": conflicts,
|
||||||
|
"message": "Resolve conflicts before pushing",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
job_id = _new_job("push")
|
||||||
|
return {"job_id": job_id, "stream_url": f"/stream/{job_id}"}
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/stream/{job_id}")
|
||||||
|
async def stream_job(job_id: str):
|
||||||
|
if job_id not in _jobs:
|
||||||
|
raise HTTPException(status_code=404, detail="Job not found")
|
||||||
|
|
||||||
|
job = _jobs[job_id]
|
||||||
|
|
||||||
|
async def _generate():
|
||||||
|
global _active_job
|
||||||
|
# Start the job the moment the first client connects to the stream.
|
||||||
|
if job["status"] == "pending":
|
||||||
|
job["status"] = "running"
|
||||||
|
try:
|
||||||
|
await run_sync_job(job_id, job["command"])
|
||||||
|
except Exception as exc:
|
||||||
|
job["output"].append({"type": "done", "success": False, "message": str(exc)})
|
||||||
|
job["status"] = "error"
|
||||||
|
finally:
|
||||||
|
_active_job = None
|
||||||
|
|
||||||
|
# Stream all buffered output (job already ran inline above).
|
||||||
|
for event in job["output"]:
|
||||||
|
yield f"data: {json.dumps(event)}\n\n"
|
||||||
|
if event.get("type") == "done":
|
||||||
|
return
|
||||||
|
|
||||||
|
# Fallback: if job was already running when we connected, poll for new output.
|
||||||
|
cursor = 0
|
||||||
|
while True:
|
||||||
|
buf = job.get("output", [])
|
||||||
|
while cursor < len(buf):
|
||||||
|
yield f"data: {json.dumps(buf[cursor])}\n\n"
|
||||||
|
if buf[cursor].get("type") == "done":
|
||||||
|
return
|
||||||
|
cursor += 1
|
||||||
|
if job.get("status") in ("done", "error") and cursor >= len(buf):
|
||||||
|
return
|
||||||
|
await asyncio.sleep(0.05)
|
||||||
|
|
||||||
|
return StreamingResponse(
|
||||||
|
_generate(),
|
||||||
|
media_type="text/event-stream",
|
||||||
|
headers={"Cache-Control": "no-cache", "X-Accel-Buffering": "no"},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Phase D — Pending Changes & Diff
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@app.get("/changes")
|
||||||
|
async def changes(request: Request):
|
||||||
|
items = _get_pending_changes()
|
||||||
|
if "text/html" not in request.headers.get("accept", ""):
|
||||||
|
return JSONResponse(items)
|
||||||
|
|
||||||
|
if not items:
|
||||||
|
rows = '<tr><td colspan="3" style="color:#999;text-align:center;padding:20px">No pending changes — vault is in sync.</td></tr>'
|
||||||
|
else:
|
||||||
|
rows = ""
|
||||||
|
for item in items:
|
||||||
|
st = item["status"]
|
||||||
|
tag = f'<span class="tag tag-{st}">{st}</span>'
|
||||||
|
path = item["path"]
|
||||||
|
safe = re.sub(r"[^a-zA-Z0-9]", "_", path)
|
||||||
|
enc = base64.urlsafe_b64encode(path.encode()).decode().rstrip("=")
|
||||||
|
|
||||||
|
display = f'{item.get("from_path","")} → {item.get("to_path",path)}' if st == "renamed" else path
|
||||||
|
action = item.get("action", "")
|
||||||
|
if st == "deleted" and action == "skip":
|
||||||
|
act_cell = f'<span class="tag tag-skip">skip</span> <small style="color:#999">{item.get("reason","")}</small>'
|
||||||
|
else:
|
||||||
|
act_cell = action
|
||||||
|
|
||||||
|
diff_btn = ""
|
||||||
|
if st == "modified":
|
||||||
|
diff_btn = f'<br><a href="#" onclick="toggleDiff(\'{path}\',\'{enc}\');return false" style="font-size:.8rem;color:#0066cc">preview diff</a><div id="diff_{safe}" class="diff-container" style="display:none"></div>'
|
||||||
|
|
||||||
|
rows += f"<tr><td><code>{display}</code>{diff_btn}</td><td>{tag}</td><td>{act_cell}</td></tr>"
|
||||||
|
|
||||||
|
body = f"""
|
||||||
|
<div class="card">
|
||||||
|
<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:12px">
|
||||||
|
<h2 style="margin:0">Pending Changes ({len(items)})</h2>
|
||||||
|
<button class="btn btn-success" onclick="doSync('/push','Sending to Outline')">Send to Outline</button>
|
||||||
|
</div>
|
||||||
|
<table>
|
||||||
|
<thead><tr><th>File</th><th>Status</th><th>Action</th></tr></thead>
|
||||||
|
<tbody>{rows}</tbody>
|
||||||
|
</table>
|
||||||
|
<div id="output"></div>
|
||||||
|
</div>"""
|
||||||
|
return HTMLResponse(_page("Pending Changes", body))
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/diff/{encoded_path}", response_class=HTMLResponse)
|
||||||
|
async def get_diff(encoded_path: str):
|
||||||
|
try:
|
||||||
|
padded = encoded_path + "=" * (-len(encoded_path) % 4)
|
||||||
|
path = base64.urlsafe_b64decode(padded.encode()).decode()
|
||||||
|
except Exception:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid path encoding")
|
||||||
|
|
||||||
|
if ".." in path or path.startswith("/"):
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid path")
|
||||||
|
|
||||||
|
r_outline = _git("show", f"outline:{path}")
|
||||||
|
outline_text = r_outline.stdout if r_outline.returncode == 0 else ""
|
||||||
|
|
||||||
|
r_main = _git("show", f"HEAD:{path}")
|
||||||
|
if r_main.returncode != 0:
|
||||||
|
raise HTTPException(status_code=404, detail="File not found in main branch")
|
||||||
|
|
||||||
|
return HTMLResponse(_render_diff_html(outline_text, r_main.stdout, Path(path).name))
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Phase F — Conflict Resolution
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class ResolveRequest(BaseModel):
|
||||||
|
file: str
|
||||||
|
accept: str
|
||||||
|
|
||||||
|
@field_validator("accept")
|
||||||
|
@classmethod
|
||||||
|
def _check_accept(cls, v: str) -> str:
|
||||||
|
if v not in ("local", "remote"):
|
||||||
|
raise ValueError("accept must be 'local' or 'remote'")
|
||||||
|
return v
|
||||||
|
|
||||||
|
@field_validator("file")
|
||||||
|
@classmethod
|
||||||
|
def _check_file(cls, v: str) -> str:
|
||||||
|
if ".." in v or v.startswith("/"):
|
||||||
|
raise ValueError("Path traversal not allowed")
|
||||||
|
return v
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/conflicts")
|
||||||
|
async def list_conflicts(request: Request):
|
||||||
|
conflict_paths = _get_conflict_files()
|
||||||
|
if "text/html" not in request.headers.get("accept", ""):
|
||||||
|
return JSONResponse([{"path": p} for p in conflict_paths])
|
||||||
|
|
||||||
|
if not conflict_paths:
|
||||||
|
inner = '<p style="color:#155724">All conflicts resolved. You can now push.</p><a href="/" class="btn btn-success">Back to Dashboard</a>'
|
||||||
|
cc_none_display, cc_list_display = "block", "none"
|
||||||
|
cards = ""
|
||||||
|
else:
|
||||||
|
cc_none_display, cc_list_display = "none", "block"
|
||||||
|
cards = ""
|
||||||
|
for path in conflict_paths:
|
||||||
|
safe = re.sub(r"[^a-zA-Z0-9]", "_", path)
|
||||||
|
enc = base64.urlsafe_b64encode(path.encode()).decode().rstrip("=")
|
||||||
|
cards += f"""
|
||||||
|
<div class="conflict-card" id="cc_{safe}">
|
||||||
|
<h3>{path}</h3>
|
||||||
|
<div class="row">
|
||||||
|
<a href="#" class="btn btn-secondary" style="font-size:.85rem" onclick="toggleDiff('{path}','{enc}');return false">Show Diff</a>
|
||||||
|
<button class="btn btn-primary" onclick="resolve('{path}','local')">Keep Mine</button>
|
||||||
|
<button class="btn btn-danger" onclick="resolve('{path}','remote')">Keep Outline's</button>
|
||||||
|
</div>
|
||||||
|
<div id="diff_{safe}" class="diff-container" style="display:none"></div>
|
||||||
|
</div>"""
|
||||||
|
inner = ""
|
||||||
|
|
||||||
|
body = f"""
|
||||||
|
<div class="card">
|
||||||
|
<h2>Version Conflicts ({len(conflict_paths)})</h2>
|
||||||
|
<p style="color:#666;margin-top:0">Same document edited in both Obsidian and Outline.</p>
|
||||||
|
<div id="cc-none" style="display:{cc_none_display}">{inner if not conflict_paths else '<p style="color:#155724">All conflicts resolved.</p><a href="/" class="btn btn-success">Back to Dashboard</a>'}</div>
|
||||||
|
<div id="cc-list" style="display:{cc_list_display}">{cards}</div>
|
||||||
|
</div>"""
|
||||||
|
return HTMLResponse(_page("Conflicts", body))
|
||||||
|
|
||||||
|
|
||||||
|
@app.post("/resolve")
|
||||||
|
async def resolve_conflict(req: ResolveRequest):
|
||||||
|
conflict_paths = _get_conflict_files()
|
||||||
|
if req.file not in conflict_paths:
|
||||||
|
raise HTTPException(status_code=404, detail=f"'{req.file}' is not in the conflict list")
|
||||||
|
|
||||||
|
side = "--ours" if req.accept == "local" else "--theirs"
|
||||||
|
r = _git("checkout", side, req.file)
|
||||||
|
if r.returncode != 0:
|
||||||
|
raise HTTPException(status_code=500, detail=f"git checkout failed: {r.stderr.strip()}")
|
||||||
|
|
||||||
|
_git("add", req.file)
|
||||||
|
_git("commit", "-m", f"resolve({req.accept}): {req.file}",
|
||||||
|
"--author", "Outline Sync UI <sync@local>")
|
||||||
|
|
||||||
|
return {"ok": True, "file": req.file, "accepted": req.accept}
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Phase G — Sync History
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@app.get("/history")
|
||||||
|
async def sync_history(request: Request):
|
||||||
|
entries = _parse_sync_log()
|
||||||
|
fmt = request.query_params.get("format")
|
||||||
|
|
||||||
|
if fmt == "json":
|
||||||
|
return JSONResponse(entries)
|
||||||
|
if "application/json" in request.headers.get("accept", "") \
|
||||||
|
and "text/html" not in request.headers.get("accept", ""):
|
||||||
|
return JSONResponse(entries)
|
||||||
|
|
||||||
|
if not entries:
|
||||||
|
table_body = '<p style="color:#999;text-align:center;padding:24px">No sync history yet.</p>'
|
||||||
|
else:
|
||||||
|
rows = ""
|
||||||
|
for e in entries:
|
||||||
|
err = "error" in e.get("status", "").lower()
|
||||||
|
st_style = ' style="color:#dc3545;font-weight:600"' if err else ""
|
||||||
|
icon = "↓ pull" if e.get("direction") == "pull" else "↑ push"
|
||||||
|
rows += f"<tr><td>{e.get('timestamp','—')}</td><td>{icon}</td><td>{e.get('files','—')}</td><td{st_style}>{e.get('status','—')}</td></tr>"
|
||||||
|
table_body = f"<table><thead><tr><th>Timestamp</th><th>Direction</th><th>Files Changed</th><th>Status</th></tr></thead><tbody>{rows}</tbody></table>"
|
||||||
|
|
||||||
|
body = f'<div class="card"><h2>Sync History</h2>{table_body}</div>'
|
||||||
|
return HTMLResponse(_page("History", body))
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# File browser
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@app.get("/files", response_class=HTMLResponse)
|
||||||
|
async def file_browser():
|
||||||
|
tree_html = _get_vault_tree_html()
|
||||||
|
# Count files (excluding .git)
|
||||||
|
file_count = 0
|
||||||
|
if VAULT_DIR.exists():
|
||||||
|
file_count = sum(
|
||||||
|
1 for p in VAULT_DIR.rglob("*")
|
||||||
|
if p.is_file() and ".git" not in p.parts
|
||||||
|
)
|
||||||
|
body = f"""
|
||||||
|
<div class="card">
|
||||||
|
<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:12px">
|
||||||
|
<h2 style="margin:0">Vault Files ({file_count})</h2>
|
||||||
|
<small style="color:#999"><code>{VAULT_DIR}</code></small>
|
||||||
|
</div>
|
||||||
|
<div style="font-size:.9rem;line-height:1.6">{tree_html}</div>
|
||||||
|
</div>"""
|
||||||
|
return HTMLResponse(_page("Files", body))
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Entry point
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import uvicorn
|
||||||
|
uvicorn.run("webui:app", host="0.0.0.0", port=8080, reload=False)
|
||||||
Reference in New Issue
Block a user