Database Access
CommandLane stores all data in a local SQLite database. This guide explains how to access your data directly for backups, exports, or custom queries.
Database Location
Default location: pkb_data.db in your project directory.
Custom location configured via:
{
"storage": {
"sqlite_db_path": "/path/to/custom.db"
}
}
Connecting to the Database
SQLite CLI
# Open database
sqlite3 pkb_data.db
# Enable headers and column mode for readability
.headers on
.mode column
# List all tables
.tables
# Show schema
.schema entries
Python
import sqlite3
from pathlib import Path
db_path = Path("pkb_data.db")
conn = sqlite3.connect(db_path)
conn.row_factory = sqlite3.Row # Access columns by name
cursor = conn.execute("SELECT * FROM entries LIMIT 5")
for row in cursor:
print(dict(row))
conn.close()
DB Browser for SQLite
For a visual interface, use DB Browser for SQLite:
- Download and install
- File → Open Database
- Select
pkb_data.db
Database Schema
entries
Main table storing all captured entries.
| Column | Type | Description |
|---|---|---|
id | TEXT | Unique identifier (UUID) |
body | TEXT | Entry content |
entry_type | TEXT | task, note, or decision |
status | TEXT | Task status (null for notes) |
due | TEXT | Due date (ISO format) |
tags | TEXT | JSON array of tags |
source_app | TEXT | Captured from which app |
source_title | TEXT | Window title when captured |
timestamp | TEXT | Creation timestamp (ISO) |
updated_at | TEXT | Last update timestamp |
notion_page_id | TEXT | Notion sync ID (if synced) |
last_synced_to_notion | TEXT | Last Notion sync time |
embeddings
Vector embeddings for semantic search.
| Column | Type | Description |
|---|---|---|
id | TEXT | Entry ID (foreign key) |
embedding | BLOB | Vector embedding data |
model | TEXT | Embedding model used |
conversations
Chat conversation metadata.
| Column | Type | Description |
|---|---|---|
id | TEXT | Conversation UUID |
title | TEXT | Conversation title |
created_at | TEXT | Creation timestamp |
updated_at | TEXT | Last message timestamp |
provider | TEXT | AI provider used |
model | TEXT | Model used |
messages
Individual chat messages.
| Column | Type | Description |
|---|---|---|
id | TEXT | Message UUID |
conversation_id | TEXT | Parent conversation |
role | TEXT | user or assistant |
content | TEXT | Message text |
created_at | TEXT | Timestamp |
migration_metadata
Schema version tracking.
| Column | Type | Description |
|---|---|---|
key | TEXT | Metadata key |
value | TEXT | Metadata value |
Useful Queries
Recent Entries
SELECT body, entry_type, timestamp
FROM entries
ORDER BY timestamp DESC
LIMIT 10;
Overdue Tasks
SELECT body, due, status
FROM entries
WHERE entry_type = 'task'
AND due < datetime('now')
AND status NOT IN ('done', 'archived')
ORDER BY due;
Tasks Due Today
SELECT body, due, status
FROM entries
WHERE entry_type = 'task'
AND date(due) = date('now')
ORDER BY due;
Entries by Source App
SELECT source_app, COUNT(*) as count
FROM entries
GROUP BY source_app
ORDER BY count DESC;
Tasks by Status
SELECT status, COUNT(*) as count
FROM entries
WHERE entry_type = 'task'
GROUP BY status;
Weekly Activity
SELECT date(timestamp) as day, COUNT(*) as captures
FROM entries
WHERE timestamp > datetime('now', '-7 days')
GROUP BY date(timestamp)
ORDER BY day;
Search by Tag
SELECT body, tags
FROM entries
WHERE tags LIKE '%"project:finance"%';
Notion Sync Status
SELECT id, body, notion_page_id, last_synced_to_notion
FROM entries
WHERE notion_page_id IS NOT NULL
ORDER BY last_synced_to_notion DESC;
Unsynced Entries
SELECT id, body, timestamp
FROM entries
WHERE notion_page_id IS NULL
AND entry_type IN ('task', 'note')
ORDER BY timestamp DESC;
Chat History
SELECT c.title, c.model, COUNT(m.id) as messages
FROM conversations c
LEFT JOIN messages m ON m.conversation_id = c.id
GROUP BY c.id
ORDER BY c.updated_at DESC;
Full Conversation
SELECT role, content, created_at
FROM messages
WHERE conversation_id = 'your-conversation-id'
ORDER BY created_at;
Full-Text Search
If FTS5 is enabled, you can use full-text search:
-- Search entries
SELECT * FROM entries_fts
WHERE entries_fts MATCH 'budget meeting';
-- Ranked search
SELECT *, rank
FROM entries_fts
WHERE entries_fts MATCH 'project planning'
ORDER BY rank;
Backup Strategies
Simple Copy
# Stop CommandLane first, then:
Copy-Item pkb_data.db pkb_data_backup_$(Get-Date -Format "yyyy-MM-dd").db
SQLite Backup Command
sqlite3 pkb_data.db ".backup backup.db"
Automated Daily Backup
Create a PowerShell script backup-db.ps1:
$source = "$env:USERPROFILE\.cmdlane\pkb_data.db"
$dest = "$env:USERPROFILE\.cmdlane\backups"
$date = Get-Date -Format "yyyy-MM-dd"
New-Item -ItemType Directory -Path $dest -Force
Copy-Item $source "$dest\pkb_data_$date.db"
# Keep only last 7 backups
Get-ChildItem $dest | Sort-Object CreationTime -Descending |
Select-Object -Skip 7 | Remove-Item
Schedule with Task Scheduler for daily backups.
Export to JSON
Export all entries to JSON:
import sqlite3
import json
from pathlib import Path
conn = sqlite3.connect("pkb_data.db")
conn.row_factory = sqlite3.Row
cursor = conn.execute("SELECT * FROM entries")
entries = [dict(row) for row in cursor]
Path("entries_export.json").write_text(
json.dumps(entries, indent=2, default=str)
)
print(f"Exported {len(entries)} entries")
Export to Markdown
import sqlite3
from pathlib import Path
from datetime import datetime
conn = sqlite3.connect("pkb_data.db")
conn.row_factory = sqlite3.Row
cursor = conn.execute("""
SELECT * FROM entries
WHERE entry_type = 'task'
ORDER BY timestamp DESC
""")
output = ["# Tasks Export\n"]
for row in cursor:
status = "x" if row["status"] == "done" else " "
output.append(f"- [{status}] {row['body']}")
if row["due"]:
output.append(f" - Due: {row['due']}")
Path("tasks_export.md").write_text("\n".join(output))
Data Integrity
Check Database Integrity
PRAGMA integrity_check;
Vacuum Database
Reclaim space after deletions:
VACUUM;
Analyze for Query Optimization
ANALYZE;
Safety Notes
For most use cases, treat the database as read-only. Modifications should go through CommandLane's API to maintain data integrity.
Safe Modifications
If you must modify data directly:
- Backup first: Always backup before changes
- Stop CommandLane: Close the app to avoid conflicts
- Use transactions: Wrap changes in
BEGIN/COMMIT - Validate: Check results before committing
BEGIN TRANSACTION;
UPDATE entries SET status = 'done' WHERE id = 'specific-id';
-- Verify the change
SELECT * FROM entries WHERE id = 'specific-id';
-- If correct:
COMMIT;
-- If wrong:
ROLLBACK;
WAL Mode
CommandLane uses WAL (Write-Ahead Logging) mode for better concurrent access:
- Main database:
pkb_data.db - WAL file:
pkb_data.db-wal - SHM file:
pkb_data.db-shm
When backing up, include all three files, or run PRAGMA wal_checkpoint(TRUNCATE) first to merge WAL into main database.