Your books. Locked up properly.
No SOC 2 theater. No "bank-grade encryption" in 60-point type. This page explains, in plain words, what we actually do with your data, where it lives, and the things we haven't gotten around to yet.
Six things, done properly.
Instead of a wall of logos, here's what happens to your data from the moment it leaves your browser to the moment we restore it at 3am on a Tuesday.
Encrypted between your browser and ours.
Everything that leaves your machine runs over TLS 1.3. We're on the HSTS preload list, so even a bookmark you typed in 2019 upgrades to https before the connection opens. Certificates auto-rotate every 60 days through Let's Encrypt. Our cipher suites are the ones Mozilla calls 'modern,' and we cut off TLS 1.1 in 2023.
Encrypted on disk. Per-tenant keys.
Your invoices, clients, attachments, and time entries are encrypted with AES-256 before they touch the database. Every workspace has its own key, derived from a master key that lives in AWS KMS. If a disk image leaked, an attacker would get ciphertext and a shrug.
Bruno. On purpose, when you ask.
Three humans have production access: Bruno, Marta (ops), and Carl (on-call backup). Every access event is logged, signed, and emailed to the other two. If you open a support ticket and we need to look at your workspace, we ask first, in the ticket, in writing. You can say no. Most problems don't actually need us to look.
Daily, encrypted, in two regions.
Databases snapshot every 6 hours, file storage replicates continuously. Backups land in us-east-1 and eu-central-1, encrypted with keys not used for anything else. On the second Tuesday of every month we restore the production database into a blank environment from cold storage. If it doesn't boot, we don't leave until it does.
Hetzner for compute. AWS for storage.
Gingerbread (hosted) runs on dedicated Hetzner servers in Falkenstein and Helsinki. We picked Hetzner because it's cheap, fast, and run by engineers instead of MBAs. Long-term storage, backups, and the secrets layer sit in AWS, because some things you want boringly conservative. Data residency is EU by default, or US if you pick that region at signup.
Small team, quiet dependencies.
Gingerbread is ~54k lines of PHP and ~18k lines of TypeScript. We depend on 31 direct npm packages and 22 composer packages. We read every changelog before we bump a version. Dependabot opens PRs, we merge them, CI runs a Semgrep pass, Playwright re-runs the critical-path tests, and we ship.
What we haven't done yet.
Every other security page pretends these don't exist. We'd rather you know up-front so you can make the call.
No SOC 2 report.
Coming late 2026We'd spend $50k a year so three employees of a large bank could ignore it. When we hit 500 business customers we'll start the Type I. Email us if you need the letter in the meantime. We'll work with you.
No HIPAA BAA.
Not plannedGingerbread isn't built for protected health information. If you're a therapist invoicing for sessions, you're fine. If you're storing diagnostic notes in client fields, please don't.
No SSO/SAML yet.
In stagingRegular 2FA works. SAML is built and sitting in staging. It'll ship when we find five customers who actually want it. If that's you, reply to any email.
No bug bounty yet.
Ad-hoc for nowWe pay researchers case-by-case through the report form. A proper program is on the list once we figure out the triage bandwidth. In 2025 we paid out $8,400 across 11 reports.
The whole log. No redactions.
Every incident, drill, and dependency patch worth talking about since we started publishing this log in March 2024.
Subscribe to the feed (coming soon)Production database restored into a blank environment from cold storage. 2h 14m, clean boot.
Dependency bumped within 6 hours of advisory. No exposure in our uploader, patched preemptively.
Hetzner network flap. Traffic failed over to Helsinki automatically. Total user-visible downtime: 47 minutes on EU workspaces.
Reported via the form, triaged in 40 min, patched within 18 hours. $1,200 paid to researcher. Postmortem published.
Ran a full dependency review, dropped 4 transitive deps, no findings.
Rotated the per-workspace master key. Zero customer impact, ~3 min of elevated tail latency.
What we store, where, and for how long.
No dark patterns. If you want any row deleted, email us and we'll confirm in writing once it's gone.
| What | Where it lives | Kept for | Who can see it |
|---|---|---|---|
| Business data (invoices, clients, tasks, time) | Encrypted in your workspace DB | Until you delete it, or 30 days after account closure | You. Your teammates. Us only with ticket consent. |
| File attachments | S3, SSE-KMS, per-workspace bucket prefix | Same as business data | Same. Never indexed, scanned, or used for ML. |
| Payment info | Never touches our servers. Stripe tokenizes it at the browser. | Stripe's policy. Not ours. | We see last four, brand, and expiry. That's it. |
| Support emails & tickets | Fastmail (EU), separate from the product DB | 3 years, then archived | Bruno, Marta. Redacted after 90 days unless needed. |
| Login & session logs | Product DB, hashed IP, 90-day TTL | 90 days | You can see yours in Settings → Security. |
| Web analytics | We run Plausible on the marketing site only. | Aggregated, 24 months | No cookies, no fingerprinting, no tracking on authed pages. |
Found something? Write to us.
Give us a reasonable window (usually 90 days) before going public. In return you'll get a human reply within a business day, credit in the changelog, and a payout if the finding was material.
If your books matter, host them yourself.
The self-hosted version runs on your server, with your backups, behind your firewall. If a security team has to sign off, this is usually the easy answer.