Vivian Voss

The Screenshot Diary

security architecture freebsd tooling

Not in the Brief ⊣ Episode 03

Open Windows 11 on a Copilot+ PC. Navigate to Settings, Privacy & security, Recall & snapshots. The switch is there. The feature is opt-in today. It was not opt-in when it was first shipped in May 2024, and the first version stored its snapshot database mostly in cleartext on disk. That part of the history is not in the brief any more; it is part of the architecture's biography.

This is the third episode of Not in the Brief. The first two looked at the browser layer: Chrome's local AI model that any web page can call without a permission prompt, and Edge's password vault that decrypts itself eagerly at launch and leaves the cleartext in memory for the session. The third looks at the operating-system layer, where Microsoft built a feature that records the screen continuously, indexes the recording with a local AI model, and offers natural-language search across the user's past activity.

The feature is called Recall. The architecture is genuinely interesting. The history of how it arrived at the current architecture is more interesting still.

The Feature

Recall takes snapshots of whatever is on the screen at regular intervals, stores them encrypted on the local disk, and applies a local AI model to extract text (OCR) and semantic embeddings from each snapshot. The user can then open Recall and type a natural-language query: "find the page about cloud egress costs I read last Tuesday", "show me the email about the meeting on Friday", "what was that diagram I saw three weeks ago". Recall returns matching snapshots from the user's own past activity, with the timeline scrubbable in either direction.

The capability is, technically, a real piece of work. On-device OCR over a continuous screen capture, with semantic embedding indexing fast enough for natural-language query, on the user's laptop without a cloud round-trip, is the kind of thing that would have been a research paper five years ago. The neural-processing-unit (NPU) hardware on Copilot+ PCs makes it tractable. NPUs rated at 40 trillion operations per second or higher are the hardware threshold; the major laptop manufacturers (Microsoft, Dell, HP, ASUS, Lenovo, Samsung) ship Copilot+ models across most of their current portfolios.

What the feature produces, when enabled, is a continuous catalogue of what appeared on the screen. Every email read, every document edited, every page browsed, every video frame paused at, every chat window left open, every banking session, every medical-record viewer, every private message: indexed and queryable in natural language. The catalogue is held on the user's own disk, by software the user did not write, and addressed by a query interface the user did not specify.

The Introduction

The feature was announced at Microsoft Build in May 2024. The original design was:

  • Default on. Recall would activate on every Copilot+ PC at first run. No setup prompt, no consent dialog. The user would receive a notification informing them the feature was active
  • Storage in cleartext, mostly. Snapshots and the searchable index database were stored on the local disk in a SQLite database that was, in the words of the security researchers who looked at it, "mostly cleartext". The OCR text was unencrypted; the snapshots themselves were not deeply protected against an attacker with access to the user account
  • No additional authentication for access. Once a user was logged in, any process running in the user's context could read the Recall database

The security community, predictably, did not love this. Researcher Alexander Hagenah, based in Zurich, published a proof-of-concept tool called TotalRecall in June 2024 that extracted the Recall database trivially, ran SQL queries on the OCR text, and demonstrated that an attacker who could exfiltrate a small SQLite file from a target machine could have a complete searchable history of the target's recent screen activity. The story moved from the security press to the mainstream press within days.

Microsoft, to its credit, did not double down on the first design. The feature was withdrawn from the Copilot+ launch in June 2024. The company announced a redesign through summer 2024, missed an October 2024 release window, and shipped a redesigned preview to Windows Insiders in November 2024. The redesigned feature reached general availability on Copilot+ PCs in April 2025.

Twenty-Three Months of Architecture, in Public May 2024 Build announce default on, mostly cleartext Jun 2024 withdrawn after Hagenah's TotalRecall Nov 2024 redesigned preview to Insiders Apr 2025 GA: VBS Enclave, TPM, Hello, opt-in default Mar 2026 TotalRecall Reloaded reads AIXHost.exe Apr 2026 Weston: "not a bypass" The opt-in default is the result of public pressure, not of original design.

The redesign is, by the standards of what the original feature was, a real change in architecture. It is the second draft, and the second draft is the one that ships. The first draft is the one that survives in the architecture's biography, however, because the decision to ship a continuous screen-capture catalogue by default with mostly-cleartext storage was a deliberate decision by a large and competent engineering organisation. The second draft did not unmake the first decision. It rewrote the storage layer.

The Mechanics

The post-redesign architecture splits cleanly into two halves: the vault, and what happens after the vault has been opened. Both halves are important.

The Vault

The architecture, as documented by Microsoft on the Windows Experience Blog and Microsoft Learn, rests on four pillars.

The Four Pillars Inside the Vault 1 ■ VBS Enclave virtualised execution context; memory invisible to the rest of Windows, including the kernel running outside it 2 ■ AES-256-GCM at rest snapshots and vector index encrypted with per-record keys; Snapshot Store on disk fully encrypted at idle 3 ■ TPM-bound keys unlocking material sealed against the device's TPM; removing the disk yields nothing decryptable 4 ■ Windows Hello Enhanced Sign-In Security; biometric or PIN gates the enclave's release; session timeout, re-authorisation

Inside that envelope, the vault is solid. The keys are unreachable from outside the enclave. The data on disk is unreadable without the TPM. The unlock requires a biometric or PIN. The original cleartext-database criticism is, on the current architecture, no longer accurate. Microsoft did the engineering work and the engineering work is good.

That is the part of the story that ends with "the vault is solid".

The Wall

The other part of the story is what happens after authentication. The Recall timeline, once the user has authenticated with Windows Hello and the enclave has decrypted the relevant snapshots, has to actually show the snapshots on the screen. Showing the snapshots requires that decrypted pixel data, decrypted OCR text and decrypted metadata leave the enclave and enter ordinary Windows processes that render UI to the user.

The process that does the rendering is called AIXHost.exe. It runs in the user's session at ordinary user privilege. It is not inside the VBS Enclave. The decrypted snapshot content lives inside its address space for as long as the timeline is open.

In March 2026, Alexander Hagenah, the same researcher who built the 2024 TotalRecall tool, published a successor called TotalRecall Reloaded. The new tool:

  • Runs as an ordinary user, with no administrative privileges, no kernel exploit, and no privilege escalation
  • Does not attempt to break the VBS Enclave or extract any keys from it
  • Does not bypass Windows Hello in any meaningful sense; the tool requires that the legitimate user has already authenticated
  • Injects into AIXHost.exe, the Recall timeline-rendering process, after the legitimate user has authenticated with Windows Hello
  • Reads decrypted screenshots, OCR text, and metadata directly from AIXHost.exe's address space, for as long as the timeline is open

In Hagenah's summary: "The vault door is titanium. The wall next to it is drywall."

Vault & Wall: the Trust Boundary Diagram The Vault VBS Enclave AES-256-GCM at rest TPM-bound keys Windows Hello gate Encrypted Snapshot Store unreadable at idle trust boundary — Windows Hello decrypted AIXHost.exe user-session, ordinary privilege renders the Recall timeline decrypted content in memory read TotalRecall Reloaded user-level injection, no admin, no kernel exploit, no key extraction "The vault door is titanium. The wall next to it is drywall." (Hagenah, 2026) Microsoft (Weston): "consistent with intended protections, does not represent a bypass."

Microsoft's response, attributed to David Weston, Corporate Vice President for Microsoft Security, was published in The Verge and other outlets in April 2026. The position is consistent: the demonstrated access pattern is "consistent with intended protections and existing controls, and does not represent a bypass of a security boundary or unauthorised access to data". In other words, the architecture is functioning as designed; the decrypted-after-Hello content is, by design, accessible to processes running in the authenticated user's session; Microsoft is not classifying this as a vulnerability.

That is, on its own terms, an internally coherent position. The enclave is for key material and at-rest encryption; once the user has authorised access, the data is by design in the user's session and shares the user's threat surface. The position is defensible.

It is also incomplete in the way that the entire architecture, taken as a system, has been incomplete since the beginning: the design is correct from the inside out, and silent on the brief.

A Note on the Other Side of the Stack

Recall is a Windows feature. On a Unix-style operating system such as FreeBSD or any Linux distribution, the equivalent does not exist; there is no system service that continuously captures the screen, indexes it with a local AI model, and exposes a natural-language search interface across the user's past activity. The capability is not there because nobody on the kernel or base-system side built it. A screen recorder can be installed (pkg install ffmpeg on FreeBSD will give the user the building blocks), but installing it is an explicit act, the daemon is named, and the captured content sits in a file the user named in a directory the user chose. The architectural difference matters for the awareness question: on a Unix system, the absence of capture is the default; on a Copilot+ Windows machine, the presence of capture is the default and the user's choice is whether to switch it on.

The Risk

The architectural risk has two layers. The first is the post-authentication accessibility problem demonstrated by TotalRecall Reloaded: a user who is running malicious software (an installed application that turned hostile, a compromised browser extension, an unfortunate click on a phishing link) and who authenticates to Recall is, on the current architecture, exposing decrypted snapshot content to that software. This is, on Microsoft's stated threat model, working as intended. It is also, on any reasonable user expectation, not the trade-off the user signed up for when they clicked "Yes, save snapshots".

The second layer is structural, and it is the more important one.

What Recall Actually Catalogues

The feature, where enabled, produces a third-party catalogue of every second on the user's own machine. Three properties of that catalogue are worth stating in full:

Three Properties of the Catalogue 1 ■ Continuous capture, at machine cadence every page read long enough, every document edited, every video paused, every chat left open 2 ■ Queried by software the user did not write user owns the disk; index format, query interface and integration surface owned by Microsoft 3 ■ No context distinction by default banking sessions, medical-record viewers, holiday photos, recipe sites — exclusions are enumerated by the user

The opt-in switch is the consent record. The presence of the switch, on every Copilot+ machine, is the default that survived public review. Consider what that means: the engineering case for the feature, after extensive public scrutiny, was strong enough that the company concluded the right answer was to ship the feature opt-in on every supported machine rather than not ship it at all. The history is in the brief. The decision is in the brief. The user, in 2026, encounters the result of that decision as a Settings switch.

What the User Actually Signed Up For

Consider a user who buys a Copilot+ laptop in 2026. They unbox it, run the Windows setup, and at some point during setup they see a Recall dialog. The dialog says, in essence, "Recall captures snapshots of your screen so you can search later. Do you want to enable it?" The default option is "Not now". The user clicks "Not now" or "Yes". The dialog goes away.

The user has now made one of two choices: opt in to Recall, or not. Both choices are documented and have a consent record. Neither choice involved being told that:

  • The opt-in dialog defaults to disabled because the first design did not default to disabled, and the first design's default was changed only after public outcry
  • The Recall switch is present on the machine whether or not the user has heard of the feature; the existence of the option is itself the default that survived public review
  • The post-authentication threat model accepts that decrypted content lives in user-session processes that can be read by other user-session code. This is by design; the boundary ends at Windows Hello
  • What is captured is everything visually present on the screen, by default, indexed by software the user did not write, in a format the user does not own

None of those facts are hidden. Microsoft has published all of them. They are in the brief, if one is willing to read Microsoft Learn end to end. They are not in the brief in the practical sense: the user, on installing the operating system, was not given a chance to weigh them against the convenience benefit.

The architecture is what it is. The point of awareness is to know that it is what it is.

How to See It

The verification path is straightforward on any current Copilot+ PC.

Settings, Privacy & security, Recall & snapshots. The master switch lives here. If the toggle is off, no snapshots are being saved and the database is empty. If the toggle is on, snapshots are being captured at the configured interval (default approximately every few seconds when activity is detected). The same page exposes the storage cap (how much disk Recall is allowed to use) and the retention horizon (how far back snapshots are kept).

The Recall app, in-app Settings, exclusions. Per-app and per-website exclusions are configured here. Browsers in private/incognito mode are automatically excluded from snapshots, by design. Passwords and credit-card fields are likewise automatically masked. Other categories of sensitive content (medical record viewers, banking apps, internal corporate tools) the user must exclude manually.

Group Policy, on Pro/Enterprise editions. Open Group Policy Editor (gpedit.msc). Navigate to User Configuration, Administrative Templates, Windows Components, Windows AI. The policy "Turn off saving snapshots for Windows" disables the snapshot-saving capability across the user's session, irrespective of the user's per-machine choice. The corresponding registry value is HKCU\Software\Policies\Microsoft\Windows\WindowsAI\DisableAIDataAnalysis (DWORD, value 1).

Intune / MDM, on managed devices. The WindowsAI configuration service provider exposes the same setting under ./User/Vendor/MSFT/Policy/Config/WindowsAI/DisableAIDataAnalysis. Managed devices have Recall disabled and the feature removed by default; the policy is the explicit setting if a managed device administrator wants to confirm the disabled state.

Forensic confirmation. If the master switch is off and the policy is enabled, the Recall database directory (typically under %LOCALAPPDATA%\CoreAIPlatform.00\UKP) should be empty or absent. If snapshots have been captured, the directory contains encrypted files; their content cannot be read without the user's Windows Hello authentication.

The honest qualification: even with Recall disabled, a Copilot+ PC has the capability to enable it at any time. The hardware is on the machine. The software is on the machine. The decision is the user's, or the device administrator's, or in some configurations, anyone who can persuade either of them to enable the toggle.

The Pattern Across the Series

Three episodes in, the pattern is starting to come into focus. The browser layer (Chrome with local AI, Edge with cleartext password vault) and the operating-system layer (Recall) all share a structural quality: a feature was added to the user's machine, the user's consent was obtained or assumed at a level that does not match the depth of the feature, and the architectural boundary that the vendor describes as sufficient does not match the boundary the user reasonably assumed.

In all three cases the company is, by its own terms, correct. Chrome's local model is local; it does not exfiltrate. Edge's password vault is, by Microsoft's threat model, secure against attackers who do not already have administrative access. Recall's enclave is, by Microsoft's threat model, secure against attackers who cannot defeat Windows Hello. In all three cases the user, by their own reasonable terms, was not told that this was the trade-off the vendor was making.

The series is not arguing that the features should not exist. The features are real, useful, and in some respects represent genuine engineering progress. The series is arguing that the consent record and the architecture should match the brief, and that where they do not match, the user should be told.

The looking is the entire point. Once a user has looked, the choice becomes theirs.

Closing

The diary is on every Copilot+ PC shipped today. On most of them, the cover is closed and the diary is empty; the user has chosen not to open it. On a growing number of them, the user has clicked "Yes, save snapshots" without quite understanding that this opens a continuously updating searchable record of everything that has appeared on their screen, kept under a vault door that is genuinely strong and a wall next to the vault that is, on Hagenah's evidence, drywall.

That is an architecture. It is also, by every available record, an opt-in architecture, achieved through public pressure rather than by original design. The history is part of the brief now. The looking, as the series keeps saying, is not difficult. It just has to start.