window-tip
Exploring the fusion of AI and Windows innovation — from GPT-powered PowerToys to Azure-based automation and DirectML acceleration. A tech-driven journal revealing how intelligent tools redefine productivity, diagnostics, and development on Windows 11.

Microsoft Pushes Windows 11 "Secure by Default": What It Means for App and Driver Signing

Microsoft is reportedly moving toward a "Secure by Default" model for Windows 11, which would restrict the operating system to running only properly signed applications and drivers out of the box. While the feature is expected to ship with a toggle to disable it, the proposal has sparked a wide-ranging debate about security, freedom, and where Windows is heading long-term.

What Is Code Signing, and Why Does It Matter?

Code signing is the practice of digitally attaching a cryptographic certificate to software, identifying who built it. It is not a guarantee of safety — a signed application can still behave maliciously — but it does establish accountability. Think of it as authentication rather than authorization: it tells you who wrote the code, not whether that code is trustworthy.

Driver signing is already mandatory on a standard Windows boot. What Microsoft appears to be considering is extending that model to user-mode applications, which has not been required until now.

The Case for Default Signing Requirements

Proponents argue that most reputable software is already signed. Popular open-source tools like Krita ship with valid certificates, and major commercial applications have required signing for years across Windows, macOS, and mobile platforms. For everyday users who are not equipped to evaluate unknown executables, a default that blocks unsigned software would meaningfully reduce exposure to malware. A user who cannot find the toggle to disable the restriction probably should not be running unverified binaries anyway.

The comparison to mobile operating systems is frequently raised. Android requires all APKs to be signed — even debug builds receive an automatically generated certificate through the development toolchain — and iOS has required code signing since its inception. This has not eliminated malware on either platform, but it has raised the cost and complexity of distributing it.

The Case Against, and the Concerns That Follow

The loudest objections come from independent developers and the open-source community. Obtaining a trusted code-signing certificate currently costs roughly $100 to $400 per year, and recent regulatory changes around Extended Validation certificates have introduced hardware security module requirements that complicate automated build pipelines. For a hobby developer distributing free software, that cost is a significant barrier. The result is that useful, legitimate software can end up flagged by security tools purely because the developer could not afford or did not prioritize signing.

There is also a structural concern about what signing requirements actually accomplish at scale. A certificate proves identity, not intent. Threat actors have historically obtained legitimate certificates through theft or by purchasing them fraudulently. Signed malware is not a hypothetical — it is a documented and recurring problem. Treating a signing requirement as a security boundary rather than an accountability mechanism overstates what the technology can deliver.

Beyond the technical objections, critics point to a longer arc. The argument is that a default restriction, once normalized, is easier to make permanent. Telemetry showing that a minority of users ever disable a feature becomes justification for removing the option in a future version. Windows S Mode follows exactly that logic: it locks users to the Microsoft Store, and while an escape hatch exists today, the commercial incentive to close it is obvious. Whether Microsoft would eventually pursue a similar path for signing enforcement is speculative, but the concern is not unreasonable given the pattern.

The Kernel-Level Anticheat Problem

A parallel discussion concerns kernel-level anticheat software used in competitive games. Several major titles require anticheat components that run at the kernel level, giving them deep system access in exchange for stronger cheat detection. Critics argue this is a significant and often underappreciated security risk — kernel compromises are among the most serious a system can suffer, and anticheat vendors are not immune to supply chain attacks or vulnerabilities.

Counter-Strike 2 is frequently cited as a counterexample: its default anticheat operates at the user level, and the game runs natively on Linux. Professional leagues that require stronger guarantees, such as FACEIT, deploy their own kernel-level solution as a deliberate choice rather than a platform default. The question of whether a stricter signing and sandboxing regime would pressure game developers to move away from kernel-level anticheat remains open.

More information on FACEIT's anticheat history and design can be found at FACEIT's official support page.

Sandboxing and File System Access

Several participants in the debate have raised a related and arguably more impactful concern: the lack of file system isolation for Windows applications. On current Windows installations, most executables can read and write across the entire file system by default. This is in contrast to mobile operating systems and, increasingly, macOS, where applications must declare and be granted specific permissions for directory access.

The practical consequences are visible in software like certain creative suites that scatter configuration and cache files across multiple unrelated directories, making clean uninstallation difficult. A permission model that required applications to declare their intended file system footprint — and that surfaced that footprint to users — would address a different but equally real category of problem.

Where This Fits in the Broader Landscape

The move toward hardware attestation and signed software environments is not unique to Microsoft. Several governments have passed or are considering legislation that effectively requires devices used for banking or official identity applications to be in a verified, unmodified state. Vietnam has enacted rules requiring banks to block access from rooted or bootloader-unlocked devices. The European Union's digital identity framework includes similar provisions. The concern among civil liberties advocates is that a platform-level signing requirement on Windows would make it technically straightforward for software — including government-mandated applications — to verify and enforce the policy.

The Free Software Foundation has written at length about the risks of trusted computing architectures and what it characterizes as the tension between user control and platform attestation. Their analysis is available at gnu.org.

The Current State and What to Watch

Windows 11 already warns users before running unsigned executables, and Windows Defender flags unrecognized software. The proposed change would make the blocking behavior the default rather than a warning, while preserving an option to disable it. Driver signing enforcement on standard boot has been in place for years, and that precedent suggests the underlying infrastructure for expanding enforcement already exists.

The meaningful question is not whether the initial release will include a disable toggle — it almost certainly will — but whether that toggle persists across future versions. The history of Windows feature rollouts, from the Microsoft account requirement to S Mode, suggests that initially optional restrictions tend to become progressively harder to escape. Whether this particular change follows that trajectory will depend on both Microsoft's commercial incentives and the degree of pushback from developers and enterprise customers who depend on unsigned tooling.

Post a Comment