Welcome to SwedenCpp
Latest blogs, videos, podcasts and releases in one stream
Friday, April 17, 2026
High-Performance Distributed Systems in Modern C++ | Boost.Asio & Beast Deep Dive🎥CppOnline
Every C++ Test Framework should wrap this CMake command!🎥Refactoring Bitcoin
Algebraic Path Problems Done Quick: Or how to find the best* path from one talk to another🎥CppCon
The Nim Programming Language🎥Northwest C++ Users Group
Why is there no `boost_discover_tests` in CMake?🎥Refactoring Bitcoin
Level Up! Procedural Game Music and Audio - Chris Nash - ADC 2025🎥audiodevcon
What Happens During C++ Compilation 🤔 #coding #shorts🎥CppOnline
Four Ways to Avoid Being TrappedIn the previous article, we looked at how senior developers slowly drift into roles they didn’t consciously choose.📝The Dev LadderThis page runs on coffee, please consider supporting it.
Thursday, April 16, 2026
What’s up with window message 0x0091? We’re getting it with unexpected parametersTrespassing on system messages. The post What’s up with window message 0x0091 ? We’re getting it with unexpected parameters appeared first on The Old New Thing .📝The Old New Thing
Leverage AI Agents to Refactor and Modernize C++ Code - Jubin Chheda - CppCon 2025🎥CppCon
Hacking and Securing C++ Marcell Juhasz - Meeting C++ 2025🎥MeetingCpp
Why AI Developers Are Missing The Fundamentals #ai #coding #programming🎥CppOnline
Zig, hot reload, and ABI troubleI was recently overcome by the idea of porting some C code of mine to Zig. In the process, I think I learned a thing or two about situations in which Zig is struggling to replace C for me. The short version is: Zig is pleasant until you need lots...📝Sebastian SchönerWednesday, April 15, 2026
GEOINT Symposium 2026At the USGIF GEOINT Symposium 2026, the leading event for the geospatial intelligence community, Kitware will present its latest advancements in AI test and evaluation (T&E), computer vision, and interactive visualization. Our work supports national security missions by enabling organizations to better analyze complex data and make informed decisions with confidence.📝Kitware Inc
Why is there a long delay between a thread exiting and the WaitForSingleObject returning?Maybe it didn't really exit. The post Why is there a long delay between a thread exiting and the WaitForSingleObject returning? appeared first on The Old New Thing .📝The Old New Thing
C++ Insights - Episode 73: Things you can do effortlessly with C++20s concepts🎥Andreas Fertig
From C+ to C++: Modernizing a GameBoy Emulator - Tom Tesch - CppCon 2025🎥CppCon
Accelerate UI Development - Seamless Designer-Developer Collaboration with Web Tools - Ryan Wardell🎥audiodevcon
The Compile-Time Trick Most Developers Miss #programming #tutorial🎥CppOnline
C++ Insights Episode 73: Things you can do effortlessly with C++20s conceptsI published a new C++ insights episode: Things you can do effortlessly with C++20s concepts. In this episode, you'll learn how C++20s concepts help you to write less code that's easier to maintain at the same time. Andreas📝AndreasFertig.com
MrDocs Bootstrap: One Script to Build Them AllWhen new developers joined the MrDocs team, we expected the usual ramp-up: learning the codebase, understanding the architecture, and getting comfortable with the review process. What we did not expect was that building and testing the project would be the hardest part. People dedicated to the project full-time spent weeks just trying to get a working build. Even when they succeeded, each person ended up with their own set of workarounds: a custom script here, a patched flag there, an undocumented environment variable somewhere else. One unrelated commit from someone else could silently break another developer’s local setup. And even after all of that, they didn’t know how to run the commands to test the project. As the complexity grew, we naturally reached for a package manager. We adopted vcpkg, but over time we discovered that our problem was too complex for what any package manager is designed to handle. The build type combinations, the sanitizer propagation, the cross-platform toolchain differences, and the IDE configurations: these are workflow problems that kept accumulating. That realization, combined with an onboarding crisis where new contributors could not build the project at all, led us to write our own bootstrap script. The idea was not unfamiliar: at the C++ Alliance, we work closely with the Boost libraries, and Boost has shipped a bootstrap script for years. We knew the pattern worked. We just needed to apply it to our own dependency problem. This post explains why robust C++ workflows are fundamentally difficult, not only for dependency management but also for supporting multiple platforms, compilers, and testing configurations. It describes what we learned from our experience with vcpkg and how a bootstrap script solved the problem for MrDocs. Why Dependency Management Is Hard A Combinatorial Explosion Why C++ Makes It Worse What Went Wrong for MrDocs Where vcpkg Fell Short The Problems No Package Manager Solves Five Workflows and Counting The Bootstrap Script How It Evolved Key Design Decisions What We Learned Why Dependency Management Is Hard A Combinatorial Explosion Suppose your project depends on Package A >=1.0 and Package B >=2.0, but all options where A >=1.0 require B ). So a developer using Clang 20 on a fresh Ubuntu machine gets build errors from the standard library, not from their own code. Testing every Clang version with every GCC’s libstdc++ is infeasible, but specific combinations matter, and the mismatch is not obvious to the developer when it happens. Platform explosion: Windows/Linux/macOS multiplied by Debug/Release/OptimizedDebug, GCC/Clang/MSVC/AppleClang, shared/static, and sanitizer variants creates a combinatorial explosion of configurations that all need to be tested. Each platform also has its own quirks: git symlinks behave differently on Windows, Ninja availability varies, and even the way you specify compiler flags differs between MSVC and GCC/Clang. Conditional dependencies: in C++, build options frequently add or remove entire dependencies. An image processing library might support PNG, JPEG, and WebP, each requiring its own codec library. Enabling or disabling a format changes the dependency graph. Build scripts also commonly look for host dependencies (system libraries for talking to the OS, GPU, or network) that you are not expected to build yourself but that must be present on the machine. The dependency graph is not static; it depends on the configuration. Closed-source dependencies: all of the problems above assume you have the source code and can rebuild with the correct flags. Sometimes you do not. When a dependency is distributed only as a pre-built binary, there is no way to adjust the ABI, propagate sanitizer flags, or change the build type. If it was compiled with incompatible settings, there is nothing you can do about it. It becomes a hard constraint on the entire system. %%{init: {"theme": "base", "themeVariables": {"primaryColor": "#f7f9ff", "primaryBorderColor": "#9aa7e8", "primaryTextColor": "#1f2a44", "lineColor": "#b4bef2", "secondaryColor": "#fbf8ff", "tertiaryColor": "#ffffff", "fontSize": "14px"}}}%% mindmap root((C++ Dependencies)) No Standard Format Built from source Closed-source binaries Compatibility ABI API / Templates Build Type / CRT Propagation Viral flags Viral macros Sanitizers Categorical options Dependencies Conditional on build options Host / system libraries Closed-source binaries Platform Toolchain setup Compiler + stdlib combos Combinatorial explosion In C++, the general case involves so many dimensions that no existing tool handles all of them well. What about CPS? The Common Package Specification (CPS) is an interesting effort to standardize how C++ packages are consumed. A .cps file describes everything a build system needs to find and link against an already-built package: include paths, library paths, compiler flags. This is valuable, but it operates at the point of consumption, where we have already made all the decisions about platform, compiler, build type, and sanitizers. It assumes the dependency has already been built in a compatible way. It does not describe how to build the dependency with the correct flags in the first place. For example, if we need AddressSanitizer, all dependencies must be built with ASan instrumentation. A CPS file tells us how to consume a package that was built with ASan, but it does not know how to rebuild that package with ASan if it was not. The problems described above are all about making those upstream decisions correctly, which happens before CPS enters the picture. What Went Wrong for MrDocs MrDocs depends on LLVM, Duktape, Lua, and libxml2 (and previously also fmt). Over time, three categories of problems accumulated. Where vcpkg Fell Short For over a year, we used vcpkg to manage these dependencies. MrDocs is a tool, not a library, so we only needed vcpkg for acquiring our own dependencies rather than for making ourselves easy to consume downstream. It worked at first, but the complexity of our workflows gradually outgrew what vcpkg was designed to handle: Build types: MrDocs developers frequently need a Debug build with optimization enabled because the codebase is large enough that an unoptimized debug build is painfully slow. On MSVC, Debug and Release are ABI-incompatible, so a “Debug with optimization” configuration does not fit neatly into vcpkg’s Debug/Release binary model. Patches and dual paths: vcpkg applies patches to libraries that do not follow CMake conventions. This meant we had to support two ways to find the same library: the vcpkg-patched version and the upstream version. When libraries do follow CMake conventions, we do not need vcpkg as much. But when they do not, the patches make vcpkg less useful rather than more. Contributors kept opening PRs proposing yet another way to locate a dependency. In a build script, every new path is expensive to test. Rigid baseline: vcpkg’s baseline model pins all libraries to a single snapshot. We are tightly coupled to a specific LLVM commit, so we could not use vcpkg for LLVM from the start. That alone meant vcpkg could only manage a subset of our dependencies. On top of that, when fmt bumped a major version and broke downstream consumers, it showed that the baseline approach is too rigid for projects that use a few unrelated libraries. Sometimes the entire baseline would be updated and libraries we had no reason to touch just got upgraded, introducing unexpected breakage. Different developers also had different baseline expectations, so the same vcpkg.json could produce different results depending on when someone last updated. Missing dependencies: some dependencies were not in vcpkg at all, or not configured the way we needed them. LLVM is the classic example: we need a specific commit, built with specific flags. Tools do not provide their own vcpkg integration; everything is centralized in the vcpkg repository. This forced us into mixed-source dependency management where some deps come from vcpkg and some from custom scripts. No variant support: when we needed sanitizer builds (ASan, MSan, UBSan, TSan), vcpkg had nothing to offer. It knows Debug and Release. Building sanitized variants required custom scripts or custom environment variables to pass the information to the package manager internally. Manifest vs. classic mode: vcpkg offers two modes for specifying dependencies. Some users simply did not like one of the modes, and we had so many complaints that we ended up supporting both. Unlike npm’s local and global modes, vcpkg’s manifest and classic modes do not play well together, so supporting both effectively meant maintaining two separate dependency workflows. The vcpkg team has done outstanding work on a genuinely difficult problem, and vcpkg handles a lot of it well. Many of these limitations may simply be the best anyone can do given the complexity of the language. Most of the problems listed above do have external solutions: you can set custom triplets, configure environment variables, pass flags manually, and configure build types from outside vcpkg. That is how we handled it for a long time. The issue is that those solutions live outside the vcpkg workflow. We owned that part, and maintaining it was hard. Having vcpkg in the equation meant one more workflow to support, even when the problem was not vcpkg’s fault. The accumulated complexity of maintaining vcpkg alongside our own custom scripts is what eventually became unsustainable. The Problems No Package Manager Solves Dependency acquisition at configure time: we once had FetchContent as an optional alternative to find_package, so CMake could download dependencies if they were not already present. A team member’s internet went down during a build and CMake failed. The reaction was strong: nobody should be required to have internet to compile a project they already downloaded. The feature was removed entirely. This reinforced that dependency acquisition needed to be a separate, explicit step that completes before the build system even runs. IDE integration: developers had to manually configure run configurations for CLion, VS Code, or Visual Studio, and those configurations broke whenever the application changed, build options were added, or targets were renamed. Platform-specific toolchain setup: on macOS with Homebrew Clang, the standard tool paths (llvm-ar, llvm-ranlib, ld.lld) are not where the system expects them. On Windows, MSVC requires a Developer Command Prompt with specific environment variables. Setting up either of these correctly from scratch is its own project. Debugger integration: there was no automated way to set up LLDB formatters or GDB pretty printers for Clang and MrDocs symbols. Developers working on the AST had to inspect raw memory layouts. The sheer volume of instructions: the build script should not assume a package manager, so you end up documenting both the manual and the package manager path. For each dependency, for each variant (sanitizers, special build types), for each platform. When the package manager path does not work for a given configuration, the developer falls back to the manual path, and that path has to be maintained too. Five Workflows and Counting The proliferation was gradual. We started with manual CMake commands, then added FetchContent as an alternative, then adopted vcpkg, then had to support both vcpkg modes, then needed custom CI scripts. By mid-2025, we had accumulated five different workflows for installing dependencies: Manual CMake: the original path, configuring everything by hand FetchContent: later removed after the internet incident vcpkg (manifest mode): the “official” package manager path vcpkg (classic mode): because some users did not like manifest mode Custom CI scripts: CI uses its own language to describe workflows, and there was no single command that could configure all possible build variants %%{init: {"theme": "base", "themeVariables": {"primaryColor": "#fce4e4", "primaryBorderColor": "#e8a0a0", "primaryTextColor": "#1f2a44", "lineColor": "#e8a0a0", "secondaryColor": "#fef3e4", "tertiaryColor": "#ffffff", "fontSize": "14px"}}}%% flowchart LR A[New Developer] --> B{Which workflow?} B --> C[Manual CMake] B --> D[FetchContent] B --> E[vcpkg manifest] B --> F[vcpkg classic] B --> G[CI scripts] We tried to create a set of instructions that would describe what the user could do for each dependency. For each dependency, we would explain each of the ways to fetch and build it: manual, vcpkg manifest, vcpkg classic. On top of that, for each special variant (sanitizer builds, special build type combinations), there would be yet another set of instructions per dependency per workflow. The documentation grew combinatorially, and people got lost. The Bootstrap Script The core principle was separation of concerns: CMake builds the project, but something else manages the dependencies. The bootstrap script fills that gap. Before: # Clone and build LLVM (specific commit) git clone https://github.com/llvm/llvm-project.git cd llvm-project && git checkout dc4cef81d47c... cmake -S llvm -B build -DCMAKE_BUILD_TYPE=Release ... cmake --build build cmake --install build cd .. # Download and build Duktape curl -L https://github.com/.../duktape-2.7.0.tar.xz | tar xJ cmake -S duktape -B duktape/build ... cmake --build duktape/build cmake --install duktape/build # Repeat for libxml2, Lua... # Then configure MrDocs with all the install paths cmake -S mrdocs -B mrdocs/build \ -DLLVM_ROOT=/path/to/llvm/install \ -Dduktape_ROOT=/path/to/duktape/install \ -Dlibxml2_ROOT=/path/to/libxml2/install \ ... cmake --build mrdocs/build After: python bootstrap.py The script handles everything else: Probes MSVC (Windows only): detects and imports the Visual Studio development environment Checks system prerequisites: validates that cmake, git, python, and a C/C++ compiler are available Sets up compilers: resolves compiler paths, detects Homebrew Clang on macOS Configures build options: prompts for build type, sanitizer, and preset name (or accepts defaults in non-interactive mode for CI) Probes compilers: runs a dummy CMake project to extract the compiler ID, version, and capabilities before building anything Sets up Ninja: finds or downloads the Ninja build system Installs dependencies: fetches and builds Duktape, Lua, libxml2, and LLVM in topological order, each with the correct flags for the chosen configuration Generates CMake presets: writes a CMakeUserPresets.json with all dependency paths, compiler configuration, and IDE settings Generates IDE configurations: run/debug configs for CLion, VS Code, and Visual Studio, plus debugger pretty printers Builds MrDocs: configures, builds, and optionally installs MrDocs using the generated presets Runs tests: executes the test suite in parallel %%{init: {"theme": "base", "themeVariables": {"primaryColor": "#e4eee8", "primaryBorderColor": "#affbd6", "primaryTextColor": "#000000", "lineColor": "#baf9d9", "secondaryColor": "#f0eae4", "tertiaryColor": "#ebeaf4", "fontSize": "14px"}}}%% sequenceDiagram participant U as Developer participant B as bootstrap.py participant S as System participant D as Dependencies participant C as CMake participant I as IDE U->>B: python bootstrap.py B->>S: Probe MSVC environment (Windows) B->>S: Check prerequisites (cmake, git, compiler) B->>S: Set up compilers and Ninja B->>U: Prompt for build type, sanitizer, preset B->>S: Probe compiler ID and version B->>D: Fetch and build dependencies B->>C: Generate CMakeUserPresets.json B->>I: Generate IDE and debugger configs B->>C: Build and install MrDocs B->>C: Run tests How It Evolved The first commit landed on July 16, 2025. Over the next eight months, the script went through seven distinct phases of development across roughly 57 commits. %%{init: {"theme": "base", "themeVariables": {"primaryColor": "#f7f9ff", "primaryBorderColor": "#9aa7e8", "primaryTextColor": "#1f2a44", "lineColor": "#b4bef2", "secondaryColor": "#fbf8ff", "tertiaryColor": "#ffffff", "fontSize": "14px"}}}%% timeline title bootstrap.py Evolution Jul 2025 : Foundation and UX Aug 2025 : IDE configs, sanitizers, and Windows Sep 2025 : Developer tooling and LLDB Dec 2025 : Modularization into package Mar 2026 : CI integration The first week (July 16–19) was about getting the one-liner to work at all: the core workflow, colored prompts, parallel test execution, and the first installation docs. Phase 1: Foundation (July 16–19, 2025) 521cc704 build: bootstrap script e32bb36e build: bootstrap uses another path for mrdocs source when not already called from source directory e7e3ef51 build: bootstrap build options list valid types 75c28e45 build: bootstrap prompts use colors c156a05f build: bootstrap removes redundant flags c14f071b build: bootstrap runs tests in parallel 1a9de28c docs: one-liner installation instructions 76611f93 build: bootstrap paths use cmake relative path shortcuts The second and third weeks turned the script into a development environment setup tool by generating IDE run configurations for CLion, VS Code, and Visual Studio. By the end of July, the script also supported custom compilers, sanitizer builds, and Homebrew Clang on macOS. Phase 2: IDE Integration (July 22–28, 2025) 502cfbd8 build: bootstrap generates debug configurations b546c260 build: bootstrap dependency refresh run configurations 83525d38 build: bootstrap documentation run configurations 2cfdd19e build: bootstrap website run configurations ca4b04d3 build: bootstrap MrDocs self-reference run configuration b5f53bd9 build: bootstrap XML lint run configurations Phase 3: Build Variants and Sanitizers (July 29–August 1, 2025) 0a751acd build: bootstrap supports custom compilers ff62919f build: LLVM runtimes come from presets 2b757fac build: bootstrap debug presets with release dependencies 0d179e84 build: installation workflow uses Ninja for all projects 3d8fa853 build: installation workflow supports sanitizers 26cec9d8 build: installation workflow supports homebrew clang August was the cross-platform month. Windows support required probing vcvarsall.bat, handling Visual Studio tool paths, and ensuring git symlinks worked. Paths were made relocatable so CMakeUserPresets.json files could be shared across machines. Phase 4: Cross-Platform Polish (August 2025) fc2aa2d6 build: external include directories are relocatable 21c206b9 build: bootstrap vscode run configurations d2f9c204 build: Visual Studio run configurations 0ca523e7 build: bootstrap supports default Visual Studio tool paths on Windows 4b79ef41 build(bootstrap): probe vcvarsall environment 4d705c96 build(bootstrap): ensure git symlinks 524e7923 build(bootstrap): visual studio run configurations and tasks 94a5b799 build(bootstrap): remove dependency build directories after installation September and October added developer tooling: LLDB data formatters for Clang and MrDocs symbols, pretty printer configurations, libcxx hardening mode, and the style guide documentation. Phase 5: Developer Tooling (September–October 2025) fc98559a build(bootstrap): include pretty printers configuration 069bd8f4 feat(lldb): LLDB data formatters 1b39fdd7 fix(lldb): clang ast formatters 988e9ebc build(bootstrap): config info for docs f48bbd2f build: bootstrap enables libcxx hardening mode 5e16e3fa Fix support for clang cl-mode driver (#1069) By December, the monolithic 2,700-line bootstrap.py was refactored into a proper Python package under util/bootstrap/ with 20+ modules organized by concern: core/ (platform detection, options, UI), configs/ (IDE run configurations), presets/ (CMake preset generation), recipes/ (dependency building), and tools/ (compiler detection). The package also includes its own test suite, which means one person changing the bootstrap script for their platform is not going to break it for someone else on a different platform. Phase 6: Modularization (November–December 2025) 0d4a8459 build(bootstrap): modularize recipes 7ba4699b build(bootstrap): transition banner 99d61207 build(bootstrap): handle empty input and “none” in prompt retry e3b3fd02 build(bootstrap): convert script into package structure In March 2026, the bootstrap script replaced the custom CI dependency scripts. This was a major milestone: users, developers, and CI now all use the same tool. CI was simplified significantly because the dependency steps are no longer custom shell commands maintained separately. And because CI runs the bootstrap on every push, the script itself is continuously tested across all platforms. If the bootstrap breaks on any platform, CI catches it immediately. Phase 7: CI Integration (2026) 6cee4af2 use system libs by default (#1077) 9b4fafbf ci: dependency steps use bootstrap script Key Design Decisions Several technical challenges required careful design. Here are the most interesting ones. Flag propagation. Not all flags should reach all dependencies, and the propagation rules vary per flag type and per dependency. Some sanitizers require all dependencies to be instrumented, while others only need compile-time checks. Build type does not always propagate (libxml2 is always built as Release). Compiler paths always propagate. The script evaluates each dependency individually and checks ABI compatibility before deciding whether to honor or coerce the build type. Windows ABI handling. On MSVC, Debug and Release are ABI-incompatible at the CRT level. When the script detects a mismatch, it coerces the dependency build to “OptimizedDebug” (Debug ABI with /O2 optimization). This is different from RelWithDebInfo, which uses the Release ABI with debug symbols and will not link with a Debug MrDocs. Cross-platform compiler detection. On Linux, compiler detection is straightforward. On macOS with Homebrew Clang, the script detects and injects the correct llvm-ar, llvm-ranlib, ld.lld, and libc++ paths, which are not on the default search path. On Windows, the script locates Visual Studio via vswhere.exe, runs vcvarsall.bat with debug output, and parses the environment variables into Python for all subsequent CMake calls. CMake preset generation. After building dependencies, the script generates a CMakeUserPresets.json with all dependency paths, compiler configuration, and platform conditions. Paths are made relocatable by replacing absolute prefixes with CMake variables (${sourceDir}, ${sourceParentDir}, $env{HOME}). IDE run configurations. The script generates ready-to-use configurations for CLion, VS Code, and Visual Studio: building and debugging MrDocs, running tests, generating documentation, refreshing dependencies, generating config info and YAML schemas, validating XML output, running MrDocs on Boost libraries (auto-discovered), and reformatting source files. CMake custom commands can create build targets, but you cannot debug them from the IDE. Recipe system. Dependencies are defined as JSON recipe files with source URLs, build steps, and dependency relationships. The bootstrap topologically sorts them and builds them in order. Each recipe tracks its state with a stamp file (recipe version, git ref, platform, build parameters). If any parameter changes, the dependency is rebuilt. The stamp system also generates CI cache keys like llvm-abc1234-release-ubuntu-24.04-clang-19-ASan. Refresh command. Because of the stamp system, a developer can run the bootstrap with --refresh-all at any time. The script re-evaluates all stamps and rebuilds only the dependencies that are out of date with whatever configurations are needed. This makes updating dependencies after a configuration change (new sanitizer, different compiler, updated LLVM commit) a single command rather than a manual process of figuring out which dependencies need rebuilding. What We Learned Users, developers, and CI now all use the same tool. Users get a one-liner installation. Developers get IDE run configurations and debugger integration. CI gets non-interactive mode with sanitizer support. The exact same code path that builds dependencies on a developer’s laptop now builds dependencies in CI. Separation of concerns. When your project’s requirements are complex enough (multiple build types, sanitizer variants, cross-platform quirks, heavy dependencies like LLVM), a custom script that owns the entire dependency lifecycle is simpler than trying to make a general-purpose tool handle every edge case. Existing tools solve the general case well. Our specific combination of requirements needed something tailored. C++ has no unified build workflow. Every platform has its own conventions for finding compilers, setting up environments, and linking libraries. Just finding and setting up MSVC from a script is a project in itself. New contributors can start working immediately. Before the bootstrap, getting a working build could take days. Now it takes a single command, and the IDE configurations are included. We still have small glitches as new compilers and platforms appear, but each fix is a localized change in one module rather than a cross-cutting update to five independent workflows. The complete bootstrap package is available in the MrDocs repository.📝The C++ AllianceTuesday, April 14, 2026
Elegant D by Walter Bright - Opening Keynote - D Language Symposium 2026 - Talk 1 of 8🎥Mike Shah
Let CTest discover GTest Test Cases🎥Refactoring Bitcoin
Why was there a red telephone at every receptionist desk?Not a direct line to Bill Gates's office. The post Why was there a red telephone at every receptionist desk? appeared first on The Old New Thing .📝The Old New Thing
CTRACK: C++ Performance Tracking and Bottleneck Discovery - Grischa Hauser - CppCon 2025🎥CppCon
MSVC Build Tools Version 14.51 Release Candidate Now AvailableTry out the final preview of MSVC Build Tools v14.51 The post MSVC Build Tools Version 14.51 Release Candidate Now Available appeared first on C++ Team Blog .📝C++ Team Blog
What C++ Developers Fear Most #cplusplus #concurrency #mindblown🎥CppOnline
ActiViz on ARM64Announcing ARM64 support for Activiz📝Kitware Inc
Open-Enrollment classes in 2026The year 2026 is already four months old. I hope you had a good start to this year. If you're interested in working with me at one of my public classes, here is what you need to know. I will give a one-day online workshop on Safe and Efficient C …📝AndreasFertig.comMonday, April 13, 2026
Suspend and Resume: How C++20 Coroutines Actually Work - Lieven de Cock - C++Online 2026🎥CppOnline
Understanding CTest's Entry Point🎥Refactoring Bitcoin
Finding a duplicated item in an array of N integers in the range 1 to N − 1Taking advantage of special characteristics of the array. The post Finding a duplicated item in an array of N integers in the range 1 to N − 1 appeared first on The Old New Thing .📝The Old New Thing
C++ Weekly - Ep 528 - Protecting From Fallthrough🎥Jason Turner
Persistence Squared: Persisting Persistent Data Structures - Juan Pedro Bolivar Puente - CppCon 2025🎥CppCon
C++26 Reflection Is Insane #cplusplus #coding🎥CppOnline
Building Better Software through Cross-Functional Collaboration - Matt Morton - ADC 2025🎥audiodevcon
std::pmr::generator, a generator without heap allocationC++23 std::generator heap-allocates the coroutine frame by default. With std::pmr::generator you can run the generator with a stack arena instead.📝Engineering the Craft
What I learned from improving Unity's Mono codegen, part 2My current sidequest is to improve the codegen for Unity games (and the editor) running on Mono. This post is a continuation of last week’s post about this journey. To LLVM or not to LLVM? As demonstrated last time, Mono’s codegen is often lackluster. The first decision I had to...📝Sebastian Schöner
Can we finally use C++ Modules in 2026?Kinda? Maybe? It's complicated.📝Mathieu RopertSunday, April 12, 2026
Get a List of Tests from the GTest Binary🎥Refactoring Bitcoin
April's Overload Journal has been published.The April 2026 ACCU Overload journal has been published and should arrive at members' addresses in the next few days. Overload 192 and previous issues of Overload can be accessed via the Journals menu.📝ACCU
Dockable Editor Panels | Devlog #6 | Pard Engine🎥PardCode
The concurrency skill everyone needs #programming #dev🎥CppOnlineSaturday, April 11, 2026
Speed of Thought🎥Matt Godbolt
Watch Your Threads Fall Into Perfect Sync #cpp #concurrency #cpp20🎥CppOnline
Automatically add Tests by Parsing GTest Sources🎥Refactoring Bitcoin
Lecture 21. Multithreaded Queues (MIPT, 2025-2026).🎥Konstantin Vladimirov
Change of Plans: Building a C++ SaaS in 2 Weeks vertical🎥Kea Sigma Delta
Change of Plans: Building a C++ SaaS in 2 Weeks🎥Kea Sigma Delta
`auto{x} != auto(x)`Recently it was asked: What’s the difference between the expressions auto(x) and auto{x} in C++23?📝Arthur O’Dwyer
Preventing Integer Overflow in Physical ComputationsPreventing Integer Overflow in Physical Computations Integers overflow. That is not a controversial statement. What is surprising is how easily overflow can hide behind the abstraction of a units library. Most developers immediately think of explicit or implicit scaling operations — calling .in(unit) to convert a quantity, constructing a quantity from a different unit, or assigning between quantities with different units. These are indeed places where overflow can occur, and the library cannot prevent it at compile time when the values are only known at runtime. But at least these operations are visible in your code : you wrote the conversion, you asked for the scaling, and you can reason about whether the multiplication or division might overflow your integer type. The far more insidious problem is what happens when you don't ask for a conversion. When you write 1 * m + 1 * ft , the library must automatically convert both operands to a common unit before performing the addition. That conversion — which you never explicitly requested — involves multiplication or division by scaling factors. With integer representations, those scaling operations can overflow silently, producing garbage results that propagate through your calculations undetected. No compile-time programming can prevent this. The values are only known at runtime. But very few libraries provide proper tools to detect it. This article explains why that limitation is real, how other libraries have tried to work around it, and what mp-units provides to close the gap as tightly as the language allows.📝mp-units