Sometimes working on a Drupal contributed module requires making changes to the module’s composer.json so that you can update a dependency. This blog post looks at how to accomplish that in a local development environment.
At Tag1, we believe in proving AI within our own work before recommending it to clients. This post is part of our AI Applied content series, where team members share real stories of how they're using Artificial Intelligence and the insights and lessons they learn along the way. Here, Charles Tanton (Software Engineer) explores how AI supported his improvements to the Optional Field Widget Ordering issue in Drupal core by accelerating progress on a long-stalled 10-year old problem.
This project focused on using AI to help resolve a long-standing Drupal core issue: Allow multiple field widgets to not use tabledrag (#2264739). The goal was simple but impactful: make it possible to disable drag-and-drop ordering for specific multi-value field widgets, instead of forcing all of them to be orderable. For years, the only practical option was a brittle patch that touched many core files and often broke on core updates, creating recurring maintenance work that nobody enjoyed. By pushing this core fix forward with AI as a coding partner, the aim is to remove that maintenance burden for good and give site builders more control over the form UX.
The core problem was that Drupal automatically renders multi-value fields in a table with tabledrag behavior, even when reordering is not needed. That table-based structure makes theming harder, complicates responsive layouts, and adds JavaScript overhead for no real benefit in many use cases. Our only workaround was a large, fragile patch from this very issue that had to be kept in sync across Drupal core releases by hand.
This AI Applied project set out to change that by getting a clean, configurable solution into core. The work included writing an improved merge request, updating the issue summary, and adding thorough test coverage, all geared toward making the change easy to understand, review, and eventually commit.
The first step was using AI to explore alternatives to the existing proposed fix and to see if there was a better architectural direction. After looking at the options together with AI, we confirmed that the original "orderable" setting approach was still the best fit, and then focused on strengthening it, especially around configuration schema. A key enhancement was the introduction of a shared field.widget.settings.base config schema so widgets could inherit the new orderable boolean cleanly instead of each re-defining it.
Across the project, AI helped with:
field-multiple-value-without-order-form template.AI also sped up UI work by letting me paste screenshots into the coding environment so it could adjust CSS more accurately, instead of iterating blindly. Here’s an example:
Over time, I shifted from an earlier extension to a smoother Claude Code setup, supported by installing the ddev-claude-code integration so the assistant could run directly inside the DDEV container with GitLab CLI access.
A few workflow patterns turned out to be especially helpful on this project:
Dedicated context folder
I added a .claude folder in the repo to hold plans, reference snippets, and docs like the Drupal config schema guide. This let me carry context between sessions and ask the AI to "open and update" specific plan files instead of re-explaining everything.
Plan-driven development
For larger theming or testing tasks, I asked the AI to first write a plan to a file (for example, @.claude/theming-plan-22-nov.md), then execute it step by step, pausing after each item for review. That structure made it much easier to course-correct early rather than cleaning up after a big batch of code.
Voice prompts for speed
Using dictation for prompts in the terminal helped reduce friction for small, repetitive questions or instructions. It was surprisingly effective for "talking through" next steps in a more natural way.
Deep reasoning prompts
Adding a keyword like "ultrathink" in key prompts encouraged the AI to reason more thoroughly before proposing code, which was particularly useful for tricky config and test design work.
On top of that, the ddev-claude-code installation was straightforward and lives right alongside normal Drupal tooling:
composer config extra.drupal-scaffold.allowed-packages --json --merge '["drupal/claude_code"]'
composer require --dev drupal/claude_code
ddev add-on get FreelyGive/ddev-claude-code
ddev restart
ddev claude
The issue is still open, but the state of the work is very different from when this effort began. The implementation now uses a clean configuration schema pattern, has broad widget coverage, and includes extensive tests and documentation in the issue summary. Remaining tasks are mostly about theming and documentation polish before the merge request is ready for final review and potential commit.
This was my first serious use of AI for coding, so it naturally took longer than it would now that I have more experience. The upside is that it gave me a strong foundation for using AI as a regular programming tool, and it has since become part of my daily workflow.
If I were starting this same issue again today, I would:
Contributions to Drupal core like this one tend to benefit every Drupal site over time. In this case, having an "Orderable" toggle for multi-value widgets will simplify maintenance by removing the need for a large, fragile patch and will improve form UX options for site builders. It's the kind of change that quietly pays off for years through cleaner upgrades and more flexible theming.
More broadly, this project is a concrete example of how teams can use AI to move long-standing open-source issues forward. It is especially valuable where the work involves a mix of architecture decisions, broad test coverage, and tedious but important updates across multiple components.
One of the most important lessons from this project is that expert oversight is essential. I saw multiple cases where the AI made questionable choices or leaned on weak assumptions in the code, which I only caught because I was reading closely and testing manually. Without that attention, the "help" would have turned into extra rework later.
Used well, AI acts as a powerful accelerator: it drafts, refactors, and suggests, while you stay accountable for direction, quality, and correctness. This project helped me build that mindset and gave me the confidence to make AI a normal part of my engineering toolkit. It also reaffirmed how valuable it is to practice with AI on internal or infrastructure-oriented work like this before applying it in higher-risk contexts.
This post is part of Tag1’s AI Applied series, where we share how we're using AI inside our own work before bringing it to clients. Our goal is to be transparent about what works, what doesn’t, and what we are still figuring out, so that together, we can build a more practical, responsible path for AI adoption.
Bring practical, proven AI adoption strategies to your organization, let's start a conversation! We'd love to hear from you.
Image by FranFrank96 from Shutterstock
read moreI don't usually fail at making my life easier, but hey, it's a whole new world lately. To try my hand at LLM during my trial of AI-assisted coding, I wanted to see if I could customize an LLM for a specific task: assigning user credit on Drupal Core issues. Depending on the complexity, activity, and number of contributors involved it can take me anywhere between 30 seconds and 30 minutes to assign credit when I commit an issue to Drupal Core. Maybe I could automate some of it?
Drupal Commerce lets you serve both retail customers and business buyers from a single installation. Same products. Shared checkout flow. Same user experience—just adapted to the relationship. Different users can see different prices, payment options, and catalogs.
You don’t need a separate platform. You don’t need a different domain. You don’t need another way to manage content. Drupal Commerce already has the tools to support both B2C and B2B on the same website, using the same codebase, delivering a unified experience to all of your customers.
Even better, you can build a B2B portal in Drupal without any code. The capabilities are already there in existing features and modules.
On February 26th, I’ll show you how to do it. We’ll walk through building a B2B purchasing portal using Commerce Kickstart as a base. Sign up now.
Read more read moreDrupal core's mail module has been a mess for a long time and has seemingly not kept up with the modernization of the rest of the stack. Using the hook system to send emails feels archaic; therefore, a while ago, we started developing a module that:
We have been using and improving Mail Composer and would love for that work to be reused and further built upon.
... is as simple as:
/** @var \Drupal\mail_composer\Manager $manager */
$manager = \Drupal::service('mail_composer.manager');
$manager
->compose()
->setFrom('foobar@foo.bar')
->setTo('foo@bar.bar')
->setSubject('Test subject')
->setBody(['This is the body of the email.'])
->send();Neat, isn't it?
read moreA close look at Charles Andrew Revkin: how an international upbringing quietly shaped a digital leader at UICC and his impact on global cancer initiatives.
read moreHere's some tips for debugging broken CSS in Drupal.
Meet Meridian, the newest Dripyard theme. We’re really excited about this release, as many hours went into Meridian along with updates to our other themes.
My favorite “feature” of Dripyard themes is flexibility. We market each theme toward a specific vertical, but in practice they are highly versatile. You can easily change the look and feel of an entire site by adjusting color schemes, border radiuses, and imagery.
Pictures are worth a thousand words, so we built our site to showcase multiple demos.
read moreDrupal CMS 2.0 launched January 28. We asked Pam Barone—CTO of Technocrat and Product Owner of Drupal CMS—to talk about what's new and what she's most excited for people to try.
Drupal CMS 1.0 was really a proof of concept, to show that we could create a version of Drupal that bundled all of the best practices that many sites were using, and that the community would come together to make it happen in a short amount of time. We did prove the concept, but the 1.0 release did not represent any major innovations, because we were mostly just packaging functionality and tools that we already had and were familiar with. That is not to downplay the accomplishment at all, because it was a huge leap forward for the project, and it provided the foundation for the next steps.
With 2.0, we are introducing two big new concepts: Drupal Canvas and site templates. These represent another huge leap for the project, each in different ways, as we continue with the strategy to empower marketers to create exceptional digital experiences without relying on developers.
Drupal Canvas! I am so excited about Canvas and can’t wait to get it into the hands of our end users. There were times during the development of 2.0 when I was working in the Canvas editor and I thought, ‘Wow, I’m actually having fun!’ I can’t say I remember thinking that with previous Drupal page building tools.
And it’s not just about end users; one of the goals of 2.0 is to introduce Canvas within the community and showcase its potential. It’s a paradigm shift, and this level of change is always challenging, but after trying it out and getting familiar with the concepts, I think it’ll be clear that it’s worth it.
Site templates are near-feature-complete starting points for Drupal sites based on specific use cases. They provide a content model, some example content, a polished look and feel, as well as the functionality you would expect based on the use case. The first site template – Byte, which is included in Drupal CMS 2.0 – is for a SaaS-based product marketing site. It includes all of the baseline functionality from 1.0, plus Canvas-powered landing pages, a blog, a newsletter signup and contact form, and a new theme with a dark style.
During the development of 1.0, we realized that we couldn’t build something that was both generic and useful. Either we would have to build something simple that would be widely applicable, or we would be making a lot of assumptions about the site’s content model and functionality, and providing things that many users wouldn't want.
We decided that in order to really make it easy to launch sites, we had to provide many different starting points, across many use cases. By identifying the use case and being opinionated about how to solve it, site templates can start you off with 95 percent of what you need to launch.
Of course, that assumes there is a site template for your use case – which means we’re going to need a lot of them. We’re currently working with a group of Drupal agencies who have signed up for a pilot to develop new site templates for the launch of the site template Marketplace.
The most obvious thing is just that it provides marketers with a modern, intuitive visual page builder of the kind that any competitive platform needs to have. Up until now, adopting Drupal meant getting its many benefits but compromising on the user experience, because the page building tools were clunky. With Canvas, that compromise is gone. We can provide the experience that marketers have come to expect.
In some ways it feels like we are playing catch-up, especially since it’s still early (the first release was in December) and there are some big gaps. But it also feels like a new era for Drupal, and the enthusiasm and pace of adoption so far is really encouraging. So I think we don’t really even know yet what changes will come, because when the community is presented with a new way to build cool things, the possibilities are endless.
One of the benefits of using Drupal is that it can be integrated with pretty much anything, and all of the common integrations have modules to make it easier. But they always require some configuration, and it can be tricky to figure out. With recipes, we can add default configuration, and we can prompt for the necessary details, so you don’t have to go hunting around for where to add them.
Drupal CMS 1.0 included two integrations that use the recipe prompt already, for Google Analytics and the AI Assistant. They’re pretty simple in that you are just adding an ID or an API key, but they still are a big improvement over the manual setup.
For 2.0, with site templates, we have the opportunity to include additional integrations that are relevant to the use case and wanted to tackle something a bit more complicated. Byte ships with a newsletter signup that uses a webform out of the box, and has an optional “Recommended add-on” to integrate with Mailchimp. The Mailchimp module already did most of the heavy lifting, but we worked with the maintainers to develop a recipe that configures the module (and its submodules), and once you authenticate your site with Mailchimp, will automatically create signup blocks for each of your audiences. From there, you can add them to any page via the Canvas editor.
We think that easy integrations are going to be really critical to making site templates attractive as an offering, so we are planning to continue working on that.
The initial site templates are very intentionally on the “making easy things less hard” side. Not only is it a totally new concept, but they are leveraging Canvas, which is also new. So we thought that the best chance for success would be to keep it simple and try to really nail the use cases. Once we’ve all built a few, and we’ve gotten feedback from real users, we can move into the more complex sites where Drupal thrives.
Try it now: drupal.org/drupal-cms/trial
Download: drupal.org/download
Learn more: drupal.org/drupal-cms
Twenty-five years in. Still building.
Drupal CMS 2.0 launched January 28. We asked Pam Barone—CTO of Technocrat and Product Owner of Drupal CMS—to talk about what's new and what she's most excited for people to try.
Drupal CMS 1.0 was really a proof of concept, to show that we could create a version of Drupal that bundled all of the best practices that many sites were using, and that the community would come together to make it happen in a short amount of time. We did prove the concept, but the 1.0 release did not represent any major innovations, because we were mostly just packaging functionality and tools that we already had and were familiar with. That is not to downplay the accomplishment at all, because it was a huge leap forward for the project, and it provided the foundation for the next steps.
With 2.0, we are introducing two big new concepts: Drupal Canvas and site templates. These represent another huge leap for the project, each in different ways, as we continue with the strategy to empower marketers to create exceptional digital experiences without relying on developers.
Drupal Canvas! I am so excited about Canvas and can’t wait to get it into the hands of our end users. There were times during the development of 2.0 when I was working in the Canvas editor and I thought, ‘Wow, I’m actually having fun!’ I can’t say I remember thinking that with previous Drupal page building tools.
And it’s not just about end users; one of the goals of 2.0 is to introduce Canvas within the community and showcase its potential. It’s a paradigm shift, and this level of change is always challenging, but after trying it out and getting familiar with the concepts, I think it’ll be clear that it’s worth it.
Site templates are near-feature-complete starting points for Drupal sites based on specific use cases. They provide a content model, some example content, a polished look and feel, as well as the functionality you would expect based on the use case. The first site template – Byte, which is included in Drupal CMS 2.0 – is for a SaaS-based product marketing site. It includes all of the baseline functionality from 1.0, plus Canvas-powered landing pages, a blog, a newsletter signup and contact form, and a new theme with a dark style.
During the development of 1.0, we realized that we couldn’t build something that was both generic and useful. Either we would have to build something simple that would be widely applicable, or we would be making a lot of assumptions about the site’s content model and functionality, and providing things that many users wouldn't want.
We decided that in order to really make it easy to launch sites, we had to provide many different starting points, across many use cases. By identifying the use case and being opinionated about how to solve it, site templates can start you off with 95 percent of what you need to launch.
Of course, that assumes there is a site template for your use case – which means we’re going to need a lot of them. We’re currently working with a group of Drupal agencies who have signed up for a pilot to develop new site templates for the launch of the site template Marketplace.
The most obvious thing is just that it provides marketers with a modern, intuitive visual page builder of the kind that any competitive platform needs to have. Up until now, adopting Drupal meant getting its many benefits but compromising on the user experience, because the page building tools were clunky. With Canvas, that compromise is gone. We can provide the experience that marketers have come to expect.
In some ways it feels like we are playing catch-up, especially since it’s still early (the first release was in December) and there are some big gaps. But it also feels like a new era for Drupal, and the enthusiasm and pace of adoption so far is really encouraging. So I think we don’t really even know yet what changes will come, because when the community is presented with a new way to build cool things, the possibilities are endless.
One of the benefits of using Drupal is that it can be integrated with pretty much anything, and all of the common integrations have modules to make it easier. But they always require some configuration, and it can be tricky to figure out. With recipes, we can add default configuration, and we can prompt for the necessary details, so you don’t have to go hunting around for where to add them.
Drupal CMS 1.0 included two integrations that use the recipe prompt already, for Google Analytics and the AI Assistant. They’re pretty simple in that you are just adding an ID or an API key, but they still are a big improvement over the manual setup.
For 2.0, with site templates, we have the opportunity to include additional integrations that are relevant to the use case and wanted to tackle something a bit more complicated. Byte ships with a newsletter signup that uses a webform out of the box, and has an optional “Recommended add-on” to integrate with Mailchimp. The Mailchimp module already did most of the heavy lifting, but we worked with the maintainers to develop a recipe that configures the module (and its submodules), and once you authenticate your site with Mailchimp, will automatically create signup blocks for each of your audiences. From there, you can add them to any page via the Canvas editor.
We think that easy integrations are going to be really critical to making site templates attractive as an offering, so we are planning to continue working on that.
The initial site templates are very intentionally on the “making easy things less hard” side. Not only is it a totally new concept, but they are leveraging Canvas, which is also new. So we thought that the best chance for success would be to keep it simple and try to really nail the use cases. Once we’ve all built a few, and we’ve gotten feedback from real users, we can move into the more complex sites where Drupal thrives.
Try it now: drupal.org/drupal-cms/trial
Download: drupal.org/download
Learn more: drupal.org/drupal-cms
Twenty-five years in. Still building.
Drupal CMS 2.0 launched January 28. We asked Pam Barone—CTO of Technocrat and Product Owner of Drupal CMS—to talk about what's new and what she's most excited for people to try.
Drupal CMS 1.0 was really a proof of concept, to show that we could create a version of Drupal that bundled all of the best practices that many sites were using, and that the community would come together to make it happen in a short amount of time. We did prove the concept, but the 1.0 release did not represent any major innovations, because we were mostly just packaging functionality and tools that we already had and were familiar with. That is not to downplay the accomplishment at all, because it was a huge leap forward for the project, and it provided the foundation for the next steps.
With 2.0, we are introducing two big new concepts: Drupal Canvas and site templates. These represent another huge leap for the project, each in different ways, as we continue with the strategy to empower marketers to create exceptional digital experiences without relying on developers.
Drupal Canvas! I am so excited about Canvas and can’t wait to get it into the hands of our end users. There were times during the development of 2.0 when I was working in the Canvas editor and I thought, ‘Wow, I’m actually having fun!’ I can’t say I remember thinking that with previous Drupal page building tools.
And it’s not just about end users; one of the goals of 2.0 is to introduce Canvas within the community and showcase its potential. It’s a paradigm shift, and this level of change is always challenging, but after trying it out and getting familiar with the concepts, I think it’ll be clear that it’s worth it.
Site templates are near-feature-complete starting points for Drupal sites based on specific use cases. They provide a content model, some example content, a polished look and feel, as well as the functionality you would expect based on the use case. The first site template – Byte, which is included in Drupal CMS 2.0 – is for a SaaS-based product marketing site. It includes all of the baseline functionality from 1.0, plus Canvas-powered landing pages, a blog, a newsletter signup and contact form, and a new theme with a dark style.
During the development of 1.0, we realized that we couldn’t build something that was both generic and useful. Either we would have to build something simple that would be widely applicable, or we would be making a lot of assumptions about the site’s content model and functionality, and providing things that many users wouldn't want.
We decided that in order to really make it easy to launch sites, we had to provide many different starting points, across many use cases. By identifying the use case and being opinionated about how to solve it, site templates can start you off with 95 percent of what you need to launch.
Of course, that assumes there is a site template for your use case – which means we’re going to need a lot of them. We’re currently working with a group of Drupal agencies who have signed up for a pilot to develop new site templates for the launch of the site template Marketplace.
The most obvious thing is just that it provides marketers with a modern, intuitive visual page builder of the kind that any competitive platform needs to have. Up until now, adopting Drupal meant getting its many benefits but compromising on the user experience, because the page building tools were clunky. With Canvas, that compromise is gone. We can provide the experience that marketers have come to expect.
In some ways it feels like we are playing catch-up, especially since it’s still early (the first release was in December) and there are some big gaps. But it also feels like a new era for Drupal, and the enthusiasm and pace of adoption so far is really encouraging. So I think we don’t really even know yet what changes will come, because when the community is presented with a new way to build cool things, the possibilities are endless.
One of the benefits of using Drupal is that it can be integrated with pretty much anything, and all of the common integrations have modules to make it easier. But they always require some configuration, and it can be tricky to figure out. With recipes, we can add default configuration, and we can prompt for the necessary details, so you don’t have to go hunting around for where to add them.
Drupal CMS 1.0 included two integrations that use the recipe prompt already, for Google Analytics and the AI Assistant. They’re pretty simple in that you are just adding an ID or an API key, but they still are a big improvement over the manual setup.
For 2.0, with site templates, we have the opportunity to include additional integrations that are relevant to the use case and wanted to tackle something a bit more complicated. Byte ships with a newsletter signup that uses a webform out of the box, and has an optional “Recommended add-on” to integrate with Mailchimp. The Mailchimp module already did most of the heavy lifting, but we worked with the maintainers to develop a recipe that configures the module (and its submodules), and once you authenticate your site with Mailchimp, will automatically create signup blocks for each of your audiences. From there, you can add them to any page via the Canvas editor.
We think that easy integrations are going to be really critical to making site templates attractive as an offering, so we are planning to continue working on that.
The initial site templates are very intentionally on the “making easy things less hard” side. Not only is it a totally new concept, but they are leveraging Canvas, which is also new. So we thought that the best chance for success would be to keep it simple and try to really nail the use cases. Once we’ve all built a few, and we’ve gotten feedback from real users, we can move into the more complex sites where Drupal thrives.
Try it now: drupal.org/drupal-cms/trial
Download: drupal.org/download
Learn more: drupal.org/drupal-cms
Twenty-five years in. Still building.
Information gathering, content writing, proofreading, SEO optimization, tag preparation – all these tasks consume a significant portion of the editorial team’s time. What if you could reduce this research time by up to 90% through automated content creation? In this article, I present a practical Drupal setup that uses AI-powered modules to generate editorial content with minimal manual input. This includes automatic information retrieval based on the title, tag generation, content creation, and detailed data fetching – all directly in your CMS, without switching between different tools. Read on or watch the episode from the Nowoczesny Drupal series.
read moreWe're excited to announce DDEV v1.25.0, featuring a completely revised Windows installer, XHGui as the default profiler, and updated system defaults including a move to Debian Trixie.
This release represents contributions from the entire DDEV community, with your suggestions, bug reports, code contributions, and financial support making it possible.
Default versions updated:
These updates mostly affect new projects. Existing projects typically continue to work without changes.
ddev-webserver and ddev-ssh-agentMajor new features:
ddev share command with a new cloudflared share provider for free sharing options. See new docs.ddev utility xdebug-diagnose helps troubleshoot Xdebug issues. See (draft) Xdebug Understanding and Troubleshootingddev utility mutagen-diagnose helps debug Mutagen issues. See (draft) Mutagen Functionality and Debuggingddev snapshot now uses zstd instead of gzip for significantly faster exports and restores, thanks @deviantintegralddev-frankenphp with many improvements. See updated (draft) Using FrankenPHP with DDEV.ddev/traefik/config/<projectname>.yaml (all other files are ignored)After upgrading to v1.25.0, follow these steps:
ddev poweroff (DDEV will prompt you for this)ddev config --auto on each project to update to current configurationddev add-on list --installed to see your add-ons, then update them as neededddev delete images to remove old Docker image versionsIf your project has custom Dockerfiles or uses webimage_extra_packages and ddev start shows any problems, you may have a little work to do, but most projects are unaffected.
What to do: Test your project after upgrading. See Debian Trixie release notes for known issues.
Note: DDEV already includes the tzdata-legacy package to handle removed timezones in Debian Trixie, so no action is needed for timezone-related changes.
If you use XHProf profiling, it now defaults to XHGui mode instead of prepend mode.
What to do: If you prefer the previous prepend mode, run:
ddev config global --xhprof-mode=prepend
If you use custom nginx modules, the package names and module loading have changed. DDEV now uses nginx bundled with Debian Trixie instead of maintaining an extra dependency on the nginx.org repository.
What to do: Update your nginx module configuration.
Example: Adding NJS (JavaScript) support to nginx in DDEV v1.25.0+:
ddev config --webimage-extra-packages="libnginx-mod-http-js,libnginx-mod-stream,libnginx-mod-stream-js" --ddev-version-constraint='>=v1.25.0'
cat <<'EOF' > .ddev/web-build/Dockerfile.nginx
RUN sed -i '1i load_module modules/ngx_stream_module.so;\nload_module modules/ngx_http_js_module.so;\nload_module modules/ngx_stream_js_module.so;\n' /etc/nginx/nginx.conf
EOF
If you use these commands, you'll need to switch:
ddev nvm: Switch to nodejs_version or install the ddev-nvm add-onddev service: Use ddev add-on to install/remove servicesddev config flagsIf you use these flags in scripts, update them:
--mutagen-enabled → --performance-mode=mutagen--upload-dir → --upload-dirs--http-port → --router-http-port--https-port → --router-https-port--mailhog-port → --mailpit-http-port--mailhog-https-port → --mailpit-https-port--projectname → --project-name--projecttype, --apptype → --project-type--sitename → --project-name--image-defaults → --web-image-defaultIf you have custom Traefik configuration, note that:
.ddev/traefik/config/<projectname>.yaml is used (other files are ignored)$HOME/.ddev/traefik/custom-global-config/What to do if you have extra Traefik files:
.ddev/traefik/config/<projectname>.yaml and remove the #ddev-generated comment from itNote: ddev-router no longer stops automatically when the last project stops. Use ddev poweroff to stop it manually.
If you're on traditional Windows (not WSL2): The installer may prompt you to uninstall the previous system-wide installation before installing the new per-user version.
This release includes many other improvements:
ddev add-on list and ddev add-on searchddev add-on get <TAB>$HOME/.ddev/homeadditions/.ssh/config.dSee the full release notes for complete details.
From the entire team, thanks for using, promoting, contributing, and supporting DDEV!
If you have questions, reach out in any of the support channels.
Follow our blog, Bluesky, LinkedIn, Mastodon, and join us on Discord. Sign up for the monthly newsletter.
This article was edited and refined with assistance from Claude Code.
read moreTL;DR: DDEV supports Podman and Docker Rootless as of v1.25.0. Podman and Docker Rootless are a bit more trouble than the recommended normal traditional Docker providers and have some serious trade-offs. With Podman on macOS you can't use the normal default ports 80 and 443. On Linux Docker Rootless you can't bind-mount directories, so the entire project has to be mutagen-synced. But Podman Rootless on Linux is pretty solid.
Jump to setup instructions: Linux/WSL2 · macOS · Windows
Note: This support is experimental. Report issues on the DDEV issue tracker.
A common misconception is that Podman is the only open-source alternative to Docker Desktop. This is not true. There are several fully open-source alternatives available on every platform:
All of these work with DDEV. The main reason to choose Podman specifically is if your organization forbids Docker entirely or if you want rootless operation by default.
Podman is rootless by default, making it the simplest option for secure container environments. Traditional Docker requires root daemons, which can be a security concern in corporate environments with strict policies. (Note that DDEV is targeted at local development, where there are few risks of specialized attacks using this vector anyway.)
Podman's rootless approach runs the daemon without elevated privileges:
While DDEV already runs containers as unprivileged users, Podman eliminates the need for a root daemon entirely.
Docker Rootless provides the same security benefits as Podman Rootless while maintaining full Docker compatibility. It runs the daemon without root privileges, offering:
Unlike Podman which is rootless by default, Docker Rootless requires special setup to enable. Choose this option if you want to stay with Docker but need rootless security.
The primary focus for this article is Linux and WSL2 (we have test coverage for Linux only for now). Most features and configurations are well-tested on these platforms.
Before diving into setup, consider whether you need an alternative to traditional Docker:
| Runtime | Why would you do this? | Key trade-offs | Performance | Setup | Recommendation |
|---|---|---|---|---|---|
| Traditional Docker | Standard, widely-used option | None | Excellent | Simple | Recommended for most users |
| Docker Rootless | Security requirement for rootless daemon | Must use --no-bind-mounts (everything via Mutagen), can't use default workflow |
Moderate (Mutagen overhead) | Moderate | Only if rootless security is required |
| Podman Rootful | Organization forbids Docker | Slower than Docker, different behavior | Slower than Docker | Moderate | Only if Docker not allowed |
| Podman Rootless | Organization forbids Docker + want rootless security | May need sysctl changes for ports <1024, slower than Docker | Slower than Docker | Moderate | Only if Docker not allowed and rootless required |
Bottom line: Stick with traditional Docker unless organizational policy or security requirements force you to use an alternative. The alternatives work, but have significant trade-offs.
Install Podman using your distribution's package manager. See the official Podman installation guide for Linux.
# Ubuntu/Debian
sudo apt-get update && sudo apt-get install podman
# Fedora
sudo dnf install --refresh podman
Note: Some distributions may have outdated Podman versions. This is the case with Ubuntu 24.04, which has Podman 4.9.3. We require Podman 5.0 or newer for the best experience, because we didn't have success with Podman 4.x in our automated tests, but you can still use Podman 4.x ignoring the warning on ddev start.
You can also install Podman Desktop if you prefer a GUI.
For more information, see the Podman tutorials.
Podman provides a Docker-compatible API, which means you can use the Docker CLI as a frontend for Podman. This approach offers several benefits:
docker commands while Podman handles the actual container operationsdocker commandInstall only the CLI:
# Ubuntu/Debian
sudo apt-get update && sudo apt-get install docker-ce-cli
# Fedora
sudo dnf install --refresh docker-ce-cli
Note: You don't need to install docker-ce (the Docker engine).
This is the recommended configuration for most users.
Prepare the system by configuring subuid and subgid ranges and enabling userns options, see the Arch Linux Wiki for details:
# Add subuid and subgid ranges if they don't exist for the current user
grep "^$(id -un):\|^$(id -u):" /etc/subuid >/dev/null 2>&1 || sudo usermod --add-subuids 100000-165535 $(whoami)
grep "^$(id -un):\|^$(id -u):" /etc/subgid >/dev/null 2>&1 || sudo usermod --add-subgids 100000-165535 $(whoami)
# Propagate changes to subuid and subgid
podman system migrate
# Debian requires setting unprivileged_userns_clone
if [ -f /proc/sys/kernel/unprivileged_userns_clone ]; then
if [ "1" != "$(cat /proc/sys/kernel/unprivileged_userns_clone)" ]; then
echo 'kernel.unprivileged_userns_clone=1' | sudo tee -a /etc/sysctl.d/60-rootless.conf
sudo sysctl --system
fi
fi
# Fedora requires setting max_user_namespaces
if [ -f /proc/sys/user/max_user_namespaces ]; then
if [ "0" = "$(cat /proc/sys/user/max_user_namespaces)" ]; then
echo 'user.max_user_namespaces=28633' | sudo tee -a /etc/sysctl.d/60-rootless.conf
sudo sysctl --system
fi
fi
# Allow privileged port access if needed
if [ -f /proc/sys/net/ipv4/ip_unprivileged_port_start ]; then
if [ "1024" = "$(cat /proc/sys/net/ipv4/ip_unprivileged_port_start)" ]; then
echo 'net.ipv4.ip_unprivileged_port_start=0' | sudo tee -a /etc/sysctl.d/60-rootless.conf
sudo sysctl --system
fi
fi
Enable the Podman socket and verify it's running (Podman socket activation documentation):
systemctl --user enable --now podman.socket
# You should see `/run/user/1000/podman/podman.sock` (the number may vary):
ls $XDG_RUNTIME_DIR/podman/podman.sock
# You can also check the socket path with:
podman info --format '{{.Host.RemoteSocket.Path}}'
Configure Docker API to use Podman (Podman rootless tutorial):
# View existing contexts
docker context ls
# Create Podman rootless context
docker context create podman-rootless \
--description "Podman (rootless)" \
--docker host="unix://$XDG_RUNTIME_DIR/podman/podman.sock"
# Switch to the new context
docker context use podman-rootless
# Verify it works
docker ps
Proceed with DDEV installation.
Podman Rootless is significantly slower than Docker. See these resources:
To improve performance, install fuse-overlayfs and configure the overlay storage driver:
Install fuse-overlayfs:
# Ubuntu/Debian
sudo apt-get update && sudo apt-get install fuse-overlayfs
# Fedora
sudo dnf install --refresh fuse-overlayfs
Configure storage:
mkdir -p ~/.config/containers
cat << 'EOF' > ~/.config/containers/storage.conf
[storage]
driver = "overlay"
[storage.options.overlay]
mount_program = "/usr/bin/fuse-overlayfs"
EOF
Warning: If you already have Podman containers, images, or volumes, you'll need to reset Podman for this change to take effect:
podman system reset
This removes all existing containers, images, and volumes (similar to docker system prune -a).
Rootless Podman is recommended. Only use rootful Podman if your setup specifically requires it.
To configure rootful Podman:
podman group (sudo groupadd podman) and add your user to it (sudo usermod -aG podman $USER).sudo systemctl enable --now podman.socketdocker context create podman-rootful --description "Podman (root)" --docker host="unix:///var/run/podman/podman.sock"docker context use podman-rootfulDocker Rootless on Linux offers rootless security with full Docker compatibility.
Follow the official Docker Rootless installation guide.
Configure system:
# Allow privileged port access if needed
if [ -f /proc/sys/net/ipv4/ip_unprivileged_port_start ]; then
if [ "1024" = "$(cat /proc/sys/net/ipv4/ip_unprivileged_port_start)" ]; then
echo 'net.ipv4.ip_unprivileged_port_start=0' | sudo tee -a /etc/sysctl.d/60-rootless.conf
sudo sysctl --system
fi
fi
# Allow loopback connections (needed for working Xdebug)
# See https://github.com/moby/moby/issues/47684#issuecomment-2166149845
mkdir -p ~/.config/systemd/user/docker.service.d
cat << 'EOF' > ~/.config/systemd/user/docker.service.d/override.conf
[Service]
Environment="DOCKERD_ROOTLESS_ROOTLESSKIT_DISABLE_HOST_LOOPBACK=false"
EOF
Enable the Docker socket, and verify it's running:
systemctl --user enable --now docker.socket
# You should see `/run/user/1000/docker.sock` (the number may vary):
ls $XDG_RUNTIME_DIR/docker.sock
Configure Docker API to use Docker rootless:
# View existing contexts
docker context ls
# Create rootless context if it doesn't exist
docker context inspect rootless >/dev/null 2>&1 || \
docker context create rootless \
--description "Rootless runtime socket" \
--docker host="unix://$XDG_RUNTIME_DIR/docker.sock"
# Switch to the context
docker context use rootless
# Verify it works
docker ps
Proceed with DDEV installation.
Docker Rootless requires no-bind-mounts mode
Docker Rootless has a limitation with bind mounts that affects DDEV. You must enable no-bind-mounts mode:
ddev config global --no-bind-mounts=true
Why this is needed:
Docker Rootless sets ownership for bind mounts to root inside containers. This is a known issue:
The root user inside the container maps to your host user, but many services will not run as root:
Podman Rootless fixes this with the --userns=keep-id option, which keeps user IDs the same. Docker Rootless does not have this option.
The no-bind-mounts mode fixes this by using Mutagen for the web container.
macOS users can use Podman and Podman Desktop, but setup has its own challenges. Docker Rootless is not available on macOS.
| Runtime | Why would you do this? | Key trade-offs | Performance | Setup | Recommendation |
|---|---|---|---|---|---|
| Traditional Docker | Standard, widely-used option | None | Excellent | Simple | Recommended for most users |
| Podman | Avoid Docker entirely (organizational policy) | Cannot use ports 80/443 (must use 8080/8443 instead), different behavior | Slower than Docker | Moderate | Only if Docker not allowed |
Bottom line: Use traditional Docker (OrbStack, Docker Desktop, Lima, Colima, or Rancher Desktop) unless your organization forbids it. The inability to use standard ports 80/443 with Podman creates a significantly different development experience.
Install Podman using Homebrew:
brew install podman
Or install Podman Desktop if you prefer a GUI.
For more information, see the official Podman installation guide for macOS and Podman tutorials.
Podman provides a Docker-compatible API, which means you can use the Docker CLI as a frontend for Podman. This approach offers several benefits:
docker commands while Podman handles the actual container operationsdocker commandbrew install docker
Handle privileged ports (<1024):
Important: Podman on macOS cannot bind to privileged ports (80/443). You must configure DDEV to use unprivileged ports:
ddev config global --router-http-port=8080 \
--router-https-port=8443
This means your DDEV projects will be accessible at https://yourproject.ddev.site:8443 instead of the standard https://yourproject.ddev.site.
Note: switching to rootful mode with podman machine set --rootful --user-mode-networking=false doesn't help with privileged ports because the --user-mode-networking=false flag is not supported on macOS (it's only available for WSL).
Initialize and start the Podman machine:
# check `podman machine init -h` for more options
podman machine init --memory 8192
podman machine start
Check for the Podman socket path using podman machine inspect:
~ % podman machine inspect
...
"ConnectionInfo": {
"PodmanSocket": {
"Path": "/var/folders/z5/lhpyjf2n7xj2djl0bw_7kb3m0000gn/T/podman/podman-machine-default-api.sock"
},
"PodmanPipe": null
},
...
Configure Docker CLI to use Podman. Choose one of two approaches:
Option 1: Create a Docker context (recommended, more flexible):
# Create Podman context (path to socket may vary)
# Use the socket path from `podman machine inspect` output
docker context create podman-rootless \
--description "Podman (rootless)" \
--docker host="unix:///var/folders/z5/lhpyjf2n7xj2djl0bw_7kb3m0000gn/T/podman/podman-machine-default-api.sock"
# Switch to the new context
docker context use podman-rootless
# Verify it works
docker ps
This approach uses Docker contexts to switch between different container runtimes without modifying system sockets. This is more flexible if you want to use multiple Docker providers.
Option 2: Use the default Docker socket (simpler, but less flexible):
# Install podman-mac-helper
# Use the command from `podman machine start` output
sudo /opt/homebrew/Cellar/podman/5.7.1/bin/podman-mac-helper install
podman machine stop
podman machine start
# Verify it works
docker ps
Proceed with DDEV installation.
Windows users can use Podman Desktop, but setup has its own challenges. Docker Rootless is not available on traditional Windows (it works in WSL2, see the Linux and WSL2 section).
| Runtime | Why would you do this? | Key trade-offs | Performance | Setup | Recommendation |
|---|---|---|---|---|---|
| Traditional Docker | Standard, widely-used option | None | Excellent | Simple | Recommended for most users |
| Podman | Avoid Docker entirely (organizational policy) | Different behavior, less mature on Windows | Slower than Docker | Moderate | Only if Docker not allowed |
Bottom line: Use traditional Docker (Docker Desktop or alternatives) unless your organization forbids it. Podman on Windows works but is less mature than on Linux.
Install Podman Desktop, which includes Podman.
Alternatively, install Podman directly following the official Podman installation guide for Windows.
For more information, see the Podman tutorials.
The setup and configuration follow similar patterns to the Linux/WSL2 setup, but with Podman Desktop managing the VM for you. Follow the Linux and WSL2 instructions.
You can run Docker and Podman sockets simultaneously and switch between them using Docker contexts.
For example, here's a system with four active Docker contexts:
$ docker context ls
NAME DESCRIPTION DOCKER ENDPOINT
default Current DOCKER_HOST based configuration unix:///var/run/docker.sock
podman Podman (rootful) unix:///var/run/podman/podman.sock
podman-rootless * Podman (rootless) unix:///run/user/1000/podman/podman.sock
rootless Rootless runtime socket unix:///run/user/1000/docker.sock
Switch between them with:
docker context use "<context-name>"
Note: Running both Docker and Podman in rootful mode at the same time may cause network conflicts. See Podman and Docker network problem on Fedora 41.
DDEV automatically detects your active container runtime. To switch:
Stop DDEV projects:
ddev poweroff
Switch Docker context or change the DOCKER_HOST environment variable
Start your project:
ddev start
| Feature | Standard Docker | Docker Rootless | Podman Rootful | Podman Rootless |
|---|---|---|---|---|
| Platform Support | All | Linux, WSL2 | All | All |
| Rootless Daemon | ❌ | ✅ | ❌ | ✅ |
| Has automated testing in DDEV | ✅ | ✅ | ❌ | ✅ |
| Mutagen | ✅ | ✅ | ✅ | ✅ |
| Bind Mounts | ✅ | ❌, requires no-bind-mounts |
✅ | ✅ (with --userns=keep-id) |
| Performance | Excellent | Moderate (because of no-bind-mounts) |
Slow compared to Docker | Slow compared to Docker |
| Privileged Ports (<1024) | Works by default | Requires sysctl config |
Works by default | Requires sysctl config or may not work |
| Setup Complexity | Simple | Moderate | Moderate | Moderate |
| Maturity | Most mature | Experimental | Experimental | Experimental |
| Recommended For | Most users | Docker users needing rootless | Organizations that forbid Docker | Organizations that forbid Docker |
Use of the many standard Docker providers if:
This is the recommended option for the vast majority of users.
Use Podman Rootless if:
Use Podman Rootful if:
Use Docker Rootless if:
Supporting Podman and Docker Rootless required major changes to DDEV's Docker integration:
docker context inspect directly, which doesn't work with Podman. We switched to using the docker/cli library to handle context operations properly.ddev auth ssh command used to call docker run directly. We rewrote it to use the Docker API, making it compatible with alternative runtimes.links and external_links directives in docker-compose files. We removed these legacy features and modernized DDEV's compose file generation.These changes enabled Podman and Docker Rootless support. These features were developed together because Podman's primary use case is rootless operation. Once DDEV could handle rootless runtimes, supporting both became natural. They share the same security model and similar technical constraints.
This Podman and Docker Rootless support was made possible by community financial support. The changes required hundreds of hours of development, code reviews, and testing.
DDEV relies on support from individuals and organizations who use it. With Podman rootless support, DDEV now works in corporate environments where Docker Desktop is not allowed. If you or your organization uses DDEV, please consider sponsoring the project to help keep DDEV free and open source.
DDEV now supports Podman and Docker Rootless as experimental features. This opens DDEV to corporate environments where traditional Docker is not allowed.
DDEV automatically detects your runtime and handles the complexity. Whether you choose Podman for rootless security, Docker Rootless for compatibility, or standard Docker, setup is straightforward.
This article was edited and refined with assistance from Claude Code.
read moreEvery year on 4th February, the world unites to mark World Cancer Day (WCD), a campaign that raises awareness, amplifies voices, and inspires collective action against cancer. Behind the scenes, the World Cancer Day website, built with Drupal, powers millions of people, providing a central platform for global engagement.
Project overview
|
The World Cancer Day 2025-2027 campaign embraces the theme “United by Unique”, emphasizing people-centered care. This approach prioritizes the needs, values, and active participation of individuals, families, and communities in cancer care. By putting people at the heart of the conversation, the campaign promotes a shift toward more inclusive, responsive, and compassionate health systems.
The 2025 campaign achieved remarkable global engagement:
These numbers highlight both the scale of the campaign and the critical need for a platform that can reliably support millions of users simultaneously.
Explore more about the campaign and join the global action at World Cancer Day.
Supporting a high-profile global campaign requires flexibility, scalability, and robustness, capabilities that are fundamental to Drupal’s architecture.
"I've been pleased with my experience with Drupal. While the earlier versions were sometimes technically complex, it always felt like a solid, robust platform to build upon. I have been genuinely pleased that we chose to stick with it over the long term and to witness its evolution into a more mature and flexible platform."
Charles Andrew Revkin Senior Digital Strategy Manager Union for International Cancer Control ( UICC)
To manage the vast volume of user-submitted stories while maintaining quality, relevance and inclusivity, WCD integrated Drupal AI. This automation helps with content moderation and reduces manual workload, allowing more people to share their experiences and supporting the campaign’s people-centered mission as it scales.
For non-profit organizations in the healthcare sector, efficiency, transparency, and long-term sustainability are essential, especially when every investment must directly support the mission. As an open-source platform, Drupal eliminates licensing costs and avoids vendor lock-in, allowing resources to be focused on participation and impact rather than software fees. Supported by a global contributor community, Drupal benefits from continuous improvements in security, accessibility, and performance, making it a trusted foundation for sensitive, high-impact initiatives like World Cancer Day.
The global fight against cancer requires collective action, and Drupal plays an important role in enabling that engagement. By managing large-scale data, complex interactive features, and high-traffic performance, the platform ensures that the campaign can reach millions of people, foster participation, and support socially impactful initiatives year after year.
Every year on 4th February, the world unites to mark World Cancer Day (WCD), a campaign that raises awareness, amplifies voices, and inspires collective action against cancer. Behind the scenes, the World Cancer Day website, built with Drupal, powers millions of people, providing a central platform for global engagement.
Project overview
|
The World Cancer Day 2025-2027 campaign embraces the theme “United by Unique”, emphasizing people-centered care. This approach prioritizes the needs, values, and active participation of individuals, families, and communities in cancer care. By putting people at the heart of the conversation, the campaign promotes a shift toward more inclusive, responsive, and compassionate health systems.
The 2025 campaign achieved remarkable global engagement:
These numbers highlight both the scale of the campaign and the critical need for a platform that can reliably support millions of users simultaneously.
Explore more about the campaign and join the global action at World Cancer Day.
Supporting a high-profile global campaign requires flexibility, scalability, and robustness, capabilities that are fundamental to Drupal’s architecture.
"I've been pleased with my experience with Drupal. While the earlier versions were sometimes technically complex, it always felt like a solid, robust platform to build upon. I have been genuinely pleased that we chose to stick with it over the long term and to witness its evolution into a more mature and flexible platform."
Charles Andrew Revkin Senior Digital Strategy Manager Union for International Cancer Control ( UICC)
To manage the vast volume of user-submitted stories while maintaining quality, relevance and inclusivity, WCD integrated Drupal AI. This automation helps with content moderation and reduces manual workload, allowing more people to share their experiences and supporting the campaign’s people-centered mission as it scales.
For non-profit organizations in the healthcare sector, efficiency, transparency, and long-term sustainability are essential, especially when every investment must directly support the mission. As an open-source platform, Drupal eliminates licensing costs and avoids vendor lock-in, allowing resources to be focused on participation and impact rather than software fees. Supported by a global contributor community, Drupal benefits from continuous improvements in security, accessibility, and performance, making it a trusted foundation for sensitive, high-impact initiatives like World Cancer Day.
The global fight against cancer requires collective action, and Drupal plays an important role in enabling that engagement. By managing large-scale data, complex interactive features, and high-traffic performance, the platform ensures that the campaign can reach millions of people, foster participation, and support socially impactful initiatives year after year.
Every year on 4th February, the world unites to mark World Cancer Day (WCD), a campaign that raises awareness, amplifies voices, and inspires collective action against cancer. Behind the scenes, the World Cancer Day website, built with Drupal, powers millions of people, providing a central platform for global engagement.
Project overview
|
The World Cancer Day 2025-2027 campaign embraces the theme “United by Unique”, emphasizing people-centered care. This approach prioritizes the needs, values, and active participation of individuals, families, and communities in cancer care. By putting people at the heart of the conversation, the campaign promotes a shift toward more inclusive, responsive, and compassionate health systems.
The 2025 campaign achieved remarkable global engagement:
These numbers highlight both the scale of the campaign and the critical need for a platform that can reliably support millions of users simultaneously.
Explore more about the campaign and join the global action at World Cancer Day.
Supporting a high-profile global campaign requires flexibility, scalability, and robustness, capabilities that are fundamental to Drupal’s architecture.
"I've been pleased with my experience with Drupal. While the earlier versions were sometimes technically complex, it always felt like a solid, robust platform to build upon. I have been genuinely pleased that we chose to stick with it over the long term and to witness its evolution into a more mature and flexible platform."
Charles Andrew Revkin Senior Digital Strategy Manager Union for International Cancer Control ( UICC)
To manage the vast volume of user-submitted stories while maintaining quality, relevance and inclusivity, WCD integrated Drupal AI. This automation helps with content moderation and reduces manual workload, allowing more people to share their experiences and supporting the campaign’s people-centered mission as it scales.
For non-profit organizations in the healthcare sector, efficiency, transparency, and long-term sustainability are essential, especially when every investment must directly support the mission. As an open-source platform, Drupal eliminates licensing costs and avoids vendor lock-in, allowing resources to be focused on participation and impact rather than software fees. Supported by a global contributor community, Drupal benefits from continuous improvements in security, accessibility, and performance, making it a trusted foundation for sensitive, high-impact initiatives like World Cancer Day.
The global fight against cancer requires collective action, and Drupal plays an important role in enabling that engagement. By managing large-scale data, complex interactive features, and high-traffic performance, the platform ensures that the campaign can reach millions of people, foster participation, and support socially impactful initiatives year after year.
Every year on 4th February, the world unites to mark World Cancer Day (WCD), a campaign that raises awareness, amplifies voices, and inspires collective action against cancer. Behind the scenes, the World Cancer Day website, built with Drupal, powers millions of people, providing a central platform for global engagement.
Project overview
|
The World Cancer Day 2025-2027 campaign embraces the theme “United by Unique”, emphasizing people-centered care. This approach prioritizes the needs, values, and active participation of individuals, families, and communities in cancer care. By putting people at the heart of the conversation, the campaign promotes a shift toward more inclusive, responsive, and compassionate health systems.
The 2025 campaign achieved remarkable global engagement:
These numbers highlight both the scale of the campaign and the critical need for a platform that can reliably support millions of users simultaneously.
Explore more about the campaign and join the global action at World Cancer Day.
Supporting a high-profile global campaign requires flexibility, scalability, and robustness, capabilities that are fundamental to Drupal’s architecture.
"I've been pleased with my experience with Drupal. While the earlier versions were sometimes technically complex, it always felt like a solid, robust platform to build upon. I have been genuinely pleased that we chose to stick with it over the long term and to witness its evolution into a more mature and flexible platform."
Charles Andrew Revkin Senior Digital Strategy Manager Union for International Cancer Control ( UICC)
To manage the vast volume of user-submitted stories while maintaining quality, relevance and inclusivity, WCD integrated Drupal AI. This automation helps with content moderation and reduces manual workload, allowing more people to share their experiences and supporting the campaign’s people-centered mission as it scales.
For non-profit organizations in the healthcare sector, efficiency, transparency, and long-term sustainability are essential, especially when every investment must directly support the mission. As an open-source platform, Drupal eliminates licensing costs and avoids vendor lock-in, allowing resources to be focused on participation and impact rather than software fees. Supported by a global contributor community, Drupal benefits from continuous improvements in security, accessibility, and performance, making it a trusted foundation for sensitive, high-impact initiatives like World Cancer Day.
The global fight against cancer requires collective action, and Drupal plays an important role in enabling that engagement. By managing large-scale data, complex interactive features, and high-traffic performance, the platform ensures that the campaign can reach millions of people, foster participation, and support socially impactful initiatives year after year.
Every year on 4th February, the world unites to mark World Cancer Day (WCD), a campaign that raises awareness, amplifies voices, and inspires collective action against cancer. Behind the scenes, the World Cancer Day website, built with Drupal, powers millions of people, providing a central platform for global engagement.
Project overview
|
The World Cancer Day 2025-2027 campaign embraces the theme “United by Unique”, emphasizing people-centered care. This approach prioritizes the needs, values, and active participation of individuals, families, and communities in cancer care. By putting people at the heart of the conversation, the campaign promotes a shift toward more inclusive, responsive, and compassionate health systems.
The 2025 campaign achieved remarkable global engagement:
These numbers highlight both the scale of the campaign and the critical need for a platform that can reliably support millions of users simultaneously.
Explore more about the campaign and join the global action at World Cancer Day.
Supporting a high-profile global campaign requires flexibility, scalability, and robustness, capabilities that are fundamental to Drupal’s architecture.
"I've been pleased with my experience with Drupal. While the earlier versions were sometimes technically complex, it always felt like a solid, robust platform to build upon. I have been genuinely pleased that we chose to stick with it over the long term and to witness its evolution into a more mature and flexible platform."
Charles Andrew Revkin Senior Digital Strategy Manager Union for International Cancer Control ( UICC)
To manage the vast volume of user-submitted stories while maintaining quality, relevance and inclusivity, WCD integrated Drupal AI. This automation helps with content moderation and reduces manual workload, allowing more people to share their experiences and supporting the campaign’s people-centered mission as it scales.
For non-profit organizations in the healthcare sector, efficiency, transparency, and long-term sustainability are essential, especially when every investment must directly support the mission. As an open-source platform, Drupal eliminates licensing costs and avoids vendor lock-in, allowing resources to be focused on participation and impact rather than software fees. Supported by a global contributor community, Drupal benefits from continuous improvements in security, accessibility, and performance, making it a trusted foundation for sensitive, high-impact initiatives like World Cancer Day.
The global fight against cancer requires collective action, and Drupal plays an important role in enabling that engagement. By managing large-scale data, complex interactive features, and high-traffic performance, the platform ensures that the campaign can reach millions of people, foster participation, and support socially impactful initiatives year after year.
Today we are talking about Development Workflows, Agentic Agents, and how they work together with guests Andy Giles & Matt Glaman. We'll also cover Drupal Canvas CLI as our module of the week.
For show notes visit: https://www.talkingDrupal.com/538
TopicsMatt Glaman - mglaman.dev mglaman
HostsNic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Andy Giles - dripyard.com andyg5000
MOTW CorrespondentMartin Anderson-Clutz - mandclu.com mandclu
Five years after the idea first surfaced, Drupal CMS 2.0 has arrived, with a clear focus on the early experience. Released on 28 January 2026, the update introduces real-time page editing via Drupal Canvas, a templating system with sector-specific defaults, and optional AI guidance. It’s not a reinvention of Drupal. It’s a response to what new users most often struggle with: getting started quickly without sacrificing long-term flexibility.
The release is built on Drupal Core 11.3, bringing the platform’s biggest performance gains in over a decade—up to 33% faster request handling. Canvas replaces the standard editing workflow with a drag-and-drop interface, powered by the new Mercury component system. The first template, Byte, is preconfigured for SaaS marketing sites and installs in under three minutes. Optional AI tools support page scaffolding, alt text generation, and guided content modelling, with integration available for amazee.ai, OpenAI, and Anthropic.
On launch day, Dries Buytaert called the release “power without complexity,” noting that it changes the starting point, not the system. Contributed module compatibility is preserved, and features from Drupal CMS 1, like automatic updates and the Gin admin UI, remain intact. For teams evaluating Drupal in 2026, CMS 2.0 sets a clearer baseline: real output, faster, with less overhead.
We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now. For timely updates, follow us on LinkedIn, Twitter, Bluesky, and Facebook. You can also join us on Drupal Slack at #thedroptimes.
Thank you.
Kazima Abbas
Sub-editor
The Drop Times
Drupal’s default workflow doesn’t always meet user needs. When you have to review content before publishing, the Content Moderation module becomes a key tool. It allows you to define and control content workflows. Discover how the Content Moderation module can help you manage content in Drupal.
read moreHeading to your very first DrupalCon? Lucky you. There’s nothing quite like that first DrupalCon — the energy, the discoveries, the “wow, I’m really here” feeling. Chicago, “The Windy City,” warmly welcomes you to see which way the wind is blowing in Drupal: latest trends, community initiatives, practical know-how, and hands-on tips.
At DrupalCon Chicago 2026, you’ll connect with fellow Drupal users and builders, swap stories, and finally match faces to names you may have seen online. You’ll meet the contributors behind the features that shape Drupal today, and they’re easy to talk to. Just come to their session or catch them nearby. The Drupal community is made up of people who enjoy sharing what they know and helping move Drupal forward together.
With so much happening at DrupalCon 2026 and an impressive choice of sessions, it can be hard to know where to start, especially on your first visit. The sessions below are hand-picked for first-time attendees and offer a balanced mix of context, inspiration, and practical takeaways.
Want to discover the real gems of Drupal that everyone’s buzzing about? This helps you stay oriented and gives you shared reference points for meaningful conversations, collaboration, and deeper exploration throughout DrupalCon and after it.
First, meet Drupal CMS — a special, ready-to-go version of Drupal built with usability in mind. It’s designed so non‑tech users can jump right in and enjoy smooth, out‑of‑the‑box experiences.
Drupal CMS 1.0 wowed the community with its pre‑built feature sets called Recipes, smart AI tools, easier admin navigation, and friendlier content editing. Now, it’s time for Drupal CMS 2.0 to shine, and you have a chance to hear an insightful session about it.
Guided by top contributors to the project, Pamela Barone (pameeela) and Cristina Chumillas (ckrina), you’ll explore the standout features that make Drupal CMS 2.0 special. Among them certainly is Drupal Canvas — a new-generation page builder. Another feature that will definitely be discussed is the newly-implemented Site Templates that enable you to kickstart pre-configured sites for specific use industries or use cases.
Besides the ready‑to‑use features in Drupal CMS 2.0, you’ll hear about the areas of ongoing work, plans, and ways to contribute. And don’t wait too long to grab your seat — Pamela and Cristina’s sessions are known to pack the room, with people standing just to catch the insights.
AI adoption is yet another topic that will help first-time attendees feel in the loop with the Drupal community. Recent DrupalCons have featured jaw‑dropping demos: AI building page layouts from a prompt, migrating content between sites, generating webforms from a sketch, and much more. In Drupal CMS, AI is baked into the concept itself, with agents, assistants, and automators designed to take on the heavy lifting.
The Drupal AI Initiative was launched in 2025 to organize, coordinate, and strategically guide AI adoption. It is in full swing, so it’s the perfect moment to attend this compelling session by its maintainers. Discover what AI capabilities have become available thanks to the Initiative, what to look forward to, and how to get involved.
And if you’re curious, stick around for the open “Ask Me Anything” segment in this session. James Abrahams (yautja_cetanu), Christoph Breidert (breidert), Dominique De Cooman (domidc), and Paul Johnson (pdjohnson) will be ready to answer your questions and share insights.
We’ve already touched on Drupal Canvas, but it deserves a moment of its own. Canvas is on track to become the primary way page layouts are built across the entire Drupal ecosystem, so knowing how it works is a must.
Built with React, Drupal Canvas brings a visual, component-driven approach to Drupal. Among its features are:
Don’t miss this session by Lauri Timmanee (lauriii), one of the key maintainers and product leads behind Drupal Canvas. This is your chance to discover how it works, explore real demos, and see the exciting features to look forward to.
Open‑source thrives because people show up — and in Drupal, every action counts. A small fix, a quick test, a bit of feedback — these tiny sparks can light up big changes. That’s the magic of contributing: each step adds to something larger, something shared.
In Drupal, those efforts don’t go unnoticed. Credits on drupal.org are one way your work is recognized — but the real reward is the respect and connection you’ll earn from a community that values every contribution.
DrupalCon is the perfect place to start contributing. Think of it as a launchpad — a welcoming space where you can learn, experiment, and make your first mark on Drupal.
Walk into a DrupalCon contribution workshop, and you’ll feel it right away — the buzz of laptops opening, sticky notes being scribbled, and people leaning in to help each other. It’s not just a session, it’s a hive of energy where newcomers and veterans sit side by side to move Drupal forward.
When it comes to discovering contribution opportunities, you might also find it very useful to attend this lively session by two seasoned and famous Drupal contributors. Mike Herchel (mherchel) and Matt Glaman (mglaman) will pull back the curtain on how contributions happen that might eventually become epic in Drupal.
So how does a future contribution start? Maybe you spot a bug, or you feel the urge to improve how something works. From that moment of drive, the journey begins — identifying the issue, pitching your idea to the right people, assembling a team, doing the work, navigating communication hurdles, and finally pushing your contribution across the finish line.
You’ll hear real stories of stubborn bug fixes, ambitious features, and the persistence it takes to get changes into Drupal core, Drupal CMS, or major contributed projects. Expect practical advice, case studies that show the highs and lows, plenty of humor, and the kind of motivation that makes you want to do something epic yourself.
The central keynote of DrupalCon is a can’t‑miss session for everyone. For first‑time attendees, it’s an especially exciting chance to see Drupal’s founder in person and hear his insights.
You’ll get a firsthand look at the features, initiatives, and updates preparing to define Drupal’s next chapter. It’s a moment to see the bigger picture, feel the energy of the community, and glimpse what lies ahead together.
Each year, the Driesnote comes with its own creative theme — from space missions to Drupal villages — always kept secret until the big reveal. Whatever the theme this year, the Driesnote is guaranteed to be a breathtaking performance, delivered by the one speaker who knows how to keep the audience engaged, fascinated, and full of anticipation.
The Driesnote is where DrupalCon truly begins — vision, energy, and surprises from Drupal’s founder. Grab your seat in the big auditorium, right where the whole community will be gathering.
The skill‑sharing spirit of the Drupal community shines brightest when welcoming new talent. Seasoned gurus are happy to help newcomers learn Drupal.
Can you really learn Drupal in a single day? You’ll keep uncovering its powerful site‑building capabilities as your journey continues, but one day can give you a real taste of Drupal — enough to explore its fundamentals and see what makes it one of the world’s leading open‑source CMS platforms.
Drupal in a Day at DrupalCon Chicago 2026 is a free, hands‑on workshop designed for beginners. This includes university or college students, or just anyone who is curious about Drupal and wants to see how it all comes together. No prior experience needed — just bring your laptop and a bit of curiosity.
Guided by experienced Drupalers, you’ll roll up your sleeves to build a site from scratch, pick up practical skills, and leave with a certificate, new connections, and the confidence to dive deeper. Who knows — this could be the first step toward a future Drupal career, where you’ll be the one teaching others or contributing to the next big Drupal feature.
Spots are limited, so register early if you want to join in.
These sessions can help you get your bearings, spark new ideas, and show how the pieces of Drupal fit together today, and where they’re headed next. In addition to the sessions on this list, there is a great variety of others you might enjoy depending on your background. Pick what catches your interest, follow your curiosity, and leave room for a few surprises along the way.
Besides the sessions, it’s a great idea to visit the Expo Hall for informal chats with solution partners and companies using Drupal. Many first‑time attendees find the networking between sessions just as valuable as the sessions themselves.
With its welcoming spirit, DrupalCon has a way of turning first sessions and first conversations into lasting connections. Make your first visit exciting, and let your journey with Drupal be truly epic!
Hackathons have always meant one thing to me: code, caffeine, and controlled chaos. You show up with a rough idea, write as much code as humanly possible in a short time, and hope it holds together long enough for a demo. The Drupal AI Hackathon – Play to Impact: 2026 edition changed that assumption completely.
read more
Lots of people working in technology have a choice between working for clients or working for consultants. We work on one side of the relationship thinking how nice it would be to have the advantages of being on the other; the preverbal grass is always seems greener.
I spent a little more than ten years as a client before I became a consultant. I spent just a bit longer as a consultant before becoming a client again. There are things I’ve learned in each role that help me do the other better. To ensure a mutually beneficial engagement it is helpful to understand the perspective of the other team.
The goals of a consultant and a client organization are misaligned. That doesn’t mean you can’t do great things together, but if you don’t understand the goals of your partner you are likely to step on each other’s toes.
Client looks to consultants for one of two primary reasons:
We either need to complete a project that our team does not have the time to tackle, or we need expertise it does not make sense for us to keep on staff. Sometimes we’re looking to reduce costs by having a group of part time people fill the roles of a smaller number of fulltime team members.
I like to have my team’s staffing level sufficient to complete all day-to-day tasks and to bring in outside help to take on special projects. Other people like to have consultants around consistently to provide the outside perspective and the diverse expertise that consultants bring. Both of those strategies are foundationally aimed at those two needs.
A smart client wants to spend the money needed to be successful, but not more. We want the most value for our money we can possibly get.
Consultants have a different pair of primary goals:
Some consultants will protest that they have the goal to solves client’s problems through good work. My perspective is those are way to achieve those two goals. Some clients are happy when you do good work (but not all). All clients are paying to have a problem solved (see above).
Profit motive isn’t evil or wrong – even when supporting nonprofits and other socially beneficial institutions (having spent much of my career in nonprofits, we think about this a lot). Consultants need to make money to stay afloat. A consulting firm has people to pay, overhead to manage, and founders/investors to reward. Independent consultants need to eat, pay their mortgage, and so on. The larger the firm, the more pressure there is for larger profit margins.
To get new clients, consultants need “referable” clients. That means having clients who are so happy with the work done they will serve as a reference. I wish that always meant creating the best solution possible. What it means in practice is building the solution that makes the client happy. As a consultant I gave clients my best advice, and when they disagreed and insisted on a different solution, we build that instead. If they ran out of money along the way we still tried to keep them happy, even if we had the duct tape the last bits.
In the end, consultants build what clients pay for, and that’s not always the best solution.
With consultants trying to make the most money they can, and clients trying to get a successful solution for the least money, there is an inherent tension in the system. Still, there is a balance to be had, where everyone wins, and great things happen. The trick is to make it a healthy tension that forces everyone to be better. Finding that balance doesn’t require that everyone involved has spent time on the other side the relationships, but it certainly helps.
When you understand the needs and goals of the other side of the relationship you can adjust your approach to make sure everyone is aligned to win.
One of the things I learned along the way was that a lot of the advice given to new consultants contradicts what I knew from being a client. Spending time as a client gives you insights into how to best serve customers that many pure consultants don’t understand.
When you work at an organization that hires consultants you see different approaches taken by different firms. You learn your preferences about what you like and don’t like in a consulting partner. While no one style is the best fit for everyone it’s unlikely that you are so unique that there aren’t lots of other people who like that same style.
Default to the Golden Rule: treat clients the way you wanted to be treated by consultants.
You can’t always do that 100% of the way – sure as a client I want everything free, but that’s not reasonable. But by approaching the client the way I would have wanted to be treated consistently went a long way to helping smooth over challenges.
Start there, and over time you’ll learn to adapt your approach when specific clients prefer a different style.
Do. Not. Lie. To. Me.
Do not guess without admitting it. If I wanted made up answers, I’d ask an AI.
Consultants always want to appear to be the expert in the room, and so they feel they have to answer every question. Too often that leads to consultants making up answers to show how smart they are; clients will catch you eventually.
One of the best ways to build trust with a client is admit when you don’t know the answer to a question, and then come back later with the answer. Do not say “I don’t know” and leave it there, go for some form of “I will need to go look that up/ask around/figure that out.”
Great consultants find solutions, they don’t always have the answer right away. We can wait for you to do some research when we stump you. That is a lot easier to explain than when you have to walk back having given us the wrong answer.
Clients should always have an outcome in mind that supports their work. Consultants are focused on the solution they are building. When everything is going well, that solution is what the client needs to support their work. If those stop being the same thing you have a very big problem.
Both clients and consultants can easily forget to consistently re-check that alignment. As a client and as a consultant I’ve been part of projects where the delivered solution didn’t solve the actual problem – even when it fulfilled the spec and SOW. These moments frequently lead to energetic discussions that often become loud. No one wins when that happens.
Regularly check with the client, and with yourself, to see if the solution will solve the client’s problem. When you see misalignment raise your hand early and often.
Of course consultants know and learn stuff that isn’t obvious to any given client. Consultants bring wider experiences, different perspectives, and a different energy to a project. That is part of what makes them valuable. Clients should hire a consultant they trust, and listen to their consultant. Think hard before deciding you know better.
As a client we tend to learn deeply about the tools we use and our work. Consultants work on a lot of projects with a lot of clients. Along the way they use a lot tools, and see at lot of ideas. That creates a culture and need for constantly learning. Often they are learning about things that don’t seem useful right away.
The higher the role you have as a consultant, the more you are expected to be at least conversant about technology you haven’t used yet. You also need to be conversant about the work of your clients. That’s a lot of learning.
I had good learning habits going in to being a consultant. They served me extremely well as a consultant, and are serving me well again as a client.
The broad knowledge of a consultant is extremely useful and everyone benefits from more people knowing more stuff. Having that breadth of knowledge also helps when you do run into the places where you don’t know something. It gives you the confidence that you can go learn the next thing you need to know quickly (see Be honest about your limitations above).
Consultants are always working within time and budget constraints – usually tight ones. That forces them to learn to be efficient. Sometimes that means they cut corners (see next section) usually that just means they move fast. Good consultants have a high degree of dexterity with their tools, they learn to line up their work to knock out tasks, and they learn what’s needed and what’s just nice to have.
New consultants often feel like they are sprinting all the time, but experienced consultants learn to balance the sprints with jogging. The pace is nearly always high (at least if sales are going well), but it still ebbs and flows. Consultants learn to hit their deadlines, but rarely are ready to deliver early.
As a consultant if a deadline was far in the future it gave me time to do careful work, balance other clients, do research, or just time off. Far off deadlines gave me time to recover from sprints and make sure I had the energy for high intensity moment. That intensity is important to driving client success – but hitting the deadline is more important.
Hitting deadlines is also important for a client to do. Consultants need you to hit your deadlines so they can balance their workload to hit their deadlines. They may also have penalties embedded in the contract (see Read the Contract below) that could cost you time or money over the course of your project.
Okay, this isn’t something just consultants know, but it is something consultants often learn to deal with the hard way.
Consultants need a solution that meets the requirements, fits in the budget, and pleases the client. They are not there to create a solution that is perfect, or even elegant. In any project there is a balance to be had between carefully polished, and just barely good enough to be successful. Consultants learn to thread that needle. As long as the project is successful that’s a good thing.
I have seen developers spend hours, days, even months, trying to build to the perfect level of abstraction, with the perfect naming conventions, and drive for the perfect code, only to have the project fail because it’s overdue, over budget, and was outmoded by someone who worked twice as fast.
Yes, we all want good solutions to our technical problems. But no solution is going to be perfect. You should aim for perfection and know you are going to miss. When you learn to accept that, it’ll be easier to move forward and be successful.
For all there are things that each side brings something to the table, there are habits that everyone should have as part of their role. There are lessons I learned, or was taught, in both roles that are super important.
Everyone on a project benefits from having working knowledge of the contract. In the end, when push comes to shove, all that matters is the words on the paper. You can usually avoid the pushing and shoving by understanding what everyone agreed to up front.
The biggest issues I’ve seen on consulting projects was when one side, or the other, didn’t pay attention to the agreement.
Sometimes this happens because everyone is working in good faith, and no one remembers to amend the agreement when needs changed. In those cases you can often recover by continuing to work with each other in good faith.
Sometimes this happens when someone signed a contract they didn’t read and understand. I once had a client yell at me because I added a paragraph to the contract outlining the resources they were responsible for providing and he didn’t read it before we asked him for those resources (these clauses are really standard, and the one I wrote was extremely simple).
If everyone on the team takes the time to read and understand the contract it greatly reduces friction. Clients who understand the bounds and assumptions in a contract are able to get the most from their vendor without creating tension. Consultants who track the required deliverables of the contract don’t frustrate clients by skipping required elements. It doesn’t take long. The more you read them the faster we’ll be at reading the next one.
Once you have read a bunch of contracts you’ll know what’s normal and what’s not. At this point, if I don’t understand the contract language I see that as a red flag even before I send it out for legal review.
Projects go best when everyone is open about what problems exist and then pivots to solving them.
Technology should be deployed to solve problems. That means starting by talking about problems. Being problem focused at the start makes it easy to be hung up talking only about those problems, or about new problems that come up while solving the first problem.
Having a good problem statement is critical to creating good solutions. But once you have the problem outlined you need to focus on solving it. Yes, raise problems, concerns, challenges, threats, weaknesses, etc. Talk openly about all those things. Then make the pivot into problem solving mode once the issue is well understood.
The best projects come together when when everyone collaborates on finding the best solutions to the problems at hand.
Everyone needs to focus on the quality of the outcome. Consultants, for all their fast moving creation of imperfect solutions, must still do good work. Clients should hold their vendors, and themselves, to high standards.
Every message that goes back and forth is a chance for misunderstand that gets in the way. Every input into discovery and every deliverable is a chance for gaps to form. If anyone takes their eye off the ball mistakes can happen and the solution no longer threads the quality needle correctly.
Mistakes will happen, and everyone will have to help course correct. But the higher the quality of the work done before the mistake, the faster it will be to recover and better and overall solution the client will get.
One final note on the way out. If you are trying to decide between being a consultant or being a client, I recommend the switch – whichever you are today try being the other if you haven’t yet. Not everyone loves both roles, and different roles have been right for me at different times.
As a client I loved what I did. We were helping make the world better. I was pushing things forward and helping the organizations succeed. But eventually the things they needed me to learn, and the pace I wanted to grow, weren’t aligned to the organization’s needs.
I’d been there a decade, I left on great terms, but it was time to go.
When I first became a consultant it was exciting. I got to work on a variety of projects, with more technologies than any one organization generally needs. The pace was higher and I was frequently pushing myself in new directions. Consulting gave me insights into how different organizations worked (for both better and worse). And I made more money.
Interesting work, exciting environment, more money, great!
As a consultant I spent less time in positions, the billable grind was exhausting, I missed being focused. When I returned to the client side, I got to focus again. I have one org to worry about, one set of organizational politics to understand, and so on. I get to learn the work of the organization deeply again and really understand the market we serve. In my case I, again, got more money – but that was at least partially luck as much as anything; consultants are often paid better than in-house team members.
Focused work, no billable hours target, calmer work environment, great!
Each really does have it’s advantages. But so does understanding what it’s like to be the person on the other side the relationship. Try them both, learn from both, decide what’s the best fit for you.
read moreWe're still being clobbered by the migration of projects from GitHub to Drupal.org, making work a lot slower as we try to work and keep track of issues/tasks in two places.
On the 27th of January 70+ developers, designers, UX, project leads joined forces in nine teams to attend the European Commission hackathon called Play to impact at The One building in the heart of the European Commission's executive arm in Brussels.
Article by Marcus Johansson.
The two tasks for the teams were clear - build something that helps the content editor using AI or build something that helps reimagine how websites are created in Canvas.
While the tasks were mainly around the development of new features and modules, other actual criteria were scored, including a final powerpoint presentation in front of everyone. This meant that a multidisciplinary team was needed to have the chance to win.
One of the other criterias was that you had to use Mistral AI for your solution. Mistral, being the powerhouse of European AI innovation in large language models, was sponsors of the event. Mistral is one of the key companies to digital sovereign AI solutions in Europe.
They were both helping to make sure that all the teams had enough credits to develop and show off their impressive solutions using likewise impressive models, but also being able to support on site and helping in jury duty when selecting the winners.
amazee.ai and DrupalForge/Devpanel was also sponsoring the event, making sure that the provider setup was smooth for the teams and that the teams were given platforms where they could deploy their solutions for the jury to test.
The teams full at work
The event was the second time the commission had a hackathon specifically around Drupal and AI and this time it was a two day event, meaning people had much more time to prepare, plan, code and present the solutions.
This time there were also prep events where you could ask actual stakeholders, like editors of platforms, what their main problems they were facing.
As one of the core maintainers of the AI module, seeing the amount of people using something you helped create, was a feeling of pride, joy and satisfaction. And as someone that was on site to help technically for the second year around, two things stood out to me:
Group photo of most of the participants and organizers. Photo credit: Antonio De Marco.
On the second day all the teams had to stop at the deadline of 14:40 and have their presentation ready, code committed and Drupal instances set up.
After that started the presentation round, where each of the teams had exactly five minutes to present their solutions to the jury and answer questions from the jury. The jury consisted of people from the European Commission, one person representing Mistral, Tim Lehnen from the Drupal Association and Jamie Abrahams from the AI Initiative.
Bram ten Hove and Ronald te Brake presenting their ACE! Solution.
The winners in the end was team #4 aptly named Token Burners, that ended up making a solution that did not just spawn one actual contributed module, but two! They also had an very impressive presentation.
We now have the FlowDrop Agents that puts the AI Agents we have had in Drupal into the awesome Workflow management system FlowDrop and also the FlowDrop Node Sessions, which makes sure to support workflows to be initialized via a Drupal entity.
The winning team Token Burners and the hackathon jury.
From my point of view the hackathon was a huge success - the energy in the room, the collaboration, the brainstorming was just impressive.
A huge thanks to the organizers Sabina La Felice, Monika Vladimirova, Raquel Fialho, Antonio De Marco and Rosa. Ordinana-Calabuig and the European Commission in general for such a great event!
On the 27th of January 70+ developers, designers, UX, project leads joined forces in nine teams to attend the European Commission hackathon called Play to impact at The One building in the heart of the European Commission's executive arm in Brussels.
Article by Marcus Johansson.
The two tasks for the teams were clear - build something that helps the content editor using AI or build something that helps reimagine how websites are created in Canvas.
While the tasks were mainly around the development of new features and modules, other actual criteria were scored, including a final powerpoint presentation in front of everyone. This meant that a multidisciplinary team was needed to have the chance to win.
One of the other criterias was that you had to use Mistral AI for your solution. Mistral, being the powerhouse of European AI innovation in large language models, was sponsors of the event. Mistral is one of the key companies to digital sovereign AI solutions in Europe.
They were both helping to make sure that all the teams had enough credits to develop and show off their impressive solutions using likewise impressive models, but also being able to support on site and helping in jury duty when selecting the winners.
amazee.ai and DrupalForge/Devpanel was also sponsoring the event, making sure that the provider setup was smooth for the teams and that the teams were given platforms where they could deploy their solutions for the jury to test.
The teams full at work
The event was the second time the commission had a hackathon specifically around Drupal and AI and this time it was a two day event, meaning people had much more time to prepare, plan, code and present the solutions.
This time there were also prep events where you could ask actual stakeholders, like editors of platforms, what their main problems they were facing.
As one of the core maintainers of the AI module, seeing the amount of people using something you helped create, was a feeling of pride, joy and satisfaction. And as someone that was on site to help technically for the second year around, two things stood out to me:
Group photo of most of the participants and organizers. Photo credit: Antonio De Marco.
On the second day all the teams had to stop at the deadline of 14:40 and have their presentation ready, code committed and Drupal instances set up.
After that started the presentation round, where each of the teams had exactly five minutes to present their solutions to the jury and answer questions from the jury. The jury consisted of people from the European Commission, one person representing Mistral, Tim Lehnen from the Drupal Association and Jamie Abrahams from the AI Initiative.
Bram ten Hove and Ronald te Brake presenting their ACE! Solution.
The winners in the end was team #4 aptly named Token Burners, that ended up making a solution that did not just spawn one actual contributed module, but two! They also had an very impressive presentation.
We now have the FlowDrop Agents that puts the AI Agents we have had in Drupal into the awesome Workflow management system FlowDrop and also the FlowDrop Node Sessions, which makes sure to support workflows to be initialized via a Drupal entity.
The winning team Token Burners and the hackathon jury.
From my point of view the hackathon was a huge success - the energy in the room, the collaboration, the brainstorming was just impressive.
A huge thanks to the organizers Sabina La Felice, Monika Vladimirova, Raquel Fialho, Antonio De Marco and Rosa. Ordinana-Calabuig and the European Commission in general for such a great event!
AI document processing is transforming content management in Drupal. Through integration with AI Automators, Unstructured.io, and GPT models, editorial teams can automate tedious tasks like metadata extraction, taxonomy matching, and summary generation. This case study reveals how BetterRegulation implemented AI document processing in their Drupal 11 platform, achieving 95%+ accuracy and 50% editorial time savings.
read moreAI makes it cheaper to contribute to Open Source, but it's not making life easier for maintainers. More contributions are flowing in, but the burden of evaluating them still falls on the same small group of people. That asymmetric pressure risks breaking maintainers.
Daniel Stenberg, who maintains curl, just ended the curl project's bug bounty program. The program had worked well for years. But in 2025, fewer than one in twenty submissions turned out to be real bugs.
In a post called "Death by a thousand slops", Stenberg described the toll on curl's seven-person security team: each report engaged three to four people, sometimes for hours, only to find nothing real. He wrote about the "emotional toll" of "mind-numbing stupidities".
Stenberg's response was pragmatic. He didn't ban AI. He ended the bug bounty. That alone removed most of the incentive to flood the project with low-quality reports.
Drupal doesn't have a bug bounty, but it still has incentives: contribution credit, reputation, and visibility all matter. Those incentives can attract low-quality contributions too, and the cost of sorting them out often lands on maintainers.
We've seen some AI slop in Drupal, though not at the scale curl experienced. But our maintainers are stretched thin, and they see what is happening to other projects.
Some have deep concerns about AI itself: its environmental cost, its impact on their craft, and the unresolved legal and ethical questions around how it was trained. Others worry about security vulnerabilities slipping through. And for some, it's simply demoralizing to watch something they built with care become a target for high-volume, low-quality contributions.
These concerns are legitimate, and they deserve to be heard. Some of them, like AI's environmental cost or its relationship to Open Web values, also deserve deeper discussion than I can give them here.
That tension shows up in conversations about AI in Drupal Core. People hesitate around AGENTS.md files and adaptable modules because they worry about inviting more contributions without adding more capacity to evaluate them.
This is the AI-induced asymmetric pressure showing up in our community. I understand the hesitation. Some feel they've already seen enough low-quality AI contributions to know where this leads. When we get this wrong, maintainers are the ones who pay. They've earned the right to be skeptical.
I feel caught between two truths.
On one side, maintainers hold everything together. If they burn out or leave, Drupal is in serious trouble. We can't ask them to absorb more work without first creating relief.
On the other side, the people who depend on Drupal are watching other platforms accelerate. If we move too slowly, they'll look elsewhere.
Both are true. Protecting maintainers and accelerating innovation shouldn't be opposites, but right now they feel that way. As Drupal's project lead, my job is to help us find a path that honors both.
I should be honest about where I stand. I've been writing software with AI tools for over a year now. I've had real successes. I've also watched some of the most experienced Drupal contributors become dramatically more productive with AI, doing things they could not have done without it. That perspective comes from direct experience, not hype.
But having a perspective is not the same as having all the answers. And leadership doesn't mean dragging people where they don't want to go. It means pointing a direction with care, staying open to evidence, and never abandoning the people who hold the project together.
New technology has a way of lowering barriers, and lower barriers always come with tradeoffs. I saw this early in my career. I was writing low-level C for embedded systems by day, and after work I'd come home and work on websites with Drupal and PHP. It was thrilling, and a stark contrast to my day job. You could build in an evening what took days in C.
I remember that excitement. The early web coming alive. I hadn't felt the same excitement in 25 years, until AI.
PHP brought in hobbyists and self-taught developers, people learning as they went. Many of them built careers here. But it also meant that a lot of early PHP code had serious security problems. The language got blamed, and many experts dismissed it entirely. Some still do.
The answer wasn't rejecting PHP for enabling low-quality code. The answer was frameworks, better security practices, and shared standards.
AI is a different technology, but I see the same patterns. It lowers barriers and will bring in new contributors who aren't experts yet. And like scripting languages, AI is here to stay. The question isn't whether AI is coming to Open Source. It's how we make it work.
The curl story doesn't end there. In October 2025, a researcher named Joshua Rogers used AI-powered code analysis tools to submit hundreds of potential issues. Stenberg was "amazed by the quality and insights". He and a fellow maintainer merged about 50 fixes from the initial batch alone.
Earlier this week, a security startup called AISLE announced they had used AI to find 12 zero-days in the latest OpenSSL security release. OpenSSL is one of the most scrutinized codebases on the planet. It encrypts most of the internet. Some of the bugs AISLE found had been hiding for over 25 years. They also reported over 30 valid security issues to curl.
The difference between this and the slop flooding Stenberg's inbox wasn't the use of AI. It was expertise and intent. Rogers and AISLE used AI to amplify deep knowledge. The low-quality reports used AI to replace expertise that wasn't there, chasing volume instead of insight.
AI created new burden for maintainers. But used well, it may also be part of the relief.
I reached out to Daniel Stenberg this week to compare notes. He's navigating the same tensions inside the curl project, with maintainers who are skeptical, if not outright negative, toward AI.
His approach is simple. Rather than pushing tools on his team, he tests them on himself. He uses AI review tools on his own pull requests to understand their strengths and limits, and to show where they actually help. The goal is to find useful applications without forcing anyone else to adopt them.
The curl team does use AI-powered analyzers today because, as Stenberg puts it, "they have proven to find things no other analyzers do". The tools earned their place.
That is a model I'd like us to try in Drupal. Experiments should stay with willing contributors, and the burden of proof should remain with the experimenters. Nothing should become a new expectation for maintainers until it has demonstrated real, repeatable value.
That does not mean we should wait. If we want evidence instead of opinions, we have to create it. Contributors should experiment on their own work first. When something helps, show it. When something doesn't, share that too. We need honest results, not just positive ones. Maintainers don't have to adopt anything, but when someone shows up with real results, it's worth a look.
Not all low-quality contributions come from bad faith. Many contributors are learning, experimenting, and trying to help. They want what is best for Drupal. A welcoming environment means building the guidelines and culture to help them succeed, with or without AI, not making them afraid to try.
I believe AI tools are part of how we create relief. I also know that is a hard sell to someone already stretched thin, or dealing with AI slop, or wrestling with what AI means for their craft. The people we most want to help are often the most skeptical, and they have good reason to be.
I'm going to do my part. I'll seek out contributors who are experimenting with AI tools and share what they're learning, what works, what doesn't, and what surprises them. I'll try some of these tools myself before asking anyone else to. And I'll keep writing about what I find, including the failures.
If you're experimenting with AI tools, I'd love to hear about it. I've opened an issue on Drupal.org to collect real-world experiences from contributors. Share what you're learning in the issue, or write about it on your own blog and link it there. I'll report back on what we learn on my blog or at DrupalCon.
This isn't just Drupal's challenge. Every large Open Source project is navigating the same tension between enthusiasm for AI and real concern about its impact.
But wherever this goes, one principle should guide us: protect your maintainers. They're a rare asset, hard to replace and easy to lose. Any path forward that burns them out isn't a path forward at all.
I believe Drupal will be stronger with AI tools, not weaker. I believe we can reduce maintainer burden rather than add to it. But getting there will take experimentation, honest results, and collaboration. That is the direction I want to point us in. Let's keep an open mind and let evidence and adoption speak for themselves.
Thanks to phenaproxima, Tim Lehnen, Gábor Hojtsy, Scott Falconer, Théodore Biadala, Jürgen Haas and Alex Bronstein for reviewing my draft.
read moreJanuary 28, 2026 – Today marks one of the biggest evolutions in Drupal's 25-year history.
Drupal CMS 2.0 launches with Drupal Canvas, AI-powered tools, and introduces a component system along with the first site template that enables marketing teams to launch fully branded, professional websites in days instead of weeks. Built on Drupal core, it maintains the enterprise-grade security, scalability, and flexibility Drupal is known for.
Try it now → drupal.org/drupal-cms
Drupal CMS 2.0 is built on top of Drupal Core 11.3, which is the biggest performance improvement in a decade, allowing you to serve 26-33% more requests with the same setup.
We are introducing Drupal Canvas as the default editing experience. Drag components onto pages with live preview and real-time editing. No more switching between admin forms and preview windows for your landing pages – build directly on the page. No Drupal knowledge required to get started.
The custom built Mercury component library provides common building blocks like cards, testimonials, heroes, menus and accordions.
We are introducing site templates that provide feature-complete starting points for specific use cases. Byte is the first template included with Drupal CMS 2.0. It is preconfigured as a marketing site for a SaaS-based product, with blog, newsletter signup, pricing pages, and a contact form, with an elegant dark design. All built with Canvas. Installs in under 3 minutes.
Recipe-based integrations automate complex configurations:
AI tools (optional):
Plus all of these proven goodies from Drupal CMS 1 (January 2025):
Drupal CMS 2.0 would not have been possible without the innovations in Drupal core and the visual tools and components built specifically for this release. Thanks to the hundreds of contributors across dozens of organizations. Special thanks to the AI initiative partners, and everyone who tested, filed issues, and pushed boundaries outward.
This is community-driven development at scale.
Try it now: drupal.org/drupal-cms/trial
Download: drupal.org/download
Learn more: drupal.org/drupal-cms
Twenty-five years in. Still building.
Drupal CMS builds on Drupal Core with full ecosystem compatibility, adding visual building tools, AI assistance, and industry-specific templates. Learn more →
January 28, 2026 – Today marks one of the biggest evolutions in Drupal's 25-year history.
Drupal CMS 2.0 launches with Drupal Canvas, AI-powered tools, and introduces a component system along with the first site template that enables marketing teams to launch fully branded, professional websites in days instead of weeks. Built on Drupal core, it maintains the enterprise-grade security, scalability, and flexibility Drupal is known for.
Try it now → drupal.org/drupal-cms
Drupal CMS 2.0 is built on top of Drupal Core 11.3, which is the biggest performance improvement in a decade, allowing you to serve 26-33% more requests with the same setup.
We are introducing Drupal Canvas as the default editing experience. Drag components onto pages with live preview and real-time editing. No more switching between admin forms and preview windows for your landing pages – build directly on the page. No Drupal knowledge required to get started.
The custom built Mercury component library provides common building blocks like cards, testimonials, heroes, menus and accordions.
We are introducing site templates that provide feature-complete starting points for specific use cases. Byte is the first template included with Drupal CMS 2.0. It is preconfigured as a marketing site for a SaaS-based product, with blog, newsletter signup, pricing pages, and a contact form, with an elegant dark design. All built with Canvas. Installs in under 3 minutes.
Recipe-based integrations automate complex configurations:
AI tools (optional):
Plus all of these proven goodies from Drupal CMS 1 (January 2025):
Drupal CMS 2.0 would not have been possible without the innovations in Drupal core and the visual tools and components built specifically for this release. Thanks to the hundreds of contributors across dozens of organizations. Special thanks to the AI initiative partners, and everyone who tested, filed issues, and pushed boundaries outward.
This is community-driven development at scale.
Try it now: drupal.org/drupal-cms/trial
Download: drupal.org/download
Learn more: drupal.org/drupal-cms
Twenty-five years in. Still building.
Drupal CMS builds on Drupal Core with full ecosystem compatibility, adding visual building tools, AI assistance, and industry-specific templates. Learn more →
January 28, 2026 – Today marks one of the biggest evolutions in Drupal's 25-year history.
Drupal CMS 2.0 launches with Drupal Canvas, AI-powered tools, and introduces a component system along with the first site template that enables marketing teams to launch fully branded, professional websites in days instead of weeks. Built on Drupal core, it maintains the enterprise-grade security, scalability, and flexibility Drupal is known for.
Try it now → drupal.org/drupal-cms
Drupal CMS 2.0 is built on top of Drupal Core 11.3, which is the biggest performance improvement in a decade, allowing you to serve 26-33% more requests with the same setup.
We are introducing Drupal Canvas as the default editing experience. Drag components onto pages with live preview and real-time editing. No more switching between admin forms and preview windows for your landing pages – build directly on the page. No Drupal knowledge required to get started.
The custom built Mercury component library provides common building blocks like cards, testimonials, heroes, menus and accordions.
We are introducing site templates that provide feature-complete starting points for specific use cases. Byte is the first template included with Drupal CMS 2.0. It is preconfigured as a marketing site for a SaaS-based product, with blog, newsletter signup, pricing pages, and a contact form, with an elegant dark design. All built with Canvas. Installs in under 3 minutes.
Recipe-based integrations automate complex configurations:
AI tools (optional):
Plus all of these proven goodies from Drupal CMS 1 (January 2025):
Drupal CMS 2.0 would not have been possible without the innovations in Drupal core and the visual tools and components built specifically for this release. Thanks to the hundreds of contributors across dozens of organizations. Special thanks to the AI initiative partners, and everyone who tested, filed issues, and pushed boundaries outward.
This is community-driven development at scale.
Try it now: drupal.org/drupal-cms/trial
Download: drupal.org/download
Learn more: drupal.org/drupal-cms
Twenty-five years in. Still building.
Drupal CMS builds on Drupal Core with full ecosystem compatibility, adding visual building tools, AI assistance, and industry-specific templates. Learn more →
January 28, 2026 – Today marks one of the biggest evolutions in Drupal's 25-year history.
Drupal CMS 2.0 launches with Drupal Canvas, AI-powered tools, and introduces a component system along with the first site template that enables marketing teams to launch fully branded, professional websites in days instead of weeks. Built on Drupal core, it maintains the enterprise-grade security, scalability, and flexibility Drupal is known for.
Try it now → drupal.org/drupal-cms
Drupal CMS 2.0 is built on top of Drupal Core 11.3, which is the biggest performance improvement in a decade, allowing you to serve 26-33% more requests with the same setup.
We are introducing Drupal Canvas as the default editing experience. Drag components onto pages with live preview and real-time editing. No more switching between admin forms and preview windows for your landing pages – build directly on the page. No Drupal knowledge required to get started.
The custom built Mercury component library provides common building blocks like cards, testimonials, heroes, menus and accordions.
We are introducing site templates that provide feature-complete starting points for specific use cases. Byte is the first template included with Drupal CMS 2.0. It is preconfigured as a marketing site for a SaaS-based product, with blog, newsletter signup, pricing pages, and a contact form, with an elegant dark design. All built with Canvas. Installs in under 3 minutes.
Recipe-based integrations automate complex configurations:
AI tools (optional):
Plus all of these proven goodies from Drupal CMS 1 (January 2025):
Drupal CMS 2.0 would not have been possible without the innovations in Drupal core and the visual tools and components built specifically for this release. Thanks to the hundreds of contributors across dozens of organizations. Special thanks to the AI initiative partners, and everyone who tested, filed issues, and pushed boundaries outward.
This is community-driven development at scale.
Try it now: drupal.org/drupal-cms/trial
Download: drupal.org/download
Learn more: drupal.org/drupal-cms
Twenty-five years in. Still building.
Drupal CMS builds on Drupal Core with full ecosystem compatibility, adding visual building tools, AI assistance, and industry-specific templates. Learn more →