In this episode of Semaphore Uncut, we chat with Brian Douglasâformer GitHub developer advocate, founder of OpenSauced, and now part of the Linux Foundationâs CNCF ecosystem team.
Brian shares his journey from helping developers contribute to open source more effectively, to leading a startup acquired by the Linux Foundation, and his latest focus: how AI tooling and open source infrastructure are reshaping developer workflowsâand, ultimately, the future of software delivery.
From GitHub to OpenSauced
Brian began his career as a software engineer and joined GitHubâs developer relations team, helping shape the early community of maintainers and contributors.
While there, he noticed a problem: developers contributing to open source lacked a clear way to track their pull requests across projects.
âI wanted a CRM for pull requestsâa single place to see where my contributions were, what stage they were in, and what was missing.â
That idea became OpenSauced, a platform offering developer insights into open source projects. After several years of building and serving enterprise clients, OpenSauced was acquired by the Linux Foundation and integrated into the LFX platform, which helps organizations manage and grow open source ecosystems.
Joining the Linux Foundation and CNCF
Now, at the Cloud Native Computing Foundation (CNCF), Brian works on ecosystem and user engagementâconnecting enterprises like Apple, Capital One, and J.P. Morgan with open source cloud technologies.
âOur job is to get enterprises not just to consume open source, but to contributeâthrough code, funding, and sharing case studies.â
For leaders managing modern infrastructure, this is the center of gravity: Kubernetes, ArgoCD, and the rest of the CNCF stack underpin the worldâs software delivery systems.
AI Tooling Comes to Infrastructure
Brian sees AI as the next transformative layer on top of open source infrastructureâsimilar to how CI/CD reshaped delivery pipelines a decade ago.
âKubernetes is already the substrate for the cloud. The next step is making it the substrate for AI.â
New tools, from K8sGPT (AI-assisted debugging for clusters) to RunAI and SF Compute (GPU scheduling and distributed compute), are extending DevOps patterns into AI infrastructure.
That means engineers will soon deploy AI clusters across regions, manage GPU resources as code, and integrate model evaluation directly into CI/CD pipelines.
âWeâre moving from calling APIs to running our own small, fine-tuned models. Thatâs where cost control and performance live.â
AI Engineers and the New Workflow
Brian describes a growing divideâand eventual convergenceâbetween traditional ML practitioners and the new generation of AI engineers:
- ML engineers: deep in data science, training, and evaluation.
- AI engineers: focused on shipping features quickly with APIs and frameworks.
Over time, these roles will merge, as infrastructure becomes the bridge between experimentation and production.
âThink of it like DevOps for AIâteams deploying and monitoring models, not just writing prompts.â
From Copilot to Autopilot
Brian compares todayâs developers to airline pilots:
âPilots donât do barrel rollsâthey supervise automation. Engineers will soon do the same.â
He cautions against what he calls âvibe codingââblindly accepting AI-generated code without understanding it.
The future developerâs skill lies in:
- Designing guardrails for AI assistants (rules, style guides, system prompts).
- Running AI evals in CI to prevent regressions.
- Understanding why a fix worksânot just letting the model âpatch and go.â
These new workflows will raise the engineering floor: smaller teams, fewer repetitive tasks, more focus on architecture, testing, and governance.
Open Source, Context, and âPolicy-as-Promptâ
One of Brianâs recurring themes is context. Whether in open source or enterprise teams, knowledge needs to be encodedinto the tools developers use.
Heâs experimenting with policy-as-promptâembedding organizational rules, API versions, and style conventions into the assistantâs context window.
âInstead of Post-it notes next to your merge button, you build the rules into your AI copilot.â
Paired with retrieval-augmented generation (RAG), this approach lets teams query their entire history of projects and incidents:
âHave we solved this before? Who fixed it? Which version broke it?â
The Human Side of Open Source + AI
Despite the automation narrative, Brian believes human creativity and curiosity will remain central to engineering culture.
âThereâll always be people who love building by handâjust like vinyl records never went away.â
The opportunity for leaders is to build teams that supervise automation effectively: crafting standards, encoding institutional knowledge, and using open source as a multiplier for innovation.
Looking Ahead
For Brian, the next decade of engineering looks less like writing code and more like orchestrating systems that write, test, and deploy codeâall running on open infrastructure.
âToday youâre a cloud engineer. Tomorrow youâre an AI-enabled cloud engineer. Thatâs where weâre headed.â
As AI merges with CI/CD, the most successful teams will be those who:
- Treat AI evals like tests and prompts like code.
- Run AI workflows on Kubernetes for scale and reliability.
- Keep humans in the loop for context, ethics, and judgment.
Follow Brian Douglas
đ LinkedIn
đ GitHub â bdougie
đŹ Website
