Tools shape culture.
Writing changed memory. Spreadsheets changed business reasoning. Smartphones changed attention. Collaboration software changed the tempo of work.
AI is changing something more specific: the speed at which people can shape new tools around their own work.
That does not make AI a mystical co-creator. It makes it a powerful toolmaking surface.
People can describe a workflow, generate a prototype, draft a template, build an analysis helper, or create a process that would previously have required more technical support. That is a real cultural shift.
It also raises a practical question: when toolmaking gets easier, how do we keep the tools responsible?
Toolmaking used to require a specialist
For most of software history, a workflow had to be translated through a specialist.
The operator knew the pain. The founder knew the customer. The engineer knew how to build the tool. Good products emerged when those perspectives met well.
AI compresses parts of that translation.
A non-engineer can now sketch a workflow in natural language. A founder can generate a prototype. An operator can create a checklist or reporting tool. An engineer can move faster through boilerplate and edge-case exploration.
The bottleneck shifts from "Can this be built?" toward "Is this the right thing to build, and can it be trusted?"
The danger of effortless tools
Cheap creation creates a quality problem.
When every team can generate its own helper, dashboard, template, or automation, the world gets more tools. Not necessarily better tools.
The risk is tool sprawl:
- duplicated workflows,
- inconsistent logic,
- unclear ownership,
- hidden assumptions,
- fragile automations,
- outputs nobody knows how to verify.
AI does not remove product discipline. It makes product discipline more important.
Toolmaking needs shared context
The most useful tools are built close to the work.
They understand the customer, the decision history, the constraints, and the people who will use them. This is why shared context matters. If AI helps create a workflow in a private tab, the team may never see the assumptions behind it.
In a shared workspace, the toolmaking process can be visible:
- what problem the tool is meant to solve,
- what source context shaped it,
- who reviewed it,
- what it is allowed to do,
- what output becomes durable.
That turns AI-assisted toolmaking from a private shortcut into a team capability.
The new skill is judgment
As AI lowers the cost of building, judgment becomes more valuable.
Can you define the problem clearly? Can you tell a useful prototype from a polished distraction? Can you see where a workflow creates risk? Can you decide what should be automated and what should remain human-owned?
These are the skills that matter in a toolmaking culture.
The best teams will not be the ones that generate the most tools. They will be the ones that create the few tools that fit the work, respect the context, and improve the outcome.
What shared workspaces should enable
The opportunity is not simply to give people AI inside chat.
It is to help people and AI agents shape work together in shared context.
That includes tool-like outputs: briefs, checklists, templates, plans, follow-up paths, and integrations. But those outputs need visible ownership and bounded action.
The product should make it easier to build useful tools without creating hidden automation debt.
That is the cultural shift I care about.
Not AI replacing human toolmakers.
People becoming better toolmakers because AI helps them work from clearer shared context.

