Most studio sites are brochures. They list services, name clients, post a few case studies, and hand the visitor a contact form. The good ones do this beautifully. The bad ones do it badly. But the structure is the same — and the structure, increasingly, is the problem.
For most of the agencies a buyer will look at this week, the website is the same shape: a hero promising digital experiences, a strip of services, a row of client logos, three or four hand-picked case studies, an awards bar, a footer. There’s nothing wrong with any of those pieces. They’re just the same pieces. They were the same pieces in 2014. They will be the same pieces in 2030.
The brochure model assumes a buyer who is already convinced and just needs reassurance. It works when the buyer arrived through a referral or a portfolio they already trusted, and the site’s job is to confirm rather than persuade. For a senior buyer doing cold research — comparing five agencies on a Tuesday afternoon — the brochure converts no one, because every brochure looks identical, and identical means forgettable.
The studios that stand out from the brochure pack — the ones senior buyers screenshot and pass around in Slacks — share a small set of structural moves. They expose more of the operation than the brochure does. They publish what they’re building, what they got wrong, what they measure, how they bill. They turn the website itself into a kind of working artefact.
This essay is about those moves. Not the visual ones — every site has a typography opinion. The structural ones. The surfaces that, taken together, make a studio site read as true rather than performed.
The default studio site is a brochure
Pull up the websites of any ten digital agencies in your city and stack them side by side. Within thirty seconds, you can see the template. Hero headline (some variant of we build digital experiences that move the needle), services strip, work showcase, testimonial carousel, contact form. Sometimes a manifesto, sometimes a process diagram, almost always an awards strip in the footer.
The visual identity across these sites genuinely differs. One studio loves serifs, another loves brutalist mono, a third loves seamless WebGL hovers. But the information architecture is identical. Every page surface is the same page surface. The buyer can’t learn anything new about the studio without booking a call — which, of course, is the explicit point.
Every page on a brochure site exists to drive the contact form. A site that exists to drive the contact form converts the buyer who already wants to fill out the contact form.
This shape is fine — it’s not lazy, it’s learned. Most agencies got here by A/B testing themselves toward conversion, and the conversion is real. The problem is that the test population was already-warm visitors. For them, the brochure is enough. For everyone else — the senior buyer doing cold research, the engineer evaluating whether you’ll respect their stack, the founder who wants to know whether you ship — the brochure says nothing.
What it says nothing about is the operation. Whether the studio actually ships. What rhythm. What it shipped last week. What it got wrong. What it measures. Whether the people writing the headlines also write the code. The brochure performs capability; it doesn’t prove reliability.
Trust surfaces
The studios that read as different have a small set of pages the brochure omits. They’re not flashy. They’re not where the studio shows off the WebGL hero. They’re the page where the studio shows the receipts.
A live dashboard
Most studio sites claim a Lighthouse score in passing — 99 / 100, of course— and move on. A trust surface publishes the number live, with the build SHA, the deploy environment, the time since the last commit, and Web Vitals captured in the visitor’s own session.[1] The visitor clicks view sourceand the number is verifiable in their browser’s own dev tools.
The dashboard is one move; what makes it a trust move is that it’s public. Every existing client of the studio can see the numbers their site is hitting. Every prospect can see what the studio’s own site hits before they commit to a retainer. The number is the same number. There is no demo build hiding the production reality.
An honest roadmap
Roadmaps inside companies have always existed; what’s new is publishing them. A page with three columns — Now, Next, Exploring— under the studio’s own URL is a trust move because it puts the studio on a clock. Anything in Now has to ship or get demoted. Anything in Next needs a credible path. Anything in Exploringis research, no commitment, but it’s research the studio is willing to say out loud.
An error log
The least-published trust surface, and the most powerful. Every studio has shipped mistakes. The ones who admit it publicly turn each mistake into a reputation asset. [2] A page like /errors with five entries — what we shipped, what was wrong, what fixed it, what we learnt — does more for senior buyers than three new case studies.
The trick is the lessons can’t be theatrical. “We shipped too late” is theatre. “We framed an ambient looping reel as a live preview, and replaced it with a static screenshot the day a client called us out on it” is a real mistake worth learning from. Specificity is what separates the page from a humility-flex.
A working tool
A free tool that does something genuinely useful — runs a Lighthouse audit, compares against a competitor, generates a proposal — converts colder traffic than any case study can. The visitor doesn’t just read about your craft; they experience it. They leave either thinking these people know what they’re doing or they fixed something I couldn’t. Both lead to the contact form by a different route.
Receipts as positioning
Every claim a studio makes is either a receipt or a noise. “5.0 Google rating” with a link to the live Google Business Profile is a receipt. “5.0 Google rating” in a vacuum is a noise. The visitor has no way to verify it, no way to know if it’s current, no way to evaluate the population the average covers.
The receipt move on a case study is to attach a small ?next to every quantitative claim. Click it: a short editorial note explains how the number was measured, and where applicable, a link out to the source — Google Maps, the public Stripe receipt, the analytics dashboard screenshot. The cost is editorial. The benefit is that the studio stops claiming things it can’t back.
The asymmetry matters: a metric without a receipt becomes more credible after this convention is in place, not less, because the studio has demonstrably published only the metrics it can defend. The ones it leaves un-annotated read as the unverifiable ones — and the studio admits it rather than fudging.
The risk of theatre
Every move on this list has a counterfeit version. A dashboard with a hand-set Lighthouse number that hasn’t been re-measured in a year is a counterfeit dashboard. A roadmap that lists ten aspirational items and never moves is a counterfeit roadmap. An error log with one entry from 2022 about we used to be too perfectionist is a counterfeit error log. A free tool that asks for an email before it returns the result is a counterfeit free tool.
Counterfeit trust surfaces are worse than no trust surface. The visitor recognises the shape — they’ve seen Linear do it real, they know what real looks like — and the knock-off reads as cynical. The studio loses on two fronts: it doesn’t convert the cold traffic the brochure was already failing on, and it actively repels the senior buyer who can spot the seams.
Counterfeit transparency is worse than no transparency. The visitor’s pattern-match for “real one” is precise enough to catch the fake.
The way to avoid theatre is to start small and let the pages stay incomplete in public. A dashboard with two numbers is more credible than a dashboard with twelve, half of which are stale. A roadmap with three real items beats ten aspirational ones. A working tool that does one specific thing well beats a tool that promises everything and delivers a marketing pitch.
The discipline is harder than it looks. The temptation, once the page exists, is to keep adding. The discipline is to keep removing — anything that isn’t actively being maintained, anything that doesn’t earn its place every week.
What this costs
Every trust surface on this list has a maintenance cost that’s not free, and the brochure model exists in part because the brochure is cheaper to maintain than the alternatives.[3] A dashboard that goes stale is a trust violation. A roadmap that lies is worse than no roadmap. An error log that nobody updates is a museum, not a living document.
The studios that invest in this kind of architecture are spending real time on it. Time that doesn’t directly bill to a client engagement. Time that the brochure-only agency is using to take another project, or to send another round of cold outreach.
The reason it’s worth doing anyway is that the brochure’s cost is invisible — paid in lost cold traffic the agency never measures because those visitors never converted to a contact-form submission. The trust architecture’s cost is visible: it shows up on the team’s calendar. But the return is buyers who arrived already convinced because the website did the convincing.
A note on this site
Everything described above lives on this site. The dashboard is at /pulse, with a real Lighthouse number, real Web Vitals, real commits. The roadmap is at /roadmap, with three columns. The error log is at /errors, and the entries are real. The free tool is at /audit, and it really does run on Google’s API.
We’re writing this in part to be honest about the choices, and in part because the architecture is its own argument. The studios that read as 1% from cold traffic aren’t there because their hero animation is more sophisticated. They’re there because the rest of the site does the work the hero only claims to.
If you’re building one, we’d encourage you to start with the smallest credible version of a trust surface, ship it, and let it stay incomplete in public until the next thing earns its place. The brochure isn’t going anywhere. But the room to differentiate inside it is getting narrower every year.
Notes
- [1]Web Vitals captured client-side use the
web-vitalslibrary and reflect the visitor’s own session, not a synthetic test rig. This means the dashboard reads back what their hardware + their connection actually delivered — which is the only number that matters. ↩ - [2]The asymmetry: every studio site says “we ship great work.” Almost none publish what they got wrong. The studios that do read as honest because the social cost of publishing a mistake is real, and almost no one pays it. ↩
- [3]A reasonable rule of thumb: budget 1–2% of the studio’s ongoing engineering capacity for trust-surface maintenance. That’s the amount that keeps things current without eating into the work that pays for it. ↩