Building Software for People Who Don't Complain.
By Brian Ochan | 07/02/2026A quiet assumption runs through many of the systems we build; that when something breaks, users will complain. Over time, I’ve learned how fragile that assumption is.
Some people don’t complain, not because things work, but because they’ve learned not to. Complaining assumes access to the right channels, enough context to articulate a problem, the confidence to speak up, time to invest in the process, and trust that doing so will change anything. Those assumptions are easy to miss if you’ve always had them.
I grew up adapting to systems rather than escalating issues within them. In Uganda, where infrastructure can be unreliable and services are sometimes switched off without notice, working around broken systems is often more rational than pushing against them. Silence, in many cases, isn't resignation, it's efficiency. That experience has shaped how I see systems today. In software, this shows up everywhere. Products are built with feedback loops that favor loud users. Metrics reward reported issues while ignoring hidden friction and internal teams equate a lack of complaints with things working well. Silence is interpreted as success but often just means we’re not listening in the right way.
While contracting for a government agency a while back home, we worked on an application intended to improve the day-to-day work of a specific group of users. The tool was designed to give them clearer tasks, show when and where work was happening, and provide useful metrics to management such as the time it took to complete certain tasks. On paper, it solved real problems. Adoption, however, was low. Our initial assumption was resistance to change. But after spending time with the users, a very different picture emerged. The application was perceived as an invasion of privacy. The same features we saw as clarity and accountability were interpreted as surveillance. Reporting “when” and “where” work happened limited their ability to operate outside narrowly defined tasks but also gave management insight into their location data during work hours. The system’s intent and the users’ reality were misaligned in ways we hadn’t anticipated.
That gap was instructive. Since then, I’ve become more attentive to building for silence, not just feedback. I treat non-usage as a signal. I spend more time observing how systems are used or more importantly quietly avoided rather than waiting for reports to surface problems. One question now follows me into most system design discussions: What would stop someone from telling us this?
Systems reflect who they are built for, and silence often maps to a power imbalance. When using a system requires confidence, time, or a willingness to expose oneself, it quietly excludes the very people it claims to serve. Good systems don’t require bravery to use.
I’m still learning how to build and design digital products for people who don’t complain. That lens now follows me into every system I touch.
This is part of an ongoing attempt to think more clearly in public.