Titus is a Senior Principal Scientist at Adobe, focusing on Developer Experience. He has served on the C++ standards committee, chairing the working group for the design and evolution of the C++ standard library. He has also served on the ACM/IEEE/AAAI CS2023 steering committee, helping set curriculum requirements for Computer Science undergraduate degrees, focusing on the requirements for software engineering. Titus was a thought leader at Google for many years, focusing on C++, software engineering practice, technical debt, and culture. He is the lead author for the book Software Engineering at Google. (O'Reilly, 2020).
Titus on LinkedIn
"We're in the business of developer productivity. We're just wise enough to know that you can't measure productivity, and developer experience is the best proxy for it. The answer to developer experience is not donuts and ponies. It's the right tools, processes, and the right culture.", says Titus in this episode of the Hangar DX podcast where he talks about fear in tech teams and the importance of psychological safety.
I just cannot find research anywhere that doesn't say psychological safety and a culture focused on growth and learning for frontline developer teams are the primary indicators, predictors, and foundations of technical success.Even in the DORA research and Accelerate, and all of these things that we've been talking about for 10 years, we find that at the root of team and organizational success, you have to focus on culture.
Even before the DORA, there was Project Aristotle. The researchers' initial hypotheses were that it would be about getting well-funded teams of experts and eliminating all of the hurdles in their way. They could not find any statistical backing for that. The research showed that you have to establish psychological safety and then get people to actually follow through on their commitments.
They actually came up with bullet point recommendations on how to make teams successful, but nobody actually remembers what three and four are. Most of the time, you don't even remember what number two is. Because if you get the psychological safety thing right, the rest of it just kind of starts to fall into place.
Psychological safety is the state where everyone on the team feels comfortable expressing an idea that might not work, admitting that they don't know something, making an honest novel mistake, aski
ng questions, not fitting in with the group, or not knowing something that everyone else in the group knows.
That sounds great, but it's not something you get for free. It takes some active effort to get there. In a group with good psychological safety, there is a substantially larger emphasis on what we need to learn, not on who did the wrong thing.
The fact that we can use psychological safety, at least in some circles, as just a term and that people know what you're talking about indicates that we're moving in the right direction.
But also, there is a rising fear coming from the advent of Gen AI, just as a response to the question: Are we all about to lose our jobs because the machines are coming for us?
A research paper from Dr. Cat Hicks says that 44 % of the engineering community believes that AI is going to take their jobs. On teams that have proper psychological safety and learning culture built in, that number is less than half. And their use of those tools is more playful, useful, and valuable.
The discourse around AI and AI coding assistants still boils down to the following: If you want this to work, all roads lead to culture.
The only thing that might be more foundational and more predictive of team and organizational success than culture is documentation. And that's not user-facing documentation, it's internal documentation.
How do we want to treat each other? What do we commit ourselves to? How do you use our internal tools? What are the policies around sick time or picking up a kid? Just getting those things written down is part of the culture.
There are really nice and pretty short survey instruments that you can use as a team exercise. Surveys of 12, 15, or 20 questions will give you a pretty solid signal on how this is actually going.
But there is another indicator—people not speaking up. Even if they know the answer, they won’t volunteer it until they're asked. And that's indicative of something that's a little concerning.
Many new managers, especially technical leaders, haven't been given any leadership training. They don't know that it's okay to have these types of discussions. One of the bits of homework that I really like to give teams is to focus on learning to have value discussions.
Start with the campsite rule. The idea of leaving the code better than you found it. Find 10 or 15 minutes in team stand-ups or meetings and ask as a hypothetical question: I spent a full day implementing this feature or fixing this bug, and knowing what I know in retrospect, could have done that in two hours. How much time should I commit to leaving the code better than I found it?
If you don't have that discussion and agree on it in open, in everyone's head, then the safe thing is always to pay lip service to the campsite rule and, in practice, no one does anything. It just gets worse and worse, and then everything is terrible and you have to rewrite.
The tech industry is way too obsessed with measuring everything, with numbers, averages, net promoter scores, and Likert scales.
If you want to know if your team's doing a good job, ask them how effective they think we're doing compared to that team. Get peer reviews from other teams and from your clients, talk to people, and stop pretending that you can average everything.
I don't think that there is a way to objectively define that.
One way to know is to ask a lot of questions - to your partner teams, to the people whose infrastructure you rely on, to the customer teams or the clients. Be skeptical. Go poke in the corners that you're afraid to go poke in to get the honest feedback that you actually need.
When you’re on a really high-performing team, you kind of know it. The buzz is amazing. It's an emotional high. Your team's reputation in the organization around you is obvious.
We can't let fear dominate our engineering decisions. It's not an option. It's a requirement to bust those ghosts. You have to go reconsecrate the graveyard. Figure out what it takes to make that a system that isn't scary to change.
If you're running full tilt on just trying to produce without focusing on getting better, you're setting yourself up for burnout and horrible technical debt.
Everyone had just as much panic about losing their jobs in the 1980s when we started having higher-level languages like COBOL and C++. These were also to put programmers out of their job because we didn’t have to write assembly anymore. It practically writes itself.The rise of generative AI software engineering tools is a great opportunity to devote more time and attention to how we do our jobs, whether it's with generative AI or not.
00:00 The Importance of Culture in Team Success
04:51 Understanding Psychological Safety
09:54 Signs of Low Psychological Safety
14:52 Balancing Urgency and Psychological Safety
19:55 Top-Down vs. Bottom-Up Culture
24:49 The Role of Managers in Fostering Safety
29:55 Measuring Team Performance and Psychological Safety
34:52 Technical Debt and Psychological Safety
40:09 The Future of Engineering Culture and AI
• At the root of team success is a focus on culture.
• Psychological safety allows team members to express ideas without fear.
• Teams with psychological safety are more effective and innovative.
• Managers should actively seek feedback from their teams.
• Documentation is crucial for establishing team norms and reducing fear.
• Signs of low psychological safety include lack of questions and engagement.
• Balancing urgent tasks with time for improvement is essential.
• Top-down leadership can coexist with a culture of psychological safety.
• Technical debt can create fear and hinder team performance.
• AI tools can provide opportunities to discuss broader cultural issues.