Cogent Conversations: Episode 23

How Teams Prioritise Work and Releases

Welcome to Season 3 of Cogent Conversations, where we’ll be taking a deeper dive into the 2021 Australia & New Zealand Product Teams Report.

This is the second year we’ve published the report, featuring insights from 100 tech companies across Australia and New Zealand, and helping us answer the question we get asked most at Cogent: “how do other people do it?” when it comes to the way product teams best work together.

In this episode Adam Murray, a Product Principal here at Cogent, digs a little deeper into how high performing teams work together. Hear him interview two highly experienced experts on the topic from Cogent about their experience, opinions and recommendations.

LIZ BLINK Lead Product Manager &
Community Engagement

LEENA HA
Senior Developer

“While the majority of leaders (53%) thought the frequency of releases was about right in the 2020 survey, this year the majority (58%) would like to see frequency increased.

In contrast, 46% of team members believe their cadence of releases is about right.”

Access your own copy of the 2021 Product Teams Report here for more insights and recommendations.  To keep up to date with what is happening with Cogent, including when new episodes of this podcast are released, you can subscribe via email or follow us on Twitter

CC-iTunes-3000

Listen to the Episodes

Subscribe and download now via:

Apple Podcasts

Spotify

RSS Feed

Download the full 50-page report

Full Episode Transcript

Adam Murray: Welcome to Season 3 of the Cogent Conversations podcast, in which we take a deeper dive into our 2021 Australia and New Zealand product teams report. This is the second year we’ve published the report, featuring insights from 100 tech companies across Australia and New Zealand, and helping us answer the question we get asked the most: “How do other of people do it?” when it comes to the way product teams work best together. 

I’m Adam Murray, a principal here at Cogent, and in each episode, I’ll be digging a little deeper into a key theme of the report, gaining further insights and recommendations from some of our experts here at Cogent. You can download the free 50-page report by visiting cogent.co/podcastreport. Let’s get into it.

Hello and welcome to the fourth episode in this season where we are understanding more of the detail behind the product teams report. And it’s exciting to be talking to people from Cogent. I’m Adam Murray, one of the principals here at Cogent in the area of strategy. In this episode, I’m sitting down with Lena Ha, a software engineer at Cogent, and Liz Blink, a product manager here, to talk about how teams prioritise work and releases. 

I want to reflect on the report a little bit, and one of the things that jumped out to me between last year and this year is the rise in the influence of what the customer gets to say—or the customer’s voice in what gets prioritised into a particular release, as compared to say the influence of leaders in the organisation. And this seems pretty interesting; seems like a fairly significant change there. Liz, how might leaders be feeling about this?

Liz: Well, I’m going to guess they’re completely excited, surely! Or hopefully they’ve begun to understand that the experience or knowledge that they’ve had, and one of the reasons they’re in a leadership position, is valuable in addition to the information they’re learning from customer research or the customer input to what we should do next. And once you start combining that, I would hope they’re feeling that their impact is multiplied, because the benefit of that experience with what the customer actually needs, wants, will pay for, is now coming together and you’re getting an exponential outcome. So that’s what I’d hope they’re feeling.

I’m sure it’s a bit disconcerting to sometimes realise you might not be right. Because some of what’s underlying in not listening to the customer, or in feeling that your perspective has got more value, is a little bit about a sense of ego and confidence that you’re right, and being open to hearing there’s another way to do that could be a little bit disconcerting. So I’m pleased to see they’re starting to embrace that.

Adam Murray: It’s particularly, I suppose, if leaders have got a few runs on the board early as well, and there’s a bit of a sense of, “Oh, I’m onto something here, and I think I know.” And a lot of time leaders do know as well—I don’t think that there’s any question about that—but what I’m hearing from you is that you think this is a good shift.

Liz: I think it’s a good shift. And I think it’s a little bit around the fact that it’s runs on the board—you were right at a point in time—and there are definitely some truisms about humans that we argue haven’t changed since we first stood up straight and moved on in evolution. But I think there’s a lot to be said for the fact that the conditions of the world today are actually changing the way we might act—and how our customers might act—from yesterday to today. And that’s what we’ve got to stay open to versus being reasonably confident about some things that remain the same.

Adam Murray: Yeah. So being open to more inputs in influencing our decisions, more salient and accurate and important. Lena, from your perspective as a developer, this change in the customer’s voice being prioritised over a leader’s voice—what does this mean for you? How is this affecting your day-to-day?

Lena: Yeah. How it affects my day to day as a developer—yeah, we love coding, but then how do we know that what we are shipping is important? So having customer involvement and then seeing their reaction when something’s shipped really creates a lot of meaning in my life. Especially to other developers, the changes they’ve made or the improvement that they’ve made impacts at a customer level versus from a leader’s voice.

Adam Murray: Yeah. And I think one of the themes that came up in the previous episode actually was that feeling that we all want, but developers in particular, given maybe the track record of a lot of their code not ever being used, perhaps, or not finding a sense of usefulness. There’s a sense of contribution and effectiveness that we all want, and you seem to be saying that hearing more of the customer’s voice enables more of that, is that right?

Lena: Yes. Hundred percent. Yeah.

Adam Murray: Yeah. Cool. You’ve both reflected on … we touched on it earlier in this conversation, how perhaps it’s a bit more unusual to have a surprisingly good prioritisation story than a story about when things have gone not so well. Liz, it seems to be a bit of an art and science to this. Can you talk about how those things are mixed? Is it intuition? Is it methodology? Tell us a bit more about that.

Liz: I might go first, Lena, because I’d love to hear more about the techniques your team are using, and I think it’s a really interesting one. In the long time that I’ve been doing product, I have a number of spreadsheets that are about doing prioritisation with some kind of number that you’d rank along various parameters—and I’ve used none of them. Because I think there is a moment where folk might be convinced that if you can put a number against everything and then just get it in a nice, neat order, number one will make sense. But I’ve seen that when people try and do that, then they’re like, “But actually the thing I wanted to go first didn’t end up at the top.” And then you argue about how to actually put it at the top anyway.

So I think the idea that you can rationalise things that aren’t actually entirely rational is where one needs to be careful. And it’s a tricky one to find a formula that will work, that overcomes that idea that, “At the end of the day, my opinion which is intuition and based on gut feel, I don’t know how to articulate very well,” or me personally, but those I’m talking to don’t how to put it into words, but they know it’s the right thing to do. And I think that’s where we play a role in starting to work on those conversations—to understand what’s going on with what they actually want to see go first, and pull out what that meaning is; to get them to talk about value in language everyone can understand.

And now, as you are talking through what should happen next, the value conversation has opened up and you don’t need numbers. Some part of it— effort, time to get it done—is important, I wouldn’t say you can ignore that, but the value, that’s the stuff that starts to get really interesting. That’s the language and that’s the conversation you want to get to. That’s where I call it art, because it’s a motive.

Adam Murray: Yeah. And there’s something interesting: you talked about intuition. For me, intuition comes from spending a long time with something and really percolating in the data or playing with the tool or understanding more of the context. I don’t know if we talk a lot about that in terms of software development or prioritisation in particular, but it’s interesting that you touched on it there. Can you elaborate on that a little bit more?

Liz: Well, I think that’s where if your product team is listening to customer interviews every day or every week or, depending on the scale of your product or service, monthly; it depends on the scale. I hate to make people feel bad, as I can’t quite find how to get it in there weekly, but I’d advocate that the more frequently you can do it, the more data points you’re putting into your head that are this qualitative information versus the quantitative information. And that will help you with the lightbulb of innovation or the, “Oh, I’ve suddenly understood why it’s not working and why everything that we have been trying isn’t coming to fruition the way we thought it would.”

And it’s those data points that you need to have your head, or the leap will not happen. So you need to be doing regular customer conversations, interviews, testing, prototyping, for that to suddenly come together. The random reference over here, and the way someone else spoke about it over here won’t sound the same, but after you keep listening and listening and listening, you’ll suddenly realise they are the same things. There’s also an underlying round of doing the particular prototype test you were looking for. And you can see a consistent response from your customers that says, “Yep, that’s the right one to go ahead with versus this one,” or the better one, and then you back it up with quantitative results that show that it follows through that way. But you need the unsynthesised information to keep going in, and that will lead to that leap.

Adam Murray: Yeah. There’s a really strong element of listening there, good-quality listening, to what you’re talking about. Lena, Liz referred a little bit to perhaps getting into some of the methodologies that you’ve used in particular; I’d like to ask you about that. But maybe also can you elaborate on where you sit on this art versus science spectrum of prioritisation? Or is there a spectrum, or do they need to go hand in hand?

Lena: I’d say hand in hand, a hundred percent. I find what makes a really good art or science is when you ship something, you want to gather as much information as you can data-wise to really then understand the next steps based on data. You don’t want to be going on intuition these days. That’s why we have data science—like Google analytics to see where users are really clicking and really spending their time—to really understand where we should be investing more of our efforts.

Adam Murray: Yeah. Great. Yeah, there is a lot more data available, so accessing and using it in an effective way … certainly if we’ve got that, let’s use it well. What about some of the actual methodologies and techniques that you employ as you go about your day-to-day work?

Lena: My favourite is really just getting all the stakeholders into one room. It’s a really good learning lesson, because all the stakeholders get to understand what the other stakeholders are going through and understand their problems as well, rather than interviewing them one on one. And also ensuring that there’s psychological safety, because you could have a stakeholder who’s a bit louder than the other. And part of our role is to really facilitate the conversations inside that room with all the stakeholders, ensuring that everyone has a voice and feels heard. And so that’s how we can really maximise solving problems for them.

And so once we gather them all in the same room, we talk about the problems that they’re going through, understand their background and understand what sort of language they use in their field as well, so we can start talking in their language. And then on top of that, once we’ve got all that, then we start moving towards, “What does the idea of goals or success look like to you?” And then we take all that information away and we summarise it. You can summarise it on a massive piece of paper—having some visuals also really works well—and then bring it back to the stakeholders again, all in summary, and make sure that we’re aligned with them, that we’re on the same page: “These are your problems, these are your goals. These are what you see as success.” And if everybody’s like, “Yep,” and we’re all aligned, then we start moving towards interviewing customers or users to really understand from their perspective as well.

And then we start understanding, “Okay. Now we understand the full problem. We’re going to start coming up with a bunch of stories or mini deliverables in a quantifiable way as well.” And then once we’ve got all our stories or mini features, we bring all the stakeholders back in again and we go, “Okay, what does MVP look like?” Because you could have the blue sky and everything, but obviously every client has a budget in mind and we don’t want to be selling everything and then be like, “Yep, this is how much it’s going to cost.” And they’re just like, “Oh man, we can’t afford that.” So that’s why we do MVP. And we sometimes try and push back and say, “Do you really need that? Do you want that?” Sorry, I’m just so excited! I’m really passionate about this field, but basically we ask them a series of questions to really make sure that we actually really need a particular story or mini deliverable, and once we’ve got that all sorted, we then talk about return of investment, start prioritising those stories and seeing which one has the highest return investment versus low. And then we start shipping it. 

And yeah, that’s the exciting stuff. I find this way of working really works well. In my experience, clients always come back and they’re like, “Yeah, we’re just so happy that it’s going so well as expected.”

Adam Murray: Yeah. Awesome. I can see that passion coming out. I might ask you a little bit about why or where that’s there. 

Liz, Lena was talking about some of the tools we have historically used, like paper, to do this kind of thing. We do all love paper. As we know, most of us are working in quite a distributed setting at the moment, which seems to be something that will continue. And it came out in the report that specific tools that are targeted only at road mapping don’t seem to be the flavour of the month, and people are finding other ways to do this. What do you make of that? And what are you noticing in your own experience about the type of tools that people are using?

Liz: I know there are plenty of software applications that claim to be roadmapping tools, and the fact that it didn’t appear from our respondents that they were using those is a reflection on product market fit somewhere else. But I think it’s actually got to do with the paper-to-something transition. I think Lena was really on point there.

And I think that the sign that Mural and Miro are predominant tools in what was being used is, I’m going to assume, the visual way in which you can talk through what you should do next. And how it allows for, or it avoids actually, getting into a spreadsheet and trying to pretend you can do some math on it, it’ll all work itself out. It’s actually about being really visual. It’s about filling in that gap and still doing it on a wall, and the ease with which theoretically you can move around where things are sitting, because you’ve started to understand the value behind or through those stories and conversations you’re having. You’re like, “Oh, so you actually mean this should sit here?” And you can just drag it around and go, “Is now, is that making sense? Is the picture here that you were looking for?” And people can go, “Yeah, that’s what I was thinking it should do.” 

But you’ve also extracted some of the language around the why, not only just a list or even a good board of work. If it doesn’t have that meaning, if it doesn’t have the why, it will still feel a bit unclear for the team to follow through on. But it’s those conversations that are really important, and I think that’s why those tools are appearing the most.

Lena: Yeah, hundred percent. Using Mural, or even Paperbase, really gets the stakeholders or the customers involved in the whole process. So it’s definitely a great way of making sure everyone is actively participating in these types of sessions.

Liz: I completely agree. I think it’s … I was going to say co-design, but that collaboration. Everyone else feels like they can jump in and play on that board as well, whereas if you’re using other types of applications it’s a bit like, “I don’t understand how to use this so I can’t touch that.” And suddenly you’ve made yourself a little bit too tightly in control, a micromanager of those things, versus a collaboration and co-design. And so people can just jump in and they’re like, “I don’t want that over here. It’s going over here.” You can see it’s happening, and then you can follow through on the conversation.

Adam Murray: There’s a whole other half to this conversation that we’re supposed to be having, and we’re probably going to have a small portion of it, which is around releasing and release frequency and reasons. So what makes a good release? There were some stats in the report there about the frequency with which people are releasing, which hasn’t changed that much year on year, but it seems like there is a slight trend towards people releasing more frequently as well. But Lena, what makes a good release?

Lena: A good release in my eyes would be a small release. From a technical perspective, releasing things small allows you to roll back. So sometimes if you do a big release, you ship it to production, and then all of a sudden it’s just not working as it should; it’d be really hard to recover from that because you’d be impacting a lot of people. So shipping something small, even doing AB testing as well while you’re at it, really makes sure that that things do go as smoothly as they should. And also it’s a good way to recover from it.

Adam Murray: Yep. And Lena, what about the other things that people can do, or as a team? there’s the fast shipping of things and shipping small things, and it feeds well into prioritisation as well. If we ship small things, we ship faster, we get that feedback faster, and then we can make better prioritisation decisions and we’re not guessing too far into the future. What other things can the team do to ensure that a release is successful over and above shipping frequently?

Lena: Sometimes we have deadlines looming over us and we feel the need to really grow fast as possible. And when we go too fast, sometimes we take shortcuts and it leads to no good. For example, from a development’s perspective, we can release a lot of things without what they call test room development, having guards in place to ensure that things are operating as they should. And if we don’t have those guards in place, we can make some really bad mistakes when released into production. So having bugs occur, maybe the login page stops working, and that really stagnates the pace within the team of releasing good stuff.

And so in my eyes, a good release is making sure things are done right. So making sure that we are meeting customer requirements, stakeholder requirements. Then making sure that everybody in your multidisciplinary team is involved, so making sure UX, UI, PMs, everybody is working collaboratively together to ship it. And then release it, grab data to make sure that it is working as it should. And then start again, really assess. 

For example, we’ve released something, we’ve grabbed all the data. All right, now we have the data, we know what’s important, let’s reassess our stories or mini deliverables and reprioritise them again, and then work on the next thing. And just going into that loop really gets things going.

Then when you do a release, what I like to do every sprint is to do a fantastic release. So showcasing to the actual stakeholders the work that we’ve been doing for the last week or last two weeks, explaining why we did it a certain way, how we did it. It could be like an educational piece to really have everybody involved and understand the value that they’re getting out of it. I think that’s really it. And that in itself really builds a strong rapport with the customers and stakeholders.

Adam Murray: Awesome. Thank you, Lena. Liz, from your product management perspective and releases, Lena’s touched on some of the things there that the product managers might be doing. But can you tell us a bit about what makes a good release from your point of view?

Liz: Yeah. There’s a couple of things I want to build on there from what Lena said. I liked the language around release: make sure it’s right or it’s good. You won’t ship everything without some bugs, but what you’re trying to do is make sure that what you do put out there is valuable. It’s not shit. And by that I mean, it’s important to also understand what the purpose of the release is. What is it actually going to deliver that will have impact for the customer with that release? 

And if that’s really clear, then you might wait a week if you realise … or you’ll not insist on shipping for shipping sake, you’ll wait, because something’s come up in the code, there’s something wrong with the design, or whatever aspect might not be right. If it’s not going to give that customer value just for the sake of shipping it, then don’t—move it out a week.

And I think the only reason I pause for a moment is it’s tricky if you’re in a team where deadlines and dates and launch dates are important. So you need to have something ready for a date-based event. The Melbourne Cup will run anyway whether you’ve shipped whatever you might need around that, or a sale, or in one of my previous gigs where the product that needed to go live and be able to be available for sales had a start date and we had to be ready. 

And it just means to me that you’ve got to shift and be careful about scope—back to prioritisation—to ensure you hit the deadline. That’s still got value, because again, if you drop it and it’s bug-filled, et cetera, you’ve wasted your moment to hit that deadline. And so you’ve got to be careful not to obsess about that, but be clear about what it means around the constraints you should work with to ensure you’re getting those pieces. And I think all of that—the test-driven development, being really confident that when you get to live it’s going to be in a good state—helps all of those things. You don’t suddenly turn the switch on and watch all the lights flicker, and then you’re like, “No!”

So I think working together in those styles really helps with that when you’ve got something that has a deadline that you need to make sure you work with. But ideally you understand the purpose, that MVP definition that Lena keeps referring to. And if you haven’t hit it, just move the release by a week. It’ll be okay.

Adam Murray: Yeah. Very good. Well, we do need to wrap up, so I’ve got one more question for each of you. Lena, for those people that are doing this well, doing prioritisation well and perhaps doing releases well, what can they do to improve on that good level? What’s some things that teams can start to think about that they might incorporate into the way they do things?

Lena: Yeah, nowadays I don’t see it happening too often, but getting feedback when you release something is so important. I don’t think we do that enough. How do we know that it’s going as smoothly as it should? Is it impacting users as much as it should? And just refining the release itself: making tweaks to it to ensure that it’s actually working as it should. 

And on top of that, speaking of deadlines that Liz has mentioned, we should also check in with our teammates. I think culture is important within the company and also with stakeholders. Just asking everybody how they’re going, do they need help? It’s the collaboration aspect—just checking in really does help make the releases a lot better. Now with the pandemic we can work in silos because we’re working from home, but just a reminder that we should make sure that we are in it together as a team and work together as a team.

Adam Murray: Yeah. I love that. And for you, Liz, for teams that are perhaps at the other end and listening to this feeling a bit discouraged, what are some small things that they can try to implement? Say small steps in this direction to improving the way they prioritise or do releases?

Liz: To be honest, my advice would be to find a way to talk to a customer once a week, or read the customer feedback that’s going through your support or help team each week. If you just take an hour a week to see some of that, and you can build it up over time, that’s what I’d ultimately advocate for. But if you started with an hour a week of that information, you are now starting to set yourself up with your own inspiration, your own motivation and your own knowledge, which leads to having some confidence to introduce that into conversations with leaders or stakeholders around their perspective that might be missing that information. And you only need a few tickets from the help desk that talk about a problem in the product to go, “Wow, actually there’s some stuff we could do here as well.” That’s the simplest place to start.

And then you can pick a friend—depending on your type of software, there’s probably someone out there that you know in your friendship group, and it’s quite safe and easy to just say, “What do you think about this?” And just build up your confidence to maybe start to invite customers from your help tickets or the site itself, and give yourself more and more information like that. Mostly, it’ll just make your job so much more fun. Because yeah, that’s why we’re here, right? That’s why we build products: to serve our customers. And then eventually it leads into that confidence to speak up around why you want to advocate for something, because you’ve got that knowledge.

Adam Murray: Yeah. Beautiful. Well, thank you both for that. Thank you Lena, thank you Liz. And the thank you all for listening, too. If you’d like to learn more about how teams across Australia and New Zealand are doing things, you can download the free 50-page report by visiting cogent.co/podcastreport. Until next time!