The David Bland Hypothesis: Assumption Mapping Before Testing Business Ideas Facilitates Better Product Decisions

David Bland, founder of Precoil, explains how important it is to get cross-functional teams to agree on the assumptions that must be true for your business to succeed and different approaches to testing those assumptions.

The David Bland Hypothesis: Assumption Mapping Before Testing Business Ideas Facilitates Better Product Decisions

David Bland is the founder of Precoil and the co-author of the new book, Testing Business Ideas. In this episode of the Product Science Podcast, we talk about how important it is to get cross-functional teams to agree on the assumptions that must be true for your business to succeed and discuss different approaches to testing the assumptions. 

Subscribe for the full episode on Apple, Google Play, Spotify, Stitcher, and more. Love what you hear? Leave us a review, it means a lot.

Resources

Questions We Explore in This Episode

What did David learn sticking with his first startup for eight years? How does your industry impact what it means to shift to agile and lean processes? What lessons did he learn from being in a startup that made a major pivot from B2C to B2B? What are the dangers of ignoring evidence, and how do organizations find ways to explain it away? What do you do when you realize your initial plan won’t work?

How do you tell people they’re wrong without shutting them down? How does David use assumptions mapping? Why is it vital to always have a cross-functional team in the room? Why is it so important to get your team to agree on what wrong assumptions will kill your product, before you even test them? How did working through these exercises lead David and Alex to write Testing Business Ideas?

How can crowdfunding help validate the fundamental question of will people pay for this? What other tools can you use to test price and viability? How can product management embrace their influence on their organization’s business model? What happens when product and marketing have a disconnect and how do you work on that? Why are product managers drawn from so many different backgrounds and how do we train them better?

What can you learn from manually delivering value to your customers? How can you give that evidence for creating effective automation? Why should you look at the feasibility side of your product with the same curiosity you look at the viability side? What bad habits does David run into in giving interviews? What are the keys to doing effective desirability testing?

How you do separate infrequently used but essential features from things your customers won’t miss? How do you segment your customers to learn more about their behavior? What are leading companies doing to get out ahead on presenting the right environment to the right customers and how is that influenced by good experiment design? How do good product managers look at data? How do business leaders help support this style of work?

Quotes From David Bland in This Episode

How important is this to us succeeding? When we write down our evidence, really, really fascinating conversations occur where this is a kind of knowledge share.
There's nothing there and they won't pay enough. We can't go forward and I think that people don't always want to face that early on either.
It's tricky because you want to drive success, but bring other people along with you for the journey. And so it's a balance of facilitation, but also having an opinion on which direction to go.

Transcription

Holly Hester-Reilly:

Hi, and welcome to the Product Science Podcast, where we're helping startup founders and products leaders build high growth products, teams and companies. Through real conversations, the people who have tried it and aren't afraid to share lessons learned from their failures along the way. I'm your host Holly Hester-Reilly, founder and CEO of H2R Product Science.


Holly Hester-Reilly:

This week on the product science podcast I'm super excited to share a conversation with David Bland. David is the founder of precoil which I believe you know, I first came across on medium I don't know if that's the place most people find you but I still remember the article, “Don't Get Dollar Shave Clubbed,” and he's currently getting ready to release a book which is with Alex Osterwalder on testing business ideas. I am super excited to talk with David today about, about that and his path and what he's up to these days. So welcome, David.


David Bland:

Thanks for having me.


Holly Hester-Reilly:

So I often like to kind of begin at the beginning. And I'm curious, I know you've been working in the industry for a while. How did you first come into this world of high growth products?


David Bland:

Yeah, it's been kind of a winding journey, I think I traced it back to school. So I went to school for design, and I thought, is the right when the .com bubble was happening and I thought, wow, I'm just going to create this like website for a startup or something, retire early, like in my mid 20s, maybe, and life's going to be amazing and didn't play out that way. But I did join a startup coming out of school and I learned really quickly that designing amazing things is only a really small part of your job at a startup. I mean, you have to do so many different things and so really quickly I had to, I kind of messed around learning how to code and learning how to do other things and do sales, but I had to kind of level up really, really quickly and so our startup, I was there for about, it was financial services. I was there for about eight years and we were acquired for 16 million in 2006 and then I kind of grew up and learned a lot of really hard lessons about what works and what doesn't and as far as product goes, and platform goes, and I think the big learning for me, and that experience, it really kind of maybe influenced my career was we thought we were B to C, we thought we were business to consumer and we ended up being B to B, business to business. And that's a pretty big pivot for a start up and it wasn't until we made that pivot until, we were really successful. So we're about out of money before we made that. And so I think that's really just influenced my career over the years. And then I bounced around a couple a couple other startups. And about nine or 10 years ago, I made the switch to kind of advising and consulting. And so really, what I've been doing is just taking everything I've learned by experiencing it impact coming up in a way where people can not be stuck anymore when they're trying to make their way forward as far as a business product. Does it have a fit or not? And how do we test it?


Holly Hester-Reilly:

Awesome. Eight years at that first startup that's a long run.


David Bland:

It is probably a little too long. There was a part there during the end I was like, you know, this is, you know, I've been doing this for so long. And it was my first startup I joined. So, you know, I wasn't the co founder, I joined this really early, I was think I was employee number five or something like that. And it was such an experience and I kind of grew up but i have learned some bad habits in the startup to you know, I learned, you know, we got thrown into agile with no training, right, like round 2000-2001 it's like, we're agile was like, Okay, I don't I don't know what that means and it was a high pressure environment. You know, we're trying to release stuff really quickly, but we were financial services. So when you're transferring a billion dollars of premiums a month, you know, you messed that up at your startup is gone and it was really interesting when I joined a couple of startups, it's like, oh, these are actually very, very different worlds, depending on what you join and I mean, I certainly I learned [inaudible 00:04:17]. I mean, it's really again, influenced my career, because I did everything there almost. And so once you have that experience, it's really tough to go, Well, I'm going to do a very narrow specific thing and only that.


Holly Hester-Reilly:

It sure is. How big was that startup when you left it?


David Bland:

I don't even remember. Probably around 100 people or so. I mean, we kept it pretty small and we [inaudible] a platform. So you know, the platform was the scaling part. So we really didn't need a lot of people to run the platform. But I actually don't remember the exact headcount. But we were acquired and I've been on both sides of acquisitions. I've been on a startup that got acquired and I've been at a bigger company to acquire to start up and I kind of have empathy for both sides as far as healthy situations playout.But it was amazing experience for me it helped kind of shape certainly way I think in the way I approach problems.


Holly Hester-Reilly:

Yeah, I'm totally sure it was. Is there anything you can kind of pinpoint that helps illustrate? Like, is there a lesson from those years that you find yourself constantly teaching other people? Because they haven't learned themselves yet?


David Bland:

Yeah, I mean, we didn't really have a lot of the language that we have now, with regards to business and strategy and product. I say we pivoted but back then we didn't call it a pivot. You know, we just, we didn't have a lot of the analytics that we have now, either. So, you know, for example, back when we started creating websites and all this stuff, we just measured hits, it's like, what's a hit? It is really hits went up hits went down, you know, compared to now where you have all this really intricate data available for you and I think the the lessons I keep teaching we're lessons that around, measure and have an opinion about where you should go in, but be informed by data and be informed by what you're measuring. And, you know, the only way we really ended up pivoting B to B was that no matter how many nights I spent up building tools for consumers, they didn't want the product. And it was very disheartening, because I literally, like sleep with the office, did not have a very good work life balance at all and yet, it didn't matter how amazing it looked, what I designed, what we put out there in front of them, they just always turned to a financial advisor to make a purchase and it wasn't until our founder was smart enough to say, Well, why don't we go after financial advisors, they're the ones always making these purchases. And so the other two startups I joined, you know, I won't go deep dive into them. But, you know, we didn't really use the data to make an informed decision. We just kept explaining it all away and saying, Oh, you know that they don't get it. We'll just keep building and keep building and those things didn't do so well. And I think they didn't do well because we kept ignoring everything and talking ourselves into over to keep persevering. We're going to be fine. And so I feel like I keep teaching those lessons where certainly have a vision and have a direction. But be open to the idea that you might be wrong and if you're still really passionate about the space, and you pivot in a way where you're just helping bring it to life, then you know, then go for it. So I feel like those lessons I keep teaching over and over again.


Holly Hester-Reilly:

Yeah and that makes sense.I think sometimes people are scared to let themselves see or understand whether people really are interested in what they're building or will actually use it if really sort of the other side, right? Like, if there's a lack of it, and they just keep explaining it away. That's I've seen that as well. It's definitely it's easier to point to it and say that is what I see here when you've lived through it before and then you can say, yeah, that's what it looked like over there and that didn't work.


David Bland:

Yeah, I mean, it's a really tough spot to be into because depending on your role in that company, right, you're kind of owning this and you want to see it for when you want to see it succeed. But then sometimes maybe it shouldn't succeed. Maybe there isn't really a there there and it's okay to park it and try something else or maybe, I don't know, I had this really awkward situation at one startup where I realized about a year and that if I was really successful with what I was trying to do, it would kill the startup. Because literally how we build and everything and how we integrated with partners, if I was super successful at my job, that startup would die and so is about this fibers, like No, I'm just gonna moonwalk away from this and you have to have that awareness to because not everyone does and we just want to plough forward and be successful no matter what we're focused on. And you do have to have kind of a bigger picture in mind. Because sometimes if you're really successful, that may mean that you're actually killing you know, the company or killing the business model. So it's certainly a journey. I just feel like it's this balance between being driven and having a vision but also being open to the idea of being wrong and an open idea of other things that may influence your thinking for sure.


Holly Hester-Reilly:

Yeah, absolutely. So is that something that you are really helping people with the tactics of? Because I feel like, you know, it's one thing to say, do that, but then I know, you've been working on giving more details on how to do certain experiments and when to do what kinds and tell us more about that, and how that helps people figure out, how to walk that balance line.

David Bland:

Yeah, I mean, that's a great question. So when I'm working with companies, people don't like hearing they're wrong and certainly it's not my case to come in. Even though I've worked in a bunch of different spaces over the years. I don't want to come in and act like I'm the expert in their thing and tell them they're wrong and no one ever wants to experience that anyway.So I kind of tried to lead them to this realization of, Okay, well if we get people together, so if you get product design engineering together in a room leadership in a room, and we have this structure conversation, which basically what I do is I do this exercise called assumptions mapping. And I kind of learned it from Josh Seiden and Jeff Gothelf, who wrote Lean UX and Sense & Respond, that used to work with Neo together here in San Francisco. And I kind of customize it over the years. And all it is, is basically and it's part of, you know, Google's design sprint now, but all it is, is a two by two that kind of forces people to have this conversation about what's really risky and what isn't. And it's also in the book, and basically, what I try to do is facilitate that conversation. And I do it in a way where people are writing, you know, they're not just talking and then when they're mapping things out on kind of level of paid. How important is this to us succeeding and we have evidence or no evidence that supports this thing that we wrote down, is really, really fascinating conversations occur where this is kind of knowledge share, right? Because if you have a cross functional team in the room, it's not just slanted towards product or not just slanted towards engineering or not just slanted towards design. Because if you just have design in the room, for example, you're going to come out with this kind of distorted view of what's really risky in your product and so having all three represented, I think it's very, very important. And so I do a lot of that facilitation to get people to come to realization of, we all agree as a group, that this stuff is proven wrong killer product, and potentially be killer business. So what are we going to do about it? And so it was a big aha moment for me to get people to that point of realization where we can all agree in a room that this stuff is proven wrong, will kill our initiative. And so but then the next big question was, now what? So what do we do? And I kept giving kind of the same advice over and over again of, well, here are the experiments available to you and based on this kind of risk, you can do this and this and this. And it finally just dawned on Alex [inaudible] told myself that, let's just write a book that helps layout that mapping. And it's not going to be perfect, but it helps people kind of connect the dots. So, hey, if we have a bunch of desirability risk, right, here are some things available to me to go test. And if I have viability risk, that's his realm, you know, the kind of Should we do it stuff around cost and revenue? Here's some things available. And then we have a bunch of feasibility risks, but can we actually deliver this? Here's some things available. And so really, it's this book is kinda like a conversation and pushing that conversation forward a little bit, where we say, Hey, this is advice we've been giving teams for years and years and years. But here's kind of a guide or a library for you to just get you started in that, because if you only know how to do interviews and surveys and landing pages, it's really going to restrict what you can learn the kind of evidence you're going to generate, because mostly those things are gonna tell you nothing about feasibility, and maybe nothing about viability depending on how you use them. So that's what been our focus for at least the last year has been can we put this stuff together in a package where people can look at this and go, okay, we all agreed on this kind of risk. Here's some things we can go try. Which one should we go and run?


Holly Hester-Reilly:

Yeah, that's awesome. There are so many things in there. But the first one that I want to kind of spell out a little further is, tell us more about why it's useful to get all of these cross functional stakeholders to agree that if this assumption is wrong, we'll have a big problem before you go and figure out if the assumption is wrong.


David Bland:

Well, I think the premise behind this is basically the idea is our risk moves around. And depending on where we are risk can change and so it's not always playing out this way but usually if I'm going into something completely new, a lot of your risk is run desirability, because people you probably haven't talked to any customers maybe performed much research at all. And so but then really quickly, it'll move over to viability and feasibility in that Well, okay, maybe they have this problem, but then well, do we have a business model that supports this? You know, does it make sense for strategic direction of the company? And then can we do this and can we doesn't mean just engineering, but you might have to bring in legal or compliance or governance or somebody that understands regulatory constraints as well. So I think, maybe in design thinking what we fall into this trap of, well, feasibility only means engineering and tack, and it's not, I meet a tone of health care startups that they have regulatory, you know, goes up against an FDA process or HIPAA or something else. And so, while it may technically feasibly work, they can't actually succeed because of regulatory constraints. So I'm a big believer of having kind of all three in the room. And I think I'm somewhat bias there too, and that, you know, previous you know, engagement and everything, we always error towards having those folks together really early on. Because when you're running tests, you're probably need some design in the room to kind of make it look legitimate. And you may have to build something to test it, but it's not necessarily this elegant, scalable, infinitely, secure thing. It could be something very small to test and so having that leadership in the room together to understand and align, I think is is really important because, like I said, if you skew just towards one and you start going down the path, let's say, for product, for example, let's say as a product manager, I think these are the risks aren't we're going to go and go validate those. But then without having design engineering in the room, there may be a bigger risk there, that that you don't even you're not even aware of and therefore it is kind of this optimize this local optimization where you're optimizing to solve for this thing. But then there might be a really other big risky thing that you're not considering that you need to work on first. So it just, I think over the years, I've come to this realization of having all three, what you're working on, you kind of need to align otherwise, you might go down this rabbit hole of experiments, that you're really validating something in the big picture doesn't, it can be delayed it's not necessarily have to work on, you know, next three to six months.


Holly Hester-Reilly:

Yeah, that's awesome. I think it's a really good example of using all of the value of those leaders, and not just their execution value. You know, you really want their brains you really want their input, you want their thoughts about what's risky and what's not. And then yes, also that can help you like design and implement tests as well. So then tell us more about the way I think in particular, I'm curious to hear about viability testing because I think that's something that, in the product management community, I feel like probably gets the least attention. I certainly have had conversations with other product managers where we talked about how we would do some of these assessments to figure out if something would make sense for the business and be viable and, and what not and a lot of times, I find that people almost hesitant to put their name to it. So I'm curious to hear what experiences you've had around that and how you help people through it.


David Bland:

Yeah, there's a lot of anxiety around testing viability. For some reason, I feel like there might be more anxiety there than any other part. And I think it's become because of pricing and testing price and the idea that we're confusing people and then maybe the realization of if there's nothing there and they won't pay enough. We can't go forward and I think that people don't always want to face that early on either. But there's certainly options out there. You know, I just did a webinar recently where we talked a lot about you know, crowdfunding for example. And I think crowdfunding early on was this thing where we're going to crowdfund something that's just for startups. And what I'm noticing are more corporations are actually crowdfunding things now. And so rather than have this conversation over and over again internally about, is it will people pay for this at all, at this price point. I'm seeing teams go outside and raise funds externally. And they still have the backing of the corporate brand. But they do you know, an Indiegogo campaign that says, Okay, here's what we're doing. And they probably lead with their little labs brand, but they still say powered by, you know, whatever the corporation is and they set a crowdfunding campaign and they do the videos and they asked questions and they answer the questions that people ask and they run it like a campaign, and then they start to see evidence of Okay, is this viable at all will people actually you know, even support this and if it works great, they kind of keep rolling with it, or they might bring it more with the brand friends center. If it doesn't, they kind of quietly shut it down and say, Hey, this was also trying to gauge demand for you all. And clearly this, we need to do some more work, and it might come back in a different form later on.


David Bland:

So it's really interesting for me to see crowdfunding kind of play a bigger role now. But you could also do pre sales, mock sales, there are a lot of things you could do on landing pages where you're price testing different product tiers, like some different service tiers of, Hey, we have this tier, this tier and this tier, and when they click on one, it doesn't mean you build them right away. But there could be we're still working on this, we're not ready. Do you want to give us your email for when we're going to roll out this tier? And there's some case studies in the book of companies that did that too, in the past, but I think where it gets a little, I was a little awkward, or maybe uncomfortable for product managers is that it becomes really obvious that you're influencing the business there. And I think product managers should embrace that because I feel that is part of the role is product and business are very integrated. You know, it's almost like a system and so you can't have this amazing product in your business model because you'll fail. And you also can't have an amazing business model and a terrible product and you're going to fail. So I think as product managers kind of grow and embrace experimentation and mature, I think they're going to find that they have this responsibility to do more of this testing to help educate business, because I've seen this happen in the past where people build an amazing product, and they kind of throw it over the wall and say, okay, figure out how to charge for this. And that never ends well, by the way, because it's like, you're asking somebody with almost no context of all the experiments you read and how you've grown and developed this product to say, Okay, now just come up with like, a model for it and just to sell it and so I do think earlier on we need to be testing for viability. Certainly like the R&D groups, innovation labs, I deal with a big companies, they're being asked to come up with business models now, which terrifies them by the way because it was pretty much a business model free zone in the past and so they need to level up. And then certainly there's some tools and everything out there that they can use now, but yeah, I think is just anxiety around viability. But I am a big believer in testing that earlier on if possible, and then helping that inform the conversation. When you go bigger later, you want to scale something you want to know you've kind of tested the model out.


Holly Hester-Reilly:

Yeah. A lot of things you said in there and really, hit home for me, I will admit that early in my product management career, I also fell into that trap of being like, well, I'm just gonna figure out what's a really great product and someone else will figure out like, how we make money with it and then I watched, you know, there be some amount of mismatch between the business thinkers plans on that and my thoughts on that and those mismatches just came from different perspectives on the whole universe like different perspectives on which people were talking to and what we're hearing and what insights we see and I think that really comes back to how it's so important to have all these cross functional stakeholders because we all need each other to come up with the best, most clear understanding of the truth.


David Bland:

Yeah, I mean, just imagine[inaudible 00:22:09] there's really interesting the, you know, even if you can't agree on the customer segment, for example, you know, and I've seen even marketing and product disagree there and big companies where, you know, like the CMO thinks their customer is X and the CPO or VP of product feels like the customer is Y and just imagine what kind of dysfunction even that would introduce into the overall success of the product. Because if your go to market strategy is a completely different value prop and messaging and everything, but no one's going to convert when they come over and see that thing you build for a very different segment. So if you can get out, you can decoupled really quickly and it can be a problem. So I am a believer of just early on trying to be more well rounded in our testing so we can just help inform conversations when we do find something rather than later on trying to bring people up to speed and expect them to create something amazing without context, which I think is just super hard for anybody to do.


Holly Hester-Reilly:

Yeah, totally. I would say I've lived through many versions of what you just described as well, like, I think, thankfully, we managed to sort of rectify it. But, you know, earlier marketing campaigns, the ones where we're trying to test things where it's like, that's not the same value proposition that product was building towards, you know, and it doesn't, it doesn't go so well. You don't want that.


David Bland:

I totally agree. And so again, it goes back to luckily optimizing right so if you're optimizing for let's say, click through rates on your ads, and you get this amazing click through rate, is it like 3%, or something crazy like that, and then they come over to a page and the page says something different because product has been working on that page and design and you're going to see your conversion rate dropped almost zero and it's going to be this weird dynamic where we're optimizing for click through over here, but then we're optimizing for conversions over here, but it's the combination of the two that matters because people are just going to bail when they come over, and they don't see something that it feels very jolting or, you know, like, What is this, this is not what I thought this was going to be. And so and same thing with the business model, when they click into pay for it, it's a price. It's wildly out there based on very, very different expectations. They're not gonna, they're not going to purchase either. So I do think there is an element of aligning around all of that. And in part of that culture, of course of companies, but I do think you have to be really, really careful of locally optimizing there because then overall, the thing fails, it doesn't fail because no one did their job really well. It's because we didn't do it all well together. And that's why it feels.


Holly Hester-Reilly:

Yeah. Although I suppose you could argue that somebody who was leading at the top third into their job well, if they didn't, if it was, all the functions weren't aligned.


David Bland:

True, true.


Holly Hester-Reilly:

Yeah, I mean, I totally agree and I love how you talked about the local optimization. I think that happens a lot in a lot of organizations that aren't as practiced at cross functional work and it feels like as a product manager today, we spend a lot of our time working on the communication and collaboration skills that are needed for good cross functional work.


David Bland:

Yeah, and it's tricky balance because obviously, you want to drive this thing to success, but you want to bring other people along with you for the journey. And so it's a balance of facilitation, but also having an opinion which direction to go, it's probably managers. They have a really tough job and one of the reasons I hope this book helps, by the way with is I go into these companies and have a bunch of product managers in my workshops and in my classes, and I'll start asking them whether, why they're getting their coffee and everything, you know, how to become a product manager. And the question and their answers are really interesting, because it's usually Well, I did something really creative over here, and it was a success. And then they said, Okay, now you're gonna be a product manager. And it's like, here's a copy of Marty's book, you know, and maybe some blogs to read, congrats, you're a product manager and so and they come from very different like analysts, testers, scientists, marketing people, research people they come from all walks of life. And it's really amazing to me that they're kind of, they're really good at creative problem solving. That's obviously like a part of it. But I think we're just not doing a great job of preparing folks. Because it's relatively new kind of role, I think in many words that are trying to be more product focused. And so I'm hoping this book and other books out there really help just give people more breath as far as Okay, well, at least I have a collection of things that'll help me here. Because that's a really tough spot to be in where you did something once that was successful, because you're creative, and you're really good at problem solving, and pattern matching, and then all of a sudden, well, now you're a product manager. It's like now what?


Holly Hester-Reilly:

Now do it again.


David Bland:

It's like that famous music artists the hits is like one hit song, and then they like, put him or her down in the studio again, they go Okay, just recreate that magic. It's like... well, it was a bunch of things that helped you to create that magic. And I don't know if I can recreate it. And so it's really it's really a tough situation for product managers to be in.


Holly Hester-Reilly:

Yeah, for sure. I also want to hear more about the what kind of experiments do you recommend for feasibility? Because I know that's also not all about the interviews and the surveys and the landing page tests. So tell us a little bit about that side.


David Bland:

Yeah. So with regards to feasibility, it's really interesting because I do broaden the definition and Alex and I really had to align on this early on in the book writing process, which was, it's not just technical, it's regulatory, is can you find the right partners and suppliers and so it's expanding the scope of what feasible means to can you support the infrastructure to make this work. And so but I'm what I'm finding are things such as, like concierge and wizard or laws, where these experiments where we kind of manually deliver the value to the end customer can help us figure out things like feasible. And I know they don't scale and I know you're doing it manually. But you know, what I, what it's kind of set forth in the book is Look, when you're performing some of these experiments, and you're delivering something manually to a customer, take some time to document what happened, create your little like con bond Board of what all the steps were and how long it took you to deliver that value. Call out where you got hung up if we got to one step and then we realized there was a long wait time where we got blocked trying to do something, how did that impact things? And then sit together as a team and start talking about, okay, what did we learn from manually delivering that value back to the customer because if you're going to automate something, you need to inform that automation and that design based on kind of real evidence. And so rather than kind of just white boarding in a room and saying, okay, wouldn't it be awesome if we just built this thing this way and then go build it. Use some of these more creative experiments to inform that conversation. So I've been kind of steering, deliberately steering people to those types of experiments. And the pushback I usually get is what we don't have time and that is very dangerous conversation, by the way, because it's almost like, well, we don't have time to manually deliver this stuff. So we're just going to jump to the building all the infrastructure and making sure it scales to million customers kind of stuff. And so there is an element of like walking people back and going, Okay, well, if we did this manually, even if we time box it for a couple weeks, we will learn so much that will help inform visibility.


David Bland:

And then there's some other experiments I have there in the library as well. Something like a letter of intent where you can actually go backstage and start validating out your back end stuff with Okay, well, we have a partner that we need to bring in to kind of deliver this product because we can't do this part ourselves. Can we do interviews there but also do like have them right craft and LLI which is not legally binding, but it gets them to a point where they're putting in writing what they can deliver on. Can we create some kind of data sheets and things that go into the specifications of what we need. So we have a conversation with people in the back end where, hey, we need you to play along. And this is what we would need from you. So, and then it has some traditional agile stuff like spikes from extreme programming and things where you can just tie box. Okay, we're going to go and see what can we feasibly like work with this software. So there's some software stuff in there too. But I do think we need to start looking at the feasibility side with almost the same curiosity that we do from the desirability side. Because it's not just a build the specs, you know, and then go build the thing. It's more about what can we experiment our way through this to make sure that this will actually work because in the end, it has to deliver value to the customer and they're all related, it'll impact viability. If it's really expensive, you got to charge more, which might shrink your market. So these things aren't in isolation, but I do think we're in danger of just kind of coming up with a big idea feasibility wise, and then breaking it up and just building and we need to keep taking more kind of experiment driven approach there to, to learn these, can we work through this together and use what we learn to kind of inform the design of the thing that we're trying to deliver?


Holly Hester-Reilly:

Yeah, absolutely. I love that you started with things other than spikes, because that was the first thing that came to my mind was like, well, you the engineer through a spike, and then but there's so much more to it and I love that. And I think I certainly and that this... maybe this is different across you know, startups versus enterprises and such, I don't know, but I'm, a lot of the product managers I work with, often discount the non technical sides of feasibility, you know, they're just so focused on what the engineering possibilities here but like if the legal possibilities won't allow it or those partnerships that need to be built are going to take nine months like they aren't prepared to handle that. So giving them tools to help with that sounds really awesome.


David Bland:

Yeah, and I think I see this play on different ways. I judge it, startup competition in Utah. And this annual competition. And one of the facts that came through there, they were doing like an FDA, a device that needed FDA approval. And technically they can build that device. But they just like the money required to go through the whole process. Even as a startup with limited resources. They just didn't have that. So they ended up doing some research and realizing that we can do a version of our product that doesn't need FDA approval. And I thought it was really fascinating because technically, yeah, they can build that one too. But then they realize, well, there's an entire market for this thing without this feature. And we can actually bootstrap and then once we have enough revenue, we can actually go through the FDA approval process and build the thing with that part, too. So there are all these really interesting options available to product now where you can look at a situation say, technically can we build this thing will maybe but what are the things around that that could influence it and especially if you're in any kind of regulatory environment, heavily regulated environment, you need to be keeping those things in mind and not discounting the fact that there might be options there that don't actually don't require those approvals right away. So it's just really fascinating to see how people can creatively problem solve their way through that and not just focus on the can we build a question over and over again?


Holly Hester-Reilly:

Yeah, that's really awesome. Cool. So since we went through those two pieces of it, we should come back and talk about the desirability testing. There's a party that's like our listeners all know all about that, but that might not be true. So tell me about your favorite ways to test desirability.


David Bland:

Desirability is interesting, I think it's worth for... it's do you want that? You know and there's all kinds of things wrong with that, by the way, it's a closed ended leading all those things. So even though I come back to saying you need to know more than interviews, I'm finding time and time again, that we're just all really bad interviews and even myself, I fall into these bad habits of it's just so hard not to bias people. And it's like little things like putting the answer to the question in the question. It's the way you ask it. It's, you know, I had this really humbling experience where I was experimenting in retail and I'm literally in the store interviewing, doing intercepts with customers once and teaching his team how to do interviews, and then I did the interview and of course, I'm biased the person. Even me and I was like, okay, just do that. And they was like, wait, the biased, and like you did? I think No, no, don't do it. I'll do what I did. So all of us need practice doing interviews. It's so hard. But beyond that, there are all kinds of really interesting options available to test is our ability now, especially with how fast we could prototype for discussions, right and even paper prototyping or clickable prototypes. There are all these things that we can use to shape the conversation, that's still touch desirability that aren't just us, you know, they're just not us just speaking to people, we have some things around us to support the conversation or frame it.


David Bland:

I've also really liked some of the in app things you can do now as far as just lightly touching people. I'm a big fan of Shaun Ellis. So I really love this Shaun Ellis test where it's your testing, basically, the question is, how disappointed would you be if this went away? And it's really simple answers, right? It's almost like, I'll be very disappointed or I don't care. It's kind of like the two extremes. And you know, from a desirability testing, this can be super interesting, right? So just imagine people in your product, and imagine them using it, and they've used it a couple times. It's like, why are they coming back? You know, I can't talk to all of them, but I'm just really curious, like, do they even care about this, you know, and sometimes you'll get, you know, some learning from that. It's something of I will be very, very disappointed to this one away and here's why. And you can start understanding Oh, wait, this is kind of our unique value prop. This is why people keep coming back. And then at the other extreme, if you get well, I just, I really don't care if this goes away, I just use it because x, y, z, and then you can start digging in deeper there of Okay, well, what could we do to actually make this a better experience or make it something more enjoyable, so that you'd be satisfied and even passionate about it. So there's all these really cool little tricks available to us today or techniques, where we can lightly touch and get kind of a glimpse into desirability, versus the traditional, let's just always go back to the long form long form interview. So yeah it's really interesting to see kind of the responses. And then you can always hack them in a way where it's, you know, instead of having radio buttons about what else people did, you know, or what else the other competitors, right, you can just have an open ended text field and have them explain like, what did you do and it takes longer and it's kind of a pain to sort through it, but you can get all this amazing desirability kind of input from just having people explain.


David Bland:

I did that recently with a travel company we realized we were competing with like Group on and completely different other jobs. Kind of like jobs theory, right of all people are hiring these other companies to do this job for this other product. And we they weren't really even on our radar until we asked them in ways where they could offer up that information. So, yeah, so desirability, I think it just comes back to just not biasing it and not doing closed into leading either surveys or interviews or whatever form you're using. And then just like taking advantage of the technology available for us today. And also having supporting the facts that help pull out learning from people. It's really pretty fun to do.


Holly Hester-Reilly:

Yeah, there's so many things that you can do today. I'm super curious about the Shaun Ellis test, I know that I think he calls it the Product Market Fit Test, but I don't necessarily believe that's exactly what it's testing but it is a really interesting one. One of the things that you said, that's really fascinating is how often have you come across like interesting responses from people who are the data shows they're regularly using the products? But their answer was I don't actually care if this gets taken away tomorrow.


David Bland:

Yeah, it's, I have to say, it's really interesting, it's almost becomes a way to segment customers, right? So anymore, at least advanced product companies I go into, it's not just we have a target customer anymore. It's okay, for each class of customer, we have this segment where we've defined what it means for them to be active in our product. And all of them have a different cadence and usage pattern, and they're measuring different things. So it's pretty fascinating. I just went to a talk from Adobe, in San Francisco recently, and they do this with, you know, all their products too. And so it's really interesting to see companies on how they're taking these slices of customer and saying, okay for this type of customer and his kind of behavioral segment that goes in and uses a product this way, here, the things they do at what cadence and then if we're going to move the needle, we need to touch it like here, here, this is where we should focus our experimentation. And for this other kind of customer that comes in and uses it this way, this is what we should focus on this, what active means to them and so you can pull from kind of Pirate Metrics, right, you know, acquisition, activation, retention, referral revenue, it's kind of an it's RA, or it's a joke there anyway. You can pull from that as a baseline, right. But it's really interesting. So when you start doing things like Shaun Ellis tests and other ways to kind of poke at how satisfied people are, you can start understanding the segments and types of customer and it doesn't mean you necessarily, like want them to use it every day, or they're ever going to use it every day in a way that you vision, but you can actually kind of create a more informed kind of persona around the segment that comes in users a specific way. And so it's somewhat subjective. But I do think, you know, with the data and everything that's available out here and the technology that's available today that you can kind of take a more informed approach of, if I ask these questions that people respond a certain way, then I can start building out these profiles of types of folks that use it a certain way. And then maybe that's enough, maybe it isn't they're going to be super disappointed all the time. So it's really hard, but I do like, how companies are kind of working through this and using kind of the desirability stuff and even throwing in (LP) and different algorithms that start understanding bigger behavioral like, almost like quantitative stuff behind the behind the qualitative.


Holly Hester-Reilly:

Yeah, interesting. One of the things that what you said made me think of is, the less advanced companies they often don't have sort of an allowance for. The idea that there would be some people who were never who shouldn't be trying to use their products more frequently, right? Like because the value that they get isn't about the frequencies in the product. And sometimes people just try to increase that simple to think about metric, like how frequently to use the product. When a test like, how upset would you be can help the whole team realize that maybe they use it once a week, and that's perfect, and they'd still be really upset if it was taken away. And that's a different kind of user.


David Bland:

Yeah, I agree and I think more advanced companies can target their experiments too, right? So it's one thing to toggle something on for everybody and toggle it off for everybody. I mean, almost everybody can do that now. But can you toggle it on for a specific kind of customer in a certain spot? And products are coming around that to do that? Now you do have you don't have to know how to code all the things you can just integrate, you know, existing experimentation products in there to help target but I think more advanced customers like they don't just want to toggle something on and see it for everybody. They want to talk to them for a specific instance, based on this behavior pattern for this customer segment and get learning from their first. Because sometimes you can, it's not something you want to get data from everybody on. So it's really it's just really interesting to me to see. Because if you design an experiment with a team, and then they don't have the ability to kind of show it to somebody at the right time. It's really frustrating for them, because they feel like well, I'll wait, how long do we have to wait to learn? Because we can't actually turn this on in the right, you know, point of the flow based on the right customer segment, like we just don't have that infrastructure. So there is an investment here. I think that more product lead companies are doing where they can really fine tune what they've learned me like Amazon's an extreme example of that. But they're really trying to fine tune how to, you know, put things out there in a very specific way, rather than just saying, well, we're going to test something but we're going to get a lot of noise for a bunch of customers that don't want this[inaudible 00:42:56] which also frustrated by the way. So it's really interesting to see that play out. And I do feel like the companies that do this well are getting further and further ahead from the companies that don't do this well, and the gap is growing. It's not it's not shrinking.


Holly Hester-Reilly:

Yeah absolutely and at the end of the day, that comes back to good experiment design and investing in the infrastructure for it, you know, that you can't, if every time somebody, if you've got some bright young product managers, or designers and others who are coming up with a great design, and then they constantly hit a wall of like, Well, you can't implement that here, you know, then that's no good either. But if you have both sides of you know, the infrastructure and then the skill and knowledge and how to use it, then the team can really take off well beyond the all the people who are still just kind of looking at averages and vanity metrics.


David Bland:

Yeah, for sure. And so I do think that it's going to have to be proud of, you know, you don't have to be a data analysts as a product manager, right. You don't have to have a deep analyst background but you need to be able to look at data you need to be able to compare with other people in the company that can help answer the questions you have. And especially just little things, like, if you're trying to prioritize your features, and you think this would be amazing, but you don't know this data set of the kind of cut, like, the pool of customers that would apply to and how many of the really are, you know, having an analyst or somebody even pair with will help you make a more informed decision as a product manager, because then you could say, I think this is amazing idea, but then you do the data, the analysis realized when they had like, 10 customers in it.


David Bland:

Now for B to B, that's a different story but for B to C is like, Oh, well, geez, I thought there was so many more of those people. And so I do think product is.... it's really interesting to be have a vision and know all these skills, but also being able to pair with people that can give you the data back. I think it's really powerful combination.


Holly Hester-Reilly:

Yeah, I think so too. Are there any other key takeaways from the book that you want to point people towards?


David Bland:

Yes. So the book, the majority of the book is the experiment library. So we have over 200 pages of that in the book. And so there's a lot there, it's pretty dense. However, there's some stuff around that, that I was very passionate about, including because I didn't want to just be a list of stuff. And so in the book, we talked about things like team design, we talked about, you know, some of the stuff we talked about earlier here in the podcasts around product design, engineering, but also the behaviors that they need to exhibit and the environment they need to live within, because you take the most amazing cross functional team ever, and you put them in environment where they're micromanaged, and on 10 different products at once, and it's still going to still going to fail. And so I've given guidance in there. But also, some things like the ceremonies that help support working this way in a repeatable fashion. Like it's amazing to run one experiment, but how do you keep doing that? Because chances are, your risk is going to move around and you know, even through the life cycle of the product, you're going to need to do different kinds of experiments. So how do you make that repeatable? So I have the ceremonies I recommend in there as well. I do have a little bit around like leadership and org design.


David Bland:

I can't solve for everything in the book, of course. But you know, the idea of how do I lead through this kind of style of work is really interesting. Because if you're always leading with answers, and you're very kind of expert directive and your leadership style, it's going to slowly undermine what you're trying to do about building a culture of experimentation. There's a little bit in there about funding too. So one of the really interesting triggers for me that I have seen is, if you couple this to like internally C funding, then you could say, oh, we're going to fund this product for six weeks or 12 weeks. And we're going to measure along the way and then we're going to come back and invest in it again, and we're going to show what we've learned and it's especially for your horizon three, new product stuff that could be a really interesting way to couple, kind of like how you invest and make investment decisions to how you experiment. So there's some of this stuff all around the library, but the libraries, the majority of the book, you know, don't get me wrong, but there's also extra stuff in there with regards to team designs ceremonies, work design leadership and funding to just give you kind of this wrapper of Okay, well, this is amazing library. But I want to do this more than once and I make this like a repeatable part and part of our culture. And so what are some of the questions are going to come up and what kind of guidance I can give. So, overall, the biggest takeaway is, is hard. It's really, really hard to do it repeat, like in a repeatable way over time, and I think it can be exhausting. So what are some ways you can make it fun? What are some ways you can tap into the creativity of your team, but also have business outcomes. So there's more than just a list of experiments in the book is what I'm trying to say.


Holly Hester-Reilly:

Yeah, now that's awesome. And I know that a lot of the concepts we just talked about, there are additional resources out there as well. So if you get a sense of Okay, you know, that one seems like a challenge for me at my organization that you can know to dive deeper into it because it really is a whole system. Like, I think that's the message I just heard from that. It's like, making this work. It's not just one person with their lab notebook. Right? So you got to build that whole system. And here's all the things you got to think about. Yeah, that's awesome. I'm so I'm super excited to see the book come out an be able to recommend it to people, I think it'll be really fun. I like watching as these things evolve, I feel like some of the stuff that you talked about, it's a little bit like when some scientific fields advance at one point in time no one's doing this particular type of analysis. And then all of a sudden, everyone's like, yeah, that analysis, right, and then we started having a shared vocabulary for it. And we're able to spread it more and talk about it and figure out when it goes well, and when it doesn't, and this sounds like a really helpful for that.


David Bland:

Yeah, I'm happy to be invited in this and again, I've witnessed this just in the product manager community, just the words we use that's changed over the last even three to five years. If you notice a lot of this experimentation, you know, because I kind of grew up in the lean startup community, it was all Lean Startup all the things and is very, very nice. And I go to product management conferences and saw, like, they'll measure learn and experiment our way through our risk was like, this is amazing. Product gets this, you know, so I find if I had to leave people with something to think about today is I think it's on almost the responsibility of the product managers to push this thinking forward now. I think it's been embraced by the product management community decided we have to experiment to learn and to have successful products. And we have to make it repeatable, and so it's really on product managers, I think to keep pushing the ideas forward.


Holly Hester-Reilly:

Yeah. Awesome and how can people find you if they want to follow you or learn more?


David Bland:

Yeah, I'm pretty active on Twitter at David J. bland. I also have my own company pre coil PRECOIL.com where you can find me and then as you found me on medium I try to write and have a lot of stuff out there just to help people. So a lot of giving a lot of the content away most of the time. So just find me and then if you have questions just hit me up. I'm pretty responsive to email so far anyway. A good job of doing that. But yeah, just hit me up if anyone gets stuck.


Holly Hester-Reilly:

Awesome. Alright. Well, thanks so much, David. It's been a pleasure to talk with you about all of this and I hope our listeners are able to go out and do even more and better experimentation.


David Bland:

Thanks for having me.


Holly Hester-Reilly:

Product science podcast is brought to you by HR product science. We teach startup founders and products leaders how to use the product science methods to discover the strongest product opportunities and lay the foundations for high growth products, teams and businesses. Learn more at h2rproductscience.com. Enjoying this episode, don't forget to subscribe so you don't miss next week's episode. I also encourage you to visit assetproductsciencepodcast.com to sign up for more information and resources from me and our guests. If you love the show a rating and review would be greatly appreciated. Thank you


More Posts