Igniting growth in product teams

(upbeat music) - Thank you Meagan. This is the best audience ever. I'm also glad that the audience is not huge because after this morning sessions with the hearing about story telling, I wish I had heard that a week ago, because today you're not gonna get cups or anything like that from me. So yes I am Peter Ikladious, I am the Director of growth at IBM. The little secret that Meagan just shared is after 12 years in New York, nine of which are at IBM, I'm back in Australia for family reasons, and so this is my last official event as an IBMer. So if any of this interests you or tickles your fancy definitely hit me up afterwards. But coming back to IBM. Just a set some context about where growth fits in IBM and how it gets applied. IBM is a pretty decent size company. It has about just shy of 400,000 employees, it's in 170 countries, and I once tried to do an audit of all the products that IBM did, and we just couldn't get to the numbers. Like we're in the 10s of thousands skews. And so that gives you some context. So the products that IBM has, things that typically IBMs well known for is its main frame, and its server type of server hardware. But then it also has services, a lot of consulting and outsourcing business which is very large here in Australia. And the other part where we focused a lot of energy on growth is in the software space. So IBM Cloud which is a competitor as your AWS, but also security, Quantum, IOT blockchain, and a whole suite of software products around that. So that's where IBM is. I am going to spend the next balance of this time really just sharing some stories and anecdotes about what we learned doing at IBM, what worked and what horribly failed. So hopefully you kinda take some pointers away from that. And so before I get into it, I just wanted to put this up. So Shaun Clowes started growth at Atlassian in 2014, he's now over at Metromile. And he has a great way of how he distinguishes and calls growth product managers as distinct to non growth or core product managers. And where he indicates core product managers, you're focusing on the big problems, like okay what's the market fit, what's the problem you're trying to solve. Growth product managers say well now that you know who your audience is, they try and get as many people in that audience to be successful in those products. So he has a really cool distinction. So I'm gonna touch on that and then kinda go deeper into where growth has touched in IBMs case, not just product but marketing, sales, clients, success, support, and so on. One little tid bit, I'm gonna put a little plug here for if you're interested in kind of the latest going on in product, there's a great podcast from a company called Roadmunk called Product to Product. It's got some great people, great topics, they really go detailed in there. So, as I mentioned it's about the story of growth at IBM. Let me start with the mission. Our mission of growth in IBM is to build an engagement model so IBM engaging with their end users, so that the end users trust and rely on every interaction they have with IBM, irrespective of where it's a digital interaction or whether it's a human. So that's really critical is that the growth mission isn't about growth. There's no where we're talking about numbers increasing, or anything driving, it's really entirely user centric. And what you see there is like there's three images up there, that's the growth team one year in at IBM, so it was about eight or nine of us to begin with. That was after a year. After two years is the second one. And then the third one is that's just the New York based team, it's actually now global team, earlier this year. So we've definitely seen a lot of growth, and a lot of evolution in the way we've done growth, and that's gonna be one of things that we talk about. So to begin I'm going to walk through that journey. Or before I start telling you of that journey, I just wanna first touch on the question of, what is growth? I think we hear it a lot. And it's not, it actually has a very specific definition, at least the way that I see it and the way that we use it in IBM. Because fundamentally what we do say is, big G growth is not little G growth. Everyone's responsibility in organisation whether you're in product, you're in sales, you gotta grow, like what's your year on year growth. Like that's a core mission. That's not what I mean in this presentation, what I mean about growth. Growth is something very specific. It's data centric, it's user focused, it's hypothesis driven, and it's agile. So data centric, it's all about pure data. It really is. The data informs what you want to do and then what you've done. Then it's hypothesis driven. It's not just about, okay let's just see what the data tells us, a lot of it's like we'll now make a guess. What do you think it should be, what do you think these changes will do. So it is a lot of hypothesis creation and ideation that happens in our growth process. Agile it goes without saying. And then the last part is it's user centric. So we really, at the end of the day we focus on the individual user and then we scale that up, okay. So if 33 users did this then all of those 25 users did that. Why did those, how's the math work, eight users, why did they drop off? What happened to those eight users, and those 25 users, what made them so successful. So it's very important that we're very user centric. So first thing on growth is we've got those definitions four attributes for big G growth. Second thing that we hear reasonable amount, it's not marketing. When I started this mission in IBM, I had marketers saying, that's my job, why are you doing it? That's not. Marketing has a certain function. Marketing often will do elements of growth, but growth itself is much broader, it impacts product, it impacts sales, it obviously impacts marketing, and it also impacts post sales, customer success and support. So there's a very big differentiation. So I've kind of gone through, here's a couple of things of what growth isn't. The question is okay then what the hell is it. And so the first thing for a legacy organisation, so a company that's been through a couple of iterations and evolutions is its digital transformation. We've heard about something like 80% of all digital transformation projects fail. It's because they're all ancient. Growth is the execution arm if you want to do digital transformation. So it's very specific, it's very clear, as I said it's got those four key attributes that allows digital transformation to execute. And the way that we started growth at IBM was through the digital transformation lens. So then how does this apply to digital natives? A lot of companies, born on the web, SAS natives, they're already digital, so what is it? Well you're trying to solve for this problem. Biggest issue I still see in small to large companies is the hot potato problem. As a user, the potato, your handed from marketing content to a nervous stream, thrown to the product, from there going to support. And you feel like everyone's just throwing you around. And you really wanna solve that problem. Growth solves that. So for digital natives, growth actually translates into being modern product, go to market and user experience. So really bridging that all together. So that's the definition of what growth is. And I'm gonna go through now a few chapters of our wonderful story and see how we go. As I go, there's a few takeaways that I'm going to share. So the first one is to engage digitally, just do growth. Okay, so if you wanna know how to do, have a really great digital experience, do growth. In the B to C world, you can say I'm pure digital. In the B to B world, as we'll kind of go through some of these stories, actually you need to have a digital or an online experience as well as a human experience but that has to be all meshed together. So you still need to do growth to mesh that together. So very important to engage as leader growth. So, let's start. So I'm gonna take us back, take us to chapter one. 2015, happy, naive times. I was tasked to build out and fix a portfolio of products in IBM. And the general manager of the time said, here's my business, stunted growth, how can you transform this portfolio? Great, into growth. Started with two people and an amazing strategy. So, I know it's an amazing strategy cause I wrote it myself. So it really had a three pronged approach. It said, think horizontally, think vertically, and change the culture. So think horizontally, think about the customer journey, define it, be deliberate about the customer journey across all the touch points. Think vertically, which means find the common touch points that are failing and fix those and have corrected actions in place. And then change the culture, you've gotta change the way product marketing works. You've gotta change the way digital marketing works. You've gotta change the way our product managers are working, engineering is working, designers working, so it's more cohesive. It was quite ahead of its time for IBM. And what we wanted to do was focus a major on the user journey. So we really wanted to focus on the horizontal cause if we could nail that everything else just falls apart. So we wanted to target that. And there was a hypothesis that we were trying and saying, actually with the consumerization of the enterprise, or B to B buyers want to buy self-serve. That was our theory. So we started design thinking workshops, and we locked people in rooms. Notice no windows. We invested in 3M and got lots of post-it notes out there. In this room we would have design, engineering, product management, support, and client success. And we would just hit it out for about three days. We would do competitive analysis, we'd look at their journeys, we would define our own journeys, we'd produce things that look like this, beautiful. Okay which is like almost like information architectures come journeys. In this case, this was for one of our products called SPSS, and then we would guide them through and we would intersect, okay when does the email come out, when does the content come out, we really thought this was the way to do it. And we came up with 30, 60, 90 day plans. We had a kanban board, agile plans, risk mitigations, we had a stakeholder engagement plan, we had this nailed. (laughing) Side bar, I have twin girls who at the time were about four years old, and the movie Frozen came out. I've seen it 27 times, so I can recite it almost word for word, and in Frozen at the start you hear Anna and Elsa singing, ♪ Do you want to build a snowman ♪ and it's beautiful, and it's wonderful, and it's happy, and Elsa's making snow inside the house, and it's really, really cool. And it's at that point that my personal and professional life merged or coalesced. Because what happens next in Frozen is Anna and Elsa's parents die, as did our initiative. (laughing) So we desperately tried to bring things together, and people were working so it wasn't that people weren't doing work, actions were happening, but it didn't come together, so we got some successes for example, we had so many legacy systems that when you registered for a product it went through so many systems that an end user would receive 14 emails. The 14th email told them ignore the first 13. (laughing) We had to go and reverse engineer the IP address of the machine just to find out where the emails were coming from. That kind of tells you some of the hygiene factors we had to work on. But needless to say we didn't get the results. We really struggled. We didn't the results. And everyone was very disillusioned by that, like what happened, why did this go, what went wrong. And so, we sat back in a fit of rage, frustration, and angst. Kind of tried to understand what failed. And there were two things that fundamentally failed. Number one prioritisation. Number two data. So from a prioritisation perspective, we had all those teams together in the room, and they're all, quote, aligned. But they're aligned within their own strategic priorities. So marketing had their own priorities saying we're gonna unify all the websites around this and as long as your thing fits in there we'll tweak and adjust it. So it was half assed. The same with the sales models. Like yeah that's great but we've got certain tiering models and certain structures. And so really the alignment was a problem. The prioritisation of the alignment. The second issue around data is, I could tell you the number of visits and sessions to a webpage, I could even tell you the click through rate, couldn't tell you what users were doing. So we really had poor data. You had classic, imagine you had Google analytics on the website, then you had something different in the product, if you had something in the product at all, and you really struggled. So the second take away before I kind of talk about what we did, is really good journeys need aligned group priorities and good data too. Now show of hands, which of these two is easier to fix? If you think it's aligned group priorities, put your hand up. If you think it's good data, put your hand up. All right you know where we're going, so we picked good data. (laughing) So chapter two, is we double down on data. We said you know what, we're not gonna fix organisational molasses, as I said 400,000 person company, what are we gonna do. So we wanted to double down on data. As I mentioned we have a scale issue. So every time we'd come up with a solution in IBM, it has to work across hundreds of product teams which implies thousands of users. So we needed a solution that even though we didn't know exactly what the end game was, wouldn't require us to go back to all of those engineering teams, to go back to all the product teams, marketing teams, and say could you just redo something. So what we did is we worked with a company called Segment, and we came up with a standard and we said, this is going to be the user tracking solution. So for us, that was one of the core things to get data together. The second thing that we wanted to, oh yeah, not the second thing, just the only thing. Then we said let's work with three product teams in IBM. One was Cloud, one was called now no longer existing analytics group, and then the third one was the Watson Data platform team. And we picked one product and we said, it was actually a database product which at the time was called dashDB, and we said you've got an instrument, you're gonna use Segment, you gotta plug in and you're gonna tell us what users are doing, so just get the data in, and tell us, like when they click, when they query, when they do this, when they do that, just go in there. We don't know what to do. We just said to the engineers, go crazy. And they did and that was really cool. But then there's obviously a lot of checks, and when you got the release cycle in engineering, it took three months. What's literally two or three lines of code to put in there it still took three months to get live. But we did that, and it was really, really exciting when we went live. Because then we produced charts like this. And so we did this dashboard and if you take a look at it, I don't know if you can, if you squint really hard you can probably see some of it, it's basics. It's number of logins, number of active users, maybe some basic feature usage, by time of day or day of week, so really simple stuff. And then all the sudden the eyebrows started to raise, like oh, this suddenly got a little bit interesting. About that time there was a new product that IBM was working on which is today called Watson Studio. And it was still, what is it, a twinkle in the eye of the product managers. And they were still working out engineering. When they saw this this said, we want to do this from the beginning. So before they even got to alpha they had instrument, and they said I want to get this level of data, I want to get this level of sophistication, I wanna know what our users are doing. One of the real takeaways from there and kind of when you are onto a good thing, was when their starting point, which was roughly around March, April through to when they had a big signature launch, which was September of that year, they redesigned the product three times, because of the data that they were getting. No time before in IBM had anyone ever done anything like that, no time before had anyone ever relied on actual user data to optimise their product. We had some amazing design teams who were focusing more on the qualitative type of research, but there was no real data to say what were people actually doing. So this was huge, it was really justification for why we were going through all of this. So for us the third takeaway that we've got is focusing on data makes you interesting but not yet valuable. Okay. We had the eyebrows but that was about it. So, we fast forward a little bit and we come to chapter three, engaging our users with real data. About that time we heard from the Intercom speaker earlier today, Intercom was rising, it was like the only chat environment at the time, it was pre-drift, pre-chatbots, it was like, it just allowed you to chat, and there was a little icon on the right with a smiley face and you clicked it and you could chat, and it was fantastic. And we thought, let's give it a shot, like this is really interesting, let's experiment with engagement. Like how do we engage with users that was the first thing. The second thing we wanted to experiment with was growth hacking. We'd hired a couple of growth hackers and we were still trying to work out what does growth hacking mean in an enterprise like IBM. It's great when you own everything and you're a one, or two, or three, or even five product company, but when you're a multi hundred product company with such distinct silos of disciplines, how do you make this work. So we tried to pull all of that together. So, growth hacking continued for trying little experiments, seeing what would work, trying to build that partnership, so definitely we had a theme of building partnerships out within the organisation. And then on the engagement side we really wanted to do chat, so we said let's give it a shot. You know, we don't wanna be laggeth, we don't wanna be perceived as old and dull IBM, we wanna be new and innovating IBM. And so we trained up maybe 40 or 50 people to do chat for two of our products rather. And we got from their product managers, engineers, and some support staff, and we got them on there, and we're able to cover generally 24/7 worldwide coverage with those 40 people, we trained them with certain scripts, we kinda tested questions. And then we let that run for about three months, was it gonna work, wasn't it gonna work, I don't know, let's find out. So after three months we reached out to one of our data scientist, we said can you just run the data and see what happens. And this is what happens, oops as I knock my microphone. Users who chatted with IBM people used our products two to four times more. And finally the penny dropped. It was at this point we knew where the value was. Now this is still corelation, it's not causal. But it proved both points, which ever way. Chatting to us drove more usage, or high valued users want to chat with us. Remember how I said we were saying, everyone's gonna go to self-serve, this just disproved that. This just kind of threw that to wind side. Actually it's a B to B product, then natively can be quite sophisticated and quite challenging, you can't have pure digital. So this was huge. And if you take a look on the left, the transcripts. The chats were how do I dot, dot, dot. Okay so this is how do I export a notebook as HTML. And so there was a conversation a dialogue, and some really material improvements. And when this happens, everyone's eyes popped up. This is where product managers said, okay I want this, and general managers said, okay how much is this gonna cost, and how much will I get out of it. It was very quick. So how do you scale? So the takeaway for number four is interacting with users, this is the real value driver. So in growth, how I mentioned earlier, our mission is user centric it's because of this. When you can optimise the way you interact with users and they feel the love, this is where you actually get value. So we did all that, and as I said hundreds of product, we already had 40 people trained up on two products. If we multiplied that out there is no way we would train up that many people. Number one it would be prohibitively expensive, number two the quality drops. Like once you get beyond the 100 people, like you know what it's like to call a call centre that has 5,000 people. It's like you know what, mixed bag, I can't, I'm gonna call up the second time maybe I'll get a different person to help solve my problem. So we didn't want to go down that path. So then the question is, what do we do? Chapter four, conversing with users at scale. This part introduced the concept of how do I automate, and what does automation mean. So when we distilled the dialogues down, we wanted to get a better understanding of what was going on. And so we introduced a few concepts here. Three actually in total. Number one was data standards, number two was milestones, and number three was automation. So first you cross all the products, they had gone and done their own data standards. So we said enough with that, we now have a good enough idea of what uses are doing and how they're behaviours are, and what information that's important to us, so let's lock down on those data standards. So we locked down on those. The second thing we did is we define this concept of a milestone. So earlier today and even yesterday, the whole jobs to be done. So jobs to be done is, my job to be done is to have a holiday in the Blue Mountains. But then that's broken up by will I need a car, I need to get fuel, I need to get all those other things, and pack a car, and do little, use let's say quote unquote products in the way. Milestones are those sub elements that get you on the way to success in a product. So in the software example of let's say a database there are essentially three key milestones in a database product. Number one I created the database. Milestone one. Milestone two, load data into database. Milestone three, query data out of database. That's it. Every product we said has between three to five or six milestones end of story. If you've gone too much then you've gone too much into the weeds. If you have fewer you haven't distilled how to get to that user to that point of value. And that triggered us to go into that third point around automation. So using the milestone model we were able to say great, you've done milestone one, why don't you now go and do milestone two. Great you've done milestone two, great now try and do milestone three. And you gave them directive instructions. A very behavioural trigger, real time, it was inner and email. As well as there was an extra chat component if they still needed more help. And that was really powerful. And so we set up a whole series of rules. And because we had standardised we could actually start scaling this across, at that point we were now about 25, 30 products. So we were really trying to get this up to scale. And so we did that, we ran it, and then we came to our data scientist and says hey, Caitlin, Caitlin, how are the numbers look? This is how they look. The conversion rates from trial to pay conversion improved four times. Our adoption, product adoption, which for us was post purchase, how many people actually used the product afterwards, improved by 30%, and billable usage, which is a proxy for revenue for us, improved 17%. And if you look at the rules, I put some proxy meta code up there, it was that simple. None of this is rocket science. This was if one than two, if two than three. Very, very, very, very simple rules. There was no AI, there was no ML, just the basics drove that performance. So at this point we had nailed what the value point was. And this generated a tonne of success, a tonne of success because we saw these results but also it generated a tonne of demand. Before we had been begging product teams to work with us, now the product teams were working with us. Like they were actually coming to us and saying, hey, can we do it. We had more demand than we could actually support. So the really a key shift. So the takeaway number five here is the only way to scale value is through automation. Okay. So where does that take us? Let me take us, fast forward us a bit to the present. So I'm about to share our org structure as it represents today. And what you've seen is so far about a three and a half year journey. And what started out with two people, myself plus two people, now is a hub team of 35 people, and north of 1500 people across IBM doing this stuff. So massive scale over the period of three years, or three and a half years. I'm holding the moment in terms of showing you the org structure, not because this is the right org structure, but to share with you this is what we've come to after evolving five times. So every time I went through a chapter, we reorged the team. So today what you've got is these four core teams with a fundamental element at the bottom. So at the bottom you have engineering and data science, so they inform and empower everyone. The first box is growth platform, so that's a classic product management and engineering team which focuses on technology and capabilities to make this stack work. You have growth and nurture services, so this is really a set of specialists who know how to do growth hacking, how to build campaigns, how to build nurtures, and like an internal agency that helps product teams actually evolve and helps them deliver campaigns and mature their product. So everything from how do I build my own growth programme through to how do I turn my transactional product into a SAS product through to some tactical how do I build a campaign to do A, B, C. So they support not just product teams but marketing teams, client success teams, sales teams. There's an adoption and learning team, which focuses on training just basic capabilities. And in journey systems which is something that, if you remember earlier chapter one was the horizontal, that we've spun out, and we brought in an amazing leader in that group who did actually all the journeys for the U.S.'s Department of Veteran Affairs, and they've built out an amazing model that transcends across all of IBM. So that's how we drive growth in IBM. That between six to 12 months will change I guarantee that, every six to 12 months we change. I think the big lesson out of this is if a 107 year old company like IBM can pivot in that period, then I think everyone else should be able to achieve that. But to do that requires this really level of ambidextrousness if that's a word, dexterity, at the leadership level but also of the people that you have. So if you bring highly specialised specialists to do roles, then they may struggle when you say okay well we're pivoting a little bit. So you gotta really think about how do you get that balance so that a growth team really does have that two sided mentality. So, this takeaway number six, big one is expect to pivot and evolve very regularly. As I said, six to 12 months it's pretty important. And the other part that's probably worth mentioning is you kinda need to have an agile organisation. So what does that mean? Like you need to be agile, I think the term that I heard was in the previous session was don't do agile be agile. So that's this. So growth really needs to be agile at the core. Not only in the way that you work with your product teams, marketing teams, sales teams, and so on, but also in the way that you yourself are operating. So, fantastic sounds great we got to today, driving value, so we think. But there's a part that we say, there's always more. So just as we three, four years ago we were part of the product team. So the growth team in IBM has gone through several phases, several evolutions. One was it was reporting into the product team, another one was reporting into marketing, and now it's a standalone entity, that's a peer of product and marketing. And the question is well then what do you do beyond the product? So I'm gonna show you a couple of quick snippets of what we're doing with clients success. So based on that we came up with an engagement framework for clients success. A big really important role in B to B SAS is really how do you have a good client success engagement model. And so we came up with a model that has these four letters. Or four words. Arm, alert, engage, and automate. And this is where we decided to get more sophisticated with how to use the data, how to use that automation that we spoke about earlier, how do you activate chat. So in arming we use Gainsight. So we actually got a lot of that usage data and fed it into the client success manager so they could see, they could see things that were relevant to them. You see on the milestone one, so the milestones I was talking about, they're now visible to the client's success manager. Now they know what the account is doing. Alerting, we built some this is now when we say get more sophisticated, we now built some churn predicted models. That said based on your usage what's your chance of attrition, or your chance of expansion. So we can actually now build those models and then alert the client's success manager, not with big numbers and metrics, but a simple call to action, something's happening go talk to them. It's literally a checklist that comes up them in Gainsite. Engage, we found there was an issue with credit card billing, so credit card swipe on Cloud because of some policies, 40% of people were being rejected, not for fraud but just for other reasons. And so we put in a combination of an inner miss, he's like oh oops something happened with your credit card do you wanna chat now. As a result of that, that drove multimillion dollars ARR just from a simple intervention like that. And then automation. When someone signs up for, or when someone's now paid for a product we now build that automation to introduce them to the client's success manager so it felt automated, it felt personalised. So very simple things that we expanded well beyond the original remit of growth, which was just let's get product and marketing to work together. So really this takeaway is simply satellite out into all the disciplines. Really we've got a hub and spoke model, so we have as you saw the picture seeing it's the hub, but we have spokes in client success, we have spokes in marketing, we have spokes in all of our business units, so in security, in Cloud, in blockchain, in IOT, and we also have spokes in the geography, so we have a team actually out here in AsiaPac, we have a team out in Europe, small one in Miya, small one in Latin America, that is actually executing those as well so that the core technology standards capability skillset remains in that core centre of excellence, and you've got these spoke teams that are feeding off and learning, and they're also being agile and learning from it. So given all that, was it all worth it? Like what's the huge value here? The first thing I can say is we got awards, so we're pretty excited about that. The one on the left Braze award, we just got awarded that two nights ago in New York for the best partner use case of the year. So we actually used Segment, Amplitude, Braze together in the way that we use it is, they think it's pretty sophisticated, so we like that. And it's driven some great value. The one on the right is from Segment for daughter impact award. So when you see the values in a second, you'll understand why, but this is driven a tonne of value, and so from Segment we received that award about a month ago as well. It's great to get awards. And awards is what pays our bills, right? (laughing) We actually ran the numbers as well. Here I'm just showing a chart of purple is the investment that we've been spending, and the green is the return. Okay, and so these are dollars, U.S. dollars. But you don't know that because there's no vertical axis. But just trust me on that. So chapter one it was all cost. We cost, we spent money, didn't give us anything. By the time we started doubling down on data there was some returns, limited, but some returns. When we started engaging, when you start engaging and having chat we started to see some increases. At automation chapter four, you can see we're now double. We now got a two to one ROI. And now today we're north of six or seven to one ROI. And that was these numbers are actually probably about nine months out of date, this year we're on track to be north of 10 to one. So that's the performance we've been able to get as we've grown, and most people think that IBM says well you've got tonnes of people, tonnes of money, tonnes of resources, no. At the beginning I was told do it with two people. And as you get those little successes you build, and build, and build, and then at some point you hit a tipping point. And so for us, yeah, we've made money. So with that let me kinda close out, summarise what the key takeaways were. So number one, to engage digitally, do growth, big G growth. So that was data centric, user focused, hypothesis driven, and agile. For good journeys you need aligned group priorities and good data. The third one that focusing on data makes you interesting but not really yet valuable. The fourth, interacting with users is the real value driver. If you take nothing else away just take that one, like it's all about users. Number five, you only scale value through automation. Six, expect to pivot as a growth team and evolve very regularly to reflect and adapt to your organisation, and to reflect the cultural change that's also happening in your organisation. And then lastly satellite out into all disciplines. Don't constrain. Don't think growth is, I'm just here to fix one thing, it's the whole customer journey is yours to be taken. And with that, thank you very much. (crowd clapping) (upbeat music)