Framework is a podcast about the process of researching, planning and building that goes into bringing a product to market.

If you’d like to hear someone else’s product story on Framework — or tell your own — we’d love to hear from you.

Framework is hosted and produced by Rob Hayes and Tom Creighton © 2018



Eric Puigmarti is a Product Designer helping startups and established companies design human-centered digital experiences. In this episode we talk about his work on Payment Schedules at Freshbooks.

Framework: You're listening to Framework, where we dig into the planning, research, design, and development work that goes into bringing a product to market. I'm Rob Hayes – and I'm Tom Creighton and today we're talking about payment schedules with Eric Puigmarti. Eric, how about a short intro to yourself and where you work.

Eric Puigmarti: Hi guys. Thanks for having me. My name is Eric, I'm a UX designer at Freshbooks.

F: So: payment schedules. What is the product problem that you're trying to solve by by building this?

E: So at Freshbooks, we're building a world where small service-based businesses can successfully run their businesses without having to learn accounting. That's the platform in which Freshbooks was conceived – and a big part of that for businesses is being able to bill their clients for for work that they're doing. Freshbooks today already has invoicing and other billing tools, and payment schedules was the conception of an extension to that and an optimization to certain processes and workflows that we've started to see our customers end up dealing with within our platform.

F: What kind of problems were your customers running into that you discovered along the way that really put you on the path towards building out payment schedules?

E: Early early last year, we did a lot of ethnographic research where our teams actually went out of our office – to meet customers in their homes, in their workplaces, in co-working spaces – to start to get to know our customers a lot better. In part of that research, we uncovered many different Insights about our customers when it comes to running their businesses. Things like: how they get paid for the work that they do, some of the challenges with asking for money. A lot of freelancers and small businesses really struggle with that kind of relationship and asking for money, which is a really challenging problem for them, and a big part of that is that a lot of the tools within Freshbooks aren’t flexible enough to match how they want to bill their clients for the work that they're doing. A lot of the insights that drove this were born from really talking to customers and seeing how they interact with our tools. The big problem that arose specifically is that a lot of our clients who work over a long periods of time or do large projects, they often have to bill in these large portions which can be a big challenge. As a client if I'm hiring someone to build my website and they bill me for ten grand, it's very likely that I won't be able to pay that all at once. So the solution that we came across was a way to be able to break down this huge project into smaller pieces to make it easier for everyone to manage that payment process.

F: So this wasn't particularly a new problem that you were uncovering in your research so much as creating a piece of functionality that everyone could potentially use.

E: Yeah – it's not necessarily a new problem, but it was kind of an extension to the solution that Freshbooks already solves as well.

F: So when your team is actually going out to do that ethnographic research, are you going out with a particular problem in mind to research, or are you more just looking at your users’ workflow to see what kind of problems and patterns pop up as as you watch them and their day-to-day?

E: It's a little bit of both. Initially at the beginning of last year we started just doing some base understanding of our customers – who our customers are, and what they know, what they do and how do they run their businesses just to expose some overall issues and potential opportunities for us to investigate. So that was the first step of that, and then a lot of the product teams at Freshbooks went out last year to do a lot of this generative sort of research. Then we took all that research back, and started to identify a few key areas that we might want to explore, and that helped us dig a little bit deeper into specific parts that each of our teams focus on within the product.

F: So each team took a different tack on on what was interesting to their functionality or their vertical within Freshbooks?

E: That's correct. Each of the product teams have a specific mandate or focus area within the product – not specifically a set of features, but it's sort of like general sentiment of a different part of the product. So each of the teams – as we went out and did this very generative research – we started to figure out if there are learnings or insights that other teams have gathered that can help us figure out potential problems to start exploring for that given year.

F: So who on your team gets involved in this? Is it is it just design leading this or do you have researchers? Is engineering involved in any way?

E: This was a bit of a new process for us, starting last year. Initially it was just designers and the product managers from each of the teams across the org. There's about 10 different product teams – so 10 separate teams went out to do this research, starting last year. So it was just design and product essentially.

F: Can you just give us a bit of a walkthrough of what your research activities look like? I'm kind of curious how you recruit and who you recruit, and what types of research methods that you use for this.

E: A lot of this generative research, we actually call it ‘customer intimacy’. It was really an exercise to get a little bit intimate, to know our customers. It's a great term, we didn't come up with it. But what we did essentially – you know Freshbooks is a Toronto-based startup – so we reached out to a list of thousands of Toronto customers that are local or in the GTA area, and we essentially sent out an email saying, ‘Hey, Freshbooks is really striving to get to know our customers better’. Over the years, we’ve found this process to be very useful for helping us identify potential ways that we can improve our products. So we reached out to over a thousand customers, asking them to invite us into their homes, their office, coffee shop – wherever they work – so we can get a better sense of what they do and actually get to see the environment in which they work in day-to-day. We sent out an email just to recruit them and then once they signed up, we ran through all the logistics in terms of finding appropriate time to meet with them, and obviously figuring out transportation, like how we get around the city to meet all these different customers. My team specifically, I think we met with about 12 to 15 different customers within a couple weeks.

F: I've actually gotten one of those emails before and I didn't respond because I wasn't sure how much I wanted people judging me as I sat around my home office in my jogging pants.

E: That's definitely a big challenge in doing this sort of research – getting to an understanding of what's the value for our customers in us doing this, but also what can we get out of it, and how can we make this a comfortable and enjoyable experience for both parties? So that's really key for us, for people to open up to their homes or their offices or even just take time out of their day to meet with us.

F: I imagine you've got a lot of really good data from from seeing people use the product in the place where they actually use it. How did you take that back and then synthesize that, and and eventually end up with the idea of payment schedules – versus any other kind of solution or approach to the problem that you found?

E: It took several weeks to be honest – for all the teams who went out to do this research to come back and synthesize through all this research. A lot of that lives on a Dropbox Paper library where all of our teams can access each other's research. We started pulling out key insights and things that we learned along the way that pertain to our team, while also looking across the org to see if the insights also matched up with others. It took a number of weeks just to go through all the data and all that research. We took photos of people's offices, we recorded all the audio – so we had to go back to some of our notes and go through that, but at the end we had a lot of these really key insights that, to each of the individual teams, seemed like it was an opportunity for us to explore further and that helped us set forward for the next few quarters of work to be investigated. Doesn't mean that we were necessarily going to commit to building any of these things, but it was it was more a focusing tool for us to figure out – Okay, we think there's opportunities in X, Y, & Z places. So we're going to commit to exploring that a little bit this year.

F: That's interesting. So, you'll actually go back to the office and compare notes with all the other teams there. Just just for my context and listeners: what team are you on? How is it defined?

E: I’m on the Payments team which actually spreads across a couple teams, but essentially that's the area within Freshbooks that helps our customers get paid for the work that they do, and also bill their clients effortlessly for the time that they track for a given project.

F: So you're actually seeing that a customer's workflow probably cuts across almost every team in the company, so the problems that you're running into don't fit nicely into just the scope of your team.

E: Yeah, a lot of the insights that we gathered span across so many different areas within the product, so it's really important for us to be able to access each other's research. So much of it was just spread across different areas within the product specifically.

F: You mentioned that this is a bit of a new way of approaching feature development for for Freshbooks. Did you have this very collaborative process in place previously, even outside the context of research?

E: Not necessarily! At Freshbooks we've been – in the last two or three years – really rebuilding and redesigning our entire platform from the ground up, both through research, through design, and through development, all the way across the board. So the the past two years for our teams have really been building this foundation for the future of Freshbooks. Once we got to that point where the new platform was released, and it was accessible to customers, the product development team – which encompasses design and product – looked across the board and said, how do we know where to go next? Like, where do we start? So, in in some sense, it was kind of a blank slate for us to start building on top of this platform that we poured all this hard work into building. That was kind of the best way for us to kind of level-set, to do a lot of this generative research and start to figure out what problems are our customers trying to solve today, and are there ways in which folks can help them get to meet those goals.

F: You said that payment schedules it was one of the areas of opportunity that you identified amongst a handful of things. How do you go about planning, prioritizing and building out the road map in terms of which problems you tackle first?

E: One of the big themes that our team specifically highlighted, one thing that we identified, is that Freshbooks wasn't flexible enough to meet the billing needs of a variety of different types of customers. The way in a development shop bills their clients is different than a recording studio, or a marketing agency, or someone who sells pool toys. So all of these different businesses have their own way of running, and billing their customers. That was kind of the big theme which we identified and we thought, Hey, there's many things that we can explore within that. We committed to that theme and within that, there were several different insights in which we could dive into. At Freshbooks, our product teams are very autonomous in the way that we choose what to commit to. Once we had that theme, we had a few key areas In which we could explore. Some of them, we explored a little bit and did some further research and before actually coming to a solution we agreed, this wasn't something we wanted to invest in. So: we would just keep it on the back burner, and then move on to the next project to investigate. We actually went through several different small experiments where we actually tested some of these features, or other ones that we just researched and realized it wasn't a good opportunity, before getting to payment schedules, which kind of evolved out of that.

F: So your team's actually going through some validation – some solution validation – on problems before committing full resources and and your team's time to building out a proper product release?

E: Yeah! Another big thing that our team adapted last year as well – there's a book by Ash Maurya. His book “Running Lean” talks about this process of developing a plan to validate a given problem. So what we did once we had a key area to investigate, we went out to go and talk to customers to dive into any given problem, and figure out: is this something that we can solve and is there a market for this? And then once we got more information about that we can start to figure out, okay, here are some of the problems that we identified – let's go back to customers and figure out which of these problems are the biggest pain for them. Once we have a set of problems identified, we can figure out which ones are really dire, which ones do our customers really need us to solve for them. So that helps helps us prioritize some of these things as we go along, before even getting to any sort of ideation or development for any of these potential solutions.

F: It sounds like you or your team picked an off-the-shelf framework to help you think through these ideas and validate their worth to both Freshbooks and to your customers. How much did you have to adjust that in terms of how well t aligns with what Freshbooks is trying to do? Or was it that you could just run with it from the get-go?

E: A lot of the team just ran with it from the get-go, and changed it as needed. There weren't many significant changes in the process. What we did find using that sort of framework is that it did take a little bit of a longer time than outlined in the book, where he goes through things very quickly in a matter of two or three weeks. So the timing was a little bit off, but the overall process and the different stages to validate an idea aligned pretty closely to how we wanted to start tackling these sort of problems for our customers.

F: So for payment schedules, how did your team go about validating this or prototyping it, so to speak, before you knew that you wanted to invest in the solution full-time?

E: Previously in the old platform of Freshbooks, there was a feature called partial payments which kind of half-did the way that customers were looking to bill their clients. What that did is essentially – Rob if I invoice you for a thousand dollars and I sent that invoice to you, you could choose how much of that $1,000 you wanted to pay. Which, obviously, has a lot of flexibility to it. But at the same time, you could put in a dollar, which wouldn't necessarily be great for me trying to collect $1,000. For a lot of our customers who moved to the new platform, they already knew that there was some sort of existing functionality which we didn't put in the new platform. We got a lot of customer support feedback from customers who were looking for some sort of functionality like this. That was a big indicator for us that this was a good opportunity to explore. What we actually did is a quick analysis of those type of customers to figure out – are there some key criteria or attributes about these customers that bill their clients in this way that we could start to find trends? We actually looked through a database to figure out some of the key customer segments that are requesting this sort of functionality, and then we went out to do more of a deep dive into these problems with these types of customers to figure out: Is this a problem that this general type of customer has? That kind of helped us find the right people to sort of validate this process with specifically.

F: So when you did that deep dive, did your assumptions actually get validated, or did you have a larger takeaway that maybe you hadn't quite hit it right on the head?

E: I think the the problem was fairly closely validated to what we expected going in, but the actual solution, the way we would actually implement it, was actually quite different than what we expected. When I'm working on a big project, for example, we knew that we wanted to allow our customers to break down an invoice into smaller chunks. But a lot of the other key things that we didn't think about was, say, how am I going to break that down? So some of the technical feasibility in some of that, something that we weren't really sure about, and also some of the the flexibility in a lot of other features that would also need to interact with this. For example, if I'm breaking an invoice into four installments, some of the things that customers really sort of needed if this was going to be the new way billing is – I want to send a notification to my client before an upcoming payment for example. Or just remind them before their payments are due, or have a clear way of letting them know like, here are the four payments that we agree on within a given project beforehand. So a lot of these key ‘nice to have’ features started to emerge that we didn't expect going in.

F: Having balled up all this research and validated and fine-tuned based on that, that further feedback, how do you then take all of this this very actionable information and planning and and sell it up the chain? What's what's the process to get the green light or get the sign-off within Freshbooks?

E: I think one way to look at it, in other companies the way of going through a development process is you do a little bit of research then you design this feature or this new thing you're working on, then it goes into development and it could be weeks if not several months before it actually gets shipped, and it's in the hands of customers. So there's a huge amount of risk in that, maybe the feature isn't built the way that it's supposed to be, or maybe the feature actually isn't needed for these types of customers, or the market’s wrong. A big part of this as well is pricing – can we charge for this feature, so there's a lot of risks along the way and this sort of process, the way that we sold it to the rest of the company, is that we have a lot of these flag posts along the way, that we can help de-risk some of these risky assumptions along the way so that we don't work on a project for six or eight months to then release it with little to no benefit for the company but also to our customers. We flipped it on its head – if we could go out and validate a given project in in a week before actually building the thing, that's a huge win for us. Having these key check-ins with customers along the way helps us validate each sort of given assumption. We're de-risking the the feasibility of launching something that just doesn't doesn't fit what our customers need and want.

F: It sounds like your team has a ton of autonomy to identify which problems they want to work on, but you're talking about selling this into the organization. Who in particular do you need to convince that this is a worthy investment of your of your team's time?

E: Our project managers, they have their own direct reports, the VP of Product, as well as the rest of the executive team which includes the heads of other departments within the team. Once we have a key problem that we believe we want to pursue for any given time, that gets pitched to the VP of Product, and then we get feedback on that as we move along, and then that goes into an OKR for a given quarter or given set time within the year – that we're going to commit to researching, or investigating, or building this feature in this much time in a given year.

F: So it sounds like this is all part of a larger planning exercise to identify items and get them on the road map. From what you're describing the time period – from the generative ethno exercise you're doing, to selling them through to the VP of Product, to when they actually show up on your road map – covers a span of a couple of months or even more, and so that's the right way to think about it?

E: Yeah, that's right.

F: So just backing up maybe a couple minutes, you mentioned framing this up in terms of OKRs, and also having some guardrails around how you how you were approaching this problem in terms of de-risking, which is really interesting to me. I think probably most organizations have those kind of internal levers that problems need to be framed up around. I don't know how to deep you can go on this, but I'm really interested to know, in the world of Freshbooks, how closely aligned with those OKRs does a new solution have to be before it's approved or built or what have you?

E: I think that that definitely depends on different teams and what their focus areas are. My team specifically, given that we're in the payment space, a lot of it is tied to some of our key metrics that we’re responsible for, which for us is online payments, for example. So: actually allowing our customers to get paid online and do transactions online. There's a financial gain in a feature that might pertain to that, as well as other key metrics. As I mentioned earlier, one of the bigger challenges we had was not having a flexible way to bill clients. So as a result, that could contribute to retention of customers or churn. The best way for us to frame these projects is what sort of investment we would make, and what potential metric do we think we would actually be able to change towards for the better for the company, in order to have enough value to pursue that for any given time.

F: So before you kick off a project, you've got a measurable goal defined that that your team is looking to hit with this feature. You're putting a line in the sand to say that we believe this is going to effect X change in our online payments behavior.

E: Yeah, we set those benchmarks within the process of going through and validating in any given project we work on.

F: It seems that kind of behavior is missing from a lot of product development work done in this world, where the actual measurable outcome is not really clearly defined and understood by the team before they actually break ground on building the feature.

E: Exactly – and I think that that can also help teams identify how much they want to invest in any given solution, if there's a huge potential to increase a certain metric. It's payments in our case, if the feature we build has a huge potential to increase volume of payments, then we can figure out – okay, how much of an investment do we want to make before we get to the point where we realize if this is actually working or not. So it could be a very lean approach, or could be several months. We actually want to build this correctly, so we're going to spend a lot of time building this from the ground up very strategically. We have some security knowing that this can have a big impact, versus other other examples where we don't know if it's going to have a huge impact. We might invest less time in that, but we still want to get to validate that problem.

F: So there's some sort of ROI calculated that determines how much time and resources your team should afford to sink into getting this project out the door. Did you get any pushback from from the team or leadership on this project being a priority? Or even necessarily having the same value that you saw?

E: Not necessarily. I think a big proponent to this was all the customer support feedback that we had going into it. We knew there was a strong enough need, and there was a visible need, it wasn't just us going out and researching this new thing. For the rest of the org, seeing that our customers are requesting this thing – there's some sort of need for this already identified. I think that kind of helped propel us a little bit forward, I didn't necessarily get a lot of pushback as we were going through. There was definitely a lot of great feedback from the VP of Product and other people within the org on the different stages within the process, and level-setting with them to make sure we checked in with customer support to ensure what we're proposing actually makes sense from what they hear from customers, or chatting with our finance team and projecting what sort of impact this might have. There's definitely a lot of check-ins across the org happening kind of organically at Freshbooks but I don't think we hit a lot of friction specifically as we went around trying to build out this feature.

F: Okay, so you are convinced that this is a priority – you've managed to convince the team that this is a priority as well. You've defined out all of the validation milestones that you want to hit along the way. So what is the first step in the build process here? How do you actually begin to break ground on a project?

E: Right. Taking all that customer research that we started off with and all the insights, we break that down into a lean UX sprint. So what that means is we take this problem – we have all these insights from customer support, from our research, from other people in the org that help us define what the key problem is – and then we do some sort of solution brainstorming where a product manager, a designer, a developer or someone from our team, a technical person is part of that ideation as well. Other members throughout the org with different specialties that might be useful this sort of ideation. We go through a brainstorming exercise to figure out: how might Freshbooks solve this problem? So that kind of process goes through a lean UX sprint, which technically usually takes about a week to go through, to get to a high-fidelity MVP. Then what we typically do as well is that our development team will do some sort of feasibility and scoping on the proposed solution before we actually go forward, and validate that with customers as well. We try and get development as involved as possible so that we don't go weeks and weeks ahead, only to have to start over or circle back to things that just aren't possible to be built.

F: So as either part of the ideation or the the feasibility study, is there a notion of breaking this feature that you've identified and planned out into a V1, or an MVP? Or is there the idea of having an indivisible feature, like it has to be at least at a certain level before before it launches?

E: The way we went about it for this given feature, when we did the brainstorming exercise, it was kind of ‘this is what the full solution looks like’, and then once we had that idea then we could figure out, are there a few lines that we could draw where this might be phase one, this might be phase two, this might nice to have for the future? And then what we actually did as we bring this MVP idea back to our customers before we build anything. We actually try and pitch that idea with that phase one. Figuring out – with this set of features – is this enough to solve the problem? That way, we can identify – if they say no – if we actually need this other sort of piece. We can have a better idea of what will encompass a lean MVP, enough for us to get in market with customers using it – for us to actually figure out how it's doing and whether or not that solves the problem. We go back to customers in that research phase to figure out exactly what this whole product looks like without having to encompass the whole end-to-end experience.

F: Are you putting artifacts in front of them at this point or is it just conversational interviews that you're running?

E: At the end of that one week of brainstorming, we actually develop a high-fidelity prototype. It's typically an Invision prototype which we use as an artifact to pitch the product to our customers before we build.

F: Are you going back to the same audience throughout this process or do you bring in new customers with each successive round of interviews and research that you do?

E: We do typically go back to the same customers. In the first part of the project where customers have identified this problem, we'll go back to those ones where the solution still maps to that problem and figure out – Hey, this way of solving it, is this the right approach? Is this actually beneficial for the way that you run your business? That helps us do a level-set, and get that consistency within the whole project rather than going out to a new set of customers which could fail for any number of different factors.

F: So that kind of feels like another form of de-risking in terms of measuring your predictions around the the MVP against some some real-world usage. How do you, as a team – or even as part of another ideation cycle – manage changes or iterations that that you get after that that first check in with actual customers?

E: So once we get that feedback from our customers, we do a level-set and figure out what sort of changes we need to make up to this point, all the things we talked about. We haven't actually written any code yet. The ability for us to make changes and pivot a little bit is quite flexible at this point. It was really easy for us to tweak things as we see fit before our development team starts building the foundation for the feature to-be.

F: Even if they haven't written any code, is the development team involved at this point in any capacity just even to provide their input?

E: Yes, for sure. They're definitely a huge proponent of being part of that ideation session, and then they also set up their own sidebar where they do a bit of a deep dive into some of the scoping and feasibility into potential solutions, and figure out what sort of challenges on the development side might they have going forward with this sort of solution as well. They start to do a little bit of a spike into that while they're also doing other sorts of work at the same time.

F: I guess at some point you've done the the work of of the MVP and going back out and you actually built the dang thing. How much love do features like this get as as they actually get built and launched, what's the process around revisiting or rethinking stuff that you have actually built?

E: For this feature specifically, we did a soft launch where we actually – out of the the 20 or so customers that we did some research with earlier on – we invited them to essentially get beta access to this feature. We released it to about a dozen or so customers,and we gave them access to the to the fairly lean version of the feature for three to four weeks. We started to find some of the key bugs that come up, figure out what sort of additions or other features or other parts of this sort of solution might be missing. That just de-risked the big launch for us as well. We learned a lot along the way within that month of beta testing the feature, both internally but also with as I mentioned this group of customers, which helped us iron through a couple of key things that we wanted to tighten up a little bit before mainlining this feature to the rest of our customers.

F: So during that initial beta release, was there anything foundational that you discovered was wrong about your product decisions, or was it a lot of just kind of sanding down the rough edges and discovering little bugs here and there?

E: Out of the the beta month there weren't a lot of big things. There were a lot of technical workflow issues and usability things that we noticed, that we wanted to optimize for a more polished shipped product. But there weren't any major changes that we needed to make along the way. It was mostly just finessing some of the key things that either we didn't include on purpose, or things that came out that we knew were must-have additions to the feature before the rest of our customers would be able to use it for their own business.

F: So what are the milestones that you needed to hit with the beta release before your team's actually feeling confident and pushes out a full release to your customers?

E: We are very closely monitoring the activity of this beta group. We set up a number of metrics and analytics and things that we were keeping an eye on. Throughout that process we use a tool called Fullstory which actually records our customers using certain parts within the app. We actually got to watch customers using this feature – both video and trouble shooting along the way to see how they're actually using the tool, so we could actually see how long it takes someone to use this feature. Do they add it and then remove it because it wasn't what they expected? Are there any other bugs that arise within that? That was a good way for us to keep track and actually, within that month, figure out, some of the things that we think need to be improved. At the end of that beta period we reached out to our customers, the same group of customers, just to get some qualitative feedback as well to figure out if this is something they would actually use, or is this not quite there yet?

F: How'd you enjoy using Fullstory? I’ve never actually had the chance to use it.

E: We almost exclusively use it when we're launching something new. It's great to build these key playlists of different features, so we can actually go through it and see if there's usability issues within any area within the product. We can spot bugs, we can follow up with customers – our customer support team uses it very heavily, actually. If a customer calls in or emails in with a problem, they can actually go through Fullstory and go back to the point in history where this thing occurred, or where they're having trouble, then they can actually see the whole session in a video recording. So that's really helpful for them to troubleshoot these sort of things where typically, if someone's writing an email or over the phone, it's hard to describe what they're seeing. So it's almost alive – it's maybe a minute or two delayed from what's actually happening. It's pretty pretty incredible for that.

F: All right, so payment schedules is live now, right? And you sent us the blog post. I was going through Freshbooks last night and prepping all my taxes, kept coming across the payment schedules – looks to be like the kind of thing I can work into my life in 2018. What's been the user response to it so far?

E: The way we did that internally on a Thursday or Friday, we essentially turned on this feature for all of our customers, but we didn't actually have any marketing communications go out for another week or two, within their time frame as well. As soon as we launched it, it was a pretty hidden feature. It was just a link essentially, on Invoice, but within minutes we saw dozens and dozens of payment schedules being added and created on invoices. So as soon as we launched it, we were stuck in Fullstory seeing how it's doing, and we saw this huge spike. For us that was an amazing thing. We weren't expecting it to have a lot of discoverability without a product update saying, Hey, here's this new beautiful feature, go try it out. That was actually really really awesome for all of our team that spent a lot of time building that to get that instant feedback. It was great. We've got a lot of emails into support saying ‘I've been waiting for this feature for so long’. We actually got a number of tweets from customers as well that said ‘These are the things that really make Freshbooks really amazing, that make me love you even more’. So hearing that was definitely really great feedback for our team – warms your heart for sure. Unless you're close very close with your customer support team, a lot of people – designers and developers – they don't always hear the positive things that customers are saying, so being able to relay that back to our team was really great for overall morale and feeling like the work we're doing is actually having this impact for our customers. Seeing that impact was really big for all of us.

F: We talked earlier about how are you framing this up in terms of an actual tangible metric to to move, or to increase, or what have you. So it's it's live! How close were you? How did it actually go?

E: It's still TBD. It's a little bit early to report on some of those metrics. We released this feature to customers back at the beginning of December, and for a lot of our customers, they bill their clients on a monthly basis. Whenever we launch a feature we often have to wait at least a one month cycle for customers to potentially even see see the feature, let alone use it, and measure some of that – but so far we're pretty close. We're hitting the key targets that we had, given that December is kind of low month for a lot of our customers. January, once the new year starts, it kind of spikes back up. So we’ll have to report back on that but it's definitely getting a lot of exposure. A little bit more than we were expecting. It's a really good sign for us.

F: How is your team actually tracking the product in the usage data, and how are they exposed to it, I guess is the better question. Do they have the TVs on the walls with Google Analytics and product usage data there, or are the PMs sending around reports every week? How's the team getting exposed to this?

E: Our project managers on each team are responsible for building these dashboards for themselves and for the team. Often at our daily stand-ups they'll report how features we've launched are doing, but they're also accessible to others within the org as well, so that's one of the key ways that we keep up-to-date, keep everyone updated. Occasionally, we do org-wide all-hands where we share the success or failure stories of things or experiences within the product, of how we launched things – so other people in the org can learn from the sort of process that we've gone through as well. I'm sure in the months to come we'll probably be presenting this to the rest of the org around this process, the good and bad things that we learned going through this and sharing where we're at with that and how that's doing today.

F: So your team is actually using your your daily stand-ups not just to look ahead at the work to be done, but also to look back at the work that has been done, the stuff that's been released. Just kind of seeing how it's performing, how people are reacting to it. It seems unique – I'm not sure enough teams really bake in looking back on the work and measuring the work into their process.

E: Yeah, I think it's really big in product development, easy to ship something and kind of move on to the next project and not look back.

F: Yeah. I assume that was successful, everyone probably loves that right? 😅

E: It's probably been pretty good. So that's one of the things that we've been trying to focus on as well as how do we keep the work that we complete or something that we launched, how do we keep that top of mind and continue to keep an eye on it, to see how it's doing and figure out if there are ways in which we might want to come back to that as we kind of go forward. Over time we might identify that, hey, there's another problem with this, or this caused a problem to a different place within the the app. Keeping these top-of-mind help us figure out whether we need to revisit this this project, or are there things that we left behind that we might want to pick up.

F: So with this end-to-end new process, starting with with research, which you said was pretty new for your team, pretty new for the organization – do you think that allows you to be more accurate in in how you thought this would turn out?

E: I think so. Definitely as a designer, I can definitely say that I felt a little bit more confident going through this process because it de-risks a lot of the work that design often does in the product development process. I felt more comfortable as we're going through this, and as, you know, our development teams are building this feature, that we're not building it one way that might not work for other people or that we’re going down a rabbit hole, in some senses. So I felt a lot more comfortable, just always having those check-ins with customers along the way to ensure that we're on the right track, or what we're focusing on is actually hitting those key needs for them as well. I think that really helped me feel more confident going through this, rather than getting to launch day for this feature and just biting your nails, trying to figure out, is this is gonna do well or not?

F: Based on that, what would you say is the the one big takeaway that you had from this experience? What's the one big learning you got from pushing payment schedules out the door?

E: I think the big learning is that this was kind of a little bit rough because it was the first or second time going through this process – but as as you go through these different stages with new ideas, you can really go through all those steps in a very short amount of time. There's a lot of ways to trim down the overhead in this process. I think that was a big learning for us, that we could actually do this process again for another new feature in probably half the time or less than that, so we can actually speed up this process as we become more comfortable with it as well. So: looking forward to new opportunities for us – I think we can really speed up the process. So that's pretty exciting.

F: So that'll do it for us for this episode. If you enjoyed this it helps a lot if you leave a review or rating on iTunes or recommend this podcast to a friend – and a big thank you to Eric for joining us today and to you for tuning in. If there's anybody you'd like to hear tell their product story on Framework, or if you'd like to tell a product story of your own, please do get in touch. All the contact details can be found on our website.