20 min read

Staff First Technology Approach Webinar Recording

Webinar Transcript

Please note, some of the webinar has been edited for clarity and time. 

Just a little bit about Now it Matters. So we are an SI partner with Salesforce. We been helping nonprofits for the last 10 years with their technology needs. We've been implementing all versions of Salesforce to all sorts of nonprofits. (We're) very success focused, we're mission driven, and we're industry experts.


 

My name is Tim Lockie. I'm the CEO-founder-janitor here at Now It Matters. I started now matters 10 years ago, and I've been running it since or it's been running me depending on the day. I have a degree in economics focused on econometric modeling, is where I really grew to appreciate the value of an importance of data. 27 years of nonprofit experience that is just incredible to believe, but it's true I'm a tee time Salesforce MVP and have for Salesforce certifications. I'm also a dad and husband and you've got my email there if you want to reach out.

What we're going to be talking about today is that we need another way to do the kind of work that we've been doing, we're gonna be talking about "Is it usually this hard?" we want to compare this technology to riding a bike, and then we want to talk a lot about change capacity and disruption. We're going to discuss more tortoise less hare and the process of that, and then we'll get a little bit practical at the end.

 

 

So we want to start by saying there's another way because technology has been rolling out, as an implementation process, in this general format and I've been doing this for 10 years. I had to learn it when I first started. I learned the value, the value of each of these components and grew to appreciate their importance. But in the midst of all of that, and I think almost all of us have seen this kind of a diagram before: the kickoff, discovery, you've got the must haves, user acceptance testing, you put the legacy data in, you have a training and it goes to support when you get the things you forgot. And then you backfill data, and on and on, do another phase. This is the general rollout. And it's remained largely unchanged.

A graph, created by Salesforce.org, which shows the reasons why implementations fail to meet objectives. The majority are people-related.

But when I start talking with new clients, there's something in the back of my mind that that is troubling to me, which is that well over half of all implementations fail to meet objectives. And we grabbed this (the slide) from a salesforce.org slide that has been rolling around for a long time. I started to be more and more bothered by this, so I just want to park here for a minute and think about this with you. So on this failure to meet objectives, you can see that the items in red is this blue note that notes that items in red are people related. So resistance by employees, there's a high resistance to implementations by employees: 82%. That means that 82% of the projects are met with resistance by employees. 72% are met with inadequate sponsorship. 65 with unrealistic expectations. And then when you start looking at the blue items, poor project management is almost always a people related issue, not just a technical issue. So I sort of feel like we should count project management. Business case not compelling? That's fine. But the project team lacking the right skills, that's a people related issue as well. And so there's also scope expansion uncertainty,  that's technical. No organizational change plan very clearly people related. No horizontal process view- that's people related, and then IT perspective not integrated. If you dig into why it's not integrated, it's also almost always a people or process related thing. It's not usually a technology related issue, which means that over half of the time, these sorts of projects don't meet expectations. They failed to meet expectations or objectives.

I think that's significant because when we're working with our clients and our nonprofits, this is often one of the largest if not the largest expense they've ever incurred, switching to a different platform, doing a large implementation, even some small implementations are still one of the largest items on that year's budget or that they've ever done. And so when I'm working with nonprofits, and our clients, and I'm telling them all about how wonderful this technology will be, and I believe it, and we've seen it, and this is definitely not their ratio of our implementations, it's not like Now It Matters implements at 70% failure. But it is still there in the water that these, these implementations often just fail to meet objectives. And last year, we started to rethink Why is this and why why does it need to be this way? Does it need to be this way? And we started to think about the question that clients usually or often ask us, which is it usually this hard? what is usual?  And they usually are asking this across the other clients that we've worked with, is it usually this difficult? And we've learned to say- it doesn't matter.

For our clients, the real question is: what's usual for you? Are you experienced as an organization at rolling out technology? Or are you more of a beginner?  Beginner is nothing to be embarrassed about. That's fine. It's just on a spectrum. There are organizations that have done more of this and organizations that have done less of this. And so the usual that you should be asking, is not what happens with other organizations, but how often do you do this?

I started to relate this to biking. So when you think about an experienced biker, they're able to think about the future, they can think about speed and steering and brakes and obstacles. So in this photo, I'm wondering if Little Dude is going to run in front of my wife Jenny, and I'm also wondering if Mozzie, which is the little dog, is going to bite Little Dudes ankles because that's what Mozzie does anytime Little Dude runs in front of her. And you're probably all wondering why the big dog's name is Little Dude, which is a great question and story for another time.
All that to say that experienced bikers think about the future. And they can think about other things. And with an experienced organization that has rolled out technology multiple times, they're able to think about the future and think about other things as well. They can think about scale. They can think about how this will apply across multiple sites that they may have, and have a CIO and probably a dedicated team, which means that this is the work that that team will do. And it's what they were hired to do, and they know how to do it, there's probably an established process in place that you can pick up and put in another spot and change the names. And basically that process, whether it's an approval process, or whether it's a way to move things forward or a decision making process, those are migratable once they're established, and their specialization, which means that there is a staff that does not do this, and the staff that does this kind of work. And there's probably a baseline amount of revenue that is supporting this, this type of work creating the margin for it for it to happen.

But if you're an inexperienced, biker, then you think about very different things. You think about right now what is happening, you're focused on the now and you're focused on not falling over. You want to keep your balance up. And so, you think about things like momentum and steering and pedaling and you're not focused down the road. You're just focused on staying upright. I'd forgotten about this until I started thinking about this slide and looking at this kid, you have to have somebody else give you a push, because that momentum starts with somebody else when you're first just starting out. And it comes from pedaling after that, but that initial momentum needs to come from someone else. And I think that that's important. You just forget these things when you become more experienced. So an organization that is less experienced in rolling out software would experience things like there is not a CIO. And in fact, we've worked with organizations who need to be told that CIO stands for Chief Information Officer, and that's fine and nothing to be embarrassed about. It's not their world not what they've needed before now, it just means that they are less experienced. All of the work that needs to be done to implement this software is going to be added to the existing duties of staff that are probably already busy. And on a shoestring budget, the data are probably incomplete. So there is data that will come from multiple sources very likely. And it will have to be processed. Nancy in accounting has been guarding that General Ledger software from day one, and nobody has access to it. And so that hasn't been integrated with any of the fundraising information. And so those data are incomplete. And you'll have to match those up and look for duplicates and all that. And then there's a low individual ROI. So for each person that's working in the project to get that rolled out is added to their duties. And it also isn't quite making sense to them yet. So they're invested in, you know, taking a lot of time from their already busy schedule, to roll out software that does not help them do their job. All it does is create more problems for them. And because of all of that, it's hard to see how the software really will help in the long run. The marketing slides look great but three months in And you're looking at a lot of data, it's hard to feel like, "Oh, this is gonna work, we're going to pull this off." And so those are very different experiences that organizations with less experience will have when they're thinking about rolling out software.

And thinking about bikes, maybe start to think, we have been focused so much on the technology piece that we may have forgotten to really start thinking about how to ride itself,  which is the main thing that we probably should be focusing on. Given that most of the technology implementations fail because of issues is less about the technology and more people using it, which is kind of like saying we should be teaching people how to ride bikes instead of focusing on how to assemble bikes because when you know how to ride then you can go wherever you want. So in this analogy, we think of the riding as knowing how to use technology, and the bike as the technology itself.

We see this, a lot of times when we encounter some ideas that add up to a statement like this "there's nothing wrong with the system except it doesn't work" Sign off happens, user acceptance testing happened. When everything flowed along a certain number of conditions, it all worked well. But when they started to take their bikes off road a little bit, suddenly it kind of all fell apart. And the ways that this happens can be really subtle, but but really important.

 

So examples of this are things like an executive director needs a dashboard or a report on an incremental basis to show either to funders or to show the board but because of issues in the data, it has to be reviewed for errors three or four times. So even though this is a recurring need every quarter, so you still have to pull all the data out of the system, put it into spreadsheets, have someone review and make sure that you've got all the right names on there, and that some anonymous donors remain anonymous and all of that. And that happens every time. So the system's working, but it's sort of also not or the value of gratitude isn't reflected because of a small process error that doesn't send a thank you note or a task to create a thank you letter to the staff that should be doing that. So again, the system works and it sort of doesn't. Program staff, they take attendance for class and they need to see that information the next day to know "Okay, who is here" and focus on some of what they will be doing the next class. But the data input happens in the back office by someone inputting that data, and that only happens once a week, and so by the time you've got that data in, it's too late for to really be any good. These are the sorts of issues that cause users to feel like there's something wrong with the system. But it doesn't really help me. It's not really doing what it's supposed to do. And I think that you can read through all of these and kind of see, see the parallels on that? I will, I'll skip them. But the last one here, I think is worth noting, which is that the admin who is really focused on wanting to do their best work and make sure that users can use everything that the system has to offer, they don't know what's not working. And so there may be really simple changes that they could make, that are fast and easy, or maybe even a little bit involved, but are still worth doing, if they knew about them. But trying to get that information to the admin can be really challenging. So these are all reasons you can have a system where there's nothing wrong, but it still doesn't work.

And that leads to organizations thinking we know we need to change, but we don't know where to start. And oftentimes, honestly, the next step is to say the issue is with our system. And so we need to change our system, which is when they get in contact with us or Salesforce, and, and they start to say, okay, like, we should change our database, or we should upgrade something in our database. So we should install a new app in our database. And maybe that's important, maybe that's good. But there needs to be some kind of change that happens that does not necessarily involve massive technology change, because maybe that change isn't the problem with that isn't the technology. And I was trying to explain this to my mom, and was using a lot of this language, which does not make sense to her. And she let me know it did not make sense. And I know that until I've got it formed in a way that my mom really understands that I've got more work to do. So Finally it came back to where I said, Mom, okay? Imagine that you are, you've got a membership to a gym, but you're not seeing results. And so you, you decide that you want to switch gyms. Now, the issue here is that if you're not using gym a, you're probably also not going to be using gym B. And the weights are probably pretty much the same, the equipment's probably pretty much the same. The issue isn't the gym, the issue is that your habits around that gym aren't going to support the kind of habits that you need to make that change and to see results. And so if you think about technology as the gym, and apps as the equipment or all of that. One thing to say is if you don't if teams don't have the right behavior around technology, then they'll just be swapping gym a for gym B and those will just be different CRM only the differences that'll be a you know A several hundred thousand dollar transition instead of just you know, a new new gym membership, and that that is actually really problematic. And so we think that instead of the problem being framed as a technology issue, we need to frame this problem as a staffing issue or change issue. So we started to look at the idea of change saturation. And I think we kind of backed into it, but it led us into a really interesting a way of understanding change. Change saturation is the ratio between the amount of disruption and the amount of capacity for change.

So if you think about the amount of capacity that an organization has for change as a coffee cup, and you think about the amount of change disruption as coffee,
then change saturation is going to happen when you've got more coffee than cup. As can be evidenced here. Now change disruption does not have to come from that technology. It can be in people's personal lives. It can be because of a global pandemic, it can be the combination of those where the global pandemic is causing people to be doing homeschooling at home instead of their jobs which they were expecting to do that that is just disruption any kinds of change, create disruption, and if the disruption exceeds the capacity for that change, then you are going to run into issues with change saturation. When I
started we did not think about how do how do we frame technology change and implementing technology in a way that pays attention to change saturation for organizations.

And so, if we were we would look at something like this, if I Organization has a high chief saturation point, it means they can handle a lot of change. So this is kind of a typical disruption line. And what that means is you need to do something new. And at first there's some disruption and then it levels out. And if the capacity for that disruption is, is higher than the disruption, then you have a high change saturation point.

 

And if you are in the opposite, where your disruption increases and exceeds the amount of your capacity for change, you have a lower saturation point. Of course, there's no real math in this except that you start to observe that people get really frustrated because of the disruption involved.

 

 

 

And we've observed this in the 10 years that we've been doing this, and we observe it because people say, sometimes the opposite things about the same project. So we hear that something's moving too fast, and we sometimes hear that things are moving too slow.

 

 

So if we take take that step, That same disruption line. And we put that into a project. And we think about, you know, development program staff that are, you know, they are having to work with the project stakeholders in the project to implement technology. And it's added to their duties, they have more meetings that, you know, are, are taking away time that they were doing other parts of their work. They're having to sort through data questions, they're having to be involved in discovery calls. They're having to do user acceptance testing. And so all of these create a high amount of disruption. Now, in the long run, these changes are going to decrease over time, and they'll balance out again, and the benefits will be really, you know, significant and nobody's really arguing that.

But what happens is if you're talking with a board or the executive team, where you can find is that the disruption sorry, the implementation line to the map. Have the quantity of technology implemented over time is relatively small for that amount of work you've been doing. And so if you are on the board, you're like tapping your foot saying, why is this taking so long? We're only  25% through phase one. And we've barely done any of this. And we've got a lot of other stuff that's pending on that. And what can be invisible is the cost to staff because of the destruction that's being introduced. This is how you end up with really, really different expectations and different experiences. I was never, as an implementer, never really taught to think about what is this doing in terms of disruption.

And in fact, it looks Something like this, you start with the technology you do the kickoff, the discovery, the must haves, we've already gone through all of that. This is the moment where users, this is the staff, this is where staff start to use the new system. And this is the very tech first approach to rolling out software. So if you have a 700 hour project, and then you take two days for training, that the amount of training relative to the amount of project is really, really small, and you will for sure end up with a low team saturation point and people feeling really overwhelmed, because they haven't really engaged with with the technology. And that's actually part of the process is just this is technology first and the staff will get it later.

But we think that there's a better way to do this. We think that we can actually introduce a methodology of implementing that comes that that has a very different outcomes. So imagine if 12 months from now, the same organization instead of going through that process, if they were steering tactical work that was done by an admin, so they could direct the work that was being done by an admin. Imagine that they had confidence that the error records that they are targeting, we're getting, they could identify error records, they could target those for correction and know that they were getting fixed over time. Imagine if, in 12 months, they had the ability to redirect technology towards new initiatives. So instead of target starting with a kickoff here or an RFP, they started with a different frame of reference. And they approached those new initiatives with a lot more confidence and a lot more expectations. Imagine if there's a defined roadmap of changes with timelines and expectations so that staff could see what was going on. And imagine if in 12 months, you had engaged staff who think not just your admins, but all staff who know the process for submitting feedback, and think that if they submit feedback, it will lead to change. And then lastly, imagine measurable KPIs with planned and actual results. Now, this is in the existing software. So this is not rolling out new software. This is just a system that is doing these things, the existing system that they have. And we think that this is possible, we think that this is the right approach to take. We think that if you start with a staff first approach, you'll end up with these results. And you'll you'll end up in a position where you can decide what's next and you can decide on timing to roll out an implementation if that's needed. But you aren't starting with question or the statement that we need new technology. You start with the statement, we need change. And that change needs to be behavioral change that we're making for our, for our staff and with our stuff in the future. Software. And then you can migrate that if you do need a different system, that's fine, you can migrate all of the ways that you're currently behaving with that technology.

So we've created five steps that we take to help organizations do this. We have a much more laid out process. But it boils down to these, these essential things. The first thing we do is we slow it down. we slow down everybody's expectations. And we say, look, we want to win in the long run, not this not the short run. we simplify planning, assembling a team creating a schedule, measuring success. I'm saying this stuff because I realized I actually have a slide for each of these. So let's talk about them.

So the first one is to slow it down. So we do think we want more tortoise less hare. I think about tortoise and hare a lot because I am completely a hare. I just run from thing to thing. And you know, lose the race but I'll be running towards quick wins all over. To replace and we think that we, we want to slow it down in and nail a long term when rather than short term quick wins. And then we want to simplify complaining. So one other thing to say about this slowing down, the purpose of slowing it down, is so that that disruption doesn't hit change Saturation levels that are too high for staff to absorb. So basically want to say we're going to make changes in a more controlled, less disruptive manner.

The second thing that we do is we make it really, really easy to simplify complaining. We think that complaining in systems is a source of hope that it is currency for change management is when people start complaining. Now I know that there's constructive criticism and all of that and I'm really not getting that really what we're trying to say is, every time someone says what is not working, that is the gift that you can say, Okay, we have an option to change. And the surprising thing is two things happen with people that complain about this system. One is, they feel better because they were able to there's just something about venting, that's helpful. And the second thing is, your team doesn't actually have to fix it, for it to be effective and to draw more complaining out from people. And I should maybe say feedback to get more feedback from your your end users. And and really, what what that means is that if you if you can, if you can convince people that if they raise their hand and say something's not working well, it's on the list and it will be evaluated. That is enough to make people feel like okay, this is worth doing. So you don't necessarily have to fix it right away, but you do have to communicate and show them that it's on the list.

The next one is to assemble a team. I think that there's a lot of different ways to think about this and I started to get very detailed. But I actually think at the end of the day, you probably will know best for your organization, what kind of a team to assemble. And really what this team does, this could be your power users or your influencer users or your stakeholders,
lots of different ways to talk about this. You basically want some cross section of people that are using the system.
And you want to create a space for them to talk about what's working in the system and to know what changes are happening. That's really the magic there, how you assemble it, and who's on it. I think that there is a lot of there. There's a lot of nuance to that. And that decision is important. But I also feel like it's so contextual, that in organizations it will probably be a parent who should be on that team. And also doesn't need to doesn't mean to be earth shattering. That's the whole point of taking your time and slowing it down. You can start with two or three users are frustrated about something and work with them. And then when that's resolved, maybe they stay on the team or maybe they don't. But but this is having a team for whom this is why they're they're getting together. And what they're looking at, makes it really, really significant difference.

Next thing on here is to create a schedule. The schedule is for a couple things. One is to say when that team is going to meet. Another is to identify which frequency or what expectations you should expect to see changes. What, you know, if if you want to, you know, say in a couple of months, we're going to make this change, then you've got a place to put that because you've got a schedule. So when you create a schedule, you'll actually start to naturally form a roadmap from that schedule. And so I'm not getting into how to form a roadmap or how to talk about that. That probably will just start to emerge. But you do have to create a schedule so that people know what to expect and this is that's kind of the key here is to help manage expectations so that staff feel like they have some say in what's happening.

And then the last one, this is really easy for go is to measure success. And it will not feel like it's worth it at the time. But do it and you'll find that it is the thing that actually generates a lot of the excitement on keeping on doing it. And what I mean by measuring success is actually pretty small things it can be. How many, how many cases were created from people submitting feedback. They can it can be how many? How many records used to not have an address and now have an address. But the the point here is that you need to track and you need to put down what this team is trying to accomplish, and be able to say we did or we didn't accomplish that and It is kind of like assembling your team, there's nuance to it. And there's importance to it. And it really does need to be done. And I thought about like pulling up a slide with smart goals and what's included all of that. But actually, it's one of those things that if you start doing it, and you're just you keep doing it, a lot will emerge around it, and it'll probably start good enough it'll get better over time. So we think that that's important to measure that success.

That's it, those are the five five simple things that we recommend. Of course, this is not the only things that will take. But one of the things to note here is that these are things that your team can do in an existing system, which means that you're able to do that without an RFP, you're able to do that without a you know, a new a new contract of some kind. This is in the system that you currently have. You also noticed I've never said the word sales force. You can do this in whatever system you're working in, because it's staff first and technology. Second, the important thing here is to recognize what does it look like to focus on helping your staff get to a new level. And that's not dependent on technology, although you'll use technology to accomplish it. And I'm not taking the time to spell that piece out. But these are, these are the five things, slow it down, simplify the complaining, so people can complain in less than 30 seconds, assemble a team that will that will work on this and read about it, create a schedule for both the work that you're going to do and when that team will get together. And then make sure make sure to measure what you're trying to accomplish. And then if you accomplish or not, and maybe I should say measure success and failure is actually really important to measure both of those.

We've turned this ourselves into a service that we call Guidance. And we've learned a lot about guidance over doing this. And one thing I will say is that there are teams out there that may get started on this and you may not be able to do it by themselves and need someone else to work with them. And as an implementation partner, one of the things that we've made a shift in is to say we focus on staff first, technology second, which means that unless we are working with a team around making sure that we're creating momentum, steering and engagement for their team, then we're not going to do a big bill there, we're not going to do an implementation for them, because it doesn't make sense for us to just do the implementation. The point isn't the technology. The point is the staff using that technology, and creating this momentum and steering and engagement will also support the builds and all of that all of this down here is just the things that we were doing before the things that every implementation partner is doing, you know, the discovery and the builds and of course, that needs to be done well and should be done. So you know, it's not hitting 70% failure rate on on implementations, but that's just meaning that isn't enough, really the goal here is to transform how staff are engaging technology and the difference that it's making for them.

And so what we found is that we take 12 months, and we teach users how to submit the feedback, you should see a lot of similarity here. We work with the admins to make sure that their closing cases and clean up targeted data. We work with stakeholders to provide strategic oversight on what's happening with that. And then we measured goals and KPIs on whether those are set and whether they're met. And, and I will say it, it is a complex process that requires a lot of nuance, but it's not. It's not rocket science, it is possible to attain and if you're finding that you can't, then we'd love to reach out and talk with you about what that looks like. But if you can do it with without us and you can make this kind of a transformation with technology inside of your own system. We think that's great. And if you work on these things, before you start an implementation project, if you know that you're going to be working in a new system, and you're going to be making that transition, all of these behaviors will help immensely. And you'll have started those before you're in a new technology, which means that you will decrease the disruption that your staff are experiencing.