Predicting Professional Services Variability with PSA Software

Author: Categories: How To, Professional Services, Projector, Tutorials

One of the reasons I love doing what I do is that it gives me the opportunity to work with a huge variety of professional services firms—from startups employing just a couple of founders to larger organizations with thousands of consultants scattered around the globe. No matter where they sit along the maturity curve, each organization faces unique challenges. They may range from the very basics of tracking and invoicing for time and expenses to estimating revenue based on backlog to balancing the supply and demand for resources. To address these challenges, we built Projector, a Professional Services Automation (PSA) solution, and what’s fun about my job is watching organizations take Projector to solve those problems. What’s even more fun is when an organization starts to push the boundaries of what we originally designed Projector for and we discover together that the system can handle that need and more.

I happened to be talking with one such organization about how they could solve the problem of understanding how variability affected their revenue projections. They were already using a disciplined approach of using scheduled hours, bill rates, and resources in Projector to project utilization, revenue, and profitability, but wanted to take things to the next level. They knew that their projections for next month typically were more accurate than projections for six months into the future and that they were likely to win additional work that tended to drive projections that were further out higher rather than lower. They also wanted to understand whether they were getting better or worse at projecting revenue as their processes matured and their team gained experience. Turns out that Projector has all the information needed to provide this perspective as well as the ability to visualize the data concisely enough to make it understandable…as long as you think about things right.To get started, we first took historical snapshots of revenue projected into the future and watched as those projections tightened as they got closer. After gathering a decent sample set and comparing those projections against what the revenue actually became, we were able to develop a forecast variance model that represented the organization’s current revenue projection capabilities:

From this, we could see that on average, actual revenue ended up being a few percentage points lower than what was projected a month or two in advance, and that current forecasts only represented a fraction of the actual revenue that would eventually be realized several months out. We could also see as expected that the variance grew the further projections were extended out into the future. With sufficient data, we could even estimate 66% (±1σ) and 95% (±2σ) confidence intervals, leading to the above graph that looks suspiciously like StormTracker 2000.

With this forecast variance behavior modeled in Projector’s Advanced Analytics Module, we could then provide a dashboard that showed what the projected revenue range would likely be within a certain confidence interval. This forecast was based on what hours, rates, and resources were scheduled within Projector at any point in time plus the variance model:

As such, the organization knew that even though Projector only “knew” about $800K in revenue projected six months from now, based on past performance, the revenue would likely be closer to $1MM. Based on the forecast variance model, the management team could also easily understand the range that the revenue might end up in and with what level of statistical confidence that projection was being made. Finally, the organization could recalibrate those confidence intervals as often as they wished to understand whether their forecasting processes were getting better or worse and automatically apply that revised variance model to their revenue dashboards.

OK, I will be the first to admit that the discussion above is a very numbers-driven approach that may appeal to me (because I’m a bit of a geek) more than it does to you. It’s not for everyone. My point about describing this variance forecasting process is not that everyone should rush out immediately and start applying stochastic modeling techniques to their projections or running Monte Carlo simulations to predict whether you’ll achieve your revenue targets. Rather, the point is that high-achieving organizations are always looking for ways to improve their performance, and those methods will vary widely based upon what the firm is doing today and what it’s looking to achieve.

You may be doing a great job today with capturing time and expenses and invoicing clients, but are you able to predict the future? You may have started to calculate SWAGs of future revenue based on backlog or remaining budgets, but how about the next step of projecting how you’re going to meet that demand? You may be doing a fantastic job of deterministically projecting revenue and utilization, but how about taking things to the next level by starting to factor in predictable variance?

Which, of course brings up three questions. Where are you headed? Will your PSA be able to keep up with you as you go there? Will your PSA partner actually help lead the way there? Something to think about especially when the only certainty is variability.

Leave a Reply

Your email address will not be published. Required fields are marked *