I harbor no illusion that the current state of artificial intelligence and machine learning is advanced enough to be an effective substitute for a talented, experienced, passionate project manager. For me, a more interesting question is whether these techniques can be used as a tool to help improve success rates within the project management discipline. Could artificial intelligence and machine learning identify the most effective PMs on your staff? Can professional development dollars have a greater impact when artificial intelligence helps inform investment? Moreover, can AI techniques help pair PMs with the projects they are the most likely to be successful at delivering? Here’s the story of how we looked to answer a few of these questions with one of our clients.
Now, I love a good caper movie. The planning of a heist in clever detail, the improvisation when things inevitably go awry, and the gradual gelling of an improbable team from a set of disparate personalities. Often, those personalities have as their roots each individual’s role within the team…the con artist, the forger, the safe cracker, the heavy. I thought it might be fun to use this analogy to explore the world of project management and how we used artificial intelligence and machine learning to gain some insight into it. This post introduces the tools we used, the AI-driven analysis, and the five profiles of project managers that we identified.
I recently was talking with a client who uses our Professional Services Automation product, Projector. Our conversation turned to some the advantages of tracking historical time in the same system as they forecast projected resource schedules. In this organization, each project manager is responsible for identifying the staffing needs for their projects. As we began to analyze the data, we realized that each PM’s resourcing forecast, as compared to the work their team actually did, was quite unique. So unique, in fact, that we found that we could identify individual project managers by looking at their forecasting “fingerprints.”
From there, we thought it would be interesting to group these unique patterns into five different base profiles. To make the analysis of these fingerprints less manual, we turned to some artificial intelligence mechanisms to crunch the numbers. In particular, we taught a neural network-based machine learning algorithm analyze the fingerprints and to identify the base profile that best described each PM.
Delivery managers can use a myriad of different metrics to measure how successful their PMs are. The data we looked at in this instance was how accurate and consistent each PM was at forecasting their projects. For several months, we studied how each PM projected their projects' resourcing needs for the next 12 weeks into the future. Once those weeks passed, we compared the forecast against the actual hours that the project teams reported in Projector.
Using some basic statistical analysis, we produced a series of graphs to show the average number of actual hours reported (the solid white line) as compared to the baseline of the PM’s forecasts (the solid yellow axis). We also plotted the variation (the dotted white lines, representing the ±1σ and ±2σ confidence intervals) to get a sense of how tight each PM’s projections were. Finally, we compared actuals against projections one week into the future, two weeks into the future, all the way to 12 weeks into the future, and came up with something like this:
We then took a look at a handful of these graphs and grouped similar-looking ones into different categories. This became our training set that we fed it into our machine learning mechanism. We then turned that neural network-based algorithm loose on a wider data set to categorize all the remaining project managers.
The first profile we happened to look at we named Deliverers. These PMs were the bread and butter of the organization—diligent, thoughtful, realistic planners who kept their resourcing needs up to date. They had good visibility into what their team needed to work on, especially on a six to eight week planning horizon. Deliverers also were eminently capable of dealing with the inevitable surprises as they came up.
Deliverers often were not working on the highest risk, most complex, or longest running projects. The short-term nature of their projects led to an increase in variability in the 11-12 week planning horizon. As such, they were largely responsible for the short- to mid-term predictability that the organization was looking for.
Our second profile emerged quickly. In this profile, the white line (average actual hours) consistently sat below the yellow line (projected hours) at all planning horizons. We also saw a wide variation that didn’t change much, whether in the near term or far in the future (the dotted white lines). Looking closer at this profile, we identified PMs that were less experienced than the Deliverer cohort. We characterized this set of project managers as Hoarders. Hoarders were a little less adept at risk identification and risk mitigation. As a result, they were more prone to allowing surprises to throw off their planning and execution. This in turn, led to the large variation.
On the plus side, Hoarders had a sense of the impact unidentified and unmanaged risks could have on their projects. As such, they tended to be overly conservative when defining resourcing needs. This showed up as the consistent difference between average actual versus projected hours.
This tendency to project higher resourcing needs than what was actually needed helped ensure successful delivery of projects managed by Hoarders. However, this conservatism wreaked havoc with the organization’s overall utilization. Resources spoken for by Hoarders “just in case” appeared unavailable to work on other projects. They then ended up underutilized when those worst-case scenarios didn’t play out.
Like our previous profile, Optimists tended to be less experienced than Deliverers. This resulted in some of the same wide variations as Hoarders. Optimists, however, tended to use resources more than they planned because, unlike Hoarders, they tended to not anticipate risks at all. Rather, they assumed that everything would go according to plan. This meant that their teams constantly found themselves in unplanned firefighting mode.
What is ironic with Optimists is that many organizations help them to develop professionally by providing tools, methodologies, and training to help identify and surface risks. As they develop better risk identification skills, however, Optimists are in danger of turning into Hoarders unless they simultaneously develop risk mitigation and management capabilities.
So, as we’ve seen with some of the previous profiles, too much variation and too large a consistent difference between projections and actuals is a bad thing. That must mean that zero variation and zero difference is good, right?
Maybe not. Meet the Controller. The Controller’s data was unusual, to say the least. Her projections were always perfect. Her teams always reported hours exactly equal to what she predicted. Her projects were never over budget (and never under budget, for that matter).
While there was some speculation about whether she was truly psychic, after some further research, we found out that she had a unique way of managing her teams. She sent out a weekly email to each team member with the number of hours that person was scheduled to work on her projects for the week. The email was, of course, conveniently timed to arrive just before the week’s time entry reporting deadline. No wonder she was never over budget. But, she (and the organization as a whole) was also prevented from learning anything about better estimation, delivery, or risk management techniques. Her project management style was effectively encouraging her teams to lie about what work they truly did.
Finally, we came upon a profile characterized by huge variation that, to me, spoke to an inability to forecast resourcing needs with any level of accuracy. My first instinct was to recommend remedial estimation, planning, and risk management training. I also considered suggesting additional domain familiarization to help them better understand the projects they were managing.
Turns out, these were already the best trained, most thoroughly ramped, most highly experienced PMs in the company. They were the managers that the organization pointed at projects that were already in trouble. They were the veteran PMs expected to help delivery teams get themselves out of a tight spot. They were the people who knew how to roll up their sleeves, get their hands dirty, and get a project back on track. They were the Fixers. Rather than expecting to be put on personal performance improvement plans, Fixers were often the PMs who could expect the largest bonuses at the end of the year.
So, what's the bottom line on using artificial intelligence to identify effective project managers? Obviously, the organization should be looking at much more than these forecasting accuracy profiles to make hiring, firing, staffing, bonusing, and professional development decisions.
What this exercise did do, however, was point to some of the real benefits that arise from managing time tracking and resource scheduling in the same system. These profiles, along with the artificial intelligence and machine learning categorization models used to characterize individual PMs, may help identify where training is needed or where particular expertise is hidden. Incorporated into the right projection models, these profiles may even help improve revenue forecasting by taking into account natural variability.
Finally, the approach also pointed out some of the potential pitfalls of relying solely on data without having a deep understanding of the business, the people, and the culture. In particular, it highlights how much of a human endeavor the services industry is and how managing a professional services business is as much an art as it is a science. At the end of the day, achieving the right outcome is, much like your favorite caper movie, a combination of having the right plan, the right tools, and the right team.
If you’re interested in learning more about PSA software for services organizations, take a look at our recently published eBook, Professional Services Automation: A Quick Primer. In it, you’ll find additional information about how Professional Services Automation solutions can improve the performance of a services organization. You’ll also see information about some of the decision points you’ll need to make when selecting a PSA tool, some of the trends we’re seeing in the PSA market, and much more: