Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Planning to Win vs. Wondering Why You Lost

Did you ever have a job that went sideways and bled cash like a stuck pig? Wonder why all your technicians appear busy yet the P&L doesn't reflect it? Most of the time, we use the rear view mirror to view the world in 20/20 hindsight. The only problem is, measuring lagging indicators like actual job hours and performance after the fact will always yield disappointing results without defining and focusing on leading indicators.

Did you ever have a job that went sideways and bled cash like a stuck pig? Wonder why all your technicians appear busy yet the P&L doesn’t reflect it? Most of the time, we use the rear view mirror to view the world in 20/20 hindsight. The only problem is, measuring lagging indicators like actual job hours and performance after the fact will always yield disappointing results without defining and focusing on leading indicators.

What’s the difference between a leading and lagging indicator? Lagging indicators let us know how the game went. Lagging indicators tend to be the easiest to pick out (think Monday morning quarterbacking). The score is on the scoreboard, and there’s no way to affect the outcome. On the other hand, leading indicators are harder to identify and require much more discipline to measure and refine prior to game time. In football parlance, your 40-yard dash time and bench press max are good leading indicators of future performance. In Home Technology Professional (HTP) terms, a good interview, personality assessment, on-the-job training, and peer feedback can be great examples as well.

Let’s say we define leading indicators in the HTP world as personality assessment results (DISC, Myers-Briggs, etc.), peer feedback, and overall cultural fit. Do those sound as easy to measure as a 40 time or bench max? Heck no! It’s no wonder we’re constantly dashing ourselves against the rocks with the easy allure of the lagging indicator. They’re easy to see, easy to measure, somewhat helpful in predicting future performance, but an organization driving with the rearview mirror is going to run into a wall sooner or later.

At Livewire, we started implementing a utilization forecast a few weeks ago. The idea is very simple. We’re consistently two to three weeks out solid on our schedule. On the Friday prior to the coming week, our project coordinator updates a report showing our forecasted utilization for the coming week. This allows us to schedule more billable hours for technicians to offset warranty hours associated with our service department. We’re looking for an overall utilization rate of 62.5 percent (25 hours billed to 40 hours worked). So far, our forecasting has been optimistic. We’ve forecasted 69 percent, 65 percent, and 79 percent for the last three weeks respectively. The actual results have been on average 10 percent less than forecasted due to jobs being rescheduled or other unexpected changes. Seeing these data have been at once enlightening and nauseating. In less than three weeks, we now know to have a better stable of last minute work and a new policy to send technicians home if there isn’t work to do (not ideal and hopefully something we never do).

The only challenge with the forecasting effort so far is that we haven’t figured out how to automate it. Our current solution (Zoho CRM) does pretty much everything except proposals and decent labor utilization reporting. That’s coming soon and I can’t wait! We have a TV channel constantly running in our warehouse showing the sales leaderboard (using Zoho Motivator and a Raspberry Pi). Ideally we’ll be able to get to the point where we have a utilization channel as easy to understand as the sales scoreboard. After all, a high utilization score should be as easy to understand as a batting average or any other quick performance statistic. Our industry doesn’t have a metric to measure installation performance and I vote we work together to get a utilization score rolled out ASAP. After all, law firms hire new associates based on hours billed at previous jobs, football teams hire based on 40 times and bench maxes. Why should we be any different? It shouldn’t be that difficult to get a quick sense of someone’s capability. If we can ask potential sales candidates how much they grew business at their old jobs, why shouldn’t prospective technicians be able to speak to their own productivity in quantifiable terms?

The biggest challenge in getting the utilization score rolled out in a meaningful way is service or warranty work. A technician shouldn’t be penalized because they’re working on jobs, which aren’t generating any more revenue for the company. To this end, we created a house account called “Livewire Service” which counts towards their utilization score the same way a billable client would.

What do you think of rolling out the utilization score as a standard across CEDIA? You’ll never look at a resume the same way again.

Stay frosty and see you in the field.

Click here to download a PDF file showing an example of our forecast report.

Close