I have been trying out projection systems lately to see how I can display changing content on walls. I got a handheld projector that has a battery and can connect to an HDMI or USB, but it has only about 400 lumens. I also bought a larger plug-in with 3500 lumens, but I returned it because it was a mistake to think it could to the job.
I have also been working on financial projection systems lately and trying out different methods for doing these projections. I am increasingly convinced that the tools we have to do this suck. Specifically, they suck up a lot of time and effort relative to the yields the produce.
More information in than value out
This brings me to another issue that I think is at the heart of many modern systems for supporting business growth. They take in a lot of information and give back a little. And in the arena of financial projections, this seems to be more true than in most other fields. While so-called big data analytics (where we take in a little information from a lot of people and share it out with automated analysis to lots of people) tends to provide a rich opportunity across a lot of spaces. In the financial projections space, it's really about the minutia or it's largely meaningless.
Somehow, the ideal situation is where we use the vast leverage of information technology and content to allow little effort to generate big results. But when it comes to financial projections, today it takes lots of expertise and provides almost no leverage.
As an example, I had perhaps 8 hours of meetings over the last week which involved perhaps 100 hours of other peoples' time to generate what amounts to a few tens of numbers. And each number is an approximation of an estimate based on a bunch of guesses. And the people in the room didn't generate any of the numbers based on some vast store of data and the customization of those numbers using analytics. Instead, Neil and I made some guesses based on nebulous information about what might happen and what has happened, and when pushed after not answering the questions several times, we got a grudging yes-like maybe when it came to accepting the numbers as adequate to the tasks at hand.
Which reminds me of a conversation with an attorney yesterday about starting a fund. The answers to the questions were not particularly helpful in terms of helping me do what I wanted to do. But they were effective in telling me what it would cost to answer my questions. Yet I already knew the answer to my questions, but I had to ask them because someone else had to learn the answers. And I had to learn some other things as well.
It's about the expertise
Neil and I are working through more of these projections, and we will, soon, get a WAG (Wild Ass Guess) at a projection that will be presented to investors and look reasonable. And over time, we may be able to teach the folks who have to answer questions about these projections how to answer those questions so as to be honest and informative but not tell anyone that this is anything but a WAG.
Here's the thing. Every financial projection for every early stage company I have ever seen or been part of is a WAG. There are different reasons for this, but in essence, nobody seems to have enough real data to do any better than WAG at the problem. At the end of the day, all we really have is the expertise of the team and their history of hitting actual numbers. To quote from Harry Potter:
This is why we count on the expertise of the team to make reasonable WAGs and then adapt when all hell breaks loose.
The next model?
But of course this does not have to always be true. And it's not generally true of more mature enterprises. At the level of the individual contributor it is often true when they start out, but as they build their book, they steady out, as do organizations who have found their product market fit. Until all hell breaks loose again. And then it's back to the quality of the team.
The approach I have been taking lately, to limited success so far, but we will see, is to model the business and adapt the model based on facts over time. We are always projecting possibilities, and we are always adapting the model, but the model has less adaptation and more predictive power over time. Until it breaks. Then all hell breaks loose. Until we fix it, and adapt the model.
I am now going to predict the future:
Prediction is not projection, of course, but I don't see big data helping me out here. I am hoping (and I try) to use big data to get me better guesses at some parameters. For example, I just got a subscription to the underlying data at Crunchbase which I hope to be able to analyze to get parameters for models. But in truth, the available data is unlikely to help me at the level of details I need to model any particular company.
Questions like what percentage of sales will be of what product or service, especially for novel products and services, seem unlikely to be answered by anything other than actually trying to sell them. And different sales people with different marketing team members will produce wildly different results. Some again it comes down to the team.
That dog won't hunt
Some WAGs have a chance at working, while others have no chance of working. One of the main things we do when we look at companies is to look at their projections and detect BS. When we see BS, we ask questions, and mostly what we find is one of two things:
For clarity, I don't claim to know which companies will succeed and which will not. But I can predict that some companies will fail at what they are trying to do, and usually that has a lot to do with their projections and their ability to answer my questions about them.
A call to action
Here's your chance to talk about your WAG with folks who have seen a lot of dogs.
We all use WAGs for our projections at early stages. But some dogs just won't hunt.
Copyright(c) Fred Cohen, 2021 - All Rights Reserved