Subject: Re: Value Trend
In the same vein, how about data used in this one, how was the 1.5 picked?
...
And last step: take that WMA and multiply by 1.50. Why 1.50? I looked for the multiple that gave the best fit to price data.


I guess my explanation wasn't very clear...

The WMA calculation is just a yardstick of value. An arbitrary number that it intended to rise at the same rate as observable intrinsic value.
Phrased another way, it's a number which, multiplied by an unknown constant, would give you the true intrinsic value.

Rather than trying to estimate what the correct constant would be for true intrinsic value, I picked the constant (a bit lower I presume) which best fits the history of market price data in the last several years.
Coincidentally that came out to almost precisely 1.50 times the WMA smoothing calculation.
So, in the interval I considered (mainly 2008 to date, with a bit of consideration for earlier dates), the price was above the 1.5 line about half the time.
I used RMS error to get the multiple which was the best fit to the price history, and out popped 1.5.

The practical use is this: if the current price is below the current WMA calculation scaled by 1.50, the implication is that it's cheaper than usual and shortish term results should be expected.
Assuming multiples and growth rates remain somewhat similar to those in recent years, of course.


Also, are results e.g. the forward return bucket results, fairly robust to changing the WMA16? That's sort of an open-ended question, to be more specific in a worst case of sophistication, hopefully just a SMA still shows a significant trend?

Yes and no.
It's not sensitive in terms of how well it "worked" in the past, but I chose the 16 quarter figure quite carefully.
With more smoothing from a longer lookback, you start to get a line that doesn't react for many years when the growth rate changes.
Since the growth rate has been remarkably steady in the past, this wouldn't have been a problem, but the whole idea was to find something that was just smooth enough but not too smooth.
With smoothing that uses too little data (shorter history), the problem is that it starts to squiggle around. You get quite visible dips in recessions.
At the extreme, you're simply using quarterly book per share as reported.
Four years (which I later changed to 16 quarters since I had the data) seemed a sensible compromise.
Dips in short term mark-to-market book per share during bear markets tend to come out as flat spots, sometimes with VERY slowly growing value per share in the early part of the dip.
That sounds sensible to me.
I personally don't think that the true value of Berkshire drops in recessions (or at least, hasn't in the past) so I'm looking for a metric which chimes with that belief.
But no more optimistic than that, no more.

Incidentally, the smoothing method was created some time ago, as a tool to solve a different problem: how much stock can you sell for income while never running out of money?
The answer is based on this thinking:
"... the fairly simple notion that ultimately what you can withdraw is a function of how much value your portfolio is generating.
If you value it sensibly on a regular basis, you can withdraw any gain and never run out of money."


Original post from 2015
http://www.datahelper.com/mi/s...
A follow-up of how it would have worked as a "safe withdrawal rate" method
http://www.datahelper.com/mi/s...

Jim