So $NVIDIA(NVDA)$ has pretty much "learned" all the click bait commentators waffling on about the so call Ai bubble. As I said in one of my articles yesterday just pure speculation over fact. The facts presented themselves earlier today when Jensen said that they have sold out of High end chips. But to me this was well telegraphed. That's why I pretty much ignore Most analysts and market commentators. I listen to the industry and the companies. And actually a lot of writers here in tiger trade, who actually know the facts.
But today I want to address another one of the stupid concepts being circulated by the sheep of Wall Street, Ai chip depreciation.
This keeps recirculating as an argument to support an Ai bubble. The fact that the big Ai compute businesses like $Alphabet(GOOG)$ have gone from depreciating their chips over 3 years to depreciating up to six years. Too this I say foobar!
Why? For a start it must be following GAAP. I would probably conclude that while a chip has a useful life of only 3 years for Ai training or inference, it still has a use in less demanding compute for, um idk, maybe another 3 years.
Second I would argue that it's not really in a companies best interests to depreciate an asset over 6 years rather than 3. Why? Think about it. Depreciation is not a cost in terms of cash flow, but lower depreciation rates impact cash flow. Why? Cause you make more profit therefore pay more tax.
Thirdly, let's face it... who cares. Depreciation, amortization, goodwill, all so called "accounting "tricks" don't matter. Personally and I'm not alone, I look at EBITDA. When buying I also discount goodwill to zero. And I'm way more interested in the cash flow statementsthan the income statements.
Food for thought? I'd love your comments fellow tigers.
@TigerWire @MojoStellar @TigerTrade @Ragz @MillionaireTiger @SPACE ROCKET
Comments