Michael Burry's concern about 2-3 year amortization is for "GPUs performing the most advanced training of LLM". Think of scientists mapping the genome or PhD mathematics! An AI agent, once trained to perform basic repetitive tasks such as writing code, basic customer service resetting passwords, uploading an application, scanning for the same phishing email, is all inference that can be done on ASIC trainium chips, with no need for high-power GPU. Trainium chips use less power and can perform inference AI tasks for 5, 6, up to 8 years? This is $Amazon.com(AMZN)$ 's sweet spot; this is the huge opportunity, in my opinion.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Report

Comment

  • Top
  • Latest
empty
No comments yet