Here are the biggest analyst moves in the area of artificial intelligence (AI) for this week.
InvestingPro subscribers got this news first. Never miss another market-moving headline.
1. Intel introduces new chips
Intel Corporation (NASDAQ:INTC) held the “AI Everywhere” event on Thursday, where it unveiled its latest innovations. The event marked the official launch of their first E-Core Xeon server CPUs, codenamed Emerald Rapids.
Moreover, Intel introduced AI-optimized Core Ultra CPUs, part of the Meteor Lake series, featuring a chiplet-based design with a CPU, embedded GPU, and NPU for enhanced AI capabilities.
“Await more proof points on INTC’s ability to close the performance gap in server, demonstrate the ability to be more competitive on AI, and execute on foundry,” Keybanc analysts said.
2. Nvidia is now an “AI budget buy” stock
While Bernstein analysts took note of the rising price of Nvidia (NASDAQ: NASDAQ:NVDA) shares, they also said that they are now the cheapest they have been since late 2018.
Moreover, the stock trades at a discount to the semis-focus SOX index, which happens for the first time in 9 years.
"We cannot say for sure that NVIDIA will never encounter some sort of air pocket. But we remain very bullish on the long term opportunity in front of them, and continue to believe that in 5 years or 10 years we will all be talking about an industry that is far larger than the numbers being bandied about today,” analysts said.
“And at current prices NVDA would trade at ~30x on a $15 EPS, which actually doesn’t seem unreasonable if one believed in that future and viewed it as a trough. This stock is, and will always be, volatile, but we think you still should be there.”
3. UBS analyses Meta’s GenAI search opportunity
UBS analysts remains very bullish on Meta Platforms' (NASDAQ:META) AI opportunity. Meta recently posted a blog post on AI updates, which analysts said supports his hypothesis that the “company stands to grow into more search revenue, expanding Bing search capabilities into more of its AI personalities.”
“We estimate that Meta could unlock $16.9B of potential search revs from GenAI apps and $2 in potential EPS contribution at scale (next 3-5 years). Near term we see drivers including continued Reels inflection, better measurement/attribution, engagement growth click-to-message ads as core drivers,” analysts said.
“We expect that Meta's genAI chatbot monetization story likely becomes more of an investor focus in 2024, driving the multiple.”
4. Broadcom has ‘best-in-class profitability’
Broadcom Inc (NASDAQ:AVGO) saw its target raised at Bank of America (NYSE:BAC) to $1,250 per share after finalizing the VMware (NYSE:VMW) acquisition. VMWare could drive AVGO EPS toward $60 in ‘25E, the analyst added.
“VMW can contribute double-digit growth for multiple years, as more workloads/activity are consistently being supported virtually using VMWare tech. We view this as key for AVGO’s infra software strategy, as VMWare (60% of segment sales) could accelerate segment growth from 4%-5% to 7%-8% long-term,” analysts said.
On AVGO’s AI opportunity, the analysts added:
“As model complexity grows towards hundreds of billions of parameters, hardware needs can change, making strong product development relationships with hyperscalers essential. In turn, this could support share gains of custom silicon over general purpose accelerators.”
5. Who’s winning the AI servers race?
Bernstein analysts weighed in on the AI servers race, which includes companies like Dell (NASDAQ: NYSE:DELL) and HP (NYSE:HPQ) Enterprise (NYSE: HPE).
“We are more confident that Dell’s AI server orders are truly incremental, while it is less certain that HPE’s are. This appears somewhat reflected in next year’s guidance, where Dell expects revenues to be above its long term model,” analysts wrote to clients.
“Stepping back, we note that Dell and HPE appear to continue to lag behind server ODMs like Quanta and Wistron in the AI server business. We also worry that AI servers will have lower margins, and with high risk of a digestion.”
“That said, if AI inferencing occurs on premise or at the edge, as opposed to remaining in the cloud, it could meaningfully benefit AI server OEMs, and we are more constructive on the medium-to-longer term AI server opportunity for OEMs.”