AMD revenue drops 9% to $5.4 bn in first-quarter

AMD today announced revenue of $5.4 billion (–9 percent), gross margin of 44 percent, operating loss of $145 million and net loss of $139 million for the first quarter of 2023.
AMD revenue break-up Q1 2023AMD’s Data Center and Embedded segments contributed more than 50 percent of revenue in the first quarter.

AMD has generated revenue of $1.295 billion (flat) from data center, $739 million (–65 percent) from client, $1.757 billion (–6 percent) from gaming and $1.562 billion (+163 percent) from embedded during the first quarter.

“We launched multiple leadership products and made significant progress accelerating our AI roadmap and customer engagements in the quarter,” AMD Chair and CEO Lisa Su, said in its earnings report.

“Longer-term, we see significant growth opportunities as we successfully deliver our roadmaps, execute our strategic data center and embedded priorities and accelerate adoption of our AI portfolio.”

AMD CFO Jean Hu said the company expects sequential growth in Data Center and Client segments offset by modest declines in Gaming and Embedded segments for the second quarter.

AMD expects revenue to be approximately $5.3 billion, plus or minus $300 million, with gross margin of approximately 50 percent for the second quarter of 2023.

AMD CEO Lisa Su told investors that the first quarter was the bottom of the market for the company’s PC business and the industry.

Lisa Su said part of that growth will come from a chip called the MI300, which will compete with Nvidia’s flagship chips for artificial intelligence. Lisa Su said customer interest in the chip is growing.

“We believe that we will start ramping revenue in the fourth quarter with cloud AI customers, and then it’ll be more meaningful in 2024,” Su said. “Success for us is having a significant part of the AI overall opportunity.”

Nvidia has the bulk of the AI market, and analysts believe it has a strong hold on its position.

“MI300 will be used primarily on special projects or on a case-by-case basis,” said Summit Insights Group analyst Kinngai Chan. “The MI300 is likely to be inferior to Nvidia’s latest H100 data center chip for large language model applications, such as ChatGPT.”