It has been a busy seven month period for the AI Mythic AI chip to start up. Seven months ago, the company unveiled its first M1108 Analog Matrix Processor (AMP) for AI inference. A month ago, Mythic AI announced a new round of $ 70 million Series C investment to bring its chips to mass production and develop its next hardware and software products.
Today, June 7 (Monday), the startup announced the arrival of its second chip design, the new M1076 AMP, which has a smaller form factor and lower power requirements than the original M1076 chip. so that it can be a better fit and better choice for the needs of smaller AI devices and applications.
Packing up to 25 TOPs in a single chip, the AMP M1076 will be available in three form factors: a stand-alone chip, a compact 22mm x 30mm PCIe M.2 A + E key card, or a loaded PCIe card. 16 M1076 chips totaling 400 TOPS and 1.28 billion in weight, while consuming 75 watts, according to the company.
The arrival of the new chip only seven months after the launch of the first M1108 chips was expected and is the result of the fact that many potential customers need a smaller, more energy efficient chip, Tim vehling, the senior vice president of product and business development at Mythic AI, said Corporate AI.
“We’re basically showing that we can derive and generate different versions of the technology quite easily” to meet customer needs, Vehling said. “One of the promises of the architecture is that you can quickly come up with larger or smaller versions and that’s what it shows.”
So far Mythic AI does not ship any chips to customers, he said, but the plan is to ship the M1076 chips by the end of 2021 or early 2022.
“We have sampled the original M1108 chips with customers,” Vehling said. “We will be delivering the first samples of this new M1076 product next month. We will see what customers go into production with, with which. But I think this version is ideal from a size, performance and horsepower perspective.
The design of the second chip was planned earlier to provide customers with a variety of chips to choose from for use in their products, he said.
“It turns out that in a PC world, the larger size [of the original M1088 chip] isn’t a problem, but if you want to have a strong presence in the embedded world, with embedded products using Arm-based processors, ”the components have to be smaller, Vehling said. “As we engage customers, we basically have to make our product a lot more scalable from the different form factors. “
Mythic AI is factory-free with all of its chips manufactured by UMC Japan.
Additional AI chips from Mythic AI are in the works and are expected in the future, Vehling said.
“If you look at the architecture, we call a tile architecture, you can see an array of these little tiles, the exact same tile is reproduced on the chip,” he said. “So we can certainly size what is needed based on the application that we see. One of the advantages of our architectures is that we use cheaper and older 40 nanometer process technology. So, it is quite easy for us to make versions or derivatives. It is not a big effort for us.
The M1076 AMP is designed for use in endpoints as well as server applications for a wide range of uses including smart cities, industrial applications, enterprise applications, consumer devices and more. The M1076 AMP is also suitable for video analysis workloads, including object detection, classification and depth estimation for machine vision, autonomous drones, surveillance cameras and recording applications. network video. It can also support AR / VR applications with low latency human body pose estimation, which should drive future smart fitness, gaming and collaborative robotics devices, according to the company.
The company’s ME1076 PCIe M.2 A + E Key and MM1076 PCIe M.2 M Key cards are expected to be available for customer review from July 2021.
Smaller chip makes good sense: analysts
Offering the new smaller M1076 AMP chip is a smart move from Mythic AI, Dan Olds, research director for Intersect360 Research, said Corporate AI. “This is their second chip design, but it seems to better meet customers’ size and usage requirements. “
Offering it in multiple configurations and performance levels is also a good thing, he said. “For AI inference, this gives customers a wide range of form factors: they can use the stand-alone processor in their own sensors to provide on-board inference capability, or use the large 16-processor board in a data center. data. There is also a third option that sits in an M.2 drive slot on a motherboard, adding more options for customers. All of them are very energy efficient, which is a requirement for anything remote on the outskirts. “
Like all the other AI chip companies competing to create the next big innovation in AI chips, Mythic AI is looking for market leader Nvidia, which has made a fortune producing GPU compute accelerators, Olds said. . “Nvidia’s GPU revenues have more than doubled in the past year, which is a huge target on their backs.”
But if there is a lot of competition, Mythic AI is well placed, he said.
“Mythic, with its analog processors, is ahead of the game when it comes to inference processors,” Olds said. “This is a very innovative approach and not likely to be copied. If they can perform and really prove their worth, they could be one of the big winners in the AI gold rush. One point in their favor is that they recently closed another round of funding, this time for $ 70 million, which will be used to increase production and, from what I can understand, establish a commercial presence in the whole world.
Another analyst, Alex Norton, senior technology analyst and head of data analytics for HPC and emerging technologies at Hyperion Research, said he was particularly excited about Mythic AI’s different approach to designing AI chips.
“The key point that grabs me about their ad and their platform is the chip’s power consumption,” Norton said. “AI accelerators and processors can be very power hungry, so building one that is more power efficient is an important distinction for Mythic. Especially since they target inference applications at the edge where power may be more limited than in a data center environment. “
More importantly, he added, the company listens to its potential customers when planning its products.
“Mythic’s announcement highlights that AI chip startups are focused on specific applications and use cases for their technology, working closely with end users to deliver products tailored to their needs.” Norton said. “The low-power, compact form fits their use cases exactly. “
Including the $ 70 million in Series C funding that Mythic AI provided in May, the company has now raised $ 165.2 million.