Just six months after unveiling its first AI inferencing processor, Mythic AI has announced a new round of funding for $70 million in Series C investment to begin mass production of its chips and to develop its next generation of hardware and software products.\nIn November, the company announced the M1108 Analog Matrix Processor (AMP) aimed at edge AI deployments across a wide range of applications, including manufacturing, video surveillance, smart cities, smart homes, AR\/VR, and drones.\n\nFor a company that is nine years old and has zero sales, it\u2019s got some heavy hitters behind it. The new investment round was led by led by venture fund giant BlackRock and Hewlett Packard Enterprise (HPE). Other investors include Alumni Ventures Group and UDC Ventures.\n Mythic \n\nMythic M1108\n\n\nMythic AI said it will use the latest funding to accelerate its plans to begin mass production of the M1108 while expanding its support to customers globally, building up its software offerings, and developing the next-generation of its hardware platform.\nInference is the second step in machine learning, following training, and has far lower compute requirements. Training requires GPUs, FPGAs, and CPUs with their massive horsepower, but inference is just a yes\/no comparison and a CPU is overkill.\nBy way of comparison, Intel\u2019s early stab at an inference processor, the Nervana (since discontinued), consumed as little as 10 watts. A CPU consumes 200 watts and a GPU up to 500 watts. The M1108 would use as little as 4 watts, so you can see why it might be ideal for a low-power edge deployment.\nThe M1108 chips are analog processors designed to provide high-performance with low power requirements. At the heart of the M1108 is the Mythic Analog Compute Engine for analog compute-in-memory with on-chip deep neural-network model execution and weight-parameter storage with no external DRAM.\nEach Mythic ACE is complemented by a digital subsystem that includes a 32-bit RISC-V nano processor, SIMD vector engine, 64KB of SRAM, and a high-throughput network-on-chip router. It uses a M.2 design which is becoming rather popular among SSD designs.\nM.2 is about the size of a stick of gum and plugs into the motherboard, lying flat. Depending on the motherboard, the M.2 uses PCI Express Gen3 or Gen4. The Mythic processor board has a four-lane PCIe interface with up to 2GB\/s bandwidth.\nDevice makers and original-equipment manufacturers (OEMs) can choose from the single-chip M1108 Mythic AMP or a variety of PCIe card configurations, including the M.2 M Key and M.2 A+E Key form factors to fit a variety of needs.\nOn the software side, the M1108 supports standard machine-learning frameworks like PyTorch, TensorFlow 2.0, and Caffe.\nMythic has not said when it plans to bring its products to market.