Packing Large AI Into Small Embedded Systems cover art

Packing Large AI Into Small Embedded Systems

Packing Large AI Into Small Embedded Systems

Listen for free

View show details
Only £0.99 a month for the first 3 months. Pay £0.99 for the first 3 months, and £8.99/month thereafter. Renews automatically. Terms apply. Start my membership

About this listen

Not every microcontroller can handle artificial intelligence and machine learning (AI/ML) chores. Simplifying the models is one way to squeeze algorithms into a more compact embedded compute engine. Another way is to pair it with an AI accelerator like Femtosense’s Sparse Processing Unit (SPU) SPU-001 and take advantage of sparsity in AI/ML models.

In this episode, Sam Fok, CEO at Femtosense, talks about AI/ML on the edge, the company's dual sparsity design, and how the small, low power SPU-001 can augment a host processor.

No reviews yet