MathWorks, Altera partner to enhance wireless system development


Collaboration boosts 5G/6G efficiency, cuts costs with AI compression.

MathWorks and Altera, an Intel company, announced a collaboration to accelerate wireless development for Altera FPGAs by enabling wireless systems engineers to use AI-based autoencoders to compress Channel State Information (CSI) data and significantly reduce fronthaul traffic and bandwidth requirements. Engineers working on 5G and 6G wireless communications systems can now ensure user data integrity and maintain wireless communications systems’ reliability and performance standards while reducing costs.

MathWorks offers a comprehensive tool suite that enhances AI and wireless development, particularly for Altera FPGAs. Deep Learning HDL Toolbox specifically addresses the needs of engineers looking to implement deep learning networks on FPGA hardware. Leveraging the capabilities of HDL Coder, this innovative toolbox empowers users to customize, build, and deploy an efficient, high-performance Deep Learning Processor IP Core. This advancement significantly enhances performance and flexibility in wireless applications by supporting standard networks and layers.

FPGA AI Suite offers push-button custom AI inference accelerator IP generation on Altera FPGAs using the OpenVINO toolkit, utilizing pre-trained AI models from popular industry frameworks. It further helps FPGA developers integrate AI inference accelerator IP seamlessly into FPGA design using best-in-class Quartus Prime Software FPGA flows. Combining the Deep Learning Toolbox and the OpenVINO toolkit creates a streamlined path for developers to optimize AI inference on Altera FPGAs.

For more information, visit mathworks.com.



Source link

Related Posts

About The Author

Add Comment