ReRAM-based Machine Learning (Computing and Networks)
Description:
The transition towards exascale computing has resulted in major transformations in computing paradigms. The need to analyze and respond to such large amounts of data sets has led to the adoption of machine learning (ML) and deep learning (DL) methods in a wide range of applications.
One of the major challenges is the fetching of data from computing memory and writing it back without experiencing a memory-wall bottleneck. To address such concerns, in-memory computing (IMC) and supporting frameworks have been introduced. In-memory computing methods have ultra-low power and high-density embedded storage. Resistive Random-Access Memory (ReRAM) technology seems the most promising IMC solution due to its minimized leakage power, reduced power consumption and smaller hardware footprint, as well as its compatibility with CMOS technology, which is widely used in industry.
In this book, the authors introduce ReRAM techniques for performing distributed computing using IMC accelerators, present ReRAM-based IMC architectures that can perform computations of ML and data-intensive applications, as well as strategies to map ML designs onto hardware accelerators.
The book serves as a bridge between researchers in the computing domain (algorithm designers for ML and DL) and computing hardware designers.
Best prices to buy, sell, or rent ISBN 9781839530814
Frequently Asked Questions about ReRAM-based Machine Learning (Computing and Networks)
The price for the book starts from $128.62 on Amazon and is available from 12 sellers at the moment.
If you’re interested in selling back the ReRAM-based Machine Learning (Computing and Networks) book, you can always look up BookScouter for the best deal. BookScouter checks 30+ buyback vendors with a single search and gives you actual information on buyback pricing instantly.
As for the ReRAM-based Machine Learning (Computing and Networks) book, the best buyback offer comes from and is $ for the book in good condition.
Not enough insights yet.
Not enough insights yet.