Menu
Shortenlink and get Paid
 

Sponsored Content

Loading...

Samsung has announced a new NPU technology that will allow on-device AI to be faster, more energy efficient and take up less space on the chip. Thanks to Quantization Interval Learning, 4-bit neural networks can be created that retain the accuracy of a 32-bit network. Using fewer bits significantly reduces the number computations and the hardware that carries them out - Samsung says it can achieve the same results 8x faster while reducing the number of transistors 40x to 120x. This will make for NPUs that are faster and use less power, but can carry out familiar tasks such as object...



from GSMArena.com - Latest articles https://ift.tt/30aUoGS
via IFTTT

Post a Comment

One way to contribute to the development of this website is by always dropping your comment whenever you read a post.


Don't leave without dropping yours

 
Top