Not too big: Machine learning tames huge data sets

A machine-learning algorithm demonstrated the capability to process data that exceeds a computer’s available memory by identifying a massive data set’s key features and dividing them into manageable batches that don’t choke computer hardware. The algorithm set a world record for factorizing huge data sets during a test run on the world’s fifth-fastest supercomputer. Equally efficient on laptops and supercomputers, the highly scalable algorithm solves hardware bottlenecks that prevent processing information from data-rich applications in cancer research, satellite imagery, social media networks, national security science and earthquake research, to name just a few.

Read more: Not too big: Machine learning tames huge data sets

Story added 11. September 2023, content source with full text you can find at link above.