Computational Storage adds AI to eDiscovery
Computational Storage allows for distributed compute to be done locally on an NVMe SSD. Within the SSD controller, we have embedded a quad-core Arm subsystem capable of running an OS and applications natively on the disk, which is closest to where the data resides. All of this is done independently of the host CPU/GPU. The net result in doing this is that applications run faster, the system consumes less power, and costs are reduced as overall system efficiency increases.
The key to the technology is having a simple programming model that allows real-world users to easily implement applications that take advantage of this technology. Using standard, open-sourced technologies, Dan Pollack of Data Storage Science shows how he implemented a seamless and efficient AI platform for eDiscovery. Files come into the system in a wide range of formats and are pushed to an S3 object-store. The drives themselves, without host intervention, are then triggered to do OCR on each file and output a text searchable object which is indexed.
No heavy lifting, no additional CPU cycles, no GPU’s necessary. Simply let the application run on Computational Storage, and everything you store on that system is now indexed and searchable.
Computational Storage is easy to implement if you use the NGD Systems NVMe Computational Storage SSDs. Contact us (or Dan) if you would like to learn more about this or any of the hundreds of other applications that Computational Storage can buff.