Deep Learning & AI Inference on Cloud Transport Devices?
Google Cloud Next is around the corner and the buzz around edge devices and cloud computing hasn’t abated.
At the AWS Summit last week in NY, Amazon announced support for compute on Snowball. Snowball Edge now supports Amazon EC2 instance types as well as AWS Lambda functions. This opens a whole suite of possibilities. Customers can now develop custom apps for pre-processing. Expect to see an entire eco-system building around it. Common use cases include data migration, data transport, image collation, IoT sensor stream capture, and machine learning.
Earlier this summer, Microsoft also announced Azure Data Box which had been in preview for some time.
Most of these edge devices are capped at 100 TBs. So, what if the size of your data set is greater than that or the influx of new edge data is faster than what data can be pre-processed, transported and ingested on the cloud side? CloudLanes can help. CloudLanes recently announced support of multiple PODs with enhanced security and data guardianship for larger data sets. The combination of enhanced security, with edge computing on large data sets is surely expected to change the paradigm of cloud computing.
So, what should we expect at Google Cloud Next? Is pattern matching and inference for deep learning around the corner on cloud transport devices?