Industry News – AWS re:Invent Week Two Recap
Week two of AWS’ three-week virtual re:Invent conference comes to a close today. At this year’s event, AWS has already debuted a number of innovations and new products. This week’s keynote from Swami Sivasubramanian, VP of Amazon AI , was all about machine learning (ML), and the overall theme as Sivasubramanian noted was about making ML easier—or in his words, “writing without a line of code.” Here are our key takeaways:
SageMaker Takes Center Stage
Introduced at re:Invent 2017, the goal of Amazon SageMaker is to make ML easier. Each year, AWS has continued to push more functionality to make that a reality, and this year was no different. Over the past 12 months, AWS has released 50 enhancements for SageMaker, making it one of the company’s fastest growing products. Here are four SageMaker announcements made this week that will accelerate ML adoption:
- Amazon SageMaker will now parallelize workloads for you, providing an 80% reduction in the time it takes to process data and train models. Traditionally, parallelizing the workload access to data or model processing has been quite difficult to achieve without a deep understanding of what you are doing. This improvement will make it much easier and faster to identify patterns in data.
- AWS announced Amazon SageMaker Data Wrangler to speed up the process of preparing data for use in ML. Typically, 80% of the total time spent on an ML project is devoted to preparing data for use. Data Wrangler is designed to significantly cut down on this preparation time by using pre-built data clean-up algorithms to visualize the data for examination before use.
- We also saw the introduction of Amazon SageMaker Pipelines, which allows the creation of fully automated workflows for ML projects. This capability creates continuous integration (CI) and continuous delivery (CD) pipelines to the ML process to automate the deployment of ML without having to write code—removing a lot of the manual work required by developers.
- AWS also launched Amazon SageMaker Edge Manager, which, according to AWS, enables you to “optimize ML models for edge devices.” With Edge Manager, users can manage multiple models deployed to edge devices and continuously monitor those models. This week’s announcement builds upon SageMaker Neo, released by AWS several years ago, which only allowed users to manage one model deployed to an edge device.
ML Becomes More Accessible to Database and BI Developers
Gone are the days of having to export data to an ML tool before running models and analysis in SageMaker. AWS has demonstrated a continued commitment to enabling database and BI developers to use ML features directly from a database engine. While several tools supporting this goal have been around since May, including Athena and Aurora, AWS announced several new developments this week that make ML even more accessible to and easier-to-use for database developers.
- Improvements to the Amazon SageMaker Autopilot tool automate much of the work involved in creating ML models. Autopilot crunches data to “build, train and tune the best ML models based on your data.” By automating the modeling process, database and BI developers can leverage ML without needing to have extensive technical knowledge.
- The new Amazon QuickSight Q allows non-technical staff to query data without having to learn query language. In yet another effort to bring ML to the masses, AWS has made it possible for users to ask questions using natural language. QuickSight Q then uses ML to translate the natural language into processable Query language. This is an important development because it eliminates the need for users to have to learn how to write query languages. Not only does this save valuable time, but it enables non-technical professionals to benefit from the technology. As AWS puts it, “Ask questions of your data and receive answers in seconds” —and now this applies to more than just technical folks with deep expertise.
Focus on Pre-Built Solutions
AWS also announced several pre-built solutions this week, including:
- AWS introduced three “Lookouts” that can detect anomalies. The first, Amazon Lookout for Metrics, uses ML to detect anomalies in streams of metric data in IT systems. Here again, no ML experience is required to benefit from this tool. Building upon that idea, Amazon Lookout for Equipment also monitors for anomalies, but specifically in industrial equipment. Lookout for Equipment can monitor up to 300 sensors per device to detect problems before failure occurs. Finally, Amazon Lookout for Vision can spot differences in photos taken over time to know when something isn’t quite right. All three Lookout products are currently in preview.
- Two other products featured, Amazon Monitron and AWS Panorama, enable organizations to extend ML to legacy equipment and industrial settings. Monitron enables the use of ML with older equipment through stick-on sensors. Data from those sensors is then poured into an ML model, which scans for issues. Panorama is a ML appliance that once plugged into an existing network can “discover, connect to, and process video from networked cameras and run simultaneous machine learning models per stream.” AWS also announced a Software Development Kit so that Panorama’s features can be embedded directly into equipment by camera manufacturers.
- Finally, Amazon HealthLake provides an all-in-one tool for healthcare companies and providers to dump data into a single place, analyze it, and produce reports. HealthLake is a HIPAA-compliant service that utilizes a pre-built data lake with the capability to ingest structured sensor data and combine it with unstructured notes and reports to provide reporting. The goal of HealthLake is to allow for “clinical data analysis powered by ML to improve care and reduce costs.” HealthLake is currently in preview.
Be sure to check back next week for our recap of AWS re:Invent’s final week! For a recap of week one, click here. To learn more about how Navisite can help you migrate and optimize your business on AWS, visit our AWS Managed Services page.