You can transition faster to cloud with Nokia & AWS. Here’s how
How do you define ‘the cloud’? Some may say “compute, storage and network elsewhere.” Others have different definitions altogether.
After attending the recent Amazon Web Services (AWS) re:Invent conference 2017 in Las Vegas, it’s no surprise the definition isn’t nailed down given how dramatically cloud computing has changed in the past five years. I saw firsthand just how much AWS delivery of advanced cloud computing offerings have evolved, with discussions surrounding a host of new topics, such as Container Management, Server-less Computing, Data stream management — and other new approaches.
Cloud computing doesn’t stop at advanced techniques to lower costs and simplify management of cloud resources. Now it’s moving into the cutting-edge world of computer vision, machine learning and edge computing. Customers around the world are excited about how these technologies can impact the way they operate their enterprise.
I was invited to speak at the AWS conference about how technologies such as computer vision, machine learning and edge computing are positively enabling video analytics solutions. Let me illustrate what a typical video analytics solution offers today and how it compares to Nokia’s innovative approach.
Nokia IMPACT Scene Analytics introduced at AWS re:Invent 2017
Typically, network cameras can produce 10GB per day of traffic across the network. If you are connecting those cameras over a LAN (such as campus, airport, etc.) and you have enough cameras locally, you probably send all those 24x7x365 video streams to a large video analytics server locally, which then provides one percent relevant data to the users.
The problem is that this model doesn’t scale very well over a WAN. Think of a typical traffic intersection with four cameras. Wired and wireless connectivity is too expensive to transport these non-stop video streams to a central video analytics server. So, deciding where to apply the analytics depends on 3 “laws”:
- Laws of physics – the anomaly alerts don’t operate at the speed of light; there are real world latency considerations
- Law of economics – what are the financial constraints around the compute, network and storage
- Law of the land – some use cases will dictate where the data resides, e.g., government data, private data, etc.
Time to move video analytics to the network edge
The solution is to put some video analytics at the edge of the network in close proximity to the center of activity. If you can analyze the non-stop video streams at the edge and only send the relevant data up to the cloud, you can drastically reduce the amount of uplink connectivity, storage and compute need. As a result, TCO decreases significantly.
This is where Nokia IMPACT Scene Analytics and AWS Greengrass work nicely together. IMPACT Scene Analytics is based on years of Bell Labs research. It uses machine learning to analyze anomalous motion. AWS Greengrass factiliates pushing object classification models from the cloud to the edge; allowing the edge compute to identify the object locally. The result is that the edge function can send real-time alerts of anomalous activity (such a bicycle illegally crossing an intersection) and metadata about the event (bicycle detected), while dramatically reducing the transport, storage and compute costs of traditional video analytics solutions.
Using edge video analytics can drastically reduce transport, storage and compute costs, while providing real-time anomaly alerting.
Is it a hot dog? Or not a hot dog?
I also attended a hands-on workshop at re:Invent, where I developed my own machine learning model for object detection, deployed it to an edge gateway and created a real-time action based on that model. I used the AWS DeepLens developer kit, which incorporate all the tools required to develop a video analytics capability in one package, eliminating the heavy lifting often required for this type of work.
A nod to the recent episode of “Silicon Valley” a situation comedy series about a startup in the valley; just like in the series – I was able to train my project to detect something as specific and random as a hot dog! I probably won’t be retiring on sales of that app, but you get the idea… the potential is enormous
Watch this video to learn more about Greengrass: At the Cutting Edge: AWS IOT & Greengrass for Multi-Access Edge
Learn more about Nokia IMPACT Scene Analytics on our website
Share your thoughts on this topic by replying below – or join the Twitter discussion with @nokianetworks using #Cloud #aws #DeepLens #reInvent #EdgeComputing #Analytics