Secure AI models deployment at the edge with OpenVINO™ on Ubuntu containers

Key Value
image
image_width
image_height
meta_image https://assets.ubuntu.com/v1/422c53b2-intel_webinar.jpg
meta_copydoc
banner_class grad
webinar_code 517110

Building a secure software supply chain is the challenge facing developers of tomorrow’s applications. Deep learning models in frameworks like PyTorch and TensorFlow require complex pipelines, composing from abundant resources. From training the models to testing, optimising, and deploying them on OCI images with OpenVINO™. This webinar will focus on the deployment on containers phase, building on the work done by Intel and Canonical to provide not only developer-friendly yet also secure and stable container images.
By combining Ubuntu containers with the OpenVINO™ Model Server, developers can deploy and manage inference-as-a-service on a range of Intel® platforms using Kubernetes – from the edge to the cloud.

Speakers:

  • Miłosz Żeglarski, Deep Learning Software Engineer, Intel
  • Ryan Loney, OpenVINO Product Manager, Intel
  • Valentin Viennot, Product Manager, Canonical