Time and Date: 2:30pm-3:30pm, Oct 11th, 2023
Location: Track 7, Room Jeanne D’Arc. Taets Art & Event Park, Amsterdam, Netherland
Building an AI factory demands an integrated suite of tools. Such a setup empowers large teams - consisting of data scientists, researchers, data engineers, analysts, and other AI professionals - to collaboratively build AI models, emphasising scalability and methodical approaches.
However, creating an AI factory is challenging due to the intricate interplay of hardware and software. NVIDIA and Canonical have forged an end-to-end solution to simplify the development and deployment of machine learning models. This solution facilitates collaboration among AI developers, while abstracting complexities related to hardware, drivers, and other foundational software.
In this workshop, participants will discover:
- How to scale AI efforts, transitioning from locally-running AI containers to an open-source MLOps platform, powered by NVIDIA AI containers and Charmed Kubeflow.
- The customization and deployment process of large language models (LLMs), starting with a foundational model.
- An in-depth look at the components used, emphasising security enhancements, reproducibility, and integration techniques.
By the workshop’s conclusion, attendees will:
- Possess a comprehensive understanding of a stack combining both hardware and software tailored for LLMs.
- Be adept at using open-source tools for developing and deploying LLMs.
- Grasp methods to navigate and surmount AI scaling challenges.
- Have the opportunity to direct questions to NVIDIA and Canonical experts in real time.
Agenda:
Duration: 60 minutes
- Introduction to LLMs
- Architecture of the end-to-end solution from hardware layer to open source libraries
- Demo of the solution
- Walkthrough customization & deployment of the LLM for the chosen use case
- Live Q&A
What you need to bring:
A laptop to start building your project hands-on:
- Access to a cloud based environment or 32GB of RAM
- Installation of MicroK8s (works on Mac/Windows/Linux) just follow a tutorial on: https://microk8s.io/docs/install-alternatives