Your submission was sent successfully! Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

Meet us at World AI 2023

Andreea Munteanu

on 14 September 2023

This article was last updated 1 year ago.


The Canonical AI Roadshow has started. Meet us around the globe.

Date: 11-12 October 2023

Location: Taets Art & Event Park, Amsterdam, Netherlands

Booth: A24

The Canonical AI Roadshow is taking off. Generative AI, large language models (LLMs) and predictive analytics are shaping the future of technology. Experience the latest advances in these areas at the World AI Summit, one the largest global AI events. 

At the conference, AI leaders, developers, creators and students will discover how to use open source technologies to elevate their AI story. From getting started to enabling large teams to build reproducible AI projects, we will cover a variety of topics during the event.

Canonical AI Roadshow at World AI Summit

World AI Summit gathers the global AI ecosystem, talking about the latest innovations from the industry, the biggest challenges that enterprises face or groundbreaking stories about AI.

Our team of experts will travel to Amsterdam as part of the Canonical AI Roadshow to give talks about large language models (LLMs), deliver a joint workshop with NVIDIA and answer your questions about AI, machine learning operations (MLOps) and open source. From genAI to predictive analytics, we will showcase a wide variety of demos for different industries. 

Bonus: Everyone who visits our booth can interact with a conversational assistant, similar to ChatGPT, built fully with open source components, to learn more about Ubuntu and Canonical. 

Build your LLM factory with NVIDIA and Canonical

Building an AI factory requires both powerful hardware and an integrated suite of tools. An AI factory should empower enterprises to run AI at scale, enabling professionals to collaboratively develop and deploy machine learning models. NVIDIA and Canonical put together an end-to-end solution to simplify these activities, addressing the entire ML lifecycle.
Michael Balint, Senior Manager at NVIDIA, Maciej Mazur, AI/ML Principal engineer and myself put together an exciting workshop to tackle building an AI factory using open source tooling. We will zoom into a use case that showcases large language models on DGX, with Charmed Kubeflow and NGC containers.

LLMs from 0 to hero

Large language models (LLMs) are gaining popularity. Maciej Mazur and myself will deliver a session on automating LLM fine-tuning and developing chat-based and multimodal model assistants. He will highlight the potential of LLMs within different industries, touching upon common pitfalls and how to overcome them.

This is a technical deep dive into large language models, which will refer to open source tooling such as Kubeflow, MLFlow or Spark. At the end of this talk, you will know how to use LLMs and open source, and have a better understanding of the major challenges that they have. 

What is the Canonical AI roadshow?

Canonical AI Roadshow is a series of events that will highlight generative AI use cases powered by open source software. The roadshow will take us around the globe between mid-September and mid-November to talk about the latest innovations from the industry, demo some of the most exciting use cases from various industries, including financial service, telco and oil and  gas, answer questions about AI, MLOps, Big data and more… We will stop in:

  • Europe 
  • North America 
    • Austin, Texas
    •  Chicago, Illinois
    • Las Vegas, Nevada
  • Middle East and Africa  
  • Central and South America
    • São Paulo, Brazil

Talk to us today

Interested in running Ubuntu in your organisation?

Newsletter signup

Get the latest Ubuntu news and updates in your inbox.

By submitting this form, I confirm that I have read and agree to Canonical's Privacy Policy.

Related posts

What is MLflow?

MLflow is an open source platform, used for managing machine learning workflows. It was launched back in 2018 and has grown in popularity ever since, reaching...

Canonical joins OPEA to enable Enterprise AI

Canonical is committed to enabling organizations to secure and scale their AI/ML projects in production. This is why we are pleased to announce that we have...

Introducing Data Science Stack: set up an ML environment with 3 commands on Ubuntu 

Canonical, the publisher of Ubuntu, today announced the general availability of Data Science Stack (DSS), an out-of-the-box solution for data science that...