Categoría: machine learning

Let’s meet at AI4 and talk about AI infrastructure with open source

Date: 11 – 13 August 2025 Booth: 353 Book a meeting You know the old saying: what happens in Vegas… transforms your AI journey with trusted open source. On August 11-13, Canonical is back at AI4 2025 to share the secrets of building secure, scalable AI infrastructure to accelerate every stage of your machine learning […]

Accelerating AI with open source machine learning infrastructure

The landscape of artificial intelligence is rapidly evolving, demanding robust and scalable infrastructure. To meet these challenges, we’ve developed a comprehensive reference architecture (RA) that leverages the power of open-source tools and cutting-edge hardware. This architecture, built on Canonical’s MicroK8s and Charmed Kubeflow, running on Dell PowerEdge R7525 servers, and accelerated by NVIDIA NIM microservices, […]

Experiment Tracking with MLFlow in Canonical’s Data Science Stack

Welcome back, data scientists! In my previous post, we explored how easy it is to set up a machine learning environment with Canonical’s Data Science Stack (DSS) and run your first model using Hugging Face’s Smol Course. Today, let’s take it a step further with experiment tracking. Experimentation is at the heart of data science, […]

Large language models (LLMs): what, why, how?

Large language models (LLMs) are machine-learning models specialised in understanding natural language. They became famous once ChatGPT was widely adopted around the world, but they have applications beyond chatbots. LLMs are suitable to generate translations or content summaries. This blog will explain large language models (LLMs), including their benefits, challenges, famous projects and what the […]