fbpx

Aurimas Griciūnas

Chief Product Officer
NEPTUNE AI

PRESENTATION TITLE:

Observability in LLMOps pipeline - different levels of scale

PRESENTATION SUMMARY:

Atom icon for The AI Conference 2023, a groundbreaking two-day event on AGI, LLMs, Infrastructure, Alignment, AI Startups, and Neural Architectures.The term LLMOps has been around for just under 2 years. In the meantime, we have gone from using fine-tuned foundation models to running complex Agentic AI systems in production. Many breakthroughs are yet to be made and hardware limitations to be overcome.

Brain icon for The AI Conference 2023, a groundbreaking two-day event on AGI, LLMs, Infrastructure, Alignment, AI Startups, and Neural Architectures.

In this talk I will walk you through the LLMOps pipeline as if we were building an AI system from scratch. From building foundation models and fine-tuning them to observing complex Agentic AI systems in production. I will emphasize the different levels of scalability requirements for observability infrastructure and tooling at each step and why you might want to invest into it.

About | Aurimas Griciūnas

Aurimas is the Chief Product Officer at neptune.ai, where he and his team are building the most scalable experiment tracker on the market.

He has over a decade of work experience in various data-related fields, including Data Analytics, Data Science, Machine Learning, Data Engineering, and Cloud Engineering. For more than four years, he has led teams working with Data and Infrastructure.

Aurimas is also sharing his data and infrastructure-related knowledge as the Founder and CEO of SwirlAI.