Llama for Python Programmers

Self Paced

 • 

Online
Presented by Michigan Online

Learn how to leverage the Llama 2 large language model (LLM) and how open-source LLMs can run on self-hosted hardware, made possible through techniques such as quantization by using the llama.cpp package. In this online course, you will explore how Meta’s Llama 2 fits into the larger AI ecosystem and how you can use it to develop Python-based LLM applications. Get hands-on skills in improving and constraining Llama 2 output, and learn how to get more robust data interchanges between Python application code and LLM inference. Understand different Llama 2 model variants, how they were trained, and how to interact with these models in Python.

This course is free for U-M alumni, students, faculty, and staff.

More from the Alumni Education Gateway
Join the Alumni Education Gateway Email List​
We use cookies to ensure you get the best experience on our website. By using this site, you accept our use of cookies.