Resources
The following is a list of resource for this class. Click on the titles to learn more.
The two textbooks recommended for this class were chosen for their complementary focus on both theoretical foundations and practical applications of large language models (LLMs) in real-world scenarios.
This guide provides a comprehensive walkthrough for setting up and utilizing Docker Desktop on Windows and macOS systems. It details specific hardware requirements, such as virtualization support and memory needs, alongside step-by-step instructions for graphical and command-line installations.
An Application Programming Interface (API) is a set of rules and protocols that enables communication between different software systems, acting as a bridge to facilitate data exchange and interaction.
No-code tools empower users to create autonomous AI systems without requiring programming expertise.
These platforms provide intuitive interfaces, making it easier to design and deploy AI-driven applications.
A list of various open-source frameworks available to you to build your own agentic AI application:
Cross-origin resource sharing (CORS) is a mechanism for integrating applications. CORS defines a way for client web applications that are loaded in one domain to interact with resources in a different domain. This configuration is needed to allow local model providers to server web-applications what were loaded from a different site.
Ollama is a tool designed to host large language models (LLMs) locally on macOS, Linux, and Windows systems, offering OpenAI-compatible APIs for seamless integration with existing applications. By running models locally, Ollama eliminates the need for cloud-based services, ensuring privacy and reducing costs.
Texts:
Synopsis:
Synopsis: