Resources
The following is a list of resource for this class. Click on the titles to learn more.
The textbooks recommended for this class were chosen for their complementary focus on both theoretical foundations and practical applications of large language models (LLMs) in real-world scenarios.
AWS PartRock is a free, no-code platform to experiment Generative AI. While the demonstration is not a functional RAG system, is shows some of its concepts.
The Syllabi Analysis DAIS is a Document-Driven Agentic Intelligence System that ingests the complete collection of class syllabi from all colleges and programs at Georgia State University, provided as PDF documents, and transforms them into a queryable semantic knowledge graph. The system is designed to serve academic program coordinators and curriculum analysts who need to ask complex questions that span hundreds of documents, such as which courses across the university cover a particular topic, how different colleges address the use of generative AI in their policies, what prerequisite chains lead to advanced courses, and where overlaps or gaps exist between programs.
An Application Programming Interface (API) is a set of rules and protocols that enables communication between different software systems, acting as a bridge to facilitate data exchange and interaction.
No-code tools empower users to create autonomous AI systems without requiring programming expertise.
These platforms provide intuitive interfaces, making it easier to design and deploy AI-driven applications.
A list of various open-source frameworks available to you to build your own agentic AI application:
Cross-origin resource sharing (CORS) is a mechanism for integrating applications. CORS defines a way for client web applications that are loaded in one domain to interact with resources in a different domain. This configuration is needed to allow local model providers to server web-applications what were loaded from a different site.
Ollama is a tool designed to host large language models (LLMs) locally on macOS, Linux, and Windows systems, offering OpenAI-compatible APIs for seamless integration with existing applications. By running models locally, Ollama eliminates the need for cloud-based services, ensuring privacy and reducing costs.
Texts:
Synopsis:
Synopsis: