Workshop: Chatbot Context Workflow: Exploring Advanced Search and LLM Integration
Workshop 1 - Vector and Hybrid Query Search by implementing Vector Embeddings in Azure AI Search
In this hands-on workshop, participants will explore the powerful capabilities of Azure AI Search for implementing advanced vector and hybrid search techniques. Learn to create a search index schema optimized for vector embeddings and configure vector search to enable high precision information retrieval.
Through step-by-step guidance, participants will:
- Understand how to insert text and embeddings into a vector store using Azure's AI search
- Perform vector similarity searches and exhaustive k-nearest neighbor (KNN) searches for exact matches.
- Dive into multi-vector search techniques, including normal and weighted approaches, for enhanced retrieval flexibility.
- Combine semantic and vector search in hybrid configurations to deliver contextualized and meaning-aware results.
- Leverage semantic hybrid search to extract the most relevant documents with precision and clarity.
Large Language Models (LLMs) can engage in human-like conversations, a feature that popularized LLMs immediately upon their introduction. The benefit of LLM to interpret nonstructured human input also prompted developers to explore the usage of LLM in managing program flow. This workshop explores the key elements of designing agentic LLMs to converse with systems instead of humans.
Participants will learn about output formatting and output parsing, ensuring the LLM returns an output that the system can interpret. Participants will also use LangSmith to analyze LLM invocations, a useful tool to debug programs running LLM multiple times in quick succession.