Interested in a private company training? Request it here.
Azure AI Foundry is a comprehensive platform that streamlines AI solution development and deployment. In this chapter, discover how to use hubs for building and testing AI solutions, projects for deploying innovations, and tools for managing resources, all while ensuring responsible AI practices are followed.
Azure AI services provides a comprehensive suite of out-of-the-box and customizable AI tools, APIs, and pre-trained models that detect sentiment, recognize speakers, understand pictures, etc. Azure AI Foundry brings together these services into a single, unified development environment.
This module introduces Azure OpenAI and the GPT family of Large Language Models (LLMs). You'll learn about available LLM models, how to configure and use them in the Azure Portal, and the Transformer architecture behind models like GPT-4. The latest GPT models offer Function Calling, enabling connections to external tools, services, or code, allowing the creation of AI-powered Copilots. Additionally, you'll discover how Azure OpenAI provides a secure way to use LLMs without exposing your company's private data.
The cost and quality of your AI-powered app depend largely on your choice of AI model and how you deploy it. Learn about the available model catalog, featuring state-of-the-art Azure OpenAI models and open-source models from Hugging Face, Meta, Google, Microsoft, Mistral, and many more.
Azure AI Search enables the Retrieval Augmented Generation (RAG) design pattern, enhancing LLMs knowledge with your own company specific data. This chapter explores the RAG design pattern by incorporating Azure AI Search, into your LangChain/Prompt flow Python applications.
This chapter covers two powerfull Python libraries for AI applications: Prompt Flow and LangChain. Prompt Flow streamlines the design and deployment of prompt-based workflows, optimizing AI processes. LangChain simplifies building applications with LLMs through open-source components and quick integrations. Learn how to expose Python functions to LLMs as plugins, to let them interact with the outside world, enabling you to build your own Copilots!
In this chapter, you'll explore advanced techniques allowing you to control the model's output, transforming generic responses into precise, valuable results. Additionally the chapter covers emerging design patterns in the field of Gen AI app development that help you increase quality of model responses and reduce costs.
How can you ensure an LLM provides relevant and coherent answers to users' questions using the correct info? How do you prevent an LLM from responding inappropriately? Discover the answers to these questions and more by exploring evaluation metrics in Azure AI Foundry and the Azure AI Content Safety Service.
Ensuring your AI app behaves as expected doesn't end at deployment. It's crucial to monitor its interactions with users while it's running in production. Learn how Azure AI Foundry integrates with industry standards like OpenTelemetry to give you a clear and transparent view of your app's behavior.
This chapter explores the advantages of fine-tuning pre-trained LLMs for higher accuracy and customized behavior compared to Retrieval Augmented Generation (RAG). While RAG offers dynamic updates and cost-effectiveness, fine-tuning provides superior precision for specialized tasks, making it ideal for achieving domain-specific results.
This course equips participants to develop, design and deploy AI solutions using Azure AI Foundry. You'll learn to collaborate on projects, manage resources, and use advanced AI techniques like prompt engineering, retrieval augmented generation, and AI orchestration frameworks in Python like Prompt Flow and Langchain. The course also covers fine-tuning models for accuracy, ensuring responsible AI practices, and monitoring applications in production.
This course is designed for developers, data scientists, and AI Operators looking to leverage the full AI app development toolset provided by Azure AI Foundry. Basic understanding of Python is recommended.