Interested in a private company training? Request it here.
Not ready to book yet? Request an offer here.
In this chapter you will get a short overview about what AI is exactly, and what we can do with it.
Azure AI Foundry is a comprehensive platform that streamlines AI solution development and deployment. In this chapter, discover how to use hubs for building and testing AI solutions, projects for grouping and deploying AI apps, and tools for managing resources, all while ensuring responsible AI practices are followed.
Azure AI services provides a comprehensive suite of out-of-the-box and customizable AI tools, APIs, and pre-trained models that detect sentiment, recognize speakers, understand pictures and many more. Azure AI Foundry brings together these services into a single, unified environment.
This module introduces Azure OpenAI and the GPT family of Large Language Models (LLMs). You'll learn about available LLM models, how to configure and use them in the Azure Portal, and the Transformer architecture behind models like GPT-4o. The latest GPT models offer Function Calling, enabling connections to external tools, services, or code, allowing the creation of AI-powered Copilots. Additionally, you'll discover how Azure OpenAI provides a secure way to use LLMs without exposing your company's private data.
Semantic Kernel is an open-source SDK backed by Microsoft that seamlessly integrates Large Language Models such as OpenAI and Azure OpenAI with programming languages like C#. It allows users to use natural language input within Large Language Models to seamlessly invoke and interact with your custom code.
This chapter explores the Model Context Protocol (MCP), an open standard revolutionizing how applications provide context to LLMs. MCP acts as a 'USB-C for AI,' standardizing connections between LLMs and various data sources or tools. Crucially, MCP empowers companies to define, once and for all, precisely how their proprietary data and tools are utilized by AI systems.
Vector search is a powerful technique that allows you to retrieve semantically related data from large datasets such as company documents or databases. This chapter will teach you how vector search works and how it enables you to find relevant information without depending on exact keyword based search terms or language of the information in the dataset.
Azure AI Search facilitates the adoption of the Retrieval Augmented Generation (RAG) design pattern. This methodology involves retrieving pertinent information from a data source and using it to increase the knowledge of generative AI models.This combination of retrieval and generation sets a new standard for AI-driven search solutions.
In this chapter, you'll explore advanced techniques allowing you to control the model's output, transforming generic responses into precise, valuable results. Additionally the chapter covers emerging design patterns in the field of Gen AI app development that help you increase quality of model responses and reduce costs.
This chapter introduces building agentic AI systems. Learn what agents are, suitable use cases, essential design foundations including models, tools, and instructions, different orchestration patterns, and the importance of implementing robust guardrails and human oversight mechanisms.
The cost and quality of your AI-powered app depend largely on your choice of AI model and how you deploy it. Learn about the available model catalog, featuring state-of-the-art Azure OpenAI models and open-source models from Hugging Face, Meta, Google, Microsoft, Mistral, and many more.
This chapter empowers you to bring powerful AI capabilities to end-user environments like mobile devices, personal computers and browsers, enhancing scalability, costs and performance. Additionally you will learn how to deploy and host your own open-source Language Models in the form of an API that you have full control over.
How can you ensure an LLM provides relevant and coherent answers to users' questions using the correct info? How do you prevent an LLM from responding inappropriately? Discover the answers to these questions and more by exploring evaluation metrics in Azure AI Foundry and the Azure AI Content Safety Service.
In this course, you will learn to seamlessly integrate pre-built AI services and Large Language Models such as ChatGPT and Phi into your .NET development projects. You will become familiar with Azure AI Foundry, Microsoft's unified portal for managing, testing, moderating and deploying AI models. The course will teach you how to use your own data with Large Language Models using Azure AI Search. Furthermore, you will gain hands-on experience with AI libraries such as Semantic Kernel. This course will equip you with the skills to integrate advanced AI capabilities into your software solutions without needing to be a data scientist.
This course targets professional C# developers that want to get started with the Microsoft AI platform, now known as Azure AI Foundry. Participants of this course need to have a decent understanding of C# and preferably some experience with Microsoft Azure. This is not a course for data scientists who want to build their own AI models or understand how existing AI models work.