Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Foundry Local enables local execution of large language models (LLMs) directly on your Windows device, as part of Microsoft Foundry on Windows. This is a good alternative if you want to go deeper and implement an AI scenario that's not available with the Windows AI APIs.
This on-device AI inference solution provides privacy, customization, and cost benefits compared to cloud-based alternatives. Best of all, it fits into your existing workflows and applications with an easy-to-use CLI and REST API.
Note
Full documentation for Foundry Local — including installation, model management, the REST API, and SDK reference — is maintained in the Azure AI Foundry documentation. The link below takes you there. Use your browser's back button or the breadcrumb to return to the Windows AI docs at any time.
If you're not sure whether Foundry Local is the right choice for your scenario, see Choose your Windows AI solution before continuing.
For more information on Foundry Local, see the Foundry Local documentation.