Tomaz Kastrun continues a series on Azure AI. First up is an introduction to Prompt Flow:
Prompt flow in Azure AI Foundry is development tool for designing the flows (streamlines) for the complete end-to-end development cycle of LLM’s AI application. You can create, iterate, test, orchestrate, debug, and monitor your flows.
After that, we get a demonstration a Prompt Flow in Python:
Prompty gives you the ability to create an end-to-end solution, like RAG where you can chat with LLM over an article or document, where you can ask to classify the input data (list of URLs,…)
Prompty is a markdown file, structured in YAML and encapsulates a series of metadata fields pivotal for defining the model’s configuration and the inputs. After this front matter is the prompt template, articulated in the
Jinja
format.