You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This configuration ensures that the response from the Large Language Model (LLM) aligns with the specified format (InvoiceData in this example).
Request:
I was wondering, if possible, to do the same through Prompty template for example:
name: InvoiceDocumentExtractiondescription: This prompt is used to extract information from an invoice document.authors:
- HKmodel:
api: chatconfiguration:
type: azure_openaiparameters:
max_tokens: 16384temperature: 0.0top_p: 0.0frequency_penalty: 0.0presence_penalty: 0.0response_format: <reference to a file>
the file could reference a predefined response format (e.g., InvoiceData class or schema), ensuring consistent structure in the LLM's output. This is particularly useful for tasks like extracting structured data from documents.
github-actionsbot
changed the title
New Feature: Support for structured output through Prompty file
.Net: New Feature: Support for structured output through Prompty file
Jan 8, 2025
name: Feature request
about: Adding support for structured output through Prompty
Currently, we can set the
response_format
property programmatically in code, as shown below:This configuration ensures that the response from the Large Language Model (LLM) aligns with the specified format (InvoiceData in this example).
Request:
I was wondering, if possible, to do the same through Prompty template for example:
the file could reference a predefined response format (e.g., InvoiceData class or schema), ensuring consistent structure in the LLM's output. This is particularly useful for tasks like extracting structured data from documents.
I see Prompty support the following
The text was updated successfully, but these errors were encountered: