ContentSafetyEvaluator(String, IDictionary<String,String>) Constructor
Definition
Important
Some information relates to prerelease product that may be substantially modified before it’s released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
An abstract base class that can be used to implement IEvaluators that utilize the
Azure AI Foundry Evaluation service to evaluate responses produced by an AI model for the presence of a variety of
unsafe content such as protected material, vulnerable code, harmful content etc.
protected:
ContentSafetyEvaluator(System::String ^ contentSafetyServiceAnnotationTask, System::Collections::Generic::IDictionary<System::String ^, System::String ^> ^ metricNames);
protected ContentSafetyEvaluator(string contentSafetyServiceAnnotationTask, System.Collections.Generic.IDictionary<string,string> metricNames);
new Microsoft.Extensions.AI.Evaluation.Safety.ContentSafetyEvaluator : string * System.Collections.Generic.IDictionary<string, string> -> Microsoft.Extensions.AI.Evaluation.Safety.ContentSafetyEvaluator
Protected Sub New (contentSafetyServiceAnnotationTask As String, metricNames As IDictionary(Of String, String))
Parameters
- contentSafetyServiceAnnotationTask
- String
The name of the annotation task that should be used when communicating with the Azure AI Foundry Evaluation service to perform evaluations.
- metricNames
- IDictionary<String,String>
A dictionary containing the mapping from the names of the metrics that are used when communicating with the Azure AI Foundry Evaluation service, to the Names of the EvaluationMetrics returned by this IEvaluator.