Table of Contents

Class CVFaceDetectedEventArgs

Namespace
VisioForge.Core.Types.Events
Assembly
VisioForge.Core.dll

Provides data for events that indicate the detection of faces in a video frame. This class encapsulates an array of VisioForge.Core.Types.VideoProcessing.CVFace objects, each representing a detected face, along with the timestamp of the frame.

public class CVFaceDetectedEventArgs : EventArgs

Inheritance

Inherited Members

Examples

// Assume an event handler is registered for a face detection event.
public void OnFaceDetected(object sender, CVFaceDetectedEventArgs e)
{
    Console.WriteLine($"Faces detected at {e.TimeStamp}: {e.Faces.Length}");

    foreach (var face in e.Faces)
    {
        Console.WriteLine($"  Face: {face.Rectangle} (Confidence: {face.Confidence})");
        // Further process face data, e.g., draw bounding boxes, identify emotions.
    }
}

Remarks

This event argument is typically used in computer vision applications for facial recognition, emotion analysis, or audience engagement tracking. Face detection uses trained machine learning models (Haar cascades, HOG descriptors, or deep learning models like MTCNN, RetinaFace) to locate faces. Each CVFace object contains bounding box coordinates, confidence score, and potentially facial landmarks (eyes, nose, mouth positions). The Faces array may contain multiple faces when several people are visible in the frame. TimeStamp provides temporal context for tracking face movement across frames or synchronizing with other events. Confidence scores help filter false positives by setting appropriate thresholds. Detection accuracy is affected by face angle, lighting conditions, partial occlusion, and image resolution. Applications include access control, attendance tracking, emotion analysis, demographic analytics, and interactive experiences. Consider implementing face tracking across frames to maintain identity consistency and reduce redundant processing. Privacy considerations are important when processing facial data - ensure compliance with applicable regulations (GDPR, CCPA, etc.). Events are raised on background processing threads, requiring thread-safe handling and UI marshaling.

Constructors

CVFaceDetectedEventArgs(CVFace[], TimeSpan)

Initializes a new instance of the VisioForge.Core.Types.Events.CVFaceDetectedEventArgs class.

public CVFaceDetectedEventArgs(CVFace[] faces, TimeSpan timeStamp)

Parameters

faces CVFace[]

An array of VisioForge.Core.Types.VideoProcessing.CVFace objects representing the detected faces.

timeStamp TimeSpan

The timestamp of the frame where faces were detected.

Properties

Faces

Gets an array of VisioForge.Core.Types.VideoProcessing.CVFace objects, each representing a detected face in the frame.

public CVFace[] Faces { get; }

Property Value

CVFace[]

TimeStamp

Gets the timestamp of the video frame at which the faces were detected.

public TimeSpan TimeStamp { get; }

Property Value

TimeSpan

Methods

ToString()

Returns a string that represents the current VisioForge.Core.Types.Events.CVFaceDetectedEventArgs instance. The string includes the number of detected faces and details for each face.

public override string ToString()

Returns

string

A String that represents this instance.