Namespace VisioForge.Core.Types
Namespaces
- VisioForge.Core.Types.AudioEffects
- VisioForge.Core.Types.Events
- VisioForge.Core.Types.MediaInfo
- VisioForge.Core.Types.VideoProcessing
- VisioForge.Core.Types.X
Classes
- AudioFrame
-
Represents a single frame of audio data, including its raw data, format information, and timing. This class is used to pass audio data between different components of the framework.
- DataFrame
-
Represents a generic frame of data, which can be used for various media types, not limited to audio or video. This class provides a buffer for raw data along with timing and format information. It includes methods for manual memory management of the underlying data buffer.
- MediaFileTags
-
Represents a comprehensive collection of metadata tags for a media file. This class aggregates various pieces of information commonly found in audio and video file tags (e.g., ID3, MP4 tags, Vorbis Comments).
- RAWAudioFrameEventArgs
-
Provides data for events that deliver raw audio frames. This class encapsulates a VisioForge.Core.Types.RAWAudioFrame, allowing it to be passed as an argument in event handlers.
- RAWVideoColorSpaceHelper
-
Provides helper methods for working with VisioForge.Core.Types.RAWVideoColorSpace. This static class facilitates conversions between different color space representations and calculates image stride.
- RAWVideoFrameEventArgs
-
Provides data for events that deliver raw video frames. This class encapsulates a VisioForge.Core.Types.RAWVideoFrame, allowing it to be passed as an argument in event handlers.
- Rect
-
Represents a rectangular area defined by its left, top, right, and bottom coordinates. This class provides properties to access the dimensions and check if the rectangle is empty.
- Size
-
Represents the size of a two-dimensional object, typically in pixels. This class encapsulates a width and a height, providing a convenient way to manage dimensions.
Structs
- RAWAudioFrame
-
Represents a raw audio frame, designed for direct memory manipulation and interop scenarios. This struct holds a pointer to the raw audio data, its size, and basic audio format information, along with timing details.
- RAWBaseAudioInfo
-
Represents fundamental information about a raw audio stream. This struct provides details such as bits per sample, number of channels, sample rate, and audio format.
- RAWBaseVideoInfo
-
Represents fundamental information about a raw video stream or frame. This struct provides details such as width, height, color space, stride, and frame rate.
- RAWImage
-
Represents a raw image frame, providing direct access to image data in unmanaged memory. This struct is designed for high-performance image processing and interoperation with native code.
- RAWVideoFrame
-
Represents a raw video frame, designed for direct memory manipulation and interop scenarios. This struct holds a pointer to the raw video data, its size, and basic video format information, along with timing details.
- VFRectIntl
-
Represents an internal rectangular area defined by its left, top, right, and bottom coordinates. This struct is primarily used for interoperation with native code due to its sequential memory layout.
- VFSizeIntl
-
Represents an internal size structure defined by its width and height. This struct is primarily used for interoperation with native code due to its sequential memory layout.
- VideoFrame
-
Represents a single frame of video data, including its raw data, format information, and timing. This struct provides a managed way to work with video frames, abstracting some of the complexities of unmanaged memory.
- VideoFrameRate
-
Represents a video frame rate, typically expressed as a fraction (numerator/denominator) or a floating-point value. This struct provides methods for comparison, conversion, and common frame rate constants.
Interfaces
- IBitmap
-
Represents a platform-agnostic bitmap image. This interface is used as a common wrapper for different bitmap implementations across various platforms (e.g., Windows, Android, iOS).
- IMediaPlayerControls
-
Defines a standard set of controls for a media player component. This interface provides a common API for controlling playback of media files, including play, pause, stop, seek, and volume adjustments. It also includes asynchronous versions of most methods for use in responsive UI applications.
- IStatusOverlay
-
Defines an interface for a status overlay, which is used to display real-time information on top of a video stream. This can include details like frame rate, resolution, timecode, recording status, or custom text provided by the application.
- IVRVideoControl
-
Defines an interface for controlling Virtual Reality (VR) video playback parameters. This interface allows manipulation of camera orientation (yaw, pitch, roll), field of view, and the VR projection mode.
- IVideoCaptureControls
-
Defines a standard set of controls for a video capture component. This interface provides a common API for managing the lifecycle of a video capture session, including starting, pausing, and stopping. It also provides access to audio volume controls during capture.
- IVideoEffectsControls
-
Defines a standard set of controls for applying basic real-time video effects. This interface provides methods to adjust common image properties such as brightness, contrast, and saturation, as well as apply simple transformations like grayscale, flip, and invert.
- IVideoView
-
Defines a platform-agnostic interface for a video rendering view. This interface abstracts the underlying UI control used for displaying video, allowing the core engine to remain independent of any specific UI framework (e.g., WinForms, WPF, Android, iOS).
- IVideoViewNativeRendering
-
Defines an interface for video view components that support switching to a native rendering backend. This is typically used to improve performance in demanding scenarios, such as playing high-resolution (4K/8K) video or using hardware-accelerated GPU decoding.
- IVideoViewPushFrame
-
Defines an interface for video views that support direct pushing of video frames for rendering. This interface is crucial for scenarios where raw video data needs to be displayed without relying on a traditional media pipeline (e.g., for custom video processing or real-time frame injection).
Enums
- AudioFormat
-
Defines the audio format for raw audio data, specifying whether it is Pulse-Code Modulation (PCM), IEEE floating-point, or encoded. This enumeration is crucial for configuring audio streams and ensuring correct data interpretation throughout the framework.
- DebugLevel
-
Specifies the severity level for logging and debug messages. This enumeration is used to filter the verbosity of diagnostic output.
- FFMPEGCodecID
-
Defines a comprehensive list of codec identifiers from the FFmpeg library. These IDs are used to uniquely identify specific audio, video, subtitle, and data compression formats.
- GIFResizeStretchMode
-
Defines how a GIF frame should be resized or stretched to fit a target dimension.
- LicenseLevel
-
Defines the different licensing tiers available for VisioForge products. Each level typically unlocks a different set of features or capabilities within the framework.
- MediaFileTagTypes
-
Specifies the types of metadata tags that can be present in a media file. This enumeration is a bit field, meaning its members can be combined using bitwise operations to indicate the presence of multiple tag types.
- MessageBoxDialogButtons
-
Specifies constants defining the buttons to display on a message box. This enumeration is used to control the user's interaction options within a dialog.
- MessageBoxResult
-
Specifies constants defining the result of a message box dialog. This enumeration is used to identify which button a user clicked on a message box.
- MouseButton
-
Specifies constants that define which mouse button was pressed. This enumeration is used in mouse-related events to identify the specific button involved in the event.
- PlaybackState
-
Defines the possible states of a media playback or capture session. This enumeration is used to indicate whether the media is actively playing, paused, or in an idle state.
- RAWVideoColorSpace
-
Defines various raw video color space formats. This enumeration is used to specify how pixel data is arranged and interpreted in raw video frames.
- VRMode
-
Specifies the different Virtual Reality (VR) projection modes supported for video playback. Not all modes may be supported by every component (e.g., specific video blocks, effects, or renderers).
- VideoRendererEVRDeinterlaceMode
-
Specifies the deinterlacing modes available for the Enhanced Video Renderer (EVR). Deinterlacing converts interlaced video (where each frame consists of two fields captured at different times) into progressive video (where each frame contains all lines of a single moment in time).
- VideoRendererMode
-
Specifies the video rendering engine to be used for displaying video frames. The choice of renderer can impact performance, compatibility, and available features.
- VideoRendererStretchMode
-
Specifies how video frames are scaled or stretched to fit the display area of a video renderer. This enumeration controls the aspect ratio preservation and filling behavior.
- VideoStreamType
-
Defines the type of a video stream, typically used in frame grabber events or when managing multiple video sources. This enumeration helps distinguish between main video, preview, capture, and Picture-in-Picture (PIP) streams.
- VideoViewUIEngine
-
Specifies the UI framework or rendering technology used by a video view component. This enumeration helps the framework adapt its rendering pipeline to different platforms and UI environments.