<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Denis Svinarchuk</title>
    <description>The latest articles on DEV Community by Denis Svinarchuk (@denissvinarchuk).</description>
    <link>https://dev.to/denissvinarchuk</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/denissvinarchuk"/>
    <language>en</language>
    <item>
      <title>How to Create an Application to Determine the Palette and Dominant Colors of an Image</title>
      <dc:creator>Denis Svinarchuk</dc:creator>
      <pubDate>Fri, 12 Apr 2024 11:54:30 +0000</pubDate>
      <link>https://dev.to/denissvinarchuk/how-to-create-an-application-to-determine-the-palette-and-dominant-colors-of-an-image-821</link>
      <guid>https://dev.to/denissvinarchuk/how-to-create-an-application-to-determine-the-palette-and-dominant-colors-of-an-image-821</guid>
      <description>&lt;p&gt;A key technique in image editing applications is determining the dominant colors. The identification of these dominant colors is essential for creating an image's palette, which, in turn, accurately reflects the effect of selected tools and the results of editing.&lt;/p&gt;

&lt;p&gt;This article will delve into how to determine an image's palette within a restricted color space, identify an image's dominant colors, and differentiate between a palette and dominant colors.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Palette&lt;/strong&gt;&lt;br&gt;
The image palette is typically a reference to all the colors present in the original image. In essence, it captures the entire range of color shades, based on the numerical scale of color stimulus signal values.&lt;/p&gt;

&lt;p&gt;If we agree on modeling a signal with a certain level of accuracy, then the range of these signal values will represent our available color palette.&lt;/p&gt;

&lt;p&gt;Each unique representation of an image and the mapping of an image's colors onto this space will be a subset of it. In digital signal processing (an image is also a signal), we often understand a variety of quantities through its discrete representation.&lt;/p&gt;

&lt;p&gt;Thus, our image palette can be viewed as a subset of all the colors in the image represented by a discrete map. Each color can be indexed and assigned a specific value in one of the color spaces. For our purposes, we'll use the RGB color model.&lt;/p&gt;

&lt;p&gt;A significant challenge in presenting the image palette in this way is the vastness of human visibility. It's unlikely we can manually analyze the entire image palette, and it often doesn't make sense to store an image represented by all the original colors. Instead, we reduce their number to a reasonable limit.&lt;/p&gt;

&lt;p&gt;This process, known as color quantization, involves lowering the number of colors from the full subset that can be represented in an image to a smaller one. For instance, 14-bit raw color data might be represented in 8-bit JPEG or 16-bit TIFF converted to 8-bit PNG.&lt;/p&gt;

&lt;p&gt;A straightforward solution to determine an image's primary colors is to envision the process of color quantization to a very limited set, such as 8 or even 3 colors. This gives us insight into the primary colors within the image, making them more noticeable and memorable.&lt;/p&gt;

&lt;p&gt;Ideally, the colors in the final image should be harmonious, following the principles outlined by &lt;a href="https://en.wikipedia.org/wiki/Johannes_Itten"&gt;Johannes Itten&lt;/a&gt; or &lt;a href="https://en.wikipedia.org/wiki/Mikhail_Matyushin"&gt;Mikhail Matyushin&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;By identifying the dominant colors of an image, we can create a tool that allows us to visualize the image's "harmony." This can also apply to the harmony of the image after filtering, or "synthetic harmonization."&lt;/p&gt;

&lt;p&gt;Below, we will make an attempt to develop such a tool.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl3k3untrz9k83h3lujq4.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl3k3untrz9k83h3lujq4.jpeg" alt="Palette source cubehistogram" width="800" height="564"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Median Cut&lt;/strong&gt;&lt;br&gt;
A widely used and high-quality algorithm for compressing an image's palette is the Median Cut algorithm. It's incorporated into most major lossy image compression algorithms, such as JPEG, albeit with some modifications.&lt;/p&gt;

&lt;p&gt;The fundamental principle of the Median Cut algorithm is to sequentially divide the image's cubic histogram into medians. Each resulting subcube contains approximately an equal number of bins, or pixels of the same color.&lt;/p&gt;

&lt;p&gt;Once the division process reaches a predetermined number of subcubes, we calculate the average color of each cube and map it to the corresponding colors of the original image. This process effectively reduces the number of colors in the original image, allowing us to compress the image into a smaller file size.&lt;/p&gt;

&lt;p&gt;It might seem at first glance that the initial part of this algorithm could solve our problem of identifying the primary or dominant colors of an image. However, a potential issue arises when the number of colors into which we divide the image is too few, and we're only analyzing a small number of colors.&lt;/p&gt;

&lt;p&gt;We might end up identifying colors that don't actually exist in the image because the colors are excessively averaged. We could choose the color with the maximum bin from each cube and label it as dominant, but then it wouldn't constitute a palette anymore.&lt;/p&gt;

&lt;p&gt;So, we'll reserve this approach for determining a compressed version of the image's palette, which can also be used as a tool for visually analyzing the image in terms of its "harmonious utility." To identify the dominant colors, we'll employ statistical analysis: searching for local maxima in the same cubic histogram.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Local Maxima&lt;/strong&gt;&lt;br&gt;
To identify the local maxima, we will implement a specific code. The author, who is also a skilled artist, provides an excellent description of the algorithm. Essentially, we first gather the image statistics into the same three-dimensional RGB histogram used in the Median Cut algorithm.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgp8i5itakic4v8aeqey4.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgp8i5itakic4v8aeqey4.jpeg" alt="RGB space" width="311" height="241"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each cell of the cubic histogram will contain color bins, and the sum of all values of each color included in the cell. Given the limited histogram dimension to a resolution of 32x32x32 (originally 30x30x30), accumulating the sum simplifies the calculation of the average cell color.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnze1fehdq469m903d12t.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnze1fehdq469m903d12t.jpeg" alt="Histogram" width="294" height="301"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We then search for the local maximum by thoroughly exploring the entire space and comparing it with neighboring cells.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgvm274fk08hgwcbajxqk.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgvm274fk08hgwcbajxqk.jpeg" alt="Local maxima" width="225" height="236"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Following this, we iteratively reduce the number of local maxima to the required amount, discarding similar colors with less weight. For all remaining local maxima, we calculate the average colors of all values included in the list.&lt;/p&gt;

&lt;p&gt;Since the color density in local maxima is higher, and the difference between pixel values is smaller than in cubes from the Median Cut, the colors will be more akin to those present in the main image and will more accurately represent its dominant colors.&lt;/p&gt;

&lt;p&gt;This reveals the primary difference between the two models: acquiring a "compressed palette" versus searching for local maxima or dominant colors. By mapping the primary "compressed palette" image, we will create a new one that maintains the same color balance as the main image, albeit in a vastly truncated form.&lt;/p&gt;

&lt;p&gt;On the other hand, dominant colors only describe the composition of the colors primarily present in the image. They cannot be transformed into anything new with a suitable color balance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implementation&lt;/strong&gt;&lt;br&gt;
Using this task as an example, we will demonstrate how straightforward it is to develop a ready-to-use application for image analysis and manipulation using the &lt;a href="https://bitbucket.org/degrader/improcessing"&gt;IMProcessing Framework&lt;/a&gt;. We will begin with functionalities that are absent in other engines.&lt;/p&gt;

&lt;p&gt;For instance, the framework has the capacity to read Adobe .cube files with pre-existing CLUTs and can extract a three-dimensional cubic histogram from an image in real-time.&lt;/p&gt;

&lt;p&gt;Capitalizing on this, we aim to create an application that can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Upload files in JFIF (jpeg) format.&lt;/li&gt;
&lt;li&gt;“Normalize” the original image.&lt;/li&gt;
&lt;li&gt;Control the intensity of the “normalizer.”&lt;/li&gt;
&lt;li&gt;Incorporate an arbitrary LUT from an Adobe .cube file into the processing.&lt;/li&gt;
&lt;li&gt;Manage the intensity of CLUT's impact.&lt;/li&gt;
&lt;li&gt;Display a linear histogram of the image.&lt;/li&gt;
&lt;li&gt;Exhibit the “compressed palette” and dominant colors of the image as well as their numerical representation in the form of a triplet (r,g,b)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The end product of our construction will resemble this interactive toy:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzyvnqq78zmrqkgttr90z.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzyvnqq78zmrqkgttr90z.gif" alt="Interactive model" width="800" height="486"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here, it's clear how the image's simple “normalization” positively influences the diversity of the final palette, re-distributing dominant colors into a more varied and harmonious set.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;"Normalizer"&lt;/strong&gt;&lt;br&gt;
We will assemble a filter from two pre-existing ones:&lt;/p&gt;

&lt;p&gt;IMPContrastFilter - allows the stretching of the image histogram to specified boundaries&lt;/p&gt;

&lt;p&gt;IMPAutoWBFilter - performs automatic white balance correction based on the search for the average color and the correction of spurious tones in the image. This is essentially a slight modification of an idea borrowed from &lt;a href="https://zhur74.livejournal.com/44023.html"&gt;Andrey Zhuravlev’s blog&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import IMProcessing

/// Image filter
public class IMPTestFilter:IMPFilter {

    /// We will use a contrast control filter through histogram stretching
    var contrastFilter:IMPContrastFilter!

    /// Auto white balance filter
    var awbFilter:IMPAutoWBFilter!

    /// Image Linear Histogram Analyzer
    var sourceAnalyzer:IMPHistogramAnalyzer!

    /// Solver of the histogram analyzer for calculating the lightness boundaries of the image
    let rangeSolver = IMPHistogramRangeSolver()

    public required init(context: IMPContext) {
        super.init(context: context)

        //  Initialize filters in context
        contrastFilter = IMPContrastFilter(context: context)
        awbFilter = IMPAutoWBFilter(context: context)

        // Add filters to the stack
        addFilter(contrastFilter)
        addFilter(awbFilter)

        // Initialize the histogram analyzer
        sourceAnalyzer = IMPHistogramAnalyzer(context: self.context)

        // Add a light boundary search solver to the analyzer
        sourceAnalyzer.addSolver(rangeSolver)

        // Add an observing handler to the filter for
        // to pass the current image frame to the analyzer
        addSourceObserver { (source) - Void in
            self.sourceAnalyzer.source = source
        }

        // Add an observing handler for updating analysis calculations to the analyzer
        sourceAnalyzer.addUpdateObserver({ (histogram) - Void in
        // set the lightness boundaries in the contrast filter each time the image changes
            self.contrastFilter.adjustment.minimum = self.rangeSolver.minimum
            self.contrastFilter.adjustment.maximum = self.rangeSolver.maximum
        })

    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Palette Solver&lt;/strong&gt;&lt;br&gt;
The IMProcessing Framework stands out from many similar platforms due to its unique organization of calculations. It uses a dedicated group of filters that essentially are not processing filters.&lt;/p&gt;

&lt;p&gt;The objects of these classes do not modify the image in the homogeneous and spatial domains. Instead, they perform certain calculations and representations of metrics for analysis in special expanders to solve specific problems.&lt;/p&gt;

&lt;p&gt;For instance, an object of the IMPHistogramAnalyzer class can simultaneously add multiple solvers that calculate the average color of the image, the light range, zonal division, etc.&lt;/p&gt;

&lt;p&gt;We use the solver to extend the analysis of IMPHistogramCubeAnalyzer to calculate the palette and the list of dominant colors. The calculation results are then displayed in an updated NSTableView.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import IMProcessing

///  Types of distribution of image color accents
///
///  - palette:   palette for quantizing image colors.
///               calculated using the median-cut transformation scheme:
///               http://www.leptonica.com/papers/mediancut.pdf
///  - dominants: calculation of dominant colors of an image by searching for local maxima
///               color distribution density functions:
///               https://github.com/pixelogik/ColorCube
///
public enum IMPPaletteType{
    case palette
    case dominants
}

/// Solver of the cubic color histogram analyzer IMPHistogramCubeAnalyzer
public class IMPPaletteSolver: IMPHistogramCubeSolver {

    /// Maximum number of palette colors for analysis
    public var maxColors = Int(8)

    /// List of found colors
    public var colors = [IMPColor]()

    /// Palette type
    public var type = IMPPaletteType.dominants

      /// Solver handler handler
    /// - parameter analyzer: link to the analyzer
    /// - parameter histogram: cubic histogram of the image
    /// - parameter imageSize: image size
    public func analizerDidUpdate(analizer: IMPHistogramCubeAnalyzer, histogram: IMPHistogramCube, imageSize: CGSize) {

        var p = [float3]()
        if type == .palette{
            p = histogram.cube.palette(count: maxColors)
        }
        else if type == .dominants {
            p = histogram.cube.dominantColors(count: maxColors)
        }

        colors.removeAll()

        for c in p {
            colors.append(IMPColor(color: float4(rgb: c, a: 1)))
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;We Assemble All the Components in the View Controller&lt;/strong&gt;&lt;br&gt;
The controller will need the main application filter, which we'll call IMPTestFilter, a CLUT filter named IMPLutFilter, the ready-to-use IMPHistogramView for displaying a "regular" histogram, the IMPHistogramCubeAnalyzer which is a cubic histogram analyzer to which we will attach our solver, the IMPPaletteSolver.&lt;/p&gt;

&lt;p&gt;Lastly, we'll use IMPImageView as the main window for displaying the image, and the common IMPContext is the key class used by all constructors of the framework.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class ViewController: NSViewController {

    //
    // Processing context
    //
    let context = IMPContext()
    //
    // Window for presenting the loaded image
    //
    var imageView:IMPImageView!

    var pannelScrollView = NSScrollView()

    //
    // Window for displaying the image histogram
    //
    var histogramView:IMPHistogramView!

    //
    // NSTableView - views of a list of colors from the palette
    //
    var paletteView:IMPPaletteListView!

    //
    // Main filter
    //
    var filter:IMPTestFilter!

    //
    // CLUT filter from Adobe Cube files
    //
    var lutFilter:IMPLutFilter?

    //
    // Analyzer of a cubic histogram of an image in RGB space
    //
    var histograCube:IMPHistogramCubeAnalyzer!

    //
    // Our solver for finding colors
    //
    var paletteSolver = IMPPaletteSolver()

    var paletteTypeChooser:NSSegmentedControl!

    override func viewDidLoad() {

        super.viewDidLoad()

        configurePannel()

        //
        // Initialize the objects we need
        //

        filter = IMPTestFilter(context: context)

        histograCube = IMPHistogramCubeAnalyzer(context: context)
        histograCube.addSolver(paletteSolver)

        imageView = IMPImageView(context: context, frame: view.bounds)
        imageView.filter = filter
        imageView.backgroundColor = IMPColor(color: IMPPrefs.colors.background)

        //
        // Add another handler to monitor the original image
        // (another one was added in the main filter IMPTestFilter)
        //
        filter.addSourceObserver { (source) -&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; Void in
            //
            // to minimize calculations, the analyzer will compress the image to 1000px on the wide side
            //
            if let size = source.texture?.size {
                let scale = 1000/max(size.width,size.height)
                self.histograCube.downScaleFactor = scale.float
            }
        }

        // Add an observer to the filter to process the filtering results
        //
        filter.addDestinationObserver { (destination) -&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; Void in

            // pass the image to the histogram indicator
            self.histogramView.source = destination

            // pass the result to the cubic histogram analyzer
            self.histograCube.source = destination
        }

        //
        // The results of updating the analyzer calculation are displayed in the color list window
        //
        histograCube.addUpdateObserver { (histogram) -&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; Void in
            self.asyncChanges({ () -&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; Void in
                self.paletteView.colorList = self.paletteSolver.colors
            })
        }

        view.addSubview(imageView)

....

        IMPDocument.sharedInstance.addDocumentObserver { (file, type) -&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; Void in
            if type == .Image {
                do{
                    //
                    // Load the file and associate it with the filter source
                    //
                    self.imageView.source = try IMPImageProvider(context: self.imageView.context, file: file)
                    self.asyncChanges({ () -&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; Void in
                        self.zoomFit()
                    })
                }
                catch let error as NSError {
                    self.asyncChanges({ () -&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; Void in
                        let alert = NSAlert(error: error)
                        alert.runModal()
                    })
                }
            }
            else if type == .LUT {
                do {

                    //
                    // Initialize the CLUT descriptor
                    //
                    var description = IMPImageProvider.LutDescription()
                    //
                    // Load CLUT
                    //
                    let lutProvider = try IMPImageProvider(context: self.context, cubeFile: file, description: &amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;description)

                    if let lut = self.lutFilter{
                        //
                        // If a CLUT filter has been added, update its LUT table from the file with the received descriptor
                        //
                        lut.update(lutProvider, description:description)
                    }
                    else{
                        //
                        // Create a new LUT filter
                        //
                        self.lutFilter = IMPLutFilter(context: self.context, lut: lutProvider, description: description)
                    }

                    //
                    // Add a LUT filter, if this filter has already been added nothing happens
                    //
                    self.filter.addFilter(self.lutFilter!)
                }
                catch let error as NSError {
                    self.asyncChanges({ () -&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; Void in
                        let alert = NSAlert(error: error)
                        alert.runModal()
                    })
                }
            }
        }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see, photo processing is becoming simpler and more accessible to average users, making it possible for virtually anyone to master this process.&lt;/p&gt;

&lt;p&gt;The entire project can be downloaded, assembled, and tested from the ImageMetalling project &lt;a href="https://github.com/dnevera/ImageMetalling"&gt;repository&lt;/a&gt;: &lt;a href="https://github.com/dnevera/ImageMetalling/tree/master/ImageMetalling-08"&gt;ImageMetalling-08&lt;/a&gt;. For proper assembly, the mega-library for working with JPEG files (JFIF), &lt;a href="https://sourceforge.net/projects/libjpeg-turbo"&gt;libjpeg-turbo&lt;/a&gt;, must be locally installed.&lt;/p&gt;

&lt;p&gt;Currently, this is the best implementation of support for this format.&lt;/p&gt;

</description>
      <category>imageprocessing</category>
      <category>plugins</category>
      <category>appdevelopment</category>
      <category>programming</category>
    </item>
    <item>
      <title>The Easy Way to Develop Your Own Apple Metal Plugin and Integrate It into Davinci Resolve</title>
      <dc:creator>Denis Svinarchuk</dc:creator>
      <pubDate>Wed, 06 Mar 2024 10:26:08 +0000</pubDate>
      <link>https://dev.to/denissvinarchuk/the-easy-way-to-develop-your-own-apple-metal-plugin-and-integrate-it-into-davinci-resolve-k60</link>
      <guid>https://dev.to/denissvinarchuk/the-easy-way-to-develop-your-own-apple-metal-plugin-and-integrate-it-into-davinci-resolve-k60</guid>
      <description>&lt;p&gt;OFX, aka &lt;a href="http://openeffects.org/" rel="noopener noreferrer"&gt;OFX Image Processing API&lt;/a&gt;, is an open standard for creating 2D visual effects and video compositing. It operates in a plugin-like application development model. Essentially, it serves as both a Host - an application providing a set of methods, and a Plug-in - an application or module implementing this set. This configuration offers the potential for unlimited expansion of the host application's functionality.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DaVinci Resolve and Metal&lt;/strong&gt;&lt;br&gt;
Applications such as Final Cut X and DaVinci Resolve Studio, starting from version 16, fully support Apple Metal pipelines. Similar to &lt;a href="https://en.wikipedia.org/wiki/OpenCL" rel="noopener noreferrer"&gt;OpenCL&lt;/a&gt; and Cuda, in the case of OFX, you can obtain a descriptor or handler of a platform-specific command queue. The host system also takes responsibility for allocating a pool of such queues and balancing calculations on them. Moreover, it places the source and target image clip data in GPU memory, significantly simplifying the development of extensible functionality.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;OFX Version Support in Resolve&lt;/strong&gt;&lt;br&gt;
With Resolve, things are slightly more complicated. DaVinci announces support for OFX v1.4, albeit with some limitations. Specifically, some methods for working with interface functions are not available for use. To determine which method is available, OFX allows you to examine the supported suite through key/value queries. Publishing methods in the plugin code is based on &lt;a href="https://github.com/ofxa/openfx" rel="noopener noreferrer"&gt;C calls&lt;/a&gt;. But we will use the OpenFXS C++ shell adapted for C++17. For convenience, I've compiled everything into one repository: &lt;a href="https://github.com/dehancer/dehancer-external" rel="noopener noreferrer"&gt;dehancer-external&lt;/a&gt; taken from the open source &lt;a href="https://www.dehancer.com/" rel="noopener noreferrer"&gt;Dehancer project&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;OFXS Concept&lt;/strong&gt;&lt;br&gt;
In this project, I will be using OpenFXS, a C++ extension to OpenFX that was originally written by &lt;a href="https://en.wikipedia.org/wiki/The_Foundry_Visionmongers" rel="noopener noreferrer"&gt;Bruno Nicoletti&lt;/a&gt; and has become popular over time in commercial and open-source video processing projects. The original &lt;a href="https://github.com/ofxa/openfx/tree/master/Support" rel="noopener noreferrer"&gt;OpenFXS&lt;/a&gt; was not adapted to modern C++ dialects, so I updated it to make it compatible with &lt;a href="https://github.com/dehancer/dehancer-external" rel="noopener noreferrer"&gt;C++17&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;OFX, and consequently OFXS, is a standalone software module that is dynamically loaded by the host program. Essentially, it is a dynamic library that is loaded when the main application starts. OpenFXS, like OFX, must publish method signatures. Hence, we use one C method from the code. To start developing in OpenFXS, you need to agree to a few common sets of classes that are used to create new functionality in your application. Typically, in a new project, you need to inherit from these classes and implement or override some virtual methods. To create your own plugin on the host system, let's start by familiarizing ourselves with the following public classes and the same method:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;OFX::PluginFactoryHelper&lt;/strong&gt; is a basic template for creating a plugin's data structure suite and control panel (although it can be left empty). The inherited class creates a singleton object that registers a set of parameters and presets in the host system, with which the developer registers his module;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OFX::ParamSetDescriptor&lt;/strong&gt; - base container class for creating and storing structure properties;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OFX::ImageEffectDescriptor&lt;/strong&gt; - a container of properties used when manipulating graphic data when calling data processing procedures. Used by the host application to save the context of processing parameters in the internal database and work with plugin properties defined for each of its instances;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OFX::ParamSet&lt;/strong&gt; - a set of settings that allows you to manipulate the registered data structure;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OFX::ImageEffect&lt;/strong&gt; - a set of settings for effects on graphical data, inherited from OFX::ParamSet;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OFX::MultiThread::Processor&lt;/strong&gt; - in the child class, it is necessary to implement data stream processing: images or videos;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OFX::Plugin::getPluginIDs&lt;/strong&gt; - method of registering a plugin (factory) in the host application.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;False Color&lt;/strong&gt;&lt;br&gt;
One feature that distinguishes the process of shooting video from simply capturing an image in a photo is the dynamic change of scenes and lighting of both scenes as a whole and areas in the image. This determines the way exposure is controlled during the shooting process. In digital video, there is a control monitor mode for operators in which the exposure level of areas is mapped into a limited set of zones, each tinted with its own color. This mode is sometimes called "predator" or False Color mode. The scales are usually referenced to the IRE scale. Such a monitor allows you to see the exposure zones and avoid significant mistakes when setting camera shooting parameters. Something similar in meaning is used when exposing in photography - &lt;a href="https://en.wikipedia.org/wiki/Zone_System" rel="noopener noreferrer"&gt;zoning&lt;/a&gt; according to &lt;a href="https://en.wikipedia.org/wiki/Ansel_Adams" rel="noopener noreferrer"&gt;Adams&lt;/a&gt;, for example. You can measure a specific target with an exposure meter and see in which zone it is located, and in real time we see the zones, neatly tinted for ease of perception. The number of zones is determined by the objectives and capabilities of the control monitor. For instance, a monitor used with &lt;a href="https://en.wikipedia.org/wiki/Arri_Alexa" rel="noopener noreferrer"&gt;Arri Alexa&lt;/a&gt; cameras can incorporate up to 6 zones.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4us8xat7vr7c9yiop31h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4us8xat7vr7c9yiop31h.png" alt="Software “predator” version with 16 zones"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Adding extensions&lt;/strong&gt;&lt;br&gt;
Before proceeding with the example, we need to add some simple proxy classes to implement OpenFXS as a platform for processing source data, such as Metal textures. These classes include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::Image&lt;/strong&gt;: A proxy class for OFX clip data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::Image2Texture&lt;/strong&gt;: A functor for transferring data from the clip buffer into a Metal texture. From DaVinci, you can extract a buffer of any structure and packaging of image channel values into the plugin, and it should be returned in a similar form. To make working with the stream format in OFX easier, you can request the host to prepare data of a specific type in advance. I will use floats packed in RGBA - red/green/blue/alpha.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::ImageFromTexture&lt;/strong&gt;: A reverse functor for transforming a stream into a host system buffer. As you can see, there is potential for significant optimization of calculations if you teach the Metal computing cores to work not with the texture, but directly with the buffer.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We inherit the OFXS base classes and write our functionality without going into the details of how the Metal core works:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::falsecolor::Processor&lt;/strong&gt;: Here we implement the stream transformation and initiate the processing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::falsecolor::Factory&lt;/strong&gt;: This will be our specific part of the suite description for the plugin. We need to implement several mandatory calls related to setting up the structure and create an instance of the OFX::ImageEffect class with specific functionality, which we divide into two subclasses in the implementation: Interaction and Plugin.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::falsecolor::Interaction&lt;/strong&gt;: Implementation of the interactive part of working with effects. Essentially, this is the implementation of only virtual methods from OFX::ImageEffect related to processing changes in plugin parameters.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::falsecolor::Plugin&lt;/strong&gt;: Implementation of thread rendering, that is, launching imetalling::Processor.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Additionally, we will need several utility classes built on top of Metal to logically separate the host code and the kernel code on MSL. These include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::Function&lt;/strong&gt;: A base class that obscures work with the Metal command queue. The main parameter will be the name of the kernel in the MSL code, and the executor of the kernel call.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;imetalling:Kernel&lt;/strong&gt;: A general class for transforming a source texture into a target texture, extending Function to simply set the parameters for calling the MSL kernel.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::PassKernel&lt;/strong&gt;: Bypass kernel.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;imetalling::FalseColorKernel&lt;/strong&gt;: Our main functional class, a "predator" emulator that posterizes (downsamples) to a specified number of colors.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The kernel code for the "predator" mode could look like this:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

static constant float3 kIMP_Y_YUV_factor = {0.2125, 0.7154, 0.0721};
constexpr sampler baseSampler(address::clamp_to_edge, filter::linear, coord::normalized);
 
inline float when_eq(float x, float y) {
  return 1.0 - abs(sign(x - y));
}
 
static inline float4 sampledColor(
        texture2d&amp;lt;float, access::sample&amp;gt; inTexture,
        texture2d&amp;lt;float, access::write&amp;gt; outTexture,
        uint2 gid
){
  float w = outTexture.get_width();
  return mix(inTexture.sample(baseSampler, float2(gid) * float2(1.0/(w-1.0), 1.0/float(outTexture.get_height()-1))),
             inTexture.read(gid),
             when_eq(inTexture.get_width(), w) // whe equal read exact texture color
  );
}
 
kernel void kernel_falseColor(
        texture2d&amp;lt;float, access::sample&amp;gt; inTexture [[texture(0)]],
        texture2d&amp;lt;float, access::write&amp;gt; outTexture [[texture(1)]],
        device float3* color_map [[ buffer(0) ]],
        constant uint&amp;amp; level [[ buffer(1) ]],
        uint2 gid [[thread_position_in_grid]])
{
  float4  inColor = sampledColor(inTexture,outTexture,gid);
  float luminance = dot(inColor.rgb, kIMP_Y_YUV_factor);
  uint      index = clamp(uint(luminance*(level-1)),uint(0),uint(level-1));
  float4    color = float4(1);
 
  if (index&amp;lt;level)
    color.rgb = color_map[index];
 
  outTexture.write(color,gid);
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Initialization of the OFX Plugin&lt;/strong&gt;&lt;br&gt;
We will begin by defining the class &lt;code&gt;imetalling::falsecolor::Factory&lt;/code&gt;. In this class, we will set a single parameter - the status of the monitor (either on or off). This is necessary for our example.&lt;/p&gt;

&lt;p&gt;We will inherit from &lt;code&gt;OFX::PluginFactoryHelper&lt;/code&gt; and overload five methods:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;load()&lt;/strong&gt;: This method is invoked to globally configure the instance when the plugin is first loaded. Overloading this method is optional.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;unload()&lt;/strong&gt;: This method is invoked when an instance is unloaded, for instance, to clear memory. Overloading this method is also optional.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;describe(ImageEffectDescriptor&amp;amp;)&lt;/strong&gt;: This is the second method that the OFX host calls when the plugin is loaded. It is virtual and must be defined in our class. In this method, we need to set all the properties of the plugin, regardless of its context type. For more details about the properties, refer to the &lt;code&gt;ImageEffectDescriptor&lt;/code&gt; code.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;describeInContext(ImageEffectDescriptor&amp;amp;,ContextEnum)&lt;/strong&gt;: Similar to the &lt;code&gt;describe&lt;/code&gt; method, this method is also called when the plugin is loaded and must be defined in our class. It should define properties associated with the current context. The context determines the type of operations the application works with, such as filter, paint, transition effect, or frame retimer in a clip.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;createInstance(OfxImageEffectHandle, ContextEnum)&lt;/strong&gt;: This is the most crucial method that we overload. We return a pointer to an object of type &lt;code&gt;ImageEffect&lt;/code&gt;. In other words, our &lt;code&gt;imetalling::falsecolor::Plugin&lt;/code&gt; in which we have defined all the functionalities, both with regard to user events in the host program and rendering (transforming) the source frame into the target one:&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

OFX::ImageEffect *Factory::createInstance(OfxImageEffectHandle handle,OFX::ContextEnum) {
     return new Plugin(handle);
   }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Handling Events&lt;/strong&gt;&lt;br&gt;
At this stage, if you compile a bundle with the OFX module, the plugin will already be available in the host application, and in DaVinci, it can be loaded onto the correction node. However, to work fully with a plugin instance, you need to define at least the interactive part and the part associated with processing the incoming video stream. To do this, we inherit from the &lt;strong&gt;OFX::ImageEffect&lt;/strong&gt; class and overload virtual methods:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;changedParam(const OFX::InstanceChangedArgs&amp;amp;, const std::string&amp;amp;)&lt;/strong&gt; - This method allows us to define the logic for handling the event. The event type is determined by the value of OFX::InstanceChangedArgs::reason and can be: eChangeUserEdit, eChangePluginEdit, eChangeTime - the event occurred as a result of a property being edited by the user, changed in a plugin or host application, or as a result of a change in the timeline. The second parameter specifies the string name that we defined at the plugin initialization stage, in our case it is one parameter: &lt;strong&gt;&lt;a href="https://github.com/dnevera/ImageMetalling/blob/master/ImageMetalling-17/ofx/plugin/Names.h" rel="noopener noreferrer"&gt;false_color_enabled_check_box&lt;/a&gt;&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;isIdentity(...)&lt;/strong&gt; - This method allows us to define the logic for reacting to an event, and return a state that determines whether something has changed and whether rendering makes sense. The method must return false or true. This is a way to optimize and reduce the number of unnecessary calculations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can read the implementation of interactive interaction with OFX in the &lt;strong&gt;&lt;a href="https://github.com/dnevera/ImageMetalling/blob/master/ImageMetalling-17/ofx/plugin/Interaction.cpp" rel="noopener noreferrer"&gt;Interaction.cpp&lt;/a&gt;&lt;/strong&gt; code. As you can see, we receive pointers to the clips: the source one and the memory area in which we will put the target transformation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implementation of Rendering Launch&lt;/strong&gt;&lt;br&gt;
We will add another logical layer on which we will define all the logic for launching the transformation. In our case, this is the only method for overriding so far:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;render(const OFX::RenderArguments&amp;amp; args)&lt;/strong&gt; - Here, you can find out the properties of the clips and decide how to render them. Also, at this stage, the Metal command queue and some useful attributes associated with the current timeline properties become available to us.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Processing&lt;/strong&gt;&lt;br&gt;
At the launch stage, an object with useful properties became available to us: we have at least a pointer to the video stream (more precisely, a memory area with frame image data), and, most importantly, a queue of Metal commands. Now we can construct a generic class that will bring us closer to a simple form of reusing kernel code. The OpenFXS extension already has such a class: OFX::ImageProcessor, we just need to overload it. In the constructor, it has the OFX::ImageEffect parameter, i.e., in it we will receive not only the current state of the plugin parameters, but everything necessary for working with the GPU. At this stage, we just need to overload the processImagesMetal() method and initiate the processing of kernels already implemented on Metal.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

Processor::Processor(
            OFX::ImageEffect *instance,
            OFX::Clip *source,
            OFX::Clip *destination,
            const OFX::RenderArguments &amp;amp;args,
            bool enabled
    ) :
            OFX::ImageProcessor(*instance),
            enabled_(enabled),
            interaction_(instance),
            wait_command_queue_(false),
            /// grab the current frame of a clip from OFX host memory
            source_(source-&amp;gt;fetchImage(args.time)),
            /// create a target frame of a clip with the memory area already specified in OFX
            destination_(destination-&amp;gt;fetchImage(args.time)),
            source_container_(nullptr),
            destination_container_(nullptr)
    {
 
      /// Set OFX rendering arguments to GPU
      setGPURenderArgs(args);
 
      /// Set render window
      setRenderWindow(args.renderWindow);
 
      /// Place source frame data in Metal texture
      source_container_ = std::make_unique&amp;lt;imetalling::Image2Texture&amp;gt;(_pMetalCmdQ, source_);
 
      /// Create empty target frame texture in Metal
      destination_container_ = std::make_unique&amp;lt;imetalling::Image2Texture&amp;gt;(_pMetalCmdQ, destination_);
 
      /// Get parameters for packing data in the memory area of the target frame
      OFX::BitDepthEnum dstBitDepth = destination-&amp;gt;getPixelDepth();
      OFX::PixelComponentEnum dstComponents = destination-&amp;gt;getPixelComponents();
 
      /// and original
      OFX::BitDepthEnum srcBitDepth = source-&amp;gt;getPixelDepth();
      OFX::PixelComponentEnum srcComponents = source-&amp;gt;getPixelComponents();
 
      /// show a message to the host system that something went wrong
      /// and cancel rendering of the current frame
      if ((srcBitDepth != dstBitDepth) || (srcComponents != dstComponents)) {
        OFX::throwSuiteStatusException(kOfxStatErrValue);
      }
 
      /// set in the current processor context a pointer to the memory area of the target frame
      setDstImg(destination_.get_ofx_image());
    }
 
    void Processor::processImagesMetal() {
 
      try {
 
        if (enabled_)
          FalseColorKernel(_pMetalCmdQ,
                           source_container_-&amp;gt;get_texture(),
                           destination_container_-&amp;gt;get_texture()).process();
        else
          PassKernel(_pMetalCmdQ,
                           source_container_-&amp;gt;get_texture(),
                           destination_container_-&amp;gt;get_texture()).process();

        ImageFromTexture(_pMetalCmdQ,
                         destination_,
                         destination_container_-&amp;gt;get_texture(),
                         wait_command_queue_);
 
      }
      catch (std::exception &amp;amp;e) {
        interaction_-&amp;gt;sendMessage(OFX::Message::eMessageError, "#message0", e.what());
      }
    }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Building the project&lt;/strong&gt;&lt;br&gt;
To build the project, you will need CMake, and it must be at least version 3.15. Additionally, you will require Qt5.13, which aids in the easy and convenient assembly of the bundle with the plugin installer in the system directory. To initiate cmake, you must first create a build directory. After creating the build directory, you can execute the following command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

cmake -DPRINT_DEBUG=ON -DQT_INSTALLER_PREFIX=/Users/&amp;lt;user&amp;gt;/Develop/QtInstaller -DCMAKE_PREFIX_PATH=/Users/&amp;lt;user&amp;gt;/Develop/Qt/5.13.0/clang_64/lib/cmake -DPLUGIN_INSTALLER_DIR=/Users/&amp;lt;user&amp;gt;/Desktop -DCMAKE_INSTALL_PREFIX=/Library/OFX/Plugins .. &amp;amp;&amp;amp; make install


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq5gnfbi3zar7z77jhx3d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq5gnfbi3zar7z77jhx3d.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Afterwards, the installer, called IMFalseColorOfxInstaller.app, will appear in the directory that you specified in the PLUGIN_INSTALLER_DIR parameter. Let's go ahead and launch it! Once the installation is successful, you can start DaVinci Resolve and begin using our new plugin. You can find and select it in the OpenFX panel on the color correction page, and add it as a node.&lt;/p&gt;

&lt;p&gt;An example of False Color work:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2jdak56x87jsmnurkbmy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2jdak56x87jsmnurkbmy.png" alt="example of False Color work"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;External links:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/dnevera/ImageMetalling/tree/master/ImageMetalling-17" rel="noopener noreferrer"&gt;False Color OFX Plugin Code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://openeffects.org/" rel="noopener noreferrer"&gt;The Open Effects Association&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.blackmagicdesign.com/products/davinciresolve" rel="noopener noreferrer"&gt;Download DaVinci Resolve&lt;/a&gt; - OFX header file version and OFXS library code under Resolve + examples&lt;/p&gt;

</description>
      <category>appdevelopment</category>
      <category>plugins</category>
      <category>imageprocessing</category>
      <category>programming</category>
    </item>
  </channel>
</rss>
